Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions documentation/docs/getting-started/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ Goose is compatible with a wide range of LLM providers, allowing you to choose a
| [GCP Vertex AI](https://cloud.google.com/vertex-ai) | Google Cloud's Vertex AI platform, supporting Gemini and Claude models. **Credentials must be [configured in advance](https://cloud.google.com/vertex-ai/docs/authentication).** | `GCP_PROJECT_ID`, `GCP_LOCATION` and optionally `GCP_MAX_RATE_LIMIT_RETRIES` (5), `GCP_MAX_OVERLOADED_RETRIES` (5), `GCP_INITIAL_RETRY_INTERVAL_MS` (5000), `GCP_BACKOFF_MULTIPLIER` (2.0), `GCP_MAX_RETRY_INTERVAL_MS` (320_000). |
| [GitHub Copilot](https://docs.github.com/en/copilot/using-github-copilot/ai-models) | Access to GitHub Copilot's chat models including gpt-4o, o1, o3-mini, and Claude models. Uses device code authentication flow for secure access. | Uses GitHub device code authentication flow (no API key needed) |
| [Groq](https://groq.com/) | High-performance inference hardware and tools for LLMs. | `GROQ_API_KEY` |
| [LiteLLM](https://docs.litellm.ai/docs/) | LiteLLM proxy supporting multiple models with automatic prompt caching and unified API access. | `LITELLM_HOST`, `LITELLM_BASE_PATH` (optional), `LITELLM_API_KEY` (optional), `LITELLM_CUSTOM_HEADERS` (optional), `LITELLM_TIMEOUT` (optional) |
| [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](#local-llms).** | `OLLAMA_HOST` |
| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the Goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](#local-llms).** | `OLLAMA_HOST` |
| [OpenAI](https://platform.openai.com/api-keys) | Provides gpt-4o, o1, and other advanced language models. Also supports OpenAI-compatible endpoints (e.g., self-hosted LLaMA, vLLM, KServe). **o1-mini and o1-preview are not supported because Goose uses tool calling.** | `OPENAI_API_KEY`, `OPENAI_HOST` (optional), `OPENAI_ORGANIZATION` (optional), `OPENAI_PROJECT` (optional), `OPENAI_CUSTOM_HEADERS` (optional) |
Expand Down
5 changes: 5 additions & 0 deletions documentation/docs/guides/interactive-chat/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,5 +42,10 @@ import styles from '@site/src/components/Card/styles.module.css';
description="MCP-UI servers return content that Goose Desktop renders as rich, embeddable UI."
link="/blog/2025/08/11/mcp-ui-post-browser-world"
/>
<Card
title="MCP-UI: The Future of Agentic Interfaces"
description="AI agents need to move beyond walls of text to rich and interactive UX."
link="/blog/2025/08/25/mcp-ui-future-agentic-interfaces"
/>
</div>
</div>
18 changes: 11 additions & 7 deletions documentation/docs/guides/recipes/session-recipes.md
Original file line number Diff line number Diff line change
Expand Up @@ -456,15 +456,15 @@ You can turn your current Goose session into a reusable recipe that includes the

<Tabs groupId="interface">
<TabItem value="ui" label="Goose Desktop" default>
Share your recipe with Desktop users by copying the recipe URL from the recipe creation dialog.
Share your recipe with Desktop users by copying the recipe link:

To copy the recipe URL:
1. [Open the recipe](#use-recipe)
2. Click the <Bot className="inline" size={16} /> button at the bottom of the app
3. Click `View/Edit Recipe`
4. Scroll down and copy the link
1. Click the <PanelLeft className="inline" size={16} /> button in the top-left to open the sidebar
2. Click `Recipes` in the sidebar
3. Find your recipe in the Recipe Library
4. Click `Preview` next to the recipe you want to share
5. Under `Deeplink`, click `Copy` and then share the link with others

When someone clicks the URL, it will open Goose Desktop with your recipe configuration. They can also use your recipe URL to [import a recipe](/docs/guides/recipes/storing-recipes#storing-recipes) into their Recipe Library.
When someone clicks the link, it will open Goose Desktop with your recipe configuration. They can also use your recipe link to [import a recipe](/docs/guides/recipes/storing-recipes#storing-recipes) into their Recipe Library for future use.

</TabItem>
<TabItem value="cli" label="Goose CLI">
Expand All @@ -477,6 +477,10 @@ You can turn your current Goose session into a reusable recipe that includes the
</TabItem>
</Tabs>

:::info Privacy & Isolation
Each recipient gets their own private session when using your shared recipe - no data is shared between users and your original session remains unaffected.
:::

## Schedule Recipe
<Tabs groupId="interface">
<TabItem value="ui" label="Goose Desktop" default>
Expand Down
5 changes: 3 additions & 2 deletions documentation/src/components/DesktopProviderSetup.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,9 @@ export const DesktopProviderSetup = () => {
<>
<p>On the welcome screen, choose how to configure a provider:</p>
<ul>
<li><strong>OpenRouter</strong> (recommended) - One-click OAuth authentication provides instant access to multiple AI models with built-in rate limiting.</li>
<li><strong>Ollama</strong> - Free local AI that runs privately on your computer. If needed, the setup flow will guide you through installing Ollama and downloading the recommended model.</li>
<li><strong>Tetrate Agent Router</strong> - One-click OAuth authentication provides instant access to multiple AI models, starting credits, and built-in rate limiting.</li>
<li><strong>OpenRouter</strong> - One-click OAuth authentication provides instant access to multiple AI models with built-in rate limiting.</li>
<li><strong>Ollama</strong> - Free local AI that runs privately on your computer. If needed, the setup flow will guide you through installing Ollama and downloading the recommended model. May require powerful hardware.</li>
<li><strong>Other Providers</strong> - Choose from <a href="/goose/docs/getting-started/providers">~20 supported providers</a> including OpenAI, Anthropic, Google Gemini, and others through manual configuration. Be ready to provide your API key.</li>
</ul>
</>
Expand Down