You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+25Lines changed: 25 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -106,6 +106,29 @@ The `--pid` flag is particularly useful when you want to analyze content
106
106
from a different pane. The pane ID is visible in your tmux status bar
107
107
(configured earlier).
108
108
109
+
### Using Alternative Model Providers
110
+
111
+
ShellSage supports using different LLM providers through base URL
112
+
configuration. This allows you to use local models or alternative API
113
+
endpoints:
114
+
115
+
``` sh
116
+
# Use a local Ollama endpoint
117
+
ssage --provider openai --model llama3.2 --base_url http://localhost:11434/v1 --api_key ollama what is rsync?
118
+
119
+
# Use together.ai
120
+
ssage --provider openai --model mistralai/Mistral-7B-Instruct-v0.3 --base_url https://api.together.xyz/v1 help me with sed # make sure you've set your together API key in your shell_sage conf
121
+
```
122
+
123
+
This is particularly useful for: - Running models locally for
124
+
privacy/offline use - Using alternative hosting providers - Testing
125
+
different model implementations - Accessing specialized model
126
+
deployments
127
+
128
+
You can also set these configurations permanently in your ShellSage
129
+
config file (`~/.config/shell_sage/shell_sage.conf`). See next section
130
+
for details.
131
+
109
132
## Configuration
110
133
111
134
ShellSage can be customized through its configuration file located at
@@ -117,6 +140,8 @@ example:
117
140
# Choose your AI model provider
118
141
provider = anthropic # or 'openai'
119
142
model = claude-3-sonnet # or 'gpt-4o-mini' for OpenAI
143
+
base_url = # leave empty to use default openai endpoint
144
+
api_key = # leave empty to default to using your OPENAI_API_KEY env var
0 commit comments