You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The `autoconfig` reads the desired configuration from `~/.netrc` and creates appropriate instance of LLM API.
26
+
The `autoconfig` reads the desired configuration from `~/.netrc` and creates appropriate instance of LLM API. Your `~/.netrc` file must include at least the `provider` and `model` fields under a named service entry. For example:
27
+
28
+
```
29
+
machine thinker
30
+
provider provider:bedrock/foundation/converse
31
+
model us.anthropic.claude-3-7-sonnet-20250219-v1:0
32
+
```
33
+
34
+
*`provider` specifies the full path to the provider's capability (e.g., `provider:bedrock/foundation/converse`). The path ressembles import path of providers implemented by this library
35
+
*`model` specifies the exact model name as recognized by the provider
36
+
37
+
Each provider and model family may support additional options. These can also be added under the same `machine` entry and will be passed into the corresponding provider implementation.
38
+
39
+
```
40
+
region // used by Bedrock providers
41
+
host // used by OpenAI providers
42
+
secret // used by OpenAI providers
43
+
timeout // used by OpenAI providers
44
+
dimensions // used by embedding families
45
+
```
46
+
47
+
### Example configurations
48
+
26
49
27
50
**For AWS Bedrock**, `~/.netrc` config is
28
51
```
29
52
machine thinker
30
-
provider bedrock
53
+
provider provider:bedrock/foundation/converse
54
+
model us.anthropic.claude-3-7-sonnet-20250219-v1:0
0 commit comments