Skip to content

Commit b085cc8

Browse files
deploy: aa67888
0 parents  commit b085cc8

26 files changed

+8629
-0
lines changed

.nojekyll

Whitespace-only changes.

core.html

Lines changed: 1467 additions & 0 deletions
Large diffs are not rendered by default.

core.html.md

Lines changed: 1073 additions & 0 deletions
Large diffs are not rendered by default.

index.html

Lines changed: 796 additions & 0 deletions
Large diffs are not rendered by default.

index.html.md

Lines changed: 250 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,250 @@
1+
# msglm
2+
3+
4+
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
5+
6+
### Installation
7+
8+
Install the latest version from pypi
9+
10+
``` sh
11+
$ pip install msglm
12+
```
13+
14+
## Usage
15+
16+
To use an LLM we need to structure our messages in a particular format.
17+
18+
Here’s an example of a text chat from the OpenAI docs.
19+
20+
``` python
21+
from openai import OpenAI
22+
client = OpenAI()
23+
24+
completion = client.chat.completions.create(
25+
model="gpt-4o",
26+
messages=[
27+
{"role": "user", "content": "What's the Wild Atlantic Way?"}
28+
]
29+
)
30+
```
31+
32+
Generating the correct format for a particular API can get tedious. The
33+
goal of *msglm* is to make it easier.
34+
35+
The examples below will show you how to use *msglm* for text and image
36+
chats with OpenAI and Anthropic.
37+
38+
### Text Chats
39+
40+
For a text chat simply pass a list of strings and the api format
41+
(e.g. “openai”) to **mk_msgs** and it will generate the correct format.
42+
43+
``` python
44+
mk_msgs(["Hello, world!", "some assistant response"], api="openai")
45+
```
46+
47+
``` js
48+
[
49+
{"role": "user", "content": "Hello, world!"},
50+
{"role": "assistant", "content": "Some assistant response"}
51+
]
52+
```
53+
54+
#### anthropic
55+
56+
``` python
57+
from msglm import mk_msgs_anthropic as mk_msgs
58+
from anthropic import Anthropic
59+
client = Anthropic()
60+
61+
r = client.messages.create(
62+
model="claude-3-haiku-20240307",
63+
max_tokens=1024,
64+
messages=[mk_msgs(["Hello, world!", "some LLM response"])]
65+
)
66+
print(r.content[0].text)
67+
```
68+
69+
#### openai
70+
71+
``` python
72+
from msglm import mk_msgs_openai as mk_msgs
73+
from openai import OpenAI
74+
75+
client = OpenAI()
76+
r = client.chat.completions.create(
77+
model="gpt-4o-mini",
78+
messages=[mk_msgs(["Hello, world!", "some LLM response"])]
79+
)
80+
print(r.choices[0].message.content)
81+
```
82+
83+
### Image Chats
84+
85+
For an image chat simply pass the raw image bytes in a list with your
86+
question to *mk_msgs* and it will generate the correct format.
87+
88+
``` python
89+
mk_msg([img, "What's in this image?"], api="anthropic")
90+
```
91+
92+
``` js
93+
[
94+
{
95+
"role": "user",
96+
"content": [
97+
{"type": "image", "source": {"type": "base64", "media_type": media_type, "data": img}}
98+
{"type": "text", "text": "What's in this image?"}
99+
]
100+
}
101+
]
102+
```
103+
104+
#### anthropic
105+
106+
``` python
107+
import httpx
108+
from msglm import mk_msg_anthropic as mk_msg
109+
from anthropic import Anthropic
110+
111+
client = Anthropic()
112+
113+
img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
114+
img = httpx.get(img_url).content
115+
116+
r = client.messages.create(
117+
model="claude-3-haiku-20240307",
118+
max_tokens=1024,
119+
messages=[mk_msg([img, "Describe the image"])]
120+
)
121+
print(r.content[0].text)
122+
```
123+
124+
#### openai
125+
126+
``` python
127+
import httpx
128+
from msglm import mk_msg_openai as mk_msg
129+
from openai import OpenAI
130+
131+
img_url = "https://www.atshq.org/wp-content/uploads/2022/07/shutterstock_1626122512.jpg"
132+
img = httpx.get(img_url).content
133+
134+
client = OpenAI()
135+
r = client.chat.completions.create(
136+
model="gpt-4o-mini",
137+
messages=[mk_msg([img, "Describe the image"])]
138+
)
139+
print(r.choices[0].message.content)
140+
```
141+
142+
### API Wrappers
143+
144+
To make life a little easier, msglm comes with api specific wrappers for
145+
[`mk_msg`](https://AnswerDotAI.github.io/msglm/core.html#mk_msg) and
146+
[`mk_msgs`](https://AnswerDotAI.github.io/msglm/core.html#mk_msgs).
147+
148+
For Anthropic use
149+
150+
``` python
151+
from msglm import mk_msg_anthropic as mk_msg, mk_msgs_anthropic as mk_msgs
152+
```
153+
154+
For OpenAI use
155+
156+
``` python
157+
from msglm import mk_msg_openai as mk_msg, mk_msgs_openai as mk_msgs
158+
```
159+
160+
### Other use-cases
161+
162+
#### Prompt Caching
163+
164+
*msglm* supports [prompt
165+
caching](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching)
166+
for Anthropic models. Simply pass *cache=True* to *mk_msg* or *mk_msgs*.
167+
168+
``` python
169+
from msglm import mk_msg_anthropic as mk_msg
170+
171+
mk_msg("please cache my message", cache=True)
172+
```
173+
174+
This generates the expected cache block below
175+
176+
``` js
177+
{
178+
"role": "user",
179+
"content": [
180+
{"type": "text", "text": "Please cache my message", "cache_control": {"type": "ephemeral"}}
181+
]
182+
}
183+
```
184+
185+
#### PDF chats
186+
187+
*msglm* offers PDF
188+
[support](https://docs.anthropic.com/en/docs/build-with-claude/pdf-support)
189+
for Anthropic. Just like an image chat all you need to do is pass the
190+
raw pdf bytes in a list with your question to *mk_msg* and it will
191+
generate the correct format as shown in the example below.
192+
193+
``` python
194+
import httpx
195+
from msglm import mk_msg_anthropic as mk_msg
196+
from anthropic import Anthropic
197+
198+
client = Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})
199+
200+
url = "https://assets.anthropic.com/m/1cd9d098ac3e6467/original/Claude-3-Model-Card-October-Addendum.pdf"
201+
pdf = httpx.get(url).content
202+
203+
r = client.messages.create(
204+
model="claude-3-5-sonnet-20241022",
205+
max_tokens=1024,
206+
messages=[mk_msg([pdf, "Which model has the highest human preference win rates across each use-case?"])]
207+
)
208+
print(r.content[0].text)
209+
```
210+
211+
Note: this feature is currently in beta so you’ll need to:
212+
213+
- use the Anthropic beta client
214+
(e.g. `anthropic.Anthropic(default_headers={'anthropic-beta': 'pdfs-2024-09-25'})`)
215+
- use the `claude-3-5-sonnet-20241022` model
216+
217+
#### Citations
218+
219+
*msglm* supports Anthropic
220+
[citations](https://docs.anthropic.com/en/docs/build-with-claude/citations).
221+
All you need to do is pass the content of your document to *mk_ant_doc*
222+
and then pass the output to *mk_msg* along with your question as shown
223+
in the example below.
224+
225+
``` python
226+
from msglm import mk_ant_doc, mk_msg_anthropic as mk_msg
227+
from anthropic import Anthropic
228+
229+
client = Anthropic()
230+
231+
doc = mk_ant_doc("The grass is green. The sky is blue.", title="My Document")
232+
233+
r = client.messages.create(
234+
model="claude-3-5-sonnet-20241022",
235+
max_tokens=1024,
236+
messages=[mk_msg([doc, "What color is the grass and sky?"])]
237+
)
238+
for o in r.content:
239+
if c:=getattr(o, 'citations', None): print(f"{o.text}. source: {c[0]['cited_text']} from {c[0]['document_title']}")
240+
else: print(o.text)
241+
```
242+
243+
*Note: The citations feature is currently available on Claude 3.5 Sonnet
244+
(new) and 3.5 Haiku.*
245+
246+
### Summary
247+
248+
We hope *msglm* will make your life a little easier when chatting to
249+
LLMs. To learn more about the package please read this
250+
[doc](https://answerdotai.github.io/msglm/).

robots.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Sitemap: https://AnswerDotAI.github.io/msglm/sitemap.xml

search.json

Lines changed: 62 additions & 0 deletions
Large diffs are not rendered by default.

site_libs/bootstrap/bootstrap-2c2b2e643518224cf2d75a643cacf640.min.css

Lines changed: 12 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)