Skip to content

Commit 56ddc20

Browse files
authored
Merge pull request #105 from ovchynnikov/dev
ollama.ccp API instead of ollama. JSON request fromat changed
2 parents d255788 + a174ff1 commit 56ddc20

File tree

3 files changed

+11
-6
lines changed

3 files changed

+11
-6
lines changed

.github/workflows/github-actions-push-image.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ jobs:
4545
4646
- name: Build and push Docker image
4747
id: push
48-
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4
48+
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1
4949
with:
5050
context: .
5151
file: ./Dockerfile

src/main.py

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -458,12 +458,17 @@ async def respond_with_llm_message(update):
458458
try:
459459
async with aiohttp.ClientSession() as session:
460460
async with session.post(
461-
f"{LLM_API_ADDR}/api/generate",
462-
json={"model": LLM_MODEL, "prompt": prompt, "stream": False, "num_predict": 200},
461+
f"{LLM_API_ADDR}/completion",
462+
json={
463+
"prompt": prompt,
464+
"n_predict": 200,
465+
"temperature": 0.7,
466+
"stop": ["</s>", "User:", "Assistant:"],
467+
},
463468
) as response:
464469
if response.status == 200:
465470
result = await response.json()
466-
bot_response = result.get("response", "Sorry, I couldn't generate a response.")
471+
bot_response = result.get("content", "Sorry, I couldn't generate a response.")
467472
else:
468473
bot_response = "Sorry, I encountered an error while processing your request."
469474

src/requirements.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
python-telegram-bot[ext]==22.0
22
python-dotenv==1.1.0
3-
yt-dlp==2025.3.31
4-
gallery-dl==1.29.4
3+
yt-dlp==2025.4.30
4+
gallery-dl==1.29.6
55
aiohttp==3.11.18

0 commit comments

Comments
 (0)