Skip to content

Ollama Support (#149) #156

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jan 6, 2025
Merged

Ollama Support (#149) #156

merged 4 commits into from
Jan 6, 2025

Conversation

daeisbae
Copy link
Owner

@daeisbae daeisbae commented Jan 6, 2025

Changes

  • Ollama Support for OpenRepoWiki.

How to use it?

Inside .env file, set LLM_PROVIDER to ollama, and set LLM_MODELNAME to the model name that you can see from ollama with the command ollama ls
Screenshot 2025-01-05 at 10 21 01 PM

LLM_PROVIDER=ollama
LLM_APIKEY=
LLM_MODELNAME=exaone3.5:latest

Potential problems

  • LLMs under 8b fails to output a proper summarization. (I set my environment variable TOKEN_PROCESSING_CHARACTER_LIMIT=15000 which was for 32k context length. It can read approximately 500 lines of code) So it is recommended to use much larger parameter LLM .

@daeisbae daeisbae added the enhancement New feature or request label Jan 6, 2025
@daeisbae daeisbae linked an issue Jan 6, 2025 that may be closed by this pull request
@daeisbae daeisbae merged commit 504f7f3 into main Jan 6, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support for ollama?
1 participant