A comprehensive command-line interface for accessing various Large Language Models (LLMs) from multiple providers. Built with Java 17 and LangChain4j, this tool provides a unified interface for interacting with models from OpenAI, Anthropic, Google Gemini, Azure OpenAI, Groq, Ollama, and more.
- Multi-Provider Support: Connect to OpenAI, Anthropic, Google Gemini, Azure OpenAI, Groq, Ollama, and HuggingFace models
- Template System: Use predefined templates for common prompt patterns
- Conversation Management: Continue conversations with context and history
- Fragment Support: Prepend reusable text fragments to prompts
- Model Aliases: Create shortcuts for frequently used models
- Structured Output: Support for JSON schema-based responses
- Tool Integration: Make tools available to compatible models
- Embedding Models: Generate embeddings for text content
- Logging & History: Track conversations and model interactions
- H2 Database: Local storage for conversations and logs
- Java 17 or higher
- Maven 3.6+ for building the project
- API keys for the LLM providers you want to use
- Clone the repository:
git clone <repository-url>
cd chat
- Build the project:
mvn clean package
This creates an executable JAR file in the target/
directory.
- Set up API keys:
The application supports multiple providers. Set the relevant environment variables:
# Groq
export GROQ_API_KEY="your_groq_api_key"
# OpenAI
export OPENAI_API_KEY="your_openai_api_key"
# Anthropic
export ANTHROPIC_API_KEY="your_anthropic_api_key"
# Google Gemini
export GOOGLE_AI_GEMINI_API_KEY="your_gemini_api_key"
# Azure OpenAI
export AZURE_OPENAI_KEY="your_azure_key"
export AZURE_OPENAI_ENDPOINT="your_azure_endpoint"
# HuggingFace
export HUGGING_FACE_API_KEY="your_hf_api_key"
Send a simple prompt to a model:
java -jar target/llm-java-1.0.jar -m groq/meta-llama/llama-4-scout-17b-16e-instruct "What is the capital of France?"
-m, --model
: Required - Specify the model to use-t, --template
: Use a predefined template-p, --param
: Parameters for templates (key-value pairs)--schema
: JSON schema for structured responses--tool
: Make a tool available to the model-f, --fragment
: Prepend a text fragment to the prompt-c, --continue
: Continue a previous conversation by ID--context
: Number of previous messages to include (default: 10)
Using templates:
java -jar target/llm-java-1.0.jar -m openai/gpt-4 -t code-review -p file "MyClass.java" -p language "Java"
Continuing a conversation:
java -jar target/llm-java-1.0.jar -m anthropic/claude-3-sonnet-20240229 -c conv_123 "Can you elaborate on that?"
Using fragments:
java -jar target/llm-java-1.0.jar -m groq/mixtral-8x7b-32768 -f expert-prompt "Explain quantum computing"
Structured output with JSON schema:
java -jar target/llm-java-1.0.jar -m openai/gpt-4 --schema person.json "Extract person info from this text"
Chat mode:
java -jar target/llm-java-1.0.jar chat -m openai/gpt-4
Generate embeddings:
java -jar target/llm-java-1.0.jar embed -m openai/text-embedding-ada-002 "Text to embed"
Manage aliases:
java -jar target/llm-java-1.0.jar aliases list
java -jar target/llm-java-1.0.jar aliases set my-model openai/gpt-4
View logs:
java -jar target/llm-java-1.0.jar log list
java -jar target/llm-java-1.0.jar log show <conversation-id>
- GPT-4, GPT-4 Turbo, GPT-3.5 Turbo
- Text embedding models
- Claude 3 (Opus, Sonnet, Haiku)
- Claude 2.1, Claude 2.0
- Gemini Pro, Gemini Pro Vision
- All OpenAI models via Azure endpoints
- LLaMA models, Mixtral, Gemma
- Local models via Ollama
- Various open-source models
src/main/java/com/example/llm/
├── Cli.java # Main CLI entry point
├── Llm.java # Core LLM interaction logic
├── ConversationManager.java # Conversation handling
├── TemplateManager.java # Template system
├── FragmentManager.java # Text fragment management
├── AliasManager.java # Model alias management
├── LogManager.java # Logging and history
├── ModelRegistry.java # Model configuration
└── provider/ # Provider-specific implementations
├── OpenAiChatModelWrapper.java
├── AnthropicChatModelWrapper.java
├── GeminiChatModelWrapper.java
└── ...
mvn test
mvn clean compile
mvn package
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
Maven dependency issues: If you encounter Maven dependency conflicts, try:
mvn dependency:tree
mvn clean install -U
API key errors: Ensure your API keys are properly set as environment variables and have sufficient permissions.
Model not found: Check that the model identifier is correct and supported by the provider.