minicoder-model-config(5) | File Formats Manual | minicoder-model-config(5) |
minicoder-model-config - AI model configuration file format for minicoder(1)
~/.config/minicoder/models.json /etc/minicoder/models.json
The minicoder command uses JSON configuration files to define available AI models and their parameters. It supports multiple model providers through OpenAI-compatible APIs, including OpenAI, Anthropic (via proxy), Google, local models via Ollama, and many others through services like OpenRouter.
Configuration files are searched in the following order:
1. User configuration: ~/.config/minicoder/models.json 2. System configuration: /etc/minicoder/models.json 3. Built-in defaults (if no configuration files exist)
The first model defined in the configuration file becomes the default model when no --model option is specified.
The configuration file must contain a JSON object where each key is a model name and the value is a model definition object.
Each model definition must include:
type (string, required)
endpoint (string, required for openai type)
model (string, optional)
api_key (string, optional)
api_key_env (string, optional)
max_context_bytes (number, optional)
params (object, optional)
For models of type "openai", the following parameters in the params object are commonly used:
stream (boolean)
reasoning (object)
temperature (number)
max_tokens (number)
Basic configuration with two models:
{ "gpt4": { "type": "openai", "endpoint": "https://api.openai.com/v1/chat/completions", "model": "gpt-4-turbo", "api_key_env": "OPENAI_API_KEY", "max_context_bytes": 512000, "params": { "stream": true, "temperature": 0.7 } }, "local-llama": { "type": "openai", "endpoint": "http://localhost:8080/v1/chat/completions", "model": "llama-3.1-8b", "max_context_bytes": 32768, "params": { "stream": false } } }
Configuration using OpenRouter with reasoning models:
{ "o3": { "type": "openai", "endpoint": "https://openrouter.ai/api/v1/chat/completions", "model": "openai/o3", "api_key_env": "OPENROUTER_API_KEY", "max_context_bytes": 512000, "params": { "reasoning": { "effort": "high" }, "stream": true } }, "gemini": { "type": "openai", "endpoint": "https://openrouter.ai/api/v1/chat/completions", "model": "google/gemini-2.5-pro", "api_key_env": "OPENROUTER_API_KEY", "max_context_bytes": 2097152, "params": { "reasoning": { "effort": "high" }, "stream": true } } }
Configuration for Ollama (local model server):
{ "ollama-llama": { "type": "openai", "endpoint": "http://localhost:11434/v1/chat/completions", "model": "llama3.1:latest", "max_context_bytes": 32768, "params": { "stream": true, "temperature": 0.7 } }, "ollama-mistral": { "type": "openai", "endpoint": "http://localhost:11434/v1/chat/completions", "model": "mistral:latest", "max_context_bytes": 32768, "params": { "stream": true } } }
If no configuration file is found, minicoder uses built-in models based on available API keys:
The first model in the list becomes the default when no --model option is specified.
API keys can be specified via environment variables using the api_key_env field. Common variables:
OPENAI_API_KEY
OPENROUTER_API_KEY
ANTHROPIC_API_KEY
minicoder(1)
Maintained by the assist project contributors.
1980-01-01 |