LLM Configuration
Instructions for configuring LLMs with different providers
Magnitude requires configuring two language models:
- “Planner” model (any good multi-modal LLM)
- “Executor” model (currently only Moondream is supported)
For the planner model, we currently support Google Vertex AI, Anthropic, AWS Bedrock, OpenAI, and OpenAI-compatible providers.
To configure your planner model, pass one of the client interfaces described below to your magnitude.config.ts
, like:
If no planner is configured, Magnitude will pick a provider and model based on available environment variables in this order:
GOOGLE_API_KEY
:gemini-2.5-pro-preview-03-25
,GOOGLE_APPLICATION_CREDENTIALS
:gemini-2.5-pro-preview-03-25
,OPENROUTER_API_KEY
:google/gemini-2.5-pro-preview-03-25
,ANTHROPIC_API_KEY
:claude-3-7-sonnet-latest
,OPENAI_API_KEY
:gpt-4.1-2025-04-14
Providers
Google AI Studio
Google Vertex AI
Authentication
The vertex-ai
provider by default will try to authenticate using the following strategies:
- if
GOOGLE_APPLICATION_CREDENTIALS
is set, it will use the specified service account - if you have run
gcloud auth application-default login
, it will use those credentials - if running in GCP, it will query the metadata server to use the attached service account
- if
gcloud
is available on thePATH
, it will usegcloud auth print-access-token
If you’re using Google Cloud application default credentials, you can expect authentication to work out of the box.
Setting options.credentials
will take precedence and force vertex-ai
to load
service account credentials from that file path.
Anthropic
OpenAI
OpenAI-compatible (OpenRouter, Ollama, etc.)
AWS Bedrock
Authenticate with bedrock using environment variables:
Configuring Moondream
Moondream cloud is the easiest way to get set up, and offers 5,000 free requests per day. Get an API key here.
Moondream is open source and can also be self-hosted instead of using their cloud option. See here for instructions.
If self-hosting, configure the baseUrl
to point to your server: