Skip to main content
langsight investigate uses an LLM to analyse health evidence and produce root cause reports. You can choose from four providers.

Quick setup

Add an investigate block to .langsight.yaml:
investigate:
  provider: gemini           # anthropic | openai | gemini | ollama
  model: gemini-2.0-flash    # optional — sensible defaults per provider

Comparison

ClaudeOpenAIGeminiOllama
Free tierNoNo✅ 1,500/day✅ Unlimited
Data privacyAnthropicOpenAIGoogleOn your machine
Context window200K128K1MVaries
RCA quality⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐
Setup time2 min2 min2 min5 min

Environment variables

ProviderVariableWhere to get it
ClaudeANTHROPIC_API_KEYconsole.anthropic.com
OpenAIOPENAI_API_KEYplatform.openai.com/api-keys
GeminiGEMINI_API_KEYaistudio.google.com/app/apikey
Ollama(none required)
Never put API keys directly in .langsight.yaml. Use environment variables. The api_key field is only for CI/CD secrets injection.

Rule-based fallback

If no provider is configured or the API call fails, langsight investigate automatically falls back to deterministic heuristics — no LLM needed.

Claude

Best quality RCA with adaptive thinking.

OpenAI

GPT-4o for teams already on OpenAI.

Gemini

Free tier, 1M context window.

Ollama

Local, free, air-gapped.