Browse
→ AI & Models
→ LiteLLM MCP
LiteLLM MCP
Universal LLM gateway via MCP. Route agent requests to 100+ LLM providers — OpenAI, Anthropic, Bedrock, Gemini, Groq — through a single interface with cost tracking and load balancing.
MCP verified
Integration
| Transport | stdio |
| Auth | api-key |
| Endpoint | npx litellm-mcp-server |
| Install | npx litellm-mcp-server |
Use Cases
| 01 | Route agent calls to the cheapest or fastest LLM for each task |
| 02 | Track token spend and cost per model across a multi-agent system |
| 03 | Switch LLM providers without changing agent code |
Tags
llm-gateway multi-model openai anthropic cost-tracking load-balancing
Machine-readable: /api/servers.json
· JSON-LD schema embedded in <head>