MCP · A2A · x402 · agentndx.ai llms.txt MCP endpoint
BrowseAI & Models → LiteLLM MCP
LiteLLM MCP
Universal LLM gateway via MCP. Route agent requests to 100+ LLM providers — OpenAI, Anthropic, Bedrock, Gemini, Groq — through a single interface with cost tracking and load balancing.
MCP verified
Transport stdio
Auth api-key
Endpoint npx litellm-mcp-server
Install
npx litellm-mcp-server
01 Route agent calls to the cheapest or fastest LLM for each task
02 Track token spend and cost per model across a multi-agent system
03 Switch LLM providers without changing agent code
llm-gateway multi-model openai anthropic cost-tracking load-balancing
Machine-readable: /api/servers.json  ·  JSON-LD schema embedded in <head>
FEATURED LISTING

Top placement + verified badge for your MCP server

Get Featured — $149
API PRO

Full API access — no rate limits, all endpoints

API Pro — $29/mo