MCP · A2A · x402 · agentndx.ai llms.txt MCP endpoint
BrowseAI & Models → Together AI MCP
Together AI MCP
Together AI inference API via MCP. Run open-source LLMs — Llama, Mistral, DBRX, and more — with fast parallel inference at scale.
MCP verified
Transport stdio
Auth api-key
Endpoint npx @togetherai/mcp-server
Install npx @togetherai/mcp-server
01 Run Llama, Mistral, or DBRX inference in agent pipelines
02 Fine-tune open-source models on custom datasets
03 Parallel batch inference for high-throughput workflows
together-ai llm inference open-source llama
Machine-readable: /api/servers.json  ·  JSON-LD schema embedded in <head>