LiteLLM
About LiteLLM
"Unified API gateway for 100+ LLMs with one consistent OpenAI-compatible interface"
LiteLLM is an open-source unified API that provides a single interface for calling 100+ LLM APIs including OpenAI, Anthropic, Gemini, Mistral, and local models, all in the OpenAI format. Developers can switch between providers with a single line change, implement fallbacks and load balancing, track costs across providers, and add rate limiting without changing their application logic. LiteLLM also provides a self-hosted proxy server for teams needing centralized API key management, budget controls, and access logging across their organization.
Key Features
5Best For
4 use casesOfficial Links
Similar to LiteLLM
6SambaNova Cloud
Ultra-fast inference for large frontier AI models on custom dataflow processors
Together AI
High-speed inference and fine-tuning platform for open-source AI models
Phi-4 Mini
Microsoft's compact 3.8B reasoning model that punches above its weight class
Mistral AI
Powerful open-source and commercial language models from Europe
Aya Expanse
Cohere's multilingual LLM covering 23 languages with state-of-the-art performance
LangSmith
Production observability platform for debugging and monitoring LLM applications
Tool Details
Use Cases
Claim this listing
Get your Official badge, edit your page, and access analytics.
Claim Listing