Ad
 
Learn more
Favicon of Bifrost

Bifrost

Open-source AI gateway delivering 50x faster performance than alternatives. Access 1000+ models from 8+ providers with built-in governance, fallback, and observability.

This is a preview only. Bifrost will be published on April 21, 2026.

Bifrost is not yet published and is only visible on this page. Upgrade your listing to skip the queue and get published within 24 hours.

Upgrade listing

Ultra-high performance AI gateway built for enterprise-scale applications. With just 20 microseconds added latency and 5,000 requests per second throughput, it delivers exceptional speed while maintaining enterprise-grade reliability.

Key performance advantages:

  • 50x faster than LiteLLM with 54x better P99 latency
  • 68% less memory usage for optimal resource efficiency
  • 9.5x higher throughput with 11.22% better success rates
  • 99.99% uptime through automatic provider fallback

Comprehensive model access to over 1000+ AI models from 8+ providers including OpenAI, Anthropic, Google, and custom deployments through a unified interface. Drop-in replacement requiring just one line of code change - compatible with existing OpenAI, Anthropic, LiteLLM, LangChain, and Vercel AI SDK implementations.

Enterprise-ready features include virtual key management with independent budgets, real-time guardrails for model protection, built-in MCP gateway for tool management, and comprehensive governance with SSO integration. Built-in observability with OpenTelemetry support and dashboard for monitoring without complex setup.

Open-source with Apache 2.0 license, active Discord community support, and 14-day free enterprise trial available.

Categories:

Share:

Similar open source projects

Favicon

 

  
  • Stars


  • Forks


  • Last commit


Favicon

 

  
  • Stars


  • Forks


  • Last commit


Favicon

 

  
  • Stars


  • Forks


  • Last commit