🚀 AISBF - Unified AI Proxy for Local & Cloud LLMs (BETA Release)
Posted by sporastefy@reddit | LocalLLaMA | View on Reddit | 2 comments
AISBF is now in BETA - a smart proxy that gives you a single endpoint for both local LLMs (like Ollama) and cloud providers (OpenAI, Anthropic, Google, etc.).
Key features for local AI enthusiasts: - 🔄 Seamless local-cloud mixing: Run Ollama locally and automatically fall back to cloud providers when needed - 💾 Intelligent caching: Semantic caching reduces redundant local LLM calls - ⚡ Provider-native caching: Supports Ollama, plus Anthropic/Google/OpenAI optimizations - 🤖 Auto-selection: AI-powered model selection based on your content - 🔧 Unified API: OpenAI-compatible endpoint works with any local LLM setup - 👥 Multi-user: Perfect for teams sharing local LLM resources - 🌐 TOR support: Access your local LLM setup anonymously via TOR - 💰 Cost saving: Reduce API calls by caching repeated prompts
Try it: - Hosted demo (no setup): https://aisbf.cloud - Self-host: `pip install aisbf` (works with local Ollama out of the box) - Source: https://git.nexlab.net/nexlab/aisbf.git
AISBF is free and open source (GPL-3.0). Would love feedback from anyone working with local LLMs!
Wonderful-Agency-210@reddit
what's new here?
we've been building portkey's AI gateway for the 3 years. and it's already at 10k+ starts, MIT license and OS on github: https://github.com/portkey-ai/gateway
sporastefy@reddit (OP)
for example the request classification with autoselection/autorouting? The support for direct oauth for use with cli based subscriptions account?
i don't see any of those features in portkey-ai :)