🚀 AISBF - Unified AI Proxy for Local & Cloud LLMs (BETA Release)

Posted by sporastefy@reddit | LocalLLaMA | View on Reddit | 2 comments

AISBF is now in BETA - a smart proxy that gives you a single endpoint for both local LLMs (like Ollama) and cloud providers (OpenAI, Anthropic, Google, etc.).

Key features for local AI enthusiasts: - 🔄 Seamless local-cloud mixing: Run Ollama locally and automatically fall back to cloud providers when needed - 💾 Intelligent caching: Semantic caching reduces redundant local LLM calls - ⚡ Provider-native caching: Supports Ollama, plus Anthropic/Google/OpenAI optimizations - 🤖 Auto-selection: AI-powered model selection based on your content - 🔧 Unified API: OpenAI-compatible endpoint works with any local LLM setup - 👥 Multi-user: Perfect for teams sharing local LLM resources - 🌐 TOR support: Access your local LLM setup anonymously via TOR - 💰 Cost saving: Reduce API calls by caching repeated prompts

Try it: - Hosted demo (no setup): https://aisbf.cloud - Self-host: `pip install aisbf` (works with local Ollama out of the box) - Source: https://git.nexlab.net/nexlab/aisbf.git

AISBF is free and open source (GPL-3.0). Would love feedback from anyone working with local LLMs!