All projects
Infrastructure
Live
LLM Client
A Python client that abstracts away provider differences and implements intelligent fallback. Tries OpenRouter first (widest model access), falls back to Gemini, then Anthropic, then OpenAI. Handles rate limiting, quota exhaustion, and provider outages transparently. Used by every Python-based tool in the system.
llm-client
client = UnifiedLLM() result = client.complete( "analyze this market", model="auto" # best available )
PythonOpenRouterMulti-Provider
Stack
- Python
- OpenRouter
- Gemini API
- Anthropic SDK