Is your feature request related to a problem?
Yes. The project is currently vendor-locked to Cloudflare Worker AI and Google Gemini. This rigidly restricts users from utilizing other LLM providers, open-source models, or self-hosted solutions.
Describe the solution you'd like
Support standard API formats (OpenAI-compatible, Anthropic, Gemini) as open options. Provide configuration fields allowing users to define:
- Custom Base URL (Endpoint)
- API Key
- Custom Model Name
This decoupling enables users to connect any compatible service (e.g., Groq, DeepSeek, Together AI) or local setups (e.g., Ollama, vLLM) effortlessly.
Describe alternatives you've considered
- Hardcoding new providers: Manually adding support for Groq, Mistral, etc., one by one. This is unscalable and bloats the codebase.
- Requiring external proxies: Forcing users to set up middleware (like LiteLLM) to translate requests into the currently supported formats. This adds unnecessary infrastructure complexity.
Additional context
No response
Is your feature request related to a problem?
Yes. The project is currently vendor-locked to Cloudflare Worker AI and Google Gemini. This rigidly restricts users from utilizing other LLM providers, open-source models, or self-hosted solutions.
Describe the solution you'd like
Support standard API formats (OpenAI-compatible, Anthropic, Gemini) as open options. Provide configuration fields allowing users to define:
This decoupling enables users to connect any compatible service (e.g., Groq, DeepSeek, Together AI) or local setups (e.g., Ollama, vLLM) effortlessly.
Describe alternatives you've considered
Additional context
No response