Skip to content

Support OpenAI/Anthropic compatible APIs for broader LLM choice #4

@lehuygiang28

Description

@lehuygiang28

Is your feature request related to a problem?

Yes. The project is currently vendor-locked to Cloudflare Worker AI and Google Gemini. This rigidly restricts users from utilizing other LLM providers, open-source models, or self-hosted solutions.

Describe the solution you'd like

Support standard API formats (OpenAI-compatible, Anthropic, Gemini) as open options. Provide configuration fields allowing users to define:

  1. Custom Base URL (Endpoint)
  2. API Key
  3. Custom Model Name

This decoupling enables users to connect any compatible service (e.g., Groq, DeepSeek, Together AI) or local setups (e.g., Ollama, vLLM) effortlessly.

Describe alternatives you've considered

  • Hardcoding new providers: Manually adding support for Groq, Mistral, etc., one by one. This is unscalable and bloats the codebase.
  • Requiring external proxies: Forcing users to set up middleware (like LiteLLM) to translate requests into the currently supported formats. This adds unnecessary infrastructure complexity.

Additional context

No response

Metadata

Metadata

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions