Ollama proxy
OpenAI-compatible endpoint for Cursor & other clients
Backend: Checking…
Cursor – Override OpenAI Base URL:
…/v1API Key: ollama · Model: qwen2.5-coder:14b
Send a message below. Backend must be connected.
Set OLLAMA_BACKEND_URL in Vercel to your tunnel URL.
https://xxx.trycloudflare.com/ollama/v1). Update when you restart the tunnel.