Ollama proxy

OpenAI-compatible endpoint for Cursor & other clients

Backend: Checking…

Cursor – Override OpenAI Base URL:

/v1

API Key: ollama · Model: qwen2.5-coder:14b

Send a message below. Backend must be connected.

Set OLLAMA_BACKEND_URL in Vercel to your tunnel URL.

Cursor stuck at "Planning next moves"? Use your tunnel URL as Base URL in Cursor (e.g. https://xxx.trycloudflare.com/ollama/v1). Update when you restart the tunnel.