Continue
Local- and SSH-friendly AI coding assistant; maximum control over model and context.
When to use it
-
You want to use local models (Ollama, etc.) or your own API keys without sending code to a vendor's cloud.
-
You develop on remote machines via SSH and want the same assistant in your editor.
-
You prefer an extension that works inside VS Code or JetBrains with minimal vendor lock-in.
Hands-on
-
Prerequisites — VS Code (or supported editor); optional: Ollama or API keys for Claude/OpenAI.
-
Install — continue.dev. Configure model (local or API) in Continue settings.
-
Runnable example — Build the same small API as in the Cursor/Copilot/Windsurf recipes (see Quick product recipe).
Telegram
Continue does not provide a Telegram gateway. For Telegram + AI, use OpenClaw or the custom bot (playgrounds/telegram-ai-bot/).
Quick product recipe
Same small API with Continue; compare workflow (local/SSH) (~15 min).
- Open a folder (local or over SSH) in your editor with Continue enabled.
- Use Continue to create a minimal HTTP API (e.g. one GET endpoint and README). If using local models, notice latency vs cloud.
- Run and verify. Compare workflow and control vs Cursor, Copilot, or Windsurf.
No separate playground in this repo; the "playground" is any folder and the agent-generated API.