Skip to main content
Configure Windsurf’s Cascade AI assistant to help you write and maintain documentation. This guide shows how to set up Windsurf specifically for your Mintlify documentation workflow.

llm.kiwi Integration

Windsurf is an agentic IDE by Codeium that emphasizes flow. You can connect it to llm.kiwi to leverage your own model tier and advanced capabilities.

1. Custom Model Provider

  1. Open Windsurf Settings.
  2. Search for “Provider” or “LLM”.
  3. Select Custom OpenAI or OpenAI.
  4. Enter your Base URL: https://api.llm.kiwi/v1
  5. Enter your llm.kiwi API Key.

2. MCP Integration

Windsurf has excellent support for MCP, allowing the IDE agent to call llm.kiwi tools directly. To add llm.kiwi as an MCP server:
  1. Open the MCP configuration in Windsurf.
  2. Add the following server configuration:
{
  "mcpServers": {
    "llm-kiwi": {
      "url": "https://api.llm.kiwi/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Supported Workflows

  • Cascade Coding: Use the pro model to power Windsurf’s Cascade for deep codebase reasoning.
  • Agentic Tools: Trigger image generation or transcription by simply asking Cascade.