Article Information

Category: Development

Published: 10 March, 2026

Author: Chris de Gruijter

Reading Time: 10 min

Tags

MCPAIdeveloper toolsWindsurfproductivitySupabaseCloudflare
Code editor with syntax highlighting showing developer workflow

Useful MCP Server Configs: The Tools That Actually Improve My AI Coding Workflow

Published: 10 March, 2026

The Model Context Protocol (MCP) is one of those things that sounds abstract until you set it up — and then you wonder how you ever coded without it. MCP servers extend your AI coding assistant with specialized tools: database access, documentation search, file operations, web search, and more.

After months of tweaking my configuration in Windsurf, I've landed on a setup that genuinely improves my workflow. Not every MCP server is worth enabling — some add noise, some are too slow, and some overlap with what the AI already does well. Here's what I actually use, why I use it, and what I've disabled.

What is MCP and Why Should You Care?

MCP (Model Context Protocol) is a standard that connects AI assistants to external tools and data sources. Think of it as giving your AI coding partner the ability to query databases, search documentation, read files, and interact with APIs — all within the context of your conversation.

Without MCP, your AI assistant relies entirely on its training data and whatever files you explicitly share. With MCP, it can pull live data from Supabase, search Cloudflare documentation, check your filesystem, or even perform web searches — all without you leaving your editor.

My Active MCP Configuration

Here's the MCP setup I use daily in Windsurf. I'll explain what each server does and why it earns its spot in my config.

1. Supabase MCP Server

"supabase-mcp-server": {
  "args": [
    "-y",
    "@supabase/mcp-server-supabase@latest",
    "--access-token",
    "your-access-token"
  ],
  "command": "npx"
}

This is the single most valuable MCP server in my setup. It gives the AI direct access to my Supabase project — it can list tables, run SQL queries, apply migrations, check advisors for security issues, and even deploy Edge Functions. When I'm working on the dashboard or the agentic SEO system, being able to say "check what's in the audit_logs table for the last 24 hours" and get an instant answer is incredibly powerful.

  • Best for: Database debugging, schema exploration, quick SQL queries
  • Watch out for: Make sure you use a scoped access token, not your service role key

2. Cloudflare Docs

"cloudflare-docs": {
  "serverUrl": "https://docs.mcp.cloudflare.com/mcp"
}

A remote MCP server that searches Cloudflare's documentation. Since I deploy everything on Cloudflare (Workers, Pages, R2), having instant doc access is invaluable. The AI can look up Worker cron syntax, R2 API methods, or Pages configuration without me opening a browser tab.

3. Filesystem Server

"filesystem": {
  "command": "npx",
  "args": [
    "-y",
    "@modelcontextprotocol/server-filesystem",
    "/home/user/Projects"
  ]
}

Gives the AI read/write access to your project files. While the IDE already has file access, the filesystem MCP server provides additional capabilities like directory tree views, file metadata, and batch operations. I scope it to my /Projects directory to keep it contained.

4. Sequential Thinking

"sequential-thinking": {
  "args": ["-y", "@modelcontextprotocol/server-sequential-thinking"],
  "command": "npx"
}

This one's subtle but effective. It provides a structured thinking tool that helps the AI break down complex problems step by step. I notice better results on architectural decisions and multi-file refactors when this is enabled — the AI takes time to reason through the approach before diving into code.

5. Memory Server

"memory": {
  "command": "npx",
  "args": ["-y", "@modelcontextprotocol/server-memory"],
  "description": "Persistent memory across sessions"
}

Provides a knowledge graph that persists across sessions. The AI can create entities, add observations, and build up a knowledge base about your project over time. Useful for long-running projects where you want the AI to remember architectural decisions, project structure, or client-specific details.

6. SerpAPI for Web Search

"serpapi-mcp": {
  "type": "http",
  "url": "https://mcp.serpapi.com/your-key/mcp"
}

Enables real-time web search from within the IDE. When the AI needs to look up the latest API syntax, check npm package versions, or research a library, it can do a Google search without you switching contexts. I use this less frequently than the other servers, but it's a lifesaver when you need current information that's beyond the AI's training cutoff.

7. Nuxt and Nuxt SEO

"nuxt": {
  "serverUrl": "https://nuxt.com/mcp",
  "disabledTools": ["get-blog-post", "list-modules", "..."]
},
"nuxt-seo-pro": {
  "serverUrl": "https://nuxtseo.com/mcp/pro",
  "headers": { "Authorization": "Bearer your-token" }
}

Since I build Nuxt sites for clients and my own portfolio, having direct access to Nuxt documentation and the Nuxt SEO toolkit is valuable. The SEO Pro server can analyze Vue SFC files for SEO issues, generate Schema.org markup, and even query Google Search Console data. I disable the tools I don't need to reduce noise.

8. Stripe MCP

"stripe": {
  "serverUrl": "https://mcp.stripe.com",
  "disabledTools": ["list_customers", "list_products", "..."]
}

For the dashboard's payment integration, having Stripe docs accessible through MCP is handy. I disable most data-reading tools since I don't want the AI browsing customer data — I only keep the documentation search and integration recommender active.

What I've Tried and Disabled

Not every MCP server deserves a spot in your config. Here's what I've experimented with and ultimately disabled:

  • GitHub MCP Server — Requires Docker, adds latency, and the AI already handles git operations well through the terminal
  • Mixpanel — Too many irrelevant tools for my use case, and the analytics data wasn't actionable in a coding context
  • Sentry — Good concept, but I found I rarely needed error tracking data while coding — I check Sentry separately
  • ClickHouse — Disabled because I don't have a ClickHouse instance; only useful if you're running analytics queries
  • Cloudflare Workers Bindings — Interesting for complex Workers setups, but not needed for my relatively simple cron-based agents
  • Context7 — Live documentation lookup that overlaps with what the Nuxt and Cloudflare MCP servers already provide

Tips for Configuring MCP Servers

Disable Tools You Don't Need

Most MCP servers come with 10-30 tools. If you only use 3 of them, disable the rest with the disabledTools array. Fewer tools means less context overhead for the AI, faster responses, and fewer irrelevant tool calls.

Use Scoped Credentials

Never use admin or service role keys in MCP configs. For Supabase, use a personal access token. For APIs, use scoped keys with minimal permissions. The AI will see whatever the credentials allow — keep the blast radius small.

Remote Servers Are Faster Than Local

MCP servers that run via serverUrl (remote HTTP) are generally faster to initialize than local npx-based servers. The first call to an npx server has a cold start while it downloads and initializes the package. Remote servers like Cloudflare Docs and Nuxt respond instantly.

Keep a "Current" and "All Available" Config

I maintain two config files: current-mcp-config.json with my active setup, and all-available-mcp-config.json with every server I've ever tried (including disabled ones). This makes it easy to re-enable a server when I need it for a specific project without having to look up the configuration again.

The Actual Impact on My Workflow

With MCP servers configured, my typical development flow looks like this: I describe what I want to build, the AI uses Supabase MCP to check the current schema, uses filesystem MCP to read related files, uses Cloudflare Docs to look up deployment syntax, and uses sequential thinking to plan the implementation. All without me opening a single browser tab.

The biggest win isn't any single server — it's the compound effect of having all your tools accessible in one place. The AI becomes a genuinely useful pair programmer instead of a text generator that needs you to manually provide all context.

Getting Started

If you're using Windsurf (or any MCP-compatible IDE), start with just two servers: filesystem and sequential-thinking. They're free, require no API keys, and immediately improve the AI's ability to work with your codebase. Add database and documentation servers as your projects require them.

The full MCP specification is open-source and growing fast. New servers are published weekly on npm, and more SaaS providers are adding MCP endpoints. It's still early days, but the developer experience improvement is already substantial.

Frequently Asked Questions

What is MCP (Model Context Protocol)?

MCP is an open standard that lets AI assistants connect to external tools and data sources. It provides a consistent interface for things like database queries, documentation search, file operations, and API calls — all accessible from within your AI coding environment.

Which IDE supports MCP servers?

Windsurf, Cursor, and other AI-focused IDEs support MCP. The configuration format is a JSON file that specifies which servers to connect to, either via local npx commands or remote HTTP endpoints.

Are MCP servers free?

Many MCP servers are free and open-source (filesystem, sequential-thinking, memory). Some are free tiers of paid services (Supabase, Cloudflare Docs). A few require paid subscriptions (Nuxt SEO Pro, SerpAPI). Start with the free ones and add paid servers as needed.

Do MCP servers slow down my IDE?

Remote servers (via serverUrl) have negligible impact. Local npx servers have a cold start on first use but are fast afterward. The key is to disable tools you don't need — fewer tools means less overhead for the AI when deciding which tool to call.

Is it safe to put API keys in MCP config?

Use scoped, minimal-permission keys — never admin or service role keys. The config file is local to your machine, but treat it like any credentials file: add it to .gitignore if it's in a repo, and rotate keys periodically. For sensitive services, consider using environment variable references instead of hardcoded keys.