16 MCP servers you should know

Whether database updates, ticket creation or payment processing - the following sections show you 15 MCP servers with which an AI language model performs such tasks by command in natural language. Each profile explains in a few sentences which actions the respective server enables and what it is particularly suitable for.

Before you dive into the list of MCP servers, it's worth taking a look at our detailed article on the Model Context Protocol. It describes how a single manifest securely connects language models with external services. With this background, it is easier to select the right server for your tech stack from the following overview.

All instances presented here are official MCP servers. They are developed and maintained directly by the respective service providers and are always updated in parallel with their original APIs. This means that endpoints, auth procedures and rate limits remain identical to the live platform and suitable for productive workflows, while community or pure reference servers are usually only used for demo purposes and are maintained less regularly.

Zapier

Zapier operates a fully hosted MCP server that gives your language model direct access to over 7,000 apps and more than 30,000 actions. After creating an individual endpoint, you authorize the required apps via OAuth or API key; Zapier takes over the complete auth handling and sets fixed rate limits of 300 calls per month in the beta. Each tool call delivers a structured JSON response so that you can send messages, create data records or book appointments without having to write your own integrations. This makes Zapier MCP a flexible orchestration layer for productive workflows, without any additional infrastructure on your side.

Zapier MCP Demo: Connect AI Assistants to Nearly 8,000+ Apps Effortlessly!

Make

Make operates a cloud-based MCP server that makes every scenario you release available as a standalone tool. As soon as you have generated an access token, these scenarios appear as executable actions in the manifest. An automated lead flow can use this to check data, create CRM entries and send notifications, for example. During a run, all log entries are displayed in real time in the browser so that every intermediate step can be tracked. The platform automatically scales the execution up to the parallel run limit specified in your tariff and offers fine-grained access rights so that only authorized scenarios are published. This keeps your automation transparent, controllable and expandable at any time, without you having to worry about infrastructure.

n8n

n8n has included MCP support natively since version 1.88.0. If you activate the MCP server trigger in the workflow, n8n generates a unique access URL including token and provides all steps of your workflow marked as tool node in the manifest. Such a workflow can retrieve open tasks, transform data and write to a database, while the language model takes over the orchestration. Executions are displayed in real time in the editor and can be debugged or restarted. With self-hosting, your data remains completely in your infrastructure; if you choose n8n Cloud, it is stored in the environment managed by n8n. You define access rights for each workflow so that you can precisely control which automations are publicly available.

→ 💪🏼 Do you want to get started with n8n and MCP? Then we have prepared a practical guide for you here. 

Cloudflare

Cloudflare provides a suite of MCP servers, including services for DNS analysis, workers observability, KV storage, Logpush, and documentation lookup. With a suitably scoped API or OAuth token, these tools can be accessed with low latency over the global Cloudflare network. The model can thus retrieve logs, create new namespaces or obtain configuration examples. The responses follow the JSON-RPC 2.0 format; for log queries, they contain edge timestamps, which facilitates troubleshooting. Granular token scopes prevent unwanted changes, and Cloudflare's global infrastructure ensures stable accessibility even during peak loads.

Build your own remote MCP server with Cloudflare

Linear

Linear is expanding its fast project management with an official MCP server. After OAuth authorization, tools such as find_issues (search), create_issue (create) and update_issue (change) are available, which accept parameters such as title, description and priority. This allows a language model to create, filter or update tickets, while Linear logs every action in the issue history. The responses are output via a streaming endpoint so that results are received during execution. OAuth scopes limit write and admin access. 

PayPal

PayPal operates an MCP server for merchant processes. After successful OAuth authorization, the server only displays the tools for which the token has the corresponding scopes, for example to create invoices or refund payments. Each transaction is given a unique reference ID and the response is immediately transmitted to the calling model via a streaming endpoint. The scope-based tokens strictly limit access to approved actions and thus prevent unwanted debits. For later review, PayPal provides audit-proof audit reports that make all financial transactions traceable over several years.

Overview of the PayPal MCP server

Notion

Notion is currently testing a hosted MCP server. After OAuth authorization, the server only provides the pages and databases as tools that have been explicitly approved for integration. The responses come in a compact, Markdown-based schema and contain block IDs so that a model can precisely reference and edit individual elements. Available commands range from create page to write database entries to search content. Changes end up in Notion's version history log after a short time, and because integrations can only access shared areas, confidential data remains hidden.

GitHub

GitHub operates an official MCP server that can be used to address pull requests, issues, GitHub actions and repository content via a toolset. After authentication with a personal access token, commands such as pull_requests.list, issues.create, actions.dispatch_workflow or repos.search_files are available. The responses are delivered in JSON-RPC 2.0 format and contain status information, reviewer information and timestamps. The toolset option can be used to restrict the available range of functions to specific areas, which reduces the attack surface. Tools such as list_workflow_jobs and get_job_logs also provide check and log data that can be evaluated in parallel during execution.

Introducing the GitHub MCP Server: AI interaction protocol | GitHub Checkout

Supabase

The Supabase MCP server makes Postgres databases and - depending on the implementation - Auth and storage resources of a Supabase project directly usable for LLM clients. After authentication using a personal access token, the server reads the database schema (and user or bucket information for community servers) and provides more than 20 standardized tools. These can be used to create tables, execute SQL queries, pause or reactivate projects and retrieve logs - without opening the Supabase dashboard.

Introducing the official Supabase MCP Server

Stripe

The Stripe MCP server makes payment, customer, product and subscription functions of the Stripe API directly usable for LLM agents. After authentication with a restricted API key, tools such as paymentIntents.create, customers.list, subscriptions.update or products.read are available. The server responses are delivered in JSON-RPC 2.0 format and contain the same fields (id, amount, currency, etc.) as the associated Stripe objects. In addition, the documentation.read tool provides excerpts from the Stripe documentation so that code examples can be called up without context switching.

Working with the Stripe MCP server and AI Assistant in VS Code

Figma Dev Mode

In Dev Mode, Figma provides a locally running MCP server that makes selected design frames accessible for language models. Once activated, tools can retrieve the associated React-with-tailwind code or the design variables used in a frame. The responses contain the node ID reference, tailwind classes and token values so that models can uniquely identify and process individual elements. As the server is executed in the desktop client, this metadata initially remains on the user's own computer and is only transferred when it is actively sent to a model. Dev Mode shortens the path from design to executable interface and drastically reduces context switches.

Create code from designs with the Dev Mode MCP server | Figma

DeepL

DeepL provides a reference implementation of an MCP server that seamlessly integrates translations and paraphrases into LLM workflows. After you have stored a DeepL API key, the server publishes tools for text translation, language list queries and paraphrasing. When translating, you can transfer text, a desired formality level and optionally a glossary ID. The responses follow the JSON-RPC 2.0 format and can contain a field with the billed characters if desired, while information on the remaining monthly quota is retrieved via a separate usage tool. In this way, DeepL functions can be embedded in automated processes without additional integration effort.

Firecrawl

Firecrawl combines search, scraping and link crawling in an MCP server that delivers web data in a structured manner. As soon as an API key is stored, tools are available for scraping individual pages, batch jobs, web search queries and crawling. Depending on the tool, the responses contain raw HTML, extracted text or rendered Markdown as well as metadata such as the page title and author or date details. Configurable parameters for depth, rate limits and timeouts allow you to keep costs under control, while webhook events provide information on the progress of long crawl jobs. This allows a model to retrieve up-to-date web content and back up its answers with fresh sources.

HubSpot

HubSpot opens its Smart CRM for MCP servers. Once token scopes have been defined, tools are available to retrieve, create or update contacts, deals, tickets and tasks as well as to manage associations. The model summarizes open deals or creates new contacts, and every change is immediately reflected in the CRM history. Calls outside the granted rights are denied. Sales, marketing and support thus merge into one voice-controlled interface, and consistent customer data remains centralized.

My colleague Alexander Sprogis, for example, uses HubSpot to generate a daily summary of the performance of all lead funnels

Perplexity

Perplexity offers an MCP server that provides the Sonar Search API as a real-time research tool. The model poses search queries, structures results and enriches them with citations so that answers are well-founded and verifiable. Metadata such as source and publication date are provided as a compact JSON structure. Rate and budget limits prevent excessive use. Perplexity replaces static training with up-to-date web information and creates trust among professional users and teams.

Canva

Canva provides an MCP server that not only searches your design archive, but also creates new designs and adapts existing ones directly in the chat. After OAuth, two tool groups appear: one for researching and summarizing your presentations, docs and campaign assets, and a second for actions such as creating presentations, reformatting assets or updating designs. A short prompt such as "Create a pitch deck from this conversation" immediately creates a new presentation in your workspace.

Conclusion

The true power of MPC servers is in the combination, e.g. you can connect several of them to Claude Desktop and build an AI assistant that answers your questions and performs tasks for you. As soon as analysis, data management, automation and design appear as independent tools in the manifest, your language model orchestrates complete workflows in a single dialog. Instead of switching between apps, you simply describe the goal and the wizard calls up the appropriate endpoints, summarizes results and delivers the finished artefact. In this way, the Model Context Protocol transforms different services into a modular ecosystem that grows with each new server and noticeably shortens routine processes.

→ 💪🏼 If you want to learn more about MCP Server, then this way!

You might also like this