MCP has been around for a while. Is it useful, or is it just hype? And what does it actually do?

Large Language Model (LLM) has two big flaws.
It kind of lives under a rock. Its knowledge is limited by the training data, which is often outdated.

It cannot interact with external tools. For example, it cannot read your database.

MCP, or Model Context Protocol, is a protocol that provides context (and tools) to LLMs.
In other words, it makes LLMs less dumb by giving them access to external tools and info.
How does it do that?
Consider these two actions:
Reading from a MongoDB database
Reading a Figma file

Performing these actions takes different steps and configurations. For a hundred tools, there are a hundred ways to set them up.

So, it's difficult for LLMs to get them right.
What if we make the steps for reading from a database and a Figma file identical?

This is what MCP does.
It provides a standard way for the AI agent (host/client) to communicate with external tools (server).

Rather than having LLMs figure out the steps, these tool providers take care of the implementation, and expose an API (MCP server) that the LLM can easily understand.
For example, the MongoDB MCP server might have a list of tools like these:
/* Run a find query
*/
@mcpTool
function find(query) {
//implementation
return document;
}
/* Insert a new document into a collection
*/
@mcpTool
function insertOne(collection, document) {
//implementation
return insertedId;
}
//...For Figma it might be these:
/* Get design context for a figma node
*/
@mcpTool
function getDesignContext(query) {
//implementation
return nodes;
}
//...
Let's say you want the AI agent to read data from your MongoDB database.
This is your prompt:
list the most active Users from my database
The AI agent
Analyzes the prompt and understands that it should use an MCP server

List all the MCP tools. Based on the descriptions, finds out that it needs to use the find tool from MongoDB

Sends a request to the server with the tool name and arguments

Sends you the response data

This process is the same for every external service. The LLM just calls the tool it needs and sends a request to the tool's MCP server.
When can you use MCP?
It’s useful whenever you need to give your LLM access to an external service or information.
Example use cases:
Using the Postgres mcp to read and analyze your data
Using the Next.js mcp to ask questions about the documentation
Using the Blender mcp to create 3D models with code
Fee from Anime Coders
PS: Want to stay competitive as a developer by adding JavaScript to your resume? Try Gacha Coders, a Gacha game that teaches you how to code.
Write docs 4x faster. Without hating every second.
Nobody became a developer to write documentation. But the docs still need to get written — PRDs, README updates, architecture decisions, onboarding guides.
Wispr Flow lets you talk through it instead. Speak naturally about what the code does, how it works, and why you built it that way. Flow formats everything into clean, professional text you can paste into Notion, Confluence, or GitHub.
Used by engineering teams at OpenAI, Vercel, and Clay. 89% of messages sent with zero edits. Works system-wide on Mac, Windows, and iPhone.


