Model Context Protocol (MCP): The USB-C of AI Integrations



Every AI application needs to connect to external tools—databases, APIs, file systems. Until now, each integration was custom. Model Context Protocol (MCP) changes that with a universal standard.

Network connections Photo by Jordan Harrison on Unsplash

The Integration Problem

Before MCP, connecting an LLM to tools looked like this:

Your App → Custom Code → Tool A
        → Different Code → Tool B  
        → More Code → Tool C

Every tool needed custom integration. Every AI platform did it differently.

What is MCP?

Model Context Protocol is an open standard that defines how LLMs communicate with external tools and data sources. Think of it as USB-C for AI—one connector, universal compatibility.

Standardization Photo by Alexandre Debiève on Unsplash

Core Concepts

MCP Server: Exposes tools/resources to AI MCP Client: The AI application that connects Transport: How they communicate (stdio, HTTP)

AI App (Client) ←→ MCP Protocol ←→ MCP Server ←→ Database/API/Tool

Building an MCP Server

Let’s create a simple MCP server that provides weather data:

import { Server } from "@modelcontextprotocol/sdk/server";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio";

const server = new Server({
  name: "weather-server",
  version: "1.0.0"
}, {
  capabilities: {
    tools: {}
  }
});

// Define the tool
server.setRequestHandler("tools/list", async () => ({
  tools: [{
    name: "get_weather",
    description: "Get current weather for a city",
    inputSchema: {
      type: "object",
      properties: {
        city: { type: "string", description: "City name" }
      },
      required: ["city"]
    }
  }]
}));

// Handle tool calls
server.setRequestHandler("tools/call", async (request) => {
  if (request.params.name === "get_weather") {
    const city = request.params.arguments.city;
    const weather = await fetchWeather(city);
    return {
      content: [{
        type: "text",
        text: JSON.stringify(weather)
      }]
    };
  }
});

// Start server
const transport = new StdioServerTransport();
await server.connect(transport);

MCP Resources vs Tools

MCP defines two types of capabilities:

Tools

Functions the AI can call:

{
  name: "send_email",
  description: "Send an email",
  inputSchema: { /* params */ }
}

Resources

Data the AI can read:

{
  uri: "file:///path/to/document.md",
  name: "Project README",
  mimeType: "text/markdown"
}

Connecting MCP to Claude

Claude Desktop and Claude Code natively support MCP. Configuration is simple:

// claude_desktop_config.json
{
  "mcpServers": {
    "weather": {
      "command": "node",
      "args": ["./weather-server.js"]
    },
    "database": {
      "command": "python",
      "args": ["./db-server.py"],
      "env": {
        "DATABASE_URL": "postgres://..."
      }
    }
  }
}

Now Claude can query weather and databases without custom code in your app.

Real-World MCP Servers

The ecosystem is growing fast:

ServerPurpose
@modelcontextprotocol/server-filesystemFile operations
@modelcontextprotocol/server-githubGitHub API
@modelcontextprotocol/server-postgresDatabase queries
@modelcontextprotocol/server-brave-searchWeb search
@modelcontextprotocol/server-slackSlack integration

Building a Database MCP Server

Here’s a more practical example—a PostgreSQL server:

from mcp.server import Server
from mcp.server.stdio import stdio_server
import asyncpg

server = Server("postgres-mcp")

@server.tool()
async def query_database(sql: str) -> str:
    """Execute a read-only SQL query"""
    if not sql.strip().upper().startswith("SELECT"):
        return "Error: Only SELECT queries allowed"
    
    conn = await asyncpg.connect(DATABASE_URL)
    try:
        rows = await conn.fetch(sql)
        return json.dumps([dict(r) for r in rows])
    finally:
        await conn.close()

@server.tool()
async def list_tables() -> str:
    """List all tables in the database"""
    return await query_database(
        "SELECT table_name FROM information_schema.tables "
        "WHERE table_schema = 'public'"
    )

async def main():
    async with stdio_server() as (read, write):
        await server.run(read, write)

asyncio.run(main())

Security Considerations

MCP servers have access to sensitive systems. Be careful:

1. Principle of Least Privilege

# Bad: Full database access
@server.tool()
async def execute_sql(sql: str): ...

# Good: Scoped read-only access
@server.tool()
async def get_user_orders(user_id: int): ...

2. Input Validation

@server.tool()
async def read_file(path: str) -> str:
    # Prevent path traversal
    safe_path = Path(ALLOWED_DIR) / Path(path).name
    if not safe_path.is_relative_to(ALLOWED_DIR):
        raise ValueError("Access denied")
    return safe_path.read_text()

3. Rate Limiting

from limits import RateLimiter

limiter = RateLimiter(calls=100, period=60)

@server.tool()
async def web_search(query: str):
    if not limiter.allow():
        return "Rate limited. Try again later."
    ...

MCP vs Function Calling

How does MCP compare to OpenAI’s function calling?

AspectFunction CallingMCP
ScopeSingle API callFull server
DiscoveryDefined per requestServer advertises
StateStatelessCan maintain state
StandardVendor-specificOpen protocol

MCP is higher-level—it’s about servers that provide multiple tools, not individual function definitions.

The Future of MCP

As MCP adoption grows:

  1. Marketplace of servers - Install capabilities like npm packages
  2. Cross-platform compatibility - Write once, use with any MCP client
  3. Composable AI systems - Chain servers together
# Future: Installing AI capabilities
mcp install weather database github slack
claude --with-mcp="weather,database,github"

Getting Started

  1. Use existing servers: Check github.com/modelcontextprotocol/servers
  2. Build custom servers: Use the TypeScript or Python SDK
  3. Configure Claude: Add servers to your config

MCP makes AI integrations portable, reusable, and standardized. If you’re building AI applications, it’s worth learning.


MCP is still evolving. Watch the spec and join the community to shape its future.

이 글이 도움이 되셨다면 공감 및 광고 클릭을 부탁드립니다 :)