The universal language that lets AI models talk to your tools, databases, and file systems — explained from first principles.
The Model Context Protocol (MCP) is an open standard introduced by Anthropic that defines a universal way for AI models to connect to external tools, data sources, and services. Think of it as a common plug socket — just as all your appliances use the same wall socket regardless of who made them, MCP allows any AI model to connect to any tool using the same standard interface.
Before MCP, every integration between an AI and a tool was bespoke. A company wanting Claude to read their database would need to write custom code, different from the custom code needed to connect GPT-4 to that same database. MCP eliminates this duplication by defining a protocol that any model and any tool can implement once and then interoperate freely.
At its core, MCP defines three primitives that any server can expose:
Functions the AI can call to take actions — like running a SQL query, reading a file, or sending an API request. Tools are the verbs of MCP.
Data sources the AI can read from — like a file, a database table, or a URL. Resources are the nouns of MCP — they provide context to the model.
Pre-written instruction templates that guide the AI on how to use the server effectively. Prompts are the instructions that come in the box.
The story of MCP is the story of a problem that every AI developer kept running into — and one team's decision to solve it properly rather than patch around it.
MCP follows a clean client–server architecture built on top of JSON-RPC 2.0. Every component has a defined role and communicates over a well-specified channel.
When you ask Claude to "query my database", here is exactly what happens:
tools/call request to the MCP client.MCP supports two transport layers depending on where your server lives:
npx commands use. Zero network overhead, inherits the user's local permissions.You already know what an API is. An API is a contract — here are my endpoints, here are the inputs I accept, here are the outputs I return. MCP is essentially an API specification, but it's designed specifically for the interaction pattern between an AI model and a tool.
The critical difference is that MCP is model-agnostic and tool-agnostic. Traditional APIs are point-to-point: you write a specific integration between System A and System B. MCP is hub-and-spoke: you write one MCP server that exposes your tool's capabilities, and any MCP-compatible AI client can use it immediately — no further integration work.
Under the hood, MCP uses JSON-RPC 2.0 — a lightweight remote procedure call protocol. When Claude wants to list your database tables, it sends something like this over stdin to the MCP server process:
The SQLite server processes this, queries sqlite_master, and returns:
Imagine you have a very smart robot assistant. This robot is incredibly good at thinking and answering questions, but it lives inside a box and can't see or touch anything in the real world by itself.
The really clever part is that because all the helper machines speak the same language (MCP), you only have to build each helper machine once. After that, any robot — not just Claude, but any AI that speaks MCP — can use it. It's like building a LEGO connector piece that works with every LEGO set ever made.
MCP servers are configured in a JSON file that Claude reads on startup. The file tells Claude which servers to launch, how to launch them, and what arguments to pass.
If you use Claude Code (the terminal-based agentic coding tool), you add MCP servers directly from the command line:
For stdio-based MCP servers running via npx, you need Node.js 18+ installed on your machine. Verify with:
The two most commonly used local MCP servers are the filesystem server (official, by Anthropic) and the SQLite server (community-built). Here's what they expose and how they differ.
Built by Anthropic · TypeScript · github.com/modelcontextprotocol/servers
| Tool | What it does | Example use |
|---|---|---|
| read_file | Read full contents of a file | "Read my config.json" |
| write_file | Create or overwrite a file | "Save this code to main.py" |
| edit_file | Make targeted partial edits | "Fix line 42 in app.js" |
| list_directory | Show files and folders in a directory | "What's on my desktop?" |
| create_directory | Make a new folder | "Create a /reports folder" |
| move_file | Move or rename a file | "Rename draft.txt to final.txt" |
| search_files | Recursive search by name pattern | "Find all .csv files" |
| get_file_info | Metadata: size, dates, type | "When was report.pdf last modified?" |
Community-built · TypeScript · better-sqlite3 driver · npmjs.com/package/mcp-sqlite
| Tool | What it does | Example use |
|---|---|---|
| query | Execute any raw SQL statement | "SELECT * FROM silver_sales" |
| list_tables | List all user tables in the DB | "What tables exist?" |
| get_table_schema | Get column definitions for a table | "Show me the schema of gold_customer_360" |
| create_record | Insert a row via JSON object | "Add a new product record" |
| update_records | Update rows matching conditions | "Mark PROD010 as inactive" |
| delete_records | Delete rows matching conditions | "Remove the test batch" |
| db_info | File path, size, table count, modified date | "How big is my database?" |
SQLite is the simplest entry point — it's a local file, zero infrastructure, perfect for learning. But the MCP database ecosystem covers the full spectrum of production-grade databases. Because every MCP server exposes the same three primitives (Tools, Resources, Prompts), swapping from SQLite to PostgreSQL or Snowflake requires only a config file change — your Claude prompts and workflows stay identical.
| Database | Package / Server | Type | Best For |
|---|---|---|---|
| SQLite | mcp-sqlite | Community · Local | Local files, prototyping, learning MCP |
| PostgreSQL | @modelcontextprotocol/server-postgres | Official · Local/Remote | Production apps, full SQL, JSONB, extensions |
| MySQL / MariaDB | mcp-mysql | Community · Remote | Web apps, legacy systems, LAMP stacks |
| Microsoft SQL Server | mcp-mssql | Community · Remote | Enterprise, Azure, .NET ecosystems |
| MongoDB | mcp-mongodb | Community · Remote | Document stores, flexible schemas, real-time apps |
| Snowflake | mcp-snowflake | Community · Cloud | Data warehousing, analytics at scale, BI workloads |
| BigQuery | mcp-bigquery | Community · Cloud | Google Cloud analytics, petabyte-scale queries |
| Redis | mcp-redis | Community · Remote | Caching layers, session stores, pub/sub |
| DuckDB | mcp-duckdb | Community · Local | In-process analytics, Parquet/CSV querying, OLAP |
The medallion architecture we built in SQLite could be pointed at a production PostgreSQL database with one edit to the config file. The natural language instructions to Claude stay exactly the same.
You can connect multiple MCP database servers simultaneously. Claude will route each query to the right server automatically based on context. A real example: Claude reads raw CSV files from the filesystem MCP, queries a staging table in the PostgreSQL MCP, and writes results into a Snowflake warehouse MCP — all in one conversation.
A powerful demonstration of MCP in practice: using the SQLite MCP server to design and populate a full Bronze → Silver → Gold medallion data architecture — entirely through natural language instructions to Claude, with no SQL IDE open.
Using the SQLite MCP's query tool, Claude executed CREATE TABLE statements for all 10 tables with appropriate column types, constraints, default values, and indexes — all in response to plain English instructions like "create bronze layer tables for raw data ingestion with a load timestamp."
Claude inserted realistic but intentionally messy raw data: duplicate customer records, a sale with a negative quantity, a product where unit cost exceeded unit price, mixed-case text values. This simulated real-world ingestion problems.
Claude wrote INSERT INTO silver_* SELECT ... FROM bronze_* transforms that cast TEXT to proper types, derived computed columns (email domain, full name, margin %), normalised inconsistent values (ACTIVE/active/true/1 → 1), deduped records, and flagged bad rows with dq_is_valid = 0.
Claude joined and aggregated silver tables into four business-ready gold tables: daily sales with status pivot, customer 360 with RFM signals, product performance with stock coverage days, and category revenue with month-over-month growth — all verified to produce consistent revenue totals of $5,279.65 across every gold table.
One of the most powerful — and underexplored — applications of MCP is using it to give AI agents a role inside your CI/CD pipeline. Rather than just generating code or answering questions, an agent equipped with MCP tools can autonomously execute pipeline steps: run migrations, validate data quality, commit schema changes, and trigger deployments.
When Claude Code (or any MCP-compatible agent) runs inside your pipeline, it connects to the same MCP servers you use interactively. This means the exact same capabilities available in a chat session are available to the agent running headlessly in GitHub Actions, GitLab CI, or Jenkins.
Triggered on every pull request that touches /migrations/, a Claude Code agent uses the PostgreSQL MCP to connect to a staging database, inspect the current schema, apply the migration SQL, verify the result, and post a structured report as a PR comment — all without a human manually running psql.
After each nightly ETL run, an agent connects via the SQLite or PostgreSQL MCP, queries all silver and gold tables, checks that dq_is_valid = 0 row counts are within acceptable thresholds, compares revenue totals across gold tables for consistency, and either approves the pipeline run or triggers a Slack alert via a third MCP server — no human reviewer needed for routine runs.
An agent uses the filesystem MCP to read newly committed SQL migration files, the database MCP to compare the expected schema against what's actually in production, and the GitHub MCP to post a detailed PR comment flagging any drift, missing indexes, or type mismatches — acting as an automated database reviewer on every pull request.
.claude/mcp.json.The MCP ecosystem is growing rapidly. These are the highest-signal resources for learning more.