Learning Library

← Back to Library

MCP: The USB‑C Standard for AI

Key Points

  • MCP (Model Context Protocol) is a new open‑standard introduced by Anthropic in late 2024 that standardizes how AI applications connect LLMs to external data sources, similar to how USB‑C standardizes hardware connections.
  • The protocol defines an MCP host that runs multiple MCP clients, each opening a JSON‑RPC 2.0 session to communicate with MCP servers that expose specific capabilities such as database access, code repositories, or email services.
  • MCP addresses two core needs of LLM‑based AI agents: delivering external contextual data (e.g., documents, knowledge‑base entries, DB records) and enabling the agents to invoke tools or actions like web searches, service calls, or calculations.
  • By providing a common set of primitives for context retrieval and tool execution, MCP offers a plug‑and‑play alternative to bespoke APIs, allowing any compatible peripheral or service to be used by LLMs without custom integration work.

Full Transcript

# MCP: The USB‑C Standard for AI **Source:** [https://www.youtube.com/watch?v=7j1t3UZA1TY](https://www.youtube.com/watch?v=7j1t3UZA1TY) **Duration:** 00:12:58 ## Summary - MCP (Model Context Protocol) is a new open‑standard introduced by Anthropic in late 2024 that standardizes how AI applications connect LLMs to external data sources, similar to how USB‑C standardizes hardware connections. - The protocol defines an MCP host that runs multiple MCP clients, each opening a JSON‑RPC 2.0 session to communicate with MCP servers that expose specific capabilities such as database access, code repositories, or email services. - MCP addresses two core needs of LLM‑based AI agents: delivering external contextual data (e.g., documents, knowledge‑base entries, DB records) and enabling the agents to invoke tools or actions like web searches, service calls, or calculations. - By providing a common set of primitives for context retrieval and tool execution, MCP offers a plug‑and‑play alternative to bespoke APIs, allowing any compatible peripheral or service to be used by LLMs without custom integration work. ## Sections - [00:00:00](https://www.youtube.com/watch?v=7j1t3UZA1TY&t=0s) **MCP — USB‑C‑Style Standard for LLMs** - Anthropic’s Model Context Protocol (MCP) is an open‑standard, JSON‑RPC‑based framework that lets AI applications plug into external data sources and services through a uniform “USB‑C”‑like interface, differing from traditional ad‑hoc APIs. - [00:03:08](https://www.youtube.com/watch?v=7j1t3UZA1TY&t=188s) **MCP Server Primitives for AI Agents** - Explains how an MCP server supplies AI agents with context and tool access through three primitives—tools, resources, and prompt templates. - [00:06:14](https://www.youtube.com/watch?v=7j1t3UZA1TY&t=374s) **RESTful API Basics and Usage** - Explains how clients interact with abstracted server services via RESTful APIs over HTTP, using standard methods and JSON, with examples such as a library system and large‑language‑model endpoints. - [00:09:35](https://www.youtube.com/watch?v=7j1t3UZA1TY&t=575s) **Model Context Protocol vs APIs** - The Model Context Protocol (MCP) is purpose‑built for LLMs, offering standardized patterns and runtime discovery of functions so agents can automatically adapt to new capabilities, unlike conventional REST APIs that require manual updates. - [00:12:41](https://www.youtube.com/watch?v=7j1t3UZA1TY&t=761s) **MCP Enables Unified Data Integration** - MCP provides standardized access to various services—including file systems, Google Maps, Docker, and Spotify—allowing AI agents to integrate enterprise data sources more easily. ## Full Transcript
0:00For large language models to be truly useful, they often need to interact with external data sources and services and tools. 0:07And until recently, that was typically done with application programming interfaces or APIs. 0:14Now, in late 2024, Anthropic introduced a new open standard protocol, that's model context protocol or MCP. 0:27And it has already made quite the splash and it standardizes how applications provide context to LLMs. 0:35So let's define these two terms MCP and API and take a look at their similarities and differences. 0:44Now a good metaphor for MCP is that it's kind of like a USB-C port for your AI applications 0:49and that's because it standardize its connections between AI applications, LLM's 0:54and external data sources. So, if you think about 0:58Just your standard laptop that you might be using. 1:03Well, that probably has a set of USB-C ports attached to it. 1:09That's a really old one. 1:11And in those ports, well, you can plug in all sorts of 1:16cables and they will use the USB-c standard to interface with all sorts peripherals. 1:22So perhaps you've plugged one of these things into a monitor. 1:26Another one is connected to an external disk drive and perhaps you've also added in a power supply for the third one. 1:34It really doesn't matter who makes the peripherals, they all work together using this common standard. 1:41Well, MCP is kind of like that. 1:44So if we take a look at really what's in it, there is an MCP host and that also runs a number of MCP clients. 1:56Now, each client opens a JSON RPC 2.0 session using 2:01the protocol that comes with MCP, so the MCP protocol, and that connects to external MCP servers. 2:13So we have a client-server relationship here. 2:17Now, servers, those expose capabilities. 2:20So perhaps we've got a server for access to a database, 2:25maybe we've got another one which gives us access to a code repository. 2:29And then maybe we have another server that gives us to an email server. 2:36So if we go back to the USB-C analogy, we can think of the laptop as being kind of like the MCP host. 2:44The MCB protocol, 2:46this is really what's signified by the USB C connection. 2:51And then the drive and the monitor and the power supply. 2:54We can think of those really as MCP servers. 2:59Okay, so that's the architecture, but what are the capabilities of MCP? 3:04Well, it addresses two main needs of LLM applications. 3:08And when I say LLMs applications, I particularly mean AI agents. 3:15And those two needs, one is to provide context in the 3:20form of contextual data And the other is to enable tools and the usage of tools by these AI agents. 3:30So it provides a standard way for an AI agent to retrieve external context, 3:35which means things like documents and knowledge base entries and database records that sort of thing, 3:40and it can also execute actions or tools like maybe run a web search or call an external service or perform some calculations. 3:49Now that's all done through this MPC server that I mentioned and that advertises a bunch of primitives. 4:01So let's take a look at three of them. 4:04Now one of the primitives is called tools and tools are discrete actions or functions the AI can call. 4:13So a weather service that might expose a get weather tool or a calendar service that may expose a create event tool. 4:21Now the server name advertises each tools name, 4:24It's description, the input and output schema in its capabilities listing as well. 4:29Now when an LLM uses an MCP client to invoke a tool, the MCP server executes the underlying function. 4:37So that's tools. 4:39Now another primitive is resources. 4:43And resources are read only data items or documents the server can provide. 4:49Which the client can then retrieve on demand, so text files, database schema, file contents, that sort of thing. 4:55And then we also have as an additional primitive prompt templates, 5:01and those are predefined templates providing suggested prompts. 5:06Now, not every MCP server will use all three primitives. 5:11In fact, many just focus on tools currently, 5:15but the important thing to understand here, 5:17is an AI agent can query an MCP server at runtime 5:21to discover what primitives are available and then invoke those capabilities in a uniform way. 5:28Because every MCP's server publishes a machine readable catalog, 5:32so tools/list and resources/list and prompts/list, 5:37agents can discover and then use new functionality without redeploying code. 5:44OK, so that's MCPs. 5:45What about APIs? 5:46Well APIs are another way of letting one system access another system's functionality or data. 5:52An application programming interface is to find a set of rules or protocols describing how to request information or services. 5:59And by using APIs, developers can integrate capabilities from external systems instead of building everything from scratch. 6:06So an e-commerce site can use a payment API to process credit card payments, for example. 6:12Now the API acts as an abstraction layer. 6:14So we have the requesting application, the client, 6:19well that doesn't need to know the internal details of the service that it wants to invoke, the server. 6:27It's all kind of abstracted away from it, 6:30because the server processes the request and the only thing we need to know is how to format the requests 6:35and understand the responses using the API. 6:40That's really all there is to it. 6:42Now there are a lot of different API styles but One of the most ubiquitous is the RESTful API style. 6:52You can kind of think of that as really the, essentially the web default API. 6:57And a RESTFUL API communicates over HTTP. 7:01So this call here is an HTTP call with RESTfUL API where clients interact using standard HTTP methods. 7:10So they might use GET, for example, to retrieve data. 7:14They might use. 7:15Post to create data, put to update data, and delete to remove data. 7:24So for example, a REST API for a library system might have an endpoint that looks something like get, and then we say /books, 7:33/123 if we want to fetch book number one, two, threes, details. 7:40Or we might use a post and say post slash loans. 7:45If we want to borrow a book. 7:47Each such endpoint returns data, often in a JSON format, representing the result. 7:54And in fact, many commercial large language models are offered over REST. 8:00Send a JSON prompt, get a JSON completion back. 8:05AI agents might also use REST APIs to perform a web search or interact with a company's internal REST services. 8:12So, MCP and APIs, they share... 8:15Many similarities, not least that they are both considered client-server model architectures. 8:25So in a REST API, a client sends an HTTP request like those gets or posts 8:30I just mentioned to a server, and then the server returns a response in MCP. 8:35The MCP client sends the request like tools slash call to an MCP server and receives a response. 8:41So they really both offer layer of abstraction so that one system doesn't need to know the low level details of another's internals. 8:53The implementation details there, they're hidden. 8:55The client just follows the interface. 8:58So both MCP and APIs, they really help to simplify things, 9:04specifically simplifying integration, letting developers wire systems together instead of reinventing wheels. 9:12But MCP and APIs have some fundamental differences too. 9:19And let's start with purpose built, 9:22which we can really consider as MCP's kind of area, 9:28versus general purpose, which we could really think of as being more of API's domain. 9:35So the model context protocol, it was explicitly designed to integrate LLM applications 9:41with external data and tools. 9:43It standardizes patterns like providing context data and invoking tools in ways that align with how AI agents operate. 9:52But APIs on the other hand, they weren't created specifically with AI or LLMs in mind 9:57and that means that MCP bakes in certain assumptions that are useful for AI. 10:03Now that includes one of MCP's strongest advantages and that is the fact that it supports dynamic discovery. 10:13So what do I mean by that? 10:15Well, an MCP client can just simply ask an MCPserver, hey, what can you do? 10:20And it will get back a description of all available functions and data that server offers. 10:27Now the client or the LLM application using it can then adapt to whatever happens to be available. 10:34Traditional REST APIs, they don't typically expose an equivalent runtime discovery mechanism 10:38and if the API changes, new endpoints are added the client needs to be updated by a developer. 10:44MCP is kind of flipping this model because the AI agents 10:48can retrieve the latest capabilities list from a server each time it connects and then it can pick up new features automatically. 10:55Now another big difference relates to standardization as well, 10:59specifically standardization of interface, 11:03and the difference here is that every MCP server 11:07regardless of what service or what data it connects to 11:11speaks the same protocol and follows the same patterns, whereas each API is unique. 11:17The specific endpoints and the parameter formats and the authentication schemes, they vary between services. 11:24So if an AI agent wants to use five different REST APIs, 11:27it might need five different adapters, whereas five MCP servers respond to the exact same calls. 11:34Build once, integrate many. 11:37Okay, so similar, but different, 11:40but here's the kicker. 11:42When it comes to MCP, many MCP servers, 11:48when we actually look at their implementation, they actually use traditional APIs to do their work. 11:55In many cases, an MCP server is essentially a wrapper around an existing API, 12:03translating between the MCP format and then the underlying services native interface by using that API, 12:13like the mcp github server, which exposes high level tools such as repository/list as mcb primitives, 12:22but then it internally translates each tool call into the corresponding githubs rest api request. 12:28So MCP and apis are not adversaries they're layers, they're layers in an AI stack. 12:35MCP might use APIs under the hood while providing a more AI friendly interface on top. 12:42And today you can find MCP service for file systems, Google Maps, 12:46Docker, Spotify, and a growing list of enterprise data sources. 12:51And thanks to MCP, those services can now be better integrated into AI agents in a standardized way.