Aug 08, 2025
4 min read
Rust,
MCP,
AI,
LLM,

Implementing MCP Protocol with Rust: Building External Tool Plugins for Large Models

This article introduces the concept and implementation of the MCP (Model Context Protocol), explains in detail how to develop an MCP service using Rust, demonstrates the implementation process of a crate documentation query tool through actual code examples, and introduces how to configure and use an MCP Server in VS Code.

The MCP (Model Context Protocol) has been around for a long time, but in the early days, there were very few clients in China that supported the MCP protocol, so there was no opportunity to develop an MCP server. Initially, only Claude’s own client supported MCP well, but without some technical means, you couldn’t even use Claude.

What is MCP?

Large models essentially have no real capabilities; they are fundamentally about inputting text => outputting text. However, this doesn’t prevent users from defining external tool functions, allowing large models to output data in a specific format when generating text (including which tool function to use and what the parameters are), and then executing the specified tool function. This is the Function Call (Tool Call) feature. But essentially, Function Call is still an internal function and doesn’t have external dynamic extension capabilities.

MCP is essentially an external “Function Call”, a plugin for Agent applications. It turns function calls into externally pluggable standardized services, allowing clients to directly register whichever service they want to use.

The prerequisite for these capabilities to work is that large models must have very strong instruction-following abilities. I mainly use Qwen models, and I can confirm that Qwen models did not have this capability at this time last year (August 2024), and in fact, no domestic large model had this capability at that time, until the emergence of DeepSeek V3/R1.

Whether it’s function calls or MCP, the simplest call chain looks like this:

prompt -> LLM understanding  -> Return output
				    -> Determine if tool (or MCP) is needed -> Call tool (or MCP) -> Merge results and call LLM understanding again -> Return output

Implementing an MCP service with Rust is relatively simple because MCP officially provides a Rust SDK. MCP uses JSON RPC as the data exchange protocol, and there are two transport channels:

  • Standard input/output (stdio)
  • Server-Sent Events (SSE) | (and later stream HTTP)

The former is suitable for local use, while the latter is suitable for remote services.

For example, I think large models perform poorly when dealing with Rust programming, often just making things up. I hope that when large models implement features that use certain crates, they would look up the latest documentation for those crates and then do a proper job.

Dependencies:

rmcp = {version = "0.5.0", features = ["transport-io","macros"]}
#[tool_router]
impl Docs {
    fn new() -> Self {
        Self {
            tool_router: Self::tool_router(),
        }
    }

    #[tool(name = "readme", description = "Get information about rust crate documentation")]
    async fn readme(&self, params: Parameters<LibraryRequest>) -> Result<CallToolResult, McpError> {
        let crate_name = params.0.name.as_str();
        let version = params.0.version.as_deref().unwrap_or("latest");
        let docs = reqwest::get(format!("https://docs.rs/{crate_name}/{version}")).await.unwrap().text().await.unwrap();
        Ok(CallToolResult::success(vec![Content::text(docs)]))
    }
}


#[tool_handler]
impl rmcp::ServerHandler for Docs {
    fn get_info(&self) -> ServerInfo {
        ServerInfo {
            instructions: Some("crate documentation".into()),
            capabilities: ServerCapabilities::builder().enable_tools().build(),
            ..Default::default()
        }
    }
}

This is the tool function that users need to implement.

Register an MCP server with stdin as the transport layer:

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    // Create and run the server with STDIO transport
    let service = Docs::new().serve(stdio()).await.inspect_err(|e| {
        println!("Error starting server: {e}");
    })?;
    service.waiting().await?;

    Ok(())
}

How to Use?

As far as I know, AI code plugins on VS Code basically all support MCP servers now. For example, Alibaba’s lingma, Tencent’s cloud code assistant, and ByteDance’s trae. We just need to use cargo build to get the binary package and configure the MCP server in these assistants. Alibaba’s lingma has the most convenient configuration, although the overall quality of its plugin is average.