Skip to content

Commit 9a7a59a

Browse files
committed
Update README.md
1 parent 6382e00 commit 9a7a59a

File tree

1 file changed

+62
-1
lines changed

1 file changed

+62
-1
lines changed

README.md

Lines changed: 62 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,62 @@
1-
# LangChain MCP Integration
1+
# LangChain MCP Integration [![npm version](https://img.shields.io/npm/v/%40lambdaworks%2Flangchain-mcp-integration)](https://www.npmjs.com/package/@lambdaworks/langchain-mcp-integration) [![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
2+
3+
## Overview
4+
5+
The [Model Context Protocol](https://modelcontextprotocol.io/introduction) allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. There are plenty of open source MCP servers available, each one offering different tools for an LLM to use, but they're not compatible with LangChain's tool representation out of the box. This library provides a thin wrapper around the interaction with MCP servers, ensuring they're compatible with LangChain.
6+
7+
## Installation
8+
9+
```
10+
npm i @lambdaworks/langchain-mcp-integration
11+
```
12+
13+
## Usage
14+
15+
An example of how this library allows you to use MCP tools with a prebuilt ReAct agent from LangGraph:
16+
17+
```
18+
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
19+
import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js";
20+
21+
import { ChatOpenAI } from "@langchain/openai";
22+
import { createReactAgent } from "@langchain/langgraph/prebuilt";
23+
import { HumanMessage } from "@langchain/core/messages";
24+
import { MCPServerToolkit } from "@lambdaworks/langchain-mcp-integration";
25+
26+
const client = new Client({
27+
name: "example-client",
28+
version: "1.0.0"
29+
});
30+
31+
const transport = new StdioClientTransport({
32+
command: "uvx",
33+
args: ["mcp-server-fetch", "--ignore-robots-txt"]
34+
});
35+
36+
let toolkit = new MCPServerToolkit();
37+
38+
await toolkit.addTools({ client, transport });
39+
40+
const agentModel = new ChatOpenAI({ temperature: 0 });
41+
42+
const agent = createReactAgent({
43+
llm: agentModel,
44+
tools: toolkit.getTools()
45+
});
46+
47+
const agentFinalState = await agent.invoke({
48+
messages: [new HumanMessage("Summarize the information from this URL: https://rs.linkedin.com/company/lambdaworksio")]
49+
});
50+
51+
console.log(
52+
agentFinalState.messages[agentFinalState.messages.length - 1].content
53+
);
54+
```
55+
56+
You can also add tools from multiple MCP servers:
57+
58+
```
59+
toolkit.addTools({ client: client1, transport: transport1 }, {client: client2, transport: transport2 });
60+
```
61+
62+
Each client and transport pair are used to specify the way you'll connect to a specific MCP server.

0 commit comments

Comments
 (0)