|
1 | 1 | # MCP LLMS-TXT Documentation Server
|
2 | 2 |
|
3 |
| -A Model Control Protocol (MCP) server for serving documentation from llms.txt files. |
| 3 | +The MCP LLMS-TXT Documentation Server is a specialized Model Control Protocol (MCP) server that delivers documentation directly from llms.txt files. It serves as a testbed for integrating documentation into IDEs via external **tools**, rather than relying solely on built-in features. While future IDEs may offer robust native support for llms.txt files, this server allows us to experiment with alternative methods, giving us full control over how documentation is retrieved and displayed. |
4 | 4 |
|
5 |
| -## Installation |
| 5 | +## Usage |
6 | 6 |
|
7 |
| -```bash |
8 |
| -pip install mcpdoc |
9 |
| -``` |
| 7 | +### Cursor |
10 | 8 |
|
11 |
| -## Usage |
| 9 | +1. Install Cursor: https://www.cursor.com/en |
| 10 | +2. Launch the MCP server in **SSE** transport. |
| 11 | + |
| 12 | + ```shell |
| 13 | + uvx --from mcpdoc mcpdoc \ |
| 14 | + --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt \ |
| 15 | + --transport sse \ |
| 16 | + --port 8081 |
| 17 | + --host localhost |
| 18 | + ``` |
| 19 | + |
| 20 | +3. Add the mcp server to Cursor. Remember to put the URL as **[host]/sse** for example **http://localhost:8081/sse**. |
| 21 | + |
| 22 | +4. You should be able to use it within composer now. |
12 | 23 |
|
13 | 24 | ### Claude Code
|
14 | 25 |
|
15 |
| -1. Install uv |
| 26 | +1. Install Claude Code: https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview |
| 27 | +2. Install [uv](https://github.com/astral-sh/uv). This step is required if you want to run the MCP server in using `uvx` command. This is generally recommended as it'll simplify all the dependency management for you. |
| 28 | +3. Configure the MCP server with claude code |
16 | 29 |
|
17 |
| -2. Add a command to claude that instructs it on how to launch the MCP Server |
| 30 | + ```shell |
| 31 | + claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt"]}' -s user |
| 32 | + ``` |
18 | 33 |
|
19 |
| -```shell |
20 |
| -claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt"]}' -s user |
21 |
| -``` |
| 34 | +4. Launch claude code |
22 | 35 |
|
23 |
| -3. Launch claude code |
| 36 | + ```shell |
| 37 | + claude code |
| 38 | + ``` |
| 39 | + |
| 40 | + Verify that the server is running by typing `/mcp` in the chat window. |
| 41 | + |
| 42 | + ``` |
| 43 | + > /mcp |
| 44 | + ``` |
| 45 | +
|
| 46 | +5. Test it out! |
| 47 | +
|
| 48 | + ``` |
| 49 | + > Write a langgraph application with two agents that debate the merits of taking a shower. |
| 50 | + ``` |
| 51 | + |
| 52 | + |
| 53 | +This MCP server was only configured with LangGraph documentation, but you can add more documentation sources by adding more `--urls` arguments or loading it from a JSON file or a YAML file. |
24 | 54 |
|
25 |
| -```shell |
26 |
| -claude code |
27 |
| -``` |
28 | 55 |
|
29 |
| -You can check the status of the mcp serer with `/mcp` command inside of claude |
30 | 56 |
|
31 | 57 |
|
32 |
| -4. Test it out! (For example, "how can i use interrupt in langgraph?") |
33 | 58 |
|
34 | 59 |
|
35 | 60 | ### Command-line Interface
|
|
0 commit comments