Skip to content

Commit 9a577b3

Browse files
Add code
1 parent cf339b6 commit 9a577b3

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

gradio-vton-mcp.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,15 +7,13 @@ authors:
77

88
# Build an AI Shopping Assistant with Gradio MCP Servers
99

10-
Gradio is the fastest way give your LLM superpowers. With Gradio's seamless Model Context Protocol (MCP) integration, your LLM can plug directly into the thousands of AI models and Spaces hosted on the Hugging Face [Hub](https://hf.co). By pairing the general reasoning capabilities of LLMs with the specialized abilities of models found on Hugging Face, your LLM can go beyond simply answering text questions to actually solving problems in your daily life.
10+
Gradio is the fastest way give your LLM superpowers. With Gradio's Model Context Protocol (MCP) integration, your LLM can plug directly into the thousands of AI models and Spaces hosted on the Hugging Face [Hub](https://hf.co). By pairing the general reasoning capabilities of LLMs with the specialized abilities of models found on Hugging Face, your LLM can go beyond simply answering text questions to actually solving problems in your daily life.
1111

1212
Beyond this, Gradio makes implementing powerful MCP servers a breeze, offering features like:
1313
* Automatic conversion of python functions into LLM tools.
1414
* Real-time progress notifications.
1515
* Automatic file uploads, including support for public URLs and handling of various file types.
1616

17-
I hate shopping because it takes too much time and I hate trying on clothes myself. What if I had an LLM do this for me?
18-
1917
Imagine this: you hate shopping because it takes too much time, and you dread trying on clothes yourself. What if an LLM could handle this for you? In this post, we'll create an LLM-powered AI assistant that can browse online clothing stores, find specific garments, and then use a virtual try-on model to show you how those clothes would look on you. See the demo below:
2018

2119
<video src="https://github.com/user-attachments/assets/e5bc58b9-ca97-418f-b78b-ce38d4bb527e" controls></video>
@@ -114,7 +112,7 @@ You can find this file by typing `MCP` in the command panel and selecting `MCP:
114112
The playwright MCP server will let our AI assistant browse the web.
115113

116114
> [!TIP]
117-
> Make sure the URL of the `vton` server matches the url printed to the console in the previous section.
115+
> Make sure the URL of the `vton` server matches the url printed to the console in the previous section. To run the playwright MCP server, you need to have node [installed](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
118116
119117
## Putting It All Together
120118

0 commit comments

Comments
 (0)