You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: gradio-vton-mcp.md
+2-4Lines changed: 2 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,15 +7,13 @@ authors:
7
7
8
8
# Build an AI Shopping Assistant with Gradio MCP Servers
9
9
10
-
Gradio is the fastest way give your LLM superpowers. With Gradio's seamless Model Context Protocol (MCP) integration, your LLM can plug directly into the thousands of AI models and Spaces hosted on the Hugging Face [Hub](https://hf.co). By pairing the general reasoning capabilities of LLMs with the specialized abilities of models found on Hugging Face, your LLM can go beyond simply answering text questions to actually solving problems in your daily life.
10
+
Gradio is the fastest way give your LLM superpowers. With Gradio's Model Context Protocol (MCP) integration, your LLM can plug directly into the thousands of AI models and Spaces hosted on the Hugging Face [Hub](https://hf.co). By pairing the general reasoning capabilities of LLMs with the specialized abilities of models found on Hugging Face, your LLM can go beyond simply answering text questions to actually solving problems in your daily life.
11
11
12
12
Beyond this, Gradio makes implementing powerful MCP servers a breeze, offering features like:
13
13
* Automatic conversion of python functions into LLM tools.
14
14
* Real-time progress notifications.
15
15
* Automatic file uploads, including support for public URLs and handling of various file types.
16
16
17
-
I hate shopping because it takes too much time and I hate trying on clothes myself. What if I had an LLM do this for me?
18
-
19
17
Imagine this: you hate shopping because it takes too much time, and you dread trying on clothes yourself. What if an LLM could handle this for you? In this post, we'll create an LLM-powered AI assistant that can browse online clothing stores, find specific garments, and then use a virtual try-on model to show you how those clothes would look on you. See the demo below:
@@ -114,7 +112,7 @@ You can find this file by typing `MCP` in the command panel and selecting `MCP:
114
112
The playwright MCP server will let our AI assistant browse the web.
115
113
116
114
> [!TIP]
117
-
> Make sure the URL of the `vton` server matches the url printed to the console in the previous section.
115
+
> Make sure the URL of the `vton` server matches the url printed to the console in the previous section. To run the playwright MCP server, you need to have node [installed](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
0 commit comments