Run models from Hugging Face Hub
npx -y @modelcontextprotocol/server-hugging-face-mcp-server
Paste this into your MCP client configuration (Cursor, Windsurf, etc).
Run ML models from Hugging Face.
{
"mcpServers": {
"huggingface": {
"command": "npx",
"args": ["-y", "@huggingface/mcp-server"],
"env": {
"HF_TOKEN": "your-token"
}
}
}
}
MCP (Model Context Protocol) servers extend your AI assistant with external tools and data sources. Follow these steps to install Hugging Face MCP Server in your preferred AI coding tool.
claude_desktop_config.json