Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
6.1.0
Hugging Face Spaces Deployment Guide
This guide walks you through deploying the LangGraph Multi-Agent MCTS demo to Hugging Face Spaces.
Prerequisites
- Hugging Face Account
- Git installed locally
- Python 3.10+ (for local testing)
Step 1: Create a New Space
- Go to Hugging Face Spaces
- Click "Create new Space"
- Fill in the form:
- Owner: Your username or organization
- Space name:
langgraph-mcts-demo(or your choice) - License: MIT
- SDK: Gradio
- Hardware: CPU Basic (Free tier - sufficient for demo)
- Visibility: Public (or Private)
- Click "Create Space"
Step 2: Clone and Deploy
Option A: Git-based Deployment (Recommended)
# 1. Clone your new empty Space
git clone https://huggingface.co/spaces/YOUR_USERNAME/langgraph-mcts-demo
cd langgraph-mcts-demo
# 2. Copy demo files from this directory
cp -r /path/to/huggingface_space/* .
cp -r /path/to/huggingface_space/.gitignore .
# 3. Verify structure
ls -la
# Should show:
# - app.py
# - requirements.txt
# - README.md
# - .gitignore
# - demo_src/
# - __init__.py
# - agents_demo.py
# - llm_mock.py
# - mcts_demo.py
# 4. Commit and push
git add -A
git commit -m "Initial deployment of LangGraph Multi-Agent MCTS demo"
git push
# 5. Space will automatically build and deploy (takes 2-5 minutes)
Option B: Direct Upload via Web UI
- Navigate to your Space on Hugging Face
- Click "Files" tab
- Click "Add file" β "Upload files"
- Upload all files maintaining the directory structure:
app.pyrequirements.txtREADME.md.gitignoredemo_src/__init__.pydemo_src/agents_demo.pydemo_src/llm_mock.pydemo_src/mcts_demo.py
- Commit changes
Step 3: Monitor Deployment
- Go to your Space URL:
https://huggingface.co/spaces/YOUR_USERNAME/langgraph-mcts-demo - Click "Logs" tab to monitor build progress
- Wait for "Running on" message
- Your demo is now live!
Step 4: Test the Demo
- Enter a query or select an example
- Enable/disable different agents
- Adjust MCTS parameters
- Click "Process Query"
- Review results and consensus scores
Optional: Enable Real LLM Responses
To use Hugging Face Inference API instead of mock responses:
1. Update requirements.txt
gradio>=4.0.0,<5.0.0
numpy>=1.24.0,<2.0.0
huggingface_hub>=0.20.0
2. Add Secret Token
- Go to Space Settings β Repository secrets
- Add new secret:
- Name:
HF_TOKEN - Value: Your Hugging Face token (from Settings β Access Tokens)
- Name:
3. Update app.py Initialization
Change line ~290 in app.py:
# From:
framework = MultiAgentFrameworkDemo(use_hf_inference=False)
# To:
import os
framework = MultiAgentFrameworkDemo(
use_hf_inference=True,
hf_model="mistralai/Mistral-7B-Instruct-v0.2"
)
4. Commit and Push
git add -A
git commit -m "Enable Hugging Face Inference API"
git push
Optional: Enable Weights & Biases Tracking
Track experiments and visualize metrics with W&B integration.
1. Get W&B API Key
- Sign up at wandb.ai
- Go to Settings β API Keys
- Copy your API key
2. Add W&B Secret to Space
- Go to Space Settings β Repository secrets
- Add new secret:
- Name:
WANDB_API_KEY - Value: Your W&B API key
- Name:
3. Use W&B in the Demo
- Expand "Weights & Biases Tracking" accordion in the UI
- Check "Enable W&B Tracking"
- Optionally set:
- Project Name: Your W&B project (default:
langgraph-mcts-demo) - Run Name: Custom name for this run (auto-generated if empty)
- Project Name: Your W&B project (default:
- Process your query
- View the W&B run URL in the results
4. What Gets Logged
- Agent Metrics: Confidence scores, execution times, response lengths
- MCTS Metrics: Best value, visits, tree depth, exploration paths
- Consensus Metrics: Agreement scores, agent combinations
- Performance: Total processing time
- Artifacts: Full JSON results as artifacts
5. View Your Dashboard
After runs, visit your W&B project dashboard to:
- Compare different agent configurations
- Visualize consensus patterns
- Analyze MCTS exploration strategies
- Track performance over time
Customization Options
Change Gradio Theme
In app.py, modify:
with gr.Blocks(
theme=gr.themes.Soft(), # Try: Default(), Monochrome(), Glass()
...
) as demo:
Add Custom Examples
Update EXAMPLE_QUERIES list in app.py:
EXAMPLE_QUERIES = [
"Your custom query 1",
"Your custom query 2",
...
]
Adjust MCTS Parameters
Modify sliders in app.py:
mcts_iterations = gr.Slider(
minimum=10,
maximum=200, # Increase for more thorough search
value=50, # Change default
...
)
Add More Agent Types
- Create new agent in
demo_src/agents_demo.py - Add to
MultiAgentFrameworkDemoinapp.py - Add UI controls in Gradio interface
Troubleshooting
Build Fails
- Check Logs tab for error details
- Verify
requirements.txthas compatible versions - Ensure all imports in
app.pyare satisfied
Slow Performance
- Reduce default MCTS iterations
- Use mock LLM (no API calls)
- Simplify tree visualization
Memory Issues (Free Tier)
- Limit max MCTS iterations to 100
- Reduce tree depth in
demo_src/mcts_demo.py - Simplify response generation
Missing Files
Ensure directory structure:
your-space/
βββ app.py
βββ requirements.txt
βββ README.md
βββ .gitignore
βββ demo_src/
βββ __init__.py
βββ agents_demo.py
βββ llm_mock.py
βββ mcts_demo.py
βββ wandb_tracker.py
Upgrading Hardware
For better performance:
- Go to Space Settings
- Under Hardware, select:
- CPU Upgrade ($0.03/hr) - Faster processing
- T4 Small ($0.60/hr) - GPU for neural models
- Save changes
Sharing Your Space
Embed in Website
<iframe
src="https://YOUR_USERNAME-langgraph-mcts-demo.hf.space"
frameborder="0"
width="100%"
height="600"
></iframe>
Direct Link
Share: https://huggingface.co/spaces/YOUR_USERNAME/langgraph-mcts-demo
API Access
Gradio automatically provides API endpoint:
https://YOUR_USERNAME-langgraph-mcts-demo.hf.space/api/predict
Next Steps
- Collect Feedback: Enable flagging for user feedback
- Add Analytics: Track usage patterns
- Extend Agents: Add domain-specific reasoning modules
- Integrate RAG: Connect to vector databases for real context
- Add Visualization: Enhanced tree and consensus displays
Support
- Hugging Face Docs: https://huggingface.co/docs/hub/spaces
- Gradio Docs: https://www.gradio.app/docs
- Full Framework: https://github.com/ianshank/langgraph_multi_agent_mcts
Happy Deploying! π