Quickstart

Install and download a taskset:
uv tool install hud-python
hud get hud-evals/basic-2048

1) Simple: Train (remote by default)

hud rl basic-2048.jsonl
This launches training remotely and automatically provisions a vLLM server and a trainer for you. You can monitor progress on https://app.hud.so. The server persists between runs, so you can rerun training or evaluate against the same endpoint. Optional baseline first (Claude or Operator):
hud eval basic-2048.jsonl

2) Run on your own machine/remote

Use any provider with at least 2 GPUs (one for inference, one for training). Run locally with the flag --local:
uv tool install hud-python
hud get basic-2048
hud rl basic-2048.jsonl --local
  • 2× A100: quick iteration, shorter runs
  • 8× A100: higher throughput for larger tasksets
Training throughput depends on task complexity and parallelism (max_parallel_episodes).

3) Build your own environment (hud init)

Create a new MCP environment, develop with hot-reload, and train on a production image:
hud init my-env && cd my-env
hud dev --interactive
# When ready to run:
hud rl
Change the tasks.json to include other tasks you want to train on. See hud init for options and details.