ServerlessBackend
LocalBackend
Managed or local training
ART provides two backend classes:ServerlessBackend- train remotely on autoscaling GPUsLocalBackend- run your agent and training code on the same machine
LocalBackend. If your agent is running on a machine without an advanced GPU (this includes most personal computers and production servers), use ServerlessBackend instead. ServerlessBackend optimizes speed and cost by autoscaling across managed clusters.
ServerlessBackend
Setting upServerlessBackend requires a W&B API key. Once you have one, you can provide it to ServerlessBackend either as an environment variable or initialization argument.
ServerlessBackend automatically saves your LoRA checkpoints as W&B Artifacts and deploys them for production inference on W&B Inference.
LocalBackend
TheLocalBackend class runs a vLLM server and either an Unsloth or torchtune instance on whatever machine your agent itself is executing. This is a good fit if you’re already running your agent on a machine with a GPU.
To declare a LocalBackend instance, follow the code sample below:
PipelineTrainer, LocalBackend is currently supported only in dedicated mode, where training and inference run on separate GPUs.
LocalBackend still pauses inference during training, so ART rejects that configuration for PipelineTrainer.
In dedicated mode, a new checkpoint becomes the default inference target only after its LoRA has been reloaded into vLLM. That checkpoint publication flow is backend-specific, so save_checkpoint does not have identical semantics across every ART backend.
Requests that are already in flight keep using the adapter they started with; the reload only affects subsequent routing to the latest served step.
Using a backend
Once initialized, a backend can be used in the same way regardless of whether it runs locally or remotely.LocalBackend and ServerlessBackend in action, try the examples below.
2048 Notebook
Use ServerlessBackend to train an agent to play 2048.
Summarizer
Use LocalBackend to train a SOTA summarizing agent.