Deleting low-performing checkpoints
To delete all but the most recent and best-performing checkpoints of a model, call thedelete_checkpoints method as shown below.
delete_checkpoints ranks existing checkpoints by their val/reward score and erases all but the highest-performing and most recent. However, delete_checkpoints can be configured to use any metric that it is passed.
Deleting within a training loop
Below is a simple example of a training loop that trains a model for 50 steps before exiting. By default, the LoRA checkpoint generated by each step will automatically be saved in the storage mechanism your backend uses (in this case W&B Artifacts).train/reward.