Alternative to wandb for hyperparameter sweeps


i am looking for a free alternative to wandb to implement hyperparameter sweeps on multiple GPUs in parallel. Ideally, it should well integrate with the Lightning CLI.

Do you have any suggestions?

Iā€™m currently using optuna sweeper which should be able to what you want. There is a template that i quite like that uses hydra for config files and have a implementation of a sweep using optuna GitHub - ashleve/lightning-hydra-template: PyTorch Lightning + Hydra. A very user-friendly template for ML experimentation. āš”šŸ”„āš”.

1 Like