Open
Description
By default the values for each metric at the latest step are being saved in the summary
. It might be interesting to add an option to additionally save the best value for each metric.
An example use case would be the usual deep learning training loop where the model is being validated at the end of each epoch and the last epoch is not necessarily the one with the best performance. Having the best value saved in the summary
could be more useful for comparing experiments (i.e. dvc metrics diff --targets dvclive.json
)
A potential problem would be how to expose to the user (#75 (comment)) some options like whether to use the ¿save_best
? or not and what to consider as better (i.e. higher or lower)