Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shorten docstring (for CLI compat) #19356

Merged
merged 12 commits into from
Jan 30, 2024
8 changes: 4 additions & 4 deletions src/lightning/pytorch/callbacks/batch_size_finder.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,10 @@


class BatchSizeFinder(Callback):
"""The ``BatchSizeFinder`` callback tries to find the largest batch size for a given model that does not give an
out of memory (OOM) error. All you need to do is add it as a callback inside Trainer and call
``trainer.{fit,validate,test,predict}``. Internally it calls the respective step function ``steps_per_trial`` times
for each batch size until one of the batch sizes generates an OOM error.
"""Attempts to find the largest batch size for a given model that avoids an out of memory (OOM) error. All you need
to do is add it as a callback inside Trainer and call ``trainer.{fit,validate,test,predict}``. Internally it calls
the respective step function ``steps_per_trial`` times for each batch size until one of the batch sizes generates
an OOM error.

.. warning:: This is an :ref:`experimental <versioning:Experimental API>` feature.

Expand Down
Loading