Skip to content

Commit 75f3d23

Browse files
committed
docs: README
1 parent 6393091 commit 75f3d23

File tree

1 file changed

+30
-0
lines changed

1 file changed

+30
-0
lines changed

README.md

+30
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,16 @@ from pytorch_optimizer import get_supported_optimizers
9090
supported_optimizers = get_supported_optimizers()
9191
```
9292

93+
or you can also search them with the filter(s).
94+
95+
```python
96+
>>> get_supported_optimizers('adam*')
97+
['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw']
98+
99+
>>> get_supported_optimizers(['adam*', 'ranger*'])
100+
['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw', 'ranger', 'ranger21']
101+
```
102+
93103
| Optimizer | Description | Official Code | Paper | Citation |
94104
|---------------|---------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------|
95105
| AdaBelief | *Adapting Step-sizes by the Belief in Observed Gradients* | [github](https://github.com/juntang-zhuang/Adabelief-Optimizer) | <https://arxiv.org/abs/2010.07468> | [cite](https://ui.adsabs.harvard.edu/abs/2020arXiv201007468Z/exportcitation) |
@@ -182,6 +192,16 @@ from pytorch_optimizer import get_supported_lr_schedulers
182192
supported_lr_schedulers = get_supported_lr_schedulers()
183193
```
184194

195+
or you can also search them with the filter(s).
196+
197+
```python
198+
>>> get_supported_lr_schedulers('cosine*')
199+
['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup']
200+
201+
>>> get_supported_lr_schedulers(['cosine*', '*warm*'])
202+
['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup', 'warmup_stable_decay']
203+
```
204+
185205
| LR Scheduler | Description | Official Code | Paper | Citation |
186206
|-----------------|---------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------|------------------------------------|----------------------------------------------------------------------------------------------------|
187207
| Explore-Exploit | *Wide-minima Density Hypothesis and the Explore-Exploit Learning Rate Schedule* | | <https://arxiv.org/abs/2003.03977> | [cite](https://ui.adsabs.harvard.edu/abs/2020arXiv200303977I/exportcitation) |
@@ -199,6 +219,16 @@ from pytorch_optimizer import get_supported_loss_functions
199219
supported_loss_functions = get_supported_loss_functions()
200220
```
201221

222+
or you can also search them with the filter(s).
223+
224+
```python
225+
>>> get_supported_loss_functions('*focal*')
226+
['bcefocalloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
227+
228+
>>> get_supported_loss_functions(['*focal*', 'bce*'])
229+
['bcefocalloss', 'bceloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
230+
```
231+
202232
| Loss Functions | Description | Official Code | Paper | Citation |
203233
|-----------------|-------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------|------------------------------------|------------------------------------------------------------------------------|
204234
| Label Smoothing | *Rethinking the Inception Architecture for Computer Vision* | | <https://arxiv.org/abs/1512.00567> | [cite](https://ui.adsabs.harvard.edu/abs/2015arXiv151200567S/exportcitation) |

0 commit comments

Comments
 (0)