Skip to content

Conversation

@jmduarte
Copy link
Contributor

@jmduarte jmduarte commented Sep 9, 2023

  • Add focal loss option for classification

Note I explicitly keep the current implementation F.cross_entropy if self.options.classification_focal_gamma == 0 out of an abundance of caution even though I checked that, numerically, the more complex formula gives the same answer.

The difference is that F.cross_entropy fuses F.log_softmax and F.nll_loss, so it is supposed to be more numerically stable, which I wanted to keep. But if you think it's unnecessary, we can remove this if-then clause.

@mstamenk

@jmduarte
Copy link
Contributor Author

jmduarte commented Oct 5, 2023

@Alexanders101 is this PR ok for you? Can you merge it? Or do you have any requested changes?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant