Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add optimisers #320

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from
Draft

Add optimisers #320

wants to merge 4 commits into from

Conversation

jatkinson1000
Copy link
Member

@jatkinson1000 jatkinson1000 commented Mar 12, 2025

This PR seeks to add optimisers functionality to the code.

  • FTorch
    • Expose optimisers in FTorch
    • Write unit tests to cover these
  • Exercise:
    • README - WIP
    • requirements
    • python version of the exercise
    • Fortran version of the exercise - WIP

Notes whilst in progress:

  • It may be useful to expose a few more tensor functions such as .mean() and .sum() - see Overload sum intrinsic for tensors #240
    • it is handy to implicitly initialise a gradient of ones in .backward() by contracting a (loss) tensor into a scalar. Should this be added to the autograd example?
  • ftorch.F90 is going to grow with this. Perhaps now is the time to break apart into sub-modules?
    • I'm thinking of having ftorch_optim as a module to hold these.

@jatkinson1000 jatkinson1000 added the enhancement New feature or request label Mar 12, 2025
@jatkinson1000 jatkinson1000 self-assigned this Mar 12, 2025
@jwallwork23 jwallwork23 added autograd Tasks towards the online training / automatic differentiation feature and removed autograd Tasks towards the online training / automatic differentiation feature labels Mar 12, 2025
@jatkinson1000 jatkinson1000 force-pushed the optim branch 3 times, most recently from 14adcf1 to 281f28a Compare March 12, 2025 17:07
Copy link

Cpp-Linter Report ⚠️

Some files did not pass the configured checks!

clang-format (v12.0.0) reports: 2 file(s) not formatted
  • src/ctorch.cpp
  • src/ctorch.h

Have any feedback or feature suggestions? Share it here.

Comment on lines 102 to +103
fortitude check src/ftorch.F90
fortitude check src/ftorch_optim.F90
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At some point it might make sense to separate out modules for tensor, model, optim.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, since each optimiser (Adam, SGD, etc.) seems to be its own function I thought I'd do this here, for the Fortran at least.
I had been hoping that there was a general optimiser function specified by an enum, but alas no.

Others feel OK for now, but definitely something I was thinking about, especially as module functions might grow soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants