Skip to content

Conversation

@ValerianRey
Copy link
Contributor

@ValerianRey ValerianRey commented Jan 8, 2026

This is just a draft, showing more or less how this change will impact the interface. What do you think so far @PierreQuinton ?

TODO:

  • Update tests of backward and mtl_backward
  • Add unit tests for jac_to_grad and AccumulateJac
  • Find solution to the dependency problem
  • Use utils package?
  • Finish solution to the jac not defined attribute
  • Check that the jacobians all have the same first dim in jac_to_grad
  • Add a changelog entry explaining that this is a breaking change and how to make the transition

@ValerianRey ValerianRey added the feat New feature or request label Jan 8, 2026
@ValerianRey ValerianRey self-assigned this Jan 8, 2026
Comment on lines +14 to +15
Aggregates the Jacobians stored in the ``.jac`` fields of ``params`` and accumulates the result
into their ``.grad`` fields.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we set on "Accumulate" vs "store" in .grad?

I think that accumulate is the way to make it more torchy, and store is a new direction related to our (my?) dislike of sums.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the main reason to accumulate is to be more torch-like. Also, if a user really wants to replace the .grad, they can do optimizer.zero_grad() and then call jac_to_grad(...).

The other way around (we decide that jac_to_grad replaces the .grad field and a user instead wants to accumulate) is not doable.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pretty convinced, but let's keep this discussion for few days in case we think of other things.

@ValerianRey
Copy link
Contributor Author

ValerianRey commented Jan 8, 2026

We have something a bit tricky: both jac_to_grad and the AccumulateGrad transform depend on the code from _accumulation.py.

To reuse some code in two different packages (utils and autojac), without making one depend on the other, we would need another protected package just for this code. So IMO we could:

  • Have a the _accumulation.py code somewhere else in torchjd (either in a standalone protected python file or in a protected accumulation module).
  • Duplicate some accumulation code.
  • Live with the fact that autojac depends on utils and utils depends on aggregation.

@ValerianRey
Copy link
Contributor Author

Lastly, it's very annoying to have this Pylance error all over the place:

Cannot access attribute "jac" for class "Tensor"
  Attribute "jac" is unknown

I'm not sure how we could reliably stop showing this error, or even use a custom tensor that would properly fix this.

@PierreQuinton
Copy link
Contributor

Lastly, it's very annoying to have this Pylance error all over the place:

Cannot access attribute "jac" for class "Tensor"
  Attribute "jac" is unknown

I'm not sure how we could reliably stop showing this error, or even use a custom tensor that would properly fix this.

I think this is just because all this is bad design. We made it more torch compatible, and this is the price. Pylance is right, there is not field jac in Tensors.

@ValerianRey
Copy link
Contributor Author

I think this is just because all this is bad design. We made it more torch compatible, and this is the price. Pylance is right, there is not field jac in Tensors.

Yeah but there are still ways to make it much cleaner. I'm working on something.

@PierreQuinton
Copy link
Contributor

Yeah but there are still ways to make it much cleaner. I'm working on something.

I think we should keep in mind that we would like the users to have a good typing experience. It would be nasty to make them use .jac fields if they get typing errors.

@ValerianRey
Copy link
Contributor Author

Yeah but there are still ways to make it much cleaner. I'm working on something.

I think we should keep in mind that we would like the users to have a good typing experience. It would be nasty to make them use .jac fields if they get typing errors.

Well, they should rarely need to use these .jac fields themselves. If they want access to the jacobians, they should use autojac.jac.

@PierreQuinton
Copy link
Contributor

right, that's very good, and if they do, they should get some amount of warning. You can delete the discussion if you want (annoying that we cannot have a thread).

- Remove test_aggregate.py
- Update test_accumulate.py and test_interactions.py to test on AccumulateGrad instead of Accumulate
- Fix tests in test_backward.py and test_mtl_backward.py to match the new interface: check the jac field instead of the .grad field.
- Use _asserts.py for helper functions common to backward.py and mtl_backward.py
@codecov
Copy link

codecov bot commented Jan 9, 2026

Codecov Report

❌ Patch coverage is 93.18182% with 6 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/torchjd/utils/_accumulation.py 87.50% 3 Missing ⚠️
src/torchjd/utils/_jac_to_grad.py 92.85% 3 Missing ⚠️
Files with missing lines Coverage Δ
src/torchjd/autojac/_backward.py 100.00% <100.00%> (ø)
src/torchjd/autojac/_mtl_backward.py 100.00% <100.00%> (ø)
src/torchjd/autojac/_transform/__init__.py 100.00% <100.00%> (ø)
src/torchjd/autojac/_transform/_accumulate.py 100.00% <100.00%> (ø)
src/torchjd/utils/__init__.py 100.00% <100.00%> (ø)
src/torchjd/utils/_tensor_with_jac.py 100.00% <100.00%> (ø)
src/torchjd/utils/_accumulation.py 87.50% <87.50%> (ø)
src/torchjd/utils/_jac_to_grad.py 92.85% <92.85%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feat New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants