You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,4 +12,4 @@ Apart from the conditions above, this repository follows the [ColPrac](https://g
12
12
Its code is formatted using [Runic.jl](https://github.com/fredrikekre/Runic.jl).
13
13
As part of continuous integration, a set of formal tests is run using [pre-commit](https://pre-commit.com/).
14
14
We invite you to install pre-commit so that these checks are performed locally before you open or update a pull request.
15
-
You can refer to the [dev guide](https://juliadiff.org/DifferentiationInterface.jl/DifferentiationInterface/dev/dev_guide/) for details on the package structure and the testing pipeline.
15
+
You can refer to the relevant page of the development documentation for details on the package structure and the testing pipeline.
Copy file name to clipboardExpand all lines: DifferentiationInterface/docs/src/dev/contributing.md
+16-15Lines changed: 16 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Dev guide
1
+
# Contributing
2
2
3
3
This page is important reading if you want to contribute to DifferentiationInterface.jl.
4
4
It is not part of the public API and the content below may become outdated, in which case you should refer to the source code as the ground truth.
@@ -7,26 +7,27 @@ It is not part of the public API and the content below may become outdated, in w
7
7
8
8
The package is structured around 8 [operators](@ref Operators):
9
9
10
-
-[`derivative`](@ref)
11
-
-[`second_derivative`](@ref)
12
-
-[`gradient`](@ref)
13
-
-[`jacobian`](@ref)
14
-
-[`hessian`](@ref)
15
-
-[`pushforward`](@ref)
16
-
-[`pullback`](@ref)
17
-
-[`hvp`](@ref)
10
+
-[`derivative`](@ref)
11
+
-[`second_derivative`](@ref)
12
+
-[`gradient`](@ref)
13
+
-[`jacobian`](@ref)
14
+
-[`hessian`](@ref)
15
+
-[`pushforward`](@ref)
16
+
-[`pullback`](@ref)
17
+
-[`hvp`](@ref)
18
18
19
19
Most operators have 4 variants, which look like this in the first order: `operator`, `operator!`, `value_and_operator`, `value_and_operator!`.
20
20
21
21
## New operator
22
22
23
23
To implement a new operator for an existing backend, you need to write 5 methods: 1 for [preparation](@ref Preparation) and 4 corresponding to the variants of the operator (see above).
24
-
For first-order operators, you may also want to support [in-place functions](@ref"Mutation and signatures"), which requires another 5 methods (defined on `f!` instead of `f`).
24
+
For some operators, you will also need to support [in-place functions](@ref"Mutation and signatures"), which requires another 5 methods (defined on `f!` instead of `f`).
25
25
26
26
The method `prepare_operator_nokwarg` must output a `prep` object of the correct type.
27
-
For instance, `prepare_gradient(strict, f, backend, x)` must return a [`DifferentiationInterface.GradientPrep`](@ref).
28
-
Assuming you don't need any preparation for said operator, you can use the trivial prep that are already defined, like `DifferentiationInterface.NoGradientPrep{SIG}`.
27
+
For instance, `prepare_gradient_nokwarg(strict, f, backend, x)` must return a [`DifferentiationInterface.GradientPrep`](@ref).
28
+
Assuming you don't need any preparation for said operator, you can use the trivial preparation types that are already defined, like `DifferentiationInterface.NoGradientPrep{SIG}`.
29
29
Otherwise, define a custom struct like `MyGradientPrep{SIG} <: DifferentiationInterface.GradientPrep{SIG}` and put the necessary storage in there.
30
+
Take inspiration from existing operators on how to enforce the signature `SIG`.
30
31
31
32
## New backend
32
33
@@ -36,18 +37,18 @@ Your AD package needs to be registered first.
36
37
### Core code
37
38
38
39
In the main package, you should define a new struct `SuperDiffBackend` which subtypes [`ADTypes.AbstractADType`](@extref ADTypes), and endow it with the fields you need to parametrize your differentiation routines.
39
-
You also have to define [`ADTypes.mode`](@extref) and [`DifferentiationInterface.inplace_support`](@ref) on `SuperDiffBackend`.
40
+
You also have to define [`ADTypes.mode`](@extref), [`DifferentiationInterface.check_available`](@ref) and [`DifferentiationInterface.inplace_support`](@ref) on `SuperDiffBackend`.
40
41
41
42
!!! info
42
-
43
+
43
44
In the end, this backend struct will need to be contributed to [ADTypes.jl](https://github.com/SciML/ADTypes.jl).
44
45
However, putting it in the DifferentiationInterface.jl PR is a good first step for debugging.
45
46
46
47
In a [package extension](https://pkgdocs.julialang.org/v1/creating-packages/#Conditional-loading-of-code-in-packages-(Extensions)) named `DifferentiationInterfaceSuperDiffExt`, you need to implement at least [`pushforward`](@ref) or [`pullback`](@ref) (and their variants).
47
48
The exact requirements depend on the differentiation mode you chose:
0 commit comments