Backpropagation
Last but not least, KeOps fully supports automatic differentiation.
Most of the magic required is implemented by the F::DiffT
attributes of KeOps formulas and reductions, as discussed in
previous pages.
Backprop through a Sum reduction
Then, to implement the PyTorch backward
of the KeOps Genred
operator, we simply have to remember that if
Genred
call with a Sum reduction, we can write that for all variations
Consequently, performing the appropriate permutations of sums:
Backprop through a Log-Sum-Exp reduction
Similarly, when
straightforward computations show that:
In other words, a backward pass through a Genred
call that involves
a Sum or a Log-Sum-Exp reduction can
always be written as a symbolic Map-Reduce computation.
Bootstrapping derivatives of arbitrary order
Applying these commutation rules between the differential operator
LazyTensors
and the
torch.autograd
package. Thanks to recursive calls to the Genred
operator and to our
symbolic math engine, everything works just fine – even high-order
derivatives.