A quickly developing framework for transforming numerical functions. Wait? What?
We know NumPy. The scientic computing package for Python with a bazillion numerical functions. It is fast and efficient and even taps into linear algebra accelerators like Intel MKL and OpenBLAS. NumPy is a fundamental part of TensorFlow and PyTorch as well as many other frameworks for ML.
It turns out all software is just layers of other software - pretty nifty!
In the case of TensorFlow you have accelerators like XLA that take atomic operations that are executed individually and colocates them (fusing) to prevent the write+read parts by directly streaming the results. It manages the code execution and scaling on CPUs, GPUs and TPUs.
Machine learning algorithms like backpropogation are essential for training neural networks. Hint - its a ton of math. Math like differentiation. Back in calculus we learned about differentiation and, in particular, a well named thing called the chain rule. Basically, express a derivative of a complex function as a chain of derivatives of functions that can compose the complex function. Ugg, say it again but differently: I don't know the change in
Get it together Mike....
Well, if Autograd
and XLA
met, hit it off, decided they like each other, became besties, then they would together be called JAX and when you hear them talk it would sound like NumPy
.
In the words of the developers 'JAX is Autograd and XLA, brought together for high-performance machine learning research.'.