SoftJAX & SoftTorch: Empowering Automatic Differentiation Libraries with Informative Gradients
arXiv:2603.08824v1 Announce Type: new Abstract: Automatic differentiation (AD) frameworks such as JAX and PyTorch have enabled gradient-based optimization for a wide range of scientific fields. …
Anselm Paulus, A. Ren\'e Geist, V\'it Musil, Sebastian Hoffmann, Onur Beker, Georg Martius
16 views