17–19 Sept 2025
Tehnical University of Moldova
Europe/Bucharest timezone

Parallel and Distributed Computation of High-Order Derivatives in Neural Networks using Stochastic Taylor Derivative Estimator

18 Sept 2025, 16:00
15m
Room 1

Room 1

Technical University of Moldova
Paper presentation Grid, Cloud & High Performance Computing in Science Doctoral Symposium

Speaker

Alex Deonise

Description

This paper presents a scalable framework for computing high-order derivatives in neural networks using the Stochastic Taylor Derivative Estimator (STDE) within parallel and distributed computing environments. Targeting Physics-Informed Neural Networks (PINNs), the work extends the theoretical and practical applicability of STDE-a method based on univariate Taylor-mode automatic differentiation and randomized jet sampling by integrating it into the JAX ecosystem with distributed primitives like pmap and pjit. The implementation achieves significant speedups and memory efficiency by decoupling the expensive tensorial computations typically associated with high-order derivatives. Experimental benchmarks on many-body Schrödinger demonstrate near-linear scalability and significant runtime improvements, achieving up to 6.86$\times$ speedups over single-GPU baselines. Our results show that STDE, when combined with distributed computation, bridges a critical gap in scalable scientific machine learning by enabling efficient, high-order autodiff in massively parallel environments.

Authors

Alex Deonise Joanna Kolodziej (NASK and Cracow University of Technology, Poland) Florin Pop (University Politehnica of Bucharest, Romania / National Institute for Research & Development in Informatics – ICI Bucharest, Romania)

Presentation materials

There are no materials yet.