This workshop, much like its predecessor at PPoPP’24, aims to bring together researchers interested in methods, tools and frameworks relying on automatic differentiation, and practitioners who need derivatives for parallel or HPC workloads, in application areas spanning applied mathematics, scientific computing, computational engineering, and machine learning. The workshop is to feature invited talks from both the framework developer and user communities, and is soliciting extended abstracts for contributed talks on topics including, but not limited to:

  • Automatic differentiation tool development,
  • Model, theory and method development for the differentiation of parallel computer programs,
  • Differentiable languages, or domain-specific languages or frameworks that support differentiation or gradient computations,
  • Case studies and experiences of computing derivatives of parallel or large scale computations, or of trying to scale differentiable applications, and
  • Approaches closely related to differentiation, which may include aspects of e.g. probabilistic programming, uncertainty quantification, or error estimation.

While we encourage novel submissions, we also accept talks on previously published material. The workshop is intended as an informal venue for discussions between developers and users about ongoing or unfinished work, as well as existing methods that are not yet widely used by all relevant communities. Accepted abstracts will be shared on the workshop website, but will otherwise not lead to a formal publication.

Important Dates

  • Submission Deadline: December 20, 23:59 AOE
  • Author Notification: January 10, 23:59 AOE
  • Conference Date: March 1 or 2

Invited Speakers

Organizers

Posts

subscribe via RSS