Below are my current research directions and contributions to funded projects.
DOE Genesis Mission
Within DOE Genesis, this research develops agentic continuous-learning frameworks for Self-Improving Foundation Models, enabling robust, automated model updating across safety-critical scientific applications and deployment with model teams spanning multiple national laboratories.
ORNL Early Career Award
This research develops LLM-based agentic methods for automating high-stakes scientific simulation workflows, coordinating portfolios of solvers, surrogate models, and uncertainty quantification tools with minimal human intervention. By reducing expert wall-time in design-space exploration, this work accelerates innovation cycles across national energy and security applications.
Householder Fellowship
This research develops scalable low-rank methods for compute- and energy-efficient deep learning at scale, focusing on adaptive rank compression during training and fine-tuning. By dynamically allocating rank where it matters most, these approaches reduce memory, compute, and communication costs while preserving accuracy and robustness, enabling efficient training and deployment of large neural networks on modern accelerator hardware.
CHaRMNET Research Center
This research develops scalable, efficient, and trustworthy AI-based surrogate models for high-dimensional partial differential equations, mainly for applications in neutron and radiation transport for the ChARMNET initiative. ChARMNET is a DOE Mathematical Multifaceted Integrated Capability Center (MMICC) led by Michigan State University and Los Alamos National Laboratory, bringing together researchers from seven universities and four DOE national laboratories to address the curse of dimensionality in plasma and transport modeling.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Most of my publications have an associated GitHub repository with code to reproduce the results. Below are some highlighted software projects.
|
|
|
|
Selected Appointments & Experience
Householder Fellow (Research Scientist) —
Oak Ridge National Laboratory
Lead independent research on scalable low-rank methods for scientific simulation and efficient neural network training, with work spanning mathematical design, performance-critical HPC implementations, and integration into production scientific workflows.
Research Software Engineer — Math2Market GmbH
Worked on performance-critical numerical software for industrial modeling applications, spanning algorithmic development, high-performance C++/GPU implementations, and integration of learning-based components into simulation pipelines.
Bootstrapped Industry Platform (Technical Lead)
Sole technical owner of a privately deployed B2B data analytics system, designed and implemented end-to-end and operated in production. The platform was engineered to handle noisy real-world inputs while meeting strict requirements on reliability, auditability, and deterministic behavior. These constraints strongly shaped my approach to building efficient, robust, and trustworthy computational systems.
Awards & Fellowships
Fellowships & PI Awards
• Alston S. Householder Fellowship, Oak Ridge National Laboratory, 2023
Distinguished staff fellowship in applied mathematics and scientific computing at ORNL funded by the US DOE Advanced Scientific Computing Research (ASCR) program.
• ORNL Early Career Award, Oak Ridge National Laboratory, 2024
Principal Investigator for two-year project on agentic AI for scientific simulation workflows.
• DFG Priority Programme Fellow, SPP 2298 “Theoretical Foundations of Deep Learning”, 2021–2024
Paper Awards
• Oral Presentation, NeurIPS 2025 — Oral acceptance rate: 0.36%
• Distinguished Paper Award, ORNL Computational Sciences and Mathematics Division, 2025
• Best Student Paper Award, AIAA Aviation Forum 2020
Selected Presentations
• NeurIPS 2025 Oral: Dynamical Low-Rank Compression of Neural Networks with Robustness under Adversarial Attacks
• ASCR Workshop 2025: Uncertainty-aware inverse modeling for federated scientific discovery across DOE facilities
• ICML 2022 Spotlight: Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation
• AIAA AVIATION 2020 Best Student Paper: Windowing Regularization Techniques for Unsteady Aerodynamic Shape Optimization
Mentorship
• Chinmay Patwardhan, visiting researcher at ORNL [2026], Ph.D. student at KIT
• Thomas Snyder, summer intern at ORNL [2024 & 2025], undergrad at Yale
• Hanna Park, summer intern at ORNL [2025], Ph.D. student at Mississippi State
• Jakob Maisch, Bachelor Thesis at KIT [2023], M.Sc. student at KIT
• Tim Stoll, Bachelor Thesis at KIT [2022]
• Martin Sadric, Bachelor Thesis at KIT [2022]
• Gabriel Moser, Bachelor Thesis at KIT [2021]
• Florian Buk, Master Thesis at KIT [2020]
Service
Organization
• International Workshop on Moment Methods in Kinetic Theory IV
• Minisymposium "Advances in High Dimensional PDE Methods using Sparse Grids and Low-Rank" at SIAM CSE 2025
• Minisymposium "Federated Learning and Data Mining in Distributed Environments" at SIAM SEAS 2025
Reviewing
• Reviewing: Program committees and peer review for NeurIPS (top reviewer), ICML, ICLR, AAAI; Journal reviewing includes Journal of Computational Physics, SIAM MMS, Journal of Scientific Computing, and Journal of Optimization Theory and Applications.
