Research areas
Research
Core research themes and representative directions.
My research focuses on scientific machine learning (SciML) for modeling and inference in complex dynamical systems. I develop scalable methods for uncertainty quantification and data assimilation, robust time integration schemes for stiff and multiscale dynamics, and adaptive mesh refinement techniques for PDE simulation.
Scientific machine learning
Focus: SciML
Hybrid physics/ML methods for modeling, inference, and uncertainty quantification.
Uncertainty quantification & data assimilation
Focus: UQ/DA
Inverse problems, sensitivity analysis, and scalable data assimilation.
Time integration
Focus: Time stepping
Robust time-stepping for stiff and multiscale dynamics (IMEX, multirate, adjoints), with PETSc and DESolve implementations.
PDE & AMR
Focus: PDE/AMR
High-fidelity PDE simulation with adaptive mesh refinement and scalable solvers.
Selected project pages
Projects (selected)
Current project overviews and research-thrust pages.
Scientific machine learning
Hybrid ML-PDE for accelerated simulation
Learned weak-form and source-term corrections for finite element and DG solvers, designed for long-horizon accuracy at reduced computational cost.
UQ & data assimilation
Data assimilation and uncertainty-aware inference
Ensemble and variational data assimilation, physics-informed Gaussian processes, and uncertainty-aware forecasting for atmospheric chemistry and climate variability.
Time integration
Time integration for stiff and multiscale dynamics
SSP and general linear methods, IMEX and multirate schemes, and global error estimation for PDE discretizations.
Open-source packages
Software
Open-source software contributions for scientific computing and machine learning workflows.
DESolve
Lead package
Time integration for stiff and multiscale systems.
PETSc TS
Core contributor
Scalable ODE/DAE and time stepping in HPC.
DAPack
Lead package
Data assimilation for UQ and inference.
UQGrid
Contributor
Power grid dynamics and UQ workflows.
Recent papers
Selected recent and featured publications.
- Junoh Jung and Emil M Constantinescu. Learning differentiable weak-form corrections to accelerate finite element simulations. To appear, 2026. [arXiv] [PDF]
- Shinhoo Kang and Emil M Constantinescu. Differentiable DG with neural operator source term correction. Submitted, 2025. [arXiv]
- Pi-Yueh Chuang, Ahmed Attia, and Emil M Constantinescu. Distributional sensitivity analysis: Enabling differentiability in sample-based inference. Submitted, 2025. [arXiv]
- Haoyuan Chen, Emil M Constantinescu, Vishwas Rao, and Cristiana Stan. Improving the predictability of the Madden-Julian oscillation at subseasonal scales with Gaussian process models. JAMES - Machine learning application to Earth system modeling, Vol. 17(5); Pages e2023MS004188, 2025. [DOI] [arXiv]
- Arkaprabha Ganguli, Nesar Ramachandra, Julie Bessac, and Emil M Constantinescu. Enhancing interpretability in generative modeling: Statistically disentangled latent spaces guided by generative factors in scientific datasets. Springer Machine Learning, Vol. 114(9); Pages 197, 2025. [DOI] [arXiv]
- Arkaprabha Ganguli, Anirban Samaddar, Florian Kéruzoré, Nesar Ramachandra, Julie Bessac, Sandeep Madireddy, and Emil M Constantinescu. Uncovering physical drivers of dark matter halo structures with auxiliary-variable-guided generative models. Submitted, 2025. [arXiv]
- Johann Rud*, Max Heldman, Emil M. Constantinescu, Qi Tang, and Xian-Zhu Tang. Scalable implicit solvers with dynamic mesh adaptation for a relativistic drift-kinetic Fokker-Planck-Boltzmann model. Journal of Computational Physics, Vol. 507; Pages 112954, 2024. [DOI] [arXiv] [PDF]
- Hongli Zhao, Tyler E. Maltba, D. Adrian Maldonado, Emil M Constantinescu, and Mihai Anitescu. Data-driven estimation of failure probabilities in correlated structure-preserving stochastic power system models. 2024. [arXiv]
- Shinhoo Kang, Alp Dener, Aidan Hamilton, Hong Zhang, Emil M Constantinescu, and Robert Jacob. Multirate partitioned Runge-Kutta methods for coupled Navier-Stokes equations. Computers & Fluids, Vol. 264(15); Pages 105964, 2023. [DOI] [arXiv] [PDF]
- Shinhoo Kang and Emil M Constantinescu. Learning subgrid-scale models with neural ordinary differential equations. Computers and Fluids, In Press, Vol. 261; Pages 105919, 2023. [DOI] [arXiv]
- Daniel Adrian Maldonado, Emil M Constantinescu, Junbo Zhao, and Mihai Anitescu. Computationally efficient power system maximum transient linear growth estimation. Submitted, 2023. [arXiv] [PDF]
- Ahmed Attia, D. Adrian Maldonado, Emil M Constantinescu, and Mihai Anitescu. Centralized calibration of power system dynamic models using variational data assimilation. 2023. [arXiv] [PDF]