Krishnanunni C G
POB 3SEo2B
201 E 24th St
Austin, Texas 78712
I am an incoming postdoctoral researcher at the Institute for Mathematics, Heidelberg University, where I will be working with Prof Jakob Zech. I obtained my PhD in Aerospace Engineering from UT Austin in 2026 under the supervision of Prof. Tan Bui-Thanh. I was a member of the PHO-ICES group, Oden Institute for Computational Engineering and Sciences from 2021 to 2026. Previously, I received a bachelor’s degree in Civil Engineering from National Institute of Technology, Calicut in 2017 and a master’s degree in Structural Engineering from Indian Institute of Technology, Madras in 2020.
I come from a mechanics background where I undertook projects on signal processing, optimization, and mathematical modelling in solid mechanics with Prof. B. N. Rao at IIT Madras and Prof. Mohammed Ameen at NIT Calicut. I was fortunate to be awarded a research fellowship to work with Prof. Phoolan Prasad at the Indian Institute of Science, Bangalore who exposed me to rigorous mathematics behind nonlinear hyperbolic waves and in particular allowed me to appreciate the beauty of mathematics in mechanics. Currently, my broad interest lies in anything creative and mathematically beautiful.
Current Research Focus
Machine learning methods for science and engineering have gained significant popularity in recent years (Scientific Machine Learning). In contrast, my research takes the opposite direction: I draw inspiration from techniques in mechanics and numerical analysis to develop novel machine learning algorithms. Specifically, my PhD thesis addresses the problem of neural network architecture adaptation by leveraging concepts from topology optimization and adaptive mesh refinement strategies in the finite element method. This results in a principled and efficient alternative to conventional neural architecture search (NAS) approaches (More details available here).
Currently, I am interested in generative modeling for prior calibration in Bayesian inverse problems, as well as in developing derivative-free optimization methods for black-box optimization.
news
| Apr 25, 2026 | Our new work titled “From an Elementary Proof of Error Representation for Hermite Quadrature to a Rediscovery of Legendre Polynomials and Rodrigues Formula” is available in arXiv. |
|---|---|
| Apr 1, 2026 | I successfully defended my PhD thesis titled “Computational Mathematics Approaches to Architecture Design of Deep Neural Networks.” My committee members included Dr. Tan Bui-Thanh, Dr. Thomas J. R. Hughes, Dr. Clint Dawson, and Dr. Qiang Liu. |
| Jan 1, 2026 | Our new paper “A New Look at the Ensemble Kalman Filter for Inverse Problems: Duality, Non-Asymptotic Analysis and Convergence Acceleration” is available in arXiv. |
| Jun 25, 2025 | Our new paper “Lilan: A linear latent network approach for real-time solutions of stiff, nonlinear, ordinary differential equations” is available in arXiv. |
| Mar 15, 2025 | Our new paper on “An adaptive and stability-promoting layerwise training approach for sparse deep neural network architecture” has been accepted for publication in Computer Methods in Applied Mechanics and Engineering. |