Fall 2025 Machine Learning Reading Seminar

We will meet 2:00-3:00pm on Fridays, Derrick 120, alternating with NMDSE.

DateSpeakerInstitutionAbstract
9/5  
9/12  NMDSE 
9/19  
9/26 NMDSE 
10/3 Aniruddha Bora Texas State UniversityTalk Title: Functional Richness Over Parameter Count: Neural Operators for Correction & Inversion Abstract: Most real-world phenomena are governed by high-dimensional, nonlinear dynamical systems. This talk will address two complementary tasks: (i) the forward problem (improving predictions when simulators exhibit structural bias or under-resolution) and (ii) the inverse problem (given temperature response fields, infer the imposed spatial forcing). In both the problems we have developed a novel framework whose core design principle is based on mathematical diversity that boosts accuracy at fixed parameter budgets. For the forward task, the talk will present an online bias-correction operator that couples to Energy Exascale Earth System (E3SM) Model under strict constraints (no temporal history). In this work we develop two novel UNet-based discrete neural operators: (i) Inception-UNet (IUNet) and (ii) Multi-scale & Multi-feature (M&M) architecture in which each scale hosts diverse operators combined via a learnable mixer. Under parameter parity comparisons, M&M consistently outperforms homogeneous baselines, indicating the gain due to functional richness. Despite using only two years of training data, these operators deliver substantial offline improvements and is deployed online for hybrid correction while preserving native dynamics. For the inverse task, we use the same M&M model to solve the mapping from given global temperature response to spatial forcing that produced the response. Across settings, M&M’s mathematical diversity improves identifiability and reconstruction quality without increasing parameters. We quantify uncertainty in the inverse problem via SWAG for epistemic model uncertainty and by evaluating ensembles of responses for data-driven variability, yielding calibrated confidence estimates. Overall, the talk presents mathematical diversity as a unifying recipe for neural operators.
10/10  NMDSE 
10/17 Xiaoxi ShenTexas State University 
10/24  NMDSE 
10/31 Xiaoxi ShenTexas State University 
11/7 NMDSE 
11/14  
11/21  NMDSE