CSCDR seminar by Yajie Ji on "Transformed Generative Pre-Trained Physics-Informed Neural Networks"
Notice: Undefined index: name in /tmp/php-events-calendar/views/353ebaae3171acb621b81ad24fa2c8e1 on line 11
Notice: Undefined index: short_desc in /tmp/php-events-calendar/views/353ebaae3171acb621b81ad24fa2c8e1 on line 48
Notice: Undefined index: main_desc in /tmp/php-events-calendar/views/353ebaae3171acb621b81ad24fa2c8e1 on line 50
Abstract:
This talk presents Transformed Generative Pre-Trained Physics-Informed Neural Networks (TGPT-PINN), a novel architecture for nonlinear model order reduction (MOR) of transport-dominated partial differential equations (PDEs) within an MOR-integrated PINNs framework. Building upon the GPT-PINN's network-of-networks design for snapshot-based model reduction, TGPT-PINN introduces a paradigm-shifting approach to handle parameter-dependent discontinuities through two key innovations: (1) a shock-capturing loss component and (2) a parameter-adaptive transform layer, collectively overcoming linear MOR's limitations in transport-dominated regimes. The method's efficacy is demonstrated through multiple non-trivial parametric PDE test cases. Furthermore, we present the Entropy-enhanced TGPT-PINN (EGPT-PINN), which achieves remarkable accuracy in solving Burgers' and Euler equations' viscosity solutions using minimal neurons—without relying on a priori knowledge of governing equations or initial conditions. By augmenting the loss function with a model-data mismatch term, EGPT-PINN demonstrates superior robustness in inverse problem solving compared to both vanilla PINNs and existing entropy-enhanced variants. The talk will highlight these architectures' theoretical foundations, implementation strategies, and benchmark results across forward and inverse problems.
TXT 105