Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 234 - New Challenges in Statistical Learning and Inference for Complex Data
Type: Topic Contributed
Date/Time: Tuesday, August 9, 2022 : 8:30 AM to 10:20 AM
Sponsor: Section for Statistical Programmers and Analysts
Abstract #320827
Title: Calibrating Multi-Dimensional Complex ODE from Noisy Data via Deep Neural Networks
Author(s): Kexuan Li Li* and Fangfang Wang and Ruiqi Liu and Fan Yang and Zuofeng Shang
Companies: Worcester Polytechnic Institute and Worcester Polytechnic Institute and Texas Tech University and Eli Lilly and Company and New Jersey Institute of Technology
Keywords: Sparsely connected neural networks; Nonlinear ODE in multi-dimensions; ReLu activation function; Deep Learning; Curse of Dimensionality; Stochastic Gradient Descent
Abstract:

Ordinary differential equations (ODEs) are widely used to model complex dynamics that arises in biology, chemistry, engineering, finance, physics, etc. Calibration of a complicated ODE system using noisy data is generally very difficult. In this work, we propose a two-stage nonparametric approach to address this problem. We first extract the de-noised data and their higher order derivatives using boundary kernel method, and then feed them into a sparsely connected deep neural network with ReLU activation function. Our method is able to recover the ODE system without being subject to the curse of dimensionality and complicated ODE structure. When the ODE possesses a general modular structure, with each modular component involving only a few input variables, and the network architecture is properly chosen, our method is proven to be consistent. Theoretical properties are corroborated by an extensive simulation study that demonstrates the validity and effectiveness of the proposed method. Finally, we use our method to simultaneously characterize the growth rate of Covid-19 infection cases from 50 states of the USA.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program