Abstract

This study aims to develop a comprehensive framework for inference theory in time series, with a focus on change-point detection and online learning. First, we introduce an $\ell^2$-based inference approach for multiple change-point detection in high-dimensional time series, which targets dense or spatially clustered signals by a novel Two-Way MOSUM (moving sum) test statistic. We derive the limiting distribution of this $\ell^2$-aggregated statistic by extending the high-dimensional Gaussian approximation to non-stationary spatial-temporal processes. Simulation exhibits promising performance of our test in detecting non-sparse weak signals, and the application to COVID-19 analysis shows the real-world relevance of our method. Next, we revisit the Stochastic Gradient Descent (SGD) algorithm from a nonlinear time series perspective. By introducing the functional dependence measure to the machine learning field, we investigate the geometric-moment contraction property of SGD. This property effectively addresses the challenges posed by non-stationarity in recursive iterations. Subsequently, we establish the asymptotic normality of averaged SGD and propose an online estimator for the long-run covariance matrix that appears in the limiting Gaussian distribution. Numerical experiments demonstrate that our proposed empirical confidence intervals exhibit asymptotically precise coverage probabilities.

Committee Chair

Likai Chen

Committee Members

Todd Kuffner

Degree

Doctor of Philosophy (PhD)

Author's Department

Mathematics

Author's School

Graduate School of Arts and Sciences

Document Type

Dissertation

Date of Award

4-30-2024

Language

English (en)

Author's ORCID

0009-0003-0597-0970

Included in

Mathematics Commons

Share

COinS