Tao Wang
Economics, University of Victoria
Title: Distributed Mode Learning
Date: Friday, April 11th, 2025
Time: 1:30PM (PDT)
Location: ASB 10900
Abstract:
We introduce a novel regression methodology leveraging parametric kernel-based mode estimation, specifically designed to handle datasets that exhibit heavy-tailed distributions or contain significant outliers. To effectively tackle computational burdens associated with large-scale data, our approach incorporates distributed statistical learning methods, markedly reducing memory demands while naturally accommodating dataset heterogeneity across distributed environments. By reformulating the local kernel-based objective into an approximate least squares framework, the method efficiently retains compact, summarized statistics from each local computation unit. These compact summaries allow for accurate global estimation with negligible asymptotic loss. Furthermore, we examine shrinkage estimation using a local quadratic approximation scheme and demonstrate that, under an adaptive LASSO framework, the estimator achieves oracle properties. Simulation studies and practical applications to real-world data underscore the superior performance and robustness of our proposed technique in finite-sample scenarios.