All Publications


  • PREINTEGRATION VIA ACTIVE SUBSPACE SIAM JOURNAL ON NUMERICAL ANALYSIS Liu, S. N., Owen, A. 2023; 61 (2): 495-514

    View details for DOI 10.1137/22M1479129

    View details for Web of Science ID 000954803300004

  • GLOBAL AND INDIVIDUALIZED COMMUNITY DETECTION IN INHOMOGENEOUS MULTILAYER NETWORKS ANNALS OF STATISTICS Chen, S., Liu, S., Ma, Z. 2022; 50 (5): 2664-2693

    View details for DOI 10.1214/22-AOS2202

    View details for Web of Science ID 000964342400009

  • Statistical Challenges in Tracking the Evolution of SARS-CoV-2 STATISTICAL SCIENCE Cappello, L., Kim, J., Liu, S., Palacios, J. A. 2022; 37 (2): 162-182

    View details for DOI 10.1214/22-STS853

    View details for Web of Science ID 000798149000003

  • How to reduce dimension with PCA and random projections? IEEE transactions on information theory Yang, F., Liu, S., Dobriban, E., Woodruff, D. P. 2021; 67 (12): 8154-8189

    Abstract

    In our "big data" age, the size and complexity of data is steadily increasing. Methods for dimension reduction are ever more popular and useful. Two distinct types of dimension reduction are "data-oblivious" methods such as random projections and sketching, and "data-aware" methods such as principal component analysis (PCA). Both have their strengths, such as speed for random projections, and data-adaptivity for PCA. In this work, we study how to combine them to get the best of both. We study "sketch and solve" methods that take a random projection (or sketch) first, and compute PCA after. We compute the performance of several popular sketching methods (random iid projections, random sampling, subsampled Hadamard transform, CountSketch, etc) in a general "signal-plus-noise" (or spiked) data model. Compared to well-known works, our results (1) give asymptotically exact results, and (2) apply when the signal components are only slightly above the noise, but the projection dimension is non-negligible. We also study stronger signals allowing more general covariance structures. We find that (a) signal strength decreases under projection in a delicate way depending on the structure of the data and the sketching method, (b) orthogonal projections are slightly more accurate, (c) randomization does not hurt too much, due to concentration of measure, (d) CountSketch can be somewhat improved by a normalization method. Our results have implications for statistical learning and data analysis. We also illustrate that the results are highly accurate in simulations and in analyzing empirical data.

    View details for DOI 10.1109/tit.2021.3112821

    View details for PubMedID 35695837

    View details for PubMedCentralID PMC9173709

  • How to Reduce Dimension With PCA and Random Projections? IEEE TRANSACTIONS ON INFORMATION THEORY Yang, F., Liu, S., Dobriban, E., Woodruff, D. P. 2021; 67 (12): 8154-8189
  • Quasi-Monte Carlo Quasi-Newton in Variational Bayes JOURNAL OF MACHINE LEARNING RESEARCH Liu, S., Owen, A. B. 2021; 22