All Publications


  • Spatiotemporal Clustering with Neyman-Scott Processes via Connections to Bayesian Nonparametric Mixture Models. Journal of the American Statistical Association Wang, Y., Degleris, A., Williams, A., Linderman, S. W. 2024; 119 (547): 2382-2395

    Abstract

    Neyman-Scott processes (NSPs) are point process models that generate clusters of points in time or space. They are natural models for a wide range of phenomena, ranging from neural spike trains to document streams. The clustering property is achieved via a doubly stochastic formulation: first, a set of latent events is drawn from a Poisson process; then, each latent event generates a set of observed data points according to another Poisson process. This construction is similar to Bayesian nonparametric mixture models like the Dirichlet process mixture model (DPMM) in that the number of latent events (i.e. clusters) is a random variable, but the point process formulation makes the NSP especially well suited to modeling spatiotemporal data. While many specialized algorithms have been developed for DPMMs, comparatively fewer works have focused on inference in NSPs. Here, we present novel connections between NSPs and DPMMs, with the key link being a third class of Bayesian mixture models called mixture of finite mixture models (MFMMs). Leveraging this connection, we adapt the standard collapsed Gibbs sampling algorithm for DPMMs to enable scalable Bayesian inference on NSP models. We demonstrate the potential of Neyman-Scott processes on a variety of applications including sequence detection in neural spike trains and event detection in document streams.

    View details for DOI 10.1080/01621459.2023.2257896

    View details for PubMedID 39308788

    View details for PubMedCentralID PMC11412414

  • Dynamic Locational Marginal Emissions via Implicit Differentiation IEEE TRANSACTIONS ON POWER SYSTEMS Valenzuela, L., Degleris, A., El Gamal, A., Pavone, M., Rajagopal, R. 2024; 39 (1): 1138-1147
  • Spatiotemporal Clustering with Neyman-Scott Processes via Connections to Bayesian Nonparametric Mixture Models JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION Wang, Y., Degleris, A., Williams, A., Linderman, S. W. 2023
  • Point process models for sequence detection in high-dimensional neural spike trains. Advances in neural information processing systems Williams, A. H., Degleris, A., Wang, Y., Linderman, S. W. 2020; 33: 14350-14361

    Abstract

    Sparse sequences of neural spikes are posited to underlie aspects of working memory [1], motor production [2], and learning [3, 4]. Discovering these sequences in an unsupervised manner is a longstanding problem in statistical neuroscience [5-7]. Promising recent work [4, 8] utilized a convolutive nonnegative matrix factorization model [9] to tackle this challenge. However, this model requires spike times to be discretized, utilizes a sub-optimal least-squares criterion, and does not provide uncertainty estimates for model predictions or estimated parameters. We address each of these shortcomings by developing a point process model that characterizes fine-scale sequences at the level of individual spikes and represents sequence occurrences as a small number of marked events in continuous time. This ultra-sparse representation of sequence events opens new possibilities for spike train modeling. For example, we introduce learnable time warping parameters to model sequences of varying duration, which have been experimentally observed in neural circuits [10]. We demonstrate these advantages on experimental recordings from songbird higher vocal center and rodent hippocampus.

    View details for PubMedID 35002191

    View details for PubMedCentralID PMC8734964

  • A Provably Correct and Robust Algorithm for Convolutive Nonnegative Matrix Factorization IEEE TRANSACTIONS ON SIGNAL PROCESSING Degleris, A., Gillis, N. 2020; 68: 2499–2512