Stanford Advisors


All Publications


  • Transfer of learning: Analysis of dose-response functions from a large-scale, online, cognitive training dataset. PloS one Osman, A. M., Jaffe, P. I., Ng, N. F., Kerlan, K. R., Schafer, R. J. 2023; 18 (5): e0281095

    Abstract

    Fundamental to the efficacy of cognitive training (CT) is its dose. Here we used the power and breadth afforded by a large dataset to measure precisely dose-response (D-R) functions for CT and to examine the generality of their magnitude and form. The present observational study involved 107,000 users of Lumosity, a commercial program comprising computer games designed to provide CT over the internet. In addition to training with Lumosity games, these users took an online battery of cognitive assessments (NeuroCognitive Performance Test, NCPT) on two or more occasions separated by at least 10 weeks. Changes in performance on the NCPT between the first and second assessments were examined as a function of the amount of intervening gameplay. The resulting D-R functions were obtained both for overall performance on the NCPT and performance on its eight subtests. Also examined were differences between D-R functions from demographic groups defined by age, gender, and education. Monotonically increasing D-R functions, well fit by an exponential approach to an asymptote, were found consistently for overall performance on the NCPT, performance on seven of the eight subtests, and at each level of age, education, and gender. By examining how individual parameters of the D-R functions varied across subtests and groups, it was possible to measure separately changes in the effects on NCPT performance of 1) transfer from CT and 2) direct practice due to repeated testing. The impact of both transfer and direct practice varied across subtests. In contrast, while the effects of direct practice diminished with age, those of transfer remained constant. Besides its implications for CT by older adults, this latter finding suggests that direct practice and transfer do not involve identical learning processes, with transfer being limited to learning processes that remain constant across the adult lifespan.

    View details for DOI 10.1371/journal.pone.0281095

    View details for PubMedID 37195927

  • Modelling human behaviour in cognitive tasks with latent dynamical systems. Nature human behaviour Jaffe, P. I., Poldrack, R. A., Schafer, R. J., Bissett, P. G. 2023

    Abstract

    Response time data collected from cognitive tasks are a cornerstone of psychology and neuroscience research, yet existing models of these data either make strong assumptions about the data-generating process or are limited to modelling single trials. We introduce task-DyVA, a deep learning framework in which expressive dynamical systems are trained to reproduce sequences of response times observed in data from individual human subjects. Models fitted to a large task-switching dataset captured subject-specific behavioural differences with high temporal precision, including task-switching costs. Through perturbation experiments and analyses of the models' latent dynamics, we find support for a rational account of switch costs in terms of a stability-flexibility trade-off. Thus, our framework can be used to discover interpretable cognitive theories that explain how the brain dynamically gives rise to behaviour.

    View details for DOI 10.1038/s41562-022-01510-8

    View details for PubMedID 36658212

  • A massive dataset of the NeuroCognitive Performance Test, a web-based cognitive assessment. Scientific data Jaffe, P. I., Kaluszka, A., Ng, N. F., Schafer, R. J. 2022; 9 (1): 758

    Abstract

    We present a dataset of approximately 5.5 million subtest scores from over 750,000 adults who completed the NeuroCognitive Performance Test (NCPT; Lumos Labs, Inc.), a validated, self-administered cognitive test accessed via web browser. The dataset includes assessment scores from eight test batteries consisting of 5-11 subtests that collectively span several cognitive domains including working memory, visual attention, and abstract reasoning. In addition to the raw scores and normative data from each subtest, the dataset includes basic demographic information from each participant (age, gender, and educational background). The scale and diversity of the dataset provides an unprecedented opportunity for researchers to investigate population-level variability in cognitive abilities and their relation to demographic factors. To facilitate reuse of this dataset by other researchers, we provide a Python module that supports several common preprocessing steps.

    View details for DOI 10.1038/s41597-022-01872-8

    View details for PubMedID 36481748

  • Shared mechanisms of auditory and non-auditory vocal learning in the songbird brain ELIFE McGregor, J. N., Grassler, A. L., Jaffe, P., Jacob, A., Brainard, M. S., Sober, S. J. 2022; 11

    Abstract

    Songbirds and humans share the ability to adaptively modify their vocalizations based on sensory feedback. Prior studies have focused primarily on the role that auditory feedback plays in shaping vocal output throughout life. In contrast, it is unclear how non-auditory information drives vocal plasticity. Here, we first used a reinforcement learning paradigm to establish that somatosensory feedback (cutaneous electrical stimulation) can drive vocal learning in adult songbirds. We then assessed the role of a songbird basal ganglia thalamocortical pathway critical to auditory vocal learning in this novel form of vocal plasticity. We found that both this circuit and its dopaminergic inputs are necessary for non-auditory vocal learning, demonstrating that this pathway is critical for guiding adaptive vocal changes based on both auditory and somatosensory signals. The ability of this circuit to use both auditory and somatosensory information to guide vocal learning may reflect a general principle for the neural systems that support vocal plasticity across species.

    View details for DOI 10.7554/eLife.75691

    View details for Web of Science ID 000862735300001

    View details for PubMedID 36107757

    View details for PubMedCentralID PMC9522248