Bio


Alissa Ling graduated from Washington University in St. Louis as a double major in Applied Math and Physics. During the summers of her Junior and Senior years at WashU, Alissa interned at Johns Hopkins Applied Physics Lab, researching computer vision techniques to improve the Argus II, a retina prosthetic for a subset of blind people, and analyzing large disease data sets to determine trajectories and risk factors for both humans and animals.
She is currently pursuing her PhD in Electrical Engineering in Professor Paul Nuyujukian's Brain Interfacing Lab (BIL). Her research goal is to advance our understanding of motor cortical control of naturalistic behavior by developing a platform that combines wireless electrophysiology with a markerless motion capture system. She hopes that her research can inform clinical studies that can improve the standard of care for patients with motor disabilities.
In her free time, Alissa enjoys swimming and outdoor activities such as surfing and hiking.

Honors & Awards


  • Neurotech training program trainee, Center for Mind, Brain, Computation and Technology at Stanford University (2020)
  • Nominated as a Rhodes Scholar candidate, Washington University in St. Louis (2017)
  • College Swimming and Diving Coaches Association of America Scholar All-American Honorable-Mention, CSCAA (2015-2017)
  • University Athletic Association All-Academic distinction, University Athletic Association (2015-2017)

Education & Certifications


  • BA, Washington University in St. Louis, Physics (2018)
  • BA, Washington University in St. Louis, Applied Math (2018)

Current Research and Scholarly Interests


My research goal is to advance our understanding of motor cortical control of naturalistic behavior. Multichannel electrode recordings allow us to simultaneously record hundreds of neurons in an awake behavioral setting. Low dimensional neural dynamics have demonstrated great potential to explain how the motor cortex controls behavior.
However, neural dynamics from classically constricted tasks may not uphold for ambulatory behavior. Some evidence exists to suggest that the complexity/variability of the neural recordings is constrained by the complexity of the task being performed, artificially and unintentionally limiting the observed neural data. To address this, experiments with higher task complexity need to be conducted. I propose to conduct freely moving experiments to directly ask whether increasing task complexity yields greater neural variance and how the extra neural variance correlates to various limb kinematics.

To address this, I will conduct freely moving experiments by developing a novel markerless motion capture system to correlate full body kinematics with neural data.
There are currently limited platforms with which to capture markerless freely moving behavior that uses multiple cameras and point cloud data. I am developing a pipeline that solves for the kinematics of a behavior from point cloud data using a geometric approach. I am using multiple Intel Realsense D435 stereo depth cameras to collect pose data.
The general flow of the pipeline converts the raw depth data into a point cloud, registers the multiple views together to create one volumetric point cloud, creates a mesh from the point cloud, and then fits an articulated skeleton to the mesh to solve for joint angle and position. I will then align the kinematic data with neural data and perform dimensionality reduction techniques on the high dimensional neural data.

The techniques leveraged here, combining wireless electrophysiology with synchronized depth cameras, may serve as a model platform with which to capture rich, natural behavior in naturalistic settings. This work will advance our understanding of how the mammalian brain controls movement in free behavior, a critical question in systems neuroscience. If successful, this work could also inform translational research to improve the standard of care for patients with motor disabilities.

Projects


  • Neural Dynamics of Ambulatory Behavior Using Markerless Point Cloud Pose Estimation, Stanford University (August 1, 2019)

    Low dimensional neural dynamics have demonstrated great potential to explain how the motor cortex controls behavior.
    However, neural dynamics from classically constricted tasks may not uphold for ambulatory behavior.
    To address this, I will conduct freely moving experiments by developing a novel markerless motion capture system to correlate full body kinematics with neural data.
    I will compare neural dynamics between free and constrained tasks and will correlate lower limb kinematics with neural data, and then create a brain computer interface to extract features of naturalistic movement.
    This will advance our understanding of cortical control of naturalistic behavior, which could inform translational research to improve the standard of care for patients with motor disabilities.

    Location

    Palo Alto, California

Lab Affiliations


Work Experience


  • Paid Summer Intern, Intelligent Systems Center 2018, Johns Hopkins Applied Physics Lab (5/1/2018 - 9/1/2018)

    Between the summer of graduating undergraduate and starting graduate school, Alissa went back to Johns Hopkins APL and continued working on the Argus II project and with Dr. Howard Burkom.
    For the Argus II project, she designed a haptic feedback glove and implemented it into the framework to provide a multimodal way to communicate with the patient. The haptic feedback would provide sensory cues so that the user could turn his or her head towards objects or people in the environment so that they would come into the field of view of the camera.
    She also analyzed Howard County General Hospital disease data to determine the trajectories and risk factors for
    Chronic Obstructive Pulmonary Disease with Dr. Burkom.

    Location

    Columbia, Maryland

  • Paid Summer Intern, Intelligent Systems Center 2017, Johns Hopkins Applied Physics Lab (5/1/2017 - 8/1/2017)

    Alissa contributed to three different projects this summer. She developed a computer vision algorithm for the retinal prosthetic, Argus II in the Second Sight project under manager, Dr. Kapil Katyal. The computer vision algorithm used optical flow to detect the motion of objects around the person as he or she was moving to allow the person to walk around the environment naturally and autonomously. This research project was used for her Senior Honors thesis at Washington University in St. Louis.

    She also developed and validated disease detection algorithms using bovine health surveillance data with manager Dr. Howard Burkom. Finally, she analyzed kinematic data from underbody blast simulations in OpenSim to study how to reduce the injury from the impact of military flight landings.

    Location

    Columbia, Maryland

All Publications


  • A markerless platform for ambulatory systems neuroscience SCIENCE ROBOTICS Silvernagel, M. P., Ling, A. S., Nuyujukian, P., Brain Interfacing Lab 2021; 6 (58)
  • A markerless platform for ambulatory systems neuroscience. Science robotics Silvernagel, M. P., Ling, A. S., Nuyujukian, P., Brain Interfacing Laboratory 2021; 6 (58): eabj7045

    Abstract

    [Figure: see text].

    View details for DOI 10.1126/scirobotics.abj7045

    View details for PubMedID 34516749

  • The Promise of the Future: Assistive Technology, Transportation, and Emerging Technologies, Promoting Successful Integration Cooper, R., Chung, C., Coltellaro, J., LaCroix, C., Ling, A., Reinsfelder, A., Salatin, B., Goeran, F. Borden Institute, Fort Sam Houston. 2018: 41–81
  • Developing a Computer Vision Algorithm to Detect Movement in the Environment for the Argus II Retinal Prosthesis Ling, A. Washington University in St. Louis. 2018 ; Senior Honors Thesis