All Publications


  • Audio-tactile association improves pitch perception in listeners with and without cochlear implants. Brain research Hodges, A., Fitzgerald, M., May, L., Lee, R. T., Goldsworthy, R., Fujioka, T. 2026: 150182

    Abstract

    Auditory perception can improve when accompanied by somatosensory information, with beneficial effects for hard-of-hearing individuals. Further enhancement could occur by mapping discrete musical pitch information onto tactile spatial patterns across four fingertips. Unlike previous studies, we used tactile stimuli that marked only the sound onsets via light pressure from air-inflated plastic membranes. Pre- and post-learning pitch discrimination tests used vocoded-audio only, vocoded-audio with tactile, and tactile-only conditions. The learning phase was a 10-minute nursery song melody listening task with the audio-tactile condition. In Exp. 1, normal-hearing listeners heard melodies in the original audio; in Exp. 2, normal-hearing listeners heard melodies with vocoded-audio; and CI users listened to the original audio. All groups performed best in the audio-tactile condition before the learning phase, and these immediate benefits were maximal at intermediate pitch intervals. Furthermore, CI users showed greater improvement in the audio-only condition after exposure, indicating the rapid transfer effect.

    View details for DOI 10.1016/j.brainres.2026.150182

    View details for PubMedID 41605410

  • A model of vocal persona: context, perception, production FRONTIERS IN COMPUTER SCIENCE Noufi, C., May, L., Berger, J. 2025; 7
  • Comparison of Impulse Response Generation Methods for a Simple Shoebox-Shaped Room ACOUSTICS May, L., Farzaneh, N., Das, O., Abel, J. S. 2025; 7 (3)
  • "Choices? That's the dream": challenges and opportunities in non-speech information closed-captioning FRONTIERS IN COMPUTER SCIENCE May, L., Clemens, M., Dang, K., Ohshiro, K., Sridhar, S., Wee, P., Fuentes, M., Lee, S., Cartwright, M. 2025; 7
  • Participant Recruitment in Accessibility Research May, L., Hassan, S., Dang, K., Lee, S., Alonzo, O., ACM ASSOC COMPUTING MACHINERY. 2025
  • Tactile Emotions: Multimodal Afective Captioning with Haptics Improves Narrative Engagement for d/Deaf and Hard-of-Hearing Viewers Pataca, C., Hassan, S., May, L., Olson, M. M., D'aurio, T., Peiris, R. L., Huenerfauth, M., ACM ASSOC COMPUTING MACHINERY. 2025
  • Designing audio processing strategies to enhance cochlear implant users' music enjoyment FRONTIERS IN COMPUTER SCIENCE May, L., Hodges, A., Park, S., Kaneshiro, B., Berger, J. 2024; 6
  • Unspoken Sound: Identifying Trends in Non-Speech Audio Captioning on YouTube May, L., Ohshiro, K., Dang, K., Sridhar, S., Pai, J., Fuentes, M., Lee, S., Cartwright, M., ACM ASSOC COMPUTING MACHINERY. 2024
  • Towards a Rich Format for Closed-Captioning May, L., Williams, A., Hassan, S., Cartwright, M., Lee, S., ACM ASSOC COMPUTING MACHINERY. 2024
  • WAM-Studio: A Web-Based Digital Audio Workstation to Empower Cochlear Implant Users Buffa, M., Vidal-Mazuy, A., May, L., Winckler, M. edited by Nocera, J. A., Larusdottir, M. K., Petrie, H., Piccinno, A., Winckler, M. SPRINGER INTERNATIONAL PUBLISHING AG. 2023: 101-110