All Publications


  • MINOTAUR: A Posit-Based 0.42-0.50-TOPS/W Edge Transformer Inference and Training Accelerator IEEE JOURNAL OF SOLID-STATE CIRCUITS Prabhu, K., Radway, R. M., Yu, J., Bartolone, K., Giordano, M., Peddinghaus, F., Urman, Y., Khwa, W., Chih, Y., Chang, M., Mitra, S., Raina, P. 2025; 60 (4): 1311-1323
  • 8-bit Transformer Inference and Fine-tuning for Edge Accelerators Yu, J., Prabhu, K., Urman, Y., Radway, R. M., Han, E., Raina, P., ACM ASSOC COMPUTING MACHINERY. 2024: 5-21
  • Ultra-Dense 3D Physical Design Unlocks New Architectural Design Points with Large Benefits Srimani, T., Radway, R. M., Kim, J., Prabhu, K., Rich, D., Gilardi, C., Raina, P., Shulaker, M., Lim, S., Mitra, S. edited by IEEE IEEE. 2023
  • CHIMERA: A 0.92-TOPS, 2.2-TOPS/W Edge AI Accelerator With 2-MByte On-Chip Foundry Resistive RAM for Efficient Training and Inference IEEE JOURNAL OF SOLID-STATE CIRCUITS Prabhu, K., Gural, A., Khan, Z. F., Radway, R. M., Giordano, M., Koul, K., Doshi, R., Kustin, J. W., Liu, T., Lopes, G. B., Turbiner, V., Khwa, W., Chih, Y., Chang, M., Lallement, G., Murmann, B., Mitra, S., Raina, P. 2022
  • A Full-Stack Search Technique for Domain Optimized Deep Learning Accelerators Zhang, D., Huda, S., Songhori, E., Prabhu, K., Quoc Le, Goldie, A., Mirhoseini, A. edited by Falsafi, B., Ferdman, M., Lu, S., Weinisch, T. ASSOC COMPUTING MACHINERY. 2022: 27-42