Ph.D. Student in Computer Science | Purdue University
Human-Centered AI · Affective Computing · Embodied Interaction
I am a Ph.D. student in Computer Science at Purdue University, working in the IDEAS Lab under the supervision of Prof. Aniket Bera. My research lies at the intersection of interactive human motion graphics, affective computing, and multimodal generative AI. My current work explores how multimodal learning frameworks—combining motion, text and music — can enable better contextualized and controlled generation. This includes building datasets and models for interactive and reactive motion generation, and context-sensitive human–AI interaction in immersive environments such as VR. Some of the upcoming projects I have been working on are related to exploring motion editing and long-term generation for two-person generation. Previously, I earned my B.Tech and M.Tech in Electronics and Electrical Communications from IIT Kharagpur.
Prerit Gupta, Shourya Verma, Ananth Grama, and Aniket Bera. Unified Multi-Modal Interactive and Reactive 3D Motion Generation via Rectified Flow. Accepted at International Conference on Learning Representations (ICLR), 2026. [link]
Prerit Gupta, Jason Alexander Fotso-Puepi, Zhengyuan Li, Jay Mehta, and Aniket Bera. MDD: A Dataset for Text-and-Music Conditioned Duet Dance Generation. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2025. [link]
Shatha J Mufti, Shourya Verma, Jhon Martinez, Prerit Gupta, Aniket Bera, Ananth Grama, Riyi Shi and Aniket Bera. Unravelling the Dynamics of Seizure-Like Activity in Neuronal Networks using Machine Learning. Journal of Neurophysiology 134 (6), 1969-1993. [link]