Perceptual Engineering Lab


Misha Sra

I am an Assistant Professor at UCSB in the CS department (since 2019) where I direct the Perceptual Engineering Lab (PercLab).

I am affiliated with the Media, Art & Tech program, Department of Electrical & Computer Engineering, Center for Responsible Machine Learning, Mind & Machine Inteligence and the Cognitive Science Program at UCSB. I received my PhD from the MIT Media Lab advised by Prof. Pattie Maes in the Fluid Interfaces group.

My primary research area is experience-focused HCI (or eHCI). My lab creates new sensing and feedback technologies that aim to improve people's day to day lives including devices that augment physiological and perceptual capabilities; experiences that put people back in touch with themselves and each other; and systems that leave people free, in the words of Clynes and Kline (Cyborgs and Space, Sept 1960) -- to explore, to create, to think and to feel.

We design and engineer AR/VR technologies (novel interactions, experiences and tools for the Metaverse), haptic feedback devices and human-centered AI systems for solving high-impact real-world problems. These technologies hold great promise for advancing learning, creativity, gaming, therapy, accessibility and more. By prioritizing real-world applications, our goal is to make groundbreaking discoveries by adapting machine learning tools and engineering new systems that can have a huge positive impact in the world.



Giuliana Barrios Dell'Olio and Misha Sra. FaraPy: An Augmented Reality Feedback System for Facial Paralysis using Action Unity Intensity Estimation. UIST 2021.
Mengyu Chen, Andrés Monroy-Hernández and Misha Sra. SceneAR: Scene-based Micro Narratives for Sharing and Remixing in Augmented Reality. ISMAR 2021. (To appear).
Mengyu Chen and Misha Sra. EntangleVR: A Visual Programming Interface for Virtual Reality Interactive Scene Generation. VRST 2021. (To appear).
Bowen Zhang and Misha Sra. PneuMod: A Modular Haptic Device with Localized Pressure and Thermal Feedback. VRST 2021. (To appear).
Pulkit Tandon, Shubham Chandak, Pat Pataranutaporn, Yimeng Liu, Anesu M Mapuranga, Pattie Maes, Tsachy Weissman and Misha Sra. Txt2Vid: Ultra-Low Bitrate Compression of Talking-Head Videos via Text. Arxiv preprint.
Atieh Taheri, Ziv Weissman and Misha Sra. Exploratory Design of a Hands-free Video Game Controller for a Quadriplegic Individual. Augmented Humans 2021.
Atieh Taheri, Ziv Weissman and Misha Sra. Design and Evaluation of a Hands-free Video Game Controller for Individuals with Motor Impairments. Frontiers 2021.
Jason Orlosky, Misha Sra, Kenan Bektas, Huaishu Peng, Jeeeun Kim, Nataliya Kosmyna, Tobias Hollerer, Anthony Steed, Kiyoshi Kiyokawa8, and Kaan Aksit Telelife: The Future of Remote Living. Frontiers 2021.
Yimeng Liu and Misha Sra. Motion Improvisation: 3D Human Motion Synthesis with a Transformer. UIST 2021. (To appear).
Sherry Chen and Misha Sra. IntoTheVideos: Exploration of Dynamic 3D Space Reconstruction From Single Sports Videos. UIST 2021. (To appear).
Apurv Varshney, Justin Nilsen, Richa Wadaskar and Misha Sra. Flick Gesture Interaction in Augmented Reality: AR Carrom. UIST 2021. (To appear).
Lawrence Lim, Wei-Yee Goh, Mara Downing and Misha Sra. A Spatial Music Listening Experience in Augmented Reality. UIST 2021. (To appear).
Jungah Son and Misha Sra. Exploring Emotion Brushes for a Virtual Reality Painting Tool. VRST 2021. (To appear).
Suriya Dakshina Murthy, Tobias Höllerer and Misha Sra. IMAGEimate - An End-to-End Pipeline to Create Realistic Animatable 3D Avatars from a Single Image Using Neural Networks. VRST 2021. (To appear).
Andrew Zhang, Jennifer M. Jacobs, Misha Sra and Tobias Höllerer. Multi-View AR Streams for Interactive 3D Remote Teaching. VRST 2021. (To appear).


Jake Guida and Misha Sra. Augmented Reality World Editor. VRST 2020. [AUDIENCE CHOICE AWARD].
Ehsan Sayyad, Misha Sra and Tobias Hollerer. Walking and Teleportation in Wide-Area Virtual Reality Experiences. ISMAR 2020.
Abhinandan Jain, Adam Haar Horowitz, Felix Schoeller, Sang-won Leigh, Pattie Maes, and Misha Sra. Designing Interactions Beyond Conscious Control: A New Model for Wearable Interfaces. IMWUT 2020.
Pat Pataranutaporn, Anjela Vujic, David S. Kong, Pattie Maes and Misha Sra. Living Bits : Opportunities and Challenges for Integrating Living Microorganisms in Human-Computer Interaction. Augmented Humans 2020. [BEST PAPER AWARD (top 4%)].


Misha Sra, Abhinandan Jain and Pattie Maes. Adding proprioceptive feedback to virtual reality experiences using galvanic vestibular stimulation. CHI 2019.


Misha Sra, Xuhai Xu, Aske Mottelson and Pattie Maes. VMotion: Designing a Seamless Walking Experience in VR. DIS 2018.
Misha Sra, Aske Mottelson, and Pattie Maes. Your Place And Mine: Designing a Shared VR Experience for Remotely Located Users. DIS 2018.
Misha Sra, Xuhai Xu, and Pattie Maes. BreathVR: Leveraging breathing as a directly controlled interface for virtual reality games. CHI 2018.[BEST PAPER HONORABLE MENTION (top 5%)].
Nicolas Villar, Daniel Cletheroe, Greg Saul, Christian Holz, Tim Regan, Oscar Salandin, Misha Sra, Hui-Shyong Yeo, William Field, and Haiyan Zhang. Project Zanzibar: A portable and flexible tangible interaction platform. CHI 2018. [BEST PAPER AWARD (top 1%)].


Misha Sra, Sergio Garrido-Jurado, and Pattie Maes. Oasis: Procedurally generated social virtual spaces from 3d scanned real spaces. IEEE Transactions on Visualization and Computer Graphics, 2017.
Misha Sra, Prashanth Vijayaraghavan, Pattie Maes, and Deb Roy. Auris: Creating affective virtual spaces from music. In ACM VRST 2017, pages 26:1–26:11.
Misha Sra, Xuhai Xu, and Pattie Maes. Galvr: A novel collaboration interface using GVS. In ACM VRST 2017, pages 61:1–61:2.
Misha Sra, Prashanth Vijayaraghavan, Ognjen Rudovic, Pattie Maes, and Deb Roy. Deepspace: Mood-based image texture generation for virtual reality from music. In IEEE CVPRW 2017, pages 2289–2298.
Misha Sra. Steering locomotion by vestibular perturbation in room-scale VR. In IEEE VR 2017, pages 405–406.
Manisha Mohan, Misha Sra, and Chris Schmandt. Technological interventions to detect, communicate and deter sexual assault. In ACM ISWC 2017, pages 126–129.


Misha Sra, Sergio Garrido-Jurado, Chris Schmandt, and Pattie Maes. Procedurally generated virtual reality from 3D reconstructed physical space. In ACM VRST 2016, pages 191–200. [BEST PAPER AWARD (top 1%)].
Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. Immersive terrestrial scuba diving using virtual reality. In ACM CHI 2016, pages 1563–1569.
Misha Sra. Asymmetric design approach and collision avoidance techniques for room-scale multiplayer virtual reality. In ACM UIST 2016, pages 29–32.
Misha Sra. Resolving spatial variation and allowing spectator participation in multiplayer VR. In ACM UIST 2016, pages 221–222.
Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. Immersive scuba diving simulator using virtual reality. In ACM UIST 2016, pages 729–739.
Misha Sra and Chris Schmandt. Bringing real objects, spaces, actions, and interactions into social VR. In IEEE 3DCVE 2016, pages 16–17.


Misha Sra and Chris Schmandt. Expanding social mobile games beyond the device screen. Personal and Ubiquitous Computing, 19(3-4):495–508, 2015.
Weixuan Chen, Misha Sra, and Rosalind Picard. Improving sleep-wake schedule using sleep behavior visualization and a bedtime alarm. In ICST EAI Wireless Mobile Communication and Healthcare 2015, pages 241–244.
Misha Sra and Chris Schmandt. Metaspace: Full-body tracking for immersive multiperson virtual reality. In ACM UIST 2015, pages 47–48.

2014 and older

Misha Sra and Chris Schmandt. Spotz: A location-based approach to self-awareness. In Springer Persuasive Tech. 2013, pages 216–221.
Misha Sra, Austin Lee, Sheng-Ying Pao, Gonglue Jiang, and Hiroshii Ishii. Point and share: from paper to whiteboard. In ACM UIST 2012, pages 23–24.


CS185 - Winter 2021 (on Gauchospace)



Misha Sra
Assistant Professor of Computer Science


Sherry Chen
PhD student, CS


Yimeng Liu
PhD student, CS


Mengyu Chen
PhD student, MAT


Atieh Taheri
PhD student, ECE


Andrew Huard
PhD student, ECE


Jungah Son
PhD student, MAT


Carlos Gilberto
Visiting PhD student, MechE


Arthur Caetano
PhD student, CS


Avinash Nargund
PhD student, ECE



Giuliana Barrios Dell'Olio
Visiting Master's Student, CS 2020-2021. Now at McKinsey & Company.


Jake Guida
Master's Student, CS 2020 (now at Adobe)


The Perceptual Engineering Lab is housed in the Computer Science Department at UCSB located in Phelps 3515.

© Perceptual Engineering Lab