Perceptual Engineering Lab

Human Computer Interaction
Virtual and Augmented Reality
Wearable Devices

Time Travel MIT

An outdoor VR time travel experience that takes you to the MIT of 1916 as well as an envisioned future 100 years from now.



Jake Guida and Misha Sra. Augmented Reality World Editor. VRST 2020. [AUDIENCE CHOICE AWARD].
Ehsan Sayyad, Misha Sra and Tobias Hollerer. Walking and Teleportation in Wide-Area Virtual Reality Experiences. ISMAR 2020.
Abhinandan Jain, Adam Haar Horowitz, Felix Schoeller, Sang-won Leigh, Pattie Maes, and Misha Sra. Designing Interactions Beyond Conscious Control: A New Model for Wearable Interfaces. IMWUT 2020.
Pat Pataranutaporn, Anjela Vujic, David S. Kong, Pattie Maes and Misha Sra. Living Bits : Opportunities and Challenges for Integrating Living Microorganisms in Human-Computer Interaction Augmented Humans 2020. [BEST PAPER AWARD (top 4%)].


Misha Sra, Abhinandan Jain and Pattie Maes. Adding proprioceptive feedback to virtual reality experiences using galvanic vestibular stimulation. CHI 2019.


Misha Sra, Xuhai Xu, Aske Mottelson and Pattie Maes. VMotion: Designing a Seamless Walking Experience in VR. DIS 2018.
Misha Sra, Aske Mottelson, and Pattie Maes. Your Place And Mine: Designing a Shared VR Experience for Remotely Located Users. DIS 2018.
Misha Sra, Xuhai Xu, and Pattie Maes. BreathVR: Leveraging breathing as a directly controlled interface for virtual reality games. CHI 2018.[BEST PAPER HONORABLE MENTION (top 5%)].
Nicolas Villar, Daniel Cletheroe, Greg Saul, Christian Holz, Tim Regan, Oscar Salandin, Misha Sra, Hui-Shyong Yeo, William Field, and Haiyan Zhang. Project Zanzibar: A portable and flexible tangible interaction platform. CHI 2018. [BEST PAPER AWARD (top 1%)].


Misha Sra, Sergio Garrido-Jurado, and Pattie Maes. Oasis: Procedurally generated social virtual spaces from 3d scanned real spaces. IEEE Transactions on Visualization and Computer Graphics, 2017.
Misha Sra, Prashanth Vijayaraghavan, Pattie Maes, and Deb Roy. Auris: Creating affective virtual spaces from music. In ACM VRST 2017, pages 26:1–26:11.
Misha Sra, Xuhai Xu, and Pattie Maes. Galvr: A novel collaboration interface using GVS. In ACM VRST 2017, pages 61:1–61:2.
Misha Sra, Prashanth Vijayaraghavan, Ognjen Rudovic, Pattie Maes, and Deb Roy. Deepspace: Mood-based image texture generation for virtual reality from music. In IEEE CVPRW 2017, pages 2289–2298.
Misha Sra. Steering locomotion by vestibular perturbation in room-scale VR. In IEEE VR 2017, pages 405–406.
Manisha Mohan, Misha Sra, and Chris Schmandt. Technological interventions to detect, communicate and deter sexual assault. In ACM ISWC 2017, pages 126–129.


Misha Sra, Sergio Garrido-Jurado, Chris Schmandt, and Pattie Maes. Procedurally generated virtual reality from 3D reconstructed physical space. In ACM VRST 2016, pages 191–200. [BEST PAPER AWARD (top 1%)].
Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. Immersive terrestrial scuba diving using virtual reality. In ACM CHI 2016, pages 1563–1569.
Misha Sra. Asymmetric design approach and collision avoidance techniques for room-scale multiplayer virtual reality. In ACM UIST 2016, pages 29–32.
Misha Sra. Resolving spatial variation and allowing spectator participation in multiplayer VR. In ACM UIST 2016, pages 221–222.
Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. Immersive scuba diving simulator using virtual reality. In ACM UIST 2016, pages 729–739.
Misha Sra and Chris Schmandt. Bringing real objects, spaces, actions, and interactions into social VR. In IEEE 3DCVE 2016, pages 16–17.


Misha Sra and Chris Schmandt. Expanding social mobile games beyond the device screen. Personal and Ubiquitous Computing, 19(3-4):495–508, 2015.
Weixuan Chen, Misha Sra, and Rosalind Picard. Improving sleep-wake schedule using sleep behavior visualization and a bedtime alarm. In ICST EAI Wireless Mobile Communication and Healthcare 2015, pages 241–244.
Misha Sra and Chris Schmandt. Metaspace: Full-body tracking for immersive multiperson virtual reality. In ACM UIST 2015, pages 47–48.

2014 and older

Misha Sra and Chris Schmandt. Spotz: A location-based approach to self-awareness. In Springer Persuasive Tech. 2013, pages 216–221.
Misha Sra, Austin Lee, Sheng-Ying Pao, Gonglue Jiang, and Hiroshii Ishii. Point and share: from paper to whiteboard. In ACM UIST 2012, pages 23–24.


CS185 - Spring 2020


Misha Sra

Assistant Professor of Computer Science

Sherry Chen

PhD student, CS

Machine Learning, VR

Yimeng Liu

PhD student, CS

Machine Learning, VR

Bowen Zhang

PhD student, CS

Haptics, Machine Learning

Giuliana Barrios

Visiting Master's student, CS

Machine Learning, Health

Atieh Taheri

PhD student, ECE

Accessibility, Machine Learning

Andrew Huard

PhD student, ECE

VR, Hardware

Jungah Son

PhD student, MAT

VR, Sketching

Mengyu Chen

PhD student, MAT

AR, Machine Learning


PhD Students

I am looking for talented PhD students to be part of PercLab who are interested in software or hardware research. I am looking for students who are highly motivated, hands-on, hard-working, and proficient in programming. Successful candidates will be creative, independent individuals with a strong work ethic and computational or hardware skills from almost any background like engineering (electrical, bio, mechanical), neuroscience, or design.

Postdocs (2 positions)

The first position is focused on applicants interested in research at the intersections of AI and HCI, specifically, augmented and virtual reality and gaming. You should have a computer science (or related) background and a solid publication record. I am looking for a highly motivated individual who is interested in applying machine learning to build novel immersive systems and technologies for a variety of applications. Given the current pandemic, timing and start dates are negotiable, but ideally we are hoping to identify candidates available by late 2020 or early 2021. If this is you, see How to Apply below and let's talk.

The second position is focused on applicants with background in hardware design in areas (but not limited to) like electrical, mechanical, bio engineering, neuroscience, applied physics, HCI or design. Postdoc applicants should have a PhD in one of these areas (or related), and a solid record of publication. Specific experience in one or more of the following would be beneficial: haptics, ultrasound, EEG/EMG or related areas of signal processing. Above all, I am looking for someone who is creative, hands-on, and highly motivated. We would like to identify candidates available for Fall 2021. If this is you, see How to Apply below and let's talk.

Visiting Students

Send me an email with your CV and if you have your own financial support. If I do not respond to your inquiry, it means that I don't have space in my lab for a new visitor at the moment.

Graduate Students (MS, BS/MS)

If you are a UCSB graduate student, please email me to schedule a time to meet. If you are not already at UCSB, you need to first apply to the Computer Science graduate program. I get a lot of emails from prospective grad students and I can't possibly respond to everyone. If you decide to contact me before applying to the program, please make sure that you have spent some time on this website and thought hard about why the PercLab is a good match for your research interests and skills.

Undergraduate Students (ECE, MAT, ME, CS, CCS)

If you are a UCSB undergraduate, please email me an email mentioning your area of interest, any prior experience and your transcript (unofficial is fine). Undergraduates in the PercLab have so far worked on a volunteer basis but working for credit is also possible.

HOW TO APPLY (Postdocs)

Applicants should send me a single email. The email should include the following:

  • A short written summary of your background, interests, and relevant experience
  • Your doctoral dissertaion topic (and link if available)
  • Desired / available start date (preferred within next six months)

As a PDF attachment, please include the following:

  • Your CV
  • List of relevant publications (a link to your Google Scholar page is fine)
  • A website link with your projects
  • Contact info for at least 3 references

Applications will be reviewed on an ongoing basis. Initial appointment will be for one year, with the possibility of renewal. The salary will be highly competitive.


The Perceptual Engineering Lab is housed in the Computer Science Department at UC, Santa Barbara. Our lab is located in Building 935, next to Phelps Hall.

© Perceptual Engineering Lab