Perceptual Engineering Lab

Objective thought is unaware of the subject of perception. - Maurice Merleau-Ponty

Our research in Human Computer Interaction is currently directed in four areas:

  • Perception - sensing and stimulation devices, biosensing and awareness
  • Interaction - body-based interfaces, wearable haptics, natural input techniques, physiological signals
  • Machine Learning - VR/AR content generation, neural style transfer, speech and gestures
  • Applications - learning, creativity, remote collaboration, multiplayer gaming


We are hiring!

RESEARCH



MoveU

Electrical stimulation wearable device for providing haptic feedback and reducing motion sickness in VR. CHI 2019.

Time Travel MIT

An outdoor VR time travel experience that takes you to the MIT of 1916 as well as an envisioned future 100 years from now.

BreathVR

VR games that use breathing actions to augment controller input. CHI 2018. Best Paper Honorable Mention (top 5%)

Zanzibar

A flexible, portable mat that can sense and track objects, and support multi-touch and hover gestures. CHI 2018. Best Paper Award (top 1%)

VMotion

Methods for managing user perception and attention based on the cognitive illusion of Inattentional Blindness. DIS 2018.

Your Place And Mine

Room-scale mapping techniques to support natural interaction in shared VR spaces for remotely located users. DIS 2018.

GVS

A collaborative interface that allows remote control of a person's walking trajectory through galvanic vestibular stimulation (GVS). VRST 2017.

Auris

A pipeline for the automatic generation of VR worlds from music. It uses a deep neural network for mood-based image generation. VRST 2017.

Oasis

Automatic generation of VR worlds from 3D reconstruction of real world places. VRST 2016. TVCG 2017. Best Paper Award (top 0.02%)

Full-body Tracking VR

A mocap hack using ViVe controllers before Vive trackers became available. IAP 2017.

Building VR Worlds

Creating immersive and interactive VR worlds using the real world as a template. New Context 2015.

SnowballVR

Asymmetric multiplayer VR with roles based on the size of a user's space. UIST 2016.

Amphibian

Multisensory scuba diving experience in VR. CHI 2016. UIST 2016.

Metaspace II

Room-scale social VR with object and full-body tracking built using Kinects before Vive became available. IEEE VR 2016.

Metaspace I

Room-scale social VR with full-body tracking. It was built before the Vive was announced. UIST 2015.

Lights Out

A sensor-based bedtime alarm and a wallpaper display to promote awareness with sleep data visualization. MobiHealth 2015.

Deep Learning

Handwritten Tamil character recognition with a Convolutional Neural Network (CNN). NEML 2014.

Hands Free

In-air gesture recognition system.

OnTheGo

Gestural input for mobile and wearable devices.

Activ8

Three physical microgames for Google Glass for taking short breaks from sitting in 2014.

Morph

A continuously morphing 4D geometrical VR world was my first VR project (DK1) back in 2013.

Spellbound

An outdoor fast paced team-based Android game. This was my Master's Thesis at the Media Lab completed in Aug 2013. PUC 2015.

Spotz

A location based personality creator and visualizer. Persuasive 2013.

Point & Share

Digital pen that transmits writings on paper to the whiteboard. UIST 2012.

Health Score

An interactive tabletop application that displays a Health Score of a location, neighborhood or a selected area on a map.

Publications

2019

Misha Sra, Abhinandan Jain and Pattie Maes. Adding proprioceptive feedback to virtual reality experiences using galvanic vestibular stimulation. CHI 2019.

2018

Misha Sra, Xuhai Xu, Aske Mottelson and Pattie Maes. VMotion: Designing a Seamless Walking Experience in VR. DIS 2018.
Misha Sra, Aske Mottelson, and Pattie Maes. Your Place And Mine: Designing a Shared VR Experience for Remotely Located Users. DIS 2018.
Misha Sra, Xuhai Xu, and Pattie Maes. BreathVR: Leveraging breathing as a directly controlled interface for virtual reality games. CHI 2018.[BEST PAPER HONORABLE MENTION (top 5%)].
Nicolas Villar, Daniel Cletheroe, Greg Saul, Christian Holz, Tim Regan, Oscar Salandin, Misha Sra, Hyi-Shyong yeo, William Field, and Haiyan Zhang. Project Zanzibar: A portable and flexible tangible interaction platform. CHI 2018. [BEST PAPER AWARD (top 1%)].

2017

Misha Sra, Sergio Garrido-Jurado, and Pattie Maes. Oasis: Procedurally generated social virtual spaces from 3d scanned real spaces. IEEE Transactions on Visualization and Computer Graphics, 2017.
Misha Sra, Prashanth Vijayaraghavan, Pattie Maes, and Deb Roy. Auris: Creating affective virtual spaces from music. In ACM VRST 2017, pages 26:1–26:11.
Misha Sra, Xuhai Xu, and Pattie Maes. Galvr: A novel collaboration interface using GVS. In ACM VRST 2017, pages 61:1–61:2.
Misha Sra, Prashanth Vijayaraghavan, Ognjen Rudovic, Pattie Maes, and Deb Roy. Deepspace: Mood-based image texture generation for virtual reality from music. In IEEE CVPRW 2017, pages 2289–2298.
Misha Sra. Steering locomotion by vestibular perturbation in room-scale VR. In IEEE VR 2017, pages 405–406.
Manisha Mohan, Misha Sra, and Chris Schmandt. Technological interventions to detect, communicate and deter sexual assault. In ACM ISWC 2017, pages 126–129.

2016

Misha Sra, Sergio Garrido-Jurado, Chris Schmandt, and Pattie Maes. Procedurally generated virtual reality from 3D reconstructed physical space. In ACM VRST 2016, pages 191–200. [BEST PAPER AWARD (top 0.02%)].
Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. Immersive terrestrial scuba diving using virtual reality. In ACM CHI 2016, pages 1563–1569.
Misha Sra. Asymmetric design approach and collision avoidance techniques for room-scale multiplayer virtual reality. In ACM UIST 2016, pages 29–32.
Misha Sra. Resolving spatial variation and allowing spectator participation in multiplayer VR. In ACM UIST 2016, pages 221–222.
Dhruv Jain, Misha Sra, Jingru Guo, Rodrigo Marques, Raymond Wu, Justin Chiu, and Chris Schmandt. Immersive scuba diving simulator using virtual reality. In ACM UIST 2016, pages 729–739.
Misha Sra and Chris Schmandt. Bringing real objects, spaces, actions, and interactions into social VR. In IEEE 3DCVE 2016, pages 16–17.

2015

Misha Sra and Chris Schmandt. Expanding social mobile games beyond the device screen. Personal and Ubiquitous Computing, 19(3-4):495–508, 2015.
Weixuan Chen, Misha Sra, and Rosalind Picard. Improving sleep-wake schedule using sleep behavior visualization and a bedtime alarm. In ICST EAI Wireless Mobile Communication and Healthcare 2015, pages 241–244.
Misha Sra and Chris Schmandt. Metaspace: Full-body tracking for immersive multiperson virtual reality. In ACM UIST 2015, pages 47–48.
Misha Sra and Chris Schmandt. Spotz: A location-based approach to self-awareness. In Springer Persuasive Tech. 2013, pages 216–221.
Misha Sra, Austin Lee, Sheng-Ying Pao, Gonglue Jiang, and Hiroshii Ishii. Point and share: from paper to whiteboard. In ACM UIST 2012, pages 23–24.

2014 and older

Misha Sra and Chris Schmandt. Spotz: A location-based approach to self-awareness. In Springer Persuasive Tech. 2013, pages 216–221.
Misha Sra, Austin Lee, Sheng-Ying Pao, Gonglue Jiang, and Hiroshii Ishii. Point and share: from paper to whiteboard. In ACM UIST 2012, pages 23–24.

TEACHING



CS185 - Spring 2020

THE TEAM



Misha Sra

Assistant Professor of Computer Science

Sherry Chen

Graduate student, CS

physiological signals, machine learning

Lu Han

Undergraduate student, CE

Virtual reality

WE ARE HIRING


PhD Students

We are looking for a few talented PhD students to be part of our lab. We are looking for students who are highly motivated, excited about research, hard-working, and proficient in programming (one of Python, C#, Java etc.) at the very least. We welcome creative, independent individuals with a strong work ethic and computational skills from almost any background like engineering (electrical, bio, mech) and design.

Visiting Students

Send Misha an email with your CV and if you have your own financial support. If I do not respond to your inquiry, it means that I don't have space in my lab for a new visitor at the moment.

Graduate Students

If you are a UCSB graduate student, please email Misha to set up a time to meet. If you are not already at UCSB, you need to first apply to the Computer Science graduate program. I get a lot of emails from prospective grad students and I can't possibly respond to everyone. If you decide to contact Misha before applying to the program, please make sure that you have spent some time on our website and thought hard about why our lab is a good match for your research interests and skills.

Undergraduate Students

If you are a UCSB undergraduate, please send email Misha mentioning your area of interest, any prior experience and your transcript (unofficial is fine). Undergraduates in our lab have so far worked on a volunteer basis but working for credit is also possible.


CONTACT


The Perceptual Engineering Lab is housed in the Computer Science Department at UC, Santa Barbara. Our lab is located in Building 935, next to Phelps Hall.



© Perceptual Engineering Lab