As AI decouples intelligence from consciousness, as XR transforms perception, and as matter becomes machine manipulable, three questions about the future arise:
To answer these questions, our current work focuses on eXtended Reality (virtual, augmented and mixed reality or XR) and Artificial Intelligence (AI). In particular, we investigate the design, engineering and study of adaptive and explainable interfaces, tools, and systems for augmenting human physical capability. We are currently working in the areas of: (1) motor skill acquisition and recovery (e.g., fitness training, sports training, physical rehab therapy), (2) realtime guidance and feedback for physical tasks (e.g., repair, maintenance, cooking), and (3) human-AI creative collaboration for physical skills (e.g., playing an instrument, choreography).
The future is a heterotopia of "worlds within worlds, mirroring and yet upsetting what is outside," connected within and without in one seamless continuum.
For Fall 2024, I am actively looking for technical and motivated students who can demonstrate expertise and creativity with prototyping XR, electrical or mechanical systems and have significant prior programming experience, with or without AI. Please use the CS admission process, pick HCI as your area of interest, and include a link to your portfolio of work. Deadline is Dec 2023. For the 2024 cycle, I am specifically interested in students who are interested in exploring the design of task guidance systems from all perspectives.
In your email, please give me a sense of why our lab is right for you and how you would contribute. What novel ideas would you bring to the lab? What new projects would you want to work on (don't just pick a project from the ones on this page)? I'd like to see a portfolio that documents the different research projects you have worked on and specifies your role in each of them, links to publications etc. I will not respond to your email if it's obvious you have never looked at our lab's website.
We are looking for applicants who:
Virtual Buddy: Redefining Conversational AI Interactions for Individuals with Hand Motor Disabilities. UIST 2023. Best Poster Award
Sonic Storyteller: Augmenting Oral Storytelling with Spatial Sound Effects. UIST 2023.
ModBand: Design of a Modular Headband for Multimodal Data Collection and Inference. UIST 2023
Design of an Emotion-Aware Painting Application With an Interactional Approach for Virtual Reality. ACII 2023.
Memento Player: Shared Multi-Perspective Playback of Volumetrically-Captured Moments in Augmented Reality. CHI 2023.
CardsVR: A Two-Person VR Experience with Passive Haptic Feedback from a Deck of Playing Cards. ISMAR 2022.
Situated VR: Toward a Congruent Hybrid Reality Without Experiential Artifacts. IEEE CG&A 2022.
SRI-EEG: State-Based Recurrent Imputation for EEG Artifact Correction. Frontiers in Neuroscience 2022.
AI-Generated Virtual Instructors Based on Liked or Admired People Can Improve Motivation and Foster Positive Emotions for Learning. FIE 2022.
AI-Generated Virtual Instructors Based on Liked or Admired People Can Improve Motivation and Foster Positive Emotions for Learning. Book Chapter 2022.
FaraPy: An Augmented Reality Feedback System for Facial Paralysis using Action Unity Intensity Estimation. UIST 2021.
SceneAR: Scene-based Micro Narratives for Sharing and Remixing in Augmented Reality. ISMAR 2021.
EntangleVR: A Visual Programming Interface for Virtual Reality Interactive Scene Generation. VRST 2021.
Exploratory Design of a Hands-free Video Game Controller for a Quadriplegic Individual. AHs 2021.
Motion Improvisation: 3D Human Motion Synthesis with a Transformer. UIST 2021.
PneuMod: A Modular Haptic Device with Localized Pressure and Thermal Feedback. VRST 2021.
Design and Evaluation of a Hands-free Video Game Controller for Individuals with Motor Impairments. Frontiers 2021.
Opportunities and Challenges for Integrating Living Microorganisms in Human-Computer Interaction. AHs 2020. Best Paper Award (top 4%).
Designing Interactions Beyond Conscious Control: A New Model for Wearable Interfaces. IMWUT 2020.
Electrical stimulation wearable device for providing haptic feedback and reducing motion sickness in VR. CHI 2019.
A flexible, portable mat that can sense and track objects, and support multi-touch and hover gestures. CHI 2018. Best Paper Award (top 1%)
An outdoor VR time travel experience that takes you to the MIT of 1916 as well as an envisioned future 100 years from now.
A collaborative interface that allows remote control of a person's walking trajectory through galvanic vestibular stimulation (GVS). VRST 2017.
Methods for managing user perception and attention based on the cognitive illusion of Inattentional Blindness. DIS 2018.
A mocap hack using ViVe controllers before Vive trackers became available. IAP 2017.
A pipeline for the automatic generation of VR worlds from music. It uses a deep neural network for mood-based image generation. VRST 2017.
Creating immersive and interactive VR worlds using the real world as a template. New Context 2015.
A sensor-based bedtime alarm and a wallpaper display to promote awareness with sleep data visualization. MobiHealth 2015.
Room-scale social VR with object and full-body tracking built using Kinects before Vive became available. IEEE VR 2016.
Handwritten Tamil character recognition with a Convolutional Neural Network (CNN). NEML 2014.
A continuously morphing 4D geometrical VR world was my first VR project (DK1) back in 2013.
An outdoor fast paced team-based Android game. This was my Master's Thesis at the Media Lab completed in Aug 2013. PUC 2015.
Three physical microgames for Google Glass for taking short breaks from sitting. Preprint 2014.
Misha Sra Assistant Professor of Computer Science
Sherry Chen PhD student, CS
Yimeng Liu PhD candidate, CS
Atieh Taheri PhD candidate, ECE
Andrew Huard PhD student, ECE
Arthur Caetano PhD student, CS
Avinash Nargund PhD student, ECE
Zichen Chen PhD student, CS
Kojiro Takeyama Visiting Researcher, Toyota Research
Alejandro ApontePhD student, MAT
Purav Bhardwaj Visiting student, NID, India
Carlos Gilberto Visiting PhD student, MechE. UNAM, Mexico City
Giuliana Barrios Dell'Olio Visiting Master's Student, CS 2020-2021. Now at McKinsey & Company
Jake Guida MS'20, CS (now at Adobe)
Mengyu Chen PhD'23, MAT (now at JP Morgan Chase)
Austin Mac BS'23, CS (now at Roblox)
Jungah Son PhD'23, MAT
For Fall 2024, I am actively looking for technical and motivated students who can demonstrate expertise and creativity with prototyping XR, electrical or mechanical systems and have significant prior programming experience, with or without AI. Please use the CS admission process, pick HCI as your area of interest, and include a link to your portfolio of work. Deadline is Dec 2023.
© Human-AI Integration Lab