As AI decouples intelligence from consciousness, as XR technologies transform perception, and as matter becomes machine manipulable, three key questions about the future arise: 1) what will newer AI-infused interfaces look like where digital bits occupy space and algorithms have faces, 2) what can we learn about human behavior through human-AI interaction and how can AI models learn from human behavior, and 3) how will a shift from data-centered to human-centered AI systems augment our capabilities and enrich our lives?
To answer these questions, our current work focuses on Human-Computer Interaction (HCI), eXtended Reality (XR) and Artificial Intelligence (AI). In particular, we investigate the design, engineering and study of interfaces (software and hardware), interactions, tools and systems for augmenting our physical capabilities. We are interested in skill acquisition and skill recovery such as fitness training and rehab therapy, physical task guidance such as equipment repair or cooking, and creative collaboration for music and dance.
We are inspired by the writings of Douglas Englebart and JCR Licklider. In both Augmenting Human Intellect and Man-Computer Symbiosis we see a vision of a world where human capabilities are greatly enhanced through close interaction with machines, which we call Human-AI Integration.
3D Motion Dataset: We are looking for an undergraduate student for help with building a dataset. Must have prior programming experience (any language) or basic Unity skills/experience or a strong willingness to learn Unity quickly.
3D Printing: We are looking for an undergraduate student who is interested in 3D printing, fabrication, electronics prototyping (e.g., Arduino). No prior experience in 3D printing needed.
Web Interface: We are looking for an undergraduate or BS/MS student for help with building a web frontend to a machine learning backend. Must have HTML/CSS/Javascript experience.
AR App: We are looking for an undergraduate student for recreating an AR application using Lens Studio. Javascript programming experience required.
Facial Dataset: We are looking for an undergraduate or BS/MS or Master's student for helping build a facial image dataset using hardware we have built. Prior programming experience, ability to learn how to use new hardware, working with people, organization and attention to detail, and high motivation are required.
If you are a CS Master's student looking for a thesis project and interested in augmented reality + computer vision, please email us.
An outdoor VR time travel experience that takes you to the MIT of 1916 as well as an envisioned future 100 years from now.
A continuously morphing 4D geometrical VR world was my first VR project (DK1) back in 2013.
Misha Sra Assistant Professor of Computer Science
Sherry Chen PhD student, CS
Yimeng Liu PhD student, CS
Atieh Taheri PhD candidate, ECE
Andrew Huard PhD student, ECE
Jungah Son PhD student, MAT
Arthur Caetano PhD student, CS
Avinash Nargund PhD student, ECE
Zichen Chen PhD student, CS
Kojiro Takeyama Visiting Researcher, Toyota Research
Purav Bhardwaj Visiting student, NID, India
Carlos Gilberto Visiting PhD student, MechE. UNAM, Mexico City
Giuliana Barrios Dell'Olio Visiting Master's Student, CS 2020-2021. Now at McKinsey & Company
Jake Guida Master's Student, CS 2020 (now at Adobe)
Mengyu Chen PhD'23, MAT
We are always looking for exceptional students at the intersection of Human-Computer Interaction, Computer Science, Electrical and Computer Engineering/Electrical Engineering, Cognitive Science and Mechanical Engineering.
If you are interested in applying to the Human-AI Integration Lab:
© Human-AI Integration Lab