Image inpainting allows for filling masked areas of an image with synthesized content that is indistinguishable from its environment. We present a video inpainting pipeline that enables users to “erase” physical objects in their environment using a mobile device. The pipeline includes an augmented reality application and an on-device conditional adversarial model for generating the inpainted textures. Users are able to interactively remove clutter in their physical space in realtime. The pipeline preserves frame to frame coherence, even with camera movements, using the Google ARCore SDK
Augmented Reality World Editor.
augmented reality, video inpainting, computer vision, Android
© Perceptual Engineering Lab