Interactive Digital Multimedia

IGERT Summer Projects

 

Spheres of Influence — Geospatial and News Data Integration with a Camera-Tracking Interface

 

 

Previous

Students

Mike Quinn, Elec & Comp Engineering
Augsut Black, Media Arts & Tech
Brian Springer, Art Studio
Thomas Kuo, Elec & Comp Engineering
Jason Wither, Computer Science

 

 

Faculty Advisors


Marko Peljhan, Media Arts & Tech
George Legrady, Media Arts & Tech

B.S. Manjunath, Elec & Comp Engineering

Next

 
 

Abstract

Spheres of Influence is an interactive system which will be installed in UCSB's Davidson Library. It will be located just inside the east entrance. The system will allow a user to interact with and control global information in various forms. This information will be conveyed to the user via four 50 inch LCD displays which will be mounted on the wall. User input will be visual. This will be achieved by the use of 4-6 firewire digital video cameras. The cameras will be used to track the user or users in front of the displays. Features such as position, velocity, color, and perhaps gesture will be extracted. The extracted information will then be transmitted to the display system for processing.

The display component of our project will combine all of our different data sources to create an interactive viewing experience. As the user moves throughout the project space our vision system will update the map data to be displayed. Through these motions the viewable area of the map will pan and zoom, allowing the user to interactively move the map. The multiple display environment will be implemented using Chromium to spread a single OpenGL application over the four displays. This will allow the displays to be tiled easily. As the map is moved by the user, videos of news streams will be displayed near the origin of the broadcast, allowing the user to physically locate the source of news in different areas.

The Spheres of Influence project will also display clips of international news from SCOLA and satellite video based on various audio and video categorizations. Some potential categorizations are handheld camera vs. still camera, day vs. night images, periods of silence, and number of faces. First, the videos need to be segmented into shots based on color histogram information. Then these shots will be categorized by low-level color, texture, and motion descriptors and other algorithms implemented in the OpenCV library. Further categories can be defined by combining these low-level descriptors for further visceral meaning. The juxtaposition of these clips will highlight their constant presence in the media.

 

 

 

 


Example of Spheres of
Influence
after installation
in a library hallway.