Music Cognition and Computation
Research at USC
Dr. Elaine Chew
Integrated Media Systems Center
Epstein Department of Industrial and Systems Engineering
University of Southern California
Date: Friday, May 20, 2005
Place: Engineering Sciences Building, Room 2001
Time: 2:00 pm — 3:00 pm
Abstract:
This presentation will give an overview of selected music cognition and
computation projects in my research group based at the Integrated Media
Systems Center (IMSC) at the USC Viterbi School of Engineering, and will
focus on three projects involving different kinds of real-time interactivity.
MuSA.RT -- an interactive music analysis and visualization system that
tracks and displays the trajectory of the tonal content and context of
music in real-time during a live performance. MuSA.RT stands for Music
on the Spiral Array . Real-Time. The system performs the dual role of
converting musical performances to mathematically elegant graphics, and
of providing a means of visualizing the inner workings of tonal induction
and tracking algorithms. The analysis algorithms and visual metaphor are
based on the Spiral Array, a mathematical model for tonality (Chew, 2000).
ESP -- the Expression Synthesis Project aims to create a driving interface
for expression synthesis. The premise of ESP is that driving serves as
an effective metaphor for expressive music performance. Not everyone can
play an instrument but almost anyone can drive a car. By using a familiar
interface, ESP aims to provide a compelling metaphor for expressive performance
so as to make high-level expressive decisions accessible to non-experts.
Both MuSA.RT and ESP are implemented using Alex François' Modular Flow
Scheduling Framework (2001).
DIP -- the Distributed Immersive Performance project explores one of the
most challenging goals of networked media technology: creating a seamless
environment for remote and synchronous musical collaboration. We, a team
of faculty and students at IMSC, have created a comprehensive framework
for the capture, recording and replay of high-resolution video, audio
and MIDI streams; and are systematically studying the tolerance thresholds
for musical interaction over distance so as to develop an environment
that would provide a more satisfying and successful cooperative experience
for the musicians.
Publications can be found at http://www-rcf.usc.edu/~echew/bibliography
ELAINE CHEW is Assistant Professor of Industrial and
Systems Engineering and Research Area Director in the Integrated Media
Systems Center at the University of Southern California Viterbi School
of Engineering. Her research interests center on computer modeling of
music cognition, analysis and performance. She was awarded a 2004 NSF
Career award for her proposal on "Performer-Centered Approaches to Computer-Assisted
Music Making." As a computational scientist, she received her PhD and
SM in Operations Research from the Massachusetts Institute of Technology,
and a BAS in Mathematical and Computational Sciences, and Music from Stanford
University. Her PhD studies and research on mathematical modeling of tonality
was made possible by a Josephine de Karman dissertation fellowship and
an Office of Naval Research graduate fellowship. As a musician, she received
her FTCL and LTCL degrees in piano performance, and was presented with
the Laya and Jerome B. Weisner Award for her contribution to the Arts
at MIT. Her performance of Ivan Tcherepnin's Fetes - Variations on Happy
Birthday - is available as streaming audio from WGBH's Art of the States
program.
Host: Curtis Roads,
Professor of Media Arts and Technology |