This presentation will highlight research by the Music & Entertainment Technology Laboratory (MET-lab) at Drexel University exploring music, emotion, and creative expression under the common vision of making music more interactive and accessible for both musicians and non-musicians. These projects encompass the recognition of emotion, such as a system for dynamic musical mood prediction and a collaborative web game for the collection of emotional annotations, as well as interfaces for expressive performance, including a novel electromagnetic approach to shaping the sound of the acoustic piano and a user-friendly controller for remixing music in terms of emotion. These and other MET-lab efforts are closely coupled with educational initiatives, many of which have been deployed in K-12 outreach programs in the Philadelphia region, to promote learning in Science, Technology, Engineering, and Mathematics (STEM).
Youngmoo Kim is an Assistant Professor of Electrical and Computer Engineering at Drexel University. His research group, the Music & Entertainment Technology Laboratory (MET-lab) focuses on the machine understanding of audio, particularly for music information retrieval. Other areas of active research at MET-lab include analysis-synthesis of sound, human-machine interfaces and robotics for expressive interaction, and K-12 outreach for engineering, science, and mathematics education. Youngmoo received his Ph.D. from the MIT Media Lab in 2003 and also holds Masters degrees in Electrical Engineering and Music (Vocal Performance Practice) from Stanford University. He served as a member of the MPEG standards committee, contributing to the MPEG-4 and MPEG-7 audio standards, and he co-chaired the 2008 International Conference on Music Information Retrieval (hosted at Drexel). His research is supported by the National Science Foundation and the NAMM Foundation, including an NSF CAREER award in 2007.