Since the 1960s at the latest, the computer has been capable of
digitally synthesizing any perceivable sound. However, in a live
context, human-computer interfaces limit the bandwidth and quality of
the interaction between the performer and the sound synthesizer. This
talk focuses on physical interfaces for music, which operate according
to the laws of physics to seamlessly and intimately connect musicians
with sound synthesizers. In particular, we study how to design
interfaces that assist a performer in making musical gestures. For
example, the haptic drum provides a performer's drumstick with an
extra push every time that the performer strikes the drum. This device
allows the performer to make gestures that would otherwise be very
difficult or impossible, such as arbitrarily long one-handed drum
rolls and arbitrarily complex rudiments. Next, we explain how to
assist a performer in accurately selecting pitches from a continuous
range on a Theremin-like haptic interface. We augment it with detents
allowing the performer to feel the locations of equal-tempered
pitches, yet the performer can still perform arbitrary pitch
inflections such as glissandi, falls, and scoops. With the help of a
subject test, we demonstrate that traditional detents as well as novel
force-sensitive detents can provide accuracy improvements over simpler
types of haptic feedback, such as spring feedback. In closing, we
relate the work to research in robotics and mixed reality.
Date and Time
Wednesday February 24, 2010 4:30pm -
5:30pm
Location
Computer Science Small Auditorium (Room 105)
Event Type
Speaker
Host
Kenneth Steiglitz