Students in the class were tasked with designing, building, evaluating, and refining a prototype system that involves tangible, embedded, gestural, non-visual, or otherwise novel interactions that move computing beyond familiar desktop, web, and mobile paradigms. Each project was also required to address some real-world problem. This is what students came up with:
Projects to be presented in the morning session, 10:00-12:00:
- ServiceCenter, a serving system for restaurants that allows waiters to efficiently manage orders and tasks by displaying information about their tables (Grupo Naidy)
- A device to improve security and responsibility in the laundry room (The Backend Cleaning Inspectors)
- A new musical listening experience using a jacket that vibrates with the bass (Team VARPEX)
- A “Kinect Jukebox”that lets you control music using gestures (Team X)
- NavBelt, a system for navigating around unfamiliar places more safely and conveniently (Team “Don't worry about itâ€)
- A Kinect-based system that watches people lift weights and gives instructional feedback to help people lift more safely and effectively (Team “Do You Even Lift?â€)
- Runway, a 3D modeling application that makes 3D manipulation more intuitive by bringing virtual objects into the real world, allowing natural 3D interaction with models using gestures (Team CAKE)
- An add-on device for the cane of a blind user which integrates GPS functionality via bluetooth and gives cardinal and route-guided directions via haptic feedback (Group 17)
- A minimally intrusive system to ensure that users remember to bring important items with them when they leave their residences; the system also helps users locate lost tagged items, either in their room or in the world at large (The Elite Four)
- Oz, a system that authenticates individuals into computer systems using sequences of basic hand gestures (Group 21)
In the afternoon session, 1:00-3:00:
- A hardware platform that receives and tracks data from sensors that users attach to objects around them and sends them notifications, e.g. to build and reinforce habits (Team TFCS)
- The GaitKeeper, an insole pad that can be inserted into a shoe, and an associated device affixed to the user’s body, that together gather information about the user’s gait for diagnostic purposes (Team GARP)
- The PostureParrot, a system that helps user maintain good back posture while sitting (Team Colonial)
- A bowl, dog collar, and mobile app that help busy owners take care of their dog by collecting and analyzing data about the dog’s diet and fitness, and optionally sending the owner notifications when they should feed or exercise their dog (Team Chewbacca)
- A glove that allows users to control (simulated) phone actions by sensing various hand gestures (The Lifehackers)
- An interface through which controlling web cameras can be as intuitive as turning one’s head (Team Epple)
- A smart bookshelf system that keeps track of which books are in it (Team “%eiipâ€)
- An interactive and fun way for middle school students to learn the fundamentals of computer science without the need for expensive software and/or hardware (Team “Name Redactedâ€)
- A gesture bracelet for computer shortcuts (Team “Cereal Killersâ€)
- AcaKinect, voice recording software that uses a Kinect for gesture-based control, which is a more efficient and intuitive way of presenting a music recording interface for those less experienced with the technical side of music production (Team “Deep Thoughtâ€)
We hope to see you there! The demo session is open to anyone who would like to attend. For more information, please email me at fiebrink@princeton.edu or visit the course webpage at http://www.cs.princeton.edu/courses/archive/spring13/cos436/