11:30 AM - 1:00 PM
Robert J.K. Jacob
This interactivity demonstrates Tern, a tangible programming language for middle school and late elementary school students. Tern consists of a collection of wooden blocks shaped like jigsaw puzzle pieces. Children connect these blocks to form physical computer programs, which may include action commands, loops, branches, and subroutines. With Tern we attempt to provide the ability for teachers to conduct engaging programming activities in their classrooms, even if there are only one or two computers available. In designing Tern, we focused on creating an inexpensive, durable, and practical system for classroom use.
Describes a novel audio and vibrotactile interface based on exciting information from a physical model. Sets out a foundation for building compelling non-visual, handheld multimodal interfaces which include complex inference.
A tilt-controlled photo browsing method for small mobile devices is presented. The implementation uses continuous inputs from an accelerometer, and a multimodal (visual, audio and vibrotactile) display coupled with the states of this model. The model is based on a simple physical model, with its characteristics shaped to enhance usability. We show how dynamics of the physical model can be shaped to make the handling qualities of the mobile device fit the browsing task. We implemented the proposed algorithm on Samsung MITs PDA with tri-axis accelerometer and a vibrotactile motor. The experiment used seven novice users browsing from 100 photos. We compare a tilt-based interaction method with a button-based browser and an iPod wheel. We discuss the usability performance and contrast this with subjective experience from the users. The iPod wheel has significantly poorer performance than button pushing or tilt interaction, despite its commercial popularity.
Computer mice do not work in mid air. The reason is that a mouse is really only half an input devicethe other half being the surface the mouse is operated on, such as a mouse pad. In this demo, we demonstrate how to combine a mouse and a mouse pad into soap, a device that can be operated in mid air with a single hand. We have used soap to control video games, interact with wall displays and Windows Media Center, and to give slide presentations.
I/O Brush is our ongoing effort to empower people to create new expressions and meanings by painting with attributes of everyday objects and movements in their physical world. Using examples from our case studies with kindergarteners and artists, we discuss I/O Brushs most distinguishing features, its dynamic ink and history functions, and how they enable people to invent new expressions and meaning making with objects in their physical environment.
The GUIDe (Gaze-enhanced User Interface Design) project in the HCI Group at Stanford University explores how gaze information can be effectively used as an augmented input in addition to keyboard and mouse. We present three practical applications of gaze as an augmented input for pointing and selection, application switching, and scrolling. Our gaze-based interaction techniques do not overload the visual channel and present a natural, universally-accessible and general purpose use of gaze information to facilitate interaction with everyday computing devices.