CHI 2007 Advance Program: Session Details

Return to Advance Program Overview

 

Adaptation & Augmentation (Interactivity)

Wednesday
11:30 AM - 1:00 PM

 

Interactive Exploration of City Maps with Auditory Torches

Wilko Heuten
Niels Henze
Susanne Boll

City maps are an important means to get an impression of the structure of cities. They represent visual abstraction of urban areas with different geographic entities, their locations, and spatial relations. However, this information is not sufficiently accessible today to blind and visually impaired people. To provide a nonvisual access to map information, we developed an interactive auditory city map, which uses 3D nonspeech sound to convey the position, shape, and type of geographic objects. For the interactive exploration of the auditory map, we designed a virtual walk-through. This allows the user to gain an overview of an area. To be able to focus on certain regions of the map, we equip the user with an auditory torch. With the auditory torch users can change the number of displayed objects in a self directed way. To further aid in getting a global idea of the displayed area we additionally introduce a bird’s eye view on the auditory map. Our evaluation shows that our approaches enable the user to gain an understanding of the explored environment.

 

BluetunA: let you neighbor know what music you like

Stephan Baumann
Arianna Bassoli
Björn Jung
Martin Wisniowski

BluetunA is an application running on Bluetooth-enabled mobile phones that allows users to share information about their favourite music. With BluetunA people can select a list of favourite artists or songs and see who else in proximity share their taste in music, or they can search whom nearby has selected specific artists, and check out what other preferences in terms of music these people have. Moreover, BluetunA users can exchange messages with each other over Bluetooth, connect to the Internet to download their profile and obtain music recommendations from Last.fm website. To enrich this experience, people can interact with each other through their mobile phones while sitting in cafes by accessing the BluetunA hotspots and a wider range of music sharing options.

 

Dreaming of Adaptive Interface Agents

Bill Tomlinson
Eric Baumer
Man Lok Yau
Paul Mac Alpine
Lorenzo Canales
Andrew Correa
Bryant Hornick
Anju Sharma

This interactive project uses the metaphor of human sleep and dreaming to present a novel paradigm that helps address problems in adaptive user interface design. Two significant problems in adaptive interfaces are: interfaces that adapt when a user does not want them to do so, and interfaces where it is hard to understand how it changed during the process of adaptation. In the project described here, the system only adapts when the user allows it to go to sleep long enough to have a dream. In addition, the dream itself is a visualization of the transformation of the interface, so that a person may see what changes have occurred. This project presents an interim stage of this system, in which an autonomous agent collects knowledge about its environment, falls asleep, has dreams, and reconfigures its internal representation of the world while it dreams. People may alter the agent’s environment, may prevent it from sleeping by making noise into a microphone, and may observe the dream process that ensues when it is allowed to fall asleep. By drawing on the universal human experience of sleep and dreaming, this project seeks to make adaptive interfaces more effective and comprehensible.

 

imPulse

Gilad Lotan
Christian Croft

imPulse is a modular design object that senses pulse and allows users to wirelessly transmit their heartbeat rhythms to companion imPulse units. By synchronizing light and vibrations with users' personal heartbeats, these devices create intimacy across distance.

 

The Mixed Reality Book: A New Multimedia Reading Experience

Raphael Grasset
Mark Billinghurst
Andreas Duesner
Hartmut Seichter

We are introducing a new type of digitally enhanced book which symbiotically merges different type of media in a seamless approach. By keeping the traditional book (and its affordances) and visually and aurally enhancing it, we are hoping to provide a highly efficient combination of the physical and digital world. Our solution is based on recent developments in computer vision tracking, advanced GPU graphics and spatial sound rendering. The demonstration will also show the collaborative possibilities of the system by allowing other users to be part of the story.