- Schedule (.pdf)
- Keynote
- Accepted Papers and Notes
- Accepted Workshops
- Accepted Posters
- Accepted Demos
- Accepted Videos
- Doctoral Colloquium
Venue Information:
Student Volunteers:
Electromagnetic Field Detector Bracelet
Cati Vaucelle, Hiroshi Ishii and Joe Paradiso
Massachusetts Institute of Technology, MIT Media Lab
In this paper we present the design of a cost-effective wearable sensor to detect and indicate the strength and other characteristics of the electric field emanating from a laptop display. Our bracelet can provide an immediate awareness of electric fields radiated from an object used frequently. Our technology thus supports awareness of ambient background emanation beyond human perception. We discuss how detection of such radiation might help to “fingerprint” devices and aid in applications that require determination of indoor location.
Expressive Gesture Controller for an Individual with Quadriplegia
Adam Boulanger
MIT Media Laboratory
We present a strategy for mapping and parameterizing expressive gesture for music control with an infrared head-mounted controller. The system allows performers to rely on movement that they find natural despite no prior experience with expressive music performance. The system additionally empowers users to identify and contextualize their control parameterization within the key events of a single composition. This highly specialized design strategy is discussed as it relates to our new work in adaptive systems that tailor to the movement of users in ubiquitous computing contexts. The technology is piloted with a user with severe motor impairment as a result of quadriplegia. Implications for the field of gesture control and pervasive systems that integrate expressive input are discussed.
Vibro-tactile Space Awareness
Alois Ferscha, Bernadette Emsenhuber, Andreas Riener, Clemens Holzmann,
Manfred Hechinger and Dominik Hochreiter
Institute for Pervasive Computing, Johannes Kepler University Linz, A-4040 Linz, Austria
Continuous visual and auditory stimulation has made human perception prone to "overseeing" and "overhearing". Moreover, in some situations, visual and auditory stimuli yield to misleading or false perception. This is particularly the case with the perception of space, e.g. when it comes to the estimation of distances, or the determination of direction. This paper addresses the issue on how humans perceive space, like room space, work space, construction space, outdoor space, etc. We propose to amplify (or readjust) the perception of space aside the primary senses (i) vision and (ii) audition by (iii) tactation. In order to enhance space awareness of humans, we opt for a vibro-tactile notification system whenever eyes, ears and hands are in charge. A body worn, belt like vibration system has been developed, delivering tactile notifications about distance and orientation to the user in a very subtle, unobtrusive, yet demanding style. A series of user tests has been conducted investigating the perception of distance to physical objects, like walls or obstacles, in the vicinity of users. Results encourage for a whole new class of space awareness solutions. In the video, one category of applications is highlighted: safety of workers in construction and manufacturing.
Ambient Assisted Storytelling
Davide Carboni, Alessandro Soro
CRS4 (Center for Advanced Studies, Research and Development in Sardinia) - Italy
In this work we describe the project Ambient Assisted Storytelling, targeted at assisting people in the task of gathering multimedia data and sharing data and experiences in the form of a tale. Computing technology can offer a huge amount of resources in terms of memory, data collection and computing power, that can be used to build a scenography for new forms of art, if only the effort required to drive these resources could be reduced leaving space to the creative process of telling a story. We show how data can be gathered unobtrusively by means of wearable and ambient sensors, processed automatically, and then retrieved during storytelling by means of speech and gesture-activated commands, that merge naturally into the performance of the storyteller.
SMeet: Enabling Practical and Interactive Collaboration in Smart Meeting Space
Sangwoo Han(GIST), Sujin
Ko(GIST), Namgon Kim(GIST), Changhyeok Bae(GIST), Vinay Ramachandra(GIST),
Jun Park(Hongik University), Hyuk Lim(GIST), and JongWon Kim(GIST)
Gwangju Institute of Science and Technology (GIST), Hongik University
To support advanced collaboration among knowledge workers distributed geographically, there have been extensive researches under the scope of ubiquitous computing environments. Especially, to cope with several known problems in traditional room-based collaboration environments such as limited display resolution, uncomfortable sharing of visuals and documents, difficult operation of collaboration tools, several prototypes have been developed. To solve the above restrictions, we develop a prototype collaboration environment, named as SMeet (Smart Meeting Space), for practical and interactive collaboration with ultra-high-resolution interactive networked displays, which enables the customization of network-based interactive sharing of HD-quality media and data, human-display interaction using multi-tracking pointer-style devices, and intuitive GUI-based configuration support for easy node operation. Finally the feasibility of SMeet prototype is verified by showcasing an interactive medical-domain collaboration session between two remote sites.
Locally organized by Sungkyunkwan Univ. and UCN
The Proceedings will be published by ACM
Sponsors: