In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroacoustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.
|Title of host publication
|Haptic and Audio Interaction Design: First International Workshop, HAID 2006, Glasgow, UK, August 31 - September 1, 2006. Proceedings
|Place of Publication
|Published - 2006
|Lecture Notes in Computer Science