Roy Haex, Lilia Perez-Remero, Yu-fang Teh & Ferdy van Varik
Jacques Terken (TU/e)
A new trend observable in last generation in-car digital systems (navigation and infotainment devices) marks a tendency toward their integration with smart phones and smart phone applications. This integration multiplies the in- car systems’ functionalities, but also the potential distractions for the driver.
In this project we analyze possible ways of coping with designing multimodal interfaces for such applications. Research through experimentation was conducted comparing two multimodal interface modalities: one that conveyed response options only through the visual modality, and one that also included the auditory modality.
User performance was measured when interacting with these interfaces while performing a concurrent visual search task. The mainly auditory or speech based mode of interaction was expected to intrude less in participant’s performance. However, no significant difference was found regarding both conditions. Some other observations were made that may be useful for future research and design guidelines.
© Roy Haex, 1987 - 2016