Multimodal Interaction for Mobile Devices

 
When?
Wednesday 31 October 2007, 14:00 to 15:00
Where?
39BB02
Open to:
Staff, Students

Professor Stephen Brewster, University of Glasgow

Abstract:

In this talk I will present some work going on at Glasgow into developing interfaces and interaction techniques for mobile computing devices (such as smart phones and PDAs) that support their use on the move. When people are on the move their visual attention must be on the environment to avoid obstacles, navigate, etc. Visual displays are hard to use in these circumstances. Using pens and styli is also difficult on the move as objects on screen are small and can be hard to hit when both the user and device are moving.

I will show about some of the work we have been doing on novel interaction techniques for mobile and wearable devices. These have focused on 3D, non-speech sound for output of information. 3D sound allows a larger display space than either stereo or the point source sounds currently used, and makes it much easier for sound sources to be discriminated. I will present the results of work we have been doing to understand how to design sounds to work in such an interface. For input we are using gestures based on head, hands or arms. These have the advantage that they can easily be done 'eyes-free' and on the move. I will show examples of some applications based around these technologies and discuss the issues with evaluating novel, mobile user interfaces.

Notes:

Stephen Brewster is professor of human-computer interaction at the University of Glasgow and an EPSRC Advanced Research Fellow. He gained his PhD in Computer Science from the University of York in 1994. Following that he worked as a Marie Curie Fellow in Finland and Norway, moving to Glasgow in 1995.

His work is in the area of multimodal human computer interaction (HCI), or using multiple sensory modalities to make interaction more effective. Brewster's work has a strong experimental focus and applies results from low-level psychophysical research to practical situations. He has shown that using multimodality can improve usability in a wide range of interactions. His particular areas of focus are haptic (touch-based) interfaces use force-feedback, tactile and gestural interaction, and non-speech auditory interfaces using earcons (the auditory equivalent of icons) and smell-based interfaces. His areas of application are mobile and wearable devices, interfaces for blind and partially sighted people and older users.

Date:
Wednesday 31 October 2007
Time:

14:00 to 15:00


Where?
39BB02
Open to:
Staff, Students

Page Owner: css1mc
Page Created: Monday 18 May 2009 14:37:37 by csp2ap
Last Modified: Tuesday 17 January 2012 19:32:05 by sl0022
Expiry Date: Wednesday 18 August 2010 14:35:39
Assembly date: Tue Mar 26 17:52:55 GMT 2013
Content ID: 4700
Revision: 2
Community: 1028