Ultrasound Haptics and Levitation: the Future of Human-Computer Interaction?
Wednesday 6th April 2022
Abstract: Ultrasound provides some brand-new opportunities for interaction in user interfaces. In this talk, I will describe this new modality and what it offers to HCI. By using standard loudspeakers, we can create soundfields that generate haptic feedback in mid-air, without the user having to hold or touch anything. We can control the position and texture of this feedback in real time. This 'mid-air' haptics enables new interaction techniques around devices. I will give examples of how it can be used for virtual controls and how novel interactions can be designed.
Another exciting possibility is the use of ultrasound to levitate small particles to create 'physical' pixels in the air in front of the user. These physical pixels can be precisely controlled to levitate 3D shapes and objects, which can be controlled dynamically. This opens many new opportunities for displaying 3D models and data. In the talk, I will describe how to make this highly novel form of display possible and some of the interesting problems that arise around selecting and manipulating the levitated objects.
Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow. He got his PhD in auditory interface design at the University of York. At Glasgow, he leads the Multimodal Interaction Group, which is very active and has a strong international reputation in HCI (http://mig.dcs.gla.ac.uk). His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly audio, haptics and gesture) to create a rich, natural interaction between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. A long-term focus has been on mobile interaction and how we can design better user interfaces for users who are on the move. Other areas of interest include VR/AR, wearable devices and in-car interaction. He pioneered the study of non-speech audio and haptic interaction for mobile devices with work starting in the 1990's.