Levitate

The Levitate project is investigating a novel type of display where interactive content is composed of levitating objects. We use ultrasound to levitate and move “particles” in mid-air. These acoustically-levitated particles are the basic display elements in a levitating object display.

Levitating object displays have some interesting properties:

  • Content is composed of physical objects, actuated by an invisible matter with no mechanical constraints (sound waves);
  • Users can see through the content to view and interact with it from many angles;
  • They can interact inside the display volume, using the space around and between the particles to manipulate and interact with content;
  • Non-interactive objects and materials can be combined with levitation, enabling new interactive experiences;
  • Mid-air haptics and directional audio can be presented using similar acoustic principles and hardware, for multimodal content in a single display.

Levitating object displays enable new interaction techniques and applications, allowing users to interact with content in new ways. One of the aims of the Levitate project is to explore new applications and develop novel techniques for interacting with levitating objects. Whilst we focus on acoustic levitation, these techniques are also relevant to other types of mid-air display (e.g., using magnetic levitation, drones, AR headsets).

What does levitation look like?

Eight beads levitating in air, inside an acoustic levitation device. The beads are arranged so that one is positioned in each corner of a cube.
Eight polystyrene beads, levitating inside an acoustic levitator. Standing sound waves hold the objects in mid-air. The objects can be repositioned by manipulating the sound field.
Two small beads inside an acoustic levitation device.
Two small objects are held in-air by ultrasound, from arrays of small transducers within the device. The objects are trapped inside low-pressure areas of stationary sound waves.
A levitating bead above a smartphone screen.
A levitating bead above a smartphone screen, held in air by two small ultrasound arrays.

Levitating object displays

These are a novel type of display, where content is composed of levitating objects. With the current state-of-the-art, up to a dozen objects can be levitated and moved independently. Advances in ultrasound field synthesis and improved hardware will enable these displays to support many more display elements. Our project is also advancing the state-of-the-art in acoustics to allow this.

Levitating objects could represent data through their positions in mid-air and could be used to create physical 3D representations of data (i.e., physical visualisations). Acoustic levitation is dynamic, so object movement can be used in meaningful ways to create dynamic data displays. For example, using the levitating particles to show simulations of astronomical and physical data.

As the Levitate project progresses, we’ll be creating new interactive demonstrators to showcase some novel applications enabled by this technology. See Levitating Particle Displays for a summary of our Pervasive Displays 2018 paper outlining the display capabilities enabled by levitation.

Levitation Around Physical Objects

My work has been exploring alternative ways of producing content using levitating particles. Instead of creating composite objects composed of several particles (like the cube shown before), we can also levitate particles in the space around other physical objects.

Two beads levitating above a physical model of a mountain.
Two small objects levitating above a single ultrasound array. The mountains are tangible models made from an acoustically transparent material that allows sound waves to pass through, allowing levitation to be used around other physical objects.

This display concept offers new possibilities for content creation. Interactivity can be added to otherwise non-interactive objects, because the levitating particles can be actuated without having to modify or instrument the original object. There are no mechanical constraints, so the levitating particles can be used to create dynamic visual content in the space surrounding the objects. Despite their simple appearance, the particles can be used in expressive ways, because their context adds meaning to their position and behaviour.

A model of a volcano with five levitating beads above it. The beads are positioned to look like ash being ejected from the top of the volcano.
Five levitating beads above a physical model of a volcano. The volcano is a 3D model made from acoustically transparent materials. The levitating particles can be animated, representing the ash exploding from the volcano.

This display concept is discussed in a full paper accepted to Pervasive Displays 2019. See Enhancing Physical Objects with Actuated Levitating Particles for more about my work in this area.

Interacting with levitating objects

One of my aims on this project is to develop new interaction techniques and applications based on levitation. We started by focusing on object selection, since this is a fundamental interaction. Selection is necessary for other actions to take place (e.g., moving an object or querying its value).

We developed a technique called Point-and-Shake, which combines ray-cast pointing input with object shaking as a feedback mechanism. Basically, users point at the object they want to select and it shakes to give feedback. You can read more about this technique here. A paper describing our selection technique was published at CHI 2018. The following video includes a demonstration of Point-and-Shake:

Sound and haptics

Another aim of the project is to develop multimodal interactions using sound and haptics, to enhance the experience of interacting with the levitating objects. This is a multidisciplinary project involving experts in acoustics, HCI, and ultrasound haptics. The hardware and acoustic techniques used for acoustic levitation are similar to those used for ultrasound haptics and directional audio, so the project aims to combine these modalities to create new multimodal experiences.

Research summaries

Engage

Highlights

  • Apr 2019: Full paper accepted for IEEE World Haptics ’19.
  • Mar 2019: Full paper accepted for ACM Pervasive Displays ’19.
  • Feb 2019: Two demos accepted for CHI ’19.
  • Nov 2018: Demo at SICSA DemoFest ’18.
  • Jun 2018: Full paper presentation at Pervasive Displays ’18.
  • Apr 2018: Full paper presentation at CHI ’18.
  • Mar 2018: Full paper accepted for ACM Pervasive Displays ’18.
  • Dec 2017: Full paper accepted to CHI ’18.
  • Nov 2017: Demo at ACM ICMI ’17.
  • Oct 2017: Demos at SICSA DemoFest ’17 and ACM ISS ’17.
  • Jun 2017: Demo at Pervasive Displays ’17.
  • Jan 2017: Start of project,

Acknowledgements

This research has received funding from the 🇪🇺 European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.

Publications

    Enhancing Physical Objects with Actuated Levitating Particles
    E. Freeman, A. Marzo, P. B. Kourtelos, J. R. Williamson, and S. Brewster.
    In Proceedings of the 8th ACM International Symposium on Pervasive Displays – PerDis ’19, Paper 24. 2019.

    Three-in-one: Levitation, Parametric Audio, and Mid-Air Haptic Feedback
    G. Shakeri, E. Freeman, W. Frier, M. Iodice, B. Long, O. Georgiou, and C. Andersson.
    In Proceedings of the 37th Annual ACM Conference on Human Factors in Computing Systems – CHI ’19 Interactivity, To Appear. 2019.

    Tangible Interactions with Acoustic Levitation
    A. Marzo, S. Kockaya, E. Freeman, and J. Williamson.
    In Proceedings of the 37th Annual ACM Conference on Human Factors in Computing Systems – CHI ’19 Interactivity, To Appear. 2019.

    Levitating Particle Displays with Interactive Voxels
    E. Freeman, J. Williamson, P. Kourtelos, and S. Brewster.
    In Proceedings of the 7th ACM International Symposium on Pervasive Displays – PerDis ’18, Article 15. 2018.

    Point-and-Shake: Selecting from Levitating Object Displays
    E. Freeman, J. Williamson, S. Subramanian, and S. Brewster.
    In Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems – CHI ’18, Paper 18. 2018.

    Textured Surfaces for Ultrasound Haptic Displays
    E. Freeman, R. Anderson, J. Williamson, G. Wilson, and S. Brewster.
    In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17 Demos, pp. 491-492. 2017.

    Floating Widgets: Interaction with Acoustically-Levitated Widgets
    E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
    In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, pp. 417-420. 2017.

    Levitate: Interaction with Floating Particle Displays
    J. R. Williamson, E. Freeman, and S. Brewster.
    In Proceedings of the 6th ACM International Symposium on Pervasive Displays – PerDis ’17 Demos, Article 24. 2017.