Levitate

Levitate project logo: the word "levitate" with a levitating dot for the letter i.

The Levitate project is investigating a novel type of display where interactive content is composed of several levitating objects. We use ultrasound to levitate and move small ‘particles’ in mid-air. These acoustically-levitated particles are the basic display elements in a levitating object display.

Levitating object displays have some interesting properties:

  • Content is composed of physical objects in invisible matter (sound waves);
  • Users can see through the content and view it from many angles;
  • They can interact inside the display volume, using the space around and between the particles to manipulate and interact with content;
  • Mid-air haptics and directional audio can be presented using similar acoustic principles and hardware, for multimodal content in a single display.

Levitating object displays enable new interaction techniques and applications, allowing users to interact with information in new ways. One of the aims of the Levitate project is to explore new applications and develop novel techniques for interacting with levitating objects. Whilst we focus on acoustic levitation, these techniques are also relevant to other types of mid-air display (e.g., using magnetic levitation, drones, AR headsets).

What does levitation look like?

Two small beads inside an acoustic levitation device.
Two small objects are held in-air by ultrasound, from arrays of small transducers within the device. The objects are trapped inside low-pressure areas of stationary sound waves.
Two beads levitating above a physical model of a mountain.
Two small objects levitating above a single ultrasound array. The mountains are made from an acoustic transparent material that allows sound waves to pass through, allowing levitation around physical objects.
A levitating bead above a smartphone screen.
A levitating bead above a smartphone screen, held in air by two small ultrasound arrays.
A levitating bead above a single emitter, with a visualisation of the acoustic pressure in the sound field.
A levitating bead above a single ultrasound emitter, with a visualisation of a Twin Trap acoustic element. The brighter area indicates higher acoustic pressure. A Twin Trap is like a set of acoustic tweezers, holding an object in mid-air.

Levitating object displays

These are a novel type of display, where content is composed of levitating objects. With the current state-of-the-art, up to a dozen objects can be levitated and moved independently. Advances in ultrasound field synthesis and improved hardware will enable these displays to support many more display elements. Our project is also advancing the state-of-the-art in acoustics to allow this.

Levitating objects could represent data through their positions in mid-air and could be used to create physical 3D representations of data (i.e., physical visualisations). Acoustic levitation is dynamic, so object movement can be used in meaningful ways to create dynamic data displays. For example, using the levitating particles to show simulations of astronomical and physical data.

As the Levitate project progresses, we’ll be creating new interactive demonstrators to showcase some novel applications enabled by this technology. See Levitating Particle Displays for a summary of our Pervasive Displays 2018 paper outlining the display capabilities enabled by levitation.

Interacting with levitating objects

My role on the project is to develop new interaction techniques and applications based on levitation. We started by focusing on object selection, since this is a fundamental interaction. Selection is necessary for other actions to take place (e.g., moving an object or querying its value).

We developed a technique called Point-and-Shake, which combines ray-cast pointing input with object shaking as a feedback mechanism. Basically, users point at the object they want to select and it shakes to give feedback. You can read more about this technique here. A paper describing our selection technique was published at CHI 2018. The following video includes a demonstration of Point-and-Shake:

Sound and haptics

One of the aims of the project is to develop multimodal interactions using sound and haptics, to enhance the experience of interacting with the levitating objects. This is a multidisciplinary project involving experts in acoustics, HCI, and ultrasound haptics. The hardware and acoustic techniques used for acoustic levitation are similar to those used for ultrasound haptics and directional audio, so the project hopes to combine these modalities to create new multimodal experiences.

Research summaries

Engage

Highlights

  • Nov 2018: Demo at SICSA DemoFest ’18.
  • Jun 2018: Full paper presentation at Pervasive Displays ’18.
  • Apr 2018: Full paper presentation at CHI ’18.
  • Mar 2018: Paper accepted to Pervasive Displays ’18.
  • Dec 2017: Paper accepted to CHI ’18.
  • Nov 2017: Demo at ACM ICMI ’17.
  • Oct 2017: Demo at SICSA DemoFest ’17 and ACM ISS ’17.
  • Jun 2017: Demo at Pervasive Displays ’17.
  • Jan 2017: Start of project,

Acknowledgements

This research has received funding from the 🇪🇺 European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.

Publications

    Levitating Object Displays with Interactive Voxels
    E. Freeman, J. Williamson, P. Kourtelos, and S. Brewster.
    In Proceedings of the 7th ACM International Symposium on Pervasive Displays – PerDis ’18, Article 15. 2018.

     PDF       DOI       Website      [Bibtex]

    @inproceedings{PerDis2018,
        author = {Freeman, Euan and Williamson, Julie and Kourtelos, Praxitelis and Brewster, Stephen},
        booktitle = {{Proceedings of the 7th ACM International Symposium on Pervasive Displays - PerDis '18}},
        title = {{Levitating Object Displays with Interactive Voxels}},
        year = {2018},
        publisher = {ACM Press},
        pages = {Article 15},
        doi = {10.1145/3205873.3205878},
      url = {http://euanfreeman.co.uk/levitate/},
      pdf = {http://research.euanfreeman.co.uk/papers/PerDis_2018.pdf},
    }

    Point-and-Shake: Selecting from Levitating Object Displays
    E. Freeman, J. Williamson, S. Subramanian, and S. Brewster.
    In Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems – CHI ’18, Paper 18. 2018.

     PDF       DOI       Website       Video      [Bibtex]

    @inproceedings{CHI2018,
        author = {Freeman, Euan and Williamson, Julie and Subramanian, Sriram and Brewster, Stephen},
        booktitle = {{Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems - CHI '18}},
        title = {{Point-and-Shake: Selecting from Levitating Object Displays}},
        year = {2018},
        publisher = {ACM Press},
        pages = {Paper 18},
        doi = {10.1145/3173574.3173592},
      url = {http://euanfreeman.co.uk/levitate/},
      video = {{https://www.youtube.com/watch?v=j8foZ5gahvQ}},
      pdf = {http://research.euanfreeman.co.uk/papers/CHI_2018.pdf},
    }

    Textured Surfaces for Ultrasound Haptic Displays
    E. Freeman, R. Anderson, J. Williamson, G. Wilson, and S. Brewster.
    In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17 Demos, pp. 491-492. 2017.

     PDF       DOI      [Bibtex]

    @inproceedings{ICMI2017Demo,
        author = {Freeman, Euan and Anderson, Ross and Williamson, Julie and Wilson, Graham and Brewster, Stephen},
        booktitle = {{Proceedings of 19th ACM International Conference on Multimodal Interaction - ICMI '17 Demos}},
        title = {{Textured Surfaces for Ultrasound Haptic Displays}},
        year = {2017},
        publisher = {ACM Press},
        pages = {491-492},
        doi = {10.1145/3136755.3143020},
      url = {},
      pdf = {http://research.euanfreeman.co.uk/papers/ICMI_2017_Demo.pdf},
    }

    Floating Widgets: Interaction with Acoustically-Levitated Widgets
    E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
    In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, pp. 417-420. 2017.

     PDF       DOI      [Bibtex]

    @inproceedings{ISS2017Demo,
        author = {Freeman, Euan and Anderson, Ross and Andersson, Carl and Williamson, Julie and Brewster, Stephen},
        booktitle = {{Proceedings of ACM International Conference on Interactive Surfaces and Spaces - ISS '17 Demos}},
        title = {{Floating Widgets: Interaction with Acoustically-Levitated Widgets}},
        year = {2017},
        publisher = {ACM Press},
        pages = {417-420},
        doi = {10.1145/3132272.3132294},
      url = {},
      pdf = {http://research.euanfreeman.co.uk/papers/ISS_2017_Demo.pdf},
    }

    Levitate: Interaction with Floating Particle Displays
    J. R. Williamson, E. Freeman, and S. Brewster.
    In Proceedings of the 6th ACM International Symposium on Pervasive Displays – PerDis ’17 Demos, Article 24. 2017.

     PDF       DOI       Website      [Bibtex]

    @inproceedings{PerDis2017Demo,
        author = {Williamson, Julie R. and Freeman, Euan and Brewster, Stephen},
        booktitle = {{Proceedings of the 6th ACM International Symposium on Pervasive Displays - PerDis '17 Demos}},
        title = {{Levitate: Interaction with Floating Particle Displays}},
        year = {2017},
        publisher = {ACM Press},
        pages = {Article 24},
        doi = {10.1145/3078810.3084347},
      url = {http://dl.acm.org/citation.cfm?id=3084347},
      pdf = {http://research.euanfreeman.co.uk/papers/PerDis_2017.pdf},
    }