Levitate

Introduction

The Levitate project is investigating new types of display where content is composed of levitating objects. We use acoustic levitation to silently position and move small ‘particles’ in mid-air. These particles are the display primitives in a levitating object display. Levitating object displays have some interesting properties:

  • users can see through the content and view it from many angles;
  • they can interact inside the display volume, using the space around and between the particles in meaningful ways;
  • mid-air haptic and directional audio can be presented using similar acoustic principles and hardware, creating truly multimodal displays.

These can enable new interaction techniques and applications, allowing users to interact with information in new ways. One of the aims of Levitate is to explore new applications and develop techniques for interacting with levitating objects.

The following image shows an example of acoustic levitation: two small objects are held in-air by ultrasound, from arrays of small transducers within the device. The objects are trapped inside low-pressure areas of stationary sound waves. By manipulating the acoustic field, the objects can be repositioned.

When scaled up to include more objects, these displays could be used in many ways. For example, levitating objects could represent data through their relative positions, or could be used to create physical 3D representations of data. Levitation can also be dynamic, so object movement can be meaningful. For example, using particles to create simulations of astronomical and physical data. As the project progresses, we’ll be creating interactive demonstrators to showcase some novel applications enabled by this technology.

Interacting with Levitating Objects

My role on the project is to develop new interaction techniques and applications based on levitation. We started by focusing on object selection, since this is a fundamental interaction. Selection is necessary for other actions to take place (e.g., moving an object or querying its value). We developed a technique called Point-and-Shake, which combines ray-cast pointing input with object shaking as a feedback mechanism. Basically, users point at the object they want to select and it shakes to give feedback. You can read more about this technique here. A paper describing our selection technique was published at CHI 2018. The following video includes a demonstration of Point-and-Shake:

Sound and Haptics

One of the aims of the project is to develop multimodal interactions using sound and haptics, to enhance the experience of interacting with the levitating objects. This is a multidisciplinary project involving experts in acoustics, HCI, and ultrasound haptics. The hardware and acoustic techniques used for acoustic levitation are similar to those used for ultrasound haptics and directional audio, so the project hopes to combine these modalities to create new multimodal experiences.

Research Summaries

Engage

Highlights

  • Apr 2018: Full paper presentation at CHI ’18.
  • Mar 2018: Paper accepted to Pervasive Displays ’18.
  • Dec 2017: Paper accepted to CHI ’18.
  • Nov 2017: Demo at ACM ICMI ’17.
  • Oct 2017: Demo at SICSA DemoFest ’17 and ACM ISS ’17.
  • Jun 2017: Demo at Pervasive Displays ’17.
  • Jan 2017: Start of project, kick-off meeting in Glasgow.

Acknowledgements

This research has received funding from the 🇪🇺 European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.

Publications

    Levitating Object Displays with Interactive Voxels
    E. Freeman, J. Williamson, P. Kourtelos, and S. Brewster.
    In Proceedings of the 7th ACM International Symposium on Pervasive Displays – PerDis ’18, to appear. 2018.

     Website      [Bibtex]

    @inproceedings{PerDis2018,
        author = {Freeman, Euan and Williamson, Julie and Kourtelos, Praxitelis and Brewster, Stephen},
        booktitle = {{Proceedings of the 7th ACM International Symposium on Pervasive Displays - PerDis '18}},
        title = {{Levitating Object Displays with Interactive Voxels}},
        year = {2018},
        publisher = {ACM Press},
        pages = {to appear},
        doi = {},
      url = {http://euanfreeman.co.uk/levitate/},
    }

    Point-and-Shake: Selecting from Levitating Object Displays
    E. Freeman, J. Williamson, S. Subramanian, and S. Brewster.
    In Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems – CHI ’18, Paper~18. 2018.

     PDF       DOI       Website       Video      [Bibtex]

    @inproceedings{CHI2018,
        author = {Freeman, Euan and Williamson, Julie and Subramanian, Sriram and Brewster, Stephen},
        booktitle = {{Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems - CHI '18}},
        title = {{Point-and-Shake: Selecting from Levitating Object Displays}},
        year = {2018},
        publisher = {ACM Press},
        pages = {Paper~18},
        doi = {10.1145/3173574.3173592},
      url = {http://euanfreeman.co.uk/levitate/},
      video = {{https://www.youtube.com/watch?v=j8foZ5gahvQ}},
      pdf = {http://research.euanfreeman.co.uk/papers/CHI_2018.pdf},
    }

    Textured Surfaces for Ultrasound Haptic Displays
    E. Freeman, R. Anderson, J. Williamson, G. Wilson, and S. Brewster.
    In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17 Demos, pp. 491-492. 2017.

     PDF       DOI      [Bibtex]

    @inproceedings{ICMI2017Demo,
        author = {Freeman, Euan and Anderson, Ross and Williamson, Julie and Wilson, Graham and Brewster, Stephen},
        booktitle = {{Proceedings of 19th ACM International Conference on Multimodal Interaction - ICMI '17 Demos}},
        title = {{Textured Surfaces for Ultrasound Haptic Displays}},
        year = {2017},
        publisher = {ACM Press},
        pages = {491-492},
        doi = {10.1145/3136755.3143020},
      url = {},
      pdf = {http://research.euanfreeman.co.uk/papers/ICMI_2017_Demo.pdf},
    }

    Floating Widgets: Interaction with Acoustically-Levitated Widgets
    E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
    In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, pp. 417-420. 2017.

     PDF       DOI      [Bibtex]

    @inproceedings{ISS2017Demo,
        author = {Freeman, Euan and Anderson, Ross and Andersson, Carl and Williamson, Julie and Brewster, Stephen},
        booktitle = {{Proceedings of ACM International Conference on Interactive Surfaces and Spaces - ISS '17 Demos}},
        title = {{Floating Widgets: Interaction with Acoustically-Levitated Widgets}},
        year = {2017},
        publisher = {ACM Press},
        pages = {417-420},
        doi = {10.1145/3132272.3132294},
      url = {},
      pdf = {http://research.euanfreeman.co.uk/papers/ISS_2017_Demo.pdf},
    }

    Levitate: Interaction with Floating Particle Displays
    J. R. Williamson, E. Freeman, and S. Brewster.
    In Proceedings of the 6th ACM International Symposium on Pervasive Displays – PerDis ’17 Demos, Article 24. 2017.

     PDF       DOI       Website      [Bibtex]

    @inproceedings{PerDis2017Demo,
        author = {Williamson, Julie R. and Freeman, Euan and Brewster, Stephen},
        booktitle = {{Proceedings of the 6th ACM International Symposium on Pervasive Displays - PerDis '17 Demos}},
        title = {{Levitate: Interaction with Floating Particle Displays}},
        year = {2017},
        publisher = {ACM Press},
        pages = {Article 24},
        doi = {10.1145/3078810.3084347},
      url = {http://dl.acm.org/citation.cfm?id=3084347},
      pdf = {http://research.euanfreeman.co.uk/papers/PerDis_2017.pdf},
    }