Levitating Particle Displays

Introduction

Object levitation enables new types of display where the content is created from physical particles in air instead of pixels on a surface. Several technologies have been developed to enable levitation, including magnetic levitation and acoustic levitation. The Levitate project focuses on acoustic levitation, but my Pervasive Displays 2018 paper [1] considers all types of levitation and the novel display capabilities they allow. This synopsis outlines some of those capabilities.

Interacting with Levitating Particle Displays

Voxels in an invisible volume

Levitating objects are held in mid-air, e.g. using sound waves or magnetic forces. This means that unlike most shape-changing displays, the display elements exist within an invisible volume. An invisible volume can allow new interactions. Multiple users around a levitating particle display can view the content but also see each other, potentially improving collaborative interactions. Users can also see surfaces behind and beneath the levitating objects, allowing levitation to augment existing interactive displays.

Reaching into the display

Because of the invisible display volume, it is often possible for users to reach inside the display. With acoustic levitation, this may disrupt the sound waves, but users can still reach in to a certain extent. Other objects may also be placed within a levitating particle display, so long as they are transparent to the levitation forces. With acoustic levitation, this means the objects must allow sound waves to pass through them. Being able to reach into the display and manipulate the display elements, and being able to place objects into the display, could enable new interactions and applications that would not be possible with traditional screens.

Expressive voxels

Whilst levitating objects are typically quite simple (e.g., small polystyrene beads in acoustic levitation systems), they can be manipulated in order to allow greater expressivity. For example, in Point-and-Shake [2], we used expressive object movement as a simple but effective means of feedback about selection gestures. In the Pervasive Displays paper [1] we discuss more about the expressive potential of levitating particles.

Colocated non-visual feedback

Acoustic levitation is based on similar techniques to ultrasound haptics and parametric audio. These techniques could potentially be combined to allow colocated non-visual feedback. Levitating particle displays have the potential to be the first display type where audio and haptic feedback can be presented directly within the display volume.

References

[1] Levitating Object Displays with Interactive Voxels
E. Freeman, J. Williamson, P. Kourtelos, and S. Brewster.
In Proceedings of the 7th ACM International Symposium on Pervasive Displays – PerDis ’18, to appear. 2018.

 PDF       DOI       Website      [Bibtex]

@inproceedings{PerDis2018,
    author = {Freeman, Euan and Williamson, Julie and Kourtelos, Praxitelis and Brewster, Stephen},
    booktitle = {{Proceedings of the 7th ACM International Symposium on Pervasive Displays - PerDis '18}},
    title = {{Levitating Object Displays with Interactive Voxels}},
    year = {2018},
    publisher = {ACM Press},
    pages = {to appear},
    doi = {10.1145/3205873.3205878},
  url = {http://euanfreeman.co.uk/levitate/},
  pdf = {http://research.euanfreeman.co.uk/papers/PerDis_2018.pdf},
}

[2] Point-and-Shake: Selecting from Levitating Object Displays
E. Freeman, J. Williamson, S. Subramanian, and S. Brewster.
In Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems – CHI ’18, Paper~18. 2018.

 PDF       DOI       Website       Video      [Bibtex]

@inproceedings{CHI2018,
    author = {Freeman, Euan and Williamson, Julie and Subramanian, Sriram and Brewster, Stephen},
    booktitle = {{Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems - CHI '18}},
    title = {{Point-and-Shake: Selecting from Levitating Object Displays}},
    year = {2018},
    publisher = {ACM Press},
    pages = {Paper~18},
    doi = {10.1145/3173574.3173592},
  url = {http://euanfreeman.co.uk/levitate/},
  video = {{https://www.youtube.com/watch?v=j8foZ5gahvQ}},
  pdf = {http://research.euanfreeman.co.uk/papers/CHI_2018.pdf},
}

Point-and-Shake: Selecting Levitating Objects

Introduction

Levitating object displays are a novel type of display where content is composed of levitating objects. To find out more about these displays, read my introduction to the Levitate project. This page provides more information and some video demonstrations of the Point-and-Shake interaction technique, described in my CHI 2018 paper[1].

Point-and-Shake

Point-and-Shake is a technique for selecting levitating objects. Selection is an important interaction, because it needs to happen before users can manipulate or interact with content in a levitating particle display. Users cannot always directly touch objects (e.g., because it disrupts the acoustic levitation forces), so we developed a mid-air gesture technique. All feedback is given through the appearance and behaviour of the levitating objects.

We used ray-cast pointing, enabling users to select “that one there”, by pointing an extended finger towards the target object. Feedback is important, to help users understand how the system is interpreting their actions. The only visual elements in a levitating object display are the objects themselves, so we manipulate the appearance of the objects to give feedback. When the user targets an object, we shake it from side to side as a means of giving feedback. Thus, Point-and-Shake is the combination of pointing gestures with object shaking as feedback. The following video demonstrates this.

Selecting Occluded Objects

A limitation of ray-cast pointing is that users might have trouble selecting occluded objects: i.e., an object hidden behind another. This is because the user cannot directly point at the object without first pointing at others in the way. We implemented two versions of the Lock Ray technique (Grossman et al.), to allow selection of occluded objects. This breaks selection into two stages: 1) aiming towards the intended target; then 2) disambiguating the selection. The following video demonstrates this.

Evaluation

We evaluated Point-and-Shake through two user studies. In these studies, we asked users to select one of two objects using our technique. Object shaking was a successful way of giving feedback about selection. Users completed 94-96% of tasks successfully, within the given task time limits. The mean selection times were 3-4 seconds. Detailed results are described in my CHI 2018 paper[1].

References

[1] Point-and-Shake: Selecting from Levitating Object Displays
E. Freeman, J. Williamson, S. Subramanian, and S. Brewster.
In Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems – CHI ’18, Paper~18. 2018.

 PDF       DOI       Website       Video      [Bibtex]

@inproceedings{CHI2018,
    author = {Freeman, Euan and Williamson, Julie and Subramanian, Sriram and Brewster, Stephen},
    booktitle = {{Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems - CHI '18}},
    title = {{Point-and-Shake: Selecting from Levitating Object Displays}},
    year = {2018},
    publisher = {ACM Press},
    pages = {Paper~18},
    doi = {10.1145/3173574.3173592},
  url = {http://euanfreeman.co.uk/levitate/},
  video = {{https://www.youtube.com/watch?v=j8foZ5gahvQ}},
  pdf = {http://research.euanfreeman.co.uk/papers/CHI_2018.pdf},
}

CHI 2018

CHI 2018 conference logo

I’m going to be at CHI in Montreal next week, to present my full paper, titled “Point-and-Shake: Selecting from Levitating Object Displays”. I’m the last talk in the Input: Targets and Selection session (Thursday 26th April, 9am, Room 517C). Come along to hear about interaction with levitating objects! To find out more, read more about the Levitate project and Point-and-Shake.

I’m also participating in the Mid-Air Haptics for Control Interfaces workshop, run by Ultrahaptics. In the workshop, I’m co-chairing a session with Seokhee Jeon from Kyung Hee University, focusing on the perception of mid-air haptics.

Finally, I’m also going to be chairing the Typing & Touch 2 papers session (Thursday 26th April, 2pm, Room 514B), which has four interesting papers on touchscreen interaction and haptic feedback.

ICMI ’17 Paper & ISS ’17 Demo

I’ve had a paper accepted by ACM ICMI 2017 titled “Rhythmic Micro-Gestures: Discreet Interaction On-the-Go” [1]. The paper is about rhythmic micro-gestures, a new interaction technique for interacting with mobile devices. This technique combines rhythmic gestures, an input technique from my CHI 2016 paper, with the concept of micro-gestures, small hand movements that can be performed discreetly. I’ll be giving a talk about this paper at the conference in November, in Glasgow.

We’ve also had a demo accepted by ACM ISS 2017 from the Levitate project [2]. That demo gives attendees the chance to try interacting with mid-air objects, suspended in air by acoustic levitation.

[1] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17, to appear. 2017.

 PDF       DOI      [Bibtex]

@inproceedings{ICMI2017,
    author = {Freeman, Euan and Griffiths, Gareth and Brewster, Stephen},
    booktitle = {{Proceedings of 19th ACM International Conference on Multimodal Interaction - ICMI '17}},
    title = {{Rhythmic Micro-Gestures: Discreet Interaction On-the-Go}},
    year = {2017},
    publisher = {ACM Press},
    pages = {to appear},
    doi = {10.1145/3136755.3136815},
  url = {},
  pdf = {http://research.euanfreeman.co.uk/papers/ICMI_2017.pdf},
}

[2] Floating Widgets: Interaction with Acoustically-Levitated Widgets
E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, pp. 417-420. 2017.

 PDF       DOI      [Bibtex]

@inproceedings{ISS2017Demo,
    author = {Freeman, Euan and Anderson, Ross and Andersson, Carl and Williamson, Julie and Brewster, Stephen},
    booktitle = {{Proceedings of ACM International Conference on Interactive Surfaces and Spaces - ISS '17 Demos}},
    title = {{Floating Widgets: Interaction with Acoustically-Levitated Widgets}},
    year = {2017},
    publisher = {ACM Press},
    pages = {417-420},
    doi = {10.1145/3132272.3132294},
  url = {},
  pdf = {http://research.euanfreeman.co.uk/papers/ISS_2017_Demo.pdf},
}

Levitate

Introduction

The Levitate project is investigating new types of display where content is composed of levitating objects. We use acoustic levitation to silently position and move small ‘particles’ in mid-air. These particles are the display primitives in a levitating object display. Levitating object displays have some interesting properties:

  • users can see through the content and view it from many angles;
  • they can interact inside the display volume, using the space around and between the particles in meaningful ways;
  • mid-air haptic and directional audio can be presented using similar acoustic principles and hardware, creating truly multimodal displays.

These can enable new interaction techniques and applications, allowing users to interact with information in new ways. One of the aims of Levitate is to explore new applications and develop techniques for interacting with levitating objects.

The following image shows an example of acoustic levitation: two small objects are held in-air by ultrasound, from arrays of small transducers within the device. The objects are trapped inside low-pressure areas of stationary sound waves. By manipulating the acoustic field, the objects can be repositioned (see the following video for an example).

When scaled up to include more objects, these displays could be used in many ways. For example, levitating objects could represent data through their relative positions, or could be used to create physical 3D representations of data. Levitation can also be dynamic, so object movement can be meaningful. For example, using particles to create simulations of astronomical and physical data. As the project progresses, we’ll be creating interactive demonstrators to showcase some novel applications enabled by this technology. See Levitating Particle Displays for a summary of our Pervasive Displays 2018 paper outlining the display capabilities enabled by levitation.

Interacting with Levitating Objects

My role on the project is to develop new interaction techniques and applications based on levitation. We started by focusing on object selection, since this is a fundamental interaction. Selection is necessary for other actions to take place (e.g., moving an object or querying its value). We developed a technique called Point-and-Shake, which combines ray-cast pointing input with object shaking as a feedback mechanism. Basically, users point at the object they want to select and it shakes to give feedback. You can read more about this technique here. A paper describing our selection technique was published at CHI 2018. The following video includes a demonstration of Point-and-Shake:

Sound and Haptics

One of the aims of the project is to develop multimodal interactions using sound and haptics, to enhance the experience of interacting with the levitating objects. This is a multidisciplinary project involving experts in acoustics, HCI, and ultrasound haptics. The hardware and acoustic techniques used for acoustic levitation are similar to those used for ultrasound haptics and directional audio, so the project hopes to combine these modalities to create new multimodal experiences.

Research Summaries

Engage

Highlights

  • Jun 2018: Full paper presentation at Pervasive Displays ’18.
  • Apr 2018: Full paper presentation at CHI ’18.
  • Mar 2018: Paper accepted to Pervasive Displays ’18.
  • Dec 2017: Paper accepted to CHI ’18.
  • Nov 2017: Demo at ACM ICMI ’17.
  • Oct 2017: Demo at SICSA DemoFest ’17 and ACM ISS ’17.
  • Jun 2017: Demo at Pervasive Displays ’17.
  • Jan 2017: Start of project, kick-off meeting in Glasgow.

Acknowledgements

This research has received funding from the 🇪🇺 European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.

Publications

    Levitating Object Displays with Interactive Voxels
    E. Freeman, J. Williamson, P. Kourtelos, and S. Brewster.
    In Proceedings of the 7th ACM International Symposium on Pervasive Displays – PerDis ’18, to appear. 2018.

     PDF       DOI       Website      [Bibtex]

    @inproceedings{PerDis2018,
        author = {Freeman, Euan and Williamson, Julie and Kourtelos, Praxitelis and Brewster, Stephen},
        booktitle = {{Proceedings of the 7th ACM International Symposium on Pervasive Displays - PerDis '18}},
        title = {{Levitating Object Displays with Interactive Voxels}},
        year = {2018},
        publisher = {ACM Press},
        pages = {to appear},
        doi = {10.1145/3205873.3205878},
      url = {http://euanfreeman.co.uk/levitate/},
      pdf = {http://research.euanfreeman.co.uk/papers/PerDis_2018.pdf},
    }

    Point-and-Shake: Selecting from Levitating Object Displays
    E. Freeman, J. Williamson, S. Subramanian, and S. Brewster.
    In Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems – CHI ’18, Paper~18. 2018.

     PDF       DOI       Website       Video      [Bibtex]

    @inproceedings{CHI2018,
        author = {Freeman, Euan and Williamson, Julie and Subramanian, Sriram and Brewster, Stephen},
        booktitle = {{Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems - CHI '18}},
        title = {{Point-and-Shake: Selecting from Levitating Object Displays}},
        year = {2018},
        publisher = {ACM Press},
        pages = {Paper~18},
        doi = {10.1145/3173574.3173592},
      url = {http://euanfreeman.co.uk/levitate/},
      video = {{https://www.youtube.com/watch?v=j8foZ5gahvQ}},
      pdf = {http://research.euanfreeman.co.uk/papers/CHI_2018.pdf},
    }

    Textured Surfaces for Ultrasound Haptic Displays
    E. Freeman, R. Anderson, J. Williamson, G. Wilson, and S. Brewster.
    In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17 Demos, pp. 491-492. 2017.

     PDF       DOI      [Bibtex]

    @inproceedings{ICMI2017Demo,
        author = {Freeman, Euan and Anderson, Ross and Williamson, Julie and Wilson, Graham and Brewster, Stephen},
        booktitle = {{Proceedings of 19th ACM International Conference on Multimodal Interaction - ICMI '17 Demos}},
        title = {{Textured Surfaces for Ultrasound Haptic Displays}},
        year = {2017},
        publisher = {ACM Press},
        pages = {491-492},
        doi = {10.1145/3136755.3143020},
      url = {},
      pdf = {http://research.euanfreeman.co.uk/papers/ICMI_2017_Demo.pdf},
    }

    Floating Widgets: Interaction with Acoustically-Levitated Widgets
    E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
    In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, pp. 417-420. 2017.

     PDF       DOI      [Bibtex]

    @inproceedings{ISS2017Demo,
        author = {Freeman, Euan and Anderson, Ross and Andersson, Carl and Williamson, Julie and Brewster, Stephen},
        booktitle = {{Proceedings of ACM International Conference on Interactive Surfaces and Spaces - ISS '17 Demos}},
        title = {{Floating Widgets: Interaction with Acoustically-Levitated Widgets}},
        year = {2017},
        publisher = {ACM Press},
        pages = {417-420},
        doi = {10.1145/3132272.3132294},
      url = {},
      pdf = {http://research.euanfreeman.co.uk/papers/ISS_2017_Demo.pdf},
    }

    Levitate: Interaction with Floating Particle Displays
    J. R. Williamson, E. Freeman, and S. Brewster.
    In Proceedings of the 6th ACM International Symposium on Pervasive Displays – PerDis ’17 Demos, Article 24. 2017.

     PDF       DOI       Website      [Bibtex]

    @inproceedings{PerDis2017Demo,
        author = {Williamson, Julie R. and Freeman, Euan and Brewster, Stephen},
        booktitle = {{Proceedings of the 6th ACM International Symposium on Pervasive Displays - PerDis '17 Demos}},
        title = {{Levitate: Interaction with Floating Particle Displays}},
        year = {2017},
        publisher = {ACM Press},
        pages = {Article 24},
        doi = {10.1145/3078810.3084347},
      url = {http://dl.acm.org/citation.cfm?id=3084347},
      pdf = {http://research.euanfreeman.co.uk/papers/PerDis_2017.pdf},
    }