SICSA DemoFest 2018

Earlier this week I was at SICSA DemoFest in Edinburgh, talking about acoustic levitation and some of the work we’ve been doing on the Levitate project.

For more information about Levitate and the awesome work we’ve been doing, follow us on Twitter (@LevitateProj), check out the project website, and see more photos and videos here.

Want to make your own acoustic levitator? You can build a simpler version of our device using Asier Marzo’s instructables instructions.

Image used in my demo presentation. The image illustrates what a standing wave looks like and has a brief explanation about how acoustic levitation works: small objects can be levitated between high-amplitude areas of the standing sound wave and objects can be moved in mid-air by moving the sound waves.
A close-up photo of our demo booth at the event. We demonstrated a levitation device with a polystyrene bead being moved in a variety of patterns in mid-air.

CHI 2018

CHI 2018 conference logo

I’m going to be at CHI in Montreal next week, to present my full paper, titled “Point-and-Shake: Selecting from Levitating Object Displays”. I’m the last talk in the Input: Targets and Selection session (Thursday 26th April, 9am, Room 517C). Come along to hear about interaction with levitating objects! To find out more, read more about the Levitate project and Point-and-Shake.

I’m also participating in the Mid-Air Haptics for Control Interfaces workshop, run by Ultrahaptics. In the workshop, I’m co-chairing a session with Seokhee Jeon from Kyung Hee University, focusing on the perception of mid-air haptics.

Finally, I’m also going to be chairing the Typing & Touch 2 papers session (Thursday 26th April, 2pm, Room 514B), which has four interesting papers on touchscreen interaction and haptic feedback.

CHI 2018 Paper

I’ve had a paper accepted to the CHI 2018 conference, describing recent work on the Levitate project. The paper is about Point-and-Shake, a mid-air interaction technique for selecting levitating objects. I’m looking forward to presenting this work at Montreal in April! The follow 30-second preview gives a super quick demo of our interaction technique.

For more information about the Levitate project, check us out on Twitter: @LevitateProj

ICMI ’17 Paper & ISS ’17 Demo

I’ve had a paper accepted by ACM ICMI 2017 titled “Rhythmic Micro-Gestures: Discreet Interaction On-the-Go” [1]. The paper is about rhythmic micro-gestures, a new interaction technique for interacting with mobile devices. This technique combines rhythmic gestures, an input technique from my CHI 2016 paper, with the concept of micro-gestures, small hand movements that can be performed discreetly. I’ll be giving a talk about this paper at the conference in November, in Glasgow.

We’ve also had a demo accepted by ACM ISS 2017 from the Levitate project [2]. That demo gives attendees the chance to try interacting with mid-air objects, suspended in air by acoustic levitation.

[1] Rhythmic Micro-Gestures: Discreet Interaction On-the-Go
E. Freeman, G. Griffiths, and S. Brewster.
In Proceedings of 19th ACM International Conference on Multimodal Interaction – ICMI ’17, pp. 115-119. 2017.

 PDF       DOI      [Bibtex]

@inproceedings{ICMI2017,
    author = {Freeman, Euan and Griffiths, Gareth and Brewster, Stephen},
    booktitle = {{Proceedings of 19th ACM International Conference on Multimodal Interaction - ICMI '17}},
    title = {{Rhythmic Micro-Gestures: Discreet Interaction On-the-Go}},
    year = {2017},
    publisher = {ACM Press},
    pages = {115--119},
    doi = {10.1145/3136755.3136815},
  url = {},
  pdf = {http://research.euanfreeman.co.uk/papers/ICMI_2017.pdf},
}

[2] Floating Widgets: Interaction with Acoustically-Levitated Widgets
E. Freeman, R. Anderson, C. Andersson, J. Williamson, and S. Brewster.
In Proceedings of ACM International Conference on Interactive Surfaces and Spaces – ISS ’17 Demos, pp. 417-420. 2017.

 PDF       DOI      [Bibtex]

@inproceedings{ISS2017Demo,
    author = {Freeman, Euan and Anderson, Ross and Andersson, Carl and Williamson, Julie and Brewster, Stephen},
    booktitle = {{Proceedings of ACM International Conference on Interactive Surfaces and Spaces - ISS '17 Demos}},
    title = {{Floating Widgets: Interaction with Acoustically-Levitated Widgets}},
    year = {2017},
    publisher = {ACM Press},
    pages = {417-420},
    doi = {10.1145/3132272.3132294},
  url = {},
  pdf = {http://research.euanfreeman.co.uk/papers/ISS_2017_Demo.pdf},
}

CHI 2017 Paper + Videos

I’m happy to note that I’ve had a full paper [1] accepted to CHI 2017. The paper describes research from the ABBI project, about how sound from wearable and fixed sources can be used to help visually impaired children at school (for more, please see here). The videos in this post include a short description of the paper as well as a longer description of the research and our findings.

[1] Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently
E. Freeman, G. Wilson, S. Brewster, G. Baud-Bovy, C. Magnusson, and H. Caltenco.
In Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems – CHI ’17, pp. 4146-4157. 2017.

 PDF       DOI       Website       Video      [Bibtex]

@inproceedings{CHI2017,
    author = {Freeman, Euan and Wilson, Graham and Brewster, Stephen and Baud-Bovy, Gabriel and Magnusson, Charlotte and Caltenco, Hector},
    booktitle = {{Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems - CHI '17}},
    title = {{Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently}},
    year = {2017},
    publisher = {ACM Press},
    pages = {4146--4157},
    doi = {10.1145/3025453.3025518},
  url = {http://euanfreeman.co.uk/research/#abbi},
  video = {{https://www.youtube.com/watch?v=SGQmt1NeAGQ}},
  pdf = {http://research.euanfreeman.co.uk/papers/CHI_2017.pdf},
}

Android 6.0 Multipart HTTP POST

This how-to shows you how to use a multipart HTTP POST request to upload a file and metadata to a web server. Android 6.0 removed support for legacy HTTP libraries, so a lot of examples I found online are outdated (or require adding the legacy libraries). This solution uses the excellent OkHttp library from Square – because instead of adding legacy libraries for the old method, you should add a new library that’ll also save you a lot of work!

Step 1: Add OkHttp to your gradle build script

In Android Studio, open the build.gradle script for your main project module and add OkHttp to your dependencies:

Step 2: Create and execute an HTTP request

This example shows how to upload the contents of a File object to a server, with a username and date string as metadata.

Summary

OkHttp is awesome because it removes a lot of the heavy lifting necessary to work with HTTP requests in Android. Construct your request content using Java objects and it’ll do the rest for you. If you’re looking for a replacement for the HTTP libraries deprecated in Android 6.0, I strongly recommend this one.

ABBI Demo at ICMI ’16

Earlier this month I was in Tokyo for the International Conference on Multimodal Interaction (ICMI). I was there to demo research from the ABBI project. We had two ABBI demos from the Multimodal Interaction Group at the conference: mine demonstrated how ABBI could be used to adapt the lighting at home for visually impaired children, and Graham’s was about using non-visual stimulus (e.g., thermal, vibration) to present affective cues in a more accessible way for visually impaired smartphone users.

The conference was good and it was held in an amazing city – Tokyo. Next year, ICMI visits another amazing city – Glasgow! Julie and Alessandro from the Glasgow Interactive Systems Group will be hosting the conference here at Glasgow Uni.

Viva and Other CHI ’16 Papers

Last week I passed my viva, subject to minor thesis corrections!

I’ve also had a Late-Breaking Work submission accepted to CHI, which discusses recent work I’ve been doing on the ABBI (Audio Bracelet for Blind Interaction) project. The paper, titled “Using Sound to Help Visually Impaired Children Play Independently”, describes initial requirement capture and prototyping for a system which uses iBeacons and a ‘smart’ bracelet to help blind and visually impaired children during play time at nursery and school.

Finally, we’ve also had a position paper accepted to the CHI ’16 workshop on mid-air haptics and displays. It outlines mid-air haptics research we have been doing at Glasgow and discusses how it can inform the creation of more usable mid-air widgets for in-air interfaces.

CHI 30-second Preview + ACing

Below is a (very!) short preview of my upcoming CHI paper. In recent years, CHI has asked authors to submit a 30 second preview video summarising accepted papers, so that’s mine.

This year I’m an AC for Late-Breaking Work submissions at CHI. I’ve been reviewing papers since the start of my PhD, but this is my first time as an AC. It’s been interesting to see a conference from the “other” side.