CHI 2019

CHI 2019 logo

I was at CHI 2019 earlier this week. It was the biggest CHI so far (almost 3,900 attendees), so I’m extra proud to have been part of the organising committee – especially since it was in Glasgow! Aside from organisation, I was helping with University of Glasgow’s exhibitor booth, had two Interactivity exhibits about acoustic levitation, and chaired a great session on Touch and Haptics. I didn’t get to see many of the technical sessions, but a few stuck in mind.

There were a couple of really good papers in the first alt.chi session. First, an analysis of dichotomous inference in CHI papers, followed by a first look at trends and clichés in CHI paper writing. Both papers were well presented and were a chance to reflect on how we present our science as a community. I’m moving away from dichotomous statistics but am a bit apprehensive about how reviewers will respond to that style. Papers like this provide a bit more momentum for change which we’ll all benefit from.

I liked Aakar Gupta’s talk on RotoSwype, which used an IMU embedded in a ring for swipe keyboard input in XR. The neat thing about that work was the focus on subtle, low-effort interaction, with hands by the side of the body instead of raised in front. Fatigue is a big barrier for mid-air interaction, especially for prolonged interactions like text entry, so it was nice to see attention paid to that.

There were good papers in the Touch and Haptics session I chaired, but one that especially sticks in mind was Philip Quinn’s work on touchscreen input sensing using a barometric pressure sensor. The core idea was that devices are sealed to prevent water and dust ingress, and also contain barometric pressure sensors for accurate altitude measurements; when someone applies pressure to the touchscreen, the air pressure inside the almost-completely-sealed device changes briefly. This internal pressure change reliably correlates with pressure input on the touchscreen. Our group in Glasgow did a lot of foundational work on pressure input for mobile devices, so it’s cool to see steps towards facilitating this without needing dedicated sensors.

Pervasive Displays 2019 Paper

Pleased to announce I’ve had a full paper accepted by ACM Pervasive Displays 2019! The paper, titled “Enhancing Physical Objects with Actuated Levitating Particles”, is about using acoustic levitation to add actuated display elements to ordinary physical objects. This is a means of adding interactivity and dynamic output to otherwise static, non-interactive objects. See this page for more about this paper.

SICSA DemoFest 2018

Earlier this week I was at SICSA DemoFest in Edinburgh, talking about acoustic levitation and some of the work we’ve been doing on the Levitate project.

For more information about Levitate and the awesome work we’ve been doing, follow us on Twitter (@LevitateProj), check out the project website, and see more photos and videos here.

Want to make your own acoustic levitator? You can build a simpler version of our device using Asier Marzo’s instructables instructions.

Image used in my demo presentation. The image illustrates what a standing wave looks like and has a brief explanation about how acoustic levitation works: small objects can be levitated between high-amplitude areas of the standing sound wave and objects can be moved in mid-air by moving the sound waves.
A close-up photo of our demo booth at the event. We demonstrated a levitation device with a polystyrene bead being moved in a variety of patterns in mid-air.

CHI 2018

CHI 2018 conference logo

I’m going to be at CHI in Montreal next week, to present my full paper, titled “Point-and-Shake: Selecting from Levitating Object Displays”. I’m the last talk in the Input: Targets and Selection session (Thursday 26th April, 9am, Room 517C). Come along to hear about interaction with levitating objects! To find out more, read more about the Levitate project and Point-and-Shake.

I’m also participating in the Mid-Air Haptics for Control Interfaces workshop, run by Ultrahaptics. In the workshop, I’m co-chairing a session with Seokhee Jeon from Kyung Hee University, focusing on the perception of mid-air haptics.

Finally, I’m also going to be chairing the Typing & Touch 2 papers session (Thursday 26th April, 2pm, Room 514B), which has four interesting papers on touchscreen interaction and haptic feedback.

CHI 2018 Paper

I’ve had a paper accepted to the CHI 2018 conference, describing recent work on the Levitate project. The paper is about Point-and-Shake, a mid-air interaction technique for selecting levitating objects. I’m looking forward to presenting this work at Montreal in April! The follow 30-second preview gives a super quick demo of our interaction technique.

For more information about the Levitate project, check us out on Twitter: @LevitateProj

CHI 2017 Paper + Videos

I’m happy to note that I’ve had a full paper [1] accepted to CHI 2017. The paper describes research from the ABBI project, about how sound from wearable and fixed sources can be used to help visually impaired children at school (for more, please see here). The videos in this post include a short description of the paper as well as a longer description of the research and our findings.

[1] Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently
E. Freeman, G. Wilson, S. Brewster, G. Baud-Bovy, C. Magnusson, and H. Caltenco.
In Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems – CHI ’17, pp. 4146-4157. 2017.

Android 6.0 Multipart HTTP POST

This how-to shows you how to use a multipart HTTP POST request to upload a file and metadata to a web server. Android 6.0 removed support for legacy HTTP libraries, so a lot of examples I found online are outdated (or require adding the legacy libraries). This solution uses the excellent OkHttp library from Square – because instead of adding legacy libraries for the old method, you should add a new library that’ll also save you a lot of work!

Step 1: Add OkHttp to your gradle build script

In Android Studio, open the build.gradle script for your main project module and add OkHttp to your dependencies:

Step 2: Create and execute an HTTP request

This example shows how to upload the contents of a File object to a server, with a username and date string as metadata.

Summary

OkHttp is awesome because it removes a lot of the heavy lifting necessary to work with HTTP requests in Android. Construct your request content using Java objects and it’ll do the rest for you. If you’re looking for a replacement for the HTTP libraries deprecated in Android 6.0, I strongly recommend this one.

ABBI Demo at ICMI ’16

Earlier this month I was in Tokyo for the International Conference on Multimodal Interaction (ICMI). I was there to demo research from the ABBI project. We had two ABBI demos from the Multimodal Interaction Group at the conference: mine demonstrated how ABBI could be used to adapt the lighting at home for visually impaired children, and Graham’s was about using non-visual stimulus (e.g., thermal, vibration) to present affective cues in a more accessible way for visually impaired smartphone users.

The conference was good and it was held in an amazing city – Tokyo. Next year, ICMI visits another amazing city – Glasgow! Julie and Alessandro from the Glasgow Interactive Systems Group will be hosting the conference here at Glasgow Uni.

Viva and Other CHI ’16 Papers

Last week I passed my viva, subject to minor thesis corrections!

I’ve also had a Late-Breaking Work submission accepted to CHI, which discusses recent work I’ve been doing on the ABBI (Audio Bracelet for Blind Interaction) project. The paper, titled “Using Sound to Help Visually Impaired Children Play Independently”, describes initial requirement capture and prototyping for a system which uses iBeacons and a ‘smart’ bracelet to help blind and visually impaired children during play time at nursery and school.

Finally, we’ve also had a position paper accepted to the CHI ’16 workshop on mid-air haptics and displays. It outlines mid-air haptics research we have been doing at Glasgow and discusses how it can inform the creation of more usable mid-air widgets for in-air interfaces.