This morning I presented my paper on audible beacons at CHI in Denver. The paper is available here. The slides from my presentation, with notes available in the presenter view, are available here. If you want find out more about the project this research comes from, please visit the ABBI project website.
Author: euan
CHI 2017 Paper + Videos
I’m happy to note that I’ve had a full paper [1] accepted to CHI 2017. The paper describes research from the ABBI project, about how sound from wearable and fixed sources can be used to help visually impaired children at school (for more, please see here). The videos in this post include a short description of the paper as well as a longer description of the research and our findings.
[1] Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently
E. Freeman, G. Wilson, S. Brewster, G. Baud-Bovy, C. Magnusson, and H. Caltenco.
In Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems – CHI ’17, 4146-4157. 2017.
@inproceedings{CHI2017,
author = {Freeman, Euan and Wilson, Graham and Brewster, Stephen and Baud-Bovy, Gabriel and Magnusson, Charlotte and Caltenco, Hector},
booktitle = {{Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems - CHI '17}},
title = {{Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently}},
year = {2017},
publisher = {ACM},
pages = {4146--4157},
doi = {10.1145/3025453.3025518},
url = {http://euanfreeman.co.uk/research/#abbi},
video = {{https://www.youtube.com/watch?v=SGQmt1NeAGQ}},
pdf = {http://research.euanfreeman.co.uk/papers/CHI_2017.pdf},
}
Android 6.0 Multipart HTTP POST
This how-to shows you how to use a multipart HTTP POST request to upload a file and metadata to a web server. Android 6.0 removed support for legacy HTTP libraries, so a lot of examples I found online are outdated (or require adding the legacy libraries). This solution uses the excellent OkHttp library from Square – because instead of adding legacy libraries for the old method, you should add a new library that’ll also save you a lot of work!
Step 1: Add OkHttp to your gradle build script
In Android Studio, open the build.gradle script for your main project module and add OkHttp to your dependencies:
dependencies { compile 'com.squareup.okhttp3:okhttp:3.5.0' }
Step 2: Create and execute an HTTP request
This example shows how to upload the contents of a File object to a server, with a username and date string as metadata.
String UPLOAD_URL = "http://yoururl.com/example.php"; // Example data String username = "test_user_123"; String datetime = "2016-12-09 10:00:00"; File image = getImage(); // Create an HTTP client to execute the request OkHttpClient client = new OkHttpClient(); // Create a multipart request body. Add metadata and files as 'data parts'. RequestBody requestBody = new MultipartBody.Builder() .setType(MultipartBody.FORM) .addFormDataPart("username", username) .addFormDataPart("datetime", datetime) .addFormDataPart("image", image.getName(), RequestBody.create(MediaType.parse("image/jpeg"), image)) .build(); // Create a POST request to send the data to UPLOAD_URL Request request = new Request.Builder() .url(UPLOAD_URL) .post(requestBody) .build(); // Execute the request and get the response from the server Response response = null; try { response = client.newCall(request).execute(); } catch (IOException e) { e.printStackTrace(); } // Check the response to see if the upload succeeded if (response == null || !response.isSuccessful()) { Log.w("Example", "Unable to upload to server."); } else { Log.v("Example", "Upload was successful."); }
Summary
OkHttp is awesome because it removes a lot of the heavy lifting necessary to work with HTTP requests in Android. Construct your request content using Java objects and it’ll do the rest for you. If you’re looking for a replacement for the HTTP libraries deprecated in Android 6.0, I strongly recommend this one.
ABBI Demo at ICMI ’16
Earlier this month I was in Tokyo for the International Conference on Multimodal Interaction (ICMI). I was there to demo research from the ABBI project. We had two ABBI demos from the Multimodal Interaction Group at the conference: mine demonstrated how ABBI could be used to adapt the lighting at home for visually impaired children, and Graham’s was about using non-visual stimulus (e.g., thermal, vibration) to present affective cues in a more accessible way for visually impaired smartphone users.
The conference was good and it was held in an amazing city – Tokyo. Next year, ICMI visits another amazing city – Glasgow! Julie and Alessandro from the Glasgow Interactive Systems Group will be hosting the conference here at Glasgow Uni.
Viva and Other CHI ’16 Papers
Last week I passed my viva, subject to minor thesis corrections!
I’ve also had a Late-Breaking Work submission accepted to CHI, which discusses recent work I’ve been doing on the ABBI (Audio Bracelet for Blind Interaction) project. The paper, titled “Using Sound to Help Visually Impaired Children Play Independently”, describes initial requirement capture and prototyping for a system which uses iBeacons and a ‘smart’ bracelet to help blind and visually impaired children during play time at nursery and school.
Finally, we’ve also had a position paper accepted to the CHI ’16 workshop on mid-air haptics and displays. It outlines mid-air haptics research we have been doing at Glasgow and discusses how it can inform the creation of more usable mid-air widgets for in-air interfaces.
CHI 30-second Preview + ACing
Below is a (very!) short preview of my upcoming CHI paper. In recent years, CHI has asked authors to submit a 30 second preview video summarising accepted papers, so that’s mine.
This year I’m an AC for Late-Breaking Work submissions at CHI. I’ve been reviewing papers since the start of my PhD, but this is my first time as an AC. It’s been interesting to see a conference from the “other” side.
CHI 2016, ABBI, and other things
My CHI 2016 submission, “Do That There: An Interaction Technique for Addressing In-Air Gesture Systems“, has been conditionally accepted! The paper covers the final three studies in my PhD, where I developed and evaluated a technique for addressing in-air gesture systems.
To address a gesture system is to direct input towards it; this involves finding where to perform gestures and how to specify the system you intend to interact with (so that other systems do not act upon your gestures). Do That There (a play on one of HCI’s most famous gesture papers, Put That There) allows both of these things: it shows you where to perform gestures, using multimodal feedback (there) and it shows you how to identify the system you want to gesture at (do that).
Three months ago I started working on the ABBI (Audio Bracelet for Blind Interaction) project as a post-doctoral researcher. The ABBI project is developing wearable technology for blind and visually impaired children. Our role at Glasgow is to investigate sound design and novel interactions which use the technology, focusing on helping visually impaired kids. Recently, we’ve presented our research and ideas to the RNIB TechShare conference and to members of SAVIE, an association focusing on the education of visually impaired children.
Finally, I submitted my PhD thesis in September although I’m still waiting for my final examination. Unfortunately it’s not going to be happening in 2015 but I’m looking forward to getting that wrapped up soon.
Interactive Light Demo at Interact ’15
This week I’ve been in Bamberg, in Germany, presenting a poster and an interactive demo at Interact 2015. If you’ve stumbled across this website via my poster, or if you tried my demo at the conference, then it was nice meeting you and I hope you had some fun with it! If you’re looking for more information about the research then I’ve written a little about it here: http://euanfreeman.co.uk/interactive-light-feedback/
For some earlier research, where we looked at using tactile feedback for in-air gestures, see: http://euanfreeman.co.uk/projects/above-device-tactile-feedback/
PhD Thesis and Interact 2015
I haven’t updated my website in months, mostly because I’ve been focusing on finishing my PhD research. In May I started writing my thesis and I’m now approaching the end of it. With 60,000 words written and my first draft almost complete, I am almost there. It’s been an exciting couple of months writing that up and I’m looking forward to finishing. Not because I haven’t enjoyed it, but because it’s the most substantial piece of work I’ve ever undertaken and it’s exciting seeing it all come together into a single piece of writing.
Not much else has happened in that time, although I did get two submissions accepted to Interact 2015: a poster [1] and a demo [2]. Both describe interactive light feedback, something which has featured a lot in my recent PhD research. I describe interactive light feedback in more detail here. Interact is in Bamberg in Germany, this year, which I’m excited about visiting! Nearer to the time (mid-September) I’ll show more photos and maybe a video of the demo I’ll be giving at the conference.
[1] Towards In-Air Gesture Control of Household Appliances with Limited Displays
E. Freeman, S. Brewster, and V. Lantz.
In Interact 2015 Posters. 2015.
@inproceedings{Interact2015Poster,
author = {Freeman, Euan and Brewster, Stephen and Lantz, Vuokko},
booktitle = {Interact 2015 Posters},
title = {{Towards In-Air Gesture Control of Household Appliances with Limited Displays}},
year = {2015},
publisher = {Springer},
doi = {10.1007/978-3-319-22723-8_73},
pdf = {http://research.euanfreeman.co.uk/papers/Interact_2015_Poster.pdf},
url = {http://link.springer.com/chapter/10.1007/978-3-319-22723-8_73},
}
[2] Interactive Light Feedback: Illuminating Above-Device Gesture Interfaces
E. Freeman, S. Brewster, and V. Lantz.
In Interact 2015 Demos. 2015.
@inproceedings{Interact2015Demo,
author = {Freeman, Euan and Brewster, Stephen and Lantz, Vuokko},
booktitle = {Interact 2015 Demos},
title = {{Interactive Light Feedback: Illuminating Above-Device Gesture Interfaces}},
year = {2015},
publisher = {Springer},
doi = {10.1007/978-3-319-22723-8_42},
pdf = {http://research.euanfreeman.co.uk/papers/Interact_2015_Demo.pdf},
url = {http://euanfreeman.co.uk/interactive-light-feedback/},
}
A New Smart-Watch Design Space?
Almost exactly a year ago I wrote about my first impressions of Pebble and concluded that “I have to wonder if smart-watches even need a display“. As a smart-watch, I found Pebble most useful for remotely controlling my phone (through its physical buttons) and for promoting awareness of notifications on my phone (through its vibration alerts); its “clunky and awkward user interface” was even detrimental to its other, more important, function as an ordinary watch.
With that in mind, I was excited by Yahoo! Labs recent paper at Tangible, Embedded, and Embodied Interaction (or TEI): Shimmering Smartwatches. In it, they present two prototype smart-watches which don’t have a screen, instead using less sophisticated (but just as expressive and informative) LEDs.
One of their prototypes, Circle, used a circular arrangement of twelve LEDs, each in place of an hour mark on the watch-face. By changing the brightness and hue of the LEDs, the watch was able to communicate information from smart-watch applications, like activity trackers and countdown timers. Their other prototype used four LEDs placed behind icons on the watch-face. Again, brightness and hue could be modulated to allow greater information to be communicated about each of the icons.
I really like the ideas in this paper and its prototypes. High resolution displays are more expensive than simple LED layouts, require more power and are not necessarily more expressive. Hopefully someone builds on the new design space presented by Shimmering Smartwatches, which can certainly be expressive but also lower cost. Also, everything is better with coloured LEDs.