“Feelable” touchscreens revisited

Tactus have gotten quite a lot of attention recently after demonstrating their new touchscreen technology (pictured above; image source). Their “Tactile Layer” technology raises bubbles on the touchscreen, creating, essentially, physical objects on the touchscreen. I suppose I’ve taken quite an interest in this since it’s similar to something I wrote about 6 months ago: feelable touchscreens.

Here are two amazing and innovative technologies, each taking a different approach towards creating tactile sensations from a touchscreen. Senseg use small electric currents to stimulate the skin, creating edges and feelings of texture, while Tactus actually create something physical.

To the best of my understanding, Tactus’ technology allows bubbles (I’m reluctant to call them buttons; who knows what else interaction designers could do with this!) in pre-determined locations, configured during manufacture. Different configurations are possible, apparently, but from what I’ve read it seems that these are decided at manufacture. Whilst this allows some fundamental improvements to the touchscreen experience (e.g. providing a configuration for a keyboard), it lacks some flexibility as manufacture determines where bubbles can be used.

Senseg’s tech, however, is more flexible and appears to be truly dynamic; application developers can control the precise location where feelings can be experienced rather than this being decided during manufacture.

Having dabbled with Microsoft Surface over the past year I’m pleased to see that both of these technologies apparently scale well to larger displays. Interactive tabletops suffer from the same loss of tactile feedback that touchscreen mobile devices do although this is perhaps less apparent on a large scale device where widgets aren’t crammed into such a small space.

I don’t think it’s fair to ask which of these technologies is better, because they can’t fairly be compared. Although the flexibility of Senseg vs the physical tactility of Tactus is an interesting comparison, I feel that a better question is could these concepts be somehow combined? Imagine a touchscreen which offers complete configuration flexibility, a richer tactile experience like Senseg claim to offer (e.g. feeling texture, not just the presence of something) and the benefits of feeling something physical on the touchscreen. Now that would be awesome.

Virtual keyboards and "feelable" touchscreens

Senseg made a splash recently when they revealed their touchscreen technology which allows you to actually “feel” objects on-screen. By manipulating small electric charges, users can actually feel texture as they interact with a touchscreen. It’d be too easy to dismiss this as a gimmick, however I think this type of technology has the potential to make a positive impact on mobile devices.

Touchscreens are becoming increasingly ubiquitous in mobile devices, leading to the demise of the hardware keyboard. A glance at the list of all HTC phones in their current line-up shows only two of seventeen phones with a hardware keyboard. Samsung again only offer two phones with a hardware keyboard. While touchscreens offer the ability to eliminate hardware keyboards and other unsightly buttons for the sake of sleek aesthetics, they’ve so far failed (in my opinion) to provide a suitable replacement for hardware keys.

Yes, touchscreen keyboards are flexible and can offer a variety of layouts, however they still don’t give sufficient physical feedback to allow fast touch typing. One reason we’re better at typing on physical keyboards is because we “know” where our fingers are. The edges of keys (and the raised bumps often found on some keys) provide reference to other locations on the keyboard. Without looking at the keyboard, an experienced typist can type upwards of 100 words per minute. On a touchscreen, without proper physical feedback, you can expect just a small fraction of those speeds.

One argument against that could be the screen size, however tablets suffer from the same problems. The 26 character keys on my keyboard are of comparable size to the virtual keyboard on my 10-inch tablet. A popular approach to providing feedback for a mobile devices is to vibrate upon key press, however this provides little information other than “you’ve pressed a key”. An alternative approach to making touchscreen keyboards easier to use has been patented by IBM; a virtual keyboard that adjusts itself to how users type on-screen. Auto-correct is another feature which has risen to aid the use of virtual keyboards, yet addresses the symptoms rather than the cause.

Enter touchscreens you can “feel”. Actually being able to feel (something which resembles) the edges of keys on a virtual keyboard is likely to make it much easier to type on touchscreen devices. If technology becomes available which allows effective representation of edges (which Senseg claims their technology can), touchscreen devices will be able to offer what is, in my opinion, an improvement to virtual keyboards. I think this could be of particularly great benefit on tabletop computers which, by nature, allow a more natural typing position than handheld devices. Or perhaps this is all just wishful thinking because I go from 110WPM at my desktop to around 5WPM on my phone.