Wednesday, October 28, 2015

Tangible user interface: What comes after click?

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of human abilities of grasp and manipulate physical objects and materials.

For three decades, most of us have interacted with computers in exactly the same way: We point with a mouse (or a finger!), click, and watch the screen. In one way, it's the most outdated element of human computer interaction around. But in another, it's the thing that's shaped every operating system and device designed since its invention. We're starting to leave it behind, though. Here's what's coming next.

Changing an interaction as deeply entrenched as clicking is, well, monumentally challenging. It's also extremely exciting. It asks us to rethink the way we interact with technology altogether. The catch-all name for this field is tangible (or graspable) user interface design, and we're hearing about it more and more often. Here's a simplified version of what our complex future has in store.

Giving Physical Objects Digital Meaning

For the past few years, the conversation around user interface design has been peppered with words like “invisible” and “disappearing.” The thinking goes that as interfaces develop, they’ll eventually disappear—we’ll just be gesturing in an empty room. Or thinking a command to a fleet of brain-embedded sensors.

It’s easy to see how conceptually speaking, it’s not a far jump from “intuitive” to “invisible.” Yet they’re not the same thing, as Mark Wilson pointed out a few months back. An interface that’s easy to use isn’t synonymous with one you can’t see. Invisible UIs are confusing. We can’t tell whether they’re working or if they’re erroring. It’s hard to learn them. As Berg’s Timo Arnall puts it, “literal invisibility can cause confusion, even fear, and they often increase unpredictability and failure.”

But what about an interface that’s woven into the fabric of everyday life? What if future interfaces aren't just visible, but feel-able? What if they’re linked to physical objects that control digital environments? That’s the basic foundation of tangible interface design.


Characteristics of tangible user interfaces

1.Physical representations are computationally coupled to underlying digital information.
2.Physical representations embody mechanisms for interactive control.
3.Physical representations are perceptually coupled to actively mediated digital representations.
4.Physical state of tangibles embodies key aspects of the digital state of a system

According to, five basic defining properties of tangible user interfaces are as follows:

1.Space-multiplex both input and output;
2.Concurrent access and manipulation of interface components;
3.Strong specific devices;
4.Spatially aware computational devices;
5.Spatial re-configurability of devices.

                                   
A simple example of tangible UI is the computer mouse. Dragging the mouse over a flat surface and having a pointer moving on the screen accordingly. There is a very clear relationship about the behaviors shown by a system with the movements of a mouse.
                        
                                           
Another example of a tangible UI is the Marble Answering Machine by Durrell Bishop (1992). A marble represents a single message left on the answering machine. Dropping a marble into a dish plays back the associated message or calls back the caller.
                                               

How it started?

Tangible Interaction has been influenced by work from different disciplines, in particular Computing, HCI, and Product/Industrial Design. For Computing and HCI, the notion of a ‘Tangible User Interface’ (as it was originally conceived in the mid/late 90s) constituted an alternative vision for computer interfaces that brings computing back ‘into the real world’ (Wellner, Mackay, Gold 1993; Ishii, Ullmer 1997). A general dissatisfaction with traditional screen-based interfaces and with Virtual Reality, which were seen as estranging people from ‘the real world’, motivated the development of the first prototypes, while technological innovations enabled building these (e.g. RFID technology). In contrast, the field of Industrial Design came to engage with Tangible Interaction out of necessity, as increasingly appliances contain electronic and digital components and become ‘intelligent’. For designers, this constituted new challenges as well as new opportunities (Djajadiningrat, Overbeeke, Wensveen 2000; Djajadiningrat et al 2004).

An interesting point is that challenges and established skills are complementary for the above mentioned disciplines: Where considerations of physical form factors, choice of materials and so on forced computer scientists and HCI researchers out of their comfort zone, industrial designers now had to focus on designing complex behaviour that is digitally controlled and has no inherent relationship to product form.

These practice and research fields had no common discussion forum and only intersected occasionally or through personal contacts, with e.g. particular product ideas and sketches inspiring the notion of a Tangible User Interface. The Marble Answering Machine, devised by Durrell Bishop while studying design at the Royal College of Art, is one such sketch that used marbles to represent incoming messages. The marbles fall out of the machine and can be played by placing them into a mould on the machine (Poynor 1995). Generalizing this design yielded the idea of representing data through physical objects and of manipulating the data by physical handling of the objects – Ishii’s Tangible Bits vision (Ishii, Ullmer 1997).

No comments:

Post a Comment

Disqus Shortname

Comments system