Have You Tried Shaking it?: The Evolution of Intuition and Discovery in Digital Interfaces

There's a kind of quiet dialogue hidden in the design of things and products, both physical and digital, and usually, when it's done well, we don't even realize it's happening.  These conversations are essential and constant, and the everyday objects and products in our lives can tell us a lot about themselves before we even pick them up, from basic questions like "what do you do?" and "how do you work?", to more complex things like error signaling and state feedback. 

A Design Primer: Affordances and Signifiers

In the late 80's, design academic Don Norman popularized two concepts in his now widely-referenced book, "The Design of Everyday Things," to describe how products engage in this silent dialogue. Roughly: things that products can do with respect to a user, affordances, and the ways that products communicate the things they can do, signifiers.  A door, for example, generally has the affordance of being pulled or pushed open by someone, just as the door's handle signifies that this affordance exists.  

In the realm of physical objects, these two things, affordances and signifiers, are often inextricably related. Oftentimes, the physical properties that enable an affordance to exist simultaneously signify what that affordance is, or in other words, the affordance itself becomes perceivable. The hinge on a door simultaneously enables the door to open and, because we understand how hinges work and what they look like, allows us to discover that the thing attached to the hinges is a door that, indeed, opens.  The silly traps that Wile E. Coyote lays for his road-running nemesis tell us exactly how they work just by looking at them, and through this understanding, can be subverted for laughs. From a design perspective, these kinds of perceivable affordances or signifiers are the conversational medium by which products communicate to users, and a tool for designers to facilitate this conversation.

In the digital space, things are different for a couple reasons. For one, the digital interfaces that people interact with are not constrained by the same set of physical properties that they are in the real world; they aren't bound by things like material or manufacturing method or physical forces, and so some of the natural ways in which the affordances of an object and the signifiers of that object are related are lost.  Building something in the real world that can be sat on, like a chair, for example, often also ends up looking like something that can be sat on (flat, supportive, butt-sized) via these signifiers, but telling a computer to make something functionally clickable in a digital interface has no inherent effect on the way the computer visually renders that thing, outside of us also telling the computer to render it in some specific way.  In some ways, this gives designers of digital interfaces more freedom, because the kinds of ways in which they can communicate a product's functions are freed from any physical limitations of the functions themselves, but also requires that designers are more intentional about how they build out these signifiers to direct users down desirable paths.  

Metaphor, Intuition, and Expectation in Digital Interfaces

Sometime in the late 80's, in the early days of consumer computing, there was this really difficult problem of teaching people how to interface with them.  This was the first generation of real consumer computers, and because nothing had existed previously to ease the consumer marketplace into this massive technological shift, interface designers relied heavily on conceptual and visual metaphor from the real world to explain complex concepts, drawing parallels between the physical objects that we were used to dealing with and important digital concepts within the computers' workflow.  This approach of explanation via metaphor, also called skeuomorphic design, essentially allowed designers to import the familiar affordances and signifiers of the real world into the digital one, and ultimately gave us an intuitive way for the new users of computers to orient themselves in the new virtual environment.  So when the early consumer computer interfaces incorporated this skeuomorphic design as a way to communicate to how digital interfaces worked to users, it acted as a sort of introductory interface that was simple to use and easy to learn.

Since then, naturally, a lot has changed. But the interesting thing about the evolution of digital interfaces is the way in which collective context and user intuition has evolved with it.  In the early days, the relationship between digital interfaces and the real world metaphors that explained them was foundational to how users understood the interfaces themselves.  Users understood that a button was clickable because it came with the same visual signifiers that accompanied real physical buttons: it had depth and a shadow and probably text that said "click here for..."  Over time, however, we started to develop a collective visual language and intuition for how things worked in digital interfaces, and the metaphors that allowed us to migrate from a purely physical world to a digital one were no longer strictly necessary.  The users of these interfaces learned what a clickable button was and where they needed to be, and eventually the visual design language evolved too.  Now we no longer need to make digital buttons look like physical ones to show that an element is clickable, and buttons have mostly lost their 3D shadows and explicit inner text.  

The other important thing to realize is that our collective intuition about how things work and the way the interfaces exist in a given time are inextricably related. The interfaces that we interact with inform the expectations and intuitions we develop about these interfaces, and the expectations we hold allow future interfaces to adapt to our collective understanding.  The early period of heavily skeuomorphic design allowed for the flat design that came after, which in turn allowed for more and more abstract design styles as the user expectation and intuition evolved alongside it.

Hidden Interactions and Discovery Today

These days, we have such a large vocabulary of digital interactions that some of the commonly used interfaces are built purely on intuition and expectation in such a way that they might seem undiscoverable to the uninitiated: physical gesture interactions like pull to refresh, shake to undo, or swipe left to go back are such ubiquitous mainstays of mobile interfaces without visual signifiers that populations that don't already know to try them can find themselves left behind.

And as the underlying technology and the interfaces that lie on top of them continue to evolve, at times we find ourselves straddling different user groups with different intuitions about how interfaces should work. People born only a handful of generations ago lived most of their life before the advent of widespread commercial computer systems, and the younger generations have never known a world without it. How can we build interfaces and digital experiences that feel effortless to both?  How can we build interfaces that appeal to, and build upon, one population's sense of intuition and expectation, without alienating and sacrificing discoverability in another?

And this isn't just a generational issue either, localized groups with different types of expectations about how interfaces work exist in lots of places.  A common interaction in Korean branded rewards apps involves shaking your phone to bring up a barcode to scan at in-store kiosks to collect points.  In some ways, it's an incredibly intuitive interaction, because there's no fumbling through layers of in-app options for your rewards number while holding up the line at a busy Starbucks, but users from outside the country that haven't built up the same kind of digital intuition for this will probably never find it naturally because it lacks the basic signifiers that make it discoverable in the first place.

Building the Future

The point of all of this is that in order to be able to design intuitive user experiences, we have to think carefully about the limitations and boundaries of our own collective understanding, and about the ways in which we learn how experiences are supposed to work.  The increasingly rapid pace of new technologies without standardized design language or interface precedent means that we're on the edge of new and exciting territory, but also that the incoming waves of interfaces for things like AI and AR and VR will have that same very difficult task of creating novel and cohesive experiences to go with them.  Now more than ever, we need to consider what it means to build experiences that are both intuitive and discoverable, and to ensure that the steady advances of technology are accessible to the people that need it, whoever and wherever they are.