Essays & Articles


Design as Communication

(This essay is available in Chinese at

I was traveling, and once again I woke up to a strange hotel room in a strange city: Delft, The Netherlands in this instance. I went to take a shower to prepare myself for my 8:30 AM pickup. As I looked over the bathtub and shower, wondering where to put the soap, I realized that the design was talking to me. “Put the soap here,” the metal dish on the side practically screamed at me. “Grab here,” said the handle at the far side of the tub. I looked up at the showerhead fastened to the wall, then down along the tub to a strange hook-shaped device just above the tub. “What is that for?” I wondered, as my eyes searched for something relevant. I looked back at the shower head, and realized it was fastened to the wall with the same hook-like device, with a flexible tubing leading back to the faucet. I lifted the showerhead off its upper location and put it down below. Yes, it fit perfectly. “No,” I said to no one in particular, “I like my shower above my head.” I took one glance at the towel rack at the side, lined with towels, all appealing to me: “take me,” each appeared to say. And as I prepared to take my shower, I looked back at the soap dish, which was still iplring “put the soap here,” and firmly announced “no, I like my soap at the back end of the tub,” and I put my newly unwrapped bar there, on the ledge so conveniently provided.

Later, as I thought back about that morning shower, I realized I had been communicating with the designers. “Grab here,” the bar was telling me. “Put the soap on me,” the wire rack soap dish screamed. “Here are your towels,” said the horizontal bars at the rear wall, at the end of the tub, conveniently stocked with towels. “Thank you, yes, and no, not for the soap,” I was replying. I even spoke some of these comments aloud. To whom was I speaking? An observer would have thought me deranged, for there was no one in the room. Fortunately, I usually shower without an observer, but I was having a conversation with the designers, considering their suggestions, accepting some and rejecting others. The designers may not have been there to listen, but their statements clearly required an answer.

Note that sometimes design is done by an individual, sometimes by a group. Designers sometimes are trained, often with degrees and certificates, sometimes not. But everyone is a designer at times, and whoever decided which soap dish to purchase and where to place it was designing, whether or not they had the official title of one. In the case of the bathroom, multiple designers were clearly involved, for the racks, bars, trays, tub, and shower equipment were commercial products, designed remotely in some industrial location, probably by different people in different companies, independently of one another. Whoever picked those particular products and configured them within the room was also designing, even if each of the individual items was already manufactured. Design is a complex activity, and most of the objects, spaces, and even services that we interact with were designed by multiple people, sometimes in synchrony with one another, sometimes not.

Each placement of an object, the choice of materials, the addition of hooks, handles, knobs, and switches, is both for utility and for communication. The physical placement and the perceptual appearance, sound, and touch all talk to the users, suggesting actions to be taken. Sometimes this conversation is accidental, but in the hands of good designers, the communication is intentional. Design is a conversation between designer and user, one that can go both ways, even though the designer is no longer present once the user enters the scene. For this insight, I am grateful to Clarisse de Souza’s book Semiotic Engineering.


Semiotics has a reputation for being difficult, abstract, and obtuse. This is most unfortunate, for as Clarisse de Souza illustrates in this powerful book, semiotics offers a compelling novel viewpoint. It is common to think of interaction between a person and technology as communicating with the technology. de Sousa shows that the real communication is between designer and person, where the technology is the medium. Once designs are thought of as shared communication and technologies as media, the entire design philosophy changes radically, but in a positive and constructive way. This book has changed my thinking: read it and change yours.

Jacket blurb written for Clarisse Sieckenius de Souza, 2005, The Semiotic Engineering of Human Computer Interaction. Cambridge, MA: MIT Press. (Norman, 2004.)

When a designer announces “I put an affordance there,” what could possibly be meant? To the purist, affordances simply exist they are the actions possible by an agent (usually a person) and the environment. But the concept was invented for the natural world, and when it comes to the physical, constructed world, it does make sense for a designer to have deliberately shaped and located the materials so as to afford action. Thus, a chair does indeed afford support, whether for standing, sitting, kneeling, or lying down, and this support has been deliberately designed in by the astute designer. An adroitly placed handle affords pulling, and so on. But when we talk about the virtual world of software and objects visible on display screens, then the term becomes problematical, for what could it possibly mean for a designer to draw a circle on the screen, announcing, “This is an affordance for clicking”? To the purist, this is a meaningless statement, because the screen is no more or less clickable after the presence of the circle than before.

In an earlier discussion on this topic (see my essays here at, I decided that although the screen designer was not using the term appropriately in its pure sense, there was no other term to describe what the designer had done, so why not appropriate “affordance” for this purpose. Affordance is indeed close, and this is how language grows over time, adding new concepts, letting words expand or contract in meanings to fit the circumstances. The one caution I added was that Affordances need not be known to either designer or person — they need not be visible, perceptible, nor easy to discover. The real role of the designer, I stated, was to make the appropriate set of affordances visible – hence the designer worked not in the world of possible affordances, but rather in the world of perceived affordances. The one obvious exception is where the designer deliberately hides the affordances, such as in designing tricks for magicians, secret door, or special entrances in both physical and virtual spaces for those privileged, but that should not be discoverable by ordinary people.

Once we start to view design as a form of communication between designer and the user, we see that perceived affordances become an important medium for that communication. Designed affordances play a very special role. Now we see that the designer deliberately places signs and signals on the artifact to communicate with the user. The metal tray made of wires clearly both affords support for solid objects but not for liquids. Hence, the very visibility of both the positive affordance (support) and the negative one (porosity, or perhaps leakiness) tell the user “put something here that fits this space, that requires support, and that you do not wish to be in a puddle of water.” Given the limited number of items one usually takes to the bath or shower, given the size constraint of the basket, and given the strong negative affordance of leakiness, what else could be meant except for soap: so the wire support shouts out to the shower-taker, “put your soap here!”

A similar communication happens in the virtual world of screen design. The computer user is often at a loss as to what to do. By making certain regions of the screen take on perceptible, distinctive appearances, the designer is communication the design intention. These are designed affordances, messages from designer to user, attracting attention to the set of desired possible actions.

Under this new view of design, designed affordances are communication devices, specifying the designer’s intentions to the audience. An now it makes perfectly good sense for a designer to say “I put an affordance here,” and, moreover, to continue in order to explain why: “I put an affordance here because … “

Communication & Context: Design as Story Telling

Note the story of the soap dish screaming “put the soap here!” In fact, the wire tray is not unambiguously a dish for soap. Were it encountered at some random location, one might ponder its usage, perhaps coming up with numerous possibilities without ever thinking of soap and the bath or shower. The wire dish was unambiguously for soap only within the context of the bathroom because it was part of the story of how a shower is taken: water of suitable temperature falling from above, a place to catch and contain the water, a way of controlling the water temperature and volume, a place to hold the soap, a place to hold other ointments and washing towels, and a place for drying towels. In actuality, the shower often requires more, such as curtains or doors to contain the water, a method of forcing the water out the showerhead and not some other place (such as the bathtub spout), and so on. In all these cases, it is the conceptual model of taking a shower that provides cohesion to the activities. This is a story. The story provides the context that makes the roles of the various accoutrements clear and distinct. Without the story, without the context, their purpose might not be discovered.

Why Stories Are Superior to Logic

To put it simply, most digital systems fail when they fail to provide a story, when there is a poor conceptual model. The notorious BMW iDrive provides a perfect example. The iDrive is well thought out, very logical, and sensible. The problem is that logic is absolutely the wrong tool. The iDrive, for those who have not yet experienced it, is a single control knob plus display that is intended to control many of the non-driving functions of the automobile – over 700 functions, bragged BMW in its early literature about the product. The iDrive provided a well thought through hierarchy of menus rather than the disconnected clutter of numerous dials, switches, and gauges that clutter the normal dashboard of the automobile.

By logic, the iDrive was a superior device. Alas, people function through stories, not logic. Moreover, people are spatial, we remember where things are in space, whereas the iDrive destroyed spatiality. And finally, the stories we remember and the conceptual models we prefer have to do with how a particular device functions: heating and cooling the automobile, changing the station of the radio, checking what distance remains for our trip. Each activity requires a separate story, separate control, and a separate location for operation. Alas, the iDrive collapsed everything into one location.


Years ago, I championed “User-Centered System Design,” based upon the point that designers had to focus their attention upon the users of their systems. I diagrammed the interaction between designer and user as a triad:

The designer’s model, the system image, and the user’s model. For people to use a product successfully, they must have the same mental model (the user’s model) as that of the designer (the designer’s model). But the designer only talks to the user via the product itself, so the entire communication must take place through the “system image”: the information conveyed by the physical product itself. (Originally published in Norman & Draper’s User Centered System Design (1986), and reused frequently thereafter: The Design of Everyday Things (1988, 2003) and Emotional Design (2004).

I have long maintained that the appropriate way to design a complex system is to develop a clear, coherent conceptual model (ideally the same as the designer’s conceptual model) and to design the system so that the user’s mental model would coincide. I had always assumed this would be done through the design of the “System Image”: the artifact plus any auxiliary material, such as manuals and help systems.

DeSousa makes a major advance in our understanding of the communication model. If the designer explains the reasoning behind the model, the user will find the task of uncovering the conceptual model much easier. In other words, what we need to provide to people is reasons, not just methods.

Systems usually try to convey the actions that need to be taken at any point in a sequence. A major driving force in the development of graphical user interfaces — the GUI — was to make all possible commands visible, so at any point a person could discover what to do simply by looking through the set of possibilities. The technique of “graying out” menu items that are not applicable at the moment is a communicative tool: indicating the existence of the command while simultaneously indicating that in this situation, it does not apply. Although GUIs were a major step forward, they simply providing information about the set of possibilities, not about the reasoning. Thus, if a grayed-out command appears to be precisely what the person is searching for, the system simply communicates its non-availability: it does not suggest why it is not available, nor how to change it to be available.

Human beings are explanation machines. We are always trying to understand the world around us, and we make up stories to explain the occurrences we experience. If there is sufficient evidence or knowledge, these stories are reasonably accurate, but in the absence of such information, the story — in other words, the explanation — is apt to be wrong. In many cases, the person simply cannot construct an adequate explanation, not even for themselves, which means they remain puzzled and are apt to have the same difficulty every time they encounter the situation.

Conceptual Models as Stories

What are conceptual models? I think the easiest way to connive of them is as stories, stories in context. A model is a story that puts the operation of the system into context: it weaves together all of the essential components, providing a framework, a context, and reasons for understanding. Without this story, without this conceptualization, the operation of something becomes a set of memorized actions, but without reason or purpose except “this is how it is done.” Although we do operate many of the devices in our lives as learned operations, if there is any complexity to the device at all, then those operations are difficult to learn and prone to error. Worse, when things go wrong – as things invariably will – the lack of a story, of a context, and most importantly of all the lack of any explanation or reasons why the operations are being performed, means that the poor person cannot cope: without understanding, new sequences are difficult to derive, escape mechanisms, recovery strategies and the all-important “work-arounds” are difficult if not impossible to discover.


After I had posted the essay, above, Terry Winograd reminded me of the excellent chapterby John Rheinfrank and Shelley Evanson in his book “Bringing Design to Software.” This chapter does an excellent job of explaining just how a design language can aid in the development of design as communication, with some excellent examples from the work they did on Xerix copiers. Their chapter was published in 1996 and I was very much aware of it: I have long been admirers of their work, and I have a chapter in that same book. My apologies for not remembering it in my first writing. So put the Rheinfrank & Evenson book together withthe de Souza book and you have a very powerful argument, one worthy of exploration in more depth.


Rheinfrank, J., & Evenson, S. (1996). Design Languages. In T. Winograd (Ed.), Bringing design to software (pp. 63-80). New York: Addison-Wesley (ACM Press).

de Souza, C. S. (2005. The Semiotic Engineering of Human Computer Interaction. Cambridge, MA: MIT Press.