Experience design: a twisted history

Jim Lentz
Consilient Design
Published in
12 min readOct 27, 2021

--

Jim Lentz and Mark Marrara

Experience design has been around for much longer than many designers realize [1]. One could argue that it began in prehistory when a flint knapper first hafted a stone blade into a wooden handle in order to make it more comfortable to hold. Or maybe, it was earlier when someone worked a stone chopper so that the part to be held was rounder and smoother. It would be fun to try and tackle the entire history but that would be a tall order. So we will constrain our story to modern times. Experience design from the 20th century on has been a enterprise motivated by different objectives. Sometimes these favor the best interests of people and sometimes they favor less noble goals.

It is necessary clarify what we mean by “experience design”. People who intentionally design experiences with technology have gone by many names, “experience designer” and “user experience designer” are just recent ones. If we ignore the others we will have a limited perspective.

We use “experience design” in a comprehensive sense. This specific term first came to be used primarily to mean the composition of digital applications and media, but in reality, it applies to much, much more. For us, it is the practice of intentionally defining the way in which people interact with some kind of constructed technology. This technology may be physically, digitally, or even socially constructed.

This definition covers a lot of areas and risks pulling in too much. We have to draw the line somewhere, therefore we won’t be talking about composing music, perfume, food or dance although they are clearly things that affect human experience. We restrict it to experiences arising from interactions with things that might be construed as tools. Business processes, garden tools, toasters, scripted customer service verbal dialogs, aircraft control panels, ranking algorithms, and movie streaming service screens all fall under our definition.

The evolution of experience design

Modern experience design evolved as in tangle of changing design disciplines and technological advances.

The concept of experience design as an occupation emerged in four stages. The first was the work early in the twentieth century of Frederick Taylor, Frank Gilbreth and Lillian Gilbreth. They developed the industrial engineering practice of “time and motion study”. Closely observing and timing actions and operations in a task made it possible to increase throughput and reduce errors. Improved productivity meant that more product would be delivered for the same wages paid. This satisfied the interests of industrialists seeking greater profits. Little consideration was given to how the work was perceived by the workers themselves. If the experience of a “scientifically managed” job was mind-numbing repetition and boredom, so be it. The only relevant goal was to increase profits.

The second stage was the creation of the field of human factors engineering in the United States during World War II. Human factors engineering also sought to improve efficiency and reduce errors. Its scientific base was expanded to applied experimental psychology, systems engineering physical anthropology as well as other areas. There were multiple applications of human factors in these early years but an illustrative one was what was then called “aviation psychology”. In war-time aviation psychology, error reduction meant not getting the pilot killed by inadvertently crashing a plane or being shot down by enemy fire. Productivity meant getting more of the enemy killed. It also meant faster, more effective training.

Another stage was the creation of the field of ergonomics in Europe. Ergonomics is the engineering science of understanding and improving work for the benefit of the worker. It was initially oriented more toward physical aspects such as tool anthropometry (the measurement branch of physical anthropology) and physical effort in industrial settings. But it soon came to encompass other areas such as cognitive and organizational design. It eventually became clear on both sides of the Atlantic that the ergonomics and human factors fields were attempting to solve similar problems using whatever scientific and engineering tools were applicable. Both ergonomics and human factors professional organizations recognized their mutual interests. The Human Factors Society eventually rebranded itself as as the Human Factors and Ergonomics Society. Similarly ergonomic organizations also formally acknowledged that their domain includes the field of human factors. They are essentially the same thing.

Finally, Industrial Design provided a parallel focus on manufacturability, appearance and functionality. In addition it began to pay increasing attention to ergonomics and usability. By considering functionality in particular, industrial designers added depth to the notion of ease of use.

These movements began to coalesce in the 1960s into what would eventually be called usability engineering. They all shared a concern for not only productivity and error reduction but also the notion of “usability”. Usability differs from utility in that it implies “ease”. Usage should be easily learned, it shouldn’t require a large number of operations or repetitions, nor should it require excessive physical or mental effort.

This notion was a win for both business and users. Businesses could save money on training costs and reap productivity gains through efficiency and error reduction. Furthermore, as consumer products became more usable, they would be more attractive to consumers. At work, users would find their jobs more satisfying because they would perform better and be less frustrated by complex tools. Employees could do more with less effort. At home, using technology would be more satisfying.

This period marked the beginning of a transition to a society that is deeply enmeshed in information technology. In the 1960s, the primary information technologies that people interacted with consisted of the printed page, telephone, television, and radio. Households had few telephones and televisions and they tended to be fixed in place. Their user interfaces changed very slowly if at all. Very little learning was required to operate them.

All this began to undergo radical change around 1980. Forward thinking technology companies such as AT&T, Digital Equipment Corporation, Hewlett Packard, IBM, and SUN realized that increasing computerization would bring complexity beyond what the user community was prepared to handle. New technologies were esoteric. Totally new classes of hardware and software products appeared including personal computers, word processing programs, and spreadsheets. Technology companies increasingly hired specialists in systems engineering, human factors and industrial design. When these were in short supply, experimental and cognitive psychologists, sociologists and anthropologists were hired to adapt these new products to less technically trained users.

A new research and applied domain called “Human-Computer Interaction” (HCI) was the result. HCI specialists spent a decade or or so discovering and documenting the basic principles of the usability of computing devices. These specialists were hired in increasing numbers by product development organizations.

Corporations recognized the marketing appeal of emphasizing that products were designed with simplicity, ease and comfort in mind. The phrase,”user-friendly” became popular in the 1980s [2]. It was used widely and sometimes indiscriminately to counter concerns about of growing technical complexity. Enterprise software makers began to market and differentiate their products based on ease of use.

Upper management and marketing teams in these companies even saw ease of use as such a marketable product differentiator that it could be part of their branding. (e.g. “Intuit” and “ Lotus 123”, an early spreadsheet). Famously, Apple Computer used simplicity very effectively in its competitive battles with the IBM PC.

HCI practitioners in this period often called themselves “usability specialists”. They employed processes called “user-centered design”. One well-known set of principles was branded as “The User’s Bill of Rights.” The slogan “user in control” was touted as a good usability strategy as well as an ethical value. They saw themselves as advocates for the user.

In the 1990s through the present, five developments have caused more upheaval in the socio-technological landscape. The first was the popularization of the Graphical User Interface (GUI) by Apple, IBM, and Microsoft on desktop personal computers. Desktop environments and applications were designed using usability principles discovered throughout the 1980s and early 1990s. Designs were tightly controlled by applying human interface guidelines and standards. These specified layouts, control features, interactions and response feedback methods. Increasing numbers of people became familiar with this technology at work, if not also at home.

In addition to ease of use, productivity, and error reduction, experience designers increasingly considered a new factor — emotional responses to technology. It wasn’t enough for users to find a product to be easy to use, they should also feel good about using it, which in turn, could lead them to promote it by evangelizing to friends and associates.

The second development was the emergence of the web and web-based software applications beginning in the mid 1990s. The web introduced sweeping changes to user experience ecosystems. Emotional design became far more important especially as e-commerce emerged as a major economic driver of digital technology. Now designers were required to consider goals like attention capture, “stickiness” and user engagement. Due to the highly visual nature of the web, visual and graphic design skills became increasingly important.

The third development, was the mobile revolution. Mobile UIs are limited in both inputs and screen space so the primary experience design challenge was to define short, focused engagements in small, simple applications. Just as competition for attention drove web experience design in new directions, mobile interfaces required design objectives beyond ease of use, productivity and error reduction. These included attention capture, novelty, and aesthetic fashions.

The fourth development was the ability to easily update user interfaces. Ease of update is almost inherent in the design of web-served content. It can be as simple as editing a file. Other technology enabled desktop computing applications, operating systems and mobile apps to be updated with only the granting of permission, if not automatically.

Easily and frequently updating user interfaces has had both positive and negative consequences. The positives are obvious. The time consuming misery of complex update and migration tasks has largely been eliminated, at least for consumer software. Design error corrections and improvements come more frequently. Security and stability bugs are rapidly eliminated. Fresh new UIs are often exciting.

On the other hand, there are some negatives. Users often become extremely facile with UIs that have served them well for years. When these interactions are changed users must relearn their apps (interactive tutorials to mitigate a change is a poor fix when the user doesn’t think the app is broken). Undesired or confusing changes created in the phenomenon of change resistance. Resistance to change is particularly acute with older and disabled consumers. Moreover, some enterprise software updates may incur millions of dollars in migration expenses for the customer. It is no wonder that some organizations will resist upgrades until the vendor removes support for outdated versions.

The fifth development was the invention of automatic experience personalization. Diverse platforms exemplified by Google, YouTube, Facebook, Twitter, Amazon, and Netflix collect and analyze user profiles and behavior to filter and recommend content. Ostensibly, this is a great usability innovation because it helps users find what they are looking for while sparing them from laborious searches. Alerts and reminders provide situational awareness and personal convenience. Having my phone recognize when I typically leave work to proactively alert me of commute times avoids unanticipated hassles.

Automatic personalization has again extended the skills needed to craft user experiences. In social media, interaction design is barely relevant because the interfaces are so simple. Instead, data mining and algorithm design skills are exploited to channel information to users. Furthermore to mitigate socially negative effects of this channeling, these platforms have found it necessary (or have been compelled) to add experience design considerations of tradeoffs between free speech, hate speech and dangerous misinformation.

People love the convenience and/or entertainment value of these platforms. With their popularity comes power. With the possible exception of Netflix, they are essentially innovation-inhibiting monopolies. Furthermore, they all have second-order effects that run counter to the best interests of individuals and society at large. They all automatically personalize content on the user’s behalf. By restricting content to what users most want to see, they steer users away from information and experiences that could broaden their world view. Feedback effects in turn have exacerbated political polarization, group think, and the proliferation of misinformation. In addition, Facebook, Tik-Tok, Twitter, Instagram and other social platforms have exploited social reward systems leading to various forms of social and emotional dysfunction including online bullying, engagement addition and body dysmorphic disorder.

For the most part, these are unintentional negative consequences produced by the personalization algorithms. However, business interests also come into play [4] and impact the User Experience. Other intentional unethical design practices have come to be called “dark UX”. Dark UX design patterns nudge users into making responses that serve profits by sacrificing user goals. A complete examination of dark UX is beyond the scope of this article; however, a couple of examples will be useful. The U.S. Federal Trade Commission’s CAN-SPAM law requires that commercial emails provide ways for recipients to opt out of future emails but most spammers use the dark UX pattern of defaulting to opt-in while burying the opt-out link in fine print. Another example is Amazon’s defaulting to paid shipping even though a customer has already met the threshold for free shipping.

Different roles — different goals

Over the past 100 years an array of specialists have been designing how we experience technology. These have included industrial engineers, applied psychologists, human factors engineers, industrial designers, visual designers, data scientists, programmers and product managers to name just a few.

With different roles come different goals and values. Over time, experience goals have included efficiency, productivity, error prevention, safety, usability, design excellence and of course, business value. Often (but not always) business value and the pro-social design values of putting users first have come into conflict. The goal of keeping the “user in control”now competes with many product goals. While designers have worked to refine the fit between people and technology, users have seen their work deskilled and routinized, consumer privacy has been compromised and social cohesion has been disrupted.

Writ large, experience design has changed considerably over the past century. Current designers often see it as primarily a creative prosocial activity. However, user-centrism has been on the decline in recent years. Usability is now just one of several goals of a product team.

The fact that “Human” or “user” are no longer part of a practitioner’s title is telling. The operative word is now “design”. Usability is still important but is merely one of many design goals. When a project has multiple objectives, trade-offs are inevitable.

Summing it all up

This is more than just a story about evolution of a discipline. It includes lessons that could help designers avoid the exploitation or trivialization of experience design in the future.

  1. Experience design has sought to realize a multitude of behavioral goals including personal productivity, safety and error avoidance, ease of learning, social engagement, satisfaction, comfort, effort reduction, and attention management. Similarly, multiple business goals have sought to exploit experience design such as product sales, cross-product sales, effective branding, vendor lock-in, cost savings, advertisement targeting precision and so on. These goals are intertwined and are often inseparable. To consistently design effectively requires understanding how user and business goals should interact.
  2. Many players are involved in the realization of the user’s total experience. To realize one’s own design goals it pays to be at least conversant in or possibly display mastery in different domains such as systems thinking, data analytics, aesthetics, learning theory, and so forth. Since business and experience goals are intertwined, being knowledgeable of the business is important for influencing product direction.
  3. Technological advances have always expanded the pool of design languages. With each technological revolution, considerations are added. They are rarely removed. Careers vary in scope but many designers will find they need to continually expand their knowledge of interface technologies.
  4. We can no longer assume that the experiences we design are inherently good for the user. Experience design often helps users by simplifying their interactions with technology and providing positive emotional responses. But it also can also produce unintended negative consequences that may not be apparent immediately. Finally, experience design can be used to advance business interests at the expense of users. If designers want to ensure that their work is a public good, they need to be aware of the entire set of experiences and outcomes that they are creating.

Some designers believe that their role and responsibilities end in creating surface level interactions. The total experience consists of so much more. It includes the experience of a product life cycle. It includes first use learning, expert usage and making transitions necessary to use updated products. It has safety and ethical dimensions for both individuals and society.

Automation and machine learning technologies increasingly shape user experiences, and we’re just beginning to learn some of their effects on society. This sixth major technology development will almost certainly have a greater effect on society than any of the previous advances. It offers both positive and negative consequences. As a minimum, we must do no harm. But we need to do much more. As designers, we need to be proactive. We need to recognize our ever expanding responsibilities for the designs we create.

Notes:

[1] In a article in a notable magazine, this author claimed to be a pioneer of user experience circa 2000. By our reckoning, he was only off by about 100 years.

Jesse James Garrett 06–03–2021, I helped pioneer UX design. What I see today disturbs me. Fast Company https://www.fastcompany.com/90642462/i-helped-pioneer-ux-design-what-i-see-today-horrifies-me

[2] Donald Norman is sometimes erroneously given credit for coining the term “user friendly” in 1982 (e.g. https://www.nngroup.com/articles/100-years-ux/) However, it dates to at least 1978: I. Kameny, J. Weiner, M. Crilley, J. Burger, R. Gates, and David Brill. EUFID: The end user friendly interface to data management systems. VLDB ’78: Proceedings of the fourth international conference on Very Large Data Bases. Volume, 4 September 1978. pp 380–391 https://dl.acm.org/doi/abs/10.5555/1286643.1286693

[3] In the early days of HCI, the minimum requirement for most positions was a master’s degree in a relevant field. Ph.D.s were preferred.

[4] This is currently all over the headlines. For example: “People or profit? Facebook papers show deep conflict within” Seattle Times. collected October 25, 2021. https://www.seattletimes.com/business/people-or-profit-facebook-papers-show-deep-conflict-within/

--

--

Jim Lentz
Consilient Design

UX research and design psychologist with interests in the relationship between humans and society, decision making, creativity and philosophy.