The Four Fundamental Principles of Human-Centered Design and Application

Human-centered design has four major principles, summarized below and then expanded upon:

  1. Ensuring that we solve the core, root issues, not just the problem as presented to us (which is often the symptom, not the cause)
  2. Focusing on people
  3. Taking a systems point of view, realizing that most complications result from the interdependencies of the multiple parts
  4. Continually testing and refining our proposals, ensuring they truly meet the needs of the people for whom they are intended

The Four Principles of Human-Centered Design

1. Understand and Address the Core Problems.

Solve the fundamental, underlying issues, not the symptoms. We recommend starting with field studies and observations of actual practice (an applied ethnography). Ask “why?” at each issue. When the answer is “human error,” keep going: why did the error occur, what could have prevented it? Core issues often include the people’s lack of understanding of the complexity an of the entire system, misaligned resources and reward structures, and the disruptions caused by the work environment, with frequent interruptions, conflicting requirements, overly-complex technology, and the need for multiple transitions among technologies systems, and people leading to continual interruptions as well as lack of complete communication between elements.

2. Be People-Centered.

Much of today’s systems, procedures, and devices are technology-centered, designed around the capabilities of the technology with people being asked to fill in the parts that the technology cannot do. People-centered means changing this, starting with the needs and abilities of people. It means considering all the people who are involved, taking account of the history, culture, beliefs, and environment of the community. The best way to do this is to let those who live in the community provide the answers.

People-centered means considering all the people who are involved. Using healthcare as an example, this means patients and their families, general practitioners, specialists, technicians, nurses, pharmacists, community supports, and the various staff who schedule and support the activity. There is need for careful observation of individuals doing their routine work, including between clinics, laboratories, and site locations. As boundaries erode between clinical care, public health, and the community, there will be a need to observe the engagement with businesses supporting healthcare and policy makers regulating care. In domains outside of healthcare, similar principles apply.

3. Use an Activity-Centered Systems Approach.

Design must focus upon the entire activity under consideration, not just isolated components. Moreover, activities do not exist in isolation: They are components of complex sociotechnical systems. Fixing or improving a small, local issue is often beneficial, but local optimization can result in sub-optimal global results. Focusing upon support of the activities is more important than optimization of the individual components. Systems involve multiple complex feedback and feed-forward loops, some with time delays measured in days or months. There are often tensions, conflicts, and differing perspectives among the multiple participants. Potential solutions have to be developed with the assistance and buy-in from all parties. Experts can provide essential analyses and approaches, but unless those most affected by the issues play a major role, in assuring that the suggestions are appropriate to the culture, the environment, and the capabilities and goals of the community, the results are apt to be unworkable and unsatisfactory.

4. Use Rapid Iterations of Prototyping and Testing.

Whatever the initial suggestions are for innovation or improvement, they probably are imperfect, incomplete, too difficult or expensive to implement, or unsuitable for the particular location. Implementation of changes requires patience and fortitude to try numerous trials, rethinking and repeating until the outcomes are good enough for deployment. We find that people accept repeated trials if they are active participants in their design and evaluation, where the trials are understood to be tests, not solutions, and where each is done quickly. Human-centered design starts with quick approximations, often having participants play-act the workings, providing rapid feedback. With each iteration the prototype becomes more refined and usable. Note that these tests must be applied to the intended recipients. Administrators and those responsible for devising the ideas under test should be (unobtrusive) observers, not participants.

The Complexities of Application

When we apply the human-centered design principles to the large, complex, needs of the world, several modifications must be made. Three areas are of special note:

A. Large complex, sociotechnical systems. When major political, economic, social and cultural variables interact, it is best to proceed slowly, with incremental, opportunistic steps.

B. The need for understanding. Modern automated technology can provide powerful answers, but its operations is often impenetrable (opaque) to both experts and affected citizens. This reduces faith and trust in the results. We need systems that provide understandable explanations.

C. Cultural sensitivity. The results must be sensitive to history, culture, and belief structures. This means that technologists and domain specialists are not enough: the local communities have to be involved in determining the outcomes.

These are all difficult issues. We believe that they require the combination of Bottom-Up and Top-Down approaches to solve. Experts provide the tools for analysis and for understanding. It will be experts that modify the opaque automation to make it a better fit to human understanding. These are top-down approaches.  But then, it will be individual people and communities who understand their own cultures and environments who can best apply the knowledge of experts. This is the bottom-up component. Both are needed. Each supplements and complements the other.

A. Large complex, sociotechnical systems

Complex systems cannot be treated in the same way as traditional design projects, which tend to be small and self-contained (e.g., a device, an interface, a procedure).  In 2014, concerned about how design could apply to the large, complex sociotechnical systems of the world, several of us wrote an opening manifesto (Friedman, et al, 2014) and then convened a workshop in Shanghai at Tongji University’s College of Design and Innovation to discuss the issues. We named the design techniques required for these issues DesignX. Afterwards P.J. Stappers and I wrote a paper that summarized our findings (Norman and Stappers, 2016).

DesignX tackles the design challenges of complex sociotechnical systems such as healthcare, transportation, governmental policy, and environmental protection. We concluded that the major challenges did not come from a lack of understanding of the issues, but rather the complexity of implementation, when political, economic, cultural, organizational, and structural problems overwhelm all else. We suggested that designers must play an active role in implementation, testing proposals through small, incremental steps. Thereby minimizing budgets and the resources required for each step and reducing political, social, and cultural disruptions. This approach requires tolerance for existing constraints and trade-offs, and a modularity that allows for measures that do not compromise the whole. These designs satisfice rather than optimize and are related to the technique of making progress by “muddling through,” a form of incrementalism championed by Lindblom (1959, 1979).

Today, I would modify the recommendations by adding lessons discussed below in Section C: The Problem with experts.

B. Opaque, non-explanatory automation

For proposals to be acceptable, they must be understandable, both in what is being proposed and also as to the reasons for the proposal. Modern decisions tools, especially those of statistical decision making, machine learning, and the deep-learning systems of neural networks are unable to provide explanations of why and how they reached their proclamations. The resulting lack of understanding reduces trust and makes experts and every people alike, wary of following the recommendations for fear that the process might have ignored or over emphasized critical attributes.

Note that over-trust can be more dangerous than under-trust.

Our systems must be designed to produce appropriate communication with those impacted by the recommendations. Explanations must be readily available in a format easily understood by the target population). An important component of explanation is the existence of an easy to understand and interpret conceptual model of the system, for without a good model, there is apt to be misunderstanding or incomprehension. (A misunderstanding can be worse than a lack of understanding, because in the first case, people do not recognize that their view is erroneous, whereas in the second case, they know that they do not know.)  

Conceptual models do not have to be completely accurate. The models simply have to be “good enough” to guide appropriate behavior, even if oversimplified. In similar fashion, complex systems do not need a single, comprehensive model: there can be different conceptual models for different aspects of the system.

C. Expert advice that is divorced from history, culture, and belief structures

We need experts to ensure that the facts and critical attributes are addressed, but we should leave the methods to those who are immediately affected. Example: An expert is needed to tell us how long to boil equipment to sterilize  it (the duration depends upon the altitude). But we should let the community that is impacted decide how to boil the water,

Complex problems do not have simple solutions. Moreover, each community has a unique history and a mix of cultures and belief structures and different socio-economic status. These differences can lead to major conflicts within the community. When experts come to the community to provide assistance, unless they are particularly sensitive to these issues, their suggestions are apt to be rejected, or if applied, fail to provide the desired result (Easterly, 2013).

Some of the conflicts are powerful and apparently unresolvable, having exited for decades or even centuries, resisting many attempts to settle them. Others, however, are less deeply embedded. Even so, experts are often insufficiently aware of the special needs, circumstances, and abilities of the people they are trying to assist. Decades of attempts by foreign experts to assist have yielded some successes but many more failures. There are thousands of scholarly articles, books, and white papers on the topic attesting to these difficulties.

We propose  different approach: bottom-up development, building upon the creativity and deep self-knowledge of the people who need assistance. Instead of experts we propose to encourage and facilitate the power and creativity of the individuals who live in the areas under concern.

Experts often are insensitive to the social, cultural, and historical attributes of the people, often believing that their expertise applies to the problems independently of the location, geography, or details of the local inhabitants. As a result, their advice often does not fit the thought patterns of people in the society, and moreover, may require educational levels they do not posses as well as critical infrastructure that may be limited, for example, electricity, heating and air-conditioning, clean water, adequate sewerage, may not always be available. When specialized devices are required, if they break, the local population cannot repair them  either because they are not familiar with the technology or because spare parts are not available.  When people design their own solutions, they can take account of all these variables.

If the people can decide, even if they come up with the same, identical solution as the expert people or machines, they will trust them, implement them, and abide with them.

When people determine their own fate, they are much more acceptance of the results, even if they are actually identical to what experts had proposed. Sometimes citizens will propose workable solutions that experts never thought of. Democratizing design is a part of the general movement toward citizen science: building upon the creativity and intelligence of everyday people.

We propose an approach of facilitation, providing assistance when asked with tools, toolkits, and instructional material. Moreover, when local implementations prove useful, we will help spread the word, through open-source distribution of knowledge by whatever medium is most appropriate: workshops, books, videos, or internet open source knowledge pods. Think of Wikipedia, Kahn Academy, or other repositories, self built, created by many individuals, with a minimum of overhead and supervisory controls.


These ideas have been developed with the aid of many people. Parts of this essay have been taken (in modified form) from Chapter 6 of Design of Everyday Things (Norman, 2013) and from a talk given at the world Government Summit, Dubai, UAE (Norman & Spencer, 2019)


Easterly, W. (2013). The tyranny of experts: economists, dictators, and the forgotten rights of the poor. New York: Basic Books.  

Friedman, K., Lou, Y., Norman, D., Stappers, P. J., Voûte, E., & Whitney, P. (2014). DesignX: A Future Path for Design. In. Retrieved from

Lindblom, C. E. (1959). The science of 'muddling through'. Public Administration Review, 19, 79–88.

Lindblom, C. E. (1979). Still muddling, not yet through. Public Administration Review, 39, 517-526.

Norman, D. (2013). The design of everyday things: Revised and expanded edition. New York; London: Basic Books; MIT Press (British Isles only).  

Norman, D., & Spencer, E. (2019). Community-based, Human-Centered Design. Paper presented at the 2019 World Government Summit, Dubai, United Arab Emirates.

Norman, D., & Stappers, P. J. (2016). DesignX: Complex Sociotechnical Systems. She Ji: The Journal of Design, Economics, and Innovation, 1, 83-106.