Whether we like it or not, software forms the cornerstones of our lives. It enables us to work, to manage money, share photos, connect with friends, follow diets, and everything in between. And those of us who build software aren’t just building apps to do those things; we’re solving health, finance, and safety problems and building experiences around them. But far too often, and despite our best intentions, those experiences fall flat, excluding and harming those we aim to help—and often when the stakes are at their highest.
Understanding the problems we’re solving (and for whom) is critical, but that task is often more complicated than we might realize. Algorithms aren’t agnostic, wireframes aren’t objective, and thinking that they are is damaging and dangerous. It’s time to acknowledge that we are not neutral and neither is our work. Our biases work their way into our code, into our mockups, and into the software we create. These manifestations can mean we exclude, ostracise, or harm those who use that software, regardless of our intentions.
As much as we want to speed up and get products out the door, we also need to step a back from time to time and consider how (and if) we should solve these problems. The way we do our work can sometimes be as important as the work we do. After all, it’s only by acknowledging our limitations that we might improve our processes and build the kinds of inclusive products that empower those who need them the most.
“Empathy”
No, this isn’t another think-piece on why ‘Empathy Is Important’ or on how you can ‘Create An Empathy Map’ for your project team.
Empathy is important. But requiring “empathy” from everyone does the concept a disservice. We can’t be expected to empathize with everyone, every day, all of the time. But we can learn to meaningfully identify how best to use our skills as designers, developers, and product managers in order to make software that is inclusive—without requiring that everyone on the team achieve peak empathy.
Empathy isn’t the solution: it’s just the beginning
Empathy is rather de rigueur at the moment. But it’s often used without definition or purpose. It’s seen as a magic-bullet solution and a method by which to build inclusive interfaces. But empathy isn’t a process or an approach; it’s an ability.
In fact, the concept is far more complex than the tech industry is making it out to be. It’s time to admit that true empathy isn’t something that can always be achieved. To assume that everyone can empathize with any particular problem is foolish. Caucasians will never know what it’s like to be a person of colour; men can’t understand what it’s like to go through menopause.
This should affect who makes critical product decisions at each stage of the development process. If we want to ensure that the software is inclusive, we need to make sure the right people are making the important decisions. We need to make sure that those we’re aiming to help are not excluded from the process and that we solve the problem as they experience it, not as we think they experience it.
Not everyone on the product team needs to empathize with the user, but everyone does need to:
- Understand the scope of the problem, and who should benefit most
- Recognize and understand the limits of their own empathy
- Identify those who can empathize, and then empower, trust, and listen to them
Once we do that, we can better leverage all our skills. It’s not a flaw to be unable to empathize with certain people and their problems; empathy is based on understanding and there are some things you or I will never be able to understand. The implication is simply that we need to wield our skills a little differently. We can still build software, but we need to better understand what empathy means and how we can adjust our processes to ensure a more inclusive user experience.
A definition
Empathy (n.): the ability to understand and share the feelings of another.
What does “exclusion” look like?
Simply put, software fails to be inclusive when it fails to accommodate those it aims to help; this is often a result of bias from a decision-maker in the product process or the result of several small, isolated oversights. A few examples:
- Apple’s Healthkit failing to allow women to track reproductive data in its initial release, then taking a year to add it.
- Airbnb’s product allowing hosts to (implicitly or otherwise) racially profile their visitors. Noirbnb and Innclusive were born in response.
- Nextdoor’s reporting interface facilitated the spread of paranoid racism.
- Domestic violence victims being silenced by Twitter.
- Snapchat’s release of racist filters.
- Facebook only allowing the selection of male or female gender.
This kind of exclusion can be annoying in some cases. In others, it can be harmful. No, exclusion wasn’t the intention, but that doesn’t make the experience any less painful for those affected.
Empathy’s role in the building of software
Inclusive design is a buzzword. Truly inclusive UX is elusive. In fact, it’s often not the presence of inclusivity, but the absence of exclusion. Empathy is critical in the design process because it creates the mindset that powers and informs the small decisions that, in the end, add up to more than the sum of their parts. A change in wording here, a subtle form arrangement there. Adding or removing certain features simply becomes self-evident when you understand the feelings of the user. And while we can’t hope to empathize with every user or audience in this way, each of us can create space for empathy when we’re designing our products.
Every piece of software aims to solve a problem, as it’s experienced by a particular person or people. It helps us track our workouts, hail a cab, communicate with friends and family. Whatever it aims to do, those of us building the product must thoroughly understand the problem to be solved, the behaviour to enable, the barrier to remove, or the task to complete—as well as the person for whom we are solving, enabling, or removing. We may not want to admit it, but we can’t create UX flows, map errors, build interfaces, or define success metrics for a feature or product when we don’t fundamentally understand the problem or the obstacle facing the user.
The limits of empathy
There are a variety of techniques available to us to “empathize” with a user—methods of research that allow the development team to (try to) stand in a user’s shoes: interviews, empathy maps, personas, experience maps, and so on. But just because the methods exist doesn’t mean that we will be able to understand the feelings of another human being if we just try hard enough. After all, we don’t arrive at each project with fresh eyes. As Fabio Chiusi so brilliantly puts it: “Even with no human intervention, there is human intervention. There always is.”
The ways in which our predispositions manifest in our work can have profound impacts on users. Solving a problem you don’t (or can’t) understand is like walking blindfolded through an ethical minefield. For example, we are all predisposed to seek out information that confirms our own worldview—this is called confirmation bias. When presented with conflicting feedback, this makes us more likely to dismiss it. As a result, people often fail to include critical features or interaction without even realizing how critical their omission is.
Sometimes we aren’t affected by the problem and we can’t bridge the gap on our own. That means that the critical decisions should not be ours.
The all-male development team of Apple’s Healthkit had no experience tracking the reproductive health of women (called blindspot bias) and relied simply upon personal experience and the availability heuristic to build a tool in which such a feature wasn’t required. Research or information to the contrary may have appeared during the development process, but if it did, either ego or confirmation bias ensured its exclusion.
There are countless examples like this one. And that’s why it’s necessary to admit that we won’t always be able to bridge that gap, rather than insisting bullishly that we can (to the detriment of our health apps). There are feelings and experiences that are more than the sum of their parts—things we simply will never be able to understand and, thus, people or groups we will never be able to empathize with. People who identify as male or female, for example, won’t understand the experience of trying to fill in forms that invalidate a non-binary gender identity. Most of us won’t be able to understand the experience of mixed-raced people. Able-bodied designers won’t be able to understand the feelings of a person with disabilities trying to navigate stairs and touchscreens and hundreds of other things you or I might take for granted.
But that’s okay! We don’t have to empathize with everyone all the time. We simply have to recognize that we can’t and adjust accordingly. We aren’t able to change who we are, but we are very much able to change how we approach the work of design and development.
So what do we do, then?
We want to solve problems and avoid harm. And our project teams have the necessary skills—code, design, product management, etc. But before applying those skills, we need to understand where they fit within an inclusive product process, based on whether or not we can empathize with the user(s). Only then can we best leverage the knowledge and abilities of our team. Let’s go.
Understand the problem
Nothing new here. What pain is being alleviated? Who is affected? Enter personas, empathy mapping, journey mapping, etc. Everyone on the team with any hand whatsoever in the building of that product should know exactly the issue at its core, for whom, and why. No “empathizing” here yet. Just knowledge-gathering and understanding. Then, we can…
Acknowledge where we stand
A product doesn’t exist in a vacuum. It will be a reflection of its user and its creator. So, as creators, we need to ask: “Am I fundamentally affected by this problem? Can I share the feelings and pains of those who are?” If not, why do I want to solve that problem? And will I truly be able to understand those who are affected, even when their experience is unknown to me?
Honesty is imperative at this stage because answering “no” is valuable. “No, sometimes we aren’t affected by the problem and we can’t bridge the gap on our own. That means that the critical decisions should not be ours.” Without that acknowledgement, any decision (no matter how small) can manifest in something exclusionary, best intentions be damned. Whenever you answer no, the next step is simple:
Put decision-making power in the hands of those affected
Worry less about your own ability to empathize, and involve expert authorities (those who can empathize) right from the beginning. Don’t just bring them in for testing of a ready-made product. That assumes that a product should be built. (Perhaps in talking to users and/or experts early on, ideation results in a completely different solution—maybe a different product or no product at all.) We can still be the excellent designers, developers, and PMs that we are. But we can also create space for those closer to the problem and then use the operational skills of our disciplines to bring to life the decisions of the people that we’re trying to help.
These conversations can be a part of initial user research, they can be ongoing, whatever works for your team (the product process is hardly linear). But the point is to involve these users throughout the process, not just at one stage. If you know that you cannot truly empathize, find those who can, and put the decision-making power in their hands. For example, in the writing of one particular article, Val Head spoke with people affected by vestibular disorders, and asked them to identify triggering animations—she didn’t create personas and then try to identify those triggers herself.
Build intentionally (and at first, slowly)
The centring of these experts will slow you down, particularly if they’re not a designer or developer already on the team. It should show you down, and you should learn to build that time into your processes. Avoid the “move fast and break things” approach. This attitude is a damaging one, both for the people building the product and for the culture it creates. Move slowly and build intentionally: understand the importance of each decision and make them thoughtfully and with consideration.
What is “default”?
When we’re building software, we always have a “default” user or persona in mind, far too often cis-white and able-bodied. Take time to think not just about how they’ll use your software, but how people of different gender identities, abilities, and contexts will use your software. Just because something is “default” or “best practice” doesn’t mean we should adopt it blindly in every context. And yes, moving away from defaults will be unfamiliar, but it is imperative to creating an inclusive UX. Accounting for exceptions doesn’t need to change the defaults—it just means you consider what “default” is and why (and what the implications of that default might be). Ensuring non-binary people can fill in the “gender” section of a form makes for an inclusive UX without affecting the “default” user. The default user probably won’t even notice the change, but the oft-excluded users certainly will.
Success is a diverse team
If you look around at your product team and find that everyone looks the same, be wary—and be careful about the decisions you make. Diverse teams build more inclusive products: science has confirmed this multiple times. Non-diverse product teams result in the wrong people making the decisions (no matter how well-intentioned). Without internal diversity on the project team, products (such as Apple’s Healthkit, for example) can often lose sight of the problem they aim to solve; they become a reflection of their creator, not the user who is meant to benefit from it.
Take responsibility
We will make mistakes. It’s how we react that matters. Got some feedback that users are being excluded? Return to your experts and use the collective skills of your awesome team to fix the mistake! When Nextdoor realized that their product allowed racial profiling, they acknowledged the mistake and tried to fix it by adding friction to their posting process.
We’re only human. It’s hard to avoid being wrong from time to time when you’re working with the unknown. But as I said, empathy is an ability and like most things, it takes practice. So treat mistakes as opportunities to deepen your understanding of others.
One important note
Involving these experts will make the product better and more inclusive, but only if they are valued and if their decisions thoughtfully implemented—even when they might seem counterintuitive. You must understand that the stakes are often high and that the decisions made in solving this problem are not to be made lightly.
You and you and you and you
This approach is not just for designers and developers, but everyone. PMs, managers, interns, technical writers—anyone involved in the creation of that product needs to understand the potential impact of the decisions they make.
Inclusive software starts with the individual
Inclusive software is about people—those who build it and those who benefit from it. It may seem counterintuitive, but we should strive to stop thinking about the products and, instead, think about the people behind them. Once we understand that our own capacity for empathy has its limits, we can empower those with an implicit and deeper understanding of the problem and all its nuances.
This implicit understanding can’t be achieved by everyone. In fact, it can be difficult to identify concrete examples of inclusive user experiences because we don’t notice them until we really need them. In many cases, success is inaction: not asking for gender or sex when you don’t need it, for example—or a subtle notification instead of a loud noise (or no notification at all).
The key to avoiding exclusion is to better understand the problem and acknowledge our place relative to it. When we cannot really empathize—when we can’t understand and share the feelings of those whose problem we want to solve—acknowledge that fact, and let those who can empathize make the critical decisions.
Make no mistake, this can be difficult, especially for people who pride themselves on their desire to help others by solving difficult problems or people who like “just getting things done.” But the heart of inclusive user experiences is a better understanding of ourselves and the limitations of our own knowledge, experience, and abilities.
By better understanding the unexpected nuances of a problem—and by acknowledging when we’re not affected by that problem—we can find and accept the limits of our own empathy. Then, we can refocus processes and responsibilities around the core desire to empower, rather than racing to be the first to solve a problem we don’t fully understand. We want to see technology as agnostic, empowering everyone—but this will only be possible if those creating it do the same.
Some inclusive interfaces
- Shauna Keating’s Mobility Map case study
- Umesh Pandya’s Validating Wayfindr case study
- Just a Brown Hand from Diógenes Brito
- Carl-Gustaf Lundholm and the design of an inclusive tissue dispenser