This video is the second in our Content Strategy Advent Calendar series.
Today, Sara Wachter-Boettcher talks about bringing the concept of kindness into our content and the importance of identifying and minimising bias, which will allow us to design for real life.
Hello, my name is Sara Wachter-Boettcher. And you may know me as a content strategy consultant. But like many of you, I’m really struggling to make sense of a really difficult year — a year where sexism, racism, violence, xenophobia, and more seemed to be normalised all over the place, all around me. Now, I’m not here to get political, exactly, but this year it almost feels like it’s impossible to avoid. Because what I’m seeing around me here in the United States, and what I’ve heard about in some other countries as well, it just flies in the face of the things I’ve been working on, professionally, for the past few years. So today, that’s what I’d like to tell you about that today, and about where I think we go from here as a profession.
A couple years ago, I started noticing just how much power our work has over people, and how they feel—whether they feel included, or excluded. It started when I realised just how thoroughly our lives have been changed by technology—and how deeply that technology touches us. According to Pew research earlier in 2016, most Americans now go online every day, and 20 percent say they’re online “almost constantly.” We use connected technology for almost everything: we use it to find products, we use it to get directions, we use it to settle arguments, we use it to stay informed, and file taxes, and chat with friends. Literally everything you can think of, there’s now an online service designed to help you with it. But despite all these different advancements in technology, I noticed something: our work hasn’t necessarily been getting better at serving the people who need it most.
Too often, our design and our content choices fail people—whether that’s Facebook’s On This Day feature sending peppy messages about someone who’s recently died, or the many race and ethnicity drop-down selection menus I’ve seen where people cannot identify as more than one race, cannot represent themselves as they do in real life, or things like period tracking apps that assume everyone who menstruates is at risk for pregnancy (never mind that they might be gay, or infertile, or a hundred other things).
These issues might seem small enough on their own, but they can cause real harm. They leave people out. They make them feel like they don’t fit, like they don’t belong. And that alone I think is a reason to do better: a reason for us content professionals to be more empathetic, and more aware of how our choices about language, and the features and the flows that we create, can affect people.
But I want to call these issues out right now not just because I think they might cause a user pain or stress, but also because I think they reveal a larger problem for people like us, and for our industry as a whole. And that problem is bias. In our work, with our pretty personas and our cleverly crafted messages, it’s very easy for us to think that we know our users, that we can picture them. And you know there’s lots of good reason that we want to do this, but doing so also makes it easy for us to establish a really narrow vision of who is normal—and who isn’t. And that really can result in products that don’t work for people who don’t fit our narrow conceptions. Oftentimes those people are the people who are already marginalised: people who are poor, people of colour, people who are gay or trans.
When we fail to identify and then minimise these biases, we end up leaving people out. And that’s becoming more and more dangerous, the more we start talking about things like machine learning and artificial intelligence—stuff like chatbots, which lots of people in our community are very excited about—because if people like us—people who create content and design experiences—build bias into the front end of the system, the bias worms its way into the systems we’ve designed, where it becomes invisible. Where it becomes hard to detect, and even harder to fix.
And so if you, like me, are trying to make sense of this year, and you’re trying to figure out how to add more meaning to your work and combat injustice that you see around you, I’d suggest that we start here: that we start with bringing the concept of kindness into our content, and into everything we touch, and we learn to identify and minimise bias in the experiences that we design and that we’re part of.
Happy holidays. I hope this is helpful to you, and I look forward to hearing how you apply it in your work. Thank you.
Sara Wachter-Boettcher runs a content strategy consultancy based in Philadelphia. She is the co-author, with Eric Meyer, of Design for Real Life, a book about creating products and interfaces that are more inclusive and compassionate. She is also the author of Content Everywhere, a book about creating flexible, mobile-ready content. Sara works with clients like Trek Bicycles, The Associated Press, The Home Depot, and Harvard, and speaks at web conferences worldwide. Find her on Twitter @sara_ann_marie or at sarawb.com.