Our on-demand webinar reviews a checklist of steps to take, and tools to use, to check that your language is inclusive and clear.
Watch webinar
+
+

Design against bias: questions and resources to help you check yours

Design against bias: questions and resources to help you check yours

4 minute read

Design against bias: questions and resources to help you check yours

4 minute read

Design against bias: questions and resources to help you check yours

Lizzie Bruce

Freelance Content Consultant

Table of contents

1.
2.
3.
4.
5.
6.
7.
8.

Earlier this year, a Twitter photo algorithm that cropped Black people out of photos was reported in the Guardian. Twitter apologised and said it had not gone far enough when testing the service for bias.

In 2017 Chukwuemeka Afigbo, a Facebook employee in Nigeria, posted this tweet and video. The video shows a soap dispenser which only works for white skin tones. Tweet text:

"If you have ever had a problem grasping the importance of diversity in tech and its impact on society watch this video." - @nke_ise on Twitter.

Their tweet was retweeted 170,000 times, and is referenced in many publications including Gizmodo. The exact same thing was experienced at the Marriott Hotel in Atlanta in 2015, as reported in Mic.com. Richard Whitney, quoted in the Mic article, explains how the design failed: "In order to compensate for variations in skin color the gain, [or] sensor equivalent to ISO and exposure in cameras, would have to be increased."

More examples of digital products not designed for all include a name form field input validator which did not recognise surnames with spaces or apostrophes in them, and the NHS Track and trace app that requires Bluetooth usage and recent operating system updates. This makes assumptions about device and data affordability, and digital skill.

What to ask yourself, and your team

Nothing is “normal”

First, remember that nothing is “normal”. Your normal is different to the next person’s. We all bring different cognitive biases from our backgrounds, upbringings, school life, work life, relationships with people and every other previous experience each of us has had as an individual. Find out more about cognitive bias:

Questions to ask

1. Am I designing for “me”? Am I or is my team making assumptions about any of these?

  • Race
  • Culture
  • Wealth
  • Literacy
  • Cognitive ability
  • Digital literacy
  • Mode or medium of access
  • Broadband speed 
  • Data allowance
  • Age
  • Gender identification
  • Sexual orientation

Would what I am creating work for, or be understood, in the way intended by a broad range of people?

2. Am I designing for “me in the past”? Our experiences may be different from present day users’ – better, worse or just different in other ways.

3. Am I designing for “people I know”? Anecdotal references are exactly that.

4. Am I designing for “my peers as I imagine them to be”? If you’re ‘writing about writing’ or creating a resource for other designers, ask – am I designing for my own subconscious idea of them? Your projection of who this user is may not necessarily reflect you, but, unless it’s based on research, it is still a bias! 

A quick way to find out what biases you might be bringing to what you’re creating is to visualise your first instinct of who you see using your product. Draw or write down a description. What does that tell you?

The only robust way to check your product design is not biased is to research and test in all stages of the design process of your digital product, service or platform, with real users, of a broad variety of cultural backgrounds, digital ability levels and using different devices, browsers and access technology.

Read ‘Questions designers should be asking’ by Garrett Kroll 

When to ask

In Discovery: to remove bias from, inform and steer your desktop research, user research participant recruitment and user interview scripts.

In Alpha: to take out bias from your prototype design. This could be: removing references that not all cultures would understand, not making gendered titles a required form field, not making assumptions about digital capabilities, like being able to upload a photo or use Google maps.

In Beta: to recruit a wide range of user research participants and to make sure the prototype will work for people using a range of devices, browsers and access technology.

At Launch: to make sure any launch events are accessible by all. To user acceptance test across a range of devices, browsers and access technology.

After Live: to maintain content that is fit for all, gather feedback from a range of users and apply new insight.

Explore this topic further

Earlier this year, a Twitter photo algorithm that cropped Black people out of photos was reported in the Guardian. Twitter apologised and said it had not gone far enough when testing the service for bias.

In 2017 Chukwuemeka Afigbo, a Facebook employee in Nigeria, posted this tweet and video. The video shows a soap dispenser which only works for white skin tones. Tweet text:

"If you have ever had a problem grasping the importance of diversity in tech and its impact on society watch this video." - @nke_ise on Twitter.

Their tweet was retweeted 170,000 times, and is referenced in many publications including Gizmodo. The exact same thing was experienced at the Marriott Hotel in Atlanta in 2015, as reported in Mic.com. Richard Whitney, quoted in the Mic article, explains how the design failed: "In order to compensate for variations in skin color the gain, [or] sensor equivalent to ISO and exposure in cameras, would have to be increased."

More examples of digital products not designed for all include a name form field input validator which did not recognise surnames with spaces or apostrophes in them, and the NHS Track and trace app that requires Bluetooth usage and recent operating system updates. This makes assumptions about device and data affordability, and digital skill.

What to ask yourself, and your team

Nothing is “normal”

First, remember that nothing is “normal”. Your normal is different to the next person’s. We all bring different cognitive biases from our backgrounds, upbringings, school life, work life, relationships with people and every other previous experience each of us has had as an individual. Find out more about cognitive bias:

Questions to ask

1. Am I designing for “me”? Am I or is my team making assumptions about any of these?

  • Race
  • Culture
  • Wealth
  • Literacy
  • Cognitive ability
  • Digital literacy
  • Mode or medium of access
  • Broadband speed 
  • Data allowance
  • Age
  • Gender identification
  • Sexual orientation

Would what I am creating work for, or be understood, in the way intended by a broad range of people?

2. Am I designing for “me in the past”? Our experiences may be different from present day users’ – better, worse or just different in other ways.

3. Am I designing for “people I know”? Anecdotal references are exactly that.

4. Am I designing for “my peers as I imagine them to be”? If you’re ‘writing about writing’ or creating a resource for other designers, ask – am I designing for my own subconscious idea of them? Your projection of who this user is may not necessarily reflect you, but, unless it’s based on research, it is still a bias! 

A quick way to find out what biases you might be bringing to what you’re creating is to visualise your first instinct of who you see using your product. Draw or write down a description. What does that tell you?

The only robust way to check your product design is not biased is to research and test in all stages of the design process of your digital product, service or platform, with real users, of a broad variety of cultural backgrounds, digital ability levels and using different devices, browsers and access technology.

Read ‘Questions designers should be asking’ by Garrett Kroll 

When to ask

In Discovery: to remove bias from, inform and steer your desktop research, user research participant recruitment and user interview scripts.

In Alpha: to take out bias from your prototype design. This could be: removing references that not all cultures would understand, not making gendered titles a required form field, not making assumptions about digital capabilities, like being able to upload a photo or use Google maps.

In Beta: to recruit a wide range of user research participants and to make sure the prototype will work for people using a range of devices, browsers and access technology.

At Launch: to make sure any launch events are accessible by all. To user acceptance test across a range of devices, browsers and access technology.

After Live: to maintain content that is fit for all, gather feedback from a range of users and apply new insight.

Explore this topic further

Ready to get started?
Start your 30-day free trial now
Start free trialBook a demo
No items found.

About the author

Lizzie Bruce

Lizzie is the author of 'Task-based intranet content, a step by step guide to user-centred design'. She led Content Design London's collaborative Readability Guidelines project, and provides content services through Cake Consultancy Ltd. With 17 years’ content usability experience across private, public and charity sectors, Lizzie is keen to share her learnings.

Related posts you might like