EFF members at the Mall of Africa. Picture: SUNDAY TIMES/THAPELO MOREBUDI
EFF members at the Mall of Africa. Picture: SUNDAY TIMES/THAPELO MOREBUDI

Last week was a rough one for online retail. The mistake Clicks made, the hurt it caused and the severe consequences the company faces has left the local marketing, and platform, world relatively rattled. While the reactions of the public ranged from “How can you not see how wrong this is?” to “What’s the big deal?” it has certainly been a cause for introspection.

We have seen many brands put out irresponsible messaging and campaigns over the past year. But what is becoming clear is that our audience and users won’t stand for it, and rightfully so. Every customer has the power to spread their message and feelings to thousands of other people through social media; and as content and experience creators, it’s time we recognise our responsibility and acknowledge that we can get things wrong … very wrong.

In an open letter, Clicks group CEO Vikesh Ramsunder committed to reviewing the company’s own and third-party content for implicit and explicit bias henceforth. And it’s the implicit biases, or more simply put, those beliefs that sit outside our awareness level, in our unconscious actions, that is the culprit most of the time. It’s easy to believe that as an experience creator in SA you understand the need for inclusivity and accessibility. But do we really, really understand how the messages we put out there can be interpreted, can exclude people, are open to abuse, and can do real harm?

In 2015 an app called Nextdoor was launched in the US. In its simplest form, Nextdoor was supposed to be a connection hub for neighbourhoods, with feel-good uses like sharing your excess tomatoes with your neighbours or selling furniture when you move. But unfortunately, one of the categories within the app – for crime and safety – created a perfect storm of discriminatory content in the form of paranoid racism.

Users in affluent neighbourhoods were posting racist messages about “suspicious” people on the channel regularly. Things escalated quickly. The app was efficient and easy, the user experience design was good. Users were able to post easily, but the moderators were so blind to the nature of the posts that the app very quickly got stuck with the reputation of being racist. To this day Nextdoor has not been able to really rectify these perceptions, whether functionally or reputationally. Imagine coupling this ease of use with machine learning, where trending keywords get suggested as you post, and you’ve created biased AI.

I’ve yet to meet an evil user experience designer. I’m not saying they don’t exist. But I like to believe that none of us creates experiences with the intent of spreading hurt and hate. So how do we make sure this doesn’t happen? Here are a few thoughts.

1. A diverse team: This seems obvious. It’s in line with what SA needs to do. But the challenge here is to recognise that diversity is more than race or ethnicity. Ensure that you have a team that represents a variety of perspectives on topics such as religion, culture, economic background, ability and gender. And involve as many points of view as you can in your projects. Host cross-product design collaborations and reviews. The aim is to interrogate your designs from all angles. Create a space that allows even the most junior team members to have a voice because, as we know, there’s still room for improvement on diversity at management level.

2. Call yourself out on your bias: There are various tools available to assess what implicit biases you have, such as Project Implicit from Harvard. This can be hard. You may think you are a saint. But you will have blind spots, and that does not make you a bad person. Once you know about your bias, you can educate yourself and actively practise reviewing your work from different points of view: Can this imagery be seen as offensive? Does this navigation label leave room for misinterpretation? Does it exclude people?

3. When designing, or ideating, do a Black Mirror (TV series) brainstorm: This is a great exercise for a new product or feature development. Create an episode in which the plot has to revolve around the misuse of your product. I first came across this concept in Joshua Mauldin‘s blogs, and you can find the how-to here.

4. Include as many user perspectives as possible in your project process: Dan Brown has written a lot about designing for abusability, but this concept around varied perspectives also relates to accessibility. In his words: “Experience endows us with perspectives, and design thrives on engaging perspectives to inject new ideas and evaluate solutions.” Do you include enough user perspectives in your work? Do you know how your experience will work for people with limited motor skills or poor eyesight? Do you know how your experience will work in the hands of users with extreme bias or bad intent? Creating personas for these perspectives is a great way to ensure you assess your product realistically, to see how it will fare in the real world, warts and all.

5. Continually assess your content before it is pushed live: More often than not, content population is a mad rush. Supplier images arrive at the last minute, you are sourcing stock photos in a hurry, and approval is pressing. Look at the final context of what the user or audience will see. Sense check your pages in their totality, with images, captions and copy. Does this read differently now? Is there room for misinterpretation?

It’s vital that bias awareness and ethics become an inherent part of our process, whether creating social content, marketing material, content strategies or product designs. Good intentions and just talking about it is not enough in the long run. We owe it to our clients and users to do better.

  • Germari Steenkamp is head of customer experience design at VMLY&R South Africa.

 

Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.