+
+

Nothing personal: why personalisation is not for me

Nothing personal: why personalisation is not for me

3 minute read

Nothing personal: why personalisation is not for me

3 minute read

Nothing personal: why personalisation is not for me

Lizzie Bruce

Freelance Content Consultant

I first considered the negatives of a personalisation strategy in early 2015 for a digital agency who thought they ought to look into any risks of the brave new world. But a lot of organisations may not question it as an approach. It’s now been a “thing to do” for online retail and services for years. It sounds effective and straightforward: “Let’s find out about this person and send them ads on things they’ll like!” 

However, it’s very hard to predict what individuals will like, no matter how much you’ve tracked them. With multi-user devices, or just with any humans who’re part of a family or community, it’s also difficult to know what information or products are being searched for.

In my opinion, personalisation generally does not give a good return on investment, and should not be understood as positively affecting user experience. Here are some reasons why. 

Too broad: actually, I don’t like that

A company’s categorisation of who likes what is likely to be primitive. A clothing company’s algorithm has determined you’re female: what if you’d actually prefer a “male” jumper or scarf to the ones they’ve commissioned to be designed for women? The company may lose a sale, as they’ve only emailed you highlights from the women’s collection.

Taste is an extremely personal thing. Sometimes even people very close to you can get it wrong. I can’t be the only one to have received a gift and thought, yes I can really see why you’d think that’s “me”, but it’s somehow just… not.

Too slow: I’ve already purchased that

It’s a source of at least mild irritation to have ads for say, turquoise wellies popping up when you’ve just bought a pair of turquoise wellies. 

But perhaps this does work with people who have a hobby that links with shopping: for example hi-fi equipment aficionados. I once met someone whose lounge was overflowing with speakers, and yet they still searched for more.

Why else might marketers like this déjà vu tactic? Perhaps you’ll prefer what’s advertised to you post-purchase, and return what you’ve just bought? More likely, the algorithm does not know or experiences a delay finding out, that you’re no longer looking for that particular thing.

Generally, if you’re trying to get someone to buy something they have literally just bought there needs to be some subtlety and marked differentiation. A different book by the same author could be helpful. But they may have already read it. Perhaps ads about new writing from that author when it comes out would work. 

Not for me this time

People are not islands. We have relatives, friends, colleagues, of a variety of ages. Those people all have birthdays and life event celebrations. You might suddenly need to find the perfect leaving present for someone with a wildly different profile and style to yourself.

Or you may be looking for information on someone else’s behalf. Some user types who might do this regularly for a variety of profiles that aren’t them:

  • doctor or care worker
  • housing officer
  • legal representative
  • teacher, tutor or supervisor
  • virtual personal assistant
  • people with large extended families
  • community group worker
  • buyer for in-house supplies

Not for them any more

Some companies might cotton on you have specific other people in your life and send you targeted ads around them. Not great after, for example, you’ve broken up with your partner or someone dies. At that point, such personalisation can cause people distress.

GDPR consent conflict

Whatever companies might say in their privacy statement, the General Data Protection Regulation in Europe clearly states that only the minimum amount of personal data required to provide a service should be collected and that this data should not be stored indefinitely. Other similar privacy regulations will apply elsewhere. Lena Roland, a managing editor at WARC, expands on this in her recent collision of data privacy and marketing article.

This is getting creepy

Worse than being haunted by turquoise wellies, is when companies seem to actually be stalking you. They pop up at breakfast, while you’re reading emails – even when you’re sure you’ve unsubscribed or you never subscribed in the first place. They turn up uninvited on the social media platforms you use. Any blog with ad space, there they are. 

The user reaction? “Ooh, lovely reminders, I must visit your website.” Er, no. “Go away ads.” Yes, that’s much more like it.

In conclusion

There are a lot of strong usability reasons against personalisation. Some may argue it can be time-saving, and increase accessibility, through shorter journeys to desired content. But, remember, that only works if the algorithm gets it right.

Can we achieve a personal experience of digital content another way? Some suggest AI is the answer, but how will it mesh with privacy regulations?

Customisation, where users set their own preferences, opt in to and out of categories and are able to update this quickly and easily when they need or want to, could be a great alternative. Or is that just adding another chore for users, on top of accepting cookies, hurdling paywalls and minimising pop-ups? 

Perhaps personalisation is best kept offline? A jar of Marmite with your name on it, that’s quite easy to get right. And whoever’s buying it for you should, hopefully, know if you love it or hate it.

I first considered the negatives of a personalisation strategy in early 2015 for a digital agency who thought they ought to look into any risks of the brave new world. But a lot of organisations may not question it as an approach. It’s now been a “thing to do” for online retail and services for years. It sounds effective and straightforward: “Let’s find out about this person and send them ads on things they’ll like!” 

However, it’s very hard to predict what individuals will like, no matter how much you’ve tracked them. With multi-user devices, or just with any humans who’re part of a family or community, it’s also difficult to know what information or products are being searched for.

In my opinion, personalisation generally does not give a good return on investment, and should not be understood as positively affecting user experience. Here are some reasons why. 

Too broad: actually, I don’t like that

A company’s categorisation of who likes what is likely to be primitive. A clothing company’s algorithm has determined you’re female: what if you’d actually prefer a “male” jumper or scarf to the ones they’ve commissioned to be designed for women? The company may lose a sale, as they’ve only emailed you highlights from the women’s collection.

Taste is an extremely personal thing. Sometimes even people very close to you can get it wrong. I can’t be the only one to have received a gift and thought, yes I can really see why you’d think that’s “me”, but it’s somehow just… not.

Too slow: I’ve already purchased that

It’s a source of at least mild irritation to have ads for say, turquoise wellies popping up when you’ve just bought a pair of turquoise wellies. 

But perhaps this does work with people who have a hobby that links with shopping: for example hi-fi equipment aficionados. I once met someone whose lounge was overflowing with speakers, and yet they still searched for more.

Why else might marketers like this déjà vu tactic? Perhaps you’ll prefer what’s advertised to you post-purchase, and return what you’ve just bought? More likely, the algorithm does not know or experiences a delay finding out, that you’re no longer looking for that particular thing.

Generally, if you’re trying to get someone to buy something they have literally just bought there needs to be some subtlety and marked differentiation. A different book by the same author could be helpful. But they may have already read it. Perhaps ads about new writing from that author when it comes out would work. 

Not for me this time

People are not islands. We have relatives, friends, colleagues, of a variety of ages. Those people all have birthdays and life event celebrations. You might suddenly need to find the perfect leaving present for someone with a wildly different profile and style to yourself.

Or you may be looking for information on someone else’s behalf. Some user types who might do this regularly for a variety of profiles that aren’t them:

  • doctor or care worker
  • housing officer
  • legal representative
  • teacher, tutor or supervisor
  • virtual personal assistant
  • people with large extended families
  • community group worker
  • buyer for in-house supplies

Not for them any more

Some companies might cotton on you have specific other people in your life and send you targeted ads around them. Not great after, for example, you’ve broken up with your partner or someone dies. At that point, such personalisation can cause people distress.

GDPR consent conflict

Whatever companies might say in their privacy statement, the General Data Protection Regulation in Europe clearly states that only the minimum amount of personal data required to provide a service should be collected and that this data should not be stored indefinitely. Other similar privacy regulations will apply elsewhere. Lena Roland, a managing editor at WARC, expands on this in her recent collision of data privacy and marketing article.

This is getting creepy

Worse than being haunted by turquoise wellies, is when companies seem to actually be stalking you. They pop up at breakfast, while you’re reading emails – even when you’re sure you’ve unsubscribed or you never subscribed in the first place. They turn up uninvited on the social media platforms you use. Any blog with ad space, there they are. 

The user reaction? “Ooh, lovely reminders, I must visit your website.” Er, no. “Go away ads.” Yes, that’s much more like it.

In conclusion

There are a lot of strong usability reasons against personalisation. Some may argue it can be time-saving, and increase accessibility, through shorter journeys to desired content. But, remember, that only works if the algorithm gets it right.

Can we achieve a personal experience of digital content another way? Some suggest AI is the answer, but how will it mesh with privacy regulations?

Customisation, where users set their own preferences, opt in to and out of categories and are able to update this quickly and easily when they need or want to, could be a great alternative. Or is that just adding another chore for users, on top of accepting cookies, hurdling paywalls and minimising pop-ups? 

Perhaps personalisation is best kept offline? A jar of Marmite with your name on it, that’s quite easy to get right. And whoever’s buying it for you should, hopefully, know if you love it or hate it.

No items found.

About the author

Lizzie Bruce

Lizzie provides content consultancy through Cake Consultancy Ltd. Motivated by creating user-focused, inclusive content she leads on Content Design London's collaborative Readability Guidelines project, and helps with content research, training and reports. She's also a freelance content designer at Scope, and writes for Prototypr and Digital Drum. With 17 years’ cross-sector content experience, including GDS, John Lewis and RNIB Lizzie's keen to share her learnings, and is currently creating a user-friendly intranet content resource.

Related posts you might like