What the Facebook emotional contagion means for A/B testing

Facebook has been in the news this week over its experiment to determine whether the emotional tone of a user's news feed can influence their own emotional state. And it's pretty clear to everyone that the way Facebook ran their experiment was a bit ethically iffy.

But what I find interesting is the potential implications of Facebook's actions on the rest of the digital marketing industry -- I think it's important we see this as a warning sign for anybody who runs hidden experiments on their website users, i.e. most of us in the digital marketing industry.

Emotional contagion

Here's the story behind the moral outrage: it's been revealed that Facebook ran a secret experiment on around 700,000 people, who unknowingly had their news feed edited. Some people saw an overall more positive feed, and others a more negative feed. Simply put, Facebook wanted to find out if seeing more negative posts from your friends makes you write more negative posts yourself. (Answer - yes, you are: Facebook calls this 'emotional contagion'.)

Why was this a problem? Because Facebook didn't explicitly ask permission first (apart from in their terms and conditions -- and, of course, we've all read those...). This breaks a golden rule of research. They experimented with people's emotions without any kind of informed consent from those people. You or I could have been part of this experiment without any knowledge of it and without understanding the effect it had on us.

And what if I had a history of depression? What if the amended feed I saw, or the subsequent posts I made, affected a relationship with a friend? Facebook didn't ensure participants were fully aware of the risks and potential outcomes of the experiment; and they didn't even let people know they were taking part. They clearly crossed an ethical line in their research.

Facebook -ethics

Crossing the line

But this opens up a difficult question for the rest of us in the digital marketing world: what about our own 'secret experiments'? How do we know when we cross the line?

We run A/B tests for our clients all the time. The participants aren't informed about what we're doing -- even though every one of these experiments is intended to influence their behaviour in some way, often using emotional triggers.

But we know this is different from Facebook's experiment.They used (and abused) personal data, 'private' communications between groups of individuals. That's not what we do with A/B testing, is it?

Well, yes and no.

When we test, for example, how different imagery affects charity donations in randomised sets of visitors, we all agree this is ethical. How about taking this to the next level by using personalisation to test with specific groups of users? Still ethical, right?

How about using advanced personalisation to change emotional messaging for individual users based on their actions, their history, their interactions with brands and with other individual website users? Are we in an ethical grey area yet? And how about in five years' time, when personalisation is giving us research opportunities that we haven't even considered yet?

What's next?

How will we know when one of our experiments crosses an ethical line? We use our common sense -- and this feels like a risk. If we're not careful, somebody with less common sense than us, who is outraged by the actions of companies like Facebook, will draw the line for us. And that could spell disaster.

This has happened before -- look at cookies. Because of a minority of unethical hucksters in our industry, politicians legislated on cookies despite not understanding what they are or how they are used. The EU cookie compliance debacle has solved nothing, damaged customers' experiences of websites, and spent valuable client budgets on intrusive and pointless cookie opt-in solutions. It's been an expensive mess.

If a moral outrage over Facebook leads to the same thing happening for A/B testing and other onsite research, then ultimately nobody wins. So let's be careful we don't sleepwalk into having to ask informed consent to test the colour of a button.

We should condemn Facebook for running unethical research, we should protect the testing we are doing now, and we should be aware of the implications of the testing we run in the future.


Inspiring the Code Creative team now