Monday, June 30, 2014

Facebook and informed consent

By now, many people will probably know that Facebook conducted an experiment on around 700,000 users by manipulating their news feeds to see whether it impacted their mental state, their mood. Now the issue is not the size of the experiment nor the merits of the experiment nor even the result. The real issue is informed consent and whether users were aware and agreed to such experimentation.

It would have been relatively trivial for Facebook to have sent a message to a million odd users saying 

'Facebook would like to conduct an experiment on how the content of newsfeed impacts emotions. In order to do this, Facebook is looking for volunteers for such a study. At some point in the next year we will manipulate the feeds of those volunteers either positively, negatively or not at all (the baseline) for a period of one week to observe any change in behaviour. It should be noted that there exists the possibility that some volunteers moods will be altered adversely during that time. The purpose of the research is to determine ways of improving newsfeed content to create a more positive experience and mood. The results of the work will be published and user details will be anonymous. Volunteers may request to see how their feed was altered and any impact on them after the study. Do you agree to volunteering for this study?'

Now the above I wrote quickly in a few minutes and could be vastly improved. However, the purpose of it would be to gain consent and agreement to an experiment to change your mood in a given timeframe for a specific purpose. This is what we call - informed consent. You know what is happening and you agree to it.

In the case of Facebook it didn't inform people specifically because it argues that when you sign up to the T&Cs then you agree to such experimentation. There are two problems with this argument.

1) I doubt if you asked a 100 people to read the Facebook T&Cs that any would agree with the statement that 'By agreeing to the T&C's then you're agreeing to Facebook conducting psychological experiments on you'. Most would probably react with shock and ask 'where does it say that?'

The single line in the Data Use Policy states

we may use the information we receive about you:
for internal operations, including troubleshooting, data analysis, testing, research and service improvement.

This is unlikely to qualify as informed consent as the average user would more likely expect this to mean functional changes, speed performance or reliability of the service rather than psychological experiments.

2) You have no option but to agree to the T&Cs or you cannot use the service to communicate with friends. In fact, looking back at earlier T&C's I've yet to find when the above line appeared. If it is the case that this 'research' line appeared in later T&C's then given the widespread use of Facebook to communicate with friends then this is not consent to experimentation but coercion to experimentation e.g. let us experiment on you if you want to continue talking with friends on Facebook.

Whichever way I look at this, I can't see how it can be considered informed consent. At best, it's uninformed consent at worst it's uninformed coercion. Neither is ethical.

But why would you do this? Well, either it's complete arrogance and disregard for users or worse it's that basest of all reasons - money. As anyone who has conducted such experiments before will probably tell you, with the exception of high minded individuals then most people ask 'What do I get for this?' 

Yes, if you're going to experiment on people then they normally expect to be paid. However, avoiding cost is no better an excuse for avoiding informed consent than saying there's a line in the T&C or arguing the results don't mean much. The only partial excuse I can think of is 'we were stupid'.

The experiment conducted is in my opinion truly awful, not because of the results but because of the lack of informed consent. I read a lot of apologists for Silicon Valley saying this is ok. Well, fortunately it's not for them to decide.

I'm glad Jim Sheridan has asked the Commons Media Select Committee to investigate.

-- Update 1st July 2014

When I was looking for when the T&Cs had changed to include the 'research' line, my concern was that this experiment was not just uninformed consent but instead uniformed coercion i.e. users didn't have any real choice but to allow Facebook to experiment on them on threat of losing the service. There was no opt-out. However, little did I imagine that Facebook (or is that FacePalm?) would apparently change the T&C's after they had conducted the experiment.

So, all those apologists for Facebook claiming it was in the T&C's better start trying to think up another excuse. Not only was this not informed consent by any stretch of the imagination, it wasn't even uniformed coercion. It seems to be a new low of Facebook doing whatever it likes without any concern over what it had agreed with, or informed the users of.

Yes, the effects were minor (if you consider tens of thousands of people affected as minor) and yes, the furore might mean Facebook won't publish further research. Which is almost certainly why we will need regulation to stop experimentation on the sly.

No, this isn't simple A/B testing as most users would reasonably expect sites to try different functional pages in order to improve user experience / interaction / ease of use. What users don't expect is for companies to manipulate their personal newsfeed in order to initiate a change of their mental state.

The argument that Facebook was only trying to change your mental state in order to improve your experience with the site is the most tenuous bit of arse covering I've read in a long time. Given the generalised, flimsy and buried line in the T&C's which supposedly permitted this didn't even exit at the time of the experiment then I hope the Commons media select committee hauls Facebook over the coals for this.

Apparently a Facebook spokesperson told Forbes "When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer." Well, that's true of many services but I can't see how any reasonable person would interpret this as "When someone signs up for Facebook, they give us permission to run psychological experiments on them including manipulating their newsfeed with the intention of negatively altering their mood".

Oh, and finally people keep on pointing out that advertisers do this all the time. Well, yes they do but that's adverts and we know that advertisers manipulate and we have legislation that even covers this. Few of us trust what advertisers say. But it wasn't adverts that Facebook manipulated it was our newsfeed. It was the collection of messages that we receive from our social network, something we value, something we trust and something which we don't expect to be manipulated in this manner.

We certainly don't expect our newsfeed to be manipulated in a manner designed to effect our mood as part of psychological experiment that we weren't even asked whether we minded being part of and then told we had given 'informed consent' because of an obscure line in the T&Cs  that no reasonable person would interpret as meaning such consent and which didn't even exist when the experiment took place.

1 comment:

Unknown said...

When constructing a writing outline, you need to think critically about what you are trying to explain or communicate in in Toronto, and think about what arguments you need to present To support your points, these points can serve as your subtitles. General essays are composed of introduction, current research status, main text, conclusions and references. Adjust the structure of the essay according to your essay. After making preparations before writing, you can start writing according to the outline. Starting from your own ideas, you will quote the opinions of others, but you must avoid plagiarism. Allow enough time to check your first draft and correct errors in grammar, spelling, and formatting so that other people or teachers can read and point out errors you didn't find.