Everything you need to know about Facebook’s psychological experiment

On January 11, 2012, a team of scientists including Adam D.I. Kramer, Jamie E. Guillory and Jeffrey T. Hancock started a weeklong experiment aimed at testing whether emotions could be “spread” through social networks like Facebook, where communication is mostly text-based. Nearly 700,000 users participated in the study without being aware of it.

Two years later, on March 25, 2014, this study was published in an American journal. Months after that, journalists made the study public, in the meantime, creating a huge scandal. But what happened exactly?

Facebook manipulated the visibility of some publications according to their words

Facebook never shows everything that your contacts publish. Instead, it selects those things it considers most relevant using a formula called EdgeRank, taking into consideration things like the type of content, or your relationship with the person, among others.

The scientists participating in this study, one of whom works at Facebook, chose 689 003 English-speaking users and modified the formulas applied to their profiles so that only the publications with an emotional content of one kind or another was visible to their contacts.

Facebook Mood Experiment - Groups

The 689 003 participants were divided into four groups of equal size (source)

So, for example, participants had a much higher or lower chance than normal to see negative or positive publications from their friends. The researchers then tested whether the participants were publishing positive or negative statements.

The publications were analyzed using the LIWC, which analyzes the text of all publications involved, searching for words that express positive or negative emotions (you can try it yourself here).

LIWC

You can try the same LIWC program used byFacebook scientists

Emotions can “spread” on Facebook

When the experiment was over, the researchers analyzed more than three million publications for a total of over 122 million words (4 million positive, 1.8 negative). That’s when scientists started to statistically analyze the data. The hypothesis was simple: if it’s true that emotions can be spread across Facebook, then people who see less positive updates will publish less positive updates, and vice versa.

What the scientists found is that when the amount of positive news dropped, people were publishing more negative and less positive updates, and vice versa, leading them to suggest that emotional contagion is possible even on social networks. Another effect is that when the number of news with emotions of any kind was reduced, people would publish fewer emotional news, perhaps indicating that they were less “invested”.

Bad day on FacebookThe kind of updates that were considered “sad” or “negative” in the experiment

In short, when your friends post sad news, you become sad. When they publish happy news, you become happier. When your friends talk about how they’re feeling, you become more emotional in your updates. It should be noted, however, that the results of this study show that the effect is very subtle.

Privacy was preserved, but Facebook didn’t ask for permission

The researchers used techniques that hid certain text from users, but they assumed that there was no need to ask the users for permission because when you sign up for Facebook, you have to accept the data use policy, which gives Facebook permission to use the data for “analysis and service improvement”.

The sentence in Facebook’s data policy that “allowed” the experiment

Informed consent, which Facebook didn’t ask for, is nothing more than a form that explains the objective of the study and asks the person involved for their consent. Its use is mandatory in clinical trials and human experiments of any kind and is a standard ethical requirement in the social sciences.

In psychology, however, if you ask for consent, you could be influencing the results: the feeling of being observed makes you behave differently, what’s known as the Hawthorne Effect. Scientists sought the advice of an ethics committee before starting the experiment, although they haven’t given any details, and there are doubts about whether or not they ethically validated it.

It’s a very common experiment in social science

Emotional contagion isn’t a virus or a psychological warfare technique, but a phenomenon that has been studied for decades, describing those moments when the emotions of two individuals converge. A frequent question is whether it can also happen online without seeing the faces or even gestures of your conversational partners. Hence, the experiment was carried out to see if the theory was true or not.

An example of happiness contagion on Facebook. It seems that I made Iván happy

The type of experiment carried out by Facebook is typical in the social sciences, where groups of people are randomly assigned to groups in which certain aspects or environmental stimuli are manipulated. In this sense, Facebook did not directly manipulate any human being, and the effect on emotions has hardly been significant.

This experiment differs from what is normal in psychology due to the fact that they used a social network like Facebook to conduct tests and collect data. The number of subjects (nearly 700,000) is much larger than what is normally studied in most psychological experiments.

What can you do if you don’t want to be an Internet guinea pig

When using apps, you should be aware that anything you post may be used to improve the service or to carry out experiments similar to the Facebook one. The simple fact of gathering visitor statistics and modifying a page is very similar, in spirit, to the above mentioned experiment.

In this sense, all apps want to collect data to improve their service, but not all stop to think about whether you’ll allow it– it’s usually an agreement you accept or reject within the app. Many of the improvements you see on Facebook, Twitter, and your favorite Google apps are the result of a study that analyses what you “like” and what you share with your contacts.

You decide if you want to carry on using a service that can use your data at will

As a general recommendation, check data and privacy policies of your favorite apps to see if you’re authorizing the use of your data for research or analysis. In some cases, you can choose not to give your details, but that option is only available for some applications.

On the other hand, stay informed. Data policies change every day, and although they may seem like a waste of time, they are agreements you enter into which may have significant consequences on your privacy. Remember to research, read and, if necessary, decide whether to stay or leave.

If you want to leave Facebook, read the 7 essential tips to for leaving Facebook.

Loading comments

Latest articles