You are on page 1of 3

Furor Erupts Over Facebook's Experiment on Users

Almost 700,000 Unwitting Subjects Had Their Feeds Altered to Gauge Effect on
Emotion

By Reed Albergotti
June 29, 2014

A social-network furor has erupted over news that Facebook Inc., in 2012, conducted a
massive psychological experiment on nearly 700,000 unwitting users.

To determine whether it could alter the emotional state of its users and prompt them to
post either more positive or negative content, the site's data scientists enabled an
algorithm, for one week, to automatically omit content that contained words associated
with either positive or negative emotions from the central news feeds of 689,003 users.

The research, published in the March issue of the Proceedings of the National Academy
of Sciences, sparked a different emotionoutrageamong some people who say
Facebook toyed with its users emotions and uses members as guinea pigs.

"What many of us feared is already a reality: Facebook is using us as lab rats, and not
just to figure out which ads we'll respond to but actually change our emotions," wrote
Animalnewyork.com, a blog post that drew attention to the study Friday morning.

Facebook has long run social experiments. Its Data Science Team is tasked with turning
the reams of information created by the more than 800 million people who log on every
day into usable scientific research.

On Sunday, the Facebook data scientist who led the study in question, Adam Kramer,
said he was having second thoughts about this particular project. "In hindsight, the
research benefits of the paper may not have justified all of this anxiety," he wrote on his
Facebook page.

"While we've always considered what research we do carefully," he wrote, Facebook's
internal review process has improved since the 2012 study was conducted. "We have
come a long way since then."

The impetus for the study was an age-old complaint of some Facebook users: That going
on Facebook and seeing all the great and wonderful things other people are doing makes
people feel bad about their own lives.

The study, Mr. Kramer wrote, was an attempt to either confirm or debunk that notion.
Mr. Kramer said it was debunked.

According to an abstract of the study, "for people who had positive content reduced in
their News Feed, a larger percentage of words in people's status updates were negative
and a smaller percentage were positive. When negativity was reduced, the opposite
pattern occurred."

The controversy over the project highlights the delicate line in the social media industry
between the privacy of users and the ambitionsboth business and intellectualof the
corporations that control their data.

Companies like Facebook, Google Inc. and Twitter Inc. rely almost solely on data-driven
advertising dollars. As a result, the companies collect and store massive amounts of
personal information. Not all of that information can be used for advertisingat least
not yet. In the case of Facebook, there is an abundance of information practically
overflowing from its servers. What Facebook does with all its extra personal
informationthe data isn't currently allocated to the advertising productis largely
unknown to the public.

Facebook's Data Science team occasionally uses the information to highlight current
events. Recently, it employed it to determine how many people were visiting Brazil for
the World Cup. In February, The Wall Street Journal published a story on the best
places to be single in the U.S., based on data gathered by the company's Data Science
Team.

Those studies have raised few eyebrows. The attempt to manipulate users' emotions,
however, struck a nerve.

"It's completely unacceptable for the terms of service to force everybody on Facebook to
participate in experiments," said Kate Crawford, visiting professor at MIT's Center for
Civic Media and principal researcher at Microsoft Research.

Ms. Crawford said it points to broader problem in the data science industry. Ethics are
not "a major part of the education of data scientists and it clearly needs to be," she said.

Asked a Forbes.com blogger: "Is it okay for Facebook to play mind games with us for
science? It's a cool finding, but manipulating unknowing users' emotional states to get
there puts Facebook's big toe on that creepy line."

Slate.com called the experiment "unethical" and said "Facebook intentionally made
thousands upon thousands of people sad."

Mr. Kramer defended the ethics of the project. He apologized for wording in the
published study that he said might have made the experiment seem sinister. "And at the
end of the day, the actual impact on people in the experiment was the minimal amount
to statistically detect it," he wrote on Facebook.

Facebook also said the study was conducted anonymously, so researchers could not
learn the names of the research subjects.

Mr. Kramer said that the contentboth positive and negativethat was removed from
some users' news feeds might have reappeared later.

The emotional changes in the research subjects was small. For instance, people who saw
fewer positive posts only reduced the number of their own positive posts by a tenth of a
percent.

Comments from Facebook users poured in Sunday evening on Mr. Kramer's Facebook
page. The comments were wide-ranging, from people who had no problem with the
content, to those who thought Facebook should respond by donating money to help
people who struggle with mental health issues.

"I appreciate the statement," one user wrote. "But emotional manipulation is emotional
manipulation, no matter how small of a sample it affected."

Facebook users agree to terms of service that give the company wide leeway in how it
can treat them.

You might also like