A study detailing how Facebook secretly manipulated the news feed of some 700,000 users to study "emotional contagion" has prompted anger on social media.
For one week in 2012 Facebook tampered with the algorithm used to place posts into user news feeds to study how this affected their mood.
The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
The researchers wanted to see if the number of positive, or negative, words in messages they read affected whether users then posted positive or negative content in their status updates.
Indeed, after the exposure the manipulated users began to use negative or positive words in their updates depending on what they were exposed to.
Results of the study spread when the online magazine Slate and The Atlantic website wrote about it on Saturday.
"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the study authors wrote.
"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."
While other studies have used metadata to study trends, this appears to be unique because it manipulates the data to see if there is a reaction.
The study was legal according to Facebook's rules -- but was it ethical?
"#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT... Yeah, time to close FB acct!" read one Twitter posting.
Other tweets used words like "super disturbing," "creepy" and "evil," as well as angry expletives, to describe the experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, told The Atlantic that she was concerned about the research and contacted the authors.
They in turn said that their institutional review boards approved the research "on the grounds that Facebook apparently manipulates people's News Feeds all the time".
Fiske admitted to being "a little creeped out" by the study.
Facebook told The Atlantic that they "carefully consider" their research, and have "a strong internal review process".
Facebook, the world's biggest social network, says it has more than one billion active users.
For one week in 2012 Facebook tampered with the algorithm used to place posts into user news feeds to study how this affected their mood.
The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
The researchers wanted to see if the number of positive, or negative, words in messages they read affected whether users then posted positive or negative content in their status updates.
Indeed, after the exposure the manipulated users began to use negative or positive words in their updates depending on what they were exposed to.
Results of the study spread when the online magazine Slate and The Atlantic website wrote about it on Saturday.
"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the study authors wrote.
"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."
While other studies have used metadata to study trends, this appears to be unique because it manipulates the data to see if there is a reaction.
The study was legal according to Facebook's rules -- but was it ethical?
"#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT... Yeah, time to close FB acct!" read one Twitter posting.
Other tweets used words like "super disturbing," "creepy" and "evil," as well as angry expletives, to describe the experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, told The Atlantic that she was concerned about the research and contacted the authors.
They in turn said that their institutional review boards approved the research "on the grounds that Facebook apparently manipulates people's News Feeds all the time".
Fiske admitted to being "a little creeped out" by the study.
Facebook told The Atlantic that they "carefully consider" their research, and have "a strong internal review process".
Facebook, the world's biggest social network, says it has more than one billion active users.