by Kristie LeVangie
The big news today comes from Facebook, and some people are outraged. Others of us realize, shit like this happens ALL the time.
News broke this weekend that the social media giant manipulated almost 700,000 users’ news feeds in 2012 in an attempt to study whether emotion can be influenced by social media.
The catch is no one was told about it until results were released this weekend.
While the company was in legal compliance to do so, some users feel as if their privacy was violated. By accepting the Terms & Conditions of the social media site upon enrollment, you signed over your right to be forewarned.
So what exactly did Facebook do? “The experiment involved reducing the number of positive news feeds for some and reducing the number of negative news feed for others. The study found that the more positive the news feeds a user received, the more positive their postings became, and vice versa,” says Yahoo’s The Daily Ticker.
Needless to say, Facebookers tend to get a bit raw when their Facebook is messed with.
The study’s lead researcher, Adam Kramer, took to his own Facebook page to apologize:
“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
This paragraph was embedded in a post offering some additional explanation on the study:
For the study details, click here.
For the rest of us social psychologists, non-consensual experimentation is nothing new. Wikipedia shows a laundry list of non-consensual experiments, including the famous Milgram Experiment from 1961.
The advertising world is a consistent arena for non-consensual social psychology experimentation. Every time they change a package, a message, a color, a situation…be warned that they are measuring your reaction to the change in brand awareness measures or cost-volume measurements.
The easiest thing to do if you don’t like it? Shut down your social profile and then, and only then, will Facebook likely take the hint.
For now, Facebook apologizes for the communication regarding the experiment, but not for the experiment itself. They say they will use what they learned on this study to adjust their communication to users in the future.
Do you feel violated? What’s your take?