Was Facebook's Mood Study Too Creepy?

Tuesday, July 01, 2014

facebook feed

According to Adrienne LaFrancesenior associate editor at The Atlantic, Facebook manipulating what you see on your newsfeed is nothing new, but users were outraged when a study was published that revealed Facebook manipulated what kinds of words they saw on their feeds - happy or sad - in an attempt to find out if it affected their moods later in the week. Was it legal? Ethical? Did it cross a line?


Ms. Adrienne LaFrance

Comments [39]

Subliminal Advertising has been controversial since the 50's.
We seem to seek manipulation for pleasure; we object to anything labeled manipulation when otherwise.

Jul. 05 2014 10:04 AM
Donald J. Sepanek from Bayonne, NJ

Jul. 02 2014 06:32 PM


933 days 16 hours and 21 minutes

.... until this anti-semite leaves the White House

Jul. 01 2014 08:39 PM


933 days 16 hours and 22 minutes

.... until this anti-semite leaves the White House

Jul. 01 2014 08:37 PM

I think the philosophy professor, as well as the numerous comments stating that we're manipulated by everything in the media, shows how far the moral compass has drifted. Yes, all media use forms of propaganda and manipulation, but stating these facts should not make them acceptable and that seems to be the agenda of the professor and many people commenting. One may cynically hold FB users in contempt, but you forget that many of its users are ESL, the poorly educated and impressionable teens. Would it be alright to manipulate them for the sake of "science"? Look how many children can identify brands before they can read, thanks to the "idiot box" - that has become an accepted norm too thanks to arguments like theirs. People are more and more commonly using Social Darwinian rationales to forgo defending those least equipped to defend themselves. They kid themselves that they are somehow exempt from all this. So much for the tired huddled masses: they're just legitimate marks - guinea pigs and canon fodder. What does that say about you and your definitions of humanity? Should your great grandparents coming through Ellis Island have been judged by that criteria? How far we haven't come.

We have a terrible education system and the professor is a prime example of why it is broken. American society is now at a precipice thanks to people like him, who will find an "objective" way to legitimize any wrong doing done at the behest of scientific "truth". I'm sure he makes a great argument for eugenics too. I mean, those guys surely have a sincere well-meaning interest in perfecting the species, don't they?? Nothing immoral there. No inherent problems foreseeable in those ideas.

When morality is diminished to a game of semantics, we have tipped the scales of justice upside down. Consequence becomes a quaint notion in an era where torture and murder are acceptable in society and in law once again. Welcome to 21st century America, it's looking a lot like the Old World. Btw, when are we pulling down Lady Liberty, because she's just cluttering up the harbor. At least turn off that torch, because we've all obviously developed beacon intolerance.

Jul. 01 2014 07:22 PM
Donald J. Sepanek from Bayonne, NJ

In my opinion, it's unethical to speculate about someone's "hidden" motives when their actions produce a positive result. In fact, it's a step away from gossip. My first reaction when I heard the results of this study was to try and share more positive items with my Facebook friends, yet your guest's first reaction was to become more negative - boggles the mind!

Jul. 01 2014 06:17 PM
Reid Blackman from Brooklyn, NY

Hi everyone - I'm the philosophy professor some of you are commenting on.

I agree that informed consent is an important issue (and if I had more time on air, I would have loved to discuss it!). But I'm not so sure whether it's at play here, or to what extent it is. As a number of other commenters have noted, people have volunteered to use Facebook, agreeing to various terms and conditions. It is also well known (though perhaps not well known enough) that Facebook and most any other website collects data from its users and displays information to its users in part based on that data (google ads, ads on other websites, Netflix and Amazon recommendations, etc.). So when you volunteer to use FB, you are volunteering to use a service that you (should) know will make use of your data in various ways. That's why informed consent is not obviously at issue: FB arguably already has it.

There is, of course, a difference between knowing and it being the case that they should know. If they should know but don't know as a result of their negligence (say, agreeing to terms and conditions without reading them), that's not FB's fault.

This is a difficult issue, of course, and I appreciate everyone's constructive criticism - thanks!

-Reid Blackman

Jul. 01 2014 04:00 PM
Ugh from NY, NY

Wow. Rayna's rant was distractingly annoying. Had to comment.

Jul. 01 2014 01:35 PM
tom LI

Sarah - its a complete misunderstanding by MOST, it not nearly all of people who use any online services.

I'm consistently amazed at how many people are surprised by the lack of privacy online, and now out in the public square.

9-11 and the overreaction by the public and the Govt, in our thirst for immediate revenge and quest for absolute safety, mixed with the unregulated technological explosion and the publics unthinking consumption of all of it (thank you Boomers!) is what allowed for all the unfettered surveillance.

American apathy is what feeds it.

Jul. 01 2014 01:13 PM
tom Li

Amazing that anyone is upset, least of all the emotion and mood public vomiting people who populate Facebook.

Hey folks, we're being manipulated by marketers all the time, whether its obvious, like TV ads, or in the public square (supermarkets, shopping malls, movie theaters, etc) they are always playing with our emotions, and in some cases recording it remotely!

Wake up! This is the United States where real privacy - thanks to the Boomers - is now a thing of the past, along with disclosure. And the vast majority of the general public is apathetic to all of it! Cameras everywhere, and no one knows half of what they're recording, and who's viewing it. Data collected at every transaction - online or at a register, or service desk, etc - and most people shrug it off. "Oh well...the price of living in 2014." Police and Google, and who knows who else driving around searching parking lots, and neighborhoods, etc and taking pictures...and no one says a thing.

Next up, drones peering in your windows, following you around, tracking you/us in traffic, because some undertrained LEO thinks you're suspicious. Or for their own attention ladies for the next stage of stalking from your Exes!

America is fast becoming an over-watched nation.

Jul. 01 2014 01:03 PM

I agree with Ben from Westchester about the Brooklyn "current caller." Specific informed consent opens up another feedback loop to verify the data.

The caller from Brooklyn who could find no harm done neglects to mention that not having informed consent makes the reporting of harm impossible. If the subject doesn't know the experiment is going on, you can't report harm. Another variation is the NRA/GOP prevention of constant data gathering by NIH & CDC on gunshot wounds/deaths & other consequences. If the data can't be gathered, no harm can be proven. Notably, the same tactic was also used when introducing GMO seeds, etc. which meant that the medical profession could not match symptom changes with new food products which may be genetically the "same," but which could not be followed properly for post-introduction changes.

Facebook just reused an old NRA/ALEC/USCOC tactic, preventing data gathering equals effective immunity from liability.

Jul. 01 2014 12:03 PM
Seth from UWS

Nothing that little creep Zuckerberg does is ethical.

Jul. 01 2014 10:58 AM
John A

Headline: Is Facebook Making Us Crazy?
Nextline: Callers prove: Yes!

Jul. 01 2014 10:55 AM
Andrea from Philadelphia

I was surprised by the philosophy professor's take. In academia, it is standard that you cannot use human subjects for research without their consent. And this is not just true of the sciences. I'm a historian and we are not supposed to interview people for the purposes of oral history without getting the interviewee's consent, even if we are not planning to quote them or use their real names. If people go to Facebook assuming that they are seeing what is going on in their friends' lives and to get a skewed mostly positive or negative feed whatever else it does impacts on those relationships.

Jul. 01 2014 10:50 AM
henry from md

Condemning Facebook and exonerating the commercial advertising industry that lurks at every corner and permeates every cranny seems to me specious.
Facebook's experiment might have this good effect: alert the public to be more aware and discriminating as to the manipulative tricks the advertising (including political commercials)are using.

Jul. 01 2014 10:49 AM
mark from NH from NH

Rayna: that was awesome!

Jul. 01 2014 10:49 AM
The Truth from Becky

Ohhhh for the love of.....delete the account already! You should have read the fine print.

Jul. 01 2014 10:49 AM
Barbara Hoffmann from PA

Tiresome, and really did you need to give that woman almost 8 minutes of airtime..

Jul. 01 2014 10:49 AM
Sarah from Brooklyn

At the core of all this is a fundamental misunderstanding between Facebook and its users regarding what Facebook IS. Users seem to believe that Facebook is a personal social tool for intimate communications; Facebook knows that it is a business, and its business is data collection and sales. It strikes me as naive to assume that every piece of information a user gives to Facebook is NOT going to be used for Facebook's corporate and business goals: making money off of its users data.

Jul. 01 2014 10:46 AM

Aren't these the same issues raised in "Nudge"

" . . . They also argue that this human quality, which some would call irrationality, can be predicted and — this is the controversial part — that if the social environment can be changed, people might be nudged into more rational behavior. . . . "

Jul. 01 2014 10:46 AM
Inquisigal from Brooklyn

Ok Brian, shut this woman caller down!

Jul. 01 2014 10:46 AM
Terri from Brooklyn

ICorrection: the CFR regs were first published in 1974, then revised and expanded in the following years.

In any case, the philosophy professor is incorrect: fed regs and generally discussions about clinical research ethics are not about the intent of the researchers or quantifiable harm ex post facto, but about the dignity of the research subjects and the necessity of informed consent so as to protect that dignity.

The primary issue is not risk but informed consent: A low-risk experiment with no informed consent is always unethical, whereas a higher-risk experiment with informed consent can be perfectly ethical.

Jul. 01 2014 10:46 AM
Nick from UWS

WAHHHHH WAHHHHH Facebook MANIPULATED me! It is rare that a story so clearly points out the need for Facebook users to look in the mirror and ask themselves why the hell they willingly participate in all this crap in the first place instead of this schoolyard finger pointing at something everybody knows is happening anyway. This is one of the most absurd segments I have ever heard on this show.

Jul. 01 2014 10:41 AM

This is an unethical experiment. What about people who were unwittingly part of this experiment who struggle with depression or anxiety or mood disorders? Some people are sensitive to various things that they see or read and therefore such an experiment, which could have a negative affect on someone's moods, needs to be disclosed. What affects one person in a small way could affect another in a more significant way depending on the other things that are going on outside of facebook.

Jul. 01 2014 10:41 AM
Ben from Westchester

The current caller who says "I can't see what harm has been done" by FB's manipulation of the FB news feed really has no idea about the "behind the scenes" issues of how their management of this private telecommunications network affects us globally.

There is a great deal of discussion online about MANY groups who see wrongs in what FB has done to News Feeds. It's not hard to find.

As but one example -- FB has previously asked non-profit organizations to accrue "likes" to their pages. Then, after FB's going public, they are now charging non-profit organizations for the non-profit's ability to "message" information to their own customers.

This is all legal, as FB has the legal right to adjust your news feed however it sees fit. But to "bait and switch" non-profits and then charge them to reach their own followers and customers on the network shows a lack of interest in the public good of non-profit entities like WNYC that are not cash rich and are just trying to reach people who have told FB that they do indeed want to hear non-profit news.

So this is one of many groups who suffer from FB manipulation of news feeds. It is not neutral. It has real effects on organizations, not just major companies like Coca Cola.

Jul. 01 2014 10:41 AM
Jessie Henshaw from Way Uptown

WHAT IS QUITE CERTAIN IS EVER INCREASING MANIPULATION of our lives DOES HARM, and that is the clearly the financial business model we are dealing with, (up to the point we pay for our services...).

Jul. 01 2014 10:39 AM
The Truth from Becky

Besides, Coke, Pepsi, McDonalds, Honda, Mercedes, etc... we receive subliminal advertising everyday! Do they make you hungry? Make you thirsty? perhaps...this is a non issue, waste of air time.

Jul. 01 2014 10:37 AM
Mar from nyc/fl

another paid research for peer pressure effect
I knew there was a reason I dropped facebook for a while

Jul. 01 2014 10:36 AM
john from office

I find this constant need to be on line, on Facebook or twitter or whatever is the newest thing CREEPY.

My Nephew takes a picture of his dinner before eating and posts it on line!!!!WHAT IS THAT ALL About

Jul. 01 2014 10:34 AM
Andrew Crocker from Brooklyn, NY

As a student of anthropology at Queens College, I believe its totally fair. When someone goes on Facebook they find what they're looking for, manipulated or not. Smart Ads from Google, NSA on our phones; there are ways that technology can be manipulated for those that own it to those that use it. I'm not worried and think if there's an opportunity for betterment, than its further than an experiment, its useful.

Jul. 01 2014 10:34 AM
The Truth from Becky

I can't be answer is my answer, period.

Jul. 01 2014 10:34 AM
Terri from Brooklyn

If the researchers did not obtain informed consent from each and every subject, the study was unethical, and I wonder about PNAS's decision to publish the study.

Facebook might be able to bury ethical concerns in its boilerplate language when one agrees, but this does not absolve the responsiblity of the researchers to comply with the US code of regs 45 CFR 46. These regs were first promulgated in 1977, so the researchers have zero excuse for noncompliance.

Jul. 01 2014 10:31 AM
jgarbuz from Queens

Facebook has some minor usefulneess to me. It reminds me of birthdays of old friends and acquaintances. It's another medium of communication. But only idiots put real information on it But then, that is what advertisers are looking for, idiots. They hunt for idiots to push their wares upon them.

Jul. 01 2014 10:29 AM
Nick from UWS

Manipulated by Facebook. Give me a break. The populace of the world is being emotionally manipulated by advertising and politics every single second of every single day in every direction and in every media and everyone accepts it without a word. What is the outrage here? Another non-conversation.

Jul. 01 2014 10:29 AM

Hey no free lunches!!

U play on the facebook they own that data

Jul. 01 2014 10:21 AM

Next the soft drink manufacturers will be using subtle messaging to increase our appetite for salty popcorn as a way increase demand for their product.

Fortunately, there are "public guardians" looking out for us:

Jul. 01 2014 10:16 AM

In for a penny in for a pound.

Reminds me of the "Meme Wars" described in some of John Barne's
science fiction.

Jul. 01 2014 10:07 AM

Forget advertising dollars, LOL – this stuff offer the keys to POWER and the Kingdom.

The elephant in the room is how the LEFT (and Facebook, Google, Techie Billionaire execs are all rich guilty Lefties) will use this to NEVER LOSE another national election.

(Example extrapolated from this recent "experiment") - Republican candidates ---remove all positive words or descriptors, make negative associations, make subtle tangential references without labeling outright, use the most unflattering photographs, darken the backgrounds. Include unpleasant ads along the sides of these pages with negative associations. Place among unsettling news stories.

Create an unconscious dislike for someone voters weren’t even familiar with a week earlier.

This is George Lakoff mixed with George Orwell and Saul Alinsky.

Jul. 01 2014 08:51 AM
Paul from Boston, MA

Whether it's legal or not, it's certainly unethical.

Jul. 01 2014 07:35 AM

Leave a Comment

Email addresses are required but never displayed.

Get the WNYC Morning Brief in your inbox.
We'll send you our top 5 stories every day, plus breaking news and weather.