National Geographic Daily News
A photo illustration of a woman looking on the Facebook website.

A woman looks at the Facebook website in Munich, Germany, on February 2, 2012. Whether her "news feed" contains more positive or negative posts may affect her mood, according to a controversial study.

Photograph by Michael Dalder, Reuters

Dan Vergano

National Geographic

Published July 1, 2014

Revelations about a secret mood manipulation experiment that Facebook conducted on nearly 700,000 social users created a lot of bad feelings this week.

The June 17 Proceedings of the National Academy of Sciences study, led by Facebook data scientist Adam Kramer, described altering users' exposure to positive or negative comments from friends for a week in 2012.

The results suggested that positive and negative emotions are contagious, in a small way, across social media. But negative feelings were the ones spreading widely with news of the study, as ethicists like Art Caplan of New York University's Langone Medical Center raised questions about its honesty.

The University of Wisconsin, Madison's Dietram Scheufele knows a thing about the power of emotions in the digital media. The professor of science communication has reported on a psychological "nasty effect" of reader comments on online news stories. His team showed that negative comments make people dislike the subject of otherwise neutral-toned news reports, while positive ones skewed them the other way.

Scheufele tells National Geographic in an email interview that the recently disclosed Facebook study points to a need for the public to know more about the world of corporate experiments and raises questions about study ethics in the era of social media.

Is there a larger meaning in the uproar over the Facebook experiment?

We've talked for a long time now about technology moving faster than regulation, but typically, we're talking about this with regards to the natural or biological sciences.

I think this [Facebook] study shows nicely that the problem might be even more urgent for the social sciences, where the realities of what is possible with big data collections and analysis have so far outpaced any legislative mandate to protect human subjects.

The latter will always play catch-up with the former. The only question is how far we're willing to fall behind. The Facebook study shows that we have clearly fallen behind way too far.

So what should the public be asking now?

What should corporations be allowed to do internally with the data they collect, and what should be acceptable in terms of published research? Realistically, there will always be a lot more latitude in terms of what corporations can do than what is acceptable in academic research.

We as consumers have gotten so used to having access to online tools and services that we don't realize anymore that things like email, messenger, photo storage, music storage, and other social networking and cloud services shouldn't be free. They're expensive to provide and—technically—they should all come with a price tag attached.

The fact that we as consumers do not see the price tag doesn't mean it doesn't exist, which means that we end up being the product that's [making money] through the terabytes and terabytes of data we provide every day by having our emails, photo tags, messages, and clicks analyzed by corporations. All that data will be collected and analyzed to increase profits and build better products. And that won't change until we're willing to pay.

So what does that mean for "manipulating" the emotional content of someone's Facebook feed?

If we see this as an issue of corporate data collection, it raises two questions: A, do the benefits of the study outweigh the consumer backlash in terms of Facebook's bottom line? And if they do for corporations like Facebook, we may see more of these studies.

And B, should corporations be allowed to experimentally assign people to conditions that might do psychological harm?

What if a severely depressed person ends up committing suicide because they ended up in the negative emotion condition? The parallel would be GM trying out different qualities of airbags in different cars that they sell to consumers to see what the fatality rates, lawsuits, etc., are in each condition.

What are the differences between academic and corporate research?

For published research, the standards should be completely different, of course. We're seeing a widening gap between academics—who often don't have access to these data or to manipulations like the ones administered in the Facebook study—and corporations, who can create these data themselves, as the Facebook study shows.

This puts academics at an inherent disadvantage, one that is exacerbated by the fact that academic [review panels] would never approve this study in the way it seems to have been conducted: without debriefing, various other safeguards, and so on.

A notable exception is Twitter, which recently had a request for proposals for academics to apply for access to their complete data. In other words, they have been pretty open in making at least raw tweets available to academic researchers.

What's the solution?

The solution, in my opinion, are public-private partnerships. If Facebook, Twitter, or other firms partnered with academic scholars in communication, psychology, political science, sociology, and so on, they would immediately capitalize on the infrastructures that are in place in terms of ethics boards and other oversight.

They would also "buy" valuable expertise in terms of answering complex and important research questions without raising ethical concerns. As Susan Fiske, the editor who oversaw submission of the Facebook study to the journal, noted, this is an important topic. [Fiske said she was "creeped out" by the study in comments to the Atlantic.]

I completely agree. Given what is at stake for society, I hope we can build partnerships that not only help us do the research necessary to understand a rapidly changing technology landscape, but also moves forward in a responsible fashion.

Follow Dan Vergano on Twitter.

Cj Mendoza
Cj Mendoza

Social media is not for the weak.

Guy Zaczek
Guy Zaczek

Bill I like your use of the words "seductive and narcotic".  But the rest of the article talks about left brain ideas like Education and Training.  Some pretty smart people I know are sill stupid when it comes to seduction and all the training in the world will not change that.  

Bill Conder
Bill Conder

All of our tools and technologies are seductive and narcotic. When we use them they turn around our mind and we become their servant. It starts with literacy training, which lays down the matrix, and from that everything else is assumed to be ok. Our educational programs need to include instruction in both desirable and undesirable effects of human technology and artifice. Otherwise we're at the mercy of those entities, private or state, who want to get and control our attention.

James Lucier
James Lucier

How much'experimentation' do you think health insurance companies have been doing on patients behind our backs?'  After half a lifetime with Kaiser,  I'm convinced Kaiser is just a big experimentation lab controlled by doctors who see themselves as scientists and use select patients as their private lab rats.

Glad to be out of that trap - feeling healthy after 22+ years of anguish.

Charlotte S.
Charlotte S.

Mind manipulation is not new. History of ads will tell you that.  People's input on line can not be taken as fact, unless reliable sources are provided. We all need to be wise and discerning. There are those who feel very powerful and who use it to their own ends. Shades of George Orwell 's 1984 and "double- speak"

Judith Brooke
Judith Brooke

I wish I understood what you were writing about.  I don't know of any studies done that include what I do on the internet.  I don't know how my photos can be of any interest to any corporation.  I know the emails are scanned and probably certain words are noted by the corporate computers, but other than posting an ad on my email page, I don't know what they do with it.  And the same with my FB postings.  If I delete certain ads and don't read certain postings, what does that say to the corporations?  I have 'unfriended' a number of FB sites because they bombard me with several postings a day, even when the messages are positive.  It's just overwhelming.  I've also 'unfriended' the groups that have negative messages, even if they are of value because there are too many. And any profanity is deleted.  FB struggles to satisfy me by introducing a site that may be of interest to me, but again, the postings are too many.  One a day of a positive message is enough.

Lisa Foster
Lisa Foster

Unbelievable,try it on your feed. Is fixed for no g+share? check  it out. Nat Geo and FB issue YUC and huh?!@Lisa Foster

Lisa Foster
Lisa Foster

Would like to share your story ,sourced,as u have now, to share into the' g+' feed.  Your Nat G story,1st saw on FB  wont let me. Your link,  even though has share" to g+button" on side. There r are only a total of 3 peeps! on counter but million+ on counter 4 FB, Hm- so why?@Lisa Foster

Lisa Foster
Lisa Foster

would like to share your story into the g+ feed ,wont let me your link even though has share to g+-only 3 peepson counter for that- but million of FB,Hmmm-

pamela letstalkaboutcorsica
pamela letstalkaboutcorsica

once again, we're not aware of half of what's going on - worrying times of manipulation, suggestion, orientation, propaganda etc,. not easy for the users to find their way through the webs - and remain entire, I agree.

Dwayne LaGrou
Dwayne LaGrou

Human beings. ARE NOT GUNIEA PIGS !!!

This type of manipulation is completely unethical. If something like this were done in the medical field they would all be guilty of malpractice. Why should this be any different?!

Chrissie Raffensperger
Chrissie Raffensperger

I work in the social sciences, if I had taken this study to an institutional review board it would have probably been tossed out as highly unethical.  

Kira Lemke
Kira Lemke

@Bill Conder  My school is training us to be careful. I agree with most of your comment, but I don't think tech's to blame. This has been used historically.

M. Aret
M. Aret

@Dwayne LaGrou It has and they weren't: watch "Dying for Drugs".  Curiously, Pfizer is still a thriving company with a full compliment of medical "researchers" on staff :\


How to Feed Our Growing Planet

  • Feed the World

    Feed the World

    National Geographic explores how we can feed the growing population without overwhelming the planet in our food series.

See blogs, stories, photos, and news »

The Innovators Project

See more innovators »

Latest News Video

  • Mazes: Key to Brain Development?

    Mazes: Key to Brain Development?

    Mazes are a powerful tool for neuroscientists trying to figure out the brain and help us learn to grapple with the unexpected.

See more videos »

Shop Our Space Collection

  • Be the First to Own <i>Cosmos: A Spacetime Odyssey</i>

    Be the First to Own Cosmos: A Spacetime Odyssey

    The updated companion book to Carl Sagan's Cosmos, featuring a new forward by Neil deGrasse Tyson is now available. Proceeds support our mission programs, which protect species, habitats, and cultures.

Shop Now »