Welcome!

Welcome to my blog, a place to explore and learn about the experience of running a psychiatric practice. I post about things that I find useful to know or think about. So, enjoy, and let me know what you think.


Sunday, June 29, 2014

What's Your Take?

You've probably seen the tweets, etc. about the paper, Experimental evidence of massive-scale emotional contagion through social networks.  This was a study conducted by the:


Core Data Science Team, Facebook, Inc., Menlo Park, CA 94025;
Center for Tobacco Control Research and Education, University of California, San Francisco, CA 94143;
and Departments of Communication and Information Science, Cornell University, Ithaca, NY 14853

From January 11-18, 2012, Facebook adjusted its news feed algorithm so that 689,003 users received either less positive emotional content, or less negative emotional content, from friends. Then they looked at what people posted in response.

They found that less positive feed content yielded less positive posts, and less negative content yielded less negative posts. If that sounds confusing, they tried to make people either sadder or happier. The ones they tried to make sadder, by giving them less happy stuff to read, got sadder. And the ones they tried to make happier, by giving them less sad stuff to read, got happier.

They used linguistic software to determine the happiness or sadness of an individual post.

The paper was reviewed and edited by Susan T. Fiske, a professor of Psychology and Public Affairs at Princeton. It was published in the Proceedings of the National Academy of Sciences of the United States of America.

Facebook always has a news feed filtering system-they claim there would just be too much stuff to read in the feed, otherwise. In the case of the study, they simply adapted the filtering system to randomize users to happy or sad arms, and they claim that this is, "...consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."

Their goals seem to have included demonstrating that online interactions can have significant emotional impact, and debunking the theory that people get sad when they see how happy all their friends are on Facebook.

Well, I guess this is a powerful use of big data, but I just wish they had asked first. They claim they did, that informed consent was implicit in the Facebook user agreement. I think that's pushing it.

I happen to know a depressed adolescent (not my patient), a big user of Facebook, who was hospitalized a couple weeks after this experiment. Am I claiming this is why he was hospitalized? No. Could it have influenced his need for hospitalization? Maybe. Does an adolescent need a parent to sign informed consent? Could be.

It's a complicated issue. Every advertiser on the planet tries to manipulate people's emotions, so how is this different? Because Facebook wasn't trying to sell anything, it was just manipulating emotions to see what would happen? I don't know if that makes it better or worse.

An inquiry was made to Dr. Fiske about IRB approval. Her response:



Would it have adversely affected their data if they had done a mass posting?:


We interrupt this waste of time to bring you an important news bulletin! You may be randomized to participate in a study of your emotions. If you experience prolonged and unremitting sadness for more than 4 hours, please seek emergency medical assistance. If you wish to opt out of this experiment, click HERE.


I don't know. What do you think?




2 comments:

  1. Reading the full text of the article - the effect documented here is fairly weak considering the mountain of posts that they perused. In their lexical analysis, the positive posts outnumbered the negative by 2:1. It seems like contagion is too strong a word for what they may be demonstrating. It seems more like a combination of empathy and resonance with real life experience. I would be surprised that it would resonate very long with anyone who has learned how to modulate their affect. When I think of contagion - I am more likely to think of mass hysteria scenarios like all of the children in a school believing they were poisoned and displaying various symptoms only to find out that there was no toxin exposure.

    It may be an interesting approach to try to detect people who are affected by violence, since most experimental paradigms have concluded that they are no good correlations, but data mining may be a better approach.

    I agree with you on the informed consent issue, but this is completely consistent with my theory that there are many forces at play to make privacy non-existent and make it seem like that is the new norm. I can't imagine that a manipulation like this is good for Facebook's core business.

    ReplyDelete
  2. One of the blogs I follow by a mathematician interested in Big Data has this commentary and links to several other articles on this experiment. I thought her comment that Google and Facebook do these kinds of experiments frequently was interesting:

    http://mathbabe.org/2014/06/30/thanks-for-a-great-case-study-facebook/

    ReplyDelete