update July 1, 2014;
Facebook has conducted a secret massive psychology experiment on its users to find out how they respond to positive and negative messages - without telling participants
Over 600,000 Facebook users have taken part in a psychological experiment organised by the social media company, without their knowledge.
Facebook altered the tone of the users' news feed to highlight either positive or negative posts from their friends, which were seen on their news feed.
They then monitored the users' response, to see whether their friends' attitude had an impact on their own.
"The results show emotional contagion," wrote a team of Facebook scientists, in a paper published by the PNAS journal - Proceedings of the National Academy of Scientists of the United States.
"When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."
Facebook were able to carry out the experiment because all users have to tick a box agreeing to their terms and conditions. These include "internal operations, including troubleshooting, data analysis, testing, research and service improvement."
In the study, the authors point out that they stayed within the remit of the agreement by using a machine to pick out positive and negative posts, meaning no user data containing personal information was actually viewed by human researchers.
The study, carried out over a week in January 2012, was carried out for scientific purposes. But evidence that social media can have such a strong impact on people's mental state will certainly be of interest to advertisers.
The lead scientist, Adam Kramer, said in an interview when he joined Facebook in March 2012 that he took the job because "Facebook data constitutes the largest field study in the history of the world."
Watch: Fewer than a third of Facebook's staff are women
High quality global journalism requires investment. Please share this article with others using the link below, do not cut & paste the article. See our Ts&Cs and Copyright Policy for more detail. Email ftsales.support@ft.com to buy additional rights. http://www.ft.com/cms/s/0/6576b0c2-0138-11e4-a938-00144feab7de.html#ixzz36Qh8FbDz
Comments
FB was created by, is monitorred by, and benefits, the intelligence agencies and the ultra rich. By putting things there, everything is recorded and can be used against you by them. This includes a psychological profile and other classifications that can be used in future to detain or kill you. FB benefits Zuckerberg, not the devotees there, so much, who become his unpaid workers, in essence. Its all about controlling and spying on you, so that the elites can play God in this lifetime, and you pay the bill for it.
In this case, FB has been caught playing games with your mind and life. Will you take this and still keep a FB page??
"Outrage is growing regarding Facebook’s (FB) study on user emotions on the social network.
The study, published in the Proceedings of the National Academy of Sciences, looked to determine how happy or sad status updates affected other Facebook “friends.” Researchers from Facebook, University of California, San Francisco and Cornell University manipulated the News Feeds of nearly 700,000 users to control the amount of positive or negative emotional content that appeared in the feed.
According to the researchers, many believe that happy posts on Facebook may be depressing to online
friends – an effect they call “alone together.” But Facebook’s study found that when positive posts were reduced, friends were less likely to post happy posts themselves and more likely to post negative posts. And when negative posts were limited, others were more likely to express positive emotions on Facebook.
While the study was in line with Facebook’s data use policy, privacy experts and even the study’s editor have criticized Facebook’s actions. Susan Fiske, the Princeton University professor of psychology who edited the study for PNAS, told The Atlantic that she was “a little creeped out” by the research experiment.
And many turned to Twitter to express their anger at the social network, including privacy expert Lauren Weinstein. Weinstein tweeted a cartoon featuring a stick figure holding a gun to its head with the text, “Facebook secretly experiments on users to try [sic] make them sad. What could go wrong?”
When asked about the study, Facebook directed FOXBusiness.com to a public post on the social network written by Adam Kramer, one of the study’s co-authors and a member of the Facebook Core Data Science Team. In the post, Kramer acknowledged the outrage over the study – but also justified the thinking behind Facebook’s manipulative actions.
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook,” wrote Kramer.
He also, however, seemed to downplay the potential negative effects of the study, pointing out that only 0.04% of Facebook users had their News Feeds manipulated.
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety,” wrote Kramer.
He added that Facebook would be updating internal review practices based on the backlash to this study, and that the social network’s practices had already changed dramatically since the study was conducted two years ago."
Facebook’s Experiment and its CIA Roots
By Pam Martens and Russ Martens: July 3, 2014
But wait. It gets worse.
Facebook’s secret human lab rat study on a self-described “massive” 689,003 of its users was published just last month in the Proceedings of the U.S. National Academy of Sciences under the title: “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks.” The study said the significant finding was that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
According to Facebook, this is what they did to manipulate the behavior of its unpaid and involuntary human lab rats:
“In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and non-verbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.”
Facebook has observed along with the rest of America the fallout from the revelations of secret government surveillance of its citizens. Somehow the public outcry over secret surveillance did not send a “non-verbal cue” to Facebook that there might be an outcry to revelations that it was using an algorithm to manipulate the emotional mood of its users without their knowledge or informed consent.
What if some of these users were under psychiatric care for depression? What if they had just lost their job, or their marriage, or their home, or experienced the death of a loved one? How outrageously irresponsible is it to secretly attempt to manipulate the mood of an already depressed person to a more negative state?
But wait. It gets worse.
In 1994, the CIA declassified a secret paper outlining other attempts to manipulate a person’s behavior without their knowledge. The document, “The Operational Potential of Subliminal Perception” by Richard Gafford notes the following:
“Usually the purpose is to produce behavior of which the individual is unaware. The use of subliminal perception, on the other hand, is a device to keep him unaware of the source of his stimulation. The desire here is not to keep him unaware of what he is doing, but rather to keep him unaware of why he is doing it, by masking the external cue or message with subliminal presentation and so stimulating an unrecognized motive.”
We’re also informed by the CIA that “The operational potential of other techniques for stimulating a person to take a specific controlled action without his being aware of the stimulus, or the source of stimulation, has in the past caught the attention of imaginative intelligence officers.”
And, the CIA offers some other helpful tips that Facebook may want to consider in its next human lab rat study:
“In order to develop the subliminal perception process for use as a reliable operational technique, it would be necessary a) to define the composition of a subliminal cue or message which will trigger an appropriate preexisting motive, b) to determine the limits of intensity between which this stimulus is effective but not consciously perceived, c) to determine what preexisting motive will produce the desired abnormal action and under what conditions it is operative, and d) to overcome the defenses aroused by consciousness of the action itself.”
But wait. It gets worse.
The jury is still out on whether this study had a military connection. The original press release issued by Cornell University, which was involved in the research study, indicated that the U.S. Army Research Office was one of the funders of the study. After there was a public uproar about the study itself, this correction appeared at the bottom of the press release:
“Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.”
While questions continue to swirl around this dubious study, one thing is not in doubt: Facebook has a unique talent for brand suicide.