facebook_logo
Editor: This is a GREAT ARTICLE on what recently happened on FaceBook, thought it was worthy of adding.

Remember when your Facebook news feed just contained your friends’ status updates? Those were such innocent days. Then came ads dispersed here and there. And over time, those ads have looked more and more like actual status updates, luring you into thinking they’re from a trusted source. Now, however, we have Facebook itself on the loose.
It’s true advertisers have worked hard to create news feed ads that contain truly valuable content and information. Some folks even enjoy the occasional ad in their feeds, as they are extremely well targeted in many cases.

Facebook, however, just doesn’t know when to quit.

Recently, much ado has been triggered over an academic study co-released by the social media giant in conjunction with Cornell University and U-Cal San Francisco. Facebook hired these researchers in 2012 to conduct a targeted study around how people reacted to posts with either positive or negative connotations. In other words, they were emotionally manipulating people via their news feeds for the sake of “research”. Sound moral or ethical? That’s a definite gray area.

Details on the Facebook Academic Research

Facebook’s mission: To find out how, if, and why users responded to positive and negative streams of commentary placed within their news feeds. Over roughly a week’s time in 2012, the researchers purposefully manipulated 700,000 news feeds, without any warning to those being tested. How can they get away with unsolicited research? It’s in Facebook’s terms and conditions; we all agree that user data can be utilized for research purposes, just by using the site. Now you know.

Here’s how it worked: Researchers reduced posts in each news feed, either focusing primarily on positive or negative posts. Then extensive metrics were amassed to see whether those users’ status updates and related posts reflected the tone they had experienced within their feed.

In other words, did photos of kittens and babies with inspirational quotes inspire users to post similar content? How affected are we by the content of posts in our feeds?

The Results of the Research

If you assumed negativity breeds negativity, and positivity more of the same, the researchers would staunchly disagree. To quote the related press release, “People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates. When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words were used in peoples’ status updates.”

At least in the land of Facebook, it seems like human nature responds in the reverse of what is being emphasized before them. So now you know why that friend you inundated with negative political posts has resorted to quoting Deepak Chopra and Oprah-isms on their news feeds.

The How’s and Why’s of the Backlash

Although the experiment may sound innocent enough, emotional manipulation on any level rides a very iffy ethical line. Adam Kramer, one of the key researchers hired to carry out the experiment, even admitted the entire thing was a bit “creepy.” He strongly defended the goal of the study, however, but there’s no denying the whole thing feels icky.

Now, there’s a suggestion that Facebook’s company policies have shifted since the study, which many theorize is supposed to lessen the blow by making it seem as though Facebook won’t do it again. Yet few, if any, trust that hypothesis.

The problem is simple: Facebook gains almost all of its revenues from advertising. Data manipulation like this, therefore, is a goldmine, and they’re not likely to shy from it again in the future. After all, although there’s a bit of a backlash now, it’s highly unlikely any of this will affect Facebook’s bottom line. Especially not in any negative fashion. Perhaps the media’s negative exposure of Facebook’s policies breed more positive revenues. They might be on to something.

The Trend Continues

To further pound the nail in the coffin for Facebook’s trustworthiness, let’s not forget that last year thousands of small business owners cried foul, for similar reasons. The accusation then was that Facebook diminished their news feed reach significantly, and without warning. This meant that your favorite businesses – i.e. the ones you actually followed, and therefore invited into your feed – had a much harder time actually reaching your news feed. Why? Because Facebook wanted to charge them to reach you. It’s crystal clear.

As a result, any businesses you followed were greatly handicapped in reaching your news feed, but businesses that paid to do so, whether or not you invited them in, were all over your homepage. So as you can see, Facebook is no stranger to data manipulation for profit.

The big problem here is not a single instance of these infractions by Facebook; it’s the bigger picture. If the company is willing to change your feeds to suit its research and/or revenue needs, there’s little to no trust factor left. If a witness lies on the stand once, they cannot be seen as honest. Facebook has changed the game and shifted things to their benefit so many times now, we would be fools to think they wouldn’t do it again – or even more to the point, aren’t doing it right now.

What is your current opinion of Facebook? Do you feel this kind of research is justified, or does it feel more like a betrayal?

Source