News Stay informed about the latest enterprise technology news and product updates.

Facebook manipulation controversy offers ethics and privacy lessons

News roundup: Facebook's manipulation of users' news feeds has reignited the data privacy debate regarding how enterprises should manage user data.

The recent revelation that Facebook Inc. conducted a psychological study on its users has unleashed a wrath of fury across the Internet, leaving users and companies alike wondering if the social network's actions violated its own privacy policies and ethics rules.

But the question being asked in the incident's wake is to what extent do enterprises manipulate data provided by users and customers, and is anyone considering the ethical implications?

For one week in January 2012, Facebook's Data Science team -- which few knew about until recently despite being in place since 2007 -- changed the news feeds of approximately 700,000 users to study whether an increase in positive or negative posts was emotionally contagious enough to affect the moods of Facebook users.

Since news of the study surfaced several days ago, more reports have emerged about "secret" Facebook tests conducted by its mysterious Data Science team. One test reportedly studied whether it could increase voter turnout in the 2010 congressional election. Another investigated the cause of loneliness. These are just two of the reportedly hundreds of tests performed.

Former Data Science team member Andrew Ledvina told the Wall Street Journal that, regarding the tests, "there's no review process, per se" and that "anyone on that team could run a test. ... They're always trying to alter people's behaviors." In his blog, Ledvina wrote that Facebook altered users' moods and behaviors to "make you like stories more, to click on more ads, to spend more time on the site." Levine also admitted that experiments were often run without the knowledge of the company or even others on the team, and that some tests were run on users with "really no formal review process."

While legal commentators have said it's unlikely that Facebook broke any laws given that its user agreement grants it broad authority to make use of data provided by its users, many have asked whether it violated its own privacy policies. Perhaps more importantly, were its actions unethical?

Data suggests that many think so. According to results of a Civic Science poll of more than 3,200 people conducted earlier this week, 53% believe Facebook acted unethically in manipulating its users feeds to intentionally affect users' emotional states.

In the wake of the incident, Facebook COO Sheryl Sandberg apologized, stating the company "communicated very badly on the emotions study." The Electronic Privacy Information Center has filed a formal complaint with the Federal Trade Commission, and the UK's Information Commission Officer and Ireland's data protection commissioner have started conducting investigations into the matter.

The experiment also prompted multiple news outlets to report what Facebook can do with user data, ponder whether Facebook is taking a lesson from advertising companies and manipulating user emotions to improve its business , and even predict that the growing power of data analysis -- such as Facebook's -- will help firms hire better employees.

The bigger question debated among end users and privacy experts alike is whether information from users -- be they clients, members or employees – can, or should, be used without the users' explicit consent.

The response to the Facebook drama has been overwhelming; many social media users have indicated it was "creepy" and unexpected, though some suggest this use of user data has become fairly routine.

In other news:

  • All of the nearly two-dozen domains that Microsoft seized from Vitalwerks (which operations and during its most recent botnet takedown were finally returned to the provider and are being restored. According to the provider, it has reached a settlement with Microsoft, agreeing to shut down the domains responsible for propagating the malware discovered by Microsoft.
  • Is there someone else following in the infamous footsteps of NSA leaker Edward Snowden and revealing more top-secret information about the clandestine intelligence-gathering agency? Glenn Greenwald, a reporter who often works with Snowden, claims there is, and industry expert Bruce Schneier agrees.
  • History is repeating itself again, at least as far as macro viruses go. This malware, which was a troublesome self-propagating issue until it fell off the map in the early 2000s, has resurfaced to once again take over application functions in the Microsoft Office suite, most commonly Word or Excel.

Next Steps

Take a deeper look at the very public issue of data privacy

Does ethical hacking do more harm than good?

View the "Ten Commandments" of computer ethics

Dig Deeper on Information security laws, investigations and ethics

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Did Facebook's manipulation of users' feeds for a psychological experiment go too far? What questions does it raise about how enterprises utilize user data?
Of course FarceBook went too far! What is so different about what they did vs. the medical experiments done on prison inmates in the 1960s?
I'm not supporting what Facebook did, but I'm not that bent out of shape about it, either. I already know that what I see in the News Feed is manipulated in some way, meaning that I don't see everything that's posted (and to those who equate Facebook with real life - I think there's an overabundance of 'positive' posts in general that skew our perceptions of others' lives). I think as a user/consumer of mass media, you have to assume that you might not be getting the full picture. 

As noted by Jonathan Zittrain and others, companies are running tests all the time to see how consumers respond. This was just handled particularly poorly.