The New York Times published a blockbuster story about Facebook that exposed how the company used  so-called “smear merchants” to attack organizations critical of the platform. The story was shocking on a number of levels, revealing that Facebook’s hired guns stooped to dog-whistling, anti-Semitic attacks aimed at George Soros 1 and writing stories blasting Facebook’s competitors on a news site they managed. As Techdirt points out, however, while the particulars are different, the basic slimy tactics are familiar. Any organization that runs public campaigns in opposition to large, moneyed corporate interests has seen some version of this “slime your enemies” playbook.

What is different here is that Facebook, the company seeking to undermine its critics, has a powerful role in shaping whether and how news and information is presented to billions of people around the world. Facebook controls the hidden algorithms and other systems that decide what comes up in Instagram and Facebook experiences. And it does so in a way that is almost completely beyond our view, much less our control.

This fact—that Facebook can secretly influence what we see (like, perhaps, criticism against it), both through what it promotes and what it allows to be posted by others—is deeply disturbing. Users deserve some answers from Facebook to these basic questions: 

  • We know that Facebook had staffers embedded in the Donald Trump presidential campaign to help the organization best craft and target its messages. What did Facebook do regarding attacks on its critics detailed in the Times story? Did it use its power to ensure a wide or specifically-targeted distribution of this smear campaign? Did Facebook use its control over how Facebook works to help aim the smears at people who would be most receptive to them? To key policymakers or their staff? If so, how? 
  • Did Facebook help develop different versions of the smear campaign to appeal to different audiences, as the Russians have done? If so, we should see all of them.
  • What is the boundary between what Facebook’s policy teams wish to tell the public and how users experience Facebook and Instagram? Is there a firewall and how is it policed?

The over 2.6 billion people using Facebook globally, who Facebook unironically calls its “community,” should demand much more than just self-serving responses or empty apologies delivered on the Friday before a holiday weekend. Facebook employees, who also have tremendous power to pressure their employer, should join users and demand that Facebook come clean.

The ongoing hidden nature of Facebook’s algorithmic decision-making, however, plus the fact that it took a major newspaper exposé to bring this to light, means Facebook probably can’t be trusted to provide the answers users require. Facebook must allow third-party, neutral investigators access to see whether Facebook is misusing its position as our information purveyor to wage its own ugly propaganda war.

Going forward, we must also demand openness from Facebook about how it uses its power to  buttress its financial and policy positions. Facebook can have policy positions, and it can even use its own platform to promote them. But it should only do so if it is up front with users about those practices and makes it crystal clear when it uses its power to put a finger on the scales to influence what they see. Only then can users make an informed decision about whether the platform is where they want to be.

Most importantly, this incident confirms that we should double down on pressure on Facebook and Instagram to provide users with more control over their experience on the platforms. We must support and develop competition, including concrete steps to promote data portability and interoperability. Congress can help by removing the legal blocks to a healthier Internet it created through the overbroad Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act (DMCA). We must also ensure that click-wrap contracts and API restrictions can’t be used to block competing and interoperable online services.  As we’ve said before:

If it were more feasible for users to take their data and move elsewhere, Facebook would need to compete on the strength of its product rather than on the difficulty of starting over. And if the platform were more interoperable, smaller companies could work with the infrastructure Facebook has already created to build innovative new experiences and open up new markets. Users are trapped in a stagnant, sick system. Freeing their data and giving them control are the first steps towards a cure.

Facebook’s smear campaign should spur policymakers and the rest of us to ask serious questions about Facebook’s power as our information supplier. And once those questions are answered, we should take the steps necessary to restore a healthy information ecosystem online.  

 

 

  • 1. EFF receives funding from Open Society Foundation, which was the organization at the center of the Facebook smear campaign. Facebook has also been a corporate sponsor for several EFF events.

Published Date

Categories