Algorithms and software entice with their efficiency, but there can be a price to pay.

Staff Writer

Facebook just announced that it may have helped Russian groups influence U.S. elections through ads. In a review of ad purchases after the election, the company found about $100,000 in spending on 3,000 ads purchased by “accounts and Pages were affiliated with one another and likely operated out of Russia.”

There have long been questions about Facebook ad effectiveness and issues of how businesses could get themselves in trouble over using ads badly. There has even been uncertainty over how much of the site’s traffic comes from bots.

Now some of those issues have cross-pollinated. The automated ad placement systems that Facebook uses may have allowed the content that the company would not have otherwise permitted. Automation and algorithms can provide a lot of benefit to companies, but they can also make them vulnerable in various ways.

Here are the points Facebook made in a press release titled An Update On Information Operations On Facebook and authored by Chief Security Officer Alex Stamos:

  • The vast majority of ads run by these accounts didn’t specifically reference the US presidential election, voting or a particular candidate.
  • Rather, the ads and accounts appeared to focus on amplifying divisive social and political messages across the ideological spectrum — touching on topics from LGBT matters to race issues to immigration to gun rights.
  • About one-quarter of these ads were geographically targeted, and of those, more ran in 2015 than 2016.
  • The behavior displayed by these accounts to amplify divisive messages was consistent with the techniques mentioned in the white paper we released in April about information operations.

“In this latest review, we also looked for ads that might have originated in Russia — even those with very weak signals of a connection and not associated with any known organized effort,” Stamos wrote. “This was a broad search, including, for instance, ads bought from accounts with US IP addresses but with the language set to Russian — even though they didn’t necessarily violate any policy or law.” He also said that the accounts in question had been deactivated.

Legal or not, though, this becomes a mess for Facebook. Influence on the 2016 election is a hotly charged topic and a place where many businesses would rather not find themselves. And, as my colleague Minda Zetlin notes, the event also reinforces concerns about Facebook’s role as a media filter, with significant consequences for society.

The difference between online and traditional media is a good way to see how automation can affect how companies do business and whether they face unintended consequences as a result. The more a company uses software to direct actions, the more it depends on the ability of technology to make the decisions the way you’d hope humans would.

Facebook isn’t the only company in which automation does things that might not seem wise. In January it became clear that a number of high-profile sites had lost their Google ad revenue because Google’s systems, including ultimate human review after action was taken, decided there was something wrong with the content.

Increasingly, companies are turning over jobs and even decision making to software systems that don’t ask for raises, vacation, promotions, or a fancy office. What’s not to like? But noticing when things go wrong can be difficult and the impact on customers and other businesses, particularly entrepreneurs, can be immense.