Why Algorithmic Accountability Is Needed Now

5 Min Read

On August 20, Dick Costolo, Twitters’ CEO, announced that his company would suspend all accounts that shared the gruesome beheadings of journalist James Foley.

On August 20, Dick Costolo, Twitters’ CEO, announced that his company would suspend all accounts that shared the gruesome beheadings of journalist James Foley. The internet cried censorship whereas news organizations argued that Twitter’s CEO exercised standard editorial rights. The Guardian stated that for “the first time, Twitter acknowledged it was a platform that exercises editorial judgment.” 

Twitter, fundamentally, is a company that monetizes the distribution of content. Their model is the same as that of Facebook, Google, The Guardian and the New York Times. But as of August 20, Twitter has more in common with The Guardian and the New York Times than it does with a Facebook or Google – because now it discloses its editorial judgements

On the other hand, Facebook and Google operate in black boxes, shrouded by advanced algorithms that encourage us to gradually consume more information and generate higher ad revenues. These companies disguise their lack of accountability by arguing that they know us better than we know ourselves. They can predict the news we’d like to see better than we can on our own accord, or so they say. 

And this approach might work right now, but these companies should exercise what they preach in their PR materials – that is, openness and accountability. Doing so is easy and has no material harm on their businesses. If anything, it’ll ignite the same debate caused by Dick Costolo and force us to think more about the types of information we consume and the methods by which we receive it. 

Ethics in the Algorithm Age

There are three ways that Facebook, Google, and other black-box-algorithm-powered content distribution services can fulfill this mission. 

First, they can explain the black box process. Even the most complicated filtering techniques have reasoning that explains why that particular technique is superior to others. The explanation can live somewhere in the help section or perhaps by the privacy settings. As long as it exists in a place easily findable and readable (which means not in the Terms of Service), people won’t complain about a lack of transparency. 

Second, and perhaps more radical, they can give the ability to toggle between a Stream and Feed in the central place. This could be a button in the top right corner on the Facebook app, for example. Then, users would be able to switch between two different forms of information – one purely unfiltered (Stream) and another highly curated (Feed).

Third, companies should call out their own censorship and publish their reasons for hiding certain pieces of information. Again, it does not need to be hidden in the Terms of Service or elsewhere; the content can be on the company blog. Facebook, Twitter, Google and others have a meaningful impact in the types of content we consume and they should be held accountable for their editorial decisions. If the New York Times or The Guardian decided to not publish an article because of how their viewers would feel about their brand, the editor would be forced to explain the decision in an op-ed. Other news outlets would pick it up and ignite the requisite debate. 

Media companies like Facebook, LinkedIn, Twitter and Google should be held to the same rigorous standard to which we hold our news organizations. After all, they have become our news sources. To do this, they can explain their black-box processes, give users the explicit option to opt-out and justify their controversial decisions – just like a traditional news organization. They should not be allowed to hide behind the language of technology, using algorithms as an excuse for lack of transparency. The internet is better than that. We are better than that.

Algorithmic black boxes / shutterstock

TAGGED:
Share This Article
Exit mobile version