Among its other disruptive influences, the rise of the web has caused journalism to become detached from the physical objects it used to be embedded in, whether that was a newspaper, magazine or book. Information flows over us like a river now, instead of being chopped up and frozen in time. And that means more than just an aesthetic change in how we consume the news — it means that the apps and devices and platforms we use play an increasingly large role in how we get our information, and therefore so does the design of those services.
Researchers Mike Ananny and Kate Crawford recently published a study looking at this phenomenon, and they spent some time interviewing designers and developers of news and content-curation apps such as Storify, Zite, Google News and Scoopinion. As the two described in in a Nieman Lab post about their research, journalists definitely have an obligation or a duty to choose and tell stories ethically, but they are no longer the only ones that have that responsibility:
Today, press ethics are intertwined with platform design ethics, and press freedom is shared with software designers. The people at Facebook, Twitter, Flipboard, Pulse and elsewhere have a new and significant role in how news circulates and what we see on our screens. We’re only just beginning to understand how these companies’ algorithms work and why they matter to the editorial calculations shaping today’s news.
Algorithms are the new editors
As Ananny and Crawford point out, one of the players at the center of this debate is [company]Facebook[/company], since the massive social platform is a source of news for a large number of users — and therefore the algorithms it uses, and the design choices it makes, have a powerful influence on what news users either see or don’t see. The contrast between a filtered and an unfiltered view of the world was brought home during the recent civil unrest in Ferguson, Mo., when Twitter users got a real-time flow of news that many users of Facebook missed out on completely.
Is that Facebook’s fault? Does it have some duty or obligation to deliver the news in an ethical or responsible way, like the newspapers it has said it wants to emulate? The Ananny-Crawford study doesn’t really address those questions, unfortunately, and in fact the researchers didn’t interview anyone at Facebook about its handling of the news — but to me, that is one of the biggest questions that is implied by the phenomenon they are describing.
In many ways, as information scientist Tarleton Gillespie pointed out recently, algorithms have taken the place of editors — or are challenging their role in an increasing number of ways. Instead of unnamed editors deciding what stories to feature on the front page, or who deserves a follow-up piece, algorithms at Facebook (and possibly at Twitter as well) are highlighting the events they see as relevant to users. As sociologist Zeynep Tufekci noted in a post at Medium about Ferguson, this has very real implications for society.
Keep in mind, Ferguson is also a net neutrality issue. It’s also an algorithmic filtering issue. How the internet is run, governed and filtered is a human rights issue. And despite a lot of dismal developments, this fight is far from over, and its enemy is cynicism and dismissal of this reality.
The hard part is that — as many defenders of Twitter’s potential filtering argued — there is so much information flowing around and over us from so many different sources that it is almost impossible for human beings to keep up, let alone focus on all of the things that are important. Algorithms are theoretically the only way to do this effectively, but all the more reason why we should be paying close attention to what those algorithms are highlighting and what they are missing.
Questioning the assumptions behind filtering
Crawford and Ananny talk about how apps like Zite and services like Google News are what they call a “liminal press” — in other words, people and systems that exist outside the traditional media industry, but help to create the conditions that either allow news to circulate or prevent it from doing so. In interviews with senior players at Storify, Zite and other services, they found that most of them see their role as finding the signal amidst the noise:
A senior leader at Storify described the company’s goal as essentially information organization — helping people pull out things they think are important and worth preserving ‘from the noise of all this social media.’
A Zite co-founder said that one of the biggest benefits of an app that learns from users is that it can help show editors what news “should be” instead of what they might think it is. But even this choice has implications — editors at media outlets used to think that part of their job was to tell people things they might not already be interested in, because they were important or necessary. Can an algorithm do that, or will it just avoid those difficult topics?
One AOL designer told the researchers that he and his colleagues don’t really think of what they are doing as being journalism. As he put it: “I’m building an entertainment product. I don’t even consider all the things that you guys are talking about.” If nothing else, maybe the research that Crawford/Ananny and others are doing will help focus more attention on the assumptions that underlie those design decisions, and how (or if) we can change them.