At the top of my summer reading list is The Filter Bubble, Eli Pariser’s new book that argues that the filters we rely on to make sense of the online world can do us as much harm as good.
While the book relies on familiar notions about the perils of the echo chamber, it uses those ideas as a starting point, rather than an ending, focusing on the algorithmic implications of all the echoing. One of the most intriguing aspects of Pariser’s argument is his exploration of the automation of preference — through the increasing influence of the Like button, through Google’s desire to make its results “relevant,” through various news orgs’ recommendation engines, and, ultimately, through media companies’ economic mandate to please, rather than provoke, their audiences.
That last one isn’t new, of course; news organizations have always navigated a tension between the need to know and the want to know when it comes to the information they serve to their readers. What is new, though, is the fact that audiences’ wants now have data to back them up; they can be analyzed and tailored and otherwise manipulated with a precision that is only increasing. Audiences’ needs, on the other hand, are generally as nebulous as they’ve ever been. But they are no less urgent.
...And I think we need people who are smart about journalism to be thinking about how we import a lot of the implicit things that a front page does, or that a well-edited newspaper does — how do we import that into these algorithms that are going to affect how a lot of people experience the world?