[Review] The Filter Bubble

The Filter BubbleThe Filter Bubble by Eli Pariser
My rating: 4 of 5 stars

Though our angles are slightly different, Pariser and I are worried about similar things: how platforms like Google and Facebook are compiling our personal data to create a “theory of identity for each user.” Pariser’s worried about what personalization means for human curiosity, serendipity, and democracy when personalization filters show us more of what we already like. I’m more concerned with how personal identity as profiled and collected on these platforms, what control we have over these profiles, as well as our sense of ownership (both legally and in terms of a mental model) over this data.

Pariser’s aim is noble: he follows up on a personal curiosity to explain to the average user how Facebook and Google’s roles as intermediaries filter our experience of the web. He successfully brings to light some features and biases inherent in the algorithms that come between us and the information we seek that are not always obvious to the non-tech savvy user. This is a goal I identify with; it’s something I aim to accomplish in my own research and writing.

But where Pariser falls short in letting the work of others speak for him. The book reads like a Who’s Who of the non-fiction technology and behavioral psychology best sellers over the last decade (Lessig, Kelly, Ariely, and he even makes the amateur faux pas of invoking McLuhan). Granted, these guys have contributed a lot that’s worth reference. But aside from Pariser’s basic premise, I’m afraid he’s doing more summarizing than he is adding to the discourse. The result is that the book reads as his own filter bubble: the experts anecdotes and observations, presented in support his largely political argument.

Pariser also fails at seeing his suggestions for improvement through. Though engineers could feasibly develop algorithms that introduce serendipity, for example, he doesn’t take it further to speculate how such algorithmic tweaks might be profitable or beneficial to their platforms, which leaves little chance for this to be practically implemented. While some users might prefer a search platform with such features, the it’s difficult to see how users would demand for such a market to bear these tweaks. Pariser doesn’t reconcile algorithmic biases with the business models they optimize and support.

Pariser also gets into trouble by basing his arguments in privacy debates. In paragraphs like the following, he starts from a privacy argument and extends it to his own information consumption concern:

"Marc Rotenberg, executive director of the Electronic Privacy Information Center, says, ‘We shouldn’t have to accept as a starting point that we can’t have free services on the Internet without major privacy violations.’ And this isn’t just about privacy. It’s also about how our data shapes the content and opportunities we see and don’t see. And it’s about being able to track and manage this constellation of data that represents our lives with the same ease that companies like Acxiom and Facebook already do."

Though these algorithms run on the same data that is as much a concern for privacy as it is for personalization, I fear that these kinds of argumentative extensions conflate concerns in an unproductive and confusing way for the average user. He also makes sloppy errors referring at times to both personal data and private data, two very distinctly different categories with different concerns and implications in my mind. I struggle with these challenges myself, and too often subtly can be lost when you get lumped into the privacy.

Ultimately Pariser succeeds at his aim: a call for awareness, advocating for greater public filter literacy. We all need to question and acknowledge personalization’s affects on our information consumption. He does a good job of reaching a broad audience with concise sentences that bring his point home: “If we want to know what the world really looks like, we have to understand how filters shape and skew our view of it.” Now that we’re all more aware of what’s going on, I’m still left with the question: can personalization be inherently bad or good? I think Pariser believes it’s not so binary (as indicated by his Kranzberg citation), but for his target audience, this is perhaps too subtle a conclusion.

On a funnier note: I have to thank Pariser for pointing out this glaringly obvious observation I had previously missed: “Page had come up with a novel approach, and with a geeky predilection for puns, he called it PageRank.”

View all my reviews