The Art of Tech Criticism

I have followed the internet, and Virginia Heffernan’s career documenting it, for more than a decade. And I have anxiously awaited this book since her column first hinted at magic and loss in 2011, and when she previewed her personal journey with the internet at length in her 2012 Berkman Center talk, “The Digital Dialectic.” When I graduated in 2007 with a joint degree in English Literature and Film Studies, I looked to Heffernan’s column as confirmation that the shifts in digital culture I cared about mattered, and that the humanities had something to bring to the conversation as we were figuring out what the internet meant, together. Like Heffernan, I was drawn to the Berkman Center for Internet & Society its co-founder Jonathan Zittrain—where people were thinking about what this internet thing meant to the world. Full disclosure, I've always kind of envied her job. So, here's my review of Heffernan's Magic and Loss, and why it matters to expand our notion of what technology criticism is and what it can do for us.

As we begin to take the internet for granted, it’s more important than ever to recognize the need for robust and diverse technology criticism. We grapple with which metrics we should use to judge Facebook’s integrity in serving us, as a social platform or as a journalistic entity. Wealthy Silicon Valley VCs with a grudge can ruin entire publications by throwing their weight behind lawsuits. Publishers tiptoe around criticizing tech companies because social platforms and newsfeeds control access to their audiences. We need to stop seeing technology criticism as destructive; rather, it gives us the opportunity to shape the future of technology in our everyday lives. Heffernan’s nuanced example in Magic and Loss expands the notion of what technology criticism can and should be.

Read more at Columbia Journalism Review.

Annihilating Time and Space: Reading River of Shadows

It has been a crazy couple of weeks. I was running on full steam wrapping up my thesis through July 22, and then went straight into cleaning-packing-moving mode moments after my return from the Exam Schools. And even after we got nearly all the unpacking done at the end of last weekend (save for the boxes of artwork), I still felt a little brain dead this past week. It was starting to get frustrating, because I wanted desperately to get into the swing of things, to get caught up on thesis follow-ups and the news I had missed. And more than anything, I was eager to get started in earnest on the book. But I just couldn’t get my head back in the game. I found myself wandering from coffeeshop to public library trying to find a comfortable place to reengage my brain. My sense of space and time was all out of whack. 

On Wednesday night I picked up Rebecca Solnit’s River of Shadows: Eadweard Muybridge and the Technological Wild West, and it was just the cure for this transition/adjustment malaise. There were so many things about this book that made it just the right thing for me to read at this very moment, and for that I’m thankful. I had come across it when I saw that Tech Book Club read it, and when I bought my mom a copy of Wanderlust: A History of Walking, but I was most recently reminded of it in an essay of Solnit’s I’d saved to Instapaper from a tweet

In the most basic sense, I had been interested in the book because it covers the life and historical context of Eadweard Muybridge, who essentially developed the means to take rapid shutter speed photographs and thus paved the way for modern cinema to capture moving images. His motion studies of horses and human bodies revealed novel detail that had been previously inaccessible to the naked eye; in gallop, all four of a horses’ hooves do, in fact, leave the ground. I enjoyed reading into the founding story of this period of photography and early cinema, given my background in film studies. We have a lovely print of Muybridge’s dancing couple that we displayed at our wedding (vintage hipster gag, I know) that we had picked up from 20x200. I had a soft spot for the man and his work already, and was eager to learn more.

image

What I wasn’t expecting to find in this book were all the connections to my recent work on the Quantified Self. In many ways, Muybridge was dissecting motion of human bodies, revealing objective, abstracted detail about the body’s movement in much the same way that sensors now enable us measure activity and movement in even greater detail. Muybridge froze time to show the patterns in a walker’s gate. Now instead of freezing time, we’re collecting data all the time, with sensors that track our gate throughout the day and monitor our movement while we sleep. Data now does what celluloid did then, parsing information into smaller and smaller knowable units. But it’s also sometimes uncanny: “Those gesturesa gymnast turning a somersault in mid air, a nude pouring waterwere unfamiliar and eerie stopped because they showed what had always been present but never seen.” As Solnit puts it: “With the motion studies that resulted it was as though he were returning bodies themselves to those who craved themnot bodies as they might daily be experienced, bodies as sensations of gravity, fatigue, strength, pleasure, but bodies become weightless images, bodies dissected and reconstructed by light and machine and fantasy.” I relished the opportunity to connect those dots in my own intellectual history, from film history to internet studies, in a new way in reading this book.

image

Solnit’s book is about a man, an innovator, but it is also about a place in time. Solnit writes a lot about landscape, the west, San Francisco, etc. I’ve been thinking a lot about the impact that the ideology and the lifestyle of the makers of technology have on its design and  adoption in broader contexts, especially now as it enters the intimate realm of our bodies and our minds. I loved the rich descriptions and sweeping connections Solnit makes from the early mining days of San Francisco directly to the emergence of Silicon industries, all happening on the same soil. It made more acute a hankering I’ve been having to spend more time in the Valley, if only to get something of an ethnographic understanding of the contexts and circumstances in which our technologies are built. She described that period in which Muybridge was working with such energy and drama; it made me want to be that much closer to the history that’s happening now. 

Solnit has a knack for drawing out these sweeping connections. She does it with such finesse that you don’t want to try to poke holes. She’s connecting a lot of dots, and doing a fine job of telling you precisely why a technological innovation has turned out to be really important. I like the way her mind works, pulling threads together across time and space. I try to do that in my own work. It’s the work of an interdisciplinarian: the railroad and the cinema and silicon valley all begin to make sense together if you are looking at the right pieces. 

Reading this got me thinking about what we’re trying to do in if/then. Solnit weaves a story about the rippling effects of converging technologies on the way we see and experience the world. Our book will weave a story about the rippling effects of current technologies, drawing out these connections and pointing to where these moments of change are happening around us right now. The only difference is the clarity and confidence that hindsight affords. I want to write about the near future in the way that Solnit writes about the past.

Solnit writes about how the Victorians worried about losing a sense of place and time, of embodiment: “It is as though the Victorians were striving to recover the sense of place they had lost when their lives accelerated, when they became disembodied. They craved landscape and nature with an anxious intensity no one has had before or since.” We’re still worried about technology’s effects on what is lost and what is gained when we change our perceptive abilities. The Victorians grappled with the dichotomy between the natural and the technologically-mediated worlds. Seeing that dichotomy spelled out so clearly through this book, it made all the more clear to me that I think that we’re getting closer and closer to the collapse of these binaries. 

[Review] The Intention Economy: When Customers Take Charge

The Intention Economy: When Customers Take Charge by Doc Searls
My rating: 3 of 5 stars

I was really excited to read The Intention Economy because it is one of the first efforts I’ve seen to extend popular privacy concerns into the realm of the economics of personal data and user empowerment. I’ve been pursuing these ideas in talking about the nature of our transactional relationships with internet services and companies by “paying with our data,” and we’ve seen early signs of this in the World Economic Forum’s discussion of personal data as a new asset class.

Searls’s strongest explanation comes in describing the underlying problem in the existing internet economy that gives data use power to companies: it’s all based on those boilerplate, impossible-to-parse terms of service that operate as “contracts of adhesion.” Contracts of adhesion is an old idea, introduced by Friedrich Kessler to describe the heavily biased positioning of bargaining power with the company. These contracts exist to allow for mass production and consumption at scale, but they get in the way of principles of “freedom of contract”. So how do we put users on a more level [contractual] playing field with companies?

Searls proposes that control over the terms of data use need to lie with the user, and that fourth party data brokers (also called data lockers) could negotiate and manage our terms and our data on our behalf to share with the services that need access to our data. Terms might include things like “If we cease our relationship, you can keep my data but not associate any PII with that data.” While customer relationship management (CRM) platforms have been around for more than ten years to help companies manage interactions with and data about customers, Searls suggests that we’ve been missing the mirrored pair on the customer side of the relationship, which he’s deemed vendor relationship management (VRM).

While I’m excited about shifting the dynamic of predominant economic models as they relate to data, Searls’ book falls short in a couple of key ways:

  • It’s too optimistic. Searls doesn’t address or even acknowledge the of the limitations and challenges (most obviously, inertia) that would prevent The Intention Economy from realization. It’s too easy to punch holes.
  • Searls puts too much focus on and faith in the tools. He says, “We didn’t need a car, a copier, a radio, or a smartphone until we saw one and said to ourselves, ‘I need that.’…SO, THEN Customer liberation requires necessity-mothering inventions.” I tend to disagree. While developing tools to enable these markets is important, making customers aware of the problem, and making them aware of an alternative model, is a bigger challenge, first. One audience member in Searls’s book talk asked: “Customers are lazy. What’s going to make them want to do anything about this?” Sure, having tools that make this as easy and seemless as possible is going to make adoption go a lot more smoothly, but there won’t be a market if enough consumers don’t see this as a clear problem. The popular media’s coverage of privacy concerns over the last year has certainly raised the level of discourse, but it’s mostly been focused on fear mongering. We haven’t seen much in terms of alternative models for moving forward, and that’s what conscientious consumers should pick up from this book.
  • Searls doesn’t address the elephant in the room, which is: why should we trust fourth parties with our data anymore than we trust Facebook or Target? Might it just be the case that the fourth party managing our data for us will just end up being one of the companies that already has a lot of our data already? Will Facebook become our data broker, like it’s already our identity manager (given all the startups using Facebook for signup and login)? That possibility seems more likely to me, given the concentration of users already captured on these platforms. It’s harder to imagine one of the current data locker startups rising to the top to come between us and Facebook (unless of course they got bought by Facebook). But even for Facebook to become our data broker, the dominant economy of the internet will have to shift way from advertising to create something that looks a little like Searls’s Intention Economy. He doesn’t make this argument, though, since he’s so invested in those VRM startups.


Aside from this more glaring problems, Searls’s writing is full of awkward, imperfect metaphors like calf-cows relationships (“The World Wide Web has become a World Wide Ranch, where we serve as calves to Web sites’ cows, which feed us milk and cookies.”) and Chinese walls between advertisers and consumers (“If advertisers would peek over on our side of the Chinese wall, they would see two icebergs toward which TV’s Titanic is headed, and both promise less tolerance for advertising.”).

All that being said, The Intention Economy is bold, and it’s hard to cover all the bases when your arguments are so radical. But it’s also hard to convince others to rally around the cause if there are too many questions left unanswered.

View all my reviews

[Review] Rule 34

Rule 34Rule 34 by Charles Stross
My rating: 4 of 5 stars

Stross imaginatively takes present day technology and brings it to its natural manifestation in the near-future world of the book. Behavioral targeting, maker-culture, big data analysis, ubiquitous wifi and its effects on privacy, mechanical turk crowd-sourced police work. All of it’s totally plausible, and all of it feels insidiously close. Circumstances that make for exciting crime narrative end up giving present-day readers pause; Stross is dealing in socio-technical implications of these technologies.

Stross also manages to draw out these implications by inflicting these technologies on the common man. His police detective isn’t an early-adopter, tech-savvy gadget freak; she’s using the tools that have made their way into her chosen line of work. Watching the effects of technology on the common man gives weight to his extrapolations.

He also successfully plays with the failure of over-hyped technologies - AI, the persistence of spam, video conferencing, etc.: “Working teleconferencing is right around the corner, just like food pills, the flying car, and energy too cheap to meter.”

At first I struggled with the second person narrative voice. It felt gimmicky, and weirdly hovered between feeling privvy to first person internal monologue and omniscient narrator hopping between characters’ consciouses across chapters. But Stross’s experimental voice reveals its purpose with hints of agency and ownership in passages like this:

(If you’re one of the piece-workers in a mechanical turk—or one of the rewrite rules inside Searle’s Chinese room—the overall pattern of the job may be indiscernible, lost in an opaque blur of seemingly random subtasks. And if you’re one of the detectives on a murder case, your immediate job—determining who last repaired a defective vacuum cleaner—may seem equally inexplicable. But there’s method in my motion, as you’ll learn for yourself.)

The effect was strangely satisfying, once it’s purpose is revealed (but I won’t go so far is to give that away).

I walked away from this book with a sense of reinforcement that I’m asking the right questions of technology and data and their impacts on society. Stross makes vivid some of the worst-case scenario uses of contemporary tech, and while somewhat alarmist, it’s helpful to work through these near-future extrapolations to get on the right side of things today.

View all my reviews

[Review] The Filter Bubble

The Filter BubbleThe Filter Bubble by Eli Pariser
My rating: 4 of 5 stars

Though our angles are slightly different, Pariser and I are worried about similar things: how platforms like Google and Facebook are compiling our personal data to create a “theory of identity for each user.” Pariser’s worried about what personalization means for human curiosity, serendipity, and democracy when personalization filters show us more of what we already like. I’m more concerned with how personal identity as profiled and collected on these platforms, what control we have over these profiles, as well as our sense of ownership (both legally and in terms of a mental model) over this data.

Pariser’s aim is noble: he follows up on a personal curiosity to explain to the average user how Facebook and Google’s roles as intermediaries filter our experience of the web. He successfully brings to light some features and biases inherent in the algorithms that come between us and the information we seek that are not always obvious to the non-tech savvy user. This is a goal I identify with; it’s something I aim to accomplish in my own research and writing.

But where Pariser falls short in letting the work of others speak for him. The book reads like a Who’s Who of the non-fiction technology and behavioral psychology best sellers over the last decade (Lessig, Kelly, Ariely, and he even makes the amateur faux pas of invoking McLuhan). Granted, these guys have contributed a lot that’s worth reference. But aside from Pariser’s basic premise, I’m afraid he’s doing more summarizing than he is adding to the discourse. The result is that the book reads as his own filter bubble: the experts anecdotes and observations, presented in support his largely political argument.

Pariser also fails at seeing his suggestions for improvement through. Though engineers could feasibly develop algorithms that introduce serendipity, for example, he doesn’t take it further to speculate how such algorithmic tweaks might be profitable or beneficial to their platforms, which leaves little chance for this to be practically implemented. While some users might prefer a search platform with such features, the it’s difficult to see how users would demand for such a market to bear these tweaks. Pariser doesn’t reconcile algorithmic biases with the business models they optimize and support.

Pariser also gets into trouble by basing his arguments in privacy debates. In paragraphs like the following, he starts from a privacy argument and extends it to his own information consumption concern:

"Marc Rotenberg, executive director of the Electronic Privacy Information Center, says, ‘We shouldn’t have to accept as a starting point that we can’t have free services on the Internet without major privacy violations.’ And this isn’t just about privacy. It’s also about how our data shapes the content and opportunities we see and don’t see. And it’s about being able to track and manage this constellation of data that represents our lives with the same ease that companies like Acxiom and Facebook already do."

Though these algorithms run on the same data that is as much a concern for privacy as it is for personalization, I fear that these kinds of argumentative extensions conflate concerns in an unproductive and confusing way for the average user. He also makes sloppy errors referring at times to both personal data and private data, two very distinctly different categories with different concerns and implications in my mind. I struggle with these challenges myself, and too often subtly can be lost when you get lumped into the privacy.

Ultimately Pariser succeeds at his aim: a call for awareness, advocating for greater public filter literacy. We all need to question and acknowledge personalization’s affects on our information consumption. He does a good job of reaching a broad audience with concise sentences that bring his point home: “If we want to know what the world really looks like, we have to understand how filters shape and skew our view of it.” Now that we’re all more aware of what’s going on, I’m still left with the question: can personalization be inherently bad or good? I think Pariser believes it’s not so binary (as indicated by his Kranzberg citation), but for his target audience, this is perhaps too subtle a conclusion.

On a funnier note: I have to thank Pariser for pointing out this glaringly obvious observation I had previously missed: “Page had come up with a novel approach, and with a geeky predilection for puns, he called it PageRank.”

View all my reviews