Published! Tow Center Report on Constructive Technology Criticism

I’ve landed back in Singapore after a whirlwind trip filled with family [BABY!], friends [WEDDING!], and some serious business [CONFERENCES AND TALKS!]. Most significantly last week, I released the report on Constructive Technology Criticism with the Tow Center for Digital Journalism, which I’ve been a remote research fellow with for the last year or so.

Complete report online at CJR, or downloadable in ebook formats at gitbook.
Style Guide for Writing About Technology and Annotated Syllabus are also available on Medium.
 I welcome your comments, additions, and further suggestions.

Being Stateside meant I got to pull together some of the people who inspired and motivated the research for a panel discussion in New York. It was a great chance to highlight some of their contributions and insights about how tech coverage and criticism are changing. Here’s the video, featuring all stars Virginia Heffernan, John Herrman, and Rose Eveleth.

It’s just barely been a week since we published and I’m eagerly watching the response to what turned out to be a pretty lengthy project (30K words, including appendices!). Working largely on my own from Singapore, it was easy to lose touch with the energy that motivated the project. There's nothing like the thrill of pressing 'publish' and getting feedback to motivate further work. On Twitter people are screenshotting a surprising range of quotes and insights from the report and even posting links to it alongside commentary in other languages. I love seeing what resonates, and I’m pleased to see folks are actually reading that deep into the report.

I’m especially excited because friends and colleagues have shared with me how this thing connects to their own work. I did not expect, for example, it would speak to my friends’ recent thinking on middle school English literature canon and pedagogy! I intended to corral a bunch of different threads and ideas together in one place so we could start having a conversation around them. It’s gratifying to see hints that it is already delivering on that potential.

It's all the more personally gratifying when the seeds of this project began as an exercise in soul searching: “What should I call myself? ‘Tech Writer’ doesn’t cut it.” Surrounded by lawyers, academics, documentary film makers, and journalists, I struggled to pin down how to introduce myself in those heady September introductory days starting as a fellow at Berkman in 2013. The soul searching continued the following year as a writing collaboration fell apart, for reasons that seemed to highlight the differences in our approach. Where there are struggles and uncertainty, there's usually something interesting worth digging into, and my personal struggle led me to exploring bigger tensions in the way we write and talk about technology and society at large.

I’m using the next couple days for jetlag-fueled musings and lining up my next steps to figure out how this work continues and evolves, in practice or in theory. Absentee ballots have been sent. Back in steamy Singapore I'm already missing fall, but I'm grateful for the crisp taste and burst of energy I got on this trip.

The Art of Tech Criticism

I have followed the internet, and Virginia Heffernan’s career documenting it, for more than a decade. And I have anxiously awaited this book since her column first hinted at magic and loss in 2011, and when she previewed her personal journey with the internet at length in her 2012 Berkman Center talk, “The Digital Dialectic.” When I graduated in 2007 with a joint degree in English Literature and Film Studies, I looked to Heffernan’s column as confirmation that the shifts in digital culture I cared about mattered, and that the humanities had something to bring to the conversation as we were figuring out what the internet meant, together. Like Heffernan, I was drawn to the Berkman Center for Internet & Society its co-founder Jonathan Zittrain—where people were thinking about what this internet thing meant to the world. Full disclosure, I've always kind of envied her job. So, here's my review of Heffernan's Magic and Loss, and why it matters to expand our notion of what technology criticism is and what it can do for us.

As we begin to take the internet for granted, it’s more important than ever to recognize the need for robust and diverse technology criticism. We grapple with which metrics we should use to judge Facebook’s integrity in serving us, as a social platform or as a journalistic entity. Wealthy Silicon Valley VCs with a grudge can ruin entire publications by throwing their weight behind lawsuits. Publishers tiptoe around criticizing tech companies because social platforms and newsfeeds control access to their audiences. We need to stop seeing technology criticism as destructive; rather, it gives us the opportunity to shape the future of technology in our everyday lives. Heffernan’s nuanced example in Magic and Loss expands the notion of what technology criticism can and should be.

Read more at Columbia Journalism Review.

Data and Algorithms IRL

This past week was a whirlwind of speaking and conference excitement. Five out of seven days I was on panels, speaking, and moderating in Cambridge, DC, and New York. I figured it was worth a recap with links and videos.

First I had the honor of sharing the stage at Berkman with Bruce Schneier, Joe Nye, Melissa Hathaway, and Yochai Benkler, moderated by Jonathan Zittrain. We discussed Bruce’s New York Times best-selling book, Data and Goliath. One thing that came out of the discussion was the idea that we need a data broker whistle blower, akin to Snowden, to reveal more about the practices of data brokers and expose the more egregious uses of our data.

Next, I was in DC talking about mobile health dataOur Data, Our Health—at the New America Foundation, apparently inspired by an article I wrote for Slate Future Tense a while ago. 

Friday in New York, I was invited to participate in an Algorithmic Transparency in the Media workshop at the Tow Center for Digital Journalism. The day focused on how algorithms are used in the newsroom—to generate articles, to recommend content to readers and enhance curation, and to advocate for change using predictive modeling to tell stories.

My favorite, and most out-of-the-ordinary event, was speaking at the New Museum at the 2015 Triennial: Surround Audience, using DIS's The Island (Ken) installation as my stage. I discussed my article about data metaphors, first published in DIS Magazine’s data issue.

And yeah, I also made stock while I was speaking at the fully-functional kitchen island. I'm not sure I completely understand my role in contemporary art...Someone attending read my piece and turned it into this awesome visualization of the argument:

Vsoon 032915 map “Data is the New ____”

Vsoon 032915 map “Data is the New ____”

Back in Cambridge Monday night, I had the pleasure of moderating a panel on The Political Startup for the Harvard Ventures, a student-run group. We had a wonderful set of panelists all working to make government data more accessible and usable.

The Political Startup, Harvard Ventures

The Political Startup, Harvard Ventures

Reading Dada Data and the Internet of Paternalistic Things on Radio Berkman

Dan Jones, audio production extraordinaire, pulled together some interviews with authors who contributed to the Berkman Center Internet Monitor report this year, including myself. I got a chance to read my speculative fiction piece about the internet of paternalistic things, and I had a great conversation with Dan about some of the inspiration behind the story. Give it a listen—my section starts around 33:00, but the whole podcast is really worth listening to.

“But Ferguson was Trending in my Feed”

This essay appeared in the Berkman Center's Youth and Media essay collection, “Youth and Online News: Reflections and Perspectives,” which is available for download through SSRN. My contribution is among a great set of pieces that offer insightful, provoking, and out-of-the-box reflections at the intersection of news, digital media, and youth. 

I was at a journalism conference recently where the topic of algorithmic curation came up. One of the speakers cited the comparison between Ferguson trending on Twitter while the Ice Bucket Challenge was all the rage on Facebook. It was held up as an example of how platforms influence and shape news and shape sharing behaviors of their users.

One student in the audience raised her hand, piping up that she contested the premise that Ferguson hadn’t trended on Facebook. “She was originally from St. Louis and all her friends from home had been talking about it, about race, about police violence, about protests. Ferguson was all over her Facebook newsfeed.”

The discrepancy provided an illustrative moment. One the one hand, opinion and data had made claims about how algorithmic filtering practices of platform affect access to news on Facebook. “On the other hand”, a personal experience of the same news event had differed drastically from the larger collective narrative about how news spreads online, and how politically sensitive topics are discussed within youth peer networks on Facebook.

That one student, away from home at school in Milwaukee, hadn’t felt distant from the activities in Ferguson. She was deep in it in her feed. The news was blowing up within her situated sphere of influence. This is how she experienced Ferguson.

Still, she had a hard time conceiving how Ferguson hadn’t made it into the feeds of others on Facebook. She contested the speaker’s claim with her own, situated and personal experience of the algorithmic curation.

Digital Literacy in Context

The greatest challenge we face in addressing the technical platforms that shape our information experiences is in demonstrating the effects between inputs and outputs in the system. Just as news literacy aims to develop skills to “understand a source’s agendas, motivations and backgrounds,” digital literacy needs to do the same of the platforms and their business models and motivations for providing value to consumers. We need tools that not only build diversity and solve for homophily problems, but also introduce us to the underlying editorial structures of these novel information platforms.

Digital news literacy ought to be taught by example and in context. Youth need to understand how algorithms affect their unique experience, not just how they influence everyone’s experience abstractly and in principle. We need more tools that allow youth to interact with the algorithm and see the micro effects of subtle changes from various inputs, like who you follow, what posts you comment on or re-share, and what things you like and click through.

Tools like Floodwatch’s ad tracking database allow us to compare our personal experience to that of others in a shared demographic profile. We could use still more technical interventions to help show variation in personalization.

What can youth learn about the way technical platforms work by comparing and contrasting the trending topics they see on Facebook and Twitter with peers in their network, and with others outside their network? What will they learn about what newsworthiness is in these personalized contexts?

If we take into account the personal, contextual experience of youth in teaching news literacy, we can help them to understand their place in a larger civic discourse around news and access to information by making it grounded, personal and real in the contexts where they get information today.

Ethnography in Youth and Media Research

News literacy goes beyond the sources that youth get information from, and how social media influences their filter bubble. It’s also about developing algorithmic literacy, for understanding the curatorial and editorial role of the platforms they interact with in their media environments.

Ethnographic interview work has vastly expanded our understanding of youth media practices by meeting them where they are and elevating their voices and concerns. Youth news experiences are inherently personalized now, and research methods for understanding those technical experiences must be as well.

Ethnography in Technology Journalism

Ethnographic approaches to knowledge and experience of algorithms should also expand to the media outlets covering our evolving relationship to technology. Journalists can play a role in developing digital literacy for access to information for their audiences by paying attention to and covering grounded, individual interactions with these systems.

That has been my methodological approach to “Living with Data,” the series I developed for Al Jazeera America. In it I examine encounters that illustrate our personal, situated experience of these tools, following reader submissions about our expectations about how these systems work or should work, and what is actually technically happening. The series aims to teach critical digital literacy through examples.

In part, this series was designed to refute the common argument that “I have nothing to hide” or that privacy concerns are too abstract for people to understand their effects. My aim is to illustrate through real experiences how autonomy and privacy are influenced by these sociotechnical systems that govern our access to information. A mission to develop critical digital literacies becomes especially important for a generation that takes Facebook and other social media platforms for granted.

This grounded approach makes the harms, or the surprises of data more personal, and more relatable. So while your experience may be very different from mine, I can begin to understand the inner workings of these algorithmic curatorial decisions because I can grasp the effects at a personal scale. I can compare my experience of Ferguson on Facebook against everyone else’s experience of the Ice Bucket Challenge.

Grounding coverage of these technical stories makes technical subjects more accessible, but also helps to make the individual stakes more present and clear.