Metaphors are the new...

A couple weeks ago, I got to talk with the fellows and gathered friends of Data & Society about some of the work I've been doing on the metaphors we use to talk about data. It's an interest that came out of my thesis research on how the Quantified Self community was talking about their relationship to their personal data, and continues in a piece I wrote for DIS Magazine's Data Issue this year.

Here's the video on YouTube. Stay for the discussion with the audience in the second half (but just listen to the audio!). We ended up focusing on how it's not just data that we need metaphors for, but also the processes and systems that use and make sense of the data. We have just as much, if not more, trouble grasping the concept of algorithms and how they work.

Huge thanks to Tim Hwang for inviting me to come down to New York and to Data & Society for providing the space and minds for a great conversation. I'm hoping that this work will continue to shape the public discourse we're using to talk about data issues going forward, especially as those framings shape policy and law, and more importantly, the way we all think about our stakes in our data.

In Good Company

I got pretty excited when people who I admire and respect cited my recent articles about data science, Facebook, and the uncanny this week. Beyond the not-so-humble brag, I'm more excited by the mounting accumulation of voices calling for accountability and ethical approaches to our data and its uses. And I was even more excited to overhear older family members at a friend's wedding this past weekend discussing the Facebook study over breakfast. I think we're starting to get somewhere.

Om Malik, who has supported some of the most extensive industry coverage of data on GigaOM, wrote this week about the Silicon Valley's collective responsibility to use their power wisely:

While many of the technologies will indeed make it easier for us to live in the future, but what about the side effects and the impacts of these technologies on our society, it’s fabric and the economy at large. It is rather irresponsible that we are not pushing back by asking tougher questions from companies that are likely to dominate our future, because if we don’t, we will fail to have a proper public discourse, and will deserve the bleak future we fear the most...Silicon Valley and the companies that control the future need to step back and become self accountable, and develop a moral imperative. My good friend and a Stanford D.School professor Reilly Brennan points out that it is all about consumer trust. Read more.

And danah boyd, who is starting up the Data & Society research institute, summed up what we've learned from the Facebook emotional contagion study echoing my point that it's not just about the Facebook study, it's about the data practices: 

This paper provided ammunition for people’s anger because it’s so hard to talk about harm in the abstract...I’m glad this study has prompted an intense debate among scholars and the public, but I fear it’s turned into a simplistic attack on Facebook over this particular study, rather than a nuanced debate over how we create meaningful ethical oversight in research and practice. The lines between research and practice are always blurred and information companies like Facebook make this increasingly salient. No one benefits by drawing lines in the sand. We need to address the problem more holistically. And, in the meantime, we need to hold companies accountable for how they manipulate people across the board, regardless of whether or not it’s couched as research. If we focus too much on this study, we’ll lose track of the broader issues at stake. Read more.

Both are great reads, and align with a lot of the things I've been exploring in my own work. I'm honored to be in such good company.