On Hip Surgery and Personal Data

Friends and family may know that I had hip surgery at the beginning of the month. It was relatively minor surgery, 1.5 hours of arthroscopic poking around. I've healed up pretty quickly, and I'm off crutches after three weeks. But during that time I did a lot of thinking about how I ended up needing the surgery for my Femoroacetabular Impingement (FAI) and torn labrum, and the defaults of the fitness trackers I've been experimenting with as part of my quantified self research. The Atlantic put out a call for stories about technologies of addition and subtraction, so I wrote up my cyborg hip experience:

I have a record of all the ways I wore away at my soft tissue, in those 18 hours and 32 minutes of yoga I did in the month leading up to my wedding, in those averaged 8-minute-mile jogs that started out with characteristically inconsistent 7:31-minute first mile splits. I see those record-breaking days wandering Paris and Venice. I see where I tried and failed to train for a half-marathon. I see where I injured myself, stopped running, and started physical therapy. These moments are marked by step counts and workouts, but the narrative that explains the numbers is overlayed like a personal journal.
— Stepping Down: Rethinking the Fitness Tracker

Read more.

I also wanted to share that I'm really grateful to friends and family who helped me out during my recovery visiting and sending flowers and love. Especially Nick, for waiting on me hand and foot while I was laid up for the first week on the couch, even while furiously writing his dissertation. And also to Dr. David, my own personal physical therapist brother who first looked at my hip years ago and walked me through every step of the process. He even adjusted my crutches so they finally felt right. I'm pretty lucky to have both of these wonderful people watching out for me.

In Good Company

I got pretty excited when people who I admire and respect cited my recent articles about data science, Facebook, and the uncanny this week. Beyond the not-so-humble brag, I'm more excited by the mounting accumulation of voices calling for accountability and ethical approaches to our data and its uses. And I was even more excited to overhear older family members at a friend's wedding this past weekend discussing the Facebook study over breakfast. I think we're starting to get somewhere.

Om Malik, who has supported some of the most extensive industry coverage of data on GigaOM, wrote this week about the Silicon Valley's collective responsibility to use their power wisely:

While many of the technologies will indeed make it easier for us to live in the future, but what about the side effects and the impacts of these technologies on our society, it’s fabric and the economy at large. It is rather irresponsible that we are not pushing back by asking tougher questions from companies that are likely to dominate our future, because if we don’t, we will fail to have a proper public discourse, and will deserve the bleak future we fear the most...Silicon Valley and the companies that control the future need to step back and become self accountable, and develop a moral imperative. My good friend and a Stanford D.School professor Reilly Brennan points out that it is all about consumer trust. Read more.

And danah boyd, who is starting up the Data & Society research institute, summed up what we've learned from the Facebook emotional contagion study echoing my point that it's not just about the Facebook study, it's about the data practices: 

This paper provided ammunition for people’s anger because it’s so hard to talk about harm in the abstract...I’m glad this study has prompted an intense debate among scholars and the public, but I fear it’s turned into a simplistic attack on Facebook over this particular study, rather than a nuanced debate over how we create meaningful ethical oversight in research and practice. The lines between research and practice are always blurred and information companies like Facebook make this increasingly salient. No one benefits by drawing lines in the sand. We need to address the problem more holistically. And, in the meantime, we need to hold companies accountable for how they manipulate people across the board, regardless of whether or not it’s couched as research. If we focus too much on this study, we’ll lose track of the broader issues at stake. Read more.

Both are great reads, and align with a lot of the things I've been exploring in my own work. I'm honored to be in such good company.

Data Doppelgängers and the Uncanny Valley of Personalization

When we talk about ads being creepy, it goes beyond a sense of being watched. It’s an encounter with the uncanny. I’ve been developing this idea for a while and I got to write it up for The Atlantic

Read more.