A picture of me standing at a lectern, working on a laptop computer, on the stage of the FWD50 digital government conference

Hi! I’m Alistair. I write surprisingly useful books, run unexpectedly interesting events, & build things humans need for the future.

You’ll be tagged

Let me introduce you to a friend of mine.

Hello-stranger

Okay, not a friend. I don’t know who he is. But he accidentally appeared alongside me in a photo taken in Tokyo this week. He’s down here, above my left shoulder. Let’s call him “他人” (stranger).

accoffeejpn
I was tired, and drinking hot coffee from a can. Why on earth don’t we have this in North America?

I didn’t even notice him. But Facebook did. When I uploaded the picture, Facebook asked me his name:

scanpicWhen he decided to stroll through Harajuku yesterday afternoon, 他人probably had a reasonable expectation of privacy  His expectation came from three basic assumptions:

  • He did not expect to be photographed
  • He did not think a stranger would know who he was
  • He did not expect any sightings of himself to be stored in a shared public place

In other words, he thought that most of the people who saw him would have no context to recognize him, and that any sightings would be ephemeral and soon forgotten.

But those assumptions are rapidly crumbling. Everyone takes pictures now, and as we get prosthetic brains—in the form of Google Glass—we’ll record everything we glimpse. It doesn’t matter whether we know what’s in those glimpses, or even if we paid attention. They’re recorded, and someone, somewhere, has context.

Crowdsourcing facial recognition

If I upload many pictures of the same person (say, thirty photographs of my daughter) Facebook groups them into people it recognizes.

facialrecog

In the example above, Facebook has found several “people” (all of whom are in fact the same person—my daughter) and asked me to name them. By telling Facebook they are all the same person, I have taught Facebook’s algorithms how to recognize her better. This happens whether I share the pictures with anyone, or simply keep them to myself.

In an entirely separate life from my own, 他人 probably has friends who tag him on Facebook. When the facial recognition algorithm doesn’t guess who he is (because of the lighting, or the position of his head, for example) his friends helpfully supply his name. His friends are training the system to better recognize him, just as I am training it to better recognize my daughter.

I have a picture of 他人; his friends have the context to recognize him; and his image is stored forever on Facebook’s servers. The intersection of these three facts is a place of little privacy:

privacyvenn

More and more metadata

I gave Facebook a considerable amount of additional information when I uploaded the picture: where and when I took it, my phone’s GPS, the kind of phone I had, and so on.

Picture metadata

Smartphone picture-taking is further helped by tools like Google Goggles that can find text, logos, and clues in images. It can infer things about the weather (from the exposure levels or cloud cover) and my surroundings (from the signage on stores around me.)

Here, for example, is a screenshot of Goggles from a trip I took to Prague in the Czech Republic last year:

ggoggles

Without assistance, using only the edge of a building, Goggles has figured out what the landmark in front of me is.

These are free tools. This is the kind of technology that Three-Letter Agencies could only dream of a decade ago; now, such software is commonplace and often runs in the background, by default.

The result of all this is that, once the training is good enough, every time 他人 shows up in a photo, Facebook will have the necessary information to know where he is and have an idea about what he’s doing.

To be clear: Facebook doesn’t really know it today.

  • There are 127M people in Japan, and only 13.5M (10.5%) of them use Facebook.
  • Globally, only 1.06 Billion of the over 7 Billion people on the planet (15%) use the social network.
  • Facebook has better things to do than violate privacy agreements.
  • Most importantly, analyzing every upload against every picture of every one of Facebook’s users is costly and time-consuming.

Let me repeat: Facebook isn’t currently doing this. But all of the systems are in place to make it trivially easy to do so, and while it might be prohibitively costly to do it for the whole world, it’s downright easy to do it for “persons of interest”.

Can a photograph steal your soul?

Much of the concern over tools like Google Glass has centred around the ubiquity of recording. To me, it’s not the pictures that matter—it’s that we’re training the machine to find context within them in shared environments. The rapid convergence of anonymous data collection, crowd-sourced tagging, and central sharing changes the game dramatically.

A world in which every public sighting is part of a searchable record of our activity is a strange one indeed. Maybe Native Americans were right.

At first, many Native Americans were wary of having their photographs taken and often refused. They believed that the process could steal a person’s soul and disrespected the spiritual world.

Consider the implications of this for law enforcement. There’s no warrant for surveillance—there doesn’t need to be. Insurance adjustors might one day subpoena such records to see if a claimant is actually ill. Spurned spouses could use this information in divorce proceedings.

Maybe you aren’t a criminal. Maybe you have nothing to fear. But as Monty Python reminds us, who hasn’t broken the law?

What kinds of legal protections need to be put in place for this? What protections could be put in place? Can a social platform be compelled to release evidence that could absolve or convict someone? And at what point can a government apply a sort of digital “eminent domain” and access private information to find someone?


Posted

in

,

by

Comments

10 responses to “You’ll be tagged”

  1. Ray Avatar
    Ray

    Maybe what we need is a worldwide “mis-tag” your friends day. That should set back the training algo quite a bit.

  2. Robin Avatar

    Alistair, this is brilliant. I’ve been growing increasingly concerned about privacy implications from all the ways in which we are tracked…and give up information voluntarily. Increasingly making me wonder: what’s privacy worth to me?

  3. Dean Avatar

    Nicely written and beautifully presented post, Alistair. However I think the arguments are flawed at a fundamental level.

    “When he decided to stroll through Harajuku yesterday afternoon, 他人probably had a reasonable expectation of privacy”

    Why? Both subjectively and objectively a public, open street could not be considered private. 他人 may have *assumed* privacy but he cannot *expect* privacy. The fact he can now be identified, contextualised and published doesn’t change this fundamental fact. To extrapolate backwards, the following story could have happened any time in the last 30 years:

    http://gizmodo.com/5619930/accidental-photobomb-leads-to-bag-thiefs-capture

    The thief was caught because someone took an ordinary photo with an ordinary camera in a public place and people were able to provide context to go along with it. The fact that this analysis is nowhere near such a manual process any more doesn’t change the privacy expectation of the situation, just the prevalence of analysis.

    1. Alistair Avatar
      Alistair

      Dean,

      I’ll disagree with you here. Privacy is a spectrum. The expectation we have of a public space includes a degree of ephemerality (our presence will be forgotten) and anonymity (our identity won’t be recognized.) Both of those are eroded by technology. The first vanishes because of cloud storage, and the second because of facial recognition.

      As Chris Taylor pointed out in a piece that expanded on what I wrote here, even the anonymity of the past is eroded, since we can scan old photos. But whereas we used to have a 24-roll of carefully staged film for an entire holiday, now people take that many pictures, carelessly, of just their dinner.

      You say “The fact that this analysis is nowhere near such a manual process any more doesn’t change the privacy expectation of the situation, just the prevalence of analysis.” I heartily protest. Once the friction of a process goes to zero, as is the case with software, the equations change completely.

  4. […] follow Alistair Croll of Solve for Interesting, you should. In a piece published last week, You’ll Be Tagged, Croll makes the point that common photo tagging technology like Facebook allows for a remarkable […]

  5. Brian Avatar

    Outstanding post. I was just at the Quantified Self conference this weekend and this was the major theme. And I think voice a great perspective on it.

    For me the major takeaway is that some dramatic changes are coming soon that may also change our culture deeply.

  6. Toby Avatar

    Very fascinating piece! That’s why it’s so important that we integrate new technologies, like Glass, into society in a way that doesn’t further infringe on our privacy. Sure, that’s going to be tough but I think its the Glasser’s responsibility to consider those around them when filming or snapping a shot and then making public online. What about asking permission before – like photographers are expected to do? It should be an unspoken etiquette in this day & age. I know I’m overreaching and putting too much faith in mankind but maybe we can, just maybe we can learn to be considerate of others’ privacy in this way.

  7. MarcB Avatar
    MarcB

    That’s a very good article, Alistair. I have been thinking a lot of the same things myself these days, particularly about google which has a) all the information in the world b) enough computing power to crunch a lot of it and c) the big brains required to add context to the data.

    What is saving us right now is that they are a “good” company and generally do not need to do anything “evil” to make money. But in 20 years? 50 years? The next generation running things? Will the current owners feel the same as they do now when they are 55?

    You ask “What protections could be put in place?”. That is a damn good question. I haven’t got a clue. We don’t want them running amok, because the sheer power they can get from properly mining/contextualizing will be able to destabilize economies, or societies. But do we want to stifle innovation in this area? That power they would get could also help everyone, as it is now with those free tools you were talking about.

    That’s a head scratcher for sure…

  8. Alex Bowyer Avatar

    This is a good read! Follows on nicely from the previous Human 2.0 piece “One Step Away from Lost Privacy?” ( http://human20.com/one-step-away-from-lost-privacy/ ) and goes deeper into than I did. It’s definitely something we are all going to have to be aware of.

  9. […] which there are individuals. We have already got identity tagging in Fb, and large cash goes towards advancing facial recognition. I additionally discovered Real […]