Microsoft research paper : courtesy protocol for wearable cameras

Interesting approach by Jaeyeon Jung & Matthai Philipose from Microsoft research. The basic idea is to turn off wearable cameras like Autographer when people are detected by a low res far-infrared imager unless those people have expressed consent.


Small and always-on, wearable video cameras disrupt social norms that have been established for traditional hand-held video cameras, which explicitly signal when and which subjects are being recorded to people around the camera-holder. We first discuss privacy-related social cues that people employ when recording other people (as a camera-holder) or when being recorded by others (as a bystander or a subject). We then discuss how low-fidelity sensors such as far-infrared imagers can be used to capture these social cues and to control video cameras accordingly in order to respect the privacy of others. We present a few initial steps toward implementing a fully functioning wearable camera that recognizes social cues related to video privacy and generates signals that can be used by others to adjust their privacy expectations.


read the full paper here


Civil Rights, Big Data, and Our Algorithmic Future

The key decisions that shape people’s lives—decisions about jobs, healthcare, housing, education, criminal justice and other key areas—are, more and more often, being made automatically by computers. As a result, a growing number of important conversations about civil rights, which focus on how these decisions are made, are also becoming discussions about how computer systems work.

The September 2014 report on social justice and technology begins to answer the question “How and where, exactly, does big data become a civil rights issue? “


Read the report here


The report is generally very good and provides real concrete examples. However they claim on the basis of sparse anecdotal evidence that ‘Body-worn cameras are poised to help boost accountability for law enforcement and citizens’. We beg to differ – life as always is more complicated than that.

There is no such thing as ‘technology.’

Anyone who views critics of particular technologies as ‘luddites’ fundamentally misunderstands what technology is.There is no such thing as ‘technology.’ Rather there are specific technologies, produced by specific economic and political actors, and deployed in specific economic and social contexts. You can be anti-nukes without being anti-antibiotics. You can be pro-surveillance of powerful institutions without being pro-surveillance of individual people. You can work on machine vision for medical applications while campaigning against the use of the same technology for automatically identifying and tracking people. How? Because you take a moral view of the likely consequences of a technology in a particular context.

…. More

There is no such thing as ‘technology.

Brookings Institute report on cyborg law and policy.

Brookings Institute report on cyborg law and policy.

In June 2014, the Supreme Court handed down its decision in Riley v. California, in which the justices unanimously ruled that police officers may not, without a warrant, search the data on a cell phone seized during an arrest. Writing for eight justices, Chief Justice John Roberts declared that “modern cell phones . . . are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.”1

This may be the first time the Supreme Court has explicitly contemplated the cyborg in case law—admittedly as a kind of metaphor. But the idea that the law will have to accommodate the integration of technology into the human being has actually been kicking around for a while.

Speaking at the Brookings Institution in 2011 at an event on the future of the Constitution in the face of technological change, Columbia Law Professor Tim Wu mused that “we’re talking about something different than we realize.” Because our cell phones are not attached to us, not embedded in us, Wu argued, we are missing the magnitude of the questions we contemplate as we make law and policy regulating human interactions with these ubiquitous machines that mediate so much of our lives. We are, in fact, he argued, reaching “the very beginnings of [a] sort of understanding [of] cyborg law, that is to say the law of augmented humans.”


The report is interesting and thoughtful. It asks exactly the kinds of questions we need to consider as a society.

Read the full report here


Since we are cited as an example twice we need to briefly clarify our views:

(1) The report states

How exactly we will mediate between the rights of cyborgs and the rights of anti-cyborgs remains to be seen—but we are already seeing some basic principles emerge. For example, the proposition that individuals should have special rights with respect to the use of therapeutic or restorative technologies appears to be so accepted that it has prompted a kind of intuitive carve-out for those who otherwise oppose wearable and similar technologies. Such is the case with Stop the Cyborgs, an organization that emerged directly in response to the public adoption of “wearable” technologies such as Google Glass. On its website, the group promotes “Google Glass ban signs” for owners of restaurants, bars and cafes to download and encourages the creation of “surveillance-free” zones.76 Yet the site also expressly requests that those who choose to ban Google Glass and “similar devices” from their property to also respect the rights of those who rely on assistive devices.77

This is  true (it refers to our section on ‘Disability rights & assistive devices‘) but our stance is a little more nuanced than implied in the report. The core issues are agency and coersion rather than some normative conception of what a human should be.

If the cyborg’s extended body includes components that they do not fully control such as:

  • Remotely controlled devices
  • Closed source devices
  • Cloud services or data storage
  • Hackable or remotely updateable networked devices

Then the cyborg does not have control over there own extended body and are in a vunerable position. They are potentially subject to external surveillance, coersion and control. Further because they are carriers of external forces they may subject those arround them to external surveillance, coersion and risk. Because their extended body comprises networked technical systems they cannot reassure people that they are not going to do X because they do not control their own extended body. Thus cyborgisation forces us to replace behavioural requests “please turn your camera off and leave it outside” with the exclusion of particular extended bodies “you cannot come in to the Tibetan dissidents meeting because your body is a camera which automatically syncs with Baidu“.

Cyborgisation threatens the idea of individual agency and responsibility.

Depending on the situation it may be the cyborg themselves or those arround them that suffer most. Further depending on the situation the degree of choice that the cyborg has about using the device may differ. In the case of assistive devices the user may have little choice. All available devices may subject the wearer and those arround them to external monitoring but because the consequences of not being able to see, or hear, move or otherwise function as huge they have little choise but to accept. Similary some people may be coersed by their insurers or employeers into wearing or being implanted with a device. Then finally we have people like glassholes or lifeloggers who have freely choosen to wear a device.

Where the cyborg is subject to coersion our sympathies are with them. If a technical part makes up your extended body then you should control it not some corporation but unfortunately the majority of medical and assistive devices are closed propriety systems. Further no-one should be coersed by people, corporations or indeed wider economic or social forces into wearing or being implanted with any device. However it is clear that many people unfortunately are.

In the case of glassholes, lifeloggers or views are clear. The loss to these people of removing their device is minimial and even if they have embedded a camera in their head – noone and no circumstance forced them to do it.

Cyborg Unplug – Plug to Unplug


Cyborg Unplug is a wireless anti-surveillance system for the home and workplace. ‘Plug to Unplug’, it detects and kicks devices known to pose a risk to personal privacy from your local wireless network, breaking uploads and streams. Detected devices currently include: Google Glass, Dropcam, small drones/copters, wireless ‘spy’ microphones and various other network-dependent surveillance devices.

Cyborg Unplug will be available for pre-order September 30, 2014. Subscribe to ensure you are kept up-to-date with the launch and to receive other low-volume information about the project.


Great blog post by Mark Carrigan of

He starts off with personal experiance of using a tracking device:

Earlier this week I finally bought the Jawbone Up24 after weeks of deliberation. I’d got bored with the Nike Fuel Band, losing interest in the opaque ‘fuel points’ measurement and increasingly finding it to be an unwelcome presence on my wrist. I’d also been ever more aware of how weird my sleep patterns have become in the past couple of years, cycling between rising early and staying up late, with little discernible rhyme or reason. The idea of tracking my sleep in a reasonably accurate fashion, using degree of bodily movement as a cypher for the depth of sleep, appealed to me on a reflexive level.

This experiance of being nudged by wearable tech makes him consider how intrusive wearable tech be if were made manditory and used to enforce behaviour.

I set the ‘idle alert’. I did so because I found it an appealing idea. It was an expression of my own agency. But it left me with a sense of quite how intrusive and aggressive this technology could be if it were ever mandated.
How hard is it to imagine a situation where Amazon factory workers are expected to wear similar bands, programmed to issue a vibrating warning after 15 minutes of idleness and to alert the supervisor if the worker is still idle a few minutes later? Is it at all challenging to imagine a comparable band with an RFID chip being used to track and sanction a call centre operator who spends too long in a toilet?
The most interesting point is on conditionality of welfare as a method of diffusion of these control techniques. Governements use tags to track offenders and Sobriety tags are being trialled in London to enforce abstinence on people banned from drinking. With tight budgets, a fondness for technological solutions and political rhetoric which divides recipients of welfare into the deserving and undeserving. How long these techniques move from ‘offenders’ to the ‘dependant’, before wealthfare payments and healthcare are made conditional on ‘good behaviour’ – enforced by a wearable monitoring system?
How hard is it to imagine a situation where a Conservative government, eager to separate ‘strivers’ from ‘skivers’ demands that welfare recipients submit to monitoring of their alcohol and nicotine intake?
How hard is it to imagine a situation where recipients of weight related interventions on the NHS are made to wear activity tracking bands with the threat of withdrawn rights to healthcare in the case of unhealthy eating or sedentary lifestyles?
Consumer stuff like Fitbit & Glass is just the first wave. It normalises wearable tech and introduces us gently to the idea of being monitored and nudged. It’s fun, it’s  cool, it makes us ‘better’. The next wave is being coersed or forced by employeers, insurers, carers or government into to wearing devices that enforce ‘correct’ behaviour.
You can read the whole blog post here