The data you unconsciously produce by going about your day is being stored up over time by one or several entities. And now it could be used against you in court….
Inspired by the recent case of Fitbit history being used in a personal injury claim the excellent Kate Crawford discusses the rise of the algorithmic expert witness in the Atlantic. The important point is not just that wearable tech records may become evidence in a court of law but also that data – or rather the conclusions drawn by analytics companies – becomes a new kind of witness.
The decisions about what is “normal” and “healthy” that these companies come to depends on which research they’re using. Who is defining what constitutes the “average” healthy person? This contextual information isn’t generally visible. Analytics companies aren’t required to reveal which data sets they are using and how they are being analyzed.
The current lawsuit is an example of Fitbit data being used to support a plaintiff in an injury case, but wearables data could just as easily be used by insurers to deny disability claims, or by prosecutors seeking a rich source of self-incriminating evidence.
We should therefore not only be aware that wearable sensors are surveillance devices and that the records they produce can be subpenaed but also resist the idea that the data is a source of objective truth and instead see it as the opinion of the experts or companies interpreting the data.
Ultimately, the Fitbit case may be just one step in a much bigger shift toward a data-driven regime of “truth.” Prioritizing data—irregular, unreliable data—over human reporting, means putting power in the hands of an algorithm. These systems are imperfect—just as human judgments can be—and it will be increasingly important for people to be able to see behind the curtain rather than accept device data as irrefutable courtroom evidence. In the meantime, users should think of wearables as partial witnesses, ones that carry their own affordances and biases.
Read the article here
In this ‘lab test’ we look at a very popular covert surveillance device, cunningly disguised as that most innocuous of devices, a smoke detector. Unlike wireless cameras disguised as wall clocks or iPod docks, spy devices in this form are far less likely to be tampered with, let alone discovered. Not only are they just out of reach, they’re considered part of the local emergency infrastructure, meaning it’s far less likely someone will take them down for inspection. All the while, from their ceiling mounted position, they’re ideal for monitoring the activities within a room.
[More details] [Pre-order]
Michael Keller & Josh Neufel have turned Unraveling Privacy: The Personal Prospectus & the Threat of a Full Disclosure Future and some other stuff like Paul Ohm’s Databases of Ruin, an interview with Danah Boyd and why Bigdata is a civil rights issue into a graphic Novella called ‘Terms of service‘ for Aljazeera America. It is a great read that explains what is really at stake.
Read or download the whole thing here
Cyborg Unplug can now be pre-ordered at https://plugunplug.net
Cyborg Unplug is an anti wireless-surveillance system for the home and workplace. It detects and kicks selected devices known to pose a risk to personal privacy from your wireless network, breaking uploads and streams. Detected wireless devices currently include: wearable ‘spy’ cameras and microphones, Google Glass and Dropcam, small drones/copters and a variety of popular spy devices disguised as familiar objects.
€52.- (ca. $66.-)
Ars reports that the chief executive officer of a mobile spyware maker was arrested over the weekend, charged with allegedly illegally marketing an app that monitors calls, texts, videos, and other communications on mobile phones “without detection,”
Can we look forward to `Sandy’ Pentland, Zuckerberg, Jeff Bezos, Riccardo (Candy crush) Zacconi, and whoever the CEO of PassTime is, being arrested next? After all they collect even more personal and location data.
Interesting approach by Jaeyeon Jung & Matthai Philipose from Microsoft research. The basic idea is to turn off wearable cameras like Autographer when people are detected by a low res far-infrared imager unless those people have expressed consent.
Small and always-on, wearable video cameras disrupt social norms that have been established for traditional hand-held video cameras, which explicitly signal when and which subjects are being recorded to people around the camera-holder. We first discuss privacy-related social cues that people employ when recording other people (as a camera-holder) or when being recorded by others (as a bystander or a subject). We then discuss how low-fidelity sensors such as far-infrared imagers can be used to capture these social cues and to control video cameras accordingly in order to respect the privacy of others. We present a few initial steps toward implementing a fully functioning wearable camera that recognizes social cues related to video privacy and generates signals that can be used by others to adjust their privacy expectations.
read the full paper here