“The greatest fear that I have regarding the outcome for America of these disclosures is that nothing will change.”
Listen to the explosive interview with Edward Snowden: the whistleblower behind the NSA surveillance revelations
Edward Snowden has risked everything to make the prism program public. The least we can do is action:
- Campaign to impeach all those responsible for PRISM at the NSA and in Government.
- Support efforts to review and repeal the patriot act
- Ditch the cloud and encrypt everything you can because giant US tech companies are no longer a trustworthy place to put your data
- Lead and initiate efforts to fight for freedom and privacy
The Guardian: prism live updates
We are waiting for more details to become clear before commenting on prism. However you can follow developments live on the Guardian website here.
License Attribution Some rights reserved by Kheel Center, Cornell University
One of the risks associated with wearable tech we have been warning about is the intrusion of work and insurance into home life. As with many of our dystopian predictions this one seems to be coming true sooner than we expected. Indeed as part of a segment on wearable tech the BBC report that:
The cloud technology firm Appirio has issued many of its staff with the UP wristband, tracking everything from their food intake to their sleep patterns. It is a voluntary scheme, and Lori Williams who runs the European division of the American business says it’s already proving valuable for employees and the firm. “We’ve had about a hundred employees that have lost a stone or more in the last several months. Last month alone, we collectively walked about 17,000km (10,563 miles). So it’s making us not just better employees but I think better people. And I think that’s the benefit.” The company has also managed to cut its health insurance costs in the United States by showing its insurer the impact of this life-logging plan.
A key principle of health and safety law is that safety in the workplace cannot be over ruled just because employees agree to perform dangerous tasks. The reason for this is simple. Power relations are unbalanced and employees can be coerced by employees and management through threats or incentives. Similarly where an organization is encouraging its staff to wear monitoring devices it is unclear whether employees can freely choose not to be monitored without their refusal potentially impacting on their careers.
The staff at Appirio are skilled developers with a degree of bargaining power – however the situation may be worse for others. For example UK supermarket chain Tesco has been accused of ‘using electronic armbands to monitor its staff‘. With an ex worker claiming that ‘the Supermarket grades employees on efficiency and can reprimand them for breaks’ according to an article in The Independent.
The risks are obvious and simple: You may be coerced by employer into wearing a tracking device at work or at home. This information will be used to grade and rank you. Which means that your employeer will effectively control your home life.
“We notice that you have not been getting to sleep early enough and are not exercising in the morning” please change this before the next performance review.
Wearable tech is merging with health and insurance so in the near future: You may also be coerced by your insurer into wearing the device because refusing results in increased insurance premiums. Wearing would also mean agreeing to whatever terms and conditions the device maker chooses to impose. It is likely that these would allow medical companies and other paying parties to look at your data. Essentially unless you are very rich you will be tracked because refusing will become unaffordable.
ORGCon2013 digital rights conference
Saturday 8 June
10:00am – 5:30pm
IET (Institute of Engineering and Technology) in central London.
On Friday 31st May Google released a statement on their project Glass G+ channel.
Glass and Facial Recognition
When we started the Explorer Program nearly a year ago our goal was simple: we wanted to make people active participants in shaping the future of this technology ahead of a broader consumer launch. We’ve been listening closely to you, and many have expressed both interest and concern around the possibilities of facial recognition in Glass. As Google has said for several years, we won’t add facial recognition features to our products without having strong privacy protections in place. With that in mind, we won’t be approving any facial recognition Glassware at this time.
We’ve learned a lot from you in just a few weeks and we’ll continue to learn more as we update the software and evolve our policies in the weeks and months ahead.
The new developers policy now includes the following new restrictions:
- Don’t use the camera or microphone to cross-reference and immediately present personal information identifying anyone other than the user, including use cases such as facial recognition and voice print. Applications that do this will not be approved at this time.
- Don’t disable or turn off the display when using the camera. The display must become active when taking a picture and stay active during a video recording as part of your application.
We strongly welcome Google clarifying their stance. However we are not popping the champagne corks just yet. The phrase ‘at this time‘ is somewhat concerning since it indicates a likely intention to add the capability to identify people in the future. Further it is unclear what ‘strong privacy protections‘ might mean and in practice it will be difficult to stop side loaded apps.
Couple of good papers worth reading if you have the time:
(1) ‘What Privacy is For‘ Julie E. Cohen
Privacy has an image problem. Over and over again, regardless of the forum in which it is debated, it is cast as old-fashioned at best and downright harmful at worst — antiprogressive, overly costly, and inimical to the welfare of the body politic. Privacy advocates resist this framing but seem unable either to displace it or to articulate a comparably urgent description of privacy’s importance. No single meme or formulation of privacy’s purpose has emerged around which privacy advocacy might coalesce. Pleas to “balance” the harms of privacy invasion against the asserted gains lack visceral force.
Privacy shelters dynamic, emergent subjectivity from the efforts of commercial and government actors to render individuals and communities fixed, transparent, and predictable. It protects the situated practices of boundary management through which the capacity for self-determination develops.
(2) ‘The Dangers of Surveillance‘ Neil M. Richards
From the Fourth Amendment to George Orwell’s Nineteen Eighty-Four, and from the Electronic Communications Privacy Act to films like Minority Report and The Lives of Others, our law and culture are full of warnings about state scrutiny of our lives. These warnings are commonplace, but they are rarely very specific. Other than the vague threat of an Orwellian dystopia, as a society we don’t really know why surveillance is bad and why we should be wary of it. To the extent that the answer has something to do with “privacy,” we lack an understanding of what “privacy” means in this context and why it matters. We’ve been able to live with this state of affairs largely because the threat of constant surveillance has been relegated to the realms of science fiction and failed totalitarian states
(3) ‘Toward a Positive Theory of Privacy Law‘ Lior Jacob Strahilevitz
Privacy protections create winners and losers. So does the absence of privacy protections. The distributive implications of governmental decisions regarding privacy are often very significant, but they can be subtle too. Policy and academic debates over privacy rules tend not to emphasize the distributive dimensions of those rules, and many privacy advocates mistakenly believe that all consumers and voters win when privacy is enhanced. At the same time, privacy skeptics who do discuss privacy in distributive terms sometimes score cheap rhetorical points by suggesting that only those with shameful secrets to hide benefit from privacy protections. Neither approach is appealing, and privacy scholars ought to do better.