home
Tumblr:Readywater
Readywater.ca
Facebook:readywater
Twitter:readywater
Flickr:readywater
Vimeo:readywater
Sunday, April 24th
5:53pm

1 note

Response to “GOOGLE, APPLE: UNABASHEDLY REASSEMBLING YOUR LIFE”

I originally wrote this as a response to a post Ben Feist made on the TAXI company blog, which I thought was a great. My response kinda veered off topic enough that it kinda became a different post in itself, but I wanted to share it anyway. 

Ben writes of the technologies and dangers of the kind of profile and aggregate information being collected by companies like Apple and Google, and goes in to some of the methods and technologies involved.

There is a massive amount of data out there, and huge implications for how it could affect our future opportunities and behaviour, but I’d argue that what we’re actually seeing is less a negative privacy infringement, but more a paradigm shift in how we perceive privacy and personal informatics. 

Consider projects like MIT’s Gaydar, which uses the FB social graph and various public information to statistically guess the likelihood of ones sexual orientation, or PleaseRobMe.com, which used foursquare or similar locational data to indicate the probability of a home being empty. Both projects are designed as awareness raising tools, and I think that is exactly what we need.

We are moving very rapidly towards a world where physical sensor data for _everything_ being available is the norm, and I’d argue that this is a very positive thing. In a talk I gave recently, I referred to these as ambient interactions, basically a many to one relationship for human interactions to computational output. By using our aggregate patterns and analyzing these trends, we’re able to use this data to create a loose feedback loop for informing our own behavior. In the same way then that we self-correct in a video game or a while driving a car or bike, we’re now able to self correct our behaviour across extended periods.

Now, this is assuming that we have control over our own data, and the information is ours to analyze. A world of physical sensors is also potentially a negative thing: we see things like the Lower Manhattan Security Initiative (building on the model of London’s Ring of Steel) using neural networks and massive sensor arrays to create reactive spaces to traffic and criminal activity. When the built environment becomes able to alter itself (in this case, shifting road blocks or caltrops) in relations to algorithmically perceived threats, we’re _all_ going to develop a very intense fear of false positives.

Likewise, as you mentioned, what happens when this sensor data is used in determining opportunities and decisions which affect us personally? How will my locational trends affect my likelihood of getting a loan? Will the fact that I bike quickly along main streets vs. slowly along side streets impact an employers perception of me as a risk taker, or a banker’s perception, for that matter?

There are good and bad sides, and I’m very much looking forward to both living in and designing for a world where this is possible. This said, I think the broader question is one of public policy and shifting social norms around these technologies. 

Already we’re seeing a subtle shift in the way people apply and understand wayfinding methods as a consequence of technologies like Google maps and streetview, and I think we will see a similar cultural shift as we align ourselves further with ubiquitous sensor networks and our existing personal wireless sensor devices: our mobile phones. Figures like Ann Cavoukien, our privacy commissioner for Ontario, have been incredibly diligent in deflecting some of the more damaging political mechanisms leading to the idea of the “big brother” state (eg: national ID cards), but we ARE going to live in a world of ubiquitous sensor networks, whose masters use this data for whatever agenda they might have. 

The question is how will educate ourselves to defend against these kind of technologies? Will we see a shift towards personal techniques like CV Dazzle, to baffle Haar-based computer vision systems? Or will a shift towards biometric masking techniques, like seen in Gattaca or Minority Report, be seen? Will such techniques be the purview of the criminal element, or an indication of social awareness or strongly held opinion: the modern punks? Will we shed the feature-rich mobile computers that is the contemporary smart phone in favour of voice/text only radios which make use of sophisticated encryption keys and decentralized mesh networks for communication away from the monolithic telecoms?

Either way, this technology is already here and widespread adoption is around the corner: the unknown is the emergent social trends that respond to this new “sentient” world. And it’s that unknown that matters more than the technology.


1 note
  1. bfeist reblogged this from readywater
  2. readywater posted this
Themed by Kiyla, powered by Tumblr.