Amber Case: Ambient location and the Future of the Interface
Amber Case (@CaseOrganic) was talking about Ambient Location and the Future of the Interface at dConstruct these are my notes from her talk.
We are all cyborgs but you don't need an implant. When you interact with technology outside yourself you are a cyborg. Humans attach appendages to ourselves to be in places we shouldn't - like space. Technology needs us to reproduce. Fist -> hammer. Tooth -> knife. How our tools look has been stable for millions of years. Computers have changed rapidly - buttons have evaporated. Devices are larger on the inside than on the outside - extension of the mental self.
Traditional anthropology looks at the other. It is often ignored that we are often the other. Devices need feeding (power), grow up (updates).
1981 - Steve Mann with wearable computing apparatus. All via radio but needed a signal - turned himself into an antenna. Age 10 walked into TV repair shop and asked for a job because he wanted to build a TV small enough to wear. He wanted to create diminished reality - a kind of visual ad blocker. Created virtual post-it notes with image processing. Remember the milk message over every supermarket sign. Used it to remember people's names from face recognition and show historical data. Started as 40lb of equipment - by 1998 it was in sunglasses. Why can't computers conform to us - input is an issue. The mouse was designed as a temporary fix for interacting with data but it became a persistent architecture. Chorded keyboards and a HUD freed people from the computer.
Thad Starner worked on smaller HUDs which got more usable and evolved into Google Glass which is still very limited.
Mark Wiser (sp?): Calm Technology. When you get angy with tech it's because it's not designed for you. It should be there when you need it and not when you don't. Memoto persistently records things - takes pictures. It's not on your face so it's not scary like Google Glass. Small camera lens and clips onto clothing. Ambient input - data from sensors such as location and time.
Making the invisible visible. Aaron Parecki (sp?) was talking his location at 5s intervals. What if the next button is in the air? Draw a boundary on the map for example and something triggers when you get inside this area. Battery drain and privacy are issues. Your phone will become a remote control for reality. Geonotes - location based messages, you get the message when you get to an area. E.g history of a bridge as you cross it.
- Home automation: turn on lights when you get near your house and turn them off when you leave.
- Real time hyperlocal weather - it's going to start raining in 10 min and last for 20 min.
- Transit notification - when's the next bus?
To get testing data created a game that got people to go round collecting territory - got data from a whole range of devices. mapattack.org - realtime shared state geolocation game. Kids' games are really simple and this provides that stimulation for adults.
Bringing static content to life. Data is stuck on the web, not associated with your location. Put information about pinball machines onto a map and went round until the phone dinged and then went into the bar and played pinball. Did a similar thing with geotagged information from Wikipedia. Serendipitous information. Don't eat that used inspection scores for local restaurants.
With social media, the person I know least about is myself. Tower of Bable of different datasets. Make what's invisible visible. Used The Sims to simulate apartment to figure out what might improve real life. Filling in survey revealed that didn't realise I was unhappy with my job. Use cybernetic feedback loop to improve your life.
The best technology is invisible and light weight.