That smartphone you carry around is a box full of sensors.
Those sensors are just sitting there doing nothing, or performing mundane tasks like giving you turn-by-turn directions or turning off your screen when you're yakking on the phone.
But what if you could use phone sensors to their full potential?
Tiny instruments inside your phone can tell where you are on the surface of the planet. Sensors can also detect and identify nearby Wi-Fi routers and Bluetooth devices. They can measure light and sound, as well as motion, direction and "attitude" (the position of the phone -- which way it's facing). They can tell if something is near the phone or not. Some phones can detect ambient temperature and other environmental facts.
Smart software with access to all the data gathered by these sensors, combined with an Internet full of information, could figure out all kinds of things about you.
Everybody knows that big companies like Google, Apple and Facebook want to harvest cellphone-generated data and use it to serve up virtual personal assistants with a side order of contextual advertising.
This vision of the future puts your phone's sensor data in the hands of megacorporations. But what if you were in control of it?
What is the quantified fife?
I've written in this space about the " quantified self," and also " lifelogging."
The "quantified self" movement uses technology to measure your biology, for the most part -- your heart rate, sleeping patterns, physical and mental performance and other data.
The problem with the "quantified self" is that many people don't want to think of themselves as, or treat themselves like, machines.
The "lifelogging" movement is about capturing everything that happens to you, mostly with a camera, for a literally photographic memory, as well as capturing your communication with others.
The problem with "lifelogging" is that in order to capture your life, you have to photograph and index conversations with everyone you encounter, which feels like an invasion of their privacy.
The quantified life is similar to both the "quantified self" notion and "lifelogging" in that it involves the use of technology to gather highly individual data in order to improve your life.
But instead of hacking your body and monitoring your vital signs, sleep patterns and brain waves as those in the quantified self movement advocate, and instead of taking pictures of everything you look at, as some lifelogging fans do, the quantified life approach simply involves the passive gathering of cellphone data in the background and then using algorithms to make sense of it all.
A free new app lets you quantify your life
A Silicon Valley startup called Alohar Mobile offers a free iOS and Android app called Placeme that's designed to gather available phone sensor data and upload it to the company's cloud-based servers.
Instead of using the data it gathers for advertising purposes, Placeme just makes that information available to you -- and nobody else.
Placeme is what Foursquare would be if you checked in at every location where you stayed for more than four minutes and annotated those check-ins with other data. Except with Placeme you don't check in. You don't have to do anything. (You can optionally specify or change the name of locations, and add notes.)
Later, you can review a list of all the places you've been, or see them pinned on a Google Map. A search button lets you search your history.
The app is fun, but it's not the most interesting thing about Alohar. The Placeme app is really a proof-of-concept demonstration of the real product, which is an "ambient location platform" for app developers.
Alohar's Mobile Behavior Analytics Engine squeezes meaning out of user sensor data and offers libraries of data analysis to developers.
For example, based on the speed at which you're moving and the pattern of your phone's motion sensor, Alohar's software can tell if you're walking, skateboarding, biking, driving or flying. The GPS can tell where this is taking place. The clock tells when.
It can also auto-categorize locations as "home," "work," "restaurant," "gym" and other groupings, and provide you with statistics about, for example, how many hours you spend at work each month.
By combining GPS, light sensor and temperature data, the software can tell if you're inside or outside.
The technology has a practical emergency application: It can tell if you've been in a car accident. If your deceleration is fast enough, then, by definition, you've been in a collision. It would be theoretically possible to develop an app that calls 911 when you've been in a crash and relays information about your location and perhaps some details of the incident.
The system can also out-Siri Siri -- theoretically. Virtual personal assistants in the age of Siri derive value from understanding regular human speech. But tomorrow's assistants will earn their keep by understanding actual human behavior.
Note that every data point gathered can be connected to a universe of useful information. For example, by knowing where you drive and also where you tend to buy gas, an app could suggest cheaper gas at stations near the routes you drive on a regular basis.
There's another "ambient awareness" app called Highlight that tells you when someone you know -- or someone who knows someone you know -- is nearby.
A combination of a system like Alohar's and an app like Highlight could tell you how you're connected to the strangers around you (those who also use the same app). Here's a guy who goes to the same gym you do. There's a woman whose kids go to the same school as your kids. One of your neighbors works in a building next to your workplace -- maybe you should carpool.
It's impossible to predict what creative developers could do with this service.
The combination of ambient sensing, powerful analytics, strong encryption and good privacy policies could usher in a world where our smartphones can be more than just handy gadgets. They could be life-transforming tools that enable us to use our own data for our own purposes.
Are you ready to quantify your life?