Designing for Sensors

by June 4, 2009

I should have written this post a long time ago. But after listing out the sensors available within Apple's iPhone in my Digital Product Platforms write-up, I felt it was a good time to revisit this topic.

In his WebVisions 2007 keynote, David Pescovitz discussed what may happen when sensing applications become more ubiquitous:

  • Three waves of technology: computing (increase productivity, process data), communicating (email, IM, etc.), sensing (sensors embedded across world).
  • In the next decade these three curves will come together. Most objects will have communication, computing, and sensing capabilities.
  • Sensor networks will serve as the eyes and ears of the geo-world and provide information relevant to wherever you are. This means the end of cyberspace as a place you go. Instead it is a seamless overlay of the world.
  • Zillionics: zillions of unrelenting rivers of sensory data coming at you day and night.
  • The fourth wave (after computing, communications, and sensing) is sense-making: helping us to deal with an overload of info.

The quantity of sensors embedded in Apple's iPhone make it pretty clear we are entering the "sensing" wave of technology:

  • Location sensor: precise location coordinates from GPS
  • User orientation sensor: directional heading from a digital compass
  • Touch sensors: Multi-touch input from one or more simultaneous gestures
  • Light/dark sensor: Ambient light detection
  • Device orientation & motion sensor: from built-in accelerometer
  • Proximity sensor: device closeness to other objects or people
  • Audio sensor: input from a microphone
  • Image & video sensors: capture/input from a camera (all kinds of signals can be found in real-time visual information)
  • Device sensor: through Bluetooth
  • Audio broadcast sensor: FM transmitter (rumored on iPhone)

Soon, instead of designing for computing (productivity and data management) or communication (email, social networks), we'll be designing for sensors.