MIX10: Developing Natural User Interfaces

by March 18, 2010

Joshua Blake's Developing Natural User Interfaces with Microsoft Silverlight and WPF 4 Touch at Microsoft's MIX10 conference provided an overivew of Natural User Interfaces (NUI) and their core building blocks.

  • NUI in terms of input: mutli-touch, voice, motion-tracking, and stylus. Just because we use these input types does not mean it is a NUI application.
  • “NUI exploits skills that we have acquired through a lifetime of living in the World” –Bill Buxton. Skills we have learned in the real world interacting with people, objects, and your environment. Many skills that people have because they are human. How can we reuse these skills?
  • Composite skills and simple skills. Composite skills take a long time to learn and are complex. It is difficult to take composite skills to new contexts. Simple skills are very easy to reuse across contexts and can be learned through basic observation.
  • NUI is designed to reuse existing skills for directly interacting with content.
  • Direct interaction: physical proximity, temporal proximity, and parallel actions. Direct interactions are high frequency (sizes are smaller so they can happen more often) and contextual (do not see everything you can do at all times –only what you need).
  • What is the NUI equivalent of WIMP? (windows, icons, menus, pointer) OCGM: objects, containers, gestures, manipulations
  • Objects are the content in an interface. Containers make relationships between content explicit. Gestures are body motions that have meaning. They are discreet (you have to complete a gesture before it can be interpreted). Manipulations are direct and have continuous feedback. With gestures the feedback does not come until the gesture is complete.