In his presentation at User Interface 19 in Boston MA 2014, Stephen Anderson talked about the onset of ubiquitous information sources and their impact on user experience design practices. Here's my notes from his talk: The Architecture of Understanding.
- With all the information & complexity in the world... how do we make sense of information and turn it into understanding?
- Information is cheap, understanding is expensive. Examples of problems that require understanding: 401ks, reward programs, selecting wines, cheese, electronics, health insurance bills, and much more.
- What can help create understanding? Visual design/visualizations and interactions. Our brains understand patterns that interactive elements can clarify through change/motion. Cause & effect can be made visible with visuals/interactions.
- When, where and how does understanding occur? Through interactions directly with information that help reveal patterns.
- Even simple interaction improves performance & reduces mistakes. Example: using your hands to manipulate a pile of money allows you to count it faster than if you can only view the pile.
- As children, we learn from interactions with the world around us. We are always thinking through doing. Playful interactions reveal previously unseen patterns.
- Embodied interactions: everything your brain knows, it knows because of your body (your senses).
- Pragmatic (bring us closer to goals) vs. epistemic actions (use the world to improve cognition).
- Timeless principles like annotating, animating, filtering, chunking, collecting, etc. will last longer than any specific technology. Build your skills in these core areas and apply them to current & upcoming tech to aid understanding.
- External representations: things that can extend our thinking/abilities.
- How does emerging technology aid in understanding? Example: prescription bottles that glow when you need to take a dose.
- The line between digital and physical is blurring. Sensors are now showing up everywhere in sports equipment, wearables, and more. The always our environment to be information rich.
- The world is becoming our information environment. We're moving from: "Human-Computer Interaction to Human-Information Interaction"
- Examples: examining the physical properties of objects with a device, umbrellas the glow with the changing weather forecast, and many more! Adafruit is great source for learning about sensors.
- Think sensors, not devices. Peel back and understand how different devices do things. Sensors can provide a set of building blocks for hardware.
- Don't be afraid to play. The cost of entry is really, really low.
- Internal representations: cognition in the brain takes reality (the world around us), filters it through our senses, and creates a simulation (the world we perceive). This is how we experience things.
- Our brains will always try to make sense of juxtapositions. Perception is not a process of active absorption, its a process of active construction based on our prior experiences and memories.
- We match patterns based on what we've seen before. Visual, words, and more can change our perception through priming.
- Change the way you think about doing UX work. Before technology, we always had representations and interactions with the world around us. We're moving past connected devices into information environments.
- We need to move from a product focus to an experience focus. Use service design to determine how everything fits together. Tools like journey maps and user lifecycles can help illustrate the big picture. Display these publicly to encourage different domains and disciplines together.