Last night’s BAYCHI panel on User Research Strategies: What Works, What Does Not Work featured Sheryl Ehrlich from Adobe; Christian Rohrer from eBay; Klaus Kaasgard from Yahoo!; Kaaren Hanson from Intuit; and Maria Stone from Google.
The overarching consensus on positioning and applications of user research between the panelists really brought to light that fact that user research has become a mature discipline. The same methodologies and philosophies were present at Adobe, Intuit, EBay, Yahoo, and in an early stage at Google. Each of the panelists described how they were moving beyond tactical validation of product concepts (i.e. just usability testing) toward more strategic contributions that directly impact corporate objectives.
Sheryl from Adobe outlined the phases of growth her user research group has been through: initial usability testing of products to “get your foot in the door”; early research to help set the direction for products; and exploratory research to identify new opportunities. This cycle closely matched the experience of other panelists. Most seemed to be established partners in the early research phase and were becoming more involved with exploratory research (traditionally the domain of corporate strategy and market research groups). Google was a notable exception as the group there was still making inroads through tactical research. They seemed to face a unique challenge in Google’s corporate culture where user research is not often leveraged for creating the next big idea.
Kaaren from Intuit discussed the strategies that have worked in her research group: aligning with business strategies; setting concrete user goals to establish a shared vision up front; tracking design iterations against those goals; and savoring surprises (unexpected data that sometimes leads to big breakthroughs).
Some of the challenges the panel outlined included compiling customer research data from various groups; managing short term (often tactical) and long term (often strategic) research simultaneously; and documenting the impact of great design on key business outcomes.
There were several interesting questions raised during Q&A, which did not get a full answer. Steve Portigal pointed out that many of the processes described by the panel were essentially design methodologies and asked if we were approaching a point where design and research might begin to meld as a unified discipline. Though the panel didn’t believe that was the case, the prevalence of methodologies where cross-disciplinary teams “hit the streets” for insights that fuel their design concepts certainly supports Steve’s point.
Related to Steve’s question was a topic I raised: given that user research groups conduct thousands of usability tests (2,000 per year at Intuit) and hundreds of fields studies, surveys, and more –how do they communicate their findings to stakeholders in a clear and concise manner? The answer given by the panel was “by bringing the stakeholders in during the data gathering and analysis stages”.
My personal experience, however, shows that it is often hard to get key stakeholders to participate in the user research phases of a project. They are often too busy or don’t find direct participation compelling enough to warrant their involvement. As a result, I’m often pressed to develop artifacts that communicate the insights that informed my product designs. These include video clips, diagrams, narratives, and yes, PowerPoint decks. I even presented a class at eBay on design artifacts that focused on getting by in from stakeholders before developing screen designs. The artifacts I highlighted summarized the implications of user research and outlined concrete directions for product design.
Perhaps the intersection point between user research and design that Steve was alluding to is a design communicator role –someone responsible for communicating the insights and implications of user research via storytelling and information design principles. Having played that role on nearly every interface design project in my career, I was confounded by the fact that none of the panelists had a methodology for it in place.