“We are HUGE advocates of the top task testing methodology” (Jeanne Quinn, Cisco)

At Cisco, we’ve wholeheartedly adopted the “top task management” approach in most of what we do for web and mobile. It has made a tremendous difference in everything from our support experiences to how we support navigation across our sites for tasks related to buying, product evaluation, consumption, and more.

The most useful thing about a “top tasks” methodology for us is that it gives us a way to measure and manage progress on the usability and effectiveness of key experiences over time. It lets us focus continually on improving the interactions our customers, partners, and other key audiences have with us.
Martin Hardee, Director for Cisco.com


We’ve been monitoring the performance of the top tasks on Cisco’s support website for more than 4 years. The Task Performance Indicator (TPI) is becoming a key metric that our executives track because it’s an actionable metric for experience.

Scores correlate much more closely with efforts to reform the site than other metrics do (such as Customer Satisfaction score, which is typically a lagging indicator of customer experience.) We’ve been able to show improvement in the score, largely due to the fact that the study reveals not only the impact of impediments to task completion, but also the nature of the impediments and comments by the user. These details lead to specific improvements. Our repetitive testing cycles drive improvements in an iterative process. It’s not the only tool in our toolbox, but it’s a very important and effective one.
Bill Skeet, Senior Manager for Digital Support Experience


We are HUGE advocates of the top task testing methodology at our company and evangelize it to everyone we talk with – both internally and externally. It’s really helped us fight against some of the ‘bright shiny object’ disease and the tendency for everyone to have an opinion of what we put on our webpages (and where/how) – because we have data to back it up! Our customers and partners do this (and not that) and when we organize content this way, they struggle, and when we organize it that way, they succeed! Clear and simple. Not easy to do mind you, but a fact-based approach that wins over colleagues and executives alike every time, and clears the way for us to make significant changes (and prove whether or not they are working for the most important folks – our users).
Jeanne Quinn, Sr. Mgr, Digital & Social Experience, Global Partner Marketing, Cisco


As part of the team at Cisco involved with user research and the Top Task Analysis Framework, I have seen the effectiveness of this approach. In my discussions with stakeholders and other researchers across Cisco, I emphasize what I feel are the two core questions for anyone working online:

1. What are your users trying to do? (NOT the same as what they are actually doing)
2. How well are they able to do what they are trying to do?

It is vitally important to make the distinction between “what users do (i.e. click on)” and “what they are trying to do.” In talking to stakeholders, they often list web metric data as synonymous with “what users are trying to do” (if they have any data at all). That is a dangerous position since it assumes that all users are accomplishing what they set out to do in the first place.

The beauty of the Top Task Analysis Framework is that it measures what users ACTUALLY are trying to accomplish and also whether they are able to be successful in those tasks. Not simply seeing what users are currently doing as they potentially struggle through a confusing interface. That’s why I promote the Top Task Analysis Framework as a “problem discovery and measurement” technique. This allows us to determine the Big Three:

1. SUCCESS/FAILURE: Are users successful in doing what they are trying to do?
2. SPEED: Are they efficient? (Can they do it quickly?)
3. CONFIDENCE: Are they confident that they were successful or are they unsure of the result?

When you add the visceral experience of a stakeholder actually watching user after user struggle with key tasks and score it with numerical values, that’s a great research methodology that can really promote change within an organization – as it has done at Cisco.”
Jeffrey Davis, Ph.D., User Research Lead, Cisco Systems