It's time for the world to be usable. People are ready.
Users are no longer passively frustrated when things don't work. They regularly suggest improvements. They use the words like "usable" and "citizen-friendly" and even "task flow." They don't just crib about time lost to inefficient products anymore. They do the math. But not on their cell phones.
Today we know that consumers evaluate and select both products and services based on the user-friendliness of an interface.
But it gets even better. Executives have discovered the value of usability. You hear the word "usability" in elevators all the time.
It's clear from those overheard conversations that executives who understand that usability can be a strategic differentiator don't always grok the practical details of what is involved. But that's not important. All the gurus agree (!) that the first step in making usability routine is getting the support of an executive champion.
If the executives will back it ‚Äď blink! ‚Äď usability matters.
User-centered design is being systematically integrated into the Web, application and product development process. It's the tipping point usability specialists have been waiting for. But are we ready? Does the field have the tools, and resources ‚Äď or for that matter the people ‚Äď to keep up with the need?
To keep up with the need, usability needs to do two things.
First, usability needs to transition away from the can-you-believe-it? high-cost boutiquey market that defines the industry today. If organizations are really going to adopt and embed usability in their day-to-day processes, it can't be guru-expensive.
There aren't really enough gurus to go around anyway. So to pick up the slack, the industry needs to evolve industry standards with common practices*, tools and resources that support scalability. Sure, we probably should keep our gurus, but we also need to create a legion of practitioners who can do the work. Usability needs to become a practice, not just an art.
This means that the industry needs to agree upon both what it is we are doing when we "do usability" and how we should go about doing it. User-centered methods should guide practitioners in collecting and analyzing user data to support informed design decisions. The methodologies need to be robust and replicable. Applying the same method in the same environment should yield a similar (though not necessarily identical) result.
Let's make it concrete. If usability is to scale, our understanding of what usability IS and how to do it has to be consistent enough so that different organizations asked to evaluate the same application, will return roughly the same list of challenges and recommendations. Usability is at least that evolved, right? There are variations on the methodological theme, but does the output vary that much?
You may be surprised. Even a task as (seemingly) transparent as usability testing Microsoft's hotmail.com elicited different data based on different approaches to usability testing.
Molich, Ede, Kaasgaard and Karyukin (2004) reports on the findings of the Comparative Usability Evaluation Study (CUE-2). This meta-analysis describes the usability testing approaches and results across nine independent usability groups asked to conduct a "standard" usability test of hotmail.com. The teams included six industry labs, two university-based teams with commercial activities and two student teams. Each team was provided the same project background information and access to a "Marketing Liaison" for further clarification or feedback on their proposed methods.
Molich and colleagues compared and contrasted the usability testing approach, usability problems discovered, and reporting of findings across the teams. Their finding is jarring: " ...our simple assumption that we are all doing the same and getting the same results in a usability test is plainly wrong" (p. 65).
The details ‚Äď particularly if you think of usability testing as a process-driven task ‚Äď are equally jarring:
Usability teams ranged from 1 to 7 members in size. They used from 16 to 218 hours to conduct the test.
Selection of Method
Eight out of the nine participating teams used some variation on think-aloud testing to conduct the usability review. The commonalities largely end here.
The various teams tested 6.6 participants on average, with a range from 4 to 50 across the teams. (The team testing 50 participants used a semi-structured exploration/questionnaire approach with no direct observation of users completing tasks.)
Interacting with the "client"
Only two of the nine teams solicited client input beyond the initial briefing during the usability testing project.
Developing the testing protocol
The project briefing provided to each team outlined 18 features that the Hotmail team indicated could be enhanced through user input. Five were listed as top priority.
Despite this client-based direction, the overlap in tested tasks was limited: 51 different tasks appeared on the testing protocols. Only two usability testing tasks were common across all of the teams (Register, send someone e-mail). 25 (49%) of the tasks tested were proposed by only one team.
Leading the witness...
8 of the nine testing teams used leading questions on their testing protocols. Leading questions are questions that contained hidden instructions or cues, such as "Create a personal signature" in a context where the user needs to click on a link with the word "signature" in it. Leading questions test participants' ability to recognize keywords rather than there ability to complete the task.
Usability problems uncovered
The usability teams reported from 10 to 149 problems. No single usability problem was reported by all nine testing teams. One problem was reported by 7 of the nine teams.
For the two tasks that were tested by all teams, 232 unique problems were reported. 75% of the problems identified were identified by only one of the teams.
Reporting the findings
Many violations of best practices in usability testing reports (see Dumas and Redish, 1999) were noted. Key among those were:
Quality of findings
Two interesting findings emerge. First, student reports are not easy to distinguish from professional reports.
Second, the results of one indirect testing team differed from those of the direct testing teams. The indirect testing team reported far fewer problems than the direct observation teams. This group also failed to observe the one serious problem that was identified by 7 of the 8 remaining teams. Molich observes, "Unattended testing didn't lead to any more (in fact, quite a bit less) reported problems and didn’t provide insights that other methods [missed]." (p. 73).
Just as in interface design, consistency is critical to the success of usability as a field. If the task of usability testing is this inconsistent, what can that mean for user-centered analysis and design projects?
Molich' and colleagues' findings suggest there is significant variability in execution and findings across the task of usability testing. These were (mostly) professional level groups. They proactively volunteered to be evaluated. One would only assume they set out to present their best work in this very public venue.
The world may be ready for usability. Molich's study indicates that there is still a lot of art in the science of usability testing. But if usability is an art, can that art be made routine?
*Note that people like Mary Theofanos at organizations like NIST are working on this.
Dumas, J.S. & Redish, J.C. (1999). A practical guide to usability testing. (Revised edition) Bristol, UK: Intellect.
Rolf Molich, Meghan R. Ede, Klaus Kaasgaard and Barbara Karyukin: Comparative usability evaluation, Behaviour & Information Technology, Vol 23, Number 1, January-February 2004, page 65-74.
Sign up to get our Newsletter delivered straight to your inbox
HFI may use ‚Äúcookies‚ÄĚ or ‚Äúweb beacons‚ÄĚ to track how Users use the Website. A cookie is a piece of software that a web server can store on Users‚Äô PCs and use to identify Users should they visit the Website again. Users may adjust their web browser software if they do not wish to accept cookies. To withdraw your consent after accepting a cookie, delete the cookie from your computer.
HFI believes that every User should know how it utilizes the information collected from Users. The Website is not directed at children under 13 years of age, and HFI does not knowingly collect personally identifiable information from children under 13 years of age online. Please note that the Website may contain links to other websites. These linked sites may not be operated or controlled by HFI. HFI is not responsible for the privacy practices of these or any other websites, and you access these websites entirely at your own risk. HFI recommends that you review the privacy practices of any other websites that you choose to visit.
HFI is based, and this website is hosted, in the United States of America. If User is from the European Union or other regions of the world with laws governing data collection and use that may differ from U.S. law and User is registering an account on the Website, visiting the Website, purchasing products or services from HFI or the Website, or otherwise using the Website, please note that any personally identifiable information that User provides to HFI will be transferred to the United States. Any such personally identifiable information provided will be processed and stored in the United States by HFI or a service provider acting on its behalf. By providing your personally identifiable information, User hereby specifically and expressly consents to such transfer and processing and the uses and disclosures set forth herein.
In the course of its business, HFI may perform expert reviews, usability testing, and other consulting work where personal privacy is a concern. HFI believes in the importance of protecting personal information, and may use measures to provide this protection, including, but not limited to, using consent forms for participants or ‚Äúdummy‚ÄĚ test data.
HFI may use personally identifiable information collected through the Website for the specific purposes for which the information was collected, to process purchases and sales of products or services offered via the Website if any, to contact Users regarding products and services offered by HFI, its parent, subsidiary and other related companies in order to otherwise to enhance Users‚Äô experience with HFI. HFI may also use information collected through the Website for research regarding the effectiveness of the Website and the business planning, marketing, advertising and sales efforts of HFI. HFI does not sell any User information under any circumstances.
HFI may disclose personally identifiable information collected from Users to its parent, subsidiary and other related companies to use the information for the purposes outlined above, as necessary to provide the services offered by HFI and to provide the Website itself, and for the specific purposes for which the information was collected. HFI may disclose personally identifiable information at the request of law enforcement or governmental agencies or in response to subpoenas, court orders or other legal process, to establish, protect or exercise HFI‚Äôs legal or other rights or to defend against a legal claim or as otherwise required or allowed by law. HFI may disclose personally identifiable information in order to protect the rights, property or safety of a User or any other person. HFI may disclose personally identifiable information to investigate or prevent a violation by User of any contractual or other relationship with HFI or the perpetration of any illegal or harmful activity. HFI may also disclose aggregate, anonymous data based on information collected from Users to investors and potential partners. Finally, HFI may disclose or transfer personally identifiable information collected from Users in connection with or in contemplation of a sale of its assets or business or a merger, consolidation or other reorganization of its business.
If a User includes such User‚Äôs personally identifiable information as part of the User posting to the Website, such information may be made available to any parties using the Website. HFI does not edit or otherwise remove such information from User information before it is posted on the Website. If a User does not wish to have such User‚Äôs personally identifiable information made available in this manner, such User must remove any such information before posting. HFI is not liable for any damages caused or incurred due to personally identifiable information made available in the foregoing manners. For example, a User posts on an HFI-administered forum would be considered Personal Information as provided by User and subject to the terms of this section.
Information about Users that is maintained on HFI‚Äôs systems or those of its service providers is protected using industry standard security measures. However, no security measures are perfect or impenetrable, and HFI cannot guarantee that the information submitted to, maintained on or transmitted from its systems will be completely secure. HFI is not responsible for the circumvention of any privacy settings or security measures relating to the Website by any Users or third parties.
Human Factors International, Inc.
PO Box 2020
1680 highway 1, STE 3600
Fairfield IA 52556
HFI reserves the right to cancel any course up to 14 (fourteen) days prior to the first day of the course. Registrants will be promptly notified and will receive a full refund or be transferred to the equivalent class of their choice within a 12-month period. HFI is not responsible for travel expenses or any costs that may be incurred as a result of cancellations.
$100 processing fee if cancelling within two weeks of course start date.
4 Pack + Exam registration: Rs. 10,000 per participant processing fee (to be paid by the participant) if cancelling or transferring the course (4 Pack-CUA/CXA) registration before three weeks from the course start date. No refund or carry forward of the course fees if cancelling or transferring the course registration within three weeks before the course start date.
Individual Modules: Rs. 3,000 per participant ‚Äėper module‚Äô processing fee (to be paid by the participant) if cancelling or transferring the course (any Individual HFI course) registration before three weeks from the course start date. No refund or carry forward of the course fees if cancelling or transferring the course registration within three weeks before the course start date.
Exam: Rs. 3,000 per participant processing fee (to be paid by the participant) if cancelling or transferring the pre agreed CUA/CXA exam date before three weeks from the examination date. No refund or carry forward of the exam fees if requesting/cancelling or transferring the CUA/CXA exam within three weeks before the examination date.
There will be no audio or video recording allowed in class. Students who have any disability that might affect their performance in this class are encouraged to speak with the instructor at the beginning of the class.
The course and training materials and all other handouts provided by HFI during the course are published, copyrighted works proprietary and owned exclusively by HFI. The course participant does not acquire title nor ownership rights in any of these materials. Further the course participant agrees not to reproduce, modify, and/or convert to electronic format (i.e., softcopy) any of the materials received from or provided by HFI. The materials provided in the class are for the sole use of the class participant. HFI does not provide the materials in electronic format to the participants in public or onsite courses.