Usability testing and user experience research typically take place in a controlled lab with small groups. While this type of testing is essential to user experience design, more companies are also looking to test large sample sizes to be able compare data according to specific user populations and see how their experiences differ across user groups. But few usability professionals have experience in setting up these studies, analyzing the data, and presenting it in effective ways. Online usability testing offers the solution by allowing testers to elicit feedback simultaneously from 1,000s of users. Beyond the Usability Lab offers tried and tested methodologies for conducting online usability studies. It gives practitioners the guidance they need to collect a wealth of data through cost-effective, efficient, and reliable practices. The reader will develop a solid understanding of the capabilities of online usability testing, when it's appropriate to use and not use, and will learn about the various types of online usability testing techniques. *The first guide for conducting large-scale user experience research using the internet *Presents how-to conduct online tests with 1000s of participants - from start to finish *Outlines essential tips for online studies to ensure cost-efficient and reliable results
Author(s): William Albert, Thomas Tullis, Donna Tedesco
Edition: 1
Publisher: Morgan Kaufmann
Year: 2010
Language: English
Pages: 328
Cover Page
......Page 1
Front matter......Page 2
Copyright......Page 4
Preface......Page 5
Acknowledgments......Page 7
Dedication......Page 9
Tom Tullis......Page 10
Donna Tedesco......Page 11
Introduction......Page 12
What is An Online Usability Study?......Page 13
Strengths and Limitations of Online Usability Testing......Page 16
Measuring the user experience......Page 17
Focusing design improvements......Page 18
Insight into users’ real experience......Page 19
Limitations of online usability testing......Page 20
Combining Online Usability Studies with Other User Research Methods......Page 21
Usability lab (or remote) testing......Page 22
Focus groups......Page 23
Web traffic analysis......Page 24
Organization of the Book......Page 25
Target Users......Page 27
What are the users’ primary goals in using the product?......Page 28
Comprehensive usability or user experience study......Page 29
Live site vs prototype comparison......Page 30
Feature- or function-specific test......Page 31
Between-Subjects Versus within-Subjects......Page 32
Task effects......Page 33
Task times (also known as “time on task”)......Page 34
Clickstream data......Page 35
Comments or verbatims......Page 36
Technology costs......Page 37
People time......Page 38
Study A......Page 40
Study B......Page 41
Study C......Page 42
Study D......Page 43
True intent intercept......Page 44
How they work......Page 45
Integrated services......Page 46
Quality of panelists......Page 47
Emailing......Page 49
Friends, family, and co-workers......Page 51
Number of participants......Page 52
Sampling techniques......Page 53
Participant Incentives......Page 55
Summary......Page 56
Introducing the Study......Page 58
Purpose, sponsor information, motivation, and incentive......Page 59
Time estimate......Page 60
Technical requirements......Page 61
Legal information and consent......Page 62
Instructions......Page 64
Types of screening questions......Page 65
Misrepresentation checks......Page 66
Starter Questions......Page 67
Product, computer, and Web experience......Page 68
Expectations......Page 69
Reducing bias later in the study......Page 70
Making the task easy to understand......Page 71
Writing tasks with task completion rates in mind......Page 73
Anticipating various paths to an answer......Page 76
Multiple-choice answers......Page 78
Including a “none of the above” option......Page 80
Including a “don’t know” or “give up” option......Page 81
Randomizing task order and answer choices......Page 82
Self-generated and self-selected tasks......Page 83
Self-reported task completion......Page 86
Self-reported data......Page 89
Open-ended responses......Page 91
Overall rating scales......Page 92
Overall assessment tools......Page 94
Open-ended questions......Page 95
Demographic questions......Page 96
Special Topics......Page 97
Progress indicators......Page 98
Speed traps......Page 99
Summary......Page 100
Pilot Data......Page 102
Technical checks......Page 103
Usability checks......Page 106
Full pilot with data checks......Page 107
Preview of results......Page 110
Finding the right time to launch......Page 111
Singular and phased launches......Page 113
Monitoring Results......Page 114
Summary......Page 115
Downloading/Exporting Data......Page 116
Data Quality Checks......Page 117
Removing Participants......Page 118
Mental cheaters......Page 119
Tips on removing participants......Page 120
Outliers......Page 121
Removing a task for all participants......Page 122
Success data......Page 123
Time variables......Page 124
Self-reported variables......Page 125
Clickstream data......Page 126
Summary......Page 128
Data Analysis and Presentation......Page 129
Binary task success......Page 130
Breakdown of task completion status......Page 132
Calculating task success rates......Page 135
Confidence intervals......Page 136
Task times......Page 139
All task times or only successful times?......Page 140
Mean, median, or geometric mean?......Page 142
Confidence intervals......Page 143
Number of tasks correct per minute......Page 144
Rating scales......Page 147
Top-2-box scores......Page 149
Task-based comments......Page 154
Open-ended questions at the end of the study......Page 156
Overall assessment tools......Page 160
Clickstream Data......Page 162
Correlations and Combinations......Page 167
Correlations......Page 168
Combinations (or deriving an overall usability score)......Page 171
Segmenting by participants......Page 174
Segmenting by tasks......Page 176
Analysis of errors......Page 177
Analysis of comments......Page 178
Comparing alternative designs......Page 179
Use confidence intervals and t tests......Page 180
Set the stage appropriately......Page 181
Tell a story......Page 182
Make details available without boring your audience......Page 183
Summary......Page 184
Building Your Online Study Using Commercial Tools......Page 186
Creating a study......Page 188
From the participant’s perspective......Page 189
Data analysis......Page 190
Summary of strengths and limitations......Page 191
RelevantView......Page 194
Creating a study......Page 195
From the participant’s perspective......Page 197
Data analysis......Page 198
UserZoom......Page 201
Creating a study......Page 202
From the participant’s perspective......Page 205
Data analysis......Page 206
Summary of strengths and limitations......Page 207
Creating a study......Page 208
From the participant’s perspective......Page 211
Data analysis......Page 217
Summary of strengths and limitations......Page 218
Checklist of Questions......Page 219
Summary......Page 222
The Basic Approach......Page 223
Measuring Task Success......Page 225
Ratings for Each Task......Page 228
Conditional Logic for a Comment or Explanation......Page 229
Task Timing......Page 231
Randomizing Task Order......Page 232
Positioning of Windows......Page 234
Random Assignment of Participants to Conditions......Page 240
Summary......Page 242
Case Studies......Page 244
Access Task Survey tool......Page 245
Methodology......Page 246
Results......Page 248
Discussion and conclusions......Page 254
Using Self-Guided Usability Tests During the Redesign of IBM Lotus Notes......Page 255
Tasks......Page 256
Participants......Page 257
Self-guided usability testing: Discussion and conclusions......Page 259
Limitations......Page 260
Lessons learned......Page 261
Biographies......Page 262
Why a longitudinal study design......Page 263
Respondent recruiting and incentives......Page 264
Results and discussion......Page 265
Nomenclature analysis findings......Page 266
Content, features, and functions......Page 267
Interactive quality findings......Page 268
Conclusion......Page 269
References......Page 270
An Automated Study of the UCSF Web Site......Page 271
Methodology......Page 272
Results and discussion......Page 275
Conclusions......Page 276
Online Usability Testing of Tax Preparation Software......Page 277
Results and discussion......Page 278
Advantages and challenges......Page 281
Biographies......Page 282
Why online usability testing?......Page 283
Recruiting......Page 284
Study mechanics......Page 286
Limitations......Page 287
Metrics and data......Page 288
Getting results heard and integrated......Page 289
Using Online Usability Testing Early in Application Development: Building Usability in From the Start......Page 290
Project background......Page 291
Creating the usability study environment......Page 292
Methodology......Page 293
Results and discussion......Page 295
Study limitations and lessons learned......Page 299
Biography......Page 300
What is your goal?......Page 301
Think Outside of the (Web) Box......Page 302
Compare Alternatives......Page 303
Consider the Entire User Experience......Page 304
Explore Data......Page 305
Sell Your Results......Page 306
Trust Data (Within Limits)......Page 307
You Don’t Have to be an Expert—Just Dive in!......Page 308
References......Page 309
E......Page 311
O......Page 312
S......Page 313
W......Page 314