Any software developer that is concerned with making great software should be testing their interfaces on their users (or people similar to their users). Its easy to convince yourself you’ve built a great UI if you only see people on the team using it. People on the team have intimate knowledge of how the system is supposed to work, and so they nearly always succeed.
The real test comes when you watch someone who has never seen your system before attempt to perform some vague tasks you have set out for them to do. These types of usability tests can not only point out the failures of your system to match what typical users expect – but watching these users struggle with stuff you wrote can be a strong motivator to improve things. For that reason, it’s important to do usability tests as often as possible, and let as many people watch as possible.
At Lanit, we’ve attempted to do usability tests a number of ways:
We’ve done full scale tests that record mouse tracking and video in addition to audio. These take a ton of time to set up, to edit/summarize the results, and to share. And, it doesn’t seem that the mouse tracking or video added much to what the user was saying (assuming the moderator was prompting for their thoughts often). This time investment caused us to do them at most once or twice a year.
We’ve tried doing simpler screen-cast and voice recordings at a local university. This was an improvement – but it still took time to schedule a room, and we still need to bring back videos to the office to share or spend time summarizing/editing them. It also eat up a half day for two people to find participants and set up the tests. And, it was a bit awkward to approach people to ask them to let you record them testing your software. Usually, we managed to do these every few months.
We’ve also tried bringing users in to our office – so that we can easily share the screen and voice live, and have an immediate discussion afterwords (similar to what Steve Krug suggests here: http://network.businessofsoftware.org/video/steve-krug-on-the-least-you). This was an improvement, but still required a time investment in finding willing users to come to us. We only tried this once – but the effort to advertise etc. felt like we would still only do a test at most once a month.
Finally, we decided to give one of the new services that have recently popped up a try. There are many of these services that do some form of usability testing for you – but we narrowed in down to usertesting.com and trymyui.com because they seemed to best emulate what we were doing live (the only real difference is that users are screened/trained by the site to speak what they are thinking, instead of requiring a live moderator to prompt them). I chose trymyui mainly because I liked the results of my first (free) test, and it was slightly cheaper ($25/user/test instead of $39). All we had to do was provide the script of tasks they were to accomplish (took all of maybe 10 or 15 minutes to create), request testers, and usually within an hour or two we had a great recording of the users screen and voice. I had one experience where a video didn’t arrive for about 2 days, but their support was very helpful and indicated that they pre-screened the results to ensure the testers tried the tasks, etc (a bonus, as you are basically guaranteed to get a good video for your $25).
We were hoping that going the online service route would save us a bunch of time – and it did – but an unexpected benefit was that the short turnaround allowed us to do a kind of A/B testing. Before when we did live tests – we’d come away with 10 or so items from across 3-5 tests in our session we’d want to improve, and we’d put them in a list to take care of some time in the future. Then, we’d have to wait until the next time to see if we made a significant improvement and what the next set of problems were.
With trymyui, we could often develop a change to improve the experience and test that change the same day. This created an even more motivating experience to build a great UI – because you could often see where the next user did better because of the last test. In the end, we made several improvements over a week that would have taken us months to do before. And, it was so effortless to set up and so easy to see the benefits that I know we will continue to use this site to test our UIs on a regular basis.