Over the years, we’ve done a lot of usability testing. We’ve used in-person, remote, and automated methods. We’ve tested on websites, intranets, and applications. But we’ve rarely tested specifically with mobile devices.
Since we have some upcoming projects where analytics point to a mobile-first design and test strategy, we wanted to give ourselves some first-hand practice before doing the real thing (we like to prototype everything!). With that in mind, we recently held an internal mobile usability testing workshop.
Of course, we informed our workshop with some initial research into known methods. There are many variations, but we decided to test drive a mobile sled, MailChimp’s “Hug-a-Laptop” method, and a freestyle, hold-the-camera-as-you-talk method. It was both fun and instructive to work together to figure out the pros and cons of each approach. After only a couple of hours, we have a much better idea of when we might try each method, what things to watch out for, and how we might adapt things to fit certain circumstances.
The Mobile-Usability-Testing Sled
If you search for mobile usability testing info, you’re bound to see a lot of posts on the mobile sled. They range from crude and sloppy to highly sophisticated. There’s even a commercial product called Mr. Tappy. Because we were experimenting, we went with a DIY tape and spatula rig.
We initially tried having the participant place her device on the tabletop, but that created a few problems. First, we had some trouble with the lighting. To get a decent picture we had to turn the phone brightness down so low that it could be hard for some participants to see the screen comfortably. We’re not sure how much this was due to camera quality, or the lack of light adjustments in the software we used. Most importantly, using a mobile device as it sits on a table is an entirely unnatural interaction.
We accidentally discovered that when the participant held her device (rather than placing it on the table) the lighting trouble went away. The screen was crisp and easily readable on the video. We still had to control the overall lighting, but the picture quality was much better when the device was held. Even better, it was a more natural way to interact with the device. The downside to this, of course, is that the participant may move their device out of the camera’s area of focus.
We’ll also get a more… em… professional setup before running real sessions.
One of the difficulties with mobile testing is that there seem to be so few remote options. But way back in 2011, MailChimp blogged about a method they’d tried that is simple and doesn’t require any special equipment. It is, however, a little weird to explain over the phone. You simply connect with a participant via video call (we used GoToMeeting), have her turn her laptop around so the camera is pointing away from her, and ask her to reach around the laptop to use her mobile device.
As we suspected, giving an un-primed participant instructions over the phone proved to be challenging. We contacted a staff member (who had no idea what we were doing) via GoToMeeting and tried to explain how to turn the laptop, and where to hold the phone. The initial set up was easy, but trying to explain how to adjust the laptop lid, or how far to hold the device from the camera was a little painful.
Our participant clearly understood what we were trying to do and, wanting to be helpful, tried holding the phone quite close to the camera. That tended to cut off the bottom half of the device on the video feed, and made it hard for her to see the phone over the laptop screen. Maybe with practice, we’d get good at prompting the necessary adjustments, but we found the constant requests to adjust the screen or the position of the phone rather distracting.
It’s perhaps obvious but worth stressing that you’ll need to screen participants carefully for this method. They’ll need a laptop on a stable surface, and hands-free way to connect for audio.
We’d love to have a reliable remote method in our toolbox, so we may work at this one a little more. We think sending out some simple instructions, along with pictures would help people better understand what’s expected. Since the video feed is the only way for you to see what they’re doing, it’s critical to get the setup right from the outset.
We’re planning to do some intercept testing on the street, so we decided to experiment with an over-the-shoulder interview and video method. We simply had the facilitator stand near the participant, camera in hand. The goal was to try something entirely informal.
To begin, we had the facilitator positioned a little behind and to the side of the participant. We quickly discovered that the participant’s hand preference impacted which side the facilitator is on. If the participant is left handed, you’ll want to be on their right side.
But we also found that there’s no way to share what you’re doing on a mobile device without getting close. Maybe TOO close. Personal space is always a consideration in testing, and small mobile screens just amplify the problem. If you try to just sit with someone and have them show you how they use their device, be prepared for some of them being uncomfortable and resistant.
To solve the personal-space issue, we had an interesting idea. We used a phone as a camera and imagined it on a tripod beside the participant. We installed Reflector (a screen-mirroring software) on the phone, turning it into a broadcast camera. Finally, the phone was hooked up to a laptop, creating a setup that allowed us to sit across from the participant and have a much more comfortable conversation. Of course, the same thing could be achieved with a bluetooth camera — we just happened to have phones in our pockets. This is MacGyver doing Agile, after all.
In effect, the experiment led us to a variation on the sled setup, but with the camera in a different position.
ONE: It’s great to experiment with new things before you have to do them for real. You can learn a lot from a brief but realistic test. Of course, that’s a cornerstone of our business, so it’s not a surprise. Just some practical confirmation that prototyping works for everything!
TWO: Some of the complications in these methods were specifically related to capturing video. It’s worth considering whether having video justifies the bother. Unless you need evidence to convince people that changes are necessary, you may be able to skip video for a more lightweight, observational approach.
THREE: In the end, we decided to invest in a simple sled-like approach for the future. We’ve ordered a Hue HD camera, which offers a good picture on a flexible arm, for a very reasonable price. We’re also going to rig up a little testing cubicle, using a trifold presentation board. We’ll add a little top to make a voter booth-like space that controls the light and limits the participant’s movements.
If you have mobile usability testing advice, questions, or experience to share drop me a line. There’s always more to learn.