29 Jul 2013 by Lisa Farlow
6 Lessons We Learned While Doing BYODevice Usability Testing





byodbattersbox

A couple of months ago, we tried out a BYOD (bring your own device) quick-and-dirty usability testing blitz at the office. The idea appealed to us and our client as a way to quickly get a wide range of feedback. The clients came in for the evening and we interviewed 11 participants, taking notes on the fly. We held a workshop the very next morning, and then the project was done!

Some things we did right, and some things we’ll have to change the next time we run a night like this. Here are six lessons we learned for next time:

1. Make stage directions

Long before testing day, we brainstormed a list of what needed to be done on test night and created a set of stage directions to plan out the order and time in which everything needed to happen. We all memorized our own tasks and were familiar enough with others’ tasks to step in if needed. Knowing exactly what needed to be done and where everything was kept the mood from ever becoming frenetic.

Next time: Make a detailed list of instructions. Do some walkthroughs with somebody in the office pretending to be a participant. Try to think of everything that could happen. What if somebody is late or early? What if somebody forgets his or her device? What if somebody brings a kid? What if somebody is an especially loud talker and can be heard outside the testing room? Plan for everything.

2. The more variety, the better

We had a real hodgepodge of laptops, Androids, iPhones and iPads, and one off-brand tablet none of us had ever seen before. The large variety of devices that showed up at our BYOD testing sessions provided interesting insights that may not have come up had we only tested on one device.

Next time: Set some minimums (e.g., let’s recruit at least four tablet users) and some maximums (e.g., no more than two users with iPhone 5s) when recruiting.

3.BYOD… heavy on the YO

The best part of having people test on a device they are comfortable with is that it eliminates the guesswork about whether their struggles were due to low familiarity with their device or poor UX on our test site. Or at least, in theory…

In our recruiting, we asked participants to bring a device that they own, but we failed to specify that we wanted them to bring a device that they own and use regularly. This led to a few participants bringing in devices with which they weren’t totally familiar. One participant brought in a tablet that he only uses while on vacation, and he hadn’t vacationed in quite a while. Another brought in a laptop that he had purchased for his child. While technically “his”, he had never used it and was unfamiliar with many of the settings and customizations.

Next time: Ask users to bring in a device that they use at least two or three times per week.

4. BYO–fully charged–D

One participant brought a device that was almost out of batteries and she didn’t have the charger with her, so we had to have her perform the test on one of our laptops in the office.

Next time: Ask participants to fully charge their device beforehand and to bring the charger with them as well.

5. Make a few different batters’ boxes

We had nearly too many clients for everybody to observe the sessions directly, so we wanted to stream the sessions to clients who were observing in another room. We taped off a box area on the desk, pointed cameras to be looking down on that area, and asked users to hold their devices within the box. But we learned:

  • The slightest glare made the screens hard to view even when devices were positioned at a perfect angle to the camera,
  • A “perfect angle” was really only even possible to maintain with a laptop, as some tablet and all phone users moved their devices around quite a bit, and
  • Zooming in and out to accommodate different device sizes is tricky on-the-fly.

Next time: set up three different batter’s boxes for each type of device in order to be ready for different screen-holding angles and levels of zoom. Observe sunlight and glare the evening before to set up our stations to be facing away from the worst light sources.

6. Leave time between interviews

We ran concurrent sessions, each planned to be half an hour long, 45 minutes apart. Most sessions ended up going a little long, but the break allowed us time to reset our stations, add to our notes, and never fall behind schedule for the next participant.

Next time: leave 15 or 20 minutes between interviews

In summary, we think that BYOD testing is a great method for fast, diverse testing and we’ll definitely do it again in the future.


10 Jun 2013 by Dennis Breen

Five Keys to Visual Business Analysis

Back in January I shared some thinking on what we've called Visual Business Analysis. That post argued that holistic visualizations of the system are necessary during requirements gathering because a coherent picture changes how people think about requirements. Read More

09 Aug 2013 by Johanna Dietrich

An alternative to post-it notes and sharpies: Using butcher paper and pre-printed sticky labels for workshop facilitation and analysis

There have historically been three sure things in IA – post-it notes, sharpies and the Polar Bear Book. We’ve recently expanded our IA toolkit to include a roll of butcher paper and a stock of Avery mailing labels. Having used … Read More