HFI Connect

User Experience News, Blogs, and Videos

5 Lessons Learned Running a Mobile UT

By Steve Fleming CUA, CXA UX Strategist @HFI

Deeply understanding the user experience on mobile devices is a challenge, much less using that understanding to deliver excellent designs. We are all mobile users, but we don’t want to fall into the trap of thinking, “well, this is how I use my smartphone, so that is probably what the users do”. Nor can we always assume that the context of use is the same – that our users will do the same things on their mobile device that they would do on their laptop or desktop computer.

Equally tempting, but dangerous, is watching all those people we pass every day – on the street, on the bus, at the wheel of their car, in our meetings – tapping away at their mobile device. Like analyzing a competitor’s design, we can get the general gist of the experience, but don’t know all the motivations, thoughts, and decisions going into the actual result – we don’t know what business trade-offs motivated our competitor’s design decisions and we don’t know what mental model is influencing how each person is using their mobile device.

But tackling the entire spectrum of design for mobile devices isn’t what I want to bite off here, but rather a much more manageable piece of user experience work in the mobile design space – some practical lessons learned based on running usability tests on mobile devices.

 

Lesson 1: Not everyone will agree on what the “mobile tasks” are.

 

Are they the same tasks your users would perform on the “full” version of your site? Do you provide a responsive design, assuming the tasks are the same regardless of device? Or do you provide an “m-dot” or “t-dot” version of the design with specific services and therefore test those? Do you keep your UT focused on the Website delivered on a mobile device or let participants use an app you provide? These are the same kinds of decisions that have to be made for any UT, but be sure to dig into some of the assumptions that team members might have about the most important parts of the design to evaluate on a mobile device. And if you already have good insights into the overall design, focus your mobile UT on tasks where interaction is key, as that is definitely different between a laptop and a touch-screen mobile device.

 

Lesson 2: Figuring out devices of interest is challenging and not entirely within your control.

 

This first obstacle is getting everyone to agree on what a “mobile device” is – team members are likely envisioning phones and tablets, but it is important to be sure. Is a Nook an acceptable mobile device for your study? Probably, as long as it is the version with Internet access! One good approach is to screen for a few specific devices along with your other screening criteria – whatever devices your team has designed for or your data says your users are using. Even when you recruit, some participants will bring multiple devices (their Kindle and their iPhone, for example). Alternately, you could not screen for devices and simply see what participants bring, but be sure your team won’t be bugged if no Android phones (for example) show up. An approach to avoid is having your own devices there for participants to use. Yes, you do get to control the technology, but you don’t want to waste half your UT session getting participants familiar with an interaction paradigm or teaching them how to use an Android tablet.

 

Lesson 3: You need special recording equipment (for now).

 

Eventually, there will be easy-to-use recording software, like Morae or Ovo, for mobile devices, but today, the way to get recordings of the screen is with external cameras. Make sure you have cameras that can zoom in on the screen of the mobile device. Most labs will have (or can contract for) motorized video cameras allowing an operator in the observation room to pan and zoom. Even with these cameras, lighting glare and the participant moving the device around will make it hard to get perfectly crisp recordings, but it some recording is better than none. If you want to have more realistic participant movement (like Barnard, Yi, Jacko, and Sears’ 2005 study with participants using a treadmill or walking a controlled path) you will have to re-think recording.

 

Lesson 4: Internet access is an important consideration.

 

This seems easy, as we need an Internet connection for most of our UT sessions. Your testing locale likely has WiFi, but you want to confirm that (and that participants can access it!). Make sure the signal is strong where you are conducting the session. Also, some participants may not be very familiar with joining WiFi networks; therefore you might need to guide them through that on their device. Some participants might use their data plan because their device doesn’t automatically pick up the WiFi network, so decide if you want to let them do that. If you allow participants to use their own data plan, remind them that the session will count against their data total (will you know the amount of data the UT will require?).

 

Lesson 5: Moderators will face special challenges.

 

If you have a great design, participants may still get frustrated because of the need to zoom or pan on their screen – or because they accidentally select the wrong link from a list of links. Now, if your design has challenges, those will be exacerbated by the fact that your participants have to zoom or pan. All this is great feedback on your design, but the moderator may have to deal with frustrated participants during the session. I have found that participants in mobile UTs more quickly announce, “I’d just call”. They are holding a phone in their hand and a call is a link away! It is great to know if that is there actual behavior and it is great user research data. But, if that means you aren’t getting the usability testing data you are looking for, you have to determine how much you will push the participant to accomplish the task – they may be less likely to “keep trying” on that mobile device.

So, these are just a few bits I have picked up from running some mobile UT sessions. By planning for these issues early in the process, you can get better data and create better designs. Happy testing!

 

To learn more about HFI's Usability Testing services please visit: http://www.humanfactors.com/

 


Steve Fleming has over 15 years experience in project management and usability and design in both corporate and government sectors. He is proficient in helping organizations effectively synthesize their business goals with end-user goals.
For more information on Steve's background please visit:
View additional blog entries by Steve
Read Steve's Bio
Connect with Steve on Linkedin

Views: 629

Comment

You need to be a member of HFI Connect to add comments!

Join HFI Connect

Photos

  • Add Photos
  • View All

Discussion Groups

© 2017   Human Factors International   Powered by

Badges  |  Report an Issue  |  Terms of Service