Remote audience testing *at* McGill: Challenges and lessons learned

Over the past year we've seen some big shifts in the way we do things, including how we conduct audience testing. All of our user experience exercises now happen online, with participants and facilitators joining in from their homes.

These days, most of our projects involve some form of user experience research, such as gathering data in audience workshops or evaluating how user-friendly a site is by conducting usability tests.

Taking these steps ensures we're designing sites that meet the needs of our site visitors. But what happens when audience testing can only happen online? Can we glean nuanced insights from remote participants?

We initially expected that conducting these exercises remotely would have a negative impact on participant numbers and the quality of the data we collect. Happily, in most cases, things proved to be quite the opposite. You can read about our takeaways and lessons learned from three of our experiences below.

Undergrad admissions website

  • Purpose: To get input from prospective undergraduate students in order to make informed decisions concerning our new program finder tool
  • Method: A card sorting exercise with prospective students (conducted in April 2020)

When we started planning this exercise we had pretty low expectations about the outcome. The full impact of the campus closure related to the COVID-19 outbreak had just settled in and we were certain we would have difficulties recruiting participants. We briefly considered bypassing doing UX research all together but quickly dismissed the thought - the project was too important for us to make decisions without data to back up our choices. Rather than doing an in-person workshop, we decided to use OptimalSort to set up an exercise that participants could complete independently online. When set-up was complete, we somewhat reluctantly reached out to our recruitment contacts on campus and the broader community asking folks to post our exercise link on their messages boards.

Result: Within 24 hours we had received over 150 responses from prospective students many of which included very detailed feedback.

Lesson learned: There's no time to test like the present
  • Remote usability testing works well with our prospective student audience
  • Early spring is a great time for us to conduct usability testing with prospective students
  • Students are currently very interested in providing feedback to help us improve our websites and services

Coronavirus communications

McGill community members' journey during the campus closure and return to campus

  • Purpose: To gain a better understanding of McGill community members' concerns and challenges as we transitioned to returning to campus
  • Method:  A user journey mapping workshop, survey and usability testing with members of the McGill community (conducted in June 2020)

Prior to the campus closure, all of our user journey mapping exercises had happened in-person. In these workshops, participants would brainstorm to produce large diagrams that illustrate the journey of website audience members. When considering how to conduct one of these workshops in a virtual environment, we had some concerns about what tools, to use. We needed a design collaboration tool that was sophisticated enough to allow participants to express their thoughts in creative ways and yet simple enough that it could be quickly and easily grasped by new users. After a bit of online research, we decided try InVision, a tool our design team was already using for their projects.

Result: Our initial attempts were a little uneven. At the start of one particular workshop, a participant undid hours of work put into creating example diagrams within a few seconds. But we quickly learned that a short demo at the start of the workshop was all that was needed to show participants the ropes. People generally seemed happy to dive into the exercise once the demo wrapped. Some of us even felt it was easier to get these exercises rolling in an online environment.

Lessons learned: Virtual post-it notes save trees; Everyone loves a good demo

Procurement Services website

  • Purpose: To get input from Procurement Services' clients to inform decisions for a revised site menu
  • Method: Usability (tree) testing with members of the McGill community (conducted in July 2020)

When the project to redesign McGill's Procurement Services website started up a little while back, we felt there was a lot of room for improvement. Audience feedback suggested the site had significant usability issues. After some initial discussions with the project team, we decided to design and test a completely new menu structure. When planning these tests, we felt that involving team members from throughout the department would be a good way to ensure diverse aspects of the project got the attention they deserved.

Result: We were easily able to involve various members of the Procurement Services team in facilitating and observing our remote tree tests. This gave team members a better understanding of how their clients use their site and understand their services, and enabled them to participate in discussions concerning takeaways and actions items. Scheduling the tests with Procurement Services' clients was fairly easy as well.

Lesson learned: Remote testing facilitates increased team participation

Find out more about user research in web projects at McGill

Back to top