So You Think You Know Your User: How UX Discovery Goes Beyond Intuition
Know your audience. It’s a statement that’s tossed around a lot in the digital marketing industry. But from a user experience perspective, defining your audience through the discovery process drills down a lot deeper than meets the eye.
There are a lot of ways you can go about collecting feedback and insights to understand how and why users interact with your site the way they do. It's important to take the practice of collecting feedback seriously as it ensures you are building and designing content and flows to create the ideal user experience.
Asking questions, whether directly to individuals or through more automated methods, is a vital part of any change process. Below are just some of the ways to collect feedback through the design process. Which methods you use and in which order may vary and may be involved in multiple parts of the process, for instance, stakeholder interviews often kick off a process, while smaller client feedback reviews will be used to continually refine and tweak designs.
During the discovery process, feedback of various forms are helpful to guide the ultimate creative and technical approaches to improve the user experience.
Stakeholder Interviews: An Inside-Out Approach
When creating or designing a website, always engage with your stakeholders to help determine what to look for and the areas in which to focus. You want to get to know the company and how the important players within it view the challenge from the inside out.
For this reason, it’s crucial to start with stakeholder interviews before moving to users. You’ll first want to measure how well the intended impression or action resonates with those who know the business the best. Start by focusing inward with how employees and leadership teams view the company, and then match that to the external feedback you receive when conducting user testing.
Stakeholder interviews usually involve candid discussions with employees, leadership, board members, and shareholders to gather gut feelings and opinions around the current experience. This becomes one piece in a big feedback “pizza” with multiple requests for toppings — before figuring out what changes should be prioritized to make a cohesive, enticing offering.
Pros of Stakeholder Interviews
No one knows what makes your business tick like the internal stakeholders, which makes it easier to get at the why. Viewpoints can be gathered at various levels within an organization. Regardless of what your users say, these are the people that will need to be convinced that your solution is best.
Cons of Stakeholder Interviews
Stakeholders are heavily invested and might not be prepared to hear feedback from within their organization. Think of past internal reviews you’ve had — they might be stuck in “the way we used to do it,” or find it harder to think outside the box.
Tools for Stakeholder Interviews
The best way to conduct stakeholder interviews is through listening, video chat, and in-person meetings. Having the fuller context – including non-verbal cues – provides a clearer plan of action than a number scale on preference or written survey response. Record interviews if you can, for future playback and confirmation of your interpretations.
Comparative Review
You know who your competition is. Your stakeholders know who the competition is. Which means you know of similar company’s websites that may be trying to reach the same goals. If you’re aware of a problem that your website has related to user experience, it might be helpful to compare it to what your competition is doing to solve that problem. Check out your site and compare it to your competitors to see if you can spot the differences and see what you might be missing. There could be opportunities to adjust or augment certain parts of the site that have a direct impact on your business’ bottom line, like moving the Contact Us form up higher in the page, for instance.
We recently used comparative analysis to look at sites using a search feature to discover whether they used global search or section-specific search. This is another piece of that bigger feedback pizza pie — sure, our competition might do one thing over another, but we still want to check in with stakeholders, experienced users, and new users.
Comparative reviews are a helpful part of the discovery process because they can provide context into alternate solutions when problem-solving the issues uncovered during user testing. You may carry over some findings into the design process. But we’re not quite done discovering just yet!
Pros of Comparative Review
In this method, you can explore real-life examples and see what works and doesn’t work on your competitors’ sites and improve or abandon for your own site. These may impact variations or examples that you can show to other testers/contributors later on in the process.
Cons of Comparative Review
Every company is different, and users interact with brands in different ways. Just because something does or doesn’t work for a competitor doesn’t mean it will or won’t work for you.
Tools for Comparative Review
Even a simple spreadsheet can be useful here. Create a matrix of challenges you've identified or items you hope to address with your new experience, then use this to grade competitors' experiences.
Examine the Analytics
Most people are using some form of analytics tracking on their website, whether that's Google Analytics, Adobe Analytics, or a custom solution they've developed. Out-of-the-box analytics data can give great insight on what is working and what's not working on a website — which landing pages perform better than others, which pages have the most engagement, where do people leave a site, etc.
With a well-thought-out implementation, you may also be tracking valuable information about interactions occurring on your site. What navigation links are being used, how are people interacting with various items on your site? If there's enough time, get this information tracking into your analytics platform immediately to begin collecting valuable information and to establish baselines. It's possible to track clicks, hovers, interactions, scrolls, you name it. What data will help you solve the problem and/or prove the success of your solution?
While many may wait until the end of a website build or a new feature implementation to consider analytics, it's important to include this early on in your process. What problem are you trying to solve and is that being properly recorded? If possible, tie the user experience problem back to a measurable number, for instance, "we think that improving the checkout flow will result in more purchases." Then make sure you know exactly where to find that number both before and after your changes.
Pros of Website Analytics
Analytics tracking is largely anonymous, behind-the-scenes, and with little interruption. Consider it the digital version of an observation test, you're collecting data to evaluate later. With the right time and tech in place, you can have large amounts of data at your fingertips to help guide your design, influence your questions, and debunk stakeholder myths.
Cons of Website Analytics
If you don't have proper tagging set up, this may take coordination with other teams and time to set up, collect, and report on the interactions that are most valuable to you. If your old experience and new experience are too dissimilar, the data you collect may not actually be valuable.
Tools for Website Analytics
The most common website analytics platforms are Google Analytics, which comes in both a free and enterprise version, and Adobe Analytics, which is a paid part of the Adobe Experience Cloud. Other tools may help to supplement website analytics tracking, like some of the ones listed below. Heat mapping/click tracking tools can provide a different set of information to enhance your existing data.
Unmoderated Testing vs Moderated Testing
Next, you should try and uncover how brand new users walk through the existing experience. There are two different ways to conduct this form of testing, moderated and unmoderated testing.
So, which kind of testing is right for your project, and at what point?
Unmoderated Testing
For those unfamiliar, unmoderated testing is completed by test participants in their own environment without a facilitator present. These test users are typically not familiar with the brand or website they are testing.
The use case for unmoderated testing is when you’re testing an existing design and want to get unfiltered thoughts from users who haven’t seen or interacted with it before. Unmoderated testing can help you narrow in on users’ pain points and learn why they give up, and the exact point on the site that resulted in them giving up on the desired action. This gives you insight into what needs to be fixed, and sometimes users even suggest what they think would fix it.
Pros of Unmoderated Testing
Through unmoderated testing, you get access to brand new users, fresh eyes, and quick results. Unmoderated users also tend to be more candid, meaning you’ll get more honest results.
Cons of Unmoderated Testing
Because you can’t follow up with users who complete the testing, you aren't able to probe on the why of their answers. You can try to prompt people to speak through questions, but that doesn’t guarantee they will. The lack of a two-way conversation means you will sometimes lean on assumptions versus hard facts.
Tools for Unmoderated Testing
While many tools exist for this purpose, we've found success with both Validately and UserInterviews.com.
Moderated Testing
Moderated testing uses a test facilitator and involves active participation from both the user and the moderator. These are typically performed in a lab or corporate setting. The moderator is in charge of administering tasks, guiding the user as they walk through, recording their behaviors and comments, answering questions, and replying to feedback — all as it happens.
Moderated testing should be conducted when you’re testing a new design through wireframes or prototypes. Because this method places the moderator in control, they can help smooth over any kinks that arise for the user while working with prototypes, and still affords the opportunity to gain valuable insight into whether the revised designs solved the found issues. In this method, you’re better positioned to walk users through different scenarios, ask follow-up questions as they work through tasks, and gain further perspective on where your users got stuck and why.
Pros for Moderated Testing
Since you have an administrator moderating the test, you’re able to drill down into the user's comments and ask them follow-up questions. The moderator is there IRL and can record all reactions and comments made by the users during testing.
Cons for Moderated Testing
This type of testing is not scalable, is pretty time consuming, and is hard to get people to commit to and follow through on a scheduled test date. Careful with how you phrase your questioning - by implying there's a problem or forcing people to give feedback, people may look harder or give you smaller feedback items that need to be weighed carefully.
Tools for Moderated Testing
Typically, this may be done by someone on your team scheduling and administering the testing protocol. There are companies that offer to run this for you, as focus-groups, etc., but often you can accomplish this with a practiced interviewer and an online video platform.
User Insight is Your North Star
We all have instincts as to why something is going wrong on our site, app, or digital experience, but the most foolproof way to determine if your intuition is right is to get insight from the users themselves. You can try and predict what the issue is when you see it and can hypothesize why it’s happening, but you have to work with the one guiding principle as your North Star: we are not the user.
Once you have feedback from about half a dozen users, synthesize your findings and uncover the common experiences your users identified. Use this information to determine what issues are occurring and start problem-solving in the wireframe and prototype design phases.
But I’m getting ahead of myself. What if we want a little bit of information from a large subset of our actual user base who are visiting the site in real time? That’s where surveys can come in handy.
Surveys
Most people are familiar with surveys. Marketing teams, event planners, hoteliers, and even professional services like doctor’s offices have been using surveys for years to gather info in a quick and precise format. Surveys are a great tool to use for getting answers about who your users are, what they think of a product or experience, and what they think could be improved.
Using a survey to conduct usability testing – whether moderated or unmoderated – allows you to get specific with your questions. With the analytics you have available, you can identify where people dropped off or got stuck and ask the users specific questions about those moments. This method is great if you know where your issues and pain points are, but can’t pinpoint what’s causing them.
Surveys can be delivered as popups on your website, emailed to your users, or anonymously delivered. Most anonymous survey tools include demographic and interest targeting, to help make sure you're showing surveys to the right type of user, even if they've never been to your site. With advanced tools like Google Surveys, you can even use remarketing lists to deliver a survey off your website to anonymous members of a decently-sized audience.
Survey design has its own art and science — be mindful to ask questions carefully, in a way that will give your team clear direction and that avoids any bias or influence in the questioning.
Pros for Using Surveys
Using surveys can provide a lot of feedback quickly and get answers to your specific questions. Surveys are great for large-scale efforts and helpful for "this or that" questions.
Cons for Using Surveys
The survey method doesn’t lend itself well to follow up. And while you can get specific questions answered, it’s harder to get feedback about a specific item or why a question was answered the way it was. It may be difficult to embed visuals within the survey. Designing a well-written survey can be tricky.
Tools for Administering Surveys
Tools we’ve used for this include Google Surveys, Hot Jar, Qualaroo, and Survey Monkey.
Conclusion
The more you speak to people and conduct tests, the more you’re able to uncover about your core problem. This will help get your wheels turning, and you’ll start to feel like you have a solution to the issue you set out to fix: why isn’t the current experience leading to our intended outcome? Now you’re free to move onto focusing on how to drive the experience. It’s no longer thinking you know your user and making assumptions based off that. Because of research and feedback, you now truly know your audience and have a first look into what they’re doing and why and how they feel. You can have your pizza and eat it too.