Qualitative UX Research Methods
Learn essential qualitative UX research methods to gain valuable insights into user behavior and preferences, shaping impactful design decisions
There is a common misconception that research, in general, is all about numbers. While numbers are definitely an integral part of research, it is not the only one. In UX research, for instance, many questions cannot be answered by numerical data alone.
For example, why do your users behave the way that they do? How do they perceive your product? What are their motivations and pain points? What are the core thoughts, fears, and attitudes that shape their decisions and actions? Why do they love some parts of your products and neglect others? These questions can be answered by qualitative research methods that collect quotes, anecdotes, observations, or narrative descriptions from users.
A competitor analysis helps you understand where your product stands relative to others in the market, and where you have room to differentiate. Rather than a broad audit of everything your competitors do, it works best when it's scoped to a specific goal.
For a fast-casual pizzeria, that might mean studying how Domino's handles online ordering, how Mod Pizza communicates its "build your own" model, or how a popular local competitor prices its menu. Each of these reveals something different about the market landscape. In practice, you'd typically look at user demographics, product features, content tone and language, and the visual design of their digital touchpoints.
What a competitor analysis can reveal depends on what you're looking for. It commonly surfaces the landscape of the market, including various user types and potential users not yet reached. It can expose gaps in the market, show your product's unique selling proposition, and highlight strengths and weaknesses in your branding and
Pro Tip: Make sure you include both direct and indirect competitors. Brands in adjacent categories, like fast-casual burger or bowl chains, often reveal experience patterns your users have come to expect, even from a pizza brand.
A
The process starts with a content inventory. You compile a list of the product pages you want to audit, their URLs, page types, and any relevant notes. Once you have that foundation, you evaluate your content against your objectives. Those might include whether your content is readable, findable, accessible, or easy to understand.
When content falls short of your standards, the audit becomes most valuable. Real usage data points you toward actionable solutions. For example, if users are uninstalling your product in large numbers after an update, a content audit can trace where the drop-offs happen and surface the specific language or structure causing friction. For more targeted insights, you can pair the audit with
Pro Tip: Don't wait for a crisis to run a content audit. Treating it as a routine checkpoint, rather than a reactive fix, makes improvements easier to scope and prioritize.
Card sorting helps you understand how users naturally group information, which is the foundation of a well-structured product. When users can find what they're looking for without thinking too hard, it's usually because the information architecture reflects their mental model, not the team's assumptions.
The process is straightforward: participants receive labeled cards and sort them into groups that make sense to them. For a fashion retailer, this might reveal whether users expect shorts under Clothing or Sportwear, two valid options that would each lead to a different navigation structure. Studies work best with 30-60 cards and 15-20 participants.
There are 3 variations to choose from, depending on your goals:
- Open. Participants create their own category names and sort freely. Use this when building something new and wanting to understand how users think from scratch.
- Closed. You provide the categories, and participants sort into them. Use this when evaluating or refining an existing structure.
- Hybrid. Combines both approaches, giving participants predefined categories while letting them create new ones when nothing fits.
Card sorting can be run online or in person, with or without a moderator. Whichever format you choose, briefing participants clearly on the purpose of the study makes a significant difference in the quality of what you get back.[3]
Ethnographic
The value of this depth shows up in what you uncover. A researcher shadowing restaurant servers during a dinner shift might notice they frequently input orders while carrying plates. This is a behavior that no interview would surface because servers have simply accepted it as part of the job.
Data collection can take several forms: field notes, photography, video recording, and artifact analysis. The researcher's role can be passive, observing without interfering, or active, participating alongside users to build rapport and access more candid behavior.
Pro Tip: Ethnographic research is expensive and time-consuming. Use it when your team's assumptions about users are weak or untested and when getting the design direction wrong carries real consequences.
Contextual inquiry is a
The method is guided by 4 principles:
- Context means conducting the session where the user naturally works, whether that's their home, office, or elsewhere.
- Partnership means treating the user as the expert and the researcher as the learner, letting both parties steer the conversation.
- Interpretation means checking your understanding in real time by sharing observations with the user and asking them to confirm or correct.
- Focus means keeping the session anchored to your
research goals, even as the conversation flows naturally.
A session typically runs around 2 hours and follows a loose structure.
Contextual inquiry works especially well for understanding complex workflows and uncovering habitual behaviors users can't easily describe in a standard
Pro Tip: The most valuable moments are often the workarounds. When users do something unexpected to get a task done, that's a signal your product isn't supporting them the way it should.
A diary study is a
For example, a design agency studying why users make repeat purchases from brands sent participants a diary kit with questions touching on relationships, routines, and expectations. Over time, those entries revealed patterns that a single
Like ethnographic
After the logging period, researchers typically follow up with participants in interviews to fill gaps, clarify ambiguous entries, and probe deeper into patterns that emerged.[5]
Pro Tip: Diary studies work best when logging feels easy for participants. The lower the friction of recording an entry, the more honest and consistent the data.
A heuristic evaluation is a usability inspection method where expert evaluators assess an interface against a set of established usability principles, known as heuristics. The most widely used are Jakob Nielsen and Rolf Molich's 10 usability heuristics, though teams can adapt or supplement these depending on the product type.
Unlike usability testing, which involves real users, heuristic evaluation relies on
The process follows these steps:
- Define the scope and choose the heuristics you'll evaluate against
- Select and brief your evaluators independently, so they don't influence each other
- Have each evaluator examine the interface individually and document usability problems
- Bring evaluators together to compare findings and rate each problem by severity
- Prioritize issues and work with your team to implement solutions[6]
Pro Tip: Heuristic evaluation gets easier with practice. Over time, evaluators develop instincts for spotting common usability problems without needing to refer to the heuristics as frequently.
Participatory design is a way of involving users in the
Your choice of exercise will depend on the exact nature of the information you are looking for from your users. Some examples of participatory design exercises include:
- Asking users to make visual empathy collages to map out their perceived connection and interaction with your product
- Asking users to visually draw out the hierarchy of their goals and needs
- Getting users to role-play and act out their problems and potential solutions
- Brainstorming and improving on ideas and solutions together in groups, etc.[7]
These exercises can be shaped in the manner of your choosing. Just remember that the purpose is to elicit solutions and answers from users naturally to create more user-centric designs.
A user interview is a
Depending on your
- Structured: follows a fixed set of questions in a set order, useful when you need consistent, comparable responses across participants
- Unstructured: has minimal predetermined questions and lets the user guide the conversation freely, making it most useful in early discovery when you don't yet know what you're looking for
- Semi-structured: the most common format in
UX research, where you prepare a guide with key questions but stay flexible enough to follow up on responses that open up interesting directions
Use user interviews when you want to gather feedback on a product or feature launch, build user
Pro Tip: Prioritize open-ended questions over closed ones. Questions that invite users to describe, explain, or walk you through an experience will almost always yield richer insights than questions that can be answered with yes or no.
Tasks in a usability test mirror what users would actually do with the product, like making a purchase or placing an order.
Usability testing can be run at any stage of the
Sessions can be run in several formats:
- Moderated tests involve a facilitator guiding the participant through tasks in real time, either in a lab or remotely.
- Unmoderated tests have participants complete tasks on their own using a testing platform, with no facilitator present.
- Guerrilla testing is a lightweight, in-person variation where participants are approached in public spaces for quick, informal sessions.[1]
Pro Tip: Avoid giving participants too much guidance during a session. Watching where users struggle without stepping in is often where the most valuable insights come from.
References
- Competitive Analysis | Medium
- Card Sorting: Uncover Users' Mental Models | Nielsen Norman Group
- Contextual Inquiry: Inspire Design by Observing and Interviewing Users in Their Context | Nielsen Norman Group
- Diary Studies: Understanding Long-Term User Behavior and Experiences | Nielsen Norman Group
- Heuristic Evaluation: How-To: Article by Jakob Nielsen | Nielsen Norman Group
- Participatory Design in Practice | UX Magazine
- User Interviews 101 | Nielsen Norman Group
















