Module 6: Fundamentals of Quantitative Research

Numbers that speak: complementing the qualitative story

Estimated time: 2 - 2.5 hours


Table of Contents

  1. Introduction: Why do numbers matter in UX Research?
  2. Surveys vs. Interviews: When to use each method
  3. Designing unbiased questions
  4. Fundamentals of A/B Testing and Analytics
  5. Mixed Methods: Combining Qual + Quant
  6. Practical exercise
  7. References and resources

Learning Objectives

At the end of this module, you will be able to:

  • Distinguish when to use surveys versus interviews according to research objectives.
  • Design survey questions minimizing cognitive and cultural biases.
  • Understand the fundamentals of A/B testing, heatmaps, and conversion funnels.
  • Apply mixed methodologies (Mixed Methods) to obtain more robust insights.

1. Introduction: Why do numbers matter?

By training and personal preference, I have a bias toward qualitative methods: moderated testing, interviews, direct observation.

But over time (and several projects where I had to defend my findings to stakeholders), I learned something crucial: quantitative data doesn't replace qualitative data, it complements it. And when you combine them well, you have a much more powerful story.

"Conversations with users tell you the WHY. Quantitative data tells you the HOW MUCH and HOW OFTEN. Together, they tell the complete story."

As Adaptive Path says in their Experience Mapping guide: quantitative data can help validate what you learn in qualitative studies, prioritize the focus of your interviews, and make stakeholders feel more comfortable with a larger sample size.

The key? Knowing when to use each approach and how to combine them strategically.


2. Surveys vs. Interviews: When to use each method

This is probably the question I get most from junior researchers: Do I do interviews or a survey? The short answer is: it depends. The long answer... here we go.

2.1 Interviews: The power of "why"

Interviews are your tool when you need depth. When you want to understand motivations, emotions, context, frustrations... all that internal world of the user that doesn't appear in metrics.

Use interviews when:

  • You're exploring a new problem and don't know what questions to ask
  • You need to understand the "why" behind a behavior
  • The topic is sensitive and requires rapport (e.g., personal finances, health)
  • You want to capture user stories and narratives
  • Your population is small or hard to reach (e.g., B2B, experts)

2.2 Surveys: The power of "how many"

Surveys shine when you need scale. When you already have clear hypotheses and want to validate them with a larger sample, or when you need data you can generalize.

Use surveys when:

  • You already understand the problem and want to quantify its magnitude
  • You need representative data from a large population
  • You want to prioritize features or pain points
  • Stakeholders need numbers to make decisions
  • You have limited budget or time for research

2.3 Comparison table

Aspect Interviews Surveys
Objective Explore, understand depth Validate, quantify, generalize
Sample size 5-15 users (per segment) 30+ for basic statistics; 100+ for robust analysis
Type of data Qualitative (narratives, quotes) Quantitative (numbers, %)
Flexibility High (you can explore new topics) Low (predefined questions)
Cost/time High per participant Low per participant
Analysis Thematic coding, synthesis Descriptive, inferential statistics

"In my experience, the most common mistake is jumping straight to surveys without having done interviews first. You end up asking the wrong questions to many people."


3. Designing Unbiased Questions

This is where many projects go wrong (sorry for the language, but it's true). A survey with poorly designed questions gives you worthless data. And the worst part is you don't realize it until it's too late.

3.1 The respondent's cognitive process

Before writing questions, you need to understand how the brain processes a survey. According to Tourangeau, Rips, and Rasinski, there are five stages the respondent goes through:

  1. Perception: "What is this?" - They see or hear the stimulus
  2. Comprehension: "What are they asking me?" - They interpret the question
  3. Retrieval: "What do I know about this?" - They search their memory
  4. Judgment: "What is my answer?" - They formulate an opinion
  5. Response: "How do I report it?" - They translate to the survey format

Important: The order of questions activates information in memory that affects subsequent responses. Priming is real!

3.2 Closed vs. open questions

Closed questions (with predefined options):

  • Easier to analyze
  • Lower cognitive effort for the respondent
  • Risk: you may omit important options or force responses

Open questions (free response):

  • Capture perspectives you didn't anticipate
  • More difficult to analyze (require coding)
  • Lower response rate (more effort)

3.3 Common biases and how to avoid them

Acquiescence bias

The tendency to agree with everything. Solution: alternate the direction of questions.

Bad: "Do you agree that our product is easy to use?"

Better: "How easy or difficult is it for you to use our product?" (balanced scale)

Social desirability bias

Responding with what they believe is socially "correct." Common in sensitive topics.

Bad: "How often do you exercise?" (everyone will exaggerate)

Better: "In a typical week, how many days do you do at least 30 minutes of physical activity?"

Double-barreled questions

Asking two things in one. The respondent doesn't know what to answer.

Bad: "How satisfied are you with the price and quality of the product?"

Better: Separate into two distinct questions.

Primacy/recency bias

In long lists, options at the beginning and end are selected more. Solution: rotate options or use visual scales.

3.4 Cultural considerations for Latin America

This topic is crucial if you work in our region. Research is not simply translating materials from English.

  • Tú vs. Usted: Affects the tone and comfort of the participant
  • Questions about income: May be considered intrusive in Mexico and other countries
  • Alternative for socioeconomic level: Ask number of light bulbs/lights in home or access to services
  • Idioms: Spanish from Chile ≠ Argentina ≠ Mexico

"A perfectly calculated sample is useless if data quality is poor due to lack of cultural sensitivity."


4. Fundamentals of A/B Testing and Analytics

OK, now let's enter the world of behavioral data. This is the part where many UX Researchers feel out of their comfort zone, but it's more accessible than you think. You don't need to be a data scientist to understand these concepts :)

4.1 What is A/B Testing?

A/B testing (or split testing) is an optimization methodology that compares two or more versions of a digital element to determine which performs better according to predefined metrics.

Classic example:

  • Version A (control): Blue button that says "Sign up"
  • Version B (variant): Green button that says "Start free"
  • Metric: Conversion rate (% of users who click)

Key concepts:

  • Baseline Conversion Rate: Your current metric (e.g., 3% of visitors sign up)
  • MDE (Minimum Detectable Effect): The minimum change you want to detect (e.g., +10% relative)
  • Statistical Power: Probability of detecting a real effect (typically 80%)
  • Confidence Level: How sure you are of the result (typically 95%)

4.2 Analytics Tools

Heatmaps

Visualizations that show where users click and how they interact with the page. Warm colors (red, orange) indicate higher activity.

Types of heatmaps:

  • Click maps: Where they click
  • Scroll maps: How far down the page they scroll
  • Move maps: Where they move the cursor (visual attention)

Tools: Microsoft Clarity (free), Hotjar, Crazy Egg

Clickstreams

The sequential journey users take through your site. Shows you the most common paths and where they abandon.

Conversion funnels

Funnels that show the % of users who complete each step of a process. They identify leak points where users are lost.

E-commerce funnel example:

Stage Users Rate
Visit product page 10,000 100%
Add to cart 2,000 20%
Start checkout 800 8%
Complete purchase 300 3%

In this example, the biggest leak point is between "Visit product" and "Add to cart." Why? That's where you need qualitative research to understand the "why."

4.3 Limitations of A/B Testing

Before you run off to A/B test everything, an important disclaimer:

  • You need sufficient traffic: Without volume, there's no statistical significance
  • It tells you WHAT works, not WHY it works
  • Only optimizes within the current paradigm (doesn't find radically new solutions)
  • The statistical winner may not be the experience winner

5. Mixed Methods: Combining Qual + Quant

We've arrived at what I consider the highest level of UX Research: knowing how to strategically combine qualitative and quantitative methodologies.

5.1 Why mix methods?

As I mentioned at the beginning, qualitative and quantitative methods answer different questions:

  • Qualitative: Why? How? What is the experience?
  • Quantitative: How many? How often? How big?

When you combine them, you get triangulation: multiple viewpoints on the same phenomenon, which increases the validity of your findings.

5.2 Mixed Methods design patterns

Pattern 1: Exploration → Validation (Sequential)

Flow: Interviews → Survey

First you explore the problem with interviews (discover themes, hypotheses, user language). Then you design a survey to validate and quantify those findings.

Example: "In interviews we discovered that 6/8 users mentioned frustration with the payment process. We designed a survey to validate with n=500 and confirm that 72% have this problem."

Pattern 2: Quantification → Deepening (Sequential)

Flow: Analytics/Survey → Interviews

First you identify patterns in data (e.g., 80% abandon at step 3 of the funnel). Then you do interviews to understand why.

Example: "Data shows that young users convert 3x more than older users. We do interviews with both groups to understand the difference."

Pattern 3: Convergent (Parallel)

Flow: Interviews + Survey simultaneously

You collect both types of data at the same time and compare/contrast results at the end.

5.3 Practical example: Experience Map project

Following the Adaptive Path guide, this is what a real Experience Mapping project with Mixed Methods would look like:

  1. Desk Research: Review existing analytics, previous research, NPS data
  2. Exploratory survey: Identify segments and prioritize touchpoints
  3. In-depth interviews: 8-12 users per segment, using directed storytelling
  4. Synthesis: Create journey map with quali + quanti data
  5. Validation: Survey to confirm main pain points

"Customer conversations and observations are your primary tool to learn, identify patterns, and capture the richness of human experience. But quantitative data validates and prioritizes." - Adaptive Path


6. Practical Exercise

Situation: Your team is redesigning the onboarding flow of a personal finance app. Analytics data shows that 65% of users abandon before completing registration.

Part 1: Plan your Mixed Methods

  1. What quantitative data do you need to review first? (hint: funnels, heatmaps)
  2. What hypotheses would you generate from that data?
  3. How would you design the interviews to deepen?
  4. What follow-up survey would you propose?

Part 2: Design 5 survey questions

Write 5 questions for a post-abandonment survey. Include at least:

  • 2 closed questions (scale or multiple choice)
  • 1 open question
  • Avoid the biases discussed in section 3

Part 3: Propose an A/B test

Based on a hypothesis about abandonment, design an A/B test. Define:

  • Variable to test (what you change)
  • Success metric (what you measure)
  • How you would interpret the results

7. References and Resources

Recommended readings

Tools mentioned

Resources in Spanish


You can also explore the Sample Calculator we've developed to help you determine sample sizes for surveys and A/B testing in the Latin American context.

See you in the next module! :)