The User Research Strategist

The User Research Strategist

Share this post

The User Research Strategist
The User Research Strategist
Have You Met TED(W)?
Execution

Have You Met TED(W)?

Master unbiased and open-ended conversations. Every time.

Nikki Anderson's avatar
Nikki Anderson
Sep 25, 2024
βˆ™ Paid
7

Share this post

The User Research Strategist
The User Research Strategist
Have You Met TED(W)?
Share

πŸ‘‹πŸ»Hi, this is Nikki with aΒ πŸ”’subscriber-only πŸ”’ article from User Research Academy. In every article, I cover in-depth topics on how to conduct user research, grow in your career, and fall in love with the craft of user research again.


Hi there, you amazing, curious person!

The most common answer in user research, product, tech, and what feels like the entire world is:

β€œIt depends.”

There are very few times I will say, β€œDo X, and you will be better at Y.”

Meet one of those few exceptions:

Yes. Ted gets its own big image. And also two fun references:

Or:

Okay. I’m done with gifs. On to the gold.

There are very few copy-and-paste formulas that lead to better outcomes and fewer quick wins. But I’m happy to say this is a quick-win formula to help you ask better, unbiased, non-leading, open-ended questions that lead to depth, rich insights, and data gold.

I promise that this will be a short and to-the-point article because it doesn’t need to be long, and you don’t have to hear about my many (I mean, MANY) failures at asking terrible questions.

Flip the Script

User research isn't just about asking questions. In fact, telling a user researcher they are simply asking questions is a surefire way to piss us off. Sorry, it’s the truth!

We aren’t just simply spewing off questions from a list.

User researchβ€”whoever does itβ€”is about asking the right questions at the right time and letting the participant speak 90% of the time.

I struggled with this in my own career in two distinct areas:

  1. When I started as a user researcher

  2. When I tried to democratize research

Instead, I asked (and watched others ask) horrible questions that led to dead-end conversations, shallow data, and non-actionable insights.

Horrible Questions

My research plans and scripts used to be filled with horrible questions. Here’s an example:

What makes for a horrible question?

  • Priming questionsβ€Šβ€”β€Šwhich will force the user to answer in a particular way

    • Example: β€œHow much do you like being able to order takeaway online?”

  • Leading questionsβ€Šβ€”β€Šwhich may prohibit the user from exploring a different avenue

    • Example: "What makes this product helpful?"

  • Asking about future behaviorβ€Šβ€”β€Šinstead of focusing on the past/present

    • Example: "Would you use this feature?"

  • Double-barreled questionsβ€Šβ€”β€Šasking two questions in one sentence

    • Example: β€œHow confident are you in our product and how much value does it bring to you?”

  • Yes/no questionsβ€Šβ€”β€Šwhich will end the conversation. Instead, we focus on open-ended questions

    • Example: β€œDid you find what you were looking for?”

  • Preference-based questionsβ€”asking about preference instead of usability, unmet needs, or pain points

    • Example: "Do you prefer to explore exotic destinations or relax on a beach vacation?"

  • What-based questionsβ€”asking questions that lead to a list of answers or behaviors that are better suited for surveys.

    • Example: β€œWhat do you come to the library for?”

  • Asking participants to design for youβ€”questions that force participants to share how they might fix or design something (which they aren’t qualified to do).

    • Example: β€œWhat would you change to improve this feature/design?”

  • Quantifying questionsβ€”asking a small sample size within a qualitative data project to quantify something through a metric.

    • Example: β€œHow confident did you feel while using this feature?”

Wowzers, that is a long list of horrible questions.

The saddest part is that I took some of these from free, open online resources that promise you a huge bank of amazing questions for your next research project.

We can’t really blame our stakeholders for asking crappy questions if they are finding these resources online that claim to have amazing questions to ask users. And new researchers are learning via these channels, so it’s no wonder why some discussion guides just aren’t set up for success.

And now, for fun, I will put some more questions here for you to decide which category of horribleness they fit intoβ€”all of them are from free online UXR resources.

Leave a comment with your thoughts. Who knows, you might win a prize.

  • To what extent do you feel this design was made for you?

  • What are your primary business goals?

  • Will you continue to use this feature?

  • What would you expect to see from the website?

  • Did the experience meet your expectations?

  • How likely are you to use this feature?

  • How do you prefer to be trained on new software?

  • How would you rate your overall experience with this product?

  • Did this article answer your question?

  • How much would you pay for this product?

  • Which product/feature/image/design do you prefer?

  • How likely is it that you would purchase this product?

  • What would happen next?

  • How successful were you at completing the task?

  • What about the feature is most exciting?

I could go on for ages, but I’m going to stop there. Have fun.

Why these Questions Suck

I’m not going to go into detail as to why each of these questions sucks because, well, we’d be here forever. Ugh, but I can’t resist.

Instead, Learn the TEDW Framework

Keep reading with a 7-day free trial

Subscribe to The User Research Strategist to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
Β© 2025 Nikki Anderson
Privacy βˆ™ Terms βˆ™ Collection notice
Start writingGet the app
Substack is the home for great culture

Share