👋🏻Hi, this is Nikki with a 🔒subscriber-only 🔒 article from User Research Academy. In every article, I cover in-depth topics on how to conduct user research, grow in your career, and fall in love with the craft of user research again.
Hi there, you amazing, curious person!
The most common answer in user research, product, tech, and what feels like the entire world is:
“It depends.”
There are very few times I will say, “Do X, and you will be better at Y.”
Meet one of those few exceptions:
Yes. Ted gets its own big image. And also two fun references:
Or:
Okay. I’m done with gifs. On to the gold.
There are very few copy-and-paste formulas that lead to better outcomes and fewer quick wins. But I’m happy to say this is a quick-win formula to help you ask better, unbiased, non-leading, open-ended questions that lead to depth, rich insights, and data gold.
I promise that this will be a short and to-the-point article because it doesn’t need to be long, and you don’t have to hear about my many (I mean, MANY) failures at asking terrible questions.
Flip the Script
User research isn't just about asking questions. In fact, telling a user researcher they are simply asking questions is a surefire way to piss us off. Sorry, it’s the truth!
We aren’t just simply spewing off questions from a list.
User research—whoever does it—is about asking the right questions at the right time and letting the participant speak 90% of the time.
I struggled with this in my own career in two distinct areas:
When I started as a user researcher
When I tried to democratize research
Instead, I asked (and watched others ask) horrible questions that led to dead-end conversations, shallow data, and non-actionable insights.
Horrible Questions
My research plans and scripts used to be filled with horrible questions. Here’s an example:
What makes for a horrible question?
Priming questions — which will force the user to answer in a particular way
Example: “How much do you like being able to order takeaway online?”
Leading questions — which may prohibit the user from exploring a different avenue
Example: "What makes this product helpful?"
Asking about future behavior — instead of focusing on the past/present
Example: "Would you use this feature?"
Double-barreled questions — asking two questions in one sentence
Example: “How confident are you in our product and how much value does it bring to you?”
Yes/no questions — which will end the conversation. Instead, we focus on open-ended questions
Example: “Did you find what you were looking for?”
Preference-based questions—asking about preference instead of usability, unmet needs, or pain points
Example: "Do you prefer to explore exotic destinations or relax on a beach vacation?"
What-based questions—asking questions that lead to a list of answers or behaviors that are better suited for surveys.
Example: “What do you come to the library for?”
Asking participants to design for you—questions that force participants to share how they might fix or design something (which they aren’t qualified to do).
Example: “What would you change to improve this feature/design?”
Quantifying questions—asking a small sample size within a qualitative data project to quantify something through a metric.
Example: “How confident did you feel while using this feature?”
Wowzers, that is a long list of horrible questions.
The saddest part is that I took some of these from free, open online resources that promise you a huge bank of amazing questions for your next research project.
We can’t really blame our stakeholders for asking crappy questions if they are finding these resources online that claim to have amazing questions to ask users. And new researchers are learning via these channels, so it’s no wonder why some discussion guides just aren’t set up for success.
And now, for fun, I will put some more questions here for you to decide which category of horribleness they fit into—all of them are from free online UXR resources.
Leave a comment with your thoughts. Who knows, you might win a prize.
To what extent do you feel this design was made for you?
What are your primary business goals?
Will you continue to use this feature?
What would you expect to see from the website?
Did the experience meet your expectations?
How likely are you to use this feature?
How do you prefer to be trained on new software?
How would you rate your overall experience with this product?
Did this article answer your question?
How much would you pay for this product?
Which product/feature/image/design do you prefer?
How likely is it that you would purchase this product?
What would happen next?
How successful were you at completing the task?
What about the feature is most exciting?
I could go on for ages, but I’m going to stop there. Have fun.
Why these Questions Suck
I’m not going to go into detail as to why each of these questions sucks because, well, we’d be here forever. Ugh, but I can’t resist.
Instead, Learn the TEDW Framework
Keep reading with a 7-day free trial
Subscribe to The User Research Strategist to keep reading this post and get 7 days of free access to the full post archives.