ππ»Hi, this is Nikki with aΒ πsubscriber-only π article from User Research Academy. In every article, I cover in-depth topics on how to conduct user research, grow in your career, and fall in love with the craft of user research again.
Hi there, you amazing, curious person!
The most common answer in user research, product, tech, and what feels like the entire world is:
βIt depends.β
There are very few times I will say, βDo X, and you will be better at Y.β
Meet one of those few exceptions:
Yes. Ted gets its own big image. And also two fun references:
Or:
Okay. Iβm done with gifs. On to the gold.
There are very few copy-and-paste formulas that lead to better outcomes and fewer quick wins. But Iβm happy to say this is a quick-win formula to help you ask better, unbiased, non-leading, open-ended questions that lead to depth, rich insights, and data gold.
I promise that this will be a short and to-the-point article because it doesnβt need to be long, and you donβt have to hear about my many (I mean, MANY) failures at asking terrible questions.
Flip the Script
User research isn't just about asking questions. In fact, telling a user researcher they are simply asking questions is a surefire way to piss us off. Sorry, itβs the truth!
We arenβt just simply spewing off questions from a list.
User researchβwhoever does itβis about asking the right questions at the right time and letting the participant speak 90% of the time.
I struggled with this in my own career in two distinct areas:
When I started as a user researcher
When I tried to democratize research
Instead, I asked (and watched others ask) horrible questions that led to dead-end conversations, shallow data, and non-actionable insights.
Horrible Questions
My research plans and scripts used to be filled with horrible questions. Hereβs an example:
What makes for a horrible question?
Priming questionsβββwhich will force the user to answer in a particular way
Example: βHow much do you like being able to order takeaway online?β
Leading questionsβββwhich may prohibit the user from exploring a different avenue
Example: "What makes this product helpful?"
Asking about future behaviorβββinstead of focusing on the past/present
Example: "Would you use this feature?"
Double-barreled questionsβββasking two questions in one sentence
Example: βHow confident are you in our product and how much value does it bring to you?β
Yes/no questionsβββwhich will end the conversation. Instead, we focus on open-ended questions
Example: βDid you find what you were looking for?β
Preference-based questionsβasking about preference instead of usability, unmet needs, or pain points
Example: "Do you prefer to explore exotic destinations or relax on a beach vacation?"
What-based questionsβasking questions that lead to a list of answers or behaviors that are better suited for surveys.
Example: βWhat do you come to the library for?β
Asking participants to design for youβquestions that force participants to share how they might fix or design something (which they arenβt qualified to do).
Example: βWhat would you change to improve this feature/design?β
Quantifying questionsβasking a small sample size within a qualitative data project to quantify something through a metric.
Example: βHow confident did you feel while using this feature?β
Wowzers, that is a long list of horrible questions.
The saddest part is that I took some of these from free, open online resources that promise you a huge bank of amazing questions for your next research project.
We canβt really blame our stakeholders for asking crappy questions if they are finding these resources online that claim to have amazing questions to ask users. And new researchers are learning via these channels, so itβs no wonder why some discussion guides just arenβt set up for success.
And now, for fun, I will put some more questions here for you to decide which category of horribleness they fit intoβall of them are from free online UXR resources.
Leave a comment with your thoughts. Who knows, you might win a prize.
To what extent do you feel this design was made for you?
What are your primary business goals?
Will you continue to use this feature?
What would you expect to see from the website?
Did the experience meet your expectations?
How likely are you to use this feature?
How do you prefer to be trained on new software?
How would you rate your overall experience with this product?
Did this article answer your question?
How much would you pay for this product?
Which product/feature/image/design do you prefer?
How likely is it that you would purchase this product?
What would happen next?
How successful were you at completing the task?
What about the feature is most exciting?
I could go on for ages, but Iβm going to stop there. Have fun.
Why these Questions Suck
Iβm not going to go into detail as to why each of these questions sucks because, well, weβd be here forever. Ugh, but I canβt resist.
Instead, Learn the TEDW Framework
Keep reading with a 7-day free trial
Subscribe to The User Research Strategist to keep reading this post and get 7 days of free access to the full post archives.