<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The User Research Strategist: Prove ROI yesterday]]></title><description><![CDATA[Show research value in PM/leadership language. 21 ROI formulas, OKR alignment, business cases they actually read. Paid members: calculators + business case templates in the Hub.]]></description><link>https://www.userresearchstrategist.com/s/strategy</link><generator>Substack</generator><lastBuildDate>Wed, 13 May 2026 19:05:53 GMT</lastBuildDate><atom:link href="https://www.userresearchstrategist.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Nikki Anderson]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[nikk@userresearchacademy.com]]></webMaster><itunes:owner><itunes:email><![CDATA[nikk@userresearchacademy.com]]></itunes:email><itunes:name><![CDATA[Nikki Anderson]]></itunes:name></itunes:owner><itunes:author><![CDATA[Nikki Anderson]]></itunes:author><googleplay:owner><![CDATA[nikk@userresearchacademy.com]]></googleplay:owner><googleplay:email><![CDATA[nikk@userresearchacademy.com]]></googleplay:email><googleplay:author><![CDATA[Nikki Anderson]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Bonus video: Behind-the-scenes of using AI agents ]]></title><description><![CDATA[I quickly walk through the difference between using an AI agent versus generic LLM chat]]></description><link>https://www.userresearchstrategist.com/p/bonus-video-behind-the-scenes-of</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/bonus-video-behind-the-scenes-of</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 03 Mar 2026 12:46:30 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/189758851/099f88832d92cb64a00cbaaad1bbd3dc.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p><em>&#128075; Hey, I&#8217;m Nikki. Each week I write about UX research strategy, communicating impact, and using AI to do your best work. For more: <a href="https://claudeskills.uxrstrategist.com/">Claude Skills Bundle</a> | <a href="https://www.uxrstrategist.com/uxr-ai-prompt-library">AI Prompt Library</a> | <a href="https://ai.uxrstrategist.com/">Team Training</a> | <a href="https://maven.com/user-research-strategist">Live Courses</a></em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><p><em>P.S. Paid subscribers get access to full archive, all content, a private Slack community, Substack lives, and a hub of templates, scripts, and mini-courses</em></p><div><hr></div><p>**With the recent updates on Open AI privacy, I want to highlight that you do not have to use ChatGPT to achieve this. I have showcased ChatGPT because that&#8217;s where all my agents currently live, but I will be migrating them to other LLMs.</p><p>I&#8217;ve received a ton of questions recently on how UXRs can use AI more in a day-to-day workflow so I recorded this quick to showcase how creating and using specific AI agents can help with:</p><ol><li><p>Offloading tedious/mundane tasks or starting with a draft versus a blank</p></li><li><p>Getting an additional perspective on something</p></li><li><p>Training others with more guardrails</p></li></ol><p>If this looks like something that would benefit your team, <strong><a href="https://calendly.com/nikkianderson/drop-in-uxr">happy to have a chat</a></strong> to see if I can help through a training workshop!</p><p>Let me know if you have any thoughts or questions below.</p><p>PS - If you need some help with prompt engineering, check out my <strong><a href="https://userresearchstrategist.squarespace.com/uxr-ai-prompt-library">AI Prompt Library for UXRs</a></strong>!</p><p>Happy researching!</p>]]></content:encoded></item><item><title><![CDATA[Use AI Across Your User Research Process]]></title><description><![CDATA[&#128075; Hey, I&#8217;m Nikki.]]></description><link>https://www.userresearchstrategist.com/p/use-ai-across-your-user-research</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/use-ai-across-your-user-research</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Thu, 05 Feb 2026 09:01:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!OtUq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>&#128075; Hey, I&#8217;m Nikki. Each week I write about UX research strategy, communicating impact, and using AI to do your best work. For more: <a href="https://claudeskills.uxrstrategist.com/">Claude Skills Bundle</a> | <a href="https://www.uxrstrategist.com/uxr-ai-prompt-library">AI Prompt Library</a> | <a href="https://www.dropinresearch.com/">Team Training</a></em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><p><em>P.S. Paid subscribers get access to full archive, all content, a private Slack community, Substack lives, and a hub of templates, scripts, and mini-courses</em></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!OtUq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!OtUq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 424w, https://substackcdn.com/image/fetch/$s_!OtUq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 848w, https://substackcdn.com/image/fetch/$s_!OtUq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 1272w, https://substackcdn.com/image/fetch/$s_!OtUq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!OtUq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png" width="396" height="396" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/76223cdd-a749-4139-8362-ac606768164c_4000x4000.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1456,&quot;width&quot;:1456,&quot;resizeWidth&quot;:396,&quot;bytes&quot;:373468,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.userresearchstrategist.com/i/179251928?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!OtUq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 424w, https://substackcdn.com/image/fetch/$s_!OtUq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 848w, https://substackcdn.com/image/fetch/$s_!OtUq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 1272w, https://substackcdn.com/image/fetch/$s_!OtUq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76223cdd-a749-4139-8362-ac606768164c_4000x4000.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://unsplash.com/illustrations/a-woman-sitting-at-a-desk-with-a-robot-next-to-her-KkF96nn73CQ">Image from Unsplash</a> </figcaption></figure></div><p>Most researchers I know have the same story. The same raised eyebrow. The same deep sigh before they open another LLM convo and brace for disappointment.</p><p>The first time I tried to use AI for something &#8220;simple,&#8221; I remember asking it to help me dissect a stakeholder brief. I pasted the vague, chaotic message, something along the lines of &#8220;We need to test the new dashboard design before Friday; can you put something together?&#8221; and waited for help.</p><p>What I got back looked like someone had skimmed a UX blog from 2012 and stitched together a few polite sentences. It read like a student trying to impress their professor without actually doing the assignment. No context. No understanding of the politics behind the request. No reading between the lines. Definitely no sense of the real decision hiding underneath the pretty words.</p><p>The problem is that we&#8217;ve been throwing AI at UXR tasks with the same energy we bring to reheating leftover lunch. Fast. Distracted. Half-formed prompts that barely capture what we actually need. Then we blame the AI when it hands back something flat, vague, or straight-up wrong.</p><p>Most UXRs are stuck in this loop:</p><ul><li><p>You give AI a tiny prompt.</p></li><li><p>It gives you a tiny answer.</p></li><li><p>You rewrite everything yourself.</p></li><li><p>You decide AI isn&#8217;t ready.</p></li><li><p>Then you go back to doing everything the slow way.</p></li></ul><p>But, at the same time, you really have been burned.</p><p>You&#8217;ve tried using AI to:</p><ul><li><p>Clean up messy notes</p></li><li><p>Summarize a long research plan</p></li><li><p>Rephrase an insight for an exec</p></li><li><p>Draft a kickoff email</p></li><li><p>Clarify a brief written by a PM who sprinted through it between meetings</p></li></ul><p>And the output felt like it came from someone who wasn&#8217;t in the room with you.</p><p>I&#8217;ve spoken to senior UXRs in fintech, SaaS, marketplaces, health tech, people who run teams, shape roadmaps, and handle cross-functional chaos every day, and every single one of them says something like:</p><p>&#8220;I can see the potential&#8230;but I don&#8217;t trust it.&#8221;</p><p>Not because AI is bad.</p><p>But because nobody taught UXRs how to use AI in a way that respects the complexity of our work. We didn&#8217;t get training.</p><p>We&#8217;re self-teaching in the middle of deadlines. We&#8217;re experimenting with prompts in between interviews. We&#8217;re trying to make sense of output that feels helpful one minute and deeply misguided the next.</p><p>AI becomes incredibly powerful for researchers once you give it the kind of direction your craft already relies on which is precision, context, constraints, intention, and the decision you&#8217;re supporting.</p><p>The magic doesn&#8217;t come from the model. The magic comes from your brain, paired with a structure that helps the AI act like a competent partner instead of an overeager intern.</p><p>Most researchers give up after a few half-hearted prompts. You ask something generic. It spits out something shallow. You move on. It&#8217;s not that AI can&#8217;t help you think better, it can, but only if you know how to push it.</p><p>Now we&#8217;re going to walk through the entire research process, from messy stakeholder kickoff to crisp, confident insights, and turn AI into the kind of co-pilot you&#8217;ve wished for since your first week as a researcher.</p><h1><strong>Why Pancake Prompts Fall Flat</strong></h1><p>If you&#8217;ve ever asked AI for help and felt mildly offended by the output, you&#8217;re not alone. Most researchers start with tiny prompts, get tiny answers, and assume the model just isn&#8217;t good enough. It&#8217;s the same energy as handing someone a sticky note that says write the whole report for Monday? and expecting them to read your mind, decode your org politics, and magically land on something useful.</p><p>The problem isn&#8217;t the AI. The problem is the prompt.</p><p>I know that sounds like the kind of patronizing advice thrown around LinkedIn, but stay with me. I spent months testing how UXRs actually prompt AI across dozens of real projects, interviews, surveys, strategy sessions, prototype tests, you name it, and most of the prompts UXRs write fall into the same patterns:</p><h4><strong>1. The &#8220;do everything for me&#8221; prompt</strong></h4><p>Example: <em>&#8220;Write a usability test.&#8221;</em></p><p>What the AI hears: <em>&#8220;Guess wildly.&#8221;</em></p><h4><strong>2. The &#8220;here&#8217;s a crumb, bake a cake&#8221; prompt</strong></h4><p>Example: <em>&#8220;Help me write a kickoff doc.&#8221;</em></p><p>What the AI hears: <em>&#8220;Please hallucinate intentions for me.&#8221;</em></p><h4><strong>3. The &#8220;I&#8217;ll tell you the task but not the stakes&#8221; prompt</strong></h4><p>Example: <em>&#8220;Suggest tasks for a survey.&#8221;</em></p><p>What the AI hears: <em>&#8220;Throw generic content at the wall.&#8221;</em></p><p>When you feed AI a prompt that thin, you get output that reads like the UX equivalent of a cookbook written by someone who has never eaten food. Lacking any awareness of real-world messiness.</p><p><strong>AI has no idea what you actually care about unless you tell it.</strong></p><p>And UX research is built on a whole lot of context:</p><ul><li><p>Why the team wants this research</p></li><li><p>What decision sits behind the request</p></li><li><p>Who&#8217;s pushing for speed</p></li><li><p>What&#8217;s riding on the outcome</p></li><li><p>What happened last time someone skipped research</p></li><li><p>Who will use the insights</p></li><li><p>What the constraints look like</p></li><li><p>Which trade-offs matter</p></li><li><p>What business metric is at stake</p></li><li><p>How much is already known</p></li><li><p>What&#8217;s being assumed without evidence</p></li></ul><p>When your prompt doesn&#8217;t include these pieces, you&#8217;re asking an AI model to work blindfolded. I started experimenting with a completely different approach: stop treating AI like a vending machine, and start treating it like a very fast, very literal junior researcher who needs a real brief.</p><p>This is where the FAST model came from. A simple four-part structure that upgrades almost any prompt instantly.</p><div><hr></div><p><strong>Below, I walk you through the exact system that turns AI from &#8220;overeager intern&#8221; into a reliable research co-pilot:</strong></p><ul><li><p><strong>The FAST model (the 4-part prompt structure that fixes pancake prompts instantly)</strong></p></li><li><p><strong>Before/after examples that show what &#8220;good&#8221; looks like</strong></p></li><li><p><strong>Copy-paste prompts for kickoff, decision-mapping, risk surfacing, and assumption-breaking</strong></p></li><li><p><strong>Mid-study checkpoint prompts to stop projects drifting off a cliff</strong></p></li><li><p><strong>Synthesis guardrails so you get support without handing over judgment or raw data</strong></p></li></ul><p><em><strong>Exclusively for paid subscribers.</strong></em></p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/use-ai-across-your-user-research">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Your Deck Isn’t the Deliverable]]></title><description><![CDATA[Use decision-first research to turn evidence into moves]]></description><link>https://www.userresearchstrategist.com/p/your-deck-isnt-the-deliverable</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/your-deck-isnt-the-deliverable</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 16 Dec 2025 09:00:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ppkY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e037c8b-2ec4-435b-b514-285bbe1e6572_4000x2759.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hi, I&#8217;m Nikki. I run Drop In Research, where I help teams stop launching &#8220;meh&#8221; and start shipping what customers really need. I write about the conversations that change a roadmap, the questions that shake loose real insight, and the moves that get leadership leaning in. <a href="https://www.dropinresearch.com/">Bring me to your team.</a></p><p>Paid subscribers get the power tools: the UXR Tools Bundle wi&#8230;</p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/your-deck-isnt-the-deliverable">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[A guide to AI agents for user researchers]]></title><description><![CDATA[Turn AI into a research thought partner]]></description><link>https://www.userresearchstrategist.com/p/a-guide-to-ai-agents-for-user-researchers</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/a-guide-to-ai-agents-for-user-researchers</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 23 Sep 2025 08:00:27 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Kryp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3d3e46c-b8cc-461a-80e7-895c0d3078eb_1800x406.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>&#128075; Hey, I&#8217;m Nikki. Each week I write about UX research strategy, communicating impact, and using AI to do your best work. For more: <a href="https://claudeskills.uxrstrategist.com/">Claude Skills Bundle</a> | <a href="https://www.uxrstrategist.com/uxr-ai-prompt-library">AI Prompt Library</a> | <a href="https://www.dropinresearch.com/">Team Training</a></em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><p><em>P.S. Paid subscribers get access to full archive, all content, a private Slack community, Substack lives, and a hub of templates, scripts, and mini-courses</em></p><div><hr></div><p>You know the drill: you&#8217;re managing five research projects, you just got pinged to &#8220;take a quick look&#8221; at a half-baked survey, and you&#8217;ve got feedback from 3,000 beta users, most of it vague, repetitive, or weirdly contradictory.</p><p>And yet, you&#8217;re still expected to be strategic, decisive, and insightful.</p><p>This guide is not about AI replacing user researchers, but about how to build a custom GPT agent that acts like a sharp, humble, unflappable thought partner. One that:</p><ul><li><p>Helps you clarify your thinking when your brain is fried</p></li><li><p>Challenges you with smart questions (not just passive answers)</p></li><li><p>Gets you out of blank-page paralysis</p></li><li><p>Spots potential blind spots or contradictions before a stakeholder does</p></li><li><p>Drafts starting points for tedious but necessary docs</p></li></ul><p>Think of it as building your own research copilot. Not to do the job for you but to help you do it with more clarity, speed, and sanity.</p><p>This article is focused on creating a GPT through ChatGPT, because that is what I am most familiar with, but can be used on other LLMs as well. </p><div><hr></div><p>Most researchers give up after a few half-hearted prompts. You ask something generic. It spits out something shallow. You move on. It&#8217;s not that AI can&#8217;t help you think better, it can, but only if you know how to push it.</p><p>I built an AI Prompt Library for user researchers who are tired of wasting time on useless outputs. These are the exact prompts I use when I want to:</p><ul><li><p>Pressure test my research questions</p></li><li><p>Catch blind spots before they derail a study</p></li><li><p>Frame insights so they actually land with leadership</p></li><li><p>Prep for tough stakeholder conversations</p></li><li><p>Kickstart deeper thinking when I&#8217;m stuck</p></li></ul><p>It&#8217;s a working toolkit I&#8217;ve refined in real research projects with teams under pressure to deliver fast, smart, credible work.</p><p>If you want to stop messing around with AI and actually <em>use it to get better at research</em>, you can purchase the library below (starting at &#163;297 for over 60 detailed prompts):</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/uxr-ai-prompt-library&quot;,&quot;text&quot;:&quot;Get the UXR AI Prompt Library&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/uxr-ai-prompt-library"><span>Get the UXR AI Prompt Library</span></a></p><div><hr></div><h1>What you&#8217;re actually building</h1><p>You&#8217;re creating a custom GPT that works like an intern. It helps you move faster without cutting corners. It doesn&#8217;t replace your judgment or intuition but it gives you a solid first draft, a second brain, a structured way to think through messy tasks, or an outside perspective you may have missed.</p><p>It&#8217;s not here to &#8220;do research.&#8221; It supports the parts of your job that slow you down, such as writing from scratch, cleaning up notes, explaining something for the third time, or organizing chaos into something usable.</p><p>Your GPT should:</p><ul><li><p>Understand how research fits into product work</p></li><li><p>Ask thoughtful questions when it doesn&#8217;t have enough info</p></li><li><p>Stick to your preferred format (Docs, Markdown, Notion, etc.)</p></li><li><p>Stay grounded in evidence</p></li></ul><p></p><h1>Step 1: Get Clear on What Your GPT Is Actually For</h1><p>Before you start building anything, pause.</p><p>Most people skip this step and jump straight to writing prompts. Then they get frustrated when their GPT gives weird, useless responses.</p><p>That&#8217;s like hiring a new team member and just telling them, &#8220;Help with research.&#8221; You&#8217;d never do that.</p><p>This step is where you define the job. If your GPT is going to help you, it needs clear direction, just like an intern would.</p><p>You&#8217;re designing a GPT-powered agent that works like a trusted assistant:</p><ul><li><p>Helps you organize your thinking</p></li><li><p>Reflects things back when you&#8217;re stuck</p></li><li><p>Drafts first versions of the stuff you&#8217;d otherwise procrastinate on</p></li><li><p>Challenges you when your plan is too big, vague, or stakeholder-pleasing</p></li></ul><p>Let&#8217;s walk through how to define its job.</p><h3><strong>1.1 Pick a Specific Role</strong></h3><p>Trying to build a GPT that &#8220;helps with research&#8221; is like hiring someone and saying &#8220;your job is everything.&#8221; It won&#8217;t work.</p><p>Instead, think about a recent project where you said, &#8220;I wish I didn&#8217;t have to figure this out alone.&#8221;</p><p>That&#8217;s your first role.</p><p>Here are 6 realistic, high-leverage roles to choose from. These are built for busy UXRs dealing with limited time, pushy teams, and messy briefs.</p><div><hr></div><h4><strong>Role 1: Study Goals &amp; Scope Coach</strong></h4><p><strong>Job:</strong> Help turn a vague or bloated request into a tight, meaningful set of research goals and define what&#8217;s out of scope.</p><p><strong>Use this when:</strong> A PM says &#8220;We just need to understand what people want,&#8221; and you need to create something feasible without turning it into a 3-month epic.</p><p><strong>What it might help you with:</strong></p><ul><li><p>Turning broad goals into sharp questions</p></li><li><p>Spotting red flags in scope creep</p></li><li><p>Suggesting ways to align goals with actual product decisions</p></li><li><p>Asking you clarifying questions when you&#8217;re stuck</p></li></ul><div><hr></div><h4><strong> Role 2: Method Matchmaker</strong></h4><p><strong>Job:</strong> Recommend a suitable method (or hybrid) based on your study goals, team constraints, and timeline. Includes both rigorous and lean options.</p><p><strong>Use this when:</strong> You&#8217;re toggling between three options&#8212;card sort? 1x1 interviews? diary study?&#8212;and need a sounding board to land on something defensible.</p><p><strong>What it might help you with:</strong></p><ul><li><p>Suggesting methods based on your real constraints</p></li><li><p>Highlighting trade-offs you&#8217;re not considering</p></li><li><p>Providing backup reasoning for your method when stakeholders push back</p></li><li><p>Recommending ways to combine methods into a phased or scrappy approach</p></li></ul><div><hr></div><h4><strong>Role 3: Success Metrics Assistant</strong></h4><p><strong>Job:</strong> Translate vague product or business goals into measurable indicators of research impact (or success). Not just &#8220;insightful findings&#8221; but change that matters.</p><p><strong>Use this when:</strong> Your team says &#8220;We&#8217;ll know the research worked if people get it,&#8221; and you&#8217;re left guessing what &#8220;get it&#8221; means.</p><p><strong>What it might help you with:</strong></p><ul><li><p>Linking research goals to team or company metrics</p></li><li><p>Coming up with proxy indicators (when hard metrics aren&#8217;t possible)</p></li><li><p>Helping define what a &#8220;useful&#8221; outcome looks like ahead of the study</p></li><li><p>Stress-testing your impact assumptions</p></li></ul><div><hr></div><h4><strong>Role 4: Stakeholder Pushback Coach</strong></h4><p><strong>Job:</strong> Help you prepare responses to difficult stakeholder feedback: &#8220;Why are we doing this?&#8221;, &#8220;We already know this,&#8221; &#8220;Can we skip the research?&#8221;</p><p><strong>Use this when:</strong> You&#8217;re emotionally exhausted from defending the value of your work and want help crafting a calm, credible, non-defensive reply.</p><p><strong>What it might help you with:</strong></p><ul><li><p>Drafting responses to common objections</p></li><li><p>Helping you clarify the real resistance (budget? timing? ego?)</p></li><li><p>Reframing research as a support tool, not a blocker</p></li><li><p>Giving you talking points in plain language, not theory</p></li></ul><div><hr></div><h4><strong>Role 5: Communication Drafting Assistant</strong></h4><p><strong>Job:</strong> Write first drafts of time-consuming or lower-stakes copy: intro emails to participants, internal study updates, screener survey logic, etc.</p><p><strong>Use this when:</strong> You&#8217;ve got a dozen tabs open, a pile of notes, and zero time to sound polished.</p><p><strong>What it might help you with:</strong></p><ul><li><p>Writing readable stakeholder updates with different tones (PM, exec, designer)</p></li><li><p>Turning bullet points into a screener survey</p></li><li><p>Drafting opt-in emails or session invites</p></li><li><p>Creating follow-up messages when sessions change or people ghost</p></li></ul><div><hr></div><h4><strong>Role 6: Business Alignment Sparring Partner</strong></h4><p><strong>Job:</strong> Help you map research questions to business priorities, identify where the value is likely to show up, and anticipate what stakeholders care about.</p><p><strong>Use this when:</strong> You&#8217;re asked to &#8220;just explore,&#8221; but you know the team will want something that ties back to growth, retention, or efficiency.</p><p><strong>What it might help you with:</strong></p><ul><li><p>Translating &#8220;user friction&#8221; into &#8220;potential revenue loss&#8221;</p></li><li><p>Connecting product discovery to business OKRs</p></li><li><p>Helping reframe user pain into decision-ready language</p></li><li><p>Pressure-testing how your research supports real-world tradeoffs</p></li></ul><div><hr></div><p>Please don&#8217;t choose all of them. Start with one. Think about where you currently lose the most time or feel most unsure. That&#8217;s your GPT&#8217;s starting role.</p><p></p><h3><strong>1.2 Define Its Behavior: What This GPT Should Be Like</strong></h3><p>You&#8217;re now writing the personality and guardrails for your assistant. This is called the &#8220;system message.&#8221; It&#8217;s what runs under the hood, every time you use the GPT. This is where you <em>train it to act like your ideal assistant.</em></p><p>Here&#8217;s a basic fill-in-the-blank version:</p><p><strong>System Message Template</strong></p><blockquote><p>You are a [tone or seniority level] research thought partner who helps with [specific task].</p><p>You are [3 traits: sharp, structured, not too verbose].</p><p>You always:</p><ul><li><p>Ask clarifying questions before guessing</p></li><li><p>Speak plainly and directly</p></li><li><p>Offer options, not just single answers</p></li><li><p>Reflect what I&#8217;ve said to check understanding</p></li></ul><p>You never:</p><ul><li><p>Make assumptions without asking</p></li><li><p>Speak like a marketer</p></li><li><p>Try to sound clever</p></li><li><p>Generalize without specific reasoning</p></li></ul></blockquote><p></p><p>Let&#8217;s look at a real example:</p><h4><strong>Example: Method Matchmaker System Message</strong></h4><blockquote><p>You are a senior research advisor who helps select the right method for a study.</p><p>You are practical, thoughtful, and straight-talking.</p><p>You always:</p><ul><li><p>Ask about the project goals, timeline, and constraints</p></li><li><p>Offer 2-3 possible methods with pros/cons</p></li><li><p>Include a scrappy version and a gold-standard version</p></li></ul><p>You never:</p><ul><li><p>Default to user interviews without justification</p></li><li><p>Suggest things we can&#8217;t realistically run</p></li><li><p>Use phrases like &#8220;delight&#8221; or &#8220;unlock&#8221;</p></li></ul></blockquote><p>You&#8217;ll paste this into the GPT builder when we get to Step 2. This is the blueprint for how your agent behaves.</p><div><hr></div><h3><strong>1.3 Provide Context Every Time You Use It</strong></h3><p>This is the biggest mistake most people make. They drop in a vague prompt like, &#8220;What&#8217;s the best method for this?&#8221;</p><p>&#8230;and then wonder why the answer is useless.</p><p>GPTs aren&#8217;t mind readers. They need context to act like a partner. Here&#8217;s what good context includes:</p><ul><li><p>What the project is about (1-2 sentences)</p></li><li><p>Who the team is</p></li><li><p>What constraints you&#8217;re facing (time, people, budget)</p></li><li><p>What stage you&#8217;re in (planning, revising, defending)</p></li><li><p>What you want from the assistant (options? feedback? draft?)</p><p></p></li></ul><p></p><h4><strong>Example: Good Setup Prompt</strong></h4>
      <p>
          <a href="https://www.userresearchstrategist.com/p/a-guide-to-ai-agents-for-user-researchers">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Lessons from the Only Researcher in the Room | Alaine Burns Laycock (Dext)]]></title><description><![CDATA[Alaine shares how she went from solo researcher to team builder, why she embraced democratization out of necessity, and how she aligned research with business goals.]]></description><link>https://www.userresearchstrategist.com/p/lessons-from-the-only-researcher</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/lessons-from-the-only-researcher</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Thu, 18 Sep 2025 08:01:12 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/165080687/92a461c54e289a78d21de1e3ebd1cb41.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p><strong>Listen now on <a href="https://podcasts.apple.com/us/podcast/the-user-research-strategist-uxr-impact-career/id1644716740">Apple</a>, <a href="https://open.spotify.com/show/53eOVirTLtGydqOvicHDyD">Spotify</a>, and <a href="https://www.youtube.com/@userresearchstrategist">YouTube</a>.</strong></p><p>&#8212;</p><p>Alaine is an empathetic UX leader with a proven track record of driving user-centric strategies, adeptly managing teams, and delivering impactful research that drives business growth and strategic alignment.</p><p>In her role as a Research Manager at Dext, she oversees research delivery, strategy, and operational functions. With 12 years of extensive experience in user research and design, she is deeply passionate about leading and inspiring individuals while ensuring that customer insights remain at the core of product development.</p><h2><strong>In our conversation, we discuss:</strong></h2><ul><li><p>What happens when a six-person research team shrinks to one and how Alaine rebuilt from there.</p></li><li><p>Why doing &#8220;more research&#8221; wasn&#8217;t the answer, and stepping back was the most strategic move.</p></li><li><p>How to spot the right people to advocate for research, even if it&#8217;s not your direct manager.</p></li><li><p>The messy reality of research democratization and what finally made it work.</p></li><li><p>What it really means to operate as a business partner rather than a user advocate.</p></li></ul><h2><strong>Some takeaways:</strong></h2><ol><li><p>You can&#8217;t research your way out of a broken system. Alaine tried to do everything alone and quickly realized she was burning out while diminishing trust in research. The turning point came when she stepped back to understand what the business <em>actually</em> needed, then rebuilt with that in mind.</p></li><li><p>Get closer to product, not just users. Alaine shifted her focus toward building relationships with senior product leaders and learning how they think. Reporting into a VP of Product helped position research as part of the product org, not a separate function trying to fight for attention.</p></li><li><p>Strategic visibility creates pull, not push. Her first hires were two contractors focused on mission-critical work. Once stakeholders saw the quality and impact, demand for research grew naturally. From there, product teams started asking for more researchers, Alaine didn&#8217;t need to campaign again.</p></li><li><p>Democratization isn&#8217;t plug-and-play, it&#8217;s infrastructure. Initial attempts failed until she simplified tools, rewrote bloated processes, and treated it as a real change management project. Now, 30+ people can confidently run research with a central handbook and lightweight systems built around their needs.</p></li><li><p>Start with honesty, not idealism. The first step in rebuilding wasn&#8217;t a grand vision. It was getting brutally honest about how research was perceived across the company. Anonymous feedback and reflection helped clarify what role the business <em>wanted</em> research to play and that insight shaped every step forward.</p></li></ol><h2><strong>Where to find Alaine:</strong></h2><ul><li><p><a href="http://www.linkedin.com/in/alaine-burns-laycock">LinkedIn</a></p></li></ul><div><hr></div><h2><strong>Interested in sponsoring the podcast?</strong></h2><p>Interested in sponsoring or advertising on this podcast? I&#8217;m always looking to partner with brands and businesses that align with my audience. <a href="https://calendly.com/nikkianderson/sponsorship-discovery-call">Book a call</a> or email me at nikki@userresearchacademy.com to learn more about sponsorship opportunities!</p><div><hr></div><p>The views and opinions expressed by the guests on this podcast are their own and do not necessarily reflect the views, positions, or policies of the host, the podcast, or any affiliated organizations or sponsors.</p>]]></content:encoded></item><item><title><![CDATA[10 (newer + bolder) ways to get stakeholder buy-in]]></title><description><![CDATA[If you're tired of being ignored and want to have some fun]]></description><link>https://www.userresearchstrategist.com/p/10-newer-bolder-ways-to-get-stakeholder</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/10-newer-bolder-ways-to-get-stakeholder</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 19 Aug 2025 07:01:26 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!0cIP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0cIP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0cIP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 424w, https://substackcdn.com/image/fetch/$s_!0cIP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 848w, https://substackcdn.com/image/fetch/$s_!0cIP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!0cIP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0cIP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg" width="586" height="346.1263736263736" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:860,&quot;width&quot;:1456,&quot;resizeWidth&quot;:586,&quot;bytes&quot;:396253,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/163459167?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0cIP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 424w, https://substackcdn.com/image/fetch/$s_!0cIP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 848w, https://substackcdn.com/image/fetch/$s_!0cIP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!0cIP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55dc558e-5459-4b79-a59e-266071773284_4000x2363.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://unsplash.com/illustrations/a-group-of-people-sitting-around-a-table-with-a-laptop-BR7v8SZZ5vE">Unsplash</a></figcaption></figure></div><p>Stakeholder buy-in is a tale&#8230;well, a nightmare, as old as time.  You run the research. You catch the cracks. You pull together quotes, clips, friction points, and five glaring moments of user confusion.</p><p>And then?</p><p>The meeting ends. Everyone goes back to business as usual. The roadmap stays exactly the same. And your work goes straight into the void&#8230;</p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/10-newer-bolder-ways-to-get-stakeholder">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[The User Research Democratization Playbook: Part Four]]></title><description><![CDATA[Part 4: Responding to UXR Democratization Issues]]></description><link>https://www.userresearchstrategist.com/p/the-user-research-democratization-51f</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/the-user-research-democratization-51f</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 15 Jul 2025 08:00:26 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!W1Bq!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce28c8b-42a9-4b75-ad65-f05ffc0df182_500x500.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995; Hi, this is Nikki with a <strong>free article</strong> from the User Research Strategist. I share content that helps you move toward a more strategic role as a researcher, measuring your ROI, and delivering impactful insights that move business decisions.</p><p>If you want to see everything I post, subscribe below!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p><em>This is a series on user research democratization &#8212; since this is a tough topic, there was way too much for one article. I will be writing this series and posting it over the next weeks and will edit this as I add to the series so you can easily navigate the different parts.</em></p><ul><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 1: The Complex Landscape of Research Democratization</a> (Free)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-d5f?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 2: A Framework for Responsible Research Democratization</a> (Paid)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-03c?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 3: Scaling research without sacrificing rigor</a> (Paid)</p></li></ul><div><hr></div><p><strong>Stop piecing it together. Start leading the work.</strong></p><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><p>You&#8217;ve made the leap and you&#8217;ve started democratizing user research. Stakeholders are getting involved, training programs are up and running, and suddenly you have more breathing room to focus on strategic work. Great, right?</p><p>Well, yes&#8230;and no.</p><p>If you&#8217;re anything like me, your journey probably started out promising. You saw stakeholders get excited about conducting research. They started running their own usability tests, sending out surveys, and occasionally producing some pretty solid insights. But somewhere along the way, you probably also encountered a situation where you had to bite your tongue and think:</p><p><em>&#8220;Wait, how did this insight even happen? That&#8217;s not what participants said at all.&#8221;</em></p><p>Sound familiar?</p><p>Maybe you noticed research being conducted without oversight or stakeholders accidentally twisting findings to fit their own narrative. Perhaps you&#8217;ve found yourself becoming more of a service desk, fielding endless requests to review interview guides, recruitment strategies, or analysis documents. (And, annoyingly, that&#8217;s exactly what you were trying to avoid in the first place.)</p><p>Democratizing user research is messy.</p><p>But that&#8217;s okay. You&#8217;re not alone, and this isn&#8217;t a sign of failure, just part of the process. After all, user research democratization is relatively new territory for most organizations. There isn&#8217;t a single team out there who hasn&#8217;t run into issues along the way.</p><p>By now, you already know the value democratization can provide, but you&#8217;re probably also realizing it comes with a host of challenges. My goal is to help you respond effectively to these inevitable bumps and frustrations.</p><p>In this article, I&#8217;ll share concrete strategies, realistic examples, and actionable frameworks for responding to the most common issues you&#8217;ll face while democratizing user research. I&#8217;ll walk you through how to proactively identify potential pitfalls, tackle them when they arise, and adjust your processes so they don&#8217;t become recurring headaches.</p><p>Together, we&#8217;ll cover how to:</p><ul><li><p>Spot early warning signs of democratization going sideways (before it&#8217;s too late).</p></li><li><p>Establish clear, realistic guidelines that stakeholders can actually follow.</p></li><li><p>Handle common pitfalls like misinterpreted insights, stakeholder overconfidence, and lack of oversight.</p></li><li><p>Build a straightforward system to respond to ethical concerns and compliance risks swiftly.</p></li><li><p>Navigate pushback from stakeholders or leadership who might question the value and role of user research altogether.</p></li></ul><p>Democratization isn&#8217;t about handing over your expertise, it&#8217;s about helping your organization scale research responsibly.</p><h1><strong>Identifying Common Issues in UXR Democratization</strong></h1><p>When I first began democratizing user research, I felt like I&#8217;d found the solution to all my problems. Stakeholders running their own usability tests? Great! Product teams collecting their own data? Even better! I finally had more bandwidth to tackle strategic initiatives.</p><p>But it wasn&#8217;t long before the cracks started showing. Teams got excited, but enthusiasm quickly turned into confusion, misinterpretation, and occasionally, chaos.</p><p>Democratization brings enormous potential, but it also introduces specific pitfalls, some that are obvious, and others you&#8217;ll only discover the hard way. Let&#8217;s break down the most frequent challenges into four clear categories. I&#8217;ll share exactly what these problems look like (so you can spot them early), along with real examples from my experience.</p><h2><strong>Quality Issues</strong></h2><p>Let&#8217;s start here, because quality is often the first place democratization breaks down. Remember, not everyone conducting research is trained or experienced. When research quality slips, insights become unreliable, and stakeholders may lose faith in the value of research altogether.</p><h3><strong>Poorly constructed research (bias, flawed methodologies)</strong></h3><p>Stakeholders designing surveys or interviews full of leading questions. For example, a product manager once created a survey where literally every question began with: <em>&#8220;How excited would you be&#8230;?&#8221;</em> Naturally, all answers were positive, but completely useless for decision-making.</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Stakeholders sending research scripts or surveys to participants without your review.</p></li><li><p>Constant use of leading, closed-ended, or ambiguous questions.</p></li><li><p>Overconfidence in their approach despite lack of formal research training.</p></li></ul><h3><strong>Misinterpretation or overgeneralization of findings</strong></h3><p>A single positive comment from one usability participant suddenly becomes proof that &#8220;users love this feature.&#8221; Once, I saw an entire roadmap change direction based on a single, misinterpreted piece of feedback from a friend of the product manager.</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Reports or presentations where quotes are cherry-picked and findings feel suspiciously aligned to stakeholders&#8217; initial beliefs.</p></li><li><p>Sweeping conclusions based on small sample sizes like &#8220;Users universally prefer&#8230;&#8221;</p></li></ul><p></p><h2><strong>Operational Issues</strong></h2><p>Even if the research itself is decent, operational issues can still cause headaches. When different teams run their own studies without clear documentation or coordination, chaos ensues, resources get wasted, and valuable insights disappear into black holes.</p><h3><strong>Inconsistent or incomplete documentation</strong></h3><p>Stakeholders conducting studies but never logging their insights anywhere. I&#8217;ve had moments of deja vu when two teams ran essentially identical research simply because no one documented the first team&#8217;s findings.</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Multiple teams are unaware of research others have already completed.</p></li><li><p>Missing context when stakeholders share findings (&#8220;Where&#8217;s the original data for this claim?&#8221;).</p></li></ul><h3><strong>Fragmented repositories or duplicated efforts</strong></h3><p>Research scattered across Slack threads, personal Notion pages, random Google Drive folders, or worse, buried in personal email inboxes. At one company, we discovered four separate research repositories existing simultaneously (all containing different research!).</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Stakeholders constantly asking, &#8220;Where can I find research on X?&#8221;</p></li><li><p>Duplicate requests for similar research studies from different teams.</p></li></ul><h2><strong>Ethical and Compliance Issues</strong></h2><p>These are the scariest because they can have serious legal and ethical consequences. Non-researchers often lack the training to understand the nuances of consent, data protection, or privacy regulations.</p><h3><strong>Mishandling of sensitive data or consent processes</strong></h3><p>Stakeholders recording video sessions without participants&#8217; explicit consent, or worse, sharing sensitive participant data openly across Slack or email. I&#8217;ve personally had to step in and remind teams that recording without clear consent isn&#8217;t just unethical, it&#8217;s illegal.</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Stakeholders unsure how to phrase consent forms or handle participant questions about data usage.</p></li><li><p>Unexpected use of unapproved recording or recruitment tools.</p></li></ul><h3><strong>Privacy concerns and non-compliance with regulations</strong></h3><p>Teams unintentionally violating GDPR or other privacy regulations by storing identifiable data improperly or failing to anonymize sensitive information. I&#8217;ve found participants&#8217; personal details casually pasted into public team channels. Yikes!</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Stakeholders asking basic questions about participant data storage or privacy (&#8220;Wait, how long should we keep this data?&#8221;).</p></li><li><p>No centralized guidance or documentation around compliance and privacy.</p></li></ul><h2><strong>Cultural and Organizational Issues</strong></h2><p>Finally, democratization issues aren&#8217;t always technical&#8212;they&#8217;re often about people and culture. Resistance and misunderstanding about the role and value of research can derail even the best-laid plans.</p><h3><strong>Resistance from stakeholders or teams</strong></h3><p>Teams who either dismiss democratized research entirely or, worse, actively undermine it. I&#8217;ve encountered stakeholders who insisted, <em>&#8220;We&#8217;ve always made decisions without research, why bother now?&#8221;</em></p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Teams consistently questioning the validity of democratized research findings.</p></li><li><p>Minimal engagement with or outright avoidance of your training efforts.</p></li></ul><h3><strong>Devaluing professional research roles</strong></h3><p>Stakeholders assuming anyone can do research (thus dismissing your expertise). Once, an executive confidently proclaimed, &#8220;Why do we need researchers at all if product managers can do interviews?&#8221;</p><p><strong>Early warning signs to look out for:</strong></p><ul><li><p>Reduced hiring budgets or stalled plans for growing the research team, justified by &#8220;democratization is handling it.&#8221;</p></li><li><p>Researchers are asked less strategic questions and expected more often to simply &#8220;check work.&#8221;</p></li></ul><p>Reading through these issues, you might be nodding vigorously because you&#8217;re already experiencing one (or all) of them. I promise you&#8217;re not alone. Democratization, like most things in user research, isn&#8217;t an all-or-nothing game. It&#8217;s a careful balancing act.</p><p>Now that we&#8217;ve identified the most common pitfalls, the next step is learning how to proactively monitor and respond to them quickly and effectively, minimizing disruption and maximizing value. We&#8217;ll dive deeply into this in the next sections, covering concrete strategies and frameworks you can implement immediately.</p><p>But first, take a moment. Reflect on your organization. Which of these issues resonate with you the most? Which can you already see emerging? Awareness is the first step towards effective action.</p><p></p><h1><strong>Establishing a Proactive Monitoring System</strong></h1><p>Democratizing research isn&#8217;t &#8220;set it and forget it.&#8221; I learned this lesson early on, mostly by ignoring it until small problems became big headaches. If you don&#8217;t regularly check on your democratized research model, quality can slip, small errors will grow, and teams might stop trusting research altogether.</p><p>Think about it like a garden. You wouldn&#8217;t plant seeds, walk away, and expect flowers to bloom perfectly months later, right? A good garden needs consistent attention, watering, pruning, checking for weeds. Similarly, democratized research needs constant care and monitoring.</p><p>Let&#8217;s dig deeper into practical, actionable ways you can set up an effective monitoring system that catches issues early before they spiral out of control.</p><h2><strong>Implement Regular Quality Audits</strong></h2><p>Regular quality audits are your first line of defense. They sound boring, I know, but trust me: you&#8217;ll be amazed (and maybe alarmed) at what you uncover.</p><h3><strong>Quarterly reviews of randomly selected democratized projects</strong></h3><p>You can&#8217;t audit everything, but periodic spot-checks help you see reality clearly&#8212;without rose-colored glasses. Doing this regularly means you&#8217;ll quickly spot patterns or repeated issues and can jump on them before they spread.</p><p><strong>How I do this:</strong></p><p>Once a quarter, randomly choose a handful of studies conducted by non-researchers. I pick studies of different types, surveys, usability tests, quick interviews, to get a full picture of what&#8217;s happening.</p><p><strong>Questions I ask when auditing:</strong></p><ul><li><p>Were research goals and hypotheses clearly defined?</p></li><li><p>Was the participant recruitment unbiased and appropriate?</p></li><li><p>Did stakeholders ask leading or biased questions? (Spoiler: they often do.)</p></li><li><p>Were conclusions properly drawn from data, or were insights exaggerated and cherry-picked?</p></li></ul><p>During one audit, I discovered a marketing team was using highly biased questions in their surveys, questions like, &#8220;How much better is this feature?&#8221; instead of neutral language. Catching this early allowed us to quickly retrain the team before it became a bigger issue.</p><h3><strong>Define clear quality metrics and review standards</strong></h3><p>You can&#8217;t measure quality without standards. Clearly defined metrics help stakeholders know exactly what&#8217;s expected and give you a fair way to judge quality.</p><p><strong>Metrics I typically use:</strong></p><ul><li><p><strong>Participant quality:</strong> Are participants representative of our actual users, or just conveniently available friends and colleagues?</p></li><li><p><strong>Question quality:</strong> Are questions unbiased and open-ended, or are they designed to confirm pre-existing beliefs?</p></li><li><p><strong>Insight quality:</strong> Are insights supported by clear evidence, or are they vague conclusions without data to back them?</p></li></ul><p>For example, I created a simple, transparent scorecard stakeholders could use to self-assess before submitting their findings. It forced stakeholders to be thoughtful about their approach, and audits became faster since basic quality improved dramatically.</p><h2><strong>Set Up Stakeholder Feedback Loops</strong></h2><p>Research democratization relies heavily on stakeholders&#8217; willingness and ability to do good work. But stakeholders won&#8217;t always volunteer when they&#8217;re struggling&#8212;sometimes due to pride, confusion, or even embarrassment. So, it&#8217;s critical to proactively reach out and give them a safe, easy way to provide feedback.</p><h3><strong>Regular surveys and interviews to understand stakeholder challenges</strong></h3><p>Regular check-ins help surface problems stakeholders might never mention unprompted. You need clear visibility into their frustrations, struggles, and successes.</p><p><strong>How I do this:</strong></p><ul><li><p>Quick quarterly surveys asking about pain points, confidence levels, and the types of research they&#8217;re struggling with most.</p></li><li><p>Short interviews or casual coffee chats to dive deeper into survey findings, clarifying ambiguous feedback.</p></li></ul><p>I once discovered through a quick stakeholder survey that teams avoided our research repository because they found the tagging system confusing. This simple insight led to a clearer system that increased adoption and reduced duplicated work.</p><h3><strong>Implement an anonymous feedback channel</strong></h3><p>Not everyone feels comfortable openly sharing their struggles, especially if it feels critical of you or your team. An anonymous feedback option ensures honest, candid responses.</p><p><strong>How I do this:</strong></p><p>I use a simple Google Form, clearly labeled as anonymous, sent out monthly. I ask stakeholders questions like:</p><ul><li><p>&#8220;What part of the research process feels most challenging or unclear?&#8221;</p></li><li><p>&#8220;Are there barriers preventing you from using the research repository effectively?&#8221;</p></li></ul><p>For example, anonymous feedback once revealed stakeholders were hesitant to ask for help, fearing they&#8217;d seem incompetent. That led me to set up casual &#8220;office hours&#8221; to normalize asking for support, quickly solving that issue.</p><h2><strong>Use Data to Track Common Pitfalls</strong></h2><p>Tracking common pitfalls systematically helps you catch trends early and tackle root causes proactively rather than continuously putting out fires.</p><h3><strong>Patterns in methodology mistakes</strong></h3><p>Repeated mistakes indicate a systemic issue, usually either training gaps or unclear resources.</p><p>During quarterly audits, I categorize common methodology issues. If the same mistakes pop up repeatedly, like consistently biased questions, I know stakeholders need refresher training.</p><p>For example, I noticed stakeholders repeatedly misunderstood when to use open-ended vs. closed-ended questions. A simple, targeted training module completely turned this around, improving question quality across all future studies.</p><h3><strong>Frequent ethical oversights or repository usage issues</strong></h3><p>Ethical issues (like improper consent forms or privacy mistakes) aren&#8217;t just embarrassing, they&#8217;re serious risks. Catching these trends early is critical. Similarly, repository issues can massively undermine the value of your democratization program.</p><p><strong>How I track this:</strong></p><ul><li><p>Logging ethical oversights found during audits or stakeholder feedback sessions.</p></li><li><p>Tracking repository issues: duplicated studies, untagged reports, or documents saved in personal drives instead of central repositories.</p></li></ul><p>After noticing repeated confusion around participant consent forms, we created a simple, required training video specifically on consent and privacy. Issues dropped significantly after stakeholders had clearer guidance.</p><p>Establishing a proactive monitoring system takes work. But believe me, the payoff is huge. You&#8217;ll quickly move from firefighting mode, always scrambling, to proactive mode, where you anticipate problems before stakeholders even realize they&#8217;re having them.</p><p>Your stakeholders will appreciate clear guidance, support, and continuous improvements, and you&#8217;ll sleep better knowing your democratization model isn&#8217;t secretly falling apart behind the scenes.</p><p>Remember: democratized research is a powerful tool, but only if you&#8217;re consistently looking after its health. Do your future self (and your stakeholders) a favor by setting up a proactive monitoring system today.</p><h1><strong>Responding to Quality Issues</strong></h1><p>Let&#8217;s talk about something uncomfortable but inevitable. At some point, stakeholders conducting research will produce low-quality work. And it will make your researcher heart sink when you spot biased surveys or usability tests that lack even basic structure. Trust me, I&#8217;ve been there, probably more often than I care to admit.</p><p>While it&#8217;s tempting to panic or start pulling your hair out, what&#8217;s more effective (and sanity-saving) is having clear, actionable strategies ready to respond quickly and productively.</p><p>Here&#8217;s exactly how I tackle these challenges, complete with real-life strategies you can steal immediately.</p><h2><strong>Issue: Stakeholders Producing Low-Quality Research</strong></h2><p>Low-quality research isn&#8217;t just frustrating, it actively undermines the credibility and value of user research in your organization. Once stakeholders (or worse, leadership) start questioning the accuracy or value of insights, rebuilding that trust is painfully slow.</p><p>Here&#8217;s exactly what I recommend doing to prevent, and quickly respond to, quality issues:</p><h3><strong>Response Strategy #1: Implement Mandatory Review by Trained Researchers</strong></h3><p>A mandatory review acts as a clear gatekeeper, preventing poorly constructed research from ever reaching stakeholders or decision-makers. It gives your research team a chance to catch and correct mistakes before any damage is done. Here&#8217;s how to do it:</p><ul><li><p>Clearly communicate expectations. Let stakeholders know upfront that every survey, usability test plan, or research guide needs to be reviewed by a trained researcher before it goes live.</p></li><li><p>Build an easy submission process. Create a simple, low-friction submission workflow (Google Form, Notion page, or Slack channel) to submit research for review.</p></li><li><p>Define a realistic review timeline. Provide a transparent turnaround time (mine is usually around 2&#8211;3 days). Stakeholders know exactly when they&#8217;ll get feedback and plan accordingly.</p></li></ul><p>We once had a marketing team write a customer satisfaction survey full of leading questions (&#8220;How much do you love this feature?&#8221;). Thankfully, our mandatory review caught it in time. Instead of panicking, we scheduled a quick 20-minute call, rewrote the questions together, and ended up with meaningful insights rather than biased fluff.</p><h3><strong>Response Strategy #2: Develop a &#8220;Research Quality Checklist&#8221; for Stakeholders</strong></h3><p>When stakeholders have clear criteria to measure their research against, quality dramatically improves, even before it hits your desk for review. It helps stakeholders internalize best practices and self-correct earlier in the process.</p><p>My &#8220;Research Quality Checklist&#8221; typically includes points like:</p><ul><li><p>Objective clarity: Are research goals clearly defined and focused?</p></li><li><p>Bias check: Do questions avoid leading language, assumptions, or confirmation bias?</p></li><li><p>Participant recruitment: Is the sample diverse, representative, and unbiased?</p></li><li><p>Insight integrity: Are insights backed by direct evidence and clearly linked to the original research question?</p></li></ul><p>I hand stakeholders this checklist upfront (and repeatedly remind them to use it!).</p><p>A design team I worked with was notorious for usability tests with ambiguous tasks. After introducing the checklist, they started explicitly defining clear objectives and scenarios for testing. Sessions became consistently productive, and I spent far less time rewriting their scripts.</p><h3><strong>Response Strategy #3: Offer Focused, Targeted Training Sessions Addressing Skills Gaps</strong></h3><p>When you see repeated quality issues, it&#8217;s usually because stakeholders simply don&#8217;t know better (or forgot your previous training). Addressing specific, targeted skill gaps in short, practical training sessions can completely transform research quality. You can do this by: </p><ul><li><p>Identifying skill gaps: During reviews, audits, or feedback, note exactly which mistakes appear repeatedly.</p></li><li><p>Scheduling short, targeted sessions: Run focused, bite-sized workshops on topics like &#8220;Unbiased Survey Writing&#8221; or &#8220;Structuring Effective Usability Tests.&#8221;</p></li><li><p>Providing clear, actionable templates: Always pair your training with ready-to-use templates or examples to reinforce what they&#8217;ve learned.</p></li></ul><p>Recently, we recently noticed a high rate of biased questions in stakeholder surveys (think: &#8220;Why is this feature so great?&#8221;). To fix this, we held a 90-minute workshop specifically on unbiased question-writing, complete with hands-on exercises, clear templates, and concrete before-and-after examples. The next set of surveys we reviewed showed immediate improvement, with questions that produced genuinely insightful data.</p><p>Stakeholders doing research will always come with some degree of quality risk. It&#8217;s inevitable. But instead of despairing, you can prepare proactively with these clear, actionable response strategies:</p><ol><li><p>Mandatory researcher reviews catch errors before they do damage.</p></li><li><p>Research quality checklists empower stakeholders to self-correct before issues arise.</p></li><li><p>Focused, targeted training sessions tackle recurring problems at the source.</p></li></ol><p>In my experience, clearly addressing quality issues head-on, and calmly guiding stakeholders toward improvement, does wonders for building trust, respect, and long-term buy-in for research across your organization.</p><h1><strong>Responding to Operational and Documentation Issues</strong></h1><p>If you&#8217;ve democratized research even slightly, you&#8217;ve probably encountered this headache, a fragmented documentation, inconsistent reporting practices, and duplicated insights scattered across every possible tool your company uses. You know the drill, one team has their findings in a Slack thread, another in Notion, and a third stored them in a random Google Doc no one can find. Suddenly, your organization&#8217;s insights resemble a digital scavenger hunt rather than a reliable repository of knowledge.</p><p>I&#8217;ve been there (and if I&#8217;m honest, I might even have caused it once or twice). The good news is, it&#8217;s fixable if you take clear, actionable steps early.</p><h2><strong>Issue: Insights scattered across multiple systems, duplicated efforts, repository inefficiencies</strong></h2><p>When documentation gets fragmented, insights become unreliable&#8212;or worse, forgotten. Stakeholders waste precious time chasing down the same insights repeatedly, duplicating studies, or making decisions without the benefit of existing research. This doesn&#8217;t just hurt your credibility, it also wastes everyone&#8217;s time.</p><h3><strong>Response Strategy #1: Clearly define and communicate documentation requirements</strong></h3><p>Documentation chaos usually starts because people aren&#8217;t clear on exactly what&#8217;s expected of them. Having crystal-clear documentation requirements removes ambiguity and creates consistent practices across teams:</p><ul><li><p>Create a standardized reporting template. Provide a structured template everyone uses. Include clear sections for research objectives, methods, key findings, supporting evidence, and next steps. (I usually put mine in Notion or Airtable.)</p></li><li><p>Document exactly where insights must live. Pick one central tool (<a href="https://lnk.condens.io/z3P">Condens</a>, Airtable, or even a dedicated Notion workspace) and explicitly mandate that all final insights must go there, no exceptions.</p></li><li><p>Communicate repeatedly (and kindly). Share your documentation guidelines multiple times: in training, Slack announcements, team meetings, and onboarding sessions. Don&#8217;t assume stakeholders remember after seeing it once.</p></li></ul><p>When my last organization faced insight chaos, we standardized all reports into an Airtable template with clear sections. After just a few weeks (and gentle reminders), everyone started reliably documenting insights in the same place, dramatically reducing confusion and duplication.</p><h3><strong>Response Strategy #2: Regularly audit repository use, emphasizing accountability</strong></h3><p>When stakeholders know documentation is being actively monitored, compliance skyrockets. Regular audits reveal who&#8217;s using the repository correctly, who needs extra help, and who might just need a polite reminder. Here&#8217;s how to run effective repository audits:</p><ol><li><p>Schedule quarterly repository audits. Pick random samples of research documentation across teams and evaluate them for completeness, clarity, and proper storage.</p></li><li><p>Create a simple scoring rubric. Develop straightforward evaluation criteria (correctly documented objectives, clearly defined next steps, insights properly tagged).</p></li><li><p>Share audit results transparently. Present audit results openly and clearly, recognizing teams with excellent documentation, and offering practical, supportive feedback to those lagging.</p></li></ol><p>We introduced quarterly repository audits at one company after finding insights scattered across Slack, Notion, and Google Drive. At first, it felt awkward, but after sharing the first audit results transparently (with plenty of positive shoutouts), stakeholders got competitive (in a good way) about improving their documentation. Within two quarters, compliance improved by over 70%.</p><h3><strong>Response Strategy #3: Consolidate insights in a centralized, accessible, and easy-to-use system</strong></h3><p>A centralized, easy-to-use repository removes friction, making it effortless for teams to store, find, and use insights. If using your documentation system feels like pulling teeth, stakeholders simply won&#8217;t do it. Make it easy, and they&#8217;ll flock to it. Here are some steps to consolidate effectively:</p><ol><li><p>Pick one intuitive, flexible tool: Choose something that stakeholders genuinely like using. I really recommend tools like <a href="https://lnk.condens.io/z3P">Condens</a> for a repository.</p></li><li><p>Clearly structure and tag insights. Use a simple tagging system (by project, team, method, or persona) to make insights instantly discoverable.</p></li><li><p>Provide training and onboarding support. Hold quick, interactive onboarding sessions showing stakeholders exactly how to use your chosen repository tool (trust me&#8212;this 30-minute training pays dividends!).</p></li></ol><p>I&#8217;ve successfully used <a href="https://lnk.condens.io/z3P">Condens</a> to centralize fragmented research. We ran quick onboarding sessions, developed a consistent tagging system (by project, research method, and audience), and stakeholders loved the ease of searching and sharing insights. The system quickly became indispensable, solving our fragmentation issue practically overnight.</p><h3><strong>Response Strategy #4: Appoint a dedicated research operations owner or coordinator</strong></h3><p>Having someone specifically accountable for research operations ensures that documentation and repositories don&#8217;t slip through the cracks as everyone&#8217;s &#8220;second job.&#8221; A dedicated owner actively manages the system, provides support, runs audits, and makes continuous improvements.</p><ol><li><p>Clearly define responsibilities. Make sure the operations owner knows exactly what&#8217;s expected: managing documentation guidelines, conducting audits, onboarding new stakeholders, and troubleshooting problems.</p></li><li><p>Empower them to drive accountability. This person should regularly check in with stakeholders, offer support proactively, and gently hold teams accountable when they slip.</p></li><li><p>Set measurable KPIs. Metrics might include repository compliance rates, reduced duplication, or stakeholder satisfaction with documentation processes.</p></li></ol><p>We assigned one senior researcher as our dedicated research ops coordinator for documentation. She actively supported stakeholders, ran the audits, and even held open &#8220;office hours&#8221; for documentation questions. Our compliance jumped almost immediately and stakeholders genuinely appreciated the dedicated support.</p><p>Scattered documentation and inefficient repositories don&#8217;t fix themselves. But with clear, supportive, and structured interventions, you can turn chaos into clarity:</p><ul><li><p>Define and communicate clear documentation requirements (think easy templates, simple instructions).</p></li><li><p>Regularly audit repository usage to drive accountability (friendly reminders work wonders!).</p></li><li><p>Consolidate your insights in a centralized, user-friendly system (the right tool makes compliance painless).</p></li><li><p>Appoint someone specifically accountable for research operations (this isn&#8217;t a side hustle, give it dedicated attention).</p></li></ul><p>By tackling these operational issues directly (but kindly), you&#8217;ll improve compliance, reduce duplication, and regain trust in the research process all while saving your own sanity.</p><h1><strong>Responding to Ethical and Compliance Issues</strong></h1><p>Ethics and compliance can feel like the least exciting part of user research, but if stakeholders mess it up, things can go downhill fast. Mishandled consent or poorly managed sensitive data isn&#8217;t just inconvenient; it can trigger severe regulatory issues, damage customer trust, and genuinely harm your organization&#8217;s reputation.</p><p>I&#8217;ve seen stakeholders accidentally skip consent because they didn&#8217;t fully understand the implications, or casually store sensitive participant data in random documents. </p><p>In fact, I messed up myself big time because I was rushing and had a very minor compliance issue when recruiting participants. </p><p>It happens more often than you&#8217;d think.</p><h2><strong>Issue: Stakeholders mishandling consent or sensitive user data</strong></h2><p>When consent and data handling slip through the cracks, your organization can quickly find itself facing legal issues or, at best, seriously damaged trust with participants. Ethical missteps can even lead leadership to question whether democratization was a good idea in the first place. We want to avoid this scenario completely.</p><p>Here&#8217;s exactly how you can make sure ethical standards stay airtight and actionable:</p><h3><strong>Response Strategy #1: Require explicit ethical training for anyone conducting research</strong></h3><p>Most ethical mistakes happen simply because stakeholders don&#8217;t know what they don&#8217;t know. Clear, explicit ethical training ensures they fully understand their responsibilities, without making ethics feel scary or bureaucratic. Here is how to make ethical training actionable and practical:</p><ul><li><p>Make it short, clear, and mandatory. Run a straightforward 1&#8211;2 hour training that covers participant consent, data protection basics, and the key dos and don&#8217;ts of ethical research.</p></li><li><p>Incorporate real-life scenarios. Use concrete examples and scenarios (ideally from your own organization or similar ones) to highlight common mistakes and exactly how to avoid them.</p></li><li><p>Include an interactive quiz or knowledge check. Ensure stakeholders genuinely grasp the content with a quick quiz at the end. No stress, but it reinforces the essentials.</p></li></ul><p>At my previous company, we rolled out a simple mandatory ethics training module with practical scenarios (&#8220;You recorded a usability test&#8212;where can you store the recording?&#8221;). Stakeholders quickly understood exactly what they could and couldn&#8217;t do, and ethical mistakes dropped sharply.</p><h3><strong>Response Strategy #2: Establish a dedicated compliance checkpoint in research workflow</strong></h3><p>Embedding a compliance checkpoint directly into your research workflow ensures ethical considerations aren&#8217;t forgotten in the rush to launch research. It puts ethics front and center at exactly the right moment. Here are some steps you can follow:</p><ol><li><p>Build it into your research process. Clearly mark a compliance checkpoint before any research goes live, this could be a checkbox in your research approval form or a step in your research tool&#8217;s workflow.</p></li><li><p>Provide a simple compliance checklist. Include essential items like: consent form completed, sensitive data storage identified, participant anonymity measures in place.</p></li><li><p>Have clear escalation paths. Clearly communicate who stakeholders should reach out to if they&#8217;re unsure about compliance (typically you or your designated compliance owner).</p></li></ol><p>For example, I introduced a mandatory &#8220;compliance check&#8221; step in our research approval workflow. Stakeholders simply couldn&#8217;t launch studies without confirming that they&#8217;d reviewed the compliance checklist. It only took an extra few minutes, but compliance rates soared immediately.</p><h3><strong>Response Strategy #3: Assign ethics reviews for all research involving sensitive data</strong></h3><p>Sensitive data, like health records, financial information, or private conversations, requires special handling. An explicit ethics review ensures risks are identified early, avoiding serious missteps. Here&#8217;s how to run one:</p><ol><li><p><strong>Clearly define what counts as sensitive data. </strong>Explicitly state categories (health data, financial information, personally identifiable information, etc.) that trigger an ethics review.</p></li><li><p><strong>Create a quick ethics-review template. </strong>Include straightforward questions stakeholders must answer questions like: &#8220;How is data anonymized?&#8221; &#8220;Where will data be stored?&#8221; &#8220;Who can access it?&#8221;</p></li><li><p><strong>Appoint an ethics review point-person. </strong>Designate someone trained in ethical practices who stakeholders can approach for quick ethics checks. This could be you or another senior researcher.</p></li></ol><h3><strong>Response Strategy #4: Clearly document consent guidelines in easily accessible formats</strong></h3><p>If your consent guidelines are buried in dense documents or scattered in multiple locations, stakeholders simply won&#8217;t use them. Accessible, clearly documented guidelines make ethical compliance effortless. You can do this by:</p><ul><li><p>Creating simple consent form templates. Provide easy-to-use, clearly worded templates stakeholders can quickly adapt for their studies.</p></li><li><p>Document exactly how and where to store consent: Give clear instructions about where stakeholders should keep consent forms.</p></li><li><p>Make guidelines easily findable. Store ethical documentation prominently in your centralized research repository (Notion, Airtable, <a href="https://lnk.condens.io/z3P">Condens</a>), and pin it in Slack or internal knowledge tools.</p></li></ul><p>At one org, we noticed incomplete consent forms happening frequently. We created simple consent-form templates stakeholders could easily customize, clearly communicated exactly where to store these forms (in a secured folder), and pinned these instructions in Slack. Compliance issues drastically reduced.</p><p>Ethical research isn&#8217;t just about rules, it&#8217;s about trust. By proactively embedding these clear, actionable steps into your democratized research workflow, you protect your organization, empower your stakeholders, and elevate the overall credibility of user research.</p><h1><strong>Responding to Cultural and Organizational Issues</strong></h1><p>User research democratization can quickly become contentious if stakeholders feel uneasy, resistant, or worried about the value of professional researchers diminishing. You may find yourself fielding awkward comments like, <em>&#8220;So, anyone can do user research now?&#8221;</em> or even dealing with stakeholders quietly conducting rogue studies out of misunderstanding or mistrust.</p><p>I&#8217;ve personally experienced team members and researchers who were genuinely anxious that democratizing research meant devaluing their hard-earned expertise. It&#8217;s a valid concern. But these cultural challenges are solvable if you&#8217;re proactive, transparent, and empathetic from the start.</p><h2><strong>Issue: Resistance from teams or concerns over the devaluation of UX researchers</strong></h2><p>Resistance isn&#8217;t just annoying, it can seriously derail your democratization efforts. If your teams don&#8217;t fully understand or support the democratization model, you&#8217;re setting yourself up for friction, confusion, and mistrust. People may become protective over their roles, defensive about expertise, or feel threatened. (Not a fun scenario.)</p><p>Let&#8217;s dive into clear, actionable strategies to prevent (or quickly fix) these cultural challenges.</p><h3><strong>Response Strategy #1: Clearly communicate the role and value of professional researchers</strong></h3><p>The root of resistance often comes down to misunderstanding or fear about roles changing. Clearly defining the ongoing critical role of professional researchers ensures everyone feels secure, respected, and confident about their place in the new model. Here are some ways to do that:</p><ul><li><p>Schedule an &#8220;All-hands&#8221; or team-wide meeting. Clearly explain the democratization strategy, emphasizing the unique skills, deep expertise, and value professional researchers bring (generative research, complex studies, research synthesis).</p></li><li><p>Create a simple visual or diagram (use a slide, Miro, or Notion). Illustrate clearly what stays researcher-led vs. what stakeholders can lead&#8212;make this easily shareable.</p></li><li><p>Send regular updates (monthly or quarterly emails/Slack posts). Reinforce your researchers&#8217; critical role through clear examples, celebrating their deeper research contributions and impact.</p></li></ul><p>When I first introduced democratization in my team, some researchers expressed concern about their roles. I immediately scheduled a short session clearly articulating researchers&#8217; continuing responsibilities, highlighting strategic research, synthesis, and generative studies. Resistance immediately softened.</p><h3><strong>Response Strategy #2: Reinforce the complementary nature of democratized and dedicated research efforts</strong></h3><p>Stakeholders often worry democratization means professional research is less valuable. Clarifying how democratized research complements (rather than replaces) dedicated researchers ensures everyone sees democratization as collaboration, not competition. Some steps on how to do this include:</p><ul><li><p>Running interactive training sessions. Pair researchers and stakeholders together to illustrate how professional researchers add value by mentoring, reviewing, and guiding research quality.</p></li><li><p>Highlighting concrete examples of collaboration. Regularly showcase successful case studies where democratized research fed into, and was improved by, professional researcher insights.</p></li><li><p>Implementing regular pairing or mentorship sessions. Create structured opportunities for researchers and stakeholders to collaborate regularly (office hours, research pairing), reinforcing the complementary relationship.</p></li></ul><p>At one org, to reduce friction, we set up weekly &#8220;Research Office Hours,&#8221; explicitly pairing researchers with stakeholders. Stakeholders quickly saw how professional researchers helped deepen insights, and researchers felt valued for their expertise.</p><h3><strong>Response Strategy #3: Showcase successful democratization examples internally</strong></h3><p>Resistance often stems from skepticism or uncertainty. Showing successful democratized research examples within your organization provides proof democratization works and can win skeptical teams over quickly:</p><ol><li><p>Start a monthly democratization showcase. Briefly highlight one or two successful democratized studies in a short monthly update (email, Slack, newsletter, or <a href="https://lnk.condens.io/z3P">Condens</a>).</p></li><li><p>Ask stakeholders to share their own experiences. Invite stakeholders who&#8217;ve successfully run studies to speak briefly at team meetings or research gatherings about their positive experiences.</p></li><li><p>Create a democratization &#8220;wins&#8221; page. Use Notion, <a href="https://lnk.condens.io/z3P">Condens</a>, or another internal tool to collect examples of successful democratized studies, clearly summarizing outcomes and stakeholder testimonials.</p></li></ol><p>We created a monthly Slack thread called &#8220;Democratization Wins&#8221; where stakeholders shared their successful usability tests and how professional researchers improved their work. Skeptical teams quickly became more open-minded.</p><h3><strong>Response Strategy #4: Set clear boundaries around what remains researcher-led and what is democratized</strong></h3><p>Ambiguity creates anxiety. Clearly defined boundaries reassure researchers that their expertise isn&#8217;t being replaced and help stakeholders feel clear about exactly where their responsibilities begin and end. Here are some boundaries to try:</p><ul><li><p>Create and distribute a simple one-pager or decision tree. Explicitly outline exactly which studies stakeholders can lead (usability tests, surveys) and those strictly researcher-led (generative, strategic studies).</p></li><li><p>Communicate boundaries clearly and repeatedly. Reinforce these boundaries in your trainings, during office hours, and via regular communications.</p></li><li><p>Include clear escalation paths. Clearly state who stakeholders should approach if they are unsure if a study should be researcher-led.</p></li></ul><p>The key to addressing cultural issues is proactive transparency. With clear communication, consistent reinforcement, and tangible examples, you&#8217;ll quickly move your stakeholders from resistant to supportive, strengthening your democratization efforts and your organization&#8217;s trust in user research.</p><h1><strong>Creating an Issue Escalation and Resolution Framework</strong></h1><p>At some point during user research democratization, things will inevitably go wrong. Maybe stakeholders release a biased survey, sensitive data gets mishandled, or critical findings get misinterpreted, causing confusion across teams.</p><p>When these issues arise, your response matters&#8212;a lot. It not only influences the immediate problem but also sets a precedent for how seriously your organization takes user research and its credibility.</p><p>Having an issue escalation and resolution framework in place might sound overly corporate or bureaucratic. This structure will save you from confusion, anxiety, and constant firefighting down the line. </p><h2><strong>Step 1: Establish Clear Escalation Paths</strong></h2><p>When something goes wrong, stakeholders should know exactly who to contact, who will handle it, and how quickly they&#8217;ll hear back. Without this clarity, issues get lost, ignored, or handled inconsistently. Here&#8217;s how to set this up:</p><ol><li><p>Define severity clearly and simply. Start by categorizing issues into three easy-to-remember groups:</p><ol><li><p>Minor: Small mistakes with limited impact (like a single biased question on a usability test).</p></li><li><p>Major: Issues happening repeatedly or those seriously affecting decisions (stakeholders repeatedly misunderstanding findings).</p></li><li><p>Critical: Any ethical or compliance violations (like missing consent forms or privacy breaches).</p></li></ol></li><li><p>Identify clear ownership. Clearly assign who&#8217;s responsible at each level, for example:</p><ol><li><p>Minor issues &#8594; Research operations coordinator (resolved in 1&#8211;2 days)</p></li><li><p>Major issues &#8594; Senior or Lead UX Researcher (response within one day, thorough follow-up within a week)</p></li><li><p>Critical issues &#8594; Head of Research or Data Compliance Lead (immediate resolution within hours)</p></li></ol></li><li><p>Communicate this widely (and repeatedly). Send regular reminders via Slack, email, or wherever stakeholders engage most. Outline the severity levels, who&#8217;s responsible, and how quickly issues will be addressed.</p></li></ol><h2>Step 2: Set Clear Criteria for Escalating Issues</h2><p>Your stakeholders aren&#8217;t mind-readers. If they aren&#8217;t clear on when to escalate an issue, important problems may go unnoticed. Avoid ambiguity by clearly defining when something must be escalated:</p><ol><li><p>Document specific triggers that require escalation clearly. Here are a few examples of how you could define triggers:</p><ol><li><p>Ethical or consent violations: Immediately escalate (critical)</p></li><li><p>Repeated research quality issues (e.g., consistently poor survey design after multiple trainings): Major escalation</p></li><li><p>One-time, minor methodological mistakes (e.g., single instance of a poorly phrased question): Minor escalation</p></li></ol></li><li><p>Give concrete examples to stakeholders. Say something clear and relatable, such as:</p><ol><li><p>&#8220;If you find that participant consent forms were not completed correctly, escalate immediately as a critical issue. If you spot biased questions showing up repeatedly in surveys after multiple training sessions, escalate as a major issue.&#8221;</p></li></ol></li><li><p>Reinforce escalation criteria during training. Explicitly discuss escalation processes during onboarding and refresher sessions so stakeholders know exactly when to act.</p></li></ol><h2><strong>Step 3: Maintain Transparency Throughout the Escalation Process</strong></h2><p>People get anxious when they don&#8217;t know what&#8217;s happening. Transparency about what issues have come up, how you&#8217;re responding, and what you&#8217;re doing to prevent them is essential to building trust in the research process. Here&#8217;s how to put this into practice:</p><ol><li><p>Set up a simple, transparent tracking method. This could be a Google Sheet, Airtable, or Notion page where everyone sees:</p><ol><li><p>A brief description of each issue</p></li><li><p>Severity level and who&#8217;s handling it</p></li><li><p>Current status and how it was resolved</p></li><li><p>Preventative actions taken</p></li></ol></li><li><p>Regularly communicate back to stakeholders. Monthly (or bi-monthly), share brief, plain-language summaries. For example:</p><ol><li><p>&#8220;Last month, we encountered two critical ethical issues around participant consent. We quickly resolved this by requiring mandatory ethics training for anyone running research. We also saw recurring issues with survey biases, so we&#8217;ve scheduled refresher training sessions.&#8221;</p></li></ol></li><li><p>Discuss openly during quarterly research meetings. Use quarterly meetings as opportunities to talk openly about challenges and lessons learned. This reinforces accountability and a healthy research culture.</p></li></ol><p>Your escalation framework doesn&#8217;t need to be complex, it just needs clarity. By clearly defining severity levels, assigning clear ownership, setting explicit escalation criteria, and maintaining full transparency, you&#8217;ll handle democratization issues proactively, calmly, and effectively, keeping user research valuable, credible, and respected across your organization.</p><h1><strong>Communicating Issues and Responses Internally</strong></h1><p>Communicating clearly about issues that crop up is vital. However, no one enjoys receiving negative news, especially when it could reflect badly on their team or their work. How you communicate these issues matters a lot. Poorly handled communication can create resistance or tension; great communication turns these moments into learning opportunities and builds trust.</p><p>Here&#8217;s how to communicate issues internally clearly, constructively, and actionably (without hurting anyone&#8217;s feelings or wasting their time).</p><h2><strong>Always Frame Issues Constructively (Opportunities vs. Failures)</strong></h2><p>No one likes hearing their project has issues, and calling out mistakes can easily make people defensive or demoralized. Instead, present challenges as opportunities for improvement or learning. This shifts the conversation from blame to growth. Here are some tips I use:</p><ul><li><p>Avoid negative language.</p><ul><li><p>Instead of:&#8220;This survey was biased and unusable,&#8221;</p></li><li><p>Try &#8220;We spotted an opportunity to make our surveys clearer and more neutral to ensure high-quality insights.&#8221;</p></li></ul></li><li><p>Always include the solution alongside the issue.</p><ul><li><p>Instead of &#8220;Participants weren&#8217;t properly consented, this is unacceptable,&#8221;</p></li><li><p>Try &#8220;We noticed a gap in consent processes. Let&#8217;s use this as an opportunity to clarify our guidelines, implement quick training refreshers, and avoid future issues.&#8221;</p></li></ul></li></ul><h2><strong>Highlight Examples Where Issues Were Successfully Addressed</strong></h2><p>People love stories. Rather than only pointing out where things have gone wrong, include clear, concrete examples where your teams successfully resolved an issue. This builds confidence and reinforces positive behaviors internally. Here are some ways to do this:</p><ul><li><p>Weekly or monthly success stories. Briefly share stories in meetings, Slack, or newsletters:</p><ul><li><p>&#8220;Last month, the product team noticed repeated bias in surveys. After a quick training session, they wrote an unbiased survey that gave clear, actionable insights, directly leading to improved user experience. Great job!&#8221;</p></li></ul></li><li><p>Personalized shoutouts. Recognize individuals publicly (always check first if they&#8217;re comfortable):</p><ul><li><p>&#8220;Huge thanks to Sarah, after attending the refresher on survey design, her latest survey provided some of the clearest data we&#8217;ve seen yet!&#8221;</p></li></ul></li></ul><p>This balances the communication about challenges with recognition, keeping people motivated rather than discouraged.</p><h2><strong>Use Clear, Consistent Language Across All Communications</strong></h2><p>Consistency builds trust and clarity. If your communication style or language is all over the place, people get confused. Keep things clear, consistent, and easy to understand so everyone knows exactly what you&#8217;re talking about each time.</p><ol><li><p>Create a simple glossary or communication guide. Outline terms clearly, such as &#8220;biased questions,&#8221; &#8220;ethical escalation,&#8221; and &#8220;critical issues,&#8221; and always use these consistently in emails, Slack, or meetings.</p></li><li><p>Use structured communication templates. For example, a short, clear message structure for issues might look like this:</p><ol><li><p>Issue Identified: (Brief description in neutral, factual language)</p></li><li><p>Opportunity: (Positive framing of issue as an improvement opportunity)</p></li><li><p>Immediate Actions: (Exactly what&#8217;s being done right away)</p></li><li><p>Next Steps: (Any follow-up training, check-ins, or audits planned)</p></li></ol></li></ol><p><strong>Example of a short internal communication:</strong></p><blockquote><p><strong>Issue Identified:</strong></p><p>Consent forms were incomplete on three recent user interviews.</p><p><strong>Opportunity:</strong></p><p>Great chance to refresh our team&#8217;s awareness on consent guidelines to improve data compliance.</p><p><strong>Immediate Actions:</strong></p><p>We&#8217;ve scheduled a brief, focused training for next week on consent processes.</p><p><strong>Next Steps:</strong></p><p>Compliance checks will be reinforced to prevent recurrence. Any questions&#8212;reach out directly!</p></blockquote><h2><strong>Keep Regular Updates Short and Actionable</strong></h2><p>Your stakeholders are busy. Long, drawn-out emails or Slack messages won&#8217;t get read thoroughly. Short, actionable messages are far more effective. </p><ul><li><p><strong>Use bullet points instead of paragraphs. </strong>People skim, so bullet points grab attention: Instead of &#8220;We noticed multiple issues with biased questions in recent surveys, and it&#8217;s essential that we address these issues quickly to ensure our data remains valid and trustworthy&#8230;&#8221;</p></li><li><p>Try</p><ul><li><p>Issue: Recent surveys have biased questions.</p></li><li><p>Solution: Immediate refresher training this Friday at 11am.</p></li><li><p>Action: RSVP here (link) and attend live or watch the recording by end-of-day Monday.</p></li></ul></li><li><p>End each message with a clear action or call to action. For example, &#8220;Action required: Attend the training session or watch the recording by next week.&#8221;</p></li></ul><p>This approach ensures your messages drive immediate, helpful actions, rather than being ignored or postponed.</p><p>Here&#8217;s how you might combine all of the above into one clear, positive, actionable message:</p><blockquote><p><strong>Quick Update: Improving Survey Quality</strong></p><p><strong>Issue:</strong></p><p>We recently spotted biased questions in some stakeholder surveys, which limits the accuracy of our findings.</p><p><strong>Opportunity:</strong></p><p>This is a great chance for everyone to brush up on survey best practices and improve data quality together!</p><p><strong>What&#8217;s happening next:</strong></p><ul><li><p>A quick, practical survey design workshop is scheduled this Thursday at 3 pm (RSVP here).</p></li><li><p>We&#8217;ve added clearer templates to our documentation (available here).</p></li><li><p>Shoutout to Alex&#8217;s team, after attending this session last quarter, their recent surveys have been excellent!</p></li></ul><p><strong>Action:</strong></p><p>Please RSVP and attend the session or watch the recording by the end of the week. Any questions, Slack me!</p></blockquote><p></p><p>Communication around democratization issues should never be scary or anxiety-inducing. When you frame issues constructively, celebrate successes, keep your language clear, and communicate actionably and briefly, your team will see research democratization as a continuous improvement process, one they&#8217;re excited to be a part of, rather than afraid of getting wrong.</p><p>Start small, stay consistent, and keep the tone positive. It really makes all the difference.</p><h1><strong>Democratization Issues Are Normal</strong></h1><p>If you&#8217;re feeling overwhelmed by all these potential democratization issues, take a deep breath. Encountering problems when democratizing user research isn&#8217;t just normal, it&#8217;s expected<em>.</em> Even the most thoughtfully built frameworks hit snags along the way. The difference between successful democratization and a messy situation is proactively managing these bumps rather than letting them spiral.</p><h2><strong>Expect Issues</strong></h2><p>First off, normalize the idea that democratization won&#8217;t be perfect from day one. Stakeholders will inevitably write biased surveys, repositories might get messy, and ethical slip-ups could occur. These are not failures&#8212;just signals you need clearer guidance, training, or oversight.</p><ul><li><p>Remind yourself (and your stakeholders!) frequently that issues are learning opportunities, not disasters.</p></li><li><p>Have your escalation and response frameworks clearly documented and ready to go, so you&#8217;re never caught off-guard.</p></li></ul><h2><strong>Use Clear Governance and Proactive Monitoring</strong></h2><p>A clear governance structure is like the scaffolding around your democratization efforts. It holds everything steady. Regularly checking in on your democratized research through audits, feedback loops, and clear checkpoints ensures your framework stays healthy and credible.</p><ul><li><p>Set up quarterly quality audits and stakeholder feedback loops immediately.</p></li><li><p>Publish your governance framework widely so everyone knows exactly how things work and how they can quickly flag or resolve issues.</p></li></ul><h2><strong>Targeted Responses Solve Problems</strong></h2><p>When issues arise, don&#8217;t just react randomly&#8212;be intentional. Having specific strategies to address different types of problems (quality, operational, ethical, cultural) makes your responses faster, clearer, and more effective.</p><ul><li><p>Create easy-to-follow response plans for each type of issue we discussed:</p><ul><li><p>Quality: Quick checklists, targeted training refreshers, mandatory reviews.</p></li><li><p>Operational: Centralized, clearly-documented repositories, dedicated research ops oversight.</p></li><li><p>Ethical: Simple templates, explicit consent guidelines, mandatory ethical training, and compliance checkpoints.</p></li><li><p>Cultural: Clear boundaries between democratized and researcher-led studies, regular celebration of successes, reinforcing the value of professional research roles.</p></li></ul></li></ul><h2><strong>Continuous Improvement Is Not Optional</strong></h2><p>Democratization is never &#8220;done.&#8221; It&#8217;s a constantly evolving process. Regularly revisiting your approach and adjusting your strategy keeps your organization sharp, credible, and effective.</p><ul><li><p>Schedule regular review checkpoints at least quarterly to reassess how your democratization model is performing.</p></li><li><p>Create an easy way for stakeholders to give ongoing feedback&#8212;anonymous surveys or open Slack channels&#8212;so you know exactly where to focus improvement efforts.</p></li></ul><p>Responding to democratization issues doesn&#8217;t mean something&#8217;s gone wrong, it means you&#8217;re doing democratization right. Every organization faces these challenges, but the ones that thrive are proactive, clear, and structured.</p><p>Democratization issues can be like weeds in a garden, inevitable, but manageable if you consistently check, prune, and nurture. By clearly communicating your plans, proactively monitoring your processes, and positively addressing challenges, you&#8217;ll keep democratization growing healthy and strong.</p><p>Now, go tackle democratization confidently. You&#8217;ve got this.</p><div><hr></div><h1><strong>Stop piecing it together. Start leading the work.</strong></h1><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><p>Stay curious,</p><p>Nikki</p>]]></content:encoded></item><item><title><![CDATA[The User Research Democratization Playbook: Part Three]]></title><description><![CDATA[Part 3: Scaling research without sacrificing rigor]]></description><link>https://www.userresearchstrategist.com/p/the-user-research-democratization-03c</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/the-user-research-democratization-03c</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 08 Jul 2025 08:00:32 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!IMnN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995; Hi, this is Nikki with a <strong>paid article</strong> from the User Research Strategist. I share content that helps you move toward a more strategic role as a researcher, measuring your ROI, and delivering impactful insights that move business decisions.</p><p>If you want to see everything I post, subscribe below!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p><em>This is a series on user research democratization &#8212; since this is a tough topic, there was way too much for one article. I will be writing this series and posting it over the next weeks and will edit this as I add to the series so you can easily navigate the different parts.</em></p><ul><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 1: The Complex Landscape of Research Democratization</a> (Free)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-d5f?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 2: A Framework for Responsible Research Democratization</a> (Paid)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-51f?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 4: Responding to UXR Democratization Issues</a> (Free)</p></li></ul><div><hr></div><p><strong>Stop piecing it together. Start leading the work.</strong></p><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><h1>Scaling Research Without Sacrificing Rigor</h1><p>Research democratization is not a one-size-fits-all solution. What works for one company might fail in another due to differences in team size, research maturity, leadership buy-in, and decision-making culture. Instead of adopting a democratization model blindly, your first step should be research on your own organization.</p><p>This chapter has outlined multiple approaches to research democratization, from fully decentralized models to research-led approaches with controlled access. But choosing the right approach requires an honest evaluation of your company&#8217;s unique needs, challenges, and research maturity.</p><p>This is where your own research skills come into play. Before deciding how to scale research within your organization, take time to assess your current environment, define your goals, and determine the level of structure and oversight required to ensure success.</p><h2>Assess the Current State of Research in Your Organization</h2><p>To scale research without sacrificing rigor, you first need a comprehensive understanding of how research currently functions within your organization. Without a clear baseline, democratization efforts risk becoming chaotic, misaligned, or ineffective. Don&#8217;t skip this step! We do this: </p><ul><li><p>To identify exactly who is conducting research.</p></li><li><p>To understand the types of research being performed.</p></li><li><p>To find bottlenecks preventing effective research.</p></li><li><p>To gauge how leadership values research.</p></li><li><p>To document how research insights are shared and consumed.</p></li></ul><p>Let&#8217;s dive into how to assess the current state.</p><h3><strong>1. Map Out Current Research Roles and Responsibilities</strong></h3><p><strong>Objective: </strong>Clarify exactly who is conducting research, formally or informally.</p><p><strong>Steps:</strong></p><ol><li><p>List all individuals conducting research regularly, including:</p><ol><li><p>Trained researchers (User Researchers, UX Researchers, etc.)</p></li><li><p>Designers</p></li><li><p>Product Managers</p></li><li><p>Marketers</p></li><li><p>Engineers</p></li><li><p>Customer Support or Success Teams</p></li></ol></li><li><p>Determine the frequency with which each role performs research:</p><ol><li><p>Is research part of their job description, or are they doing it informally?</p></li><li><p>How frequently do non-researchers independently initiate studies?</p></li></ol></li></ol><p></p><h3><strong>2. Inventory Types of Research Currently Conducted</strong></h3><p><strong>Objective: </strong>Understand what research methods are being used and by whom.</p><p><strong>Steps:</strong></p><ol><li><p>Catalog recent research projects over the last 3-6 months.</p></li><li><p>Categorize these by research type:</p><ol><li><p>Usability testing (quick tests, prototype evaluations)</p></li><li><p>Surveys (customer satisfaction, feedback)</p></li><li><p>Interviews (generative or evaluative)</p></li><li><p>Analytics reviews (product usage analysis)</p></li><li><p>Generative or strategic studies (exploratory, opportunity-focused research)</p></li></ol></li><li><p>Highlight gaps between desired and actual types of research performed.</p></li></ol><p></p><h3><strong>3. Identify Research Bottlenecks and Pain Points</strong></h3><p><strong>Objective: </strong>Determine the obstacles preventing effective and timely research.</p><p><strong>Steps:</strong></p><ol><li><p>Conduct stakeholder interviews or surveys asking:</p><ol><li><p>&#8220;How often are you delayed by waiting for research results?&#8221;</p></li><li><p>&#8220;Have you ever skipped research due to lack of availability?&#8221;</p></li><li><p>&#8220;How often do you have to conduct research on your own without support?&#8221;</p></li></ol></li><li><p>Quantify these pain points if possible (&#8220;70% of Product Managers skip research due to long wait times&#8221;).</p></li></ol><p><strong>Example survey question (with rating scale):</strong></p><blockquote><p>&#8220;On a scale of 1-5, how often do you find research availability a blocker for timely decisions?&#8221;</p><p>(1 = Never, 5 = Always)</p></blockquote><p><strong>Example finding:</strong></p><p>&#8220;80% of Product Managers rated research availability as 4 or higher, indicating significant delays.&#8221;</p><p></p><h3><strong>4. Evaluate Leadership&#8217;s Attitude Toward Research</strong></h3><p><strong>Objective: </strong>Assess how research is valued by leaders and decision-makers.</p><p><strong>Steps:</strong></p><ol><li><p>Conduct targeted leadership interviews or distribute a leadership-focused survey. Ask clear, pointed questions such as:</p><ol><li><p>&#8220;Do you consider research essential, helpful, or optional for decision-making?&#8221;</p></li><li><p>&#8220;Can you provide examples of recent decisions influenced by research?&#8221;</p></li><li><p>&#8220;How much are you willing to invest in research resources and training?&#8221;</p></li></ol></li><li><p>Analyze responses to determine if leadership views research as:</p><ol><li><p>A critical component</p></li><li><p>An occasional input</p></li><li><p>A luxury or nice-to-have</p></li></ol></li></ol><p><strong>Example question:</strong></p><blockquote><p>&#8220;Describe a recent instance where research directly impacted your decision-making.&#8221;</p></blockquote><p><strong>Example finding:</strong></p><p>&#8220;We delayed launching the new pricing model until the UX team conducted surveys&#8212;research is crucial for big decisions like pricing.&#8221;</p><p></p><h3><strong>5. Assess How Research Insights Are Currently Shared and Stored</strong></h3><p><strong>Objective: </strong>Understand how research insights are documented and made accessible.</p><p><strong>Steps:</strong></p><ol><li><p>Identify all locations where research insights currently live:</p><ol><li><p>Centralized (Dovetail, Airtable, Confluence)</p></li><li><p>Decentralized (Google Drive folders, Slack, emails)</p></li></ol></li><li><p>Check how consistently insights are documented:</p><ol><li><p>Do insights consistently include the research question, methods, results, and actionable recommendations?</p></li></ol></li><li><p>Evaluate accessibility and discoverability:</p><ol><li><p>Are insights easy to find by people across the organization?</p></li><li><p>How often do stakeholders complain about not finding past research?</p></li></ol></li></ol><p></p><h3><strong>6. Summarize Your Findings Into a Clear Research Landscape Report</strong></h3><p><strong>Objective: </strong>Create a succinct, actionable summary highlighting gaps, strengths, and weaknesses.</p><p><strong>Suggested structure:</strong></p><ul><li><p>Current research roles:</p><ul><li><p>Who&#8217;s conducting research (trained vs. informal)?</p></li></ul></li><li><p>Research types in use:</p><ul><li><p>Methods commonly and rarely used.</p></li></ul></li><li><p>Identified bottlenecks:</p><ul><li><p>Delays in conducting or accessing research.</p></li></ul></li><li><p>Leadership alignment:</p><ul><li><p>How critical is research viewed by leadership?</p></li></ul></li><li><p>Research documentation &amp; sharing:</p><ul><li><p>Current status of knowledge management and accessibility.</p></li></ul></li></ul><p>Example:</p><p>&#8220;Research is primarily done by one full-time UX researcher, supported informally by designers and PMs. Usability testing is frequent, but generative research is nonexistent. Teams often skip research due to delays, and leadership sees it as important but secondary. Documentation is decentralized, causing frequent duplication and wasted efforts.&#8221;</p><p></p><h2>Define Your Organization&#8217;s Research Needs and Risks</h2><p>Not all organizations require the same level of research rigor, nor can they accept the same level of risk. Clearly understanding your organization&#8217;s specific needs and risk tolerance is essential to creating a democratization model that is both effective and safe. This step is necessary to:</p><ul><li><p>Ensure the democratization model aligns with your organization&#8217;s specific risk and rigor requirements.</p></li><li><p>Prevent costly errors resulting from inappropriate levels of oversight.</p></li><li><p>Leverage existing skills within your organization efficiently.</p></li></ul><h3><strong>1. Determine the Level of Research Rigor Required</strong></h3><p>Research rigor refers to the quality standards and methodological thoroughness expected within your organization.</p><p><strong>Steps:</strong></p><ol><li><p>Classify decision types and their consequences. List recent or upcoming decisions influenced by research. Categorize these by the impact and risks involved.</p></li><li><p>Categorize the decisions into tiers of required rigor:</p><ol><li><p>High-Rigor: Decisions have significant financial, legal, or safety implications.</p></li><li><p>Medium-Rigor: Decisions impact user satisfaction, retention, or moderate financial outcomes.</p></li><li><p>Low-Rigor: Decisions are incremental, reversible, or experimental.</p></li></ol></li></ol><p><strong>Prompting questions:</strong></p><ul><li><p>What kinds of decisions does your organization regularly face?</p></li><li><p>What is the worst-case scenario if research for these decisions is inaccurate or incomplete?</p></li><li><p>Can you group your decisions into categories based on the potential risk or consequence?</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IqSz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IqSz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 424w, https://substackcdn.com/image/fetch/$s_!IqSz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 848w, https://substackcdn.com/image/fetch/$s_!IqSz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 1272w, https://substackcdn.com/image/fetch/$s_!IqSz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IqSz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png" width="1456" height="196" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:196,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:105495,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/159748769?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!IqSz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 424w, https://substackcdn.com/image/fetch/$s_!IqSz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 848w, https://substackcdn.com/image/fetch/$s_!IqSz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 1272w, https://substackcdn.com/image/fetch/$s_!IqSz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F867931e9-fb40-48df-8d8c-3cccf964314a_2230x300.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p></p><h3><strong>2. Clarify Your Organization&#8217;s Risk Tolerance</strong></h3><p>Risk tolerance defines how much uncertainty or potential harm your organization is willing to accept as it expands research responsibilities.</p><p><strong>Steps:</strong></p><ol><li><p>Conduct internal interviews or workshops to gauge comfort with risk:</p><ol><li><p>Ask stakeholders to rate their tolerance for potential research errors (low, medium, high).</p></li><li><p>Discuss potential consequences openly and document responses.</p></li></ol></li><li><p>Create a risk assessment matrix to visualize tolerance clearly.</p></li></ol><p><strong>Prompting questions:</strong></p><ul><li><p>How comfortable is leadership with research findings from non-researchers driving key decisions?</p></li><li><p>What types of errors or biases can your organization afford, and what is completely unacceptable?</p></li><li><p>Which areas (finance, regulatory, health) require the strictest oversight, and which can afford some flexibility?</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zuvn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zuvn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 424w, https://substackcdn.com/image/fetch/$s_!zuvn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 848w, https://substackcdn.com/image/fetch/$s_!zuvn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 1272w, https://substackcdn.com/image/fetch/$s_!zuvn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zuvn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png" width="1456" height="250" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:250,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:81830,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/159748769?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zuvn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 424w, https://substackcdn.com/image/fetch/$s_!zuvn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 848w, https://substackcdn.com/image/fetch/$s_!zuvn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 1272w, https://substackcdn.com/image/fetch/$s_!zuvn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70b1651b-9874-4a17-858a-caa0d6971c95_1746x300.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p></p><h3><strong>3. Evaluate Existing Research Skills in Your Organization</strong></h3><p>Understanding the research skills and experience within your organization is essential for deciding how much oversight or training you&#8217;ll need to implement.</p><p><strong>Steps:</strong></p><ol><li><p>Create an inventory of current research skills among non-research stakeholders.</p></li><li><p>Use surveys or interviews to capture their research experience.</p></li><li><p>Categorize teams by experience (High, Medium, Low).</p></li><li><p>Assess the gap between current skills and desired research rigor levels.</p></li></ol><p><strong>Prompting questions:</strong></p><ul><li><p>Do your product or design teams have formal research training?</p></li><li><p>Are there team members regularly conducting interviews, usability tests, or surveys without oversight?</p></li><li><p>Which teams consistently produce reliable insights, and which require significant researcher intervention?</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!g6BE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!g6BE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 424w, https://substackcdn.com/image/fetch/$s_!g6BE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 848w, https://substackcdn.com/image/fetch/$s_!g6BE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 1272w, https://substackcdn.com/image/fetch/$s_!g6BE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!g6BE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png" width="1456" height="179" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:179,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:90193,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/159748769?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!g6BE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 424w, https://substackcdn.com/image/fetch/$s_!g6BE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 848w, https://substackcdn.com/image/fetch/$s_!g6BE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 1272w, https://substackcdn.com/image/fetch/$s_!g6BE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa14a3690-9a42-45e0-9336-a0e17487ce9f_2406x296.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p></p><h2>Identify the Right Democratization Model for Your Context</h2><p>Selecting the appropriate democratization model is critical. The right model enables your organization to effectively scale research without sacrificing quality, credibility, or reliability. Using insights gathered from assessing your current state (Step 1) and defining your organization&#8217;s research needs and risks (Step 2), follow the guide below to pinpoint exactly which model aligns best with your organization&#8217;s unique circumstances.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!IMnN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!IMnN!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 424w, https://substackcdn.com/image/fetch/$s_!IMnN!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 848w, https://substackcdn.com/image/fetch/$s_!IMnN!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 1272w, https://substackcdn.com/image/fetch/$s_!IMnN!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!IMnN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png" width="1456" height="320" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:320,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:167854,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/159748769?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!IMnN!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 424w, https://substackcdn.com/image/fetch/$s_!IMnN!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 848w, https://substackcdn.com/image/fetch/$s_!IMnN!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 1272w, https://substackcdn.com/image/fetch/$s_!IMnN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F77eaf2fd-3715-47bb-8a25-1b60f99b45c7_2184x480.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><h3><strong>If your organization has a small research team supporting many product teams:</strong></h3>
      <p>
          <a href="https://www.userresearchstrategist.com/p/the-user-research-democratization-03c">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Breaking the Research Bubble | Zack Stewart (Anthropologica)]]></title><description><![CDATA[Zack Stewart shares how UX researchers can push beyond product teams to shape strategy, connect across departments, and drive action through creativity.]]></description><link>https://www.userresearchstrategist.com/p/breaking-the-research-bubble-zack</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/breaking-the-research-bubble-zack</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Thu, 03 Jul 2025 08:00:51 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/165079075/076ebb21eafe607011229f2bfa46b1ac.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p><strong>Listen now on <a href="https://podcasts.apple.com/us/podcast/the-user-research-strategist-uxr-impact-career/id1644716740">Apple</a>, <a href="https://open.spotify.com/show/53eOVirTLtGydqOvicHDyD">Spotify</a>, and <a href="https://www.youtube.com/@userresearchstrategist">YouTube</a>.</strong></p><p>&#8212;</p><p>Zack is a mixed-methods researcher who transitioned from managing a restaurant pre-pandemic to becoming a UX researcher. Being homeschooled from an early age sparked a deep curiosity for human behavior and unconventional problem-solving. With a unique background, including a bachelor&#8217;s degree in film production, several associate degrees, he brings creative techniques and innovative methodologies to his work, delivering actionable insights that empower stakeholders to make confident decisions.</p><p>Zack led generative research and ideation that resulted in real estate's first AI-driven search experience at Flyhomes, addressing pain points that plagued traditional search methods for over a decade. Passionate about collaboration, he believes it&#8217;s the key to creating exceptional products, and services. As a long-form improviser, he applies a &#8220;yes, and&#8221; mindset to facilitate ideation sessions with diverse stakeholders, ensuring research leads to meaningful outcomes.</p><h2><strong>In our conversation, we discuss:</strong></h2><ul><li><p>Zack&#8217;s shift from running a restaurant to leading UX research, and how that background shaped his curiosity.</p></li><li><p>The difference between hitting deadlines and creating meaningful outcomes across departments.</p></li><li><p>What it takes to keep asking hard questions during a study, not just at the beginning.</p></li><li><p>Step-by-step ways to connect with teams like sales or marketing, even when they&#8217;ve never heard of UX research.</p></li><li><p>How to use short videos, emotional storytelling, and collaborative workshops to get research noticed and used.</p></li></ul><h2><strong>Some takeaways:</strong></h2><ol><li><p>Research isn&#8217;t just for product and design.<strong> </strong>Zack treats UX like a dartboard with many targets. Impact grows when researchers map who else touches the user journey and pull those people in from the start. Sales, support, marketing, leadership; each group brings new context and different blind spots.</p></li><li><p>Ask tough questions even when the project is underway. When Zack senses something is off, he pauses. He isn&#8217;t afraid to bring up pivots, gaps, or flawed assumptions mid-way through a study. That level of care helps avoid wasted effort later, and teams usually respect the honesty when it&#8217;s framed around shared goals.</p></li><li><p>Start with a small ask when reaching out. Zack sends a quick message before booking meetings. His script focuses on what the other person is working toward and how research might support that. Once a relationship forms, it&#8217;s easier to loop them in during planning or synthesis.</p></li><li><p>Workshops land better with the right mix of people. Instead of relying on leadership alone, Zack invites people from across levels and functions to join ideation. Groups merge ideas step-by-step until everyone has shaped the outcome. That approach often leads to ideas that reflect both the big picture and day-to-day realities.</p></li><li><p>Make insights feel real. Zack uses breakup letter prompts and short interview videos to get decision-makers to stop, feel, and act. A one-minute clip where ten customers echo the same pain point can trigger more change than a research report ever could.</p></li></ol><h2><strong>Where to find Zack:</strong></h2><ul><li><p><a href="https://www.linkedin.com/in/zack-t-stewart/">LinkedIn</a></p></li></ul><div><hr></div><h2><strong>Stop piecing it together. Start leading the work.</strong></h2><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><h2><strong>Interested in sponsoring the podcast?</strong></h2><p>Interested in sponsoring or advertising on this podcast? I&#8217;m always looking to partner with brands and businesses that align with my audience. <a href="https://calendly.com/nikkianderson/sponsorship-discovery-call">Book a call</a> or email me at nikki@userresearchacademy.com to learn more about sponsorship opportunities!</p><div><hr></div><p>The views and opinions expressed by the guests on this podcast are their own and do not necessarily reflect the views, positions, or policies of the host, the podcast, or any affiliated organizations or sponsors.</p>]]></content:encoded></item><item><title><![CDATA[The User Research Democratization Playbook: Part Two]]></title><description><![CDATA[Part 2: A framework for responsible research democratization]]></description><link>https://www.userresearchstrategist.com/p/the-user-research-democratization-d5f</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/the-user-research-democratization-d5f</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 17 Jun 2025 08:00:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!GhR7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995; Hi, this is Nikki with a <strong>paid article</strong> from the User Research Strategist. I share content that helps you move toward a more strategic role as a researcher, measuring your ROI, and delivering impactful insights that move business decisions.</p><p>If you want to see everything I post, subscribe below!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p><em>This is a series on user research democratization &#8212; since this is a tough topic, there was way too much for one article. I will be writing this series and posting it over the next weeks and will edit this as I add to the series so you can easily navigate the different parts.</em></p><ul><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 1: The Complex Landscape of Research Democratization</a> (Free)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-03c?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 3: Scaling Research Without Sacrificing Rigor</a> (Paid)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-51f?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 4: Responding to UXR Democratization Issues</a> (Free)</p></li></ul><div><hr></div><p><strong>Stop piecing it together. Start leading the work.</strong></p><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to&#8212;not around.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><h1>The Framework for Responsible Research Democratization</h1><p>Scaling research while maintaining rigor is not a simple process. Without structure, democratization can result in misleading insights, ethical missteps, and wasted effort. However, when implemented correctly, it empowers teams to make user-centered decisions while ensuring that research retains its credibility and influence.</p><p>This framework is designed to help research leaders establish a structured, effective, and scalable democratization model, one that enables non-researchers to contribute to research without compromising quality.</p><h3>Step 1: Define What Can and Cannot Be Democratized</h3><p>One of the biggest mistakes in research democratization is assuming that all research methods can (or should) be conducted by non-researchers. That is not the case.</p><p>A successful framework begins with clear definitions of what research activities can be democratized and what must remain with trained researchers. This prevents low-quality research from being used in high-stakes decision-making and ensures that non-researchers are only conducting studies that fit their skill set.</p><h4>Create a Categorization System for Research Methods</h4><p>Break research activities into four tiers:</p><ol><li><p>Fully Democratized &#8211; Can be run by non-researchers with minimal oversight.</p></li><li><p>Democratized with Oversight &#8211; Can be conducted by non-researchers, but requires a trained researcher&#8217;s review.</p></li><li><p>Guided Research &#8211; Non-researchers can be involved, but a trained researcher must lead the study.</p></li><li><p>Restricted to Researchers &#8211; Must be conducted exclusively by trained researchers.</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GhR7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GhR7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 424w, https://substackcdn.com/image/fetch/$s_!GhR7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 848w, https://substackcdn.com/image/fetch/$s_!GhR7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 1272w, https://substackcdn.com/image/fetch/$s_!GhR7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GhR7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png" width="1456" height="930" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:930,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:220938,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/159002989?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GhR7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 424w, https://substackcdn.com/image/fetch/$s_!GhR7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 848w, https://substackcdn.com/image/fetch/$s_!GhR7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 1272w, https://substackcdn.com/image/fetch/$s_!GhR7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa60314b4-c0b8-4913-9aa5-afdcfcf367b7_1618x1034.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>Create a Decision Framework for Stakeholders</h4><p>Once research categories are defined, build a decision tree that helps stakeholders determine:</p><ul><li><p>When they can conduct research independently.</p></li><li><p>When they need to partner with a researcher.</p></li><li><p>When they need to escalate to a research team.</p></li></ul><p>For example:</p><ul><li><p>Does this research involve sensitive user data or compliance risks? &#8594; If yes, it must be conducted by a trained researcher.</p></li><li><p>Is this a usability test for a minor UI update? &#8594; If yes, a trained non-researcher can conduct it following a structured template.</p></li><li><p>Is this a strategic or generative study exploring unmet needs? &#8594; If yes, it must be conducted by a research professional.</p></li></ul><p>Having a clear framework removes ambiguity and prevents research from being misused or diluted.</p><h3>Step 2: Create Clear Guidelines and Guardrails</h3><p>Without clear guidelines, research democratization can quickly spiral into inconsistent methods, poor-quality data, and confusion across teams.</p><h4>1. Develop Standardized Research Protocols</h4><p>To ensure consistency, create a set of research protocols that all teams must follow. These should include:</p><ol><li><p>Usability Testing Guides &#8211; Pre-written usability testing scripts, rubrics for evaluating responses, and success metrics.</p></li><li><p>Survey Templates &#8211; Guidelines on how to write unbiased survey questions and analyze responses correctly.</p></li><li><p>Participant Recruitment Best Practices &#8211; Prevents teams from relying on biased, unrepresentative samples (only interviewing internal employees).</p></li><li><p>Data Storage and Handling Policies &#8211; Ensures that participant privacy and legal compliance are followed.</p></li></ol><h4>2. Implement Pre-Approved Research Templates</h4><p>Rather than letting teams design research studies from scratch, provide pre-approved templates that ensure structured, repeatable processes. For example, a usability test template might include:</p><ul><li><p>Scripted introduction to ensure consistency in test facilitation.</p></li><li><p>Pre-written, unbiased tasks for participants.</p></li><li><p>A standardized results sheet to ensure that findings are logged consistently.</p></li></ul><p>Templates help prevent ad-hoc, low-quality research from creeping into the organization.</p><h4>3. Build Research Governance into the Process</h4><ul><li><p>Require all studies to be logged in a central repository before they are conducted.</p></li><li><p>Assign a researcher to review methodologies for any study run by non-researchers.</p></li><li><p>Create a decision tree for ethical considerations, ensuring that sensitive studies are escalated appropriately.</p></li></ul><p>By creating strong governance from the start, research democratization remains structured, not chaotic.</p><h3>Step 3: Provide Training and Ongoing Support</h3><p>Training is not optional in research democratization. It is the foundation that determines whether non-researchers will conduct studies that actually improve decision-making or introduce flawed, misleading insights into the organization.</p><p>Without training, democratization does not scale research, it scales bad research. Untrained stakeholders may run usability tests with leading questions, interpret survey data incorrectly, or unknowingly introduce bias into their studies. Worse, they may present their findings with false confidence, leading to major business or product decisions being made on inaccurate or incomplete data.</p><p>Many organizations make the mistake of treating research training as a one-time event&#8212;a workshop, a few documentation pages, or an online course. But research is a skill that requires reinforcement, practice, and feedback. Even experienced researchers continually refine their craft.</p><p>For democratization to be effective, organizations need to establish an ongoing, structured training system that aligns with the level of research responsibilities stakeholders will have.</p><h4>1. Build a Tiered Training Program</h4><p>Not every stakeholder needs the same level of research expertise. The goal of training is not to turn product managers or designers into full-fledged researchers, it&#8217;s to ensure that they have enough knowledge to conduct certain types of research effectively while knowing when to escalate more complex studies. A tiered approach ensures that training is scalable, relevant, and structured.</p><h4>Level 1: Research Awareness (Mandatory for All Stakeholders Involved in Research)</h4><p>This is the foundational level for anyone conducting or relying on research insights. It is designed to ensure that all stakeholders understand the purpose of research, when it is appropriate for them to conduct studies, and when they need to escalate to a trained researcher. Topics covered:</p><ul><li><p>Bias Awareness and Mitigation &#8211; How to recognize and reduce bias in research questions, participant selection, and interpretation of results.</p></li><li><p>When to Conduct Research vs. When to Escalate &#8211; A clear decision framework for determining whether a study should be owned by a researcher or if it can be conducted by a non-researcher.</p></li><li><p>Ethical Considerations in Research &#8211; Understanding participant consent, privacy requirements, and ethical issues related to data collection and storage.</p></li><li><p>How Research Fits Into the Organization &#8211; The role of research in decision-making and how democratized research should feed into the broader research ecosystem.</p></li></ul><p>This training should be a required baseline for anyone involved in research&#8212;no exceptions.</p><h4>Level 2: Basic Research Training (For Those Conducting Tactical Studies)</h4><p>This level is for stakeholders who will be running their own research studies, such as product managers, designers, or marketers conducting usability tests or small-scale surveys. Topics covered:</p><ul><li><p>How to Conduct Usability Testing &#8211; Structuring usability tests, avoiding leading questions, and synthesizing findings.</p></li><li><p>Survey Design Best Practices &#8211; Writing unbiased questions, selecting appropriate response formats, and analyzing survey data responsibly.</p></li><li><p>Basic Interviewing Skills &#8211; When and how to ask open-ended vs. closed-ended questions, active listening techniques, and how to probe deeper without leading.</p></li><li><p>How to Synthesize Findings Responsibly &#8211; Avoiding cherry-picking data, recognizing patterns, and presenting insights objectively.</p></li></ul><p>At this level, stakeholders should still have oversight from researchers, but they can conduct certain studies independently with structured templates and review processes in place.</p><h4>Level 3: Advanced Research Training (Optional for Stakeholders Seeking Deeper Expertise)</h4><p>This level is not required for most democratized research participants but can be beneficial for stakeholders who want to develop more advanced research skills and greater autonomy. Examples of who might pursue this level:</p><ul><li><p>Senior designers who frequently run complex usability studies.</p></li><li><p>Product leaders who want to deeply integrate research into their strategy.</p></li><li><p>Marketers conducting ongoing customer insights research.</p></li></ul><p>Topics covered:</p><ul><li><p>Advanced Interviewing Techniques &#8211; Learning how to facilitate in-depth qualitative research, including Jobs-to-Be-Done (JTBD) interviews.</p></li><li><p>Behavioral Data Analysis &#8211; How to connect qualitative insights with analytics data for a more comprehensive understanding of user behavior.</p></li><li><p>Longitudinal and Diary Studies &#8211; How to structure longer-term research that tracks user behavior over time.</p></li><li><p>How to Lead Research Synthesis and Workshops &#8211; Training on how to facilitate research readouts and stakeholder engagement sessions.</p></li></ul><p>Stakeholders at this level may require less oversight for certain research types, but they should still have their work peer-reviewed by professional researchers.</p><h3>2. Set Up Ongoing Research Mentorship and Coaching</h3><p>Training is only effective if it is reinforced through practice, feedback, and ongoing support. Organizations that simply provide training sessions but fail to offer continuous coaching often find that:</p><ul><li><p>Stakeholders forget key research principles over time.</p></li><li><p>Poor research practices start creeping back in.</p></li><li><p>Teams still struggle with synthesizing and interpreting insights correctly.</p></li></ul><p>A structured mentorship and support system ensures that research remains high quality over time and that non-researchers have access to expert guidance when needed.</p><h4>1. Office Hours with Researchers</h4><p>Setting up weekly or bi-weekly office hours allows non-researchers to:</p><ul><li><p>Get feedback on their research plans before launching a study.</p></li><li><p>Ask questions about synthesis and reporting.</p></li><li><p>Discuss challenges or uncertainties they&#8217;re facing in their research.</p></li></ul><p>This system creates a structured yet flexible way for researchers to provide ongoing guidance without needing to hand-hold every study.</p><h4>2. Research Coaching Programs</h4><p>Some organizations may benefit from a formal coaching program, where trained researchers mentor non-researchers through their first few studies. A structured coaching model might look like this:</p><ol><li><p>Observation Phase &#8211; The non-researcher shadows a researcher conducting a study, taking notes on best practices.</p></li><li><p>Co-Facilitation Phase &#8211; The non-researcher conducts part of a study under the guidance of a researcher.</p></li><li><p>Supervised Execution &#8211; The non-researcher conducts a full study independently, with a researcher reviewing their work and providing feedback.</p></li><li><p>Independent Research with Oversight &#8211; The non-researcher is approved to conduct select studies on their own but still submits research plans and synthesis for review.</p></li></ol><p>This gradual introduction to conducting research ensures that stakeholders build real skills rather than diving in with little preparation.</p><h4>3. Quality Review Check-Ins</h4><p>To maintain consistency, all democratized research should be subject to regular quality reviews. This includes:</p><ul><li><p>Pre-study reviews &#8211; A researcher approves study designs before they begin.</p></li><li><p>Post-study reviews &#8211; Researchers check that findings are synthesized correctly and insights are actionable.</p></li><li><p>Quarterly audits &#8211; Reviewing all democratized research to identify trends, common mistakes, and areas for additional training.</p></li></ul><p>These regular check-ins act as a safety net, ensuring that non-researchers remain aligned with best practices and that the research function maintains credibility.</p><h3>Step 4: Establish a Centralized Research Repository</h3><p></p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/the-user-research-democratization-d5f">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[The User Research Democratization Playbook: Part One]]></title><description><![CDATA[Part 1: The Complex Landscape of Research Democratization]]></description><link>https://www.userresearchstrategist.com/p/the-user-research-democratization</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/the-user-research-democratization</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 03 Jun 2025 08:00:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!bMXk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995; Hi, this is Nikki with a <strong>free article</strong> from the User Research Strategist. I share content that helps you move toward a more strategic role as a researcher, measuring your ROI, and delivering impactful insights that move business decisions.</p><p>If you want to see everything I post, subscribe below!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p><em>This is a series on user research democratization &#8212; since this is a tough topic, there was way too much for one article. I will be writing this series and posting it over the next weeks and will edit this as I add to the series so you can easily navigate the different parts.</em></p><ul><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-d5f?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 2: A Framework for Responsible Research Democratization</a> (Paid)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-03c?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 3: Scaling Research Without Sacrificing Rigor</a> (Paid)</p></li><li><p><a href="https://open.substack.com/pub/userresearchacademy/p/the-user-research-democratization-51f?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">Part 4: Responding to UXR Democratization Issues</a> (Free)</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bMXk!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bMXk!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 424w, https://substackcdn.com/image/fetch/$s_!bMXk!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 848w, https://substackcdn.com/image/fetch/$s_!bMXk!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 1272w, https://substackcdn.com/image/fetch/$s_!bMXk!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bMXk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png" width="1456" height="717" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:717,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:495531,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/157949398?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bMXk!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 424w, https://substackcdn.com/image/fetch/$s_!bMXk!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 848w, https://substackcdn.com/image/fetch/$s_!bMXk!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 1272w, https://substackcdn.com/image/fetch/$s_!bMXk!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff49e3b6d-c8c9-4177-af94-d7fad2913289_4000x1969.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://unsplash.com/illustrations/community-of-business-people-building-teamwork-and-cooperation-cartoon-corporate-tiny-characters-connect-and-match-puzzle-parts-together-make-achievement-flat-vector-illustration-challenge-concept-dly5qyR3N5w">Unsplash</a></figcaption></figure></div><div><hr></div><h2><strong>Stop piecing it together. Start leading the work.</strong></h2><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><h1>The Complex Landscape of Research Democratization</h1><p>User research is at an inflection point. Demand for research insights is growing exponentially, but research teams remain small. This imbalance forces organizations to explore democratization, enabling non-researchers to conduct research.</p><p>At its best, democratization scales insights, increases research buy-in, and enhances customer-centricity across an organization. At its worst, it leads to poor-quality research, biased data, and diluted research rigor. For researchers, democratization can feel like a double-edged sword: it helps meet demand, but it can also erode the depth and expertise of the field if done poorly.</p><h2>The Growing Demand for Research Across Organizations</h2><p>User research has expanded beyond the traditional UX and product development lifecycle. Today, research is needed for:</p><ul><li><p>Product strategy &#8211; Understanding user needs before features are even conceptualized.</p></li><li><p>Marketing validation &#8211; Ensuring messaging aligns with real customer pain points.</p></li><li><p>Customer support optimization &#8211; Identifying friction points that lead to high support ticket volumes.</p></li><li><p>Business decision-making &#8211; Using research to inform investment, expansion, and prioritization.</p></li></ul><p>As research extends into these diverse functions, the traditional researcher-to-team ratio has become unsustainable. Many researchers find themselves spread too thin, juggling too many projects with too few resources. In some cases, researchers must reject critical research requests simply because they don&#8217;t have the bandwidth. This leads to frustration among stakeholders who feel they lack access to the insights they need.</p><p>Additionally, many researchers face the dilemma of either being a solo UXR or small team of researchers trying to take care of multiple teams (I was once working across 10 teams), foundational processes, ops, and maintaining a research framework.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!nDZf!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!nDZf!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!nDZf!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!nDZf!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!nDZf!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!nDZf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:255883,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/157949398?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!nDZf!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 424w, https://substackcdn.com/image/fetch/$s_!nDZf!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 848w, https://substackcdn.com/image/fetch/$s_!nDZf!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!nDZf!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F571b4e23-c12d-4682-a213-2df97c0926dd_1920x1080.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>However, research teams haven&#8217;t scaled at the same rate as demand. According to <a href="https://maze.co/resources/continuous-research-report/">Maze&#8217;s 2023 Continuous Research Report</a>, 64% of companies now have a democratized research culture to cope with increasing research requests. Yet, research bandwidth remains a major challenge:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!k9Qr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!k9Qr!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 424w, https://substackcdn.com/image/fetch/$s_!k9Qr!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 848w, https://substackcdn.com/image/fetch/$s_!k9Qr!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 1272w, https://substackcdn.com/image/fetch/$s_!k9Qr!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!k9Qr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png" width="642" height="374.5" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:700,&quot;width&quot;:1200,&quot;resizeWidth&quot;:642,&quot;bytes&quot;:32171,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://userresearchacademy.substack.com/i/157949398?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!k9Qr!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 424w, https://substackcdn.com/image/fetch/$s_!k9Qr!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 848w, https://substackcdn.com/image/fetch/$s_!k9Qr!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 1272w, https://substackcdn.com/image/fetch/$s_!k9Qr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F457dc004-0834-483f-ad36-6bfa6ff33340_1200x700.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This imbalance is what drives the need for democratization but, without structure, it can create chaos rather than efficiency.</p><h2>The Growing Desire from Non-Researchers to Engage in Research</h2><p>It&#8217;s not just research teams feeling the pressure; stakeholders themselves are becoming more eager to engage with user insights. Product managers, designers, and even marketing teams want direct access to users. They see the value in speaking with customers, testing hypotheses, and gathering feedback.</p><p>This shift is largely positive. When more people across an organization have a user-centric mindset, better decisions are made. However, without structure, this enthusiasm can lead to problems:</p><ul><li><p>Biased research &#8211; Without training, non-researchers may unconsciously lead participants toward desired answers.</p></li><li><p>Unethical practices &#8211; Mishandling of participant consent, privacy, and data security.</p></li><li><p>Poor research methodologies &#8211; Relying on convenience sampling, asking leading questions, or misinterpreting results.</p></li></ul><p><a href="https://www.optimalworkshop.com/blog/democratizing-ux-research-empowering-cross-functional-teams">A 2023 industry survey</a> found that 73% of UX researchers reported spending significant time correcting or guiding poorly conducted research by non-researchers. This often happens because non-researchers unknowingly introduce confirmation bias, leading questions, and flawed synthesis.</p><h1>The Risks and Rewards of Democratizing Research</h1><p>Like many trends in UX and product development, democratization is neither inherently good nor bad, it depends entirely on how it is implemented.</p><h2>Potential Benefits of Democratization:</h2><ol><li><p>Scalability &#8211; More research gets done without overburdening researchers.</p></li><li><p>Stakeholder Buy-in &#8211; Teams feel more ownership of insights, increasing the likelihood of acting on findings.</p></li><li><p>Faster Decision-Making &#8211; Teams don&#8217;t have to wait weeks or months for research results.</p></li><li><p>User-Centric Culture &#8211; More teams engaging with research can help embed user thinking into company culture.</p></li></ol><h2>Potential Risks of Democratization:</h2><ol><li><p>Compromised Research Quality &#8211; Without structure, teams may conduct poorly designed studies, leading to misleading conclusions.</p></li><li><p>Research Fragmentation &#8211; Multiple teams running isolated studies without alignment, leading to duplicated work and inconsistent data.</p></li><li><p>Undermining the Value of Research &#8211; If anyone can &#8220;do research,&#8221; the expertise of trained researchers may be devalued.</p></li><li><p>Ethical Concerns &#8211; Improper consent collection, storing sensitive data insecurely, or asking questions that could cause harm.</p></li></ol><p>In a <a href="https://www.userinterviews.com/state-of-user-research-2023-report">survey done by User Interviews</a>, participants rated their feelings about democratization on a scale from 1 (very concerned/dissatisfied) to 5 (very excited/satisfied). On average, participants rated their feelings a 2.95 out of 5&#8212;just below neutral. Sentiment toward democratization was lowest among UXRs, who gave an average rating of 2.84.</p><p>The key to democratization is finding the right balance by enabling access to research while ensuring rigor and ethical responsibility.</p><h2>A Personal Reflection</h2><p>I&#8217;ll never forget the first time I experienced the impact of ungoverned research democratization. At one company, a well-intentioned product manager decided to run their own customer interviews. They were frustrated that the research team couldn&#8217;t prioritize their request, so they took matters into their own hands.</p><p>On the surface, it seemed great, my stakeholders were taking initiative. But when they presented their findings, I quickly realized the data was riddled with issues:</p><ul><li><p>They had only interviewed three participants, all of whom were personal contacts.</p></li><li><p>The questions were leading, designed to confirm the PM&#8217;s assumptions rather than uncover real insights.</p></li><li><p>They had misinterpreted responses, turning neutral feedback into positive validation.</p></li></ul><p>As a result, they advocated for a product change that was completely misaligned with actual user needs. It wasn&#8217;t until months later, after launch failure, that the team realized their mistake.</p><p>This experience solidified my belief that democratization without structure is dangerous. However, the opposite extreme&#8212;gatekeeping research&#8212;also isn&#8217;t the answer. If research teams hoard insights, they risk becoming bottlenecks and alienating stakeholders.</p><h1>Defining Democratization in a Research Context</h1><p>At its simplest, research democratization is the process of making research more accessible beyond the UX research team. It allows product managers, designers, marketers, customer support teams, and even engineers to engage with research, conduct studies, and apply insights. But accessibility doesn&#8217;t mean a free-for-all, it requires structure, training, and well-defined boundaries.</p><blockquote><p>Nielsen Norman Group has called this &#8216;Democratization 2.0&#8217; where research is distributed but carefully guided through training, templates, governance, and tiered research access.</p></blockquote><p>For democratization to work, it must be intentional. It&#8217;s not about handing research tools to anyone who wants them. It&#8217;s about equipping the right people with the right skills to conduct the right kinds of research under the right conditions.</p><h2>What Democratization Looks Like in Practice</h2><p>A well-structured approach to research democratization includes:</p><ul><li><p>Training and education &#8211; Non-researchers need to be taught not just how to conduct research but how to recognize its limitations, avoid bias, and synthesize insights properly.</p></li><li><p>Clear guidelines on who can conduct what research &#8211; Not all research should be democratized. Simple usability tests? Yes. Complex generative studies? No.</p></li><li><p>Templates and frameworks &#8211; Providing standardized interview guides, usability testing scripts, and survey templates reduces the likelihood of poorly designed studies.</p></li><li><p>A review and oversight process &#8211; Researchers should act as coaches and advisors, ensuring studies are structured correctly and that findings are interpreted responsibly.</p></li><li><p>A centralized research repository &#8211; Without a system for documenting and sharing insights, research efforts become fragmented, leading to duplication and inconsistencies.</p></li></ul><p>At one company, we introduced a tiered democratization system to balance access with quality control:</p><ul><li><p>Product managers and designers were trained to run usability tests using a structured process. They had to submit a research plan before running any sessions, and a researcher reviewed their findings before they were shared.</p></li><li><p>Marketing and customer success teams were given access to pre-approved survey templates but needed a researcher&#8217;s sign-off before launching a survey.</p></li><li><p>All generative and exploratory research remained the responsibility of trained researchers, ensuring foundational insights were handled by those with the expertise to do them properly.</p></li></ul><p>This system allowed the research team to focus on high-impact projects while enabling stakeholders to conduct low-risk research on their own. It wasn&#8217;t about giving away research, it was about scaling it responsibly.</p><h2>Common Misconceptions About Democratization</h2><p>Much of the resistance to research democratization comes from misunderstanding what it actually entails. Let&#8217;s break down the biggest misconceptions.</p><h3>Misconception #1: &#8220;Democratization means replacing researchers.&#8221;</h3><p>One of the most common fears among researchers is that democratization is a thinly veiled cost-cutting measure, a way for companies to avoid hiring or retaining research talent. The reality is, if democratization is implemented as a replacement for researchers, it will fail. Research quality will drop, teams will make decisions based on incomplete or biased findings, and the organization will ultimately feel the consequences in lost revenue, increased churn, or misguided product investments.</p><p>Democratization should extend the impact of research, not eliminate the need for researchers. When stakeholders conduct basic research, it frees up researchers to focus on deeper, more complex studies&#8212;the kind that require a trained researcher&#8217;s skill set.</p><p><strong>A responsible approach to democratization:</strong> A product team struggling with usability issues trained designers to run usability tests using a pre-approved script. However, researchers still guided the study setup, reviewed findings, and ensured insights were properly synthesized. This allowed research to happen faster while maintaining quality.</p><p><strong>A dangerous approach to democratization:</strong> An organization decided researchers weren&#8217;t needed because product managers could &#8220;just talk to users.&#8221; Without training or structure, these conversations were riddled with leading questions, incorrect assumptions, and cherry-picked data that confirmed pre-existing biases. The result? A product launch based on faulty insights, leading to poor adoption and wasted development time.</p><h3>Misconception #2: &#8220;Anyone can do research well.&#8221;</h3><p>It&#8217;s tempting to think that research is just about asking people questions. After all, everyone talks to users in some capacity, doesn&#8217;t that mean anyone can conduct research? Not exactly. Good research requires more than just talking to customers. It involves:</p><ul><li><p>Knowing how to frame a study to uncover real insights, not just confirm assumptions.</p></li><li><p>Asking the right kinds of questions&#8212;ones that don&#8217;t lead or bias participants.</p></li><li><p>Understanding how to analyze responses in a way that reflects true patterns, not just individual anecdotes.</p></li><li><p>Recognizing the limits of what a given method can tell you.</p></li></ul><p>I once worked with a marketing team that wanted to &#8220;validate&#8221; a new pricing strategy by running a customer survey. When I reviewed their draft, I found that nearly every question was leading: &#8220;Would you be excited to see this new lower price?&#8221; &#8220;How much better is this compared to what we had before?&#8221; The survey was structured in a way that guaranteed positive responses, and they nearly made a major pricing change based on biased data.</p><p>The solution isn&#8217;t to ban stakeholders from doing research, it&#8217;s to train them on how to do it properly and put safeguards in place.</p><p><strong>A balanced approach:</strong> Educate non-researchers about common biases, provide pre-approved templates, and have researchers review research plans before launch.</p><p><strong>A risky approach:</strong> Assume that because someone understands their product, they automatically understand how to conduct valid research.</p><h2>The Spectrum of Research Democratization</h2><p>Not all organizations take the same approach to democratization. There&#8217;s a spectrum, ranging from tightly controlled research to fully open, self-directed studies.</p><ol><li><p>No Democratization:</p><ol><li><p>Research is conducted solely by dedicated UX researchers.</p></li><li><p>Insights are centralized but often bottlenecked by research bandwidth.</p></li><li><p>Teams rely entirely on the research team for user insights.</p></li></ol></li><li><p>Partial Democratization:</p><ol><li><p>Non-researchers conduct some research but within a structured framework.</p></li><li><p>Usability tests, surveys, and small-scale studies can be run by trained stakeholders.</p></li><li><p>Researchers maintain oversight and provide guidance.</p></li></ol></li><li><p>Full Democratization:</p><ol><li><p>Research is open to everyone with minimal oversight.</p></li><li><p>Teams run studies independently, without researcher involvement.</p></li><li><p>Insights are often fragmented, with no centralized knowledge base.</p></li><li><p>Without strong governance, this approach usually leads to unreliable data and misalignment across teams.</p></li></ol></li></ol><p>Most organizations benefit from structured, partial democratization. It allows research to scale while maintaining rigor.</p><h1>The Case for Democratizing Research</h1><p>Democratizing research isn&#8217;t just about efficiency, it&#8217;s about survival. The landscape of product development moves at an unforgiving pace. Decisions are being made constantly, whether research has informed them or not. Research teams, however, rarely scale at the same speed as their organizations. While demand for research has skyrocketed, the number of dedicated researchers often remains stagnant.</p><p>This gap has forced research leaders to rethink traditional models. If a research team can&#8217;t directly support every initiative, how can research still be embedded in decision-making across an organization? The answer, for many, has been some form of research democratization&#8212;empowering non-researchers with the tools, training, and guardrails to conduct research on their own.</p><p>Done correctly, this creates a system where research isn&#8217;t just something that happens within the confines of a UX research team. It becomes part of how the organization operates. It allows for more user-centered decisions at scale, better alignment across teams, and an overall stronger connection to customer needs.</p><p>But before getting into how to democratize research effectively, it&#8217;s important to understand why this shift is happening and what&#8217;s at stake.</p><h2>The Need for Scale</h2><p>Most research teams face an impossible task: meeting an increasing demand for research with limited resources.</p><ul><li><p>A single researcher might be responsible for supporting five to ten product teams, each with multiple ongoing projects.</p></li><li><p>Companies are making huge strategic bets on product roadmaps, go-to-market strategies, and design decisions, but often without the research capacity to inform them properly.</p></li><li><p>Many research teams spend their time prioritizing projects rather than conducting research, meaning valuable but lower-priority questions go unanswered.</p></li></ul><p>I&#8217;ve seen this firsthand. In one company, I was the only researcher supporting seven product teams. There were simply not enough hours in the day to conduct research on every feature, design iteration, or customer pain point that needed attention. No matter how well we prioritized, important questions were left unanswered.</p><p>At the same time, my colleagues&#8212;product managers, designers, and marketers&#8212;were desperate for insights. They wanted to understand users, but without access to a researcher, they were forced to rely on assumptions or whatever anecdotal evidence they could gather on their own.</p><p>This is the core problem democratization attempts to solve. If every research question has to go through an overburdened research team, insights become a bottleneck. But if teams are enabled to conduct certain types of research themselves, more questions get answered, and research scales alongside the company.</p><p>The key is to ensure this doesn&#8217;t lead to a free-for-all where bad research does more harm than good.</p><h2>The Benefit of Increased Empathy</h2><p>One of the most overlooked benefits of democratization is how it transforms the way teams think about users.</p><p>Research isn&#8217;t just about gathering insights; it&#8217;s about changing perspectives. When product managers, designers, or marketers engage directly with users, it fundamentally shifts how they approach their work. They start making decisions based on what they&#8217;ve heard and seen, not just what they assume.</p><ul><li><p>A designer who watches users struggle through an onboarding flow will never unsee those frustrations. Instead of relying on second-hand reports, they will instinctively advocate for a better experience.</p></li><li><p>A product manager who sits in on user interviews stops thinking about features in isolation and starts seeing the bigger picture&#8212;the messy, real-world contexts in which customers actually interact with their product.</p></li><li><p>A marketing team that tests messaging directly with users will refine their approach based on evidence, not just gut feeling.</p></li></ul><p>I once worked with a product manager who, before engaging in research, had a firm belief that customers wanted more customization options. He pushed hard for this, confident that flexibility was the key to retention. But after sitting in on just three customer interviews, he completely changed his mind. Customers didn&#8217;t want more customization; they were overwhelmed by the complexity of the product and wanted simpler, more guided experiences.</p><p>That shift in thinking didn&#8217;t come from a research report, it came from direct engagement with users. That&#8217;s the power of democratization.</p><p>When more people in an organization interact with customers, it builds a culture of customer empathy, where decisions are made with a deeper understanding of real user needs.</p><p>However, this benefit only materializes when teams are engaging with research in the right way. Without structure, direct engagement can just as easily reinforce biases rather than challenge them. That&#8217;s why a thoughtful approach to democratization is critical.</p><h2>Faster Decision-Making</h2><p>Speed matters. In fast-moving companies, decisions are made quickly, often without research, simply because waiting weeks for insights isn&#8217;t an option.</p><ul><li><p>Product teams are pressured to ship. They can&#8217;t always afford to wait for a dedicated researcher to become available.</p></li><li><p>Executives expect quick answers. Delays in research can sometimes mean the difference between launching a feature and missing a market opportunity.</p></li><li><p>Customer expectations are evolving constantly. The faster a company can learn, the faster it can adapt.</p></li></ul><p>Democratization, when done well, allows teams to validate assumptions quickly, reducing the risk of making costly missteps. For example:</p><ul><li><p>A design team that has been trained to conduct usability testing can validate whether a new checkout flow is intuitive in days, not weeks.</p></li><li><p>A product team with access to survey tools can gather user sentiment data before launching a feature, ensuring they&#8217;re not blindsided by poor reception.</p></li><li><p>A marketing team can test messaging with real users before committing to a campaign, avoiding misalignment with customer expectations.</p></li></ul><p>By giving teams the ability to get user feedback quickly, democratization reduces reliance on guesswork. However, the key here is ensuring that teams know when to move fast and when to slow down. Not all research can or should be done quickly. Usability tests and surveys? These can often be done efficiently. Generative research or behavioral studies? These require more time and expertise.</p><p>Without a clear framework for what research should be democratized and what should remain within a dedicated research team, organizations risk prioritizing speed over accuracy&#8212;which can lead to even bigger problems down the road.</p><h1>The Risks and Challenges of Research Democratization</h1><p>Research democratization, if structured well, can expand an organization&#8217;s ability to incorporate user insights into decision-making. But when done poorly&#8212;or without enough oversight&#8212;it can introduce serious risks that undermine the credibility of research altogether.</p><p>Scaling research across non-researchers means reducing barriers to participation, but without guardrails, it also increases the likelihood of biased findings, ethical missteps, and fragmented efforts that don&#8217;t drive meaningful change.</p><h2>Quality Control Issues</h2><p>One of the most immediate risks of democratization is a decline in research quality. When individuals without formal training in research conduct studies, common methodological mistakes can creep in, sometimes with significant consequences for product and business decisions.</p><p>In a study of research democratization practices, 73% of UX researchers reported spending significant time correcting or guiding poorly conducted research by non-researchers. This often happens because non-researchers unknowingly introduce confirmation bias, leading questions, and flawed synthesis. One research leader shared that democratization without guardrails turned research into a &#8216;game of telephone&#8217;&#8212;insights became increasingly distorted as non-researchers misinterpreted findings, leading to misguided product decisions.</p><h3>1. Risks of Biased Research, Leading Questions, and Poor Methodology</h3><p>Non-researchers often approach research with the best of intentions, but without training, they can unintentionally introduce bias at every stage of the process:</p><ul><li><p>Confirmation bias &#8211; Asking questions designed to validate existing assumptions rather than uncovering new insights.</p></li><li><p>Leading questions &#8211; Steering users toward certain responses rather than letting them express their true thoughts.</p></li><li><p>Poor sampling &#8211; Interviewing a narrow or unrepresentative set of users, leading to skewed conclusions.</p></li><li><p>Flawed synthesis &#8211; Cherry-picking insights that align with stakeholder preferences rather than accurately reflecting patterns in the data.</p></li></ul><p>At one company, a product team wanted to &#8220;validate&#8221; a new feature idea. Since the research team was at capacity, a product manager ran a quick round of interviews. But instead of an open-ended discovery study, they asked leading questions like, &#8220;Wouldn&#8217;t you find this feature helpful?&#8221; Predictably, most users responded positively.</p><p>The team took this as a green light to move forward, investing months of development resources. After launch, usage was almost nonexistent&#8212;because while users agreed in interviews, their actual behavior told a different story. The issue wasn&#8217;t the idea itself, but the flawed research approach that had given them false confidence.</p><h3>2. The Danger of Superficial or Cherry-Picked Insights</h3><p>Without proper synthesis, research findings can become over-simplified, misinterpreted, or cherry-picked to support pre-existing ideas. Teams conducting their own research may:</p><ul><li><p>Over-rely on a few strong opinions, mistaking them for broad trends.</p></li><li><p>Ignore conflicting feedback that doesn&#8217;t align with their preferred narrative.</p></li><li><p>Mistake usability issues for lack of user interest, discarding features too quickly.</p></li></ul><p>A marketing team wanted to refine its messaging and conducted a quick user survey. Most responses were positive, leading them to assume their messaging was strong. But when a researcher later reviewed the data, they found that negative feedback had been dismissed as &#8220;outliers.&#8221;</p><p>In reality, those &#8220;outliers&#8221; represented a critical segment of potential customers who found the messaging unclear. Because this nuance had been ignored, the company missed an opportunity to improve conversion rates.</p><h2>Ethical Concerns</h2><p>User research involves handling people&#8217;s personal data, stories, and behaviors. When research is democratized, it&#8217;s critical to ensure that ethical best practices aren&#8217;t compromised.</p><h3>1. Consent, Privacy, and Proper Handling of Sensitive Data</h3><p>When non-researchers conduct studies, they may not fully understand data protection laws or consent protocols. Common mistakes include:</p><ul><li><p>Failing to obtain proper consent before recording or storing user data.</p></li><li><p>Not anonymizing sensitive information, increasing the risk of privacy violations.</p></li><li><p>Misusing user data beyond the scope of consent, which can lead to legal repercussions.</p></li></ul><p>A designer I worked with ran an unmoderated usability test using a third-party tool but forgot to include a consent disclaimer. Participants were unaware their sessions were being recorded, violating privacy policies. This led to a compliance issue that required removing all collected data, wasting weeks of work.</p><h3>2. The Risk of Manipulating Research to Confirm Pre-Existing Biases</h3><p>Research can be weaponized. When stakeholders conduct their own studies, there&#8217;s a risk that they will design the research to confirm what they already want to believe. This can lead to:</p><ul><li><p>Over-reliance on supportive data while ignoring conflicting insights.</p></li><li><p>Framing research questions in ways that guarantee a preferred outcome.</p></li><li><p>Misrepresenting insights to push a particular agenda.</p></li></ul><p>An executive wanted to push a new subscription model and asked for research to support the decision. Instead of conducting an unbiased study, they only surveyed customers who had previously expressed interest in subscriptions. The results? A misleadingly high approval rate that didn&#8217;t reflect the broader customer base.</p><p>When the new model launched, churn increased dramatically because the real majority of customers had never been considered in the research.</p><h2>Undermining the Value of Professional Research</h2><p>One of the most contentious risks of research democratization is the fear that it diminishes the role and expertise of trained researchers. As more non-researchers take on research tasks, there&#8217;s a real concern that leadership will begin to deprioritize the need for dedicated research professionals altogether.</p><p>When organizations assume that &#8220;anyone can do research,&#8221; they often fail to recognize the depth of expertise required to conduct meaningful, unbiased, and methodologically sound studies. This can lead to fewer dedicated research hires, underfunded research teams, and a loss of credibility for research as a discipline.</p><p>But this problem doesn&#8217;t emerge overnight. It often starts subtly by shifting responsibilities away from researchers and making research a distributed, secondary task rather than a core business function. If this shift goes unchallenged, researchers can quickly find themselves fighting for relevance rather than driving strategic impact.</p><h3>How Democratization Can Devalue Research Roles</h3><p>Democratization, when unchecked, can lead to a misunderstanding of research as a profession. Instead of being seen as a specialized discipline requiring training, rigor, and experience, research is sometimes reduced to a simple task that anyone with access to a survey tool or a scheduling link can handle. There are a few ways this devaluation takes shape:</p><h3>1. The Erosion of Research Credibility</h3><p>When non-researchers conduct studies without proper training, they often produce flawed, biased, or misleading insights. These insights, if used to inform decisions, can lead to failed product launches, wasted development resources, or misaligned marketing strategies.</p><p>However, when these failures happen, the blame doesn&#8217;t always fall where it should. Instead of acknowledging that the methodology was flawed, teams may conclude that research itself isn&#8217;t valuable or that it doesn&#8217;t lead to actionable insights.</p><p>Over time, this weakens the perception of research within an organization. Instead of being seen as a critical function, research becomes an optional, nice-to-have activity that doesn&#8217;t always justify investment.</p><p>At one company, product managers were given the freedom to conduct their own research. Over time, they ran dozens of studies, but because they lacked training, their insights were inconsistent, biased, and often contradicted each other.</p><p>Executives began questioning the value of research altogether. &#8220;Why are we spending time on this if every study seems to say something different?&#8221; they asked. Instead of realizing that the issue was the lack of research rigor, they assumed that research itself wasn&#8217;t producing useful outcomes.</p><h3>2. The Shift from Research as a Discipline to Research as an Admin Task</h3><p>When democratization isn&#8217;t structured properly, research risks being reduced to a tactical, administrative function rather than a strategic discipline.</p><p>Instead of being valued for their critical thinking, synthesis, and ability to uncover deep insights, researchers may find themselves relegated to checking survey drafts, reviewing discussion guides, or approving stakeholder-run studies. This shift has serious long-term consequences:</p><ul><li><p>Researchers lose their influence in shaping business and product strategy.</p></li><li><p>The organization stops seeing research as a driver of innovation and only values it for usability testing and validation.</p></li><li><p>Research teams become service providers rather than thought leaders.</p></li></ul><p>A UX research team at a large company started a democratization initiative that allowed product managers and designers to run usability tests. Over time, stakeholders became accustomed to doing their own research and started relying less on the research team.</p><p>Eventually, leadership began questioning the need for a dedicated research function at all. &#8220;If product teams can do their own research, why do we need a full research team?&#8221;</p><p>Instead of scaling research, democratization led to the gradual defunding of the research department, reducing it to a small oversight function rather than a core driver of decision-making.</p><h3>3. The Budget and Hiring Freeze Effect</h3><p>One of the most polarizing debates in research democratization is whether it threatens the job security of UX researchers. <a href="https://www.userinterviews.com/state-of-user-research-2023-report">In a 2023 survey</a>, 7% of researchers explicitly linked democratization to layoffs or role reductions. Some companies, after implementing democratization, froze research hiring or shifted research into hybrid roles rather than dedicated teams. However, research leaders argue that structured democratization should enhance, not replace, UX researchers. The key distinction is ensuring researchers own complex studies while enabling non-researchers to contribute within predefined boundaries.</p><p>When leadership perceives that research is happening without dedicated researchers, they may start questioning the need to invest in research at all. This can result in:</p><ul><li><p>Reduced budgets for research tools, participant recruitment, and training.</p></li><li><p>Hiring freezes for research roles, even when the demand for insights remains high.</p></li><li><p>Reallocation of research responsibilities to non-researchers, leading to burnout and ineffective studies.</p></li></ul><p>This often happens gradually. At first, democratization is seen as a way to scale research&#8212;but without careful structuring, it can quickly lead to justification for cost-cutting.</p><p>At one startup, researchers trained designers to conduct usability tests. Initially, this helped the research team focus on generative studies. However, when budget season rolled around, leadership pointed to the success of democratization as a reason not to hire additional researchers.</p><p>Within a year, the research team was stretched even thinner, and designers&#8212;who were supposed to be running only tactical usability tests&#8212;were now expected to handle all product research. The result? A research culture built on speed, not depth, with major gaps in insight quality.</p><h2>How to Protect the Value of Research While Scaling Access</h2><p>If democratization is necessary, researchers must take an active role in shaping its implementation rather than passively accepting it. Here&#8217;s how to do that:</p><ol><li><p>Define the boundaries of democratized research</p><ol><li><p>Be clear about what types of research can and cannot be democratized.</p></li><li><p>Ensure that high-risk, high-impact research remains with trained researchers.</p></li></ol></li><li><p>Establish research standards and oversight</p><ol><li><p>Create a research framework with clear guidelines for methodology, synthesis, and reporting.</p></li><li><p>Require peer reviews and quality checks before insights are shared.</p></li></ol></li><li><p>Position research as a strategic partner, not just a service</p><ol><li><p>Proactively contribute to decision-making conversations, not just research execution.</p></li><li><p>Show how research can drive innovation, reduce business risk, and uncover opportunities that teams hadn&#8217;t considered.</p></li></ol></li><li><p>Continuously advocate for research expertise</p><ol><li><p>Educate leadership on the depth and complexity of research.</p></li><li><p>Track and report the impact of research on business outcomes, so it&#8217;s clear why dedicated researchers are still essential.</p></li></ol></li></ol><h1>When and How to Democratize Research Responsibly</h1><p>Democratizing research isn&#8217;t an all-or-nothing decision. Done well, it can scale research efforts, integrate user insights across an organization, and build a stronger culture of customer empathy. Done poorly, it can introduce bias, lead to poor decision-making, and undermine the credibility of research as a function.</p><p>The key to responsible democratization isn&#8217;t just who conducts research, but how, when, and under what conditions. This section breaks down the circumstances in which democratization makes sense, when it doesn&#8217;t, and how to ensure that research remains rigorous even as it becomes more widely distributed.</p><h2>When Should Research Be Democratized?</h2><p>Democratization works best when it&#8217;s filling a gap, not replacing expertise. In the right contexts, it can help teams make better, faster, and more user-centered decisions while allowing researchers to focus on high-value work. Here are the conditions that make democratization beneficial:</p><h3>1. When Research Demand Exceeds Researcher Capacity</h3><p>Research teams&#8212;especially in growing organizations&#8212;are often stretched thin. The number of product teams, marketing initiatives, and business strategies that could benefit from research far outweighs the available researcher capacity. In these situations, democratization allows research to scale beyond the limitations of a small team.</p><p>However, this does not mean handing over all research responsibilities to non-researchers. Instead, it means creating a system where smaller, tactical studies can be owned by trained stakeholders, freeing researchers to focus on more complex, high-impact work. Without this structure, research functions become bottlenecks, delaying projects and forcing teams to make decisions based on assumptions rather than data.</p><p>To manage this well, researchers should define the types of studies stakeholders can conduct and establish guidelines for when their involvement is required. This ensures that research demand is met without compromising quality or overwhelming the research team.</p><h3>2. When Teams Need Quick, Low-Risk Insights</h3><p>There are times when teams need immediate feedback on relatively small decisions&#8212;such as refining copy on a landing page, testing a minor UI change, or gauging user reactions to a new feature layout. These types of research questions are not deeply exploratory and do not require advanced methodologies, making them ideal for democratization.</p><p>But even these quick-turnaround studies need guardrails to ensure findings are still meaningful. Without structure, teams may conduct rushed, poorly designed research that introduces more noise than clarity. For democratization to work in these cases, organizations need:</p><ul><li><p>Pre-approved templates and research guides to ensure consistency.</p></li><li><p>Baseline training on research bias and question framing to avoid leading questions or faulty assumptions.</p></li><li><p>Access to a repository of past research so that teams don&#8217;t conduct unnecessary studies when existing data already holds the answer.</p></li></ul><p>By putting these supports in place, teams can run research without reinventing the wheel or making common mistakes that undermine their findings.</p><h3>3. When Non-Researchers Are Trained and Supported</h3><p>One of the biggest mistakes organizations make with democratization is assuming that anyone can do research effectively without training. In reality, even seemingly simple methods&#8212;such as usability testing or surveys&#8212;can introduce bias or misinterpretation when not handled correctly.</p><p>For democratization to succeed, non-researchers must be properly trained in research fundamentals. This doesn&#8217;t mean turning them into full-fledged researchers, but rather ensuring they have enough knowledge to avoid common mistakes and recognize when they need additional support. Training should include:</p><ul><li><p>How to ask unbiased questions and avoid leading participants.</p></li><li><p>How to recruit representative samples rather than relying on convenience sampling.</p></li><li><p>How to synthesize findings in a way that reflects patterns rather than isolated opinions.</p></li><li><p>How to understand ethical considerations, such as participant consent and data handling.</p></li></ul><p>Beyond training, ongoing support is necessary. Research should not be a one-time training session that leaves stakeholders on their own. Researchers should act as mentors, reviewing research plans, helping synthesize findings, and ensuring that non-researchers have the support they need to conduct meaningful studies.</p><h3>4. When Research Rigor is Maintained Through Structured Oversight</h3><p>Democratization should never mean unstructured or uncontrolled research. While it allows for more people to participate in research, it should still operate within a defined system that ensures research quality remains high. This requires clear oversight mechanisms, including:</p><ul><li><p>A standardized research review process where trained researchers sign off on study designs before they are executed.</p></li><li><p>A centralized research repository where all findings are logged and cross-referenced to prevent duplication and inconsistencies.</p></li><li><p>Regular research audits to evaluate the quality of democratized studies and refine processes over time.</p></li></ul><p>Without these structures, research can become fragmented, inconsistent, and difficult to trust&#8212;leading to decisions being made based on unreliable data.</p><h2>When Should Research NOT Be Democratized?</h2><p>Just as there are times when democratization is beneficial, there are also clear situations where research should remain exclusively within the domain of trained researchers. These tend to be higher-risk studies where poor execution can have serious consequences.</p><h3>1. When the Study Requires Advanced Methodologies</h3><p>Some research methods are simply too complex to be handled by non-researchers. These include:</p><ul><li><p>Generative research that explores unmet needs and uncovers new opportunities.</p></li><li><p>Behavioral research that requires deep observation over time.</p></li><li><p>Mixed-method studies that involve advanced synthesis across qualitative and quantitative data.</p></li></ul><p>These methods require expertise in study design, recruitment, analysis, and synthesis to ensure findings are valid, reliable, and actionable. Handing them over to non-researchers can lead to inaccurate conclusions that derail business strategies.</p><h3>2. When Biases Could Significantly Distort Findings</h3><p>All research contains some level of bias, but certain situations make it especially difficult to remove. If the person conducting the research has a vested interest in the outcome, there is a high risk of unintentional&#8212;or even deliberate&#8212;bias shaping the results.</p><p>For example, if a product manager is testing their own feature, they may subconsciously lead users toward positive feedback or ignore negative comments that challenge their assumptions.</p><p>In cases like these, research should be handled by an independent researcher who can approach the study with neutrality and objectivity.</p><h3>3. When Ethical or Privacy Concerns Exist</h3><p>Studies that involve sensitive topics, vulnerable populations, or legally protected data require a high level of ethical oversight. If non-researchers are not trained in research ethics, they may unknowingly:</p><ul><li><p>Fail to obtain proper consent before recording user data.</p></li><li><p>Collect and store sensitive data in ways that violate privacy regulations.</p></li><li><p>Ask questions that unintentionally cause harm or distress to participants.</p></li></ul><p>For any study that involves healthcare, finance, children, or legally protected information, research should be conducted only by trained professionals who understand compliance, consent, and ethical risk mitigation.</p><h1><strong>Scaling Research Without Losing Rigor</strong></h1><p>Research democratization is a reality for many organizations, and it&#8217;s clear that it is not inherently good or bad&#8212;it depends entirely on how it is executed. When structured effectively, democratization enables faster, more user-informed decision-making without sacrificing research integrity. However, without proper governance, it risks lowering research quality, fragmenting insights, and reducing the perceived value of dedicated research teams.</p><p>The key takeaway from this first part of the series is this: Democratization is not an all-or-nothing approach.Organizations that find success with it strike a balance between empowerment and oversight&#8212;providing stakeholders with the tools and training they need while ensuring researchers maintain quality control.</p><p>In the next part of this series, we&#8217;ll explore specific frameworks and methodologies that enable responsible democratization, including:</p><ol><li><p>A framework for responsible research democratization</p></li><li><p>Scaling research without losing rigor </p></li><li><p>Responding to democratization issues</p></li></ol><p>Stay tuned, and if you want to ensure you don&#8217;t miss the next part of the series, subscribe for updates!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h2><strong>Stop piecing it together. Start leading the work.</strong></h2><p>The Everything UXR Bundle is for researchers who are tired of duct-taping free templates and second-guessing what good looks like.</p><p>You get my complete set of toolkits, templates, and strategy guides. used by teams across Google, Spotify, , to run credible research, influence decisions, and actually grow in your role.</p><p>It&#8217;s built to save you time, raise your game, and make you the person people turn to.</p><p>&#8594; Save 140+ hours a year with ready-to-use templates and frameworks</p><p>&#8594; Boost productivity by 40% with tools that cut admin and sharpen your focus</p><p>&#8594; Increase research adoption by 50% through clearer, faster, more strategic delivery</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchstrategist.squarespace.com/everything-uxr-bundle&quot;,&quot;text&quot;:&quot;Grab the Everything UXR Bundle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://userresearchstrategist.squarespace.com/everything-uxr-bundle"><span>Grab the Everything UXR Bundle</span></a></p><div><hr></div><p>Stay curious,</p><p>Nikki</p>]]></content:encoded></item><item><title><![CDATA[Why Strategy and Business Acumen Matter in User Research]]></title><description><![CDATA[And how to leverage it for career growth]]></description><link>https://www.userresearchstrategist.com/p/why-strategy-and-business-acumen</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/why-strategy-and-business-acumen</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 25 Mar 2025 09:01:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kuCS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995; Hi, this is Nikki with a <strong>free, bonus article</strong> from the User Research Strategist. I share content that helps you move toward a more strategic role as a researcher, measuring your ROI, and delivering impactful insights that move business decisions.</p><p>If you want to see everything I post, subscribe below!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kuCS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kuCS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kuCS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kuCS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kuCS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kuCS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg" width="501" height="501" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:501,&quot;bytes&quot;:235760,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kuCS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!kuCS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!kuCS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!kuCS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f5ed450-cb6e-43bb-b5a6-81d7ac1139c6_1024x1024.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image via Midjourney</figcaption></figure></div><h2>Hello curious human</h2><p>User research is often viewed as a practice that uncovers insights to enhance user experiences. While this is true, the role of a user researcher goes far beyond usability testing and interview synthesis. Strategic thinking and business acumen are critical to ensuring research has a tangible impact on an organization&#8217;s goals.</p><p>Without aligning research efforts with business objectives, insights risk becoming unused documents that never drive action. To make an impact, user researchers need to bridge the gap between research and business strategy, effectively positioning themselves as catalysts for innovation and growth.</p><p>If you&#8217;ve ever struggled with proving the value of your work or gaining stakeholder buy-in, understanding why and how research supports business growth is the key to unlocking new opportunities.</p><p>Strategic user researchers move beyond traditional roles of collecting insights and instead become active contributors to product development, customer success, and revenue growth. </p><p>This is exactly why I created the <strong><a href="https://www.userresearchstrategist.com/">Impact Membership</a></strong>&#8212;to help user researchers like you align your efforts with business goals and position themselves as strategic leaders. Through the membership, you&#8217;ll gain the skills, mentorship, and tools needed to amplify your impact.</p><p>Let&#8217;s break down why developing a business-first mindset is critical for user researchers.</p><h1>Influencing product and business decisions</h1><p>User research is not just about uncovering insights&#8212;it&#8217;s about driving meaningful change that aligns with business goals and product strategy. If you&#8217;re struggling to get your research recognized, the key lies in strategic influence.</p><p>Influencing business and product decisions means being proactive, speaking the language of stakeholders, and positioning research as a critical component of business success. But how do you do that effectively?</p><p>Many researchers feel frustrated when their insights are ignored or when they see a lack of follow-through from stakeholders. The truth is, product teams and executives don&#8217;t dismiss research because they don&#8217;t value it&#8212;they dismiss it when they can&#8217;t see its direct connection to business goals.</p><p>Your research should answer questions like:</p><ul><li><p>How will this impact revenue?</p></li><li><p>Will this improve user retention?</p></li><li><p>Does this support our strategic business objectives?</p></li></ul><p>If your insights don&#8217;t align with these priorities, they risk being perceived as &#8220;nice-to-have&#8221; rather than essential.</p><p>For example:</p><p>A user researcher at an e-commerce company conducted a study on mobile usability. Initially, their insights were dismissed as &#8220;too minor.&#8221; However, when they reframed their findings in terms of potential cart abandonment reduction and increased average order value (AOV), leadership took notice, and the proposed improvements were prioritized.</p><p>The way you present your research findings can determine whether they are acted upon or ignored.</p><h2>Key challenges in influencing business and product decisions</h2><p>Before we discuss how to influence decisions, it&#8217;s important to understand the challenges that researchers often face:</p><p><strong>Lack of alignment with business goals:</strong></p><p>Research sometimes focuses too heavily on user experience without considering business objectives like growth, retention, or revenue.</p><p><strong>Communication gaps:</strong></p><p>Researchers often communicate in UX terms (e.g., usability scores, personas) that don&#8217;t resonate with business stakeholders.</p><p><strong>Timing issues:</strong></p><p>Research insights often come too late in the product development cycle, making it difficult to incorporate recommendations.</p><p><strong>Perceived lack of ROI:</strong></p><p>If research isn&#8217;t tied to tangible outcomes, stakeholders may deprioritize it in favor of other initiatives.</p><h2>How to influence product and business decisions effectively</h2><h3>Step 1: Speak the language of business</h3><p>To gain traction with stakeholders, you need to frame your research findings in terms of business impact, revenue, and growth.</p><p>Instead of saying: &#8220;Users struggle to find the checkout button.&#8221;</p><p>Say: &#8220;Improving checkout discoverability could increase conversion rates by 18%, potentially adding $500,000 in monthly revenue.&#8221;</p><p><strong>Steps:</strong></p><ol><li><p>Get familiar with key business metrics such as:</p><ol><li><p>Customer Acquisition Cost (CAC)</p></li><li><p>Customer Lifetime Value (CLV)</p></li><li><p>Churn Rate</p></li><li><p>Retention Metrics</p></li></ol></li><li><p>Align your research insights with these metrics whenever possible.</p></li><li><p>Use frameworks like OKRs (Objectives and Key Results) to link research outcomes to strategic goals.</p></li></ol><h3>Step 2: Involve stakeholders early in the research process</h3><p>Stakeholders are more likely to act on research when they feel invested in the process. Bringing them in early fosters buy-in and collaboration.</p><ul><li><p>Schedule kickoff meetings with product managers, marketers, and executives to align on research goals.</p></li><li><p>Ask questions like:</p><ul><li><p>&#8220;What business challenges are we facing right now?&#8221;</p></li><li><p>&#8220;What user behaviors do we need to understand to drive growth?&#8221;</p></li><li><p>&#8220;What success metrics matter most for this initiative?&#8221;</p></li></ul></li></ul><p><strong>Steps:</strong></p><ol><li><p>Create stakeholder &#8220;co-design&#8221; sessions where they help shape research questions.</p></li><li><p>Share research goals early and align them with product roadmaps.</p></li><li><p>Conduct stakeholder interviews to understand their pain points and align insights accordingly.</p></li></ol><h3>Step 3: Tie research to key business decisions</h3><p>Your research should not only support product design but also influence business strategy.</p><ul><li><p>Identify upcoming business decisions and ensure research addresses them directly. Anticipate business questions such as:</p><ul><li><p>&#8220;Should we enter a new market?&#8221;</p></li><li><p>&#8220;Which feature should we prioritize?&#8221;</p></li><li><p>&#8220;Why are customers dropping off at a certain stage?&#8221;</p></li></ul></li></ul><p><strong>Steps:</strong></p><ol><li><p>Align your research reports with business objectives.</p></li><li><p>Use storytelling techniques to make findings relatable.</p></li><li><p>Provide executive summaries highlighting business impact.</p></li></ol><p>Example:</p><p>Instead of delivering a lengthy report, distill insights into a one-page dashboard with key findings like:</p><ul><li><p>Current user pain point: 65% of users drop off due to a confusing checkout process.</p></li><li><p>Business impact: Reducing drop-offs by 20% could increase revenue by $2 million annually.</p></li><li><p>Proposed action: Simplify checkout steps and add trust signals.</p></li></ul><h3>Step 4: Demonstrate the ROI of research</h3><p>Leadership cares about numbers. When you can prove the financial or operational impact of your research, your influence grows.</p><ul><li><p>Track pre- and post-research impact using A/B tests, analytics, and customer satisfaction scores. Quantify improvements in terms of:</p><ul><li><p>Increased conversion rates</p></li><li><p>Reduction in support tickets</p></li><li><p>Faster time-to-market</p></li></ul></li></ul><p><strong>Steps:</strong></p><ol><li><p>Work with data analysts to measure the impact of your recommendations.</p></li><li><p>Develop case studies showcasing how past research influenced business success.</p></li><li><p>Present findings through quarterly impact reports.</p></li></ol><h3>Step 5: Become a strategic partner, not just a researcher</h3><p>Product teams will start viewing you as a strategic partner when you actively contribute beyond research&#8212;helping prioritize initiatives and offering business-aligned recommendations.</p><ul><li><p>Regularly attend product strategy meetings.</p></li><li><p>Offer insights proactively, even outside formal research projects.</p></li><li><p>Educate cross-functional teams on the value of research.</p></li></ul><p><strong>Steps:</strong></p><ol><li><p>Stay informed on industry trends and market shifts.</p></li><li><p>Position yourself as an internal consultant, not just a service provider.</p></li><li><p>Work with leadership to build a long-term research roadmap aligned with business strategy.</p></li></ol><p>Want to overcome these challenges? The <strong><a href="https://www.userresearchstrategist.com/">Impact Membership</a></strong> offers expert-led workshops on how to align research with strategic business objectives, making your research invaluable to leadership.</p><h1>Prioritization of research efforts for maximum impact</h1><p>There&#8217;s never a shortage of questions to answer, stakeholders to satisfy, and initiatives to support. It&#8217;s easy to feel overwhelmed by competing demands and endless requests for insights.</p><p>The key to success lies in prioritization, focusing on the research efforts that will have the highest impact on business goals and user needs. Without it, research can become scattered, reactive, and ultimately, undervalued.</p><p>Effective prioritization allows you to:</p><p><strong>Maximize your influence:</strong></p><p>Working on the right projects ensures your research contributes to critical business decisions and product development milestones.</p><p><strong>Ensure efficient use of resources:</strong></p><p>Research teams often have limited bandwidth, and focusing on high-impact initiatives allows you to deliver more value with fewer resources.</p><p><strong>Strengthen your position as a strategic partner:</strong></p><p>When your work is aligned with business priorities, stakeholders see you as an integral part of the decision-making process rather than a reactive service provider.</p><p><strong>Avoid wasted effort:</strong></p><p>Prioritizing correctly helps you avoid spending time on low-impact research that may not drive action or align with company goals.</p><h2>Challenges in research prioritization</h2><p>Before diving into how to prioritize effectively, it&#8217;s important to recognize the most common challenges user researchers face:</p><p><strong>Saying &#8220;Yes&#8221; to everything:</strong></p><p>Stakeholders often bombard researchers with urgent requests, and without clear prioritization, it&#8217;s easy to fall into a reactive mode that drains resources and focus.</p><p><strong>Lack of strategic alignment:</strong></p><p>Conducting research without understanding broader business objectives can lead to misaligned efforts that don&#8217;t contribute to product or revenue goals.</p><p><strong>Trying to do too much at once:</strong></p><p>Attempting to juggle multiple research projects simultaneously can dilute the quality and depth of insights.</p><p><strong>Focusing on short-term fixes:</strong></p><p>Addressing only immediate usability concerns without considering strategic opportunities to shape the product&#8217;s future can limit your impact.</p><h2>How to prioritize research efforts effectively</h2><h3>1. Align research with business and product goals</h3><h4>Step 1: Understand the company&#8217;s objectives</h4><p>Start by gaining clarity on your organization&#8217;s strategic goals. Ask yourself:</p><ul><li><p>What are the company&#8217;s top priorities for this quarter/year?</p></li><li><p>Are we focusing on growth, retention, cost reduction, or product expansion?</p></li><li><p>What key metrics (KPIs) are driving business success?</p></li></ul><h4>Step 2: Partner with stakeholders</h4><p>Meet regularly with product managers, executives, and other stakeholders to align research initiatives with business priorities.</p><h4>Step 3: Map research needs to business impact</h4><p>Consider how each potential research project contributes to the following areas:</p><ul><li><p>Revenue growth: Does this research have the potential to improve conversion rates, upsell opportunities, or retention?</p></li><li><p>Operational efficiency: Will this research reduce support tickets, decrease churn, or optimize workflows?</p></li><li><p>Customer satisfaction: Does this research align with improving CSAT metrics?</p></li></ul><p>Create a simple research roadmap that visually maps upcoming research projects against business goals. This ensures transparency and alignment across teams.</p><h3>2. Use a research prioritization framework</h3><p>A structured prioritization framework helps in objectively evaluating research opportunities. Some of the most effective frameworks include:</p><p><strong>The RICE scoring model</strong></p><p>RICE stands for Reach, Impact, Confidence, and Effort. Assigning numerical values to each factor helps determine which research efforts should take precedence.</p><ul><li><p>Reach: How many users/customers will be affected by the research findings?</p></li><li><p>Impact: What is the potential business impact (high, medium, low)?</p></li><li><p>Confidence: How confident are you that the research will produce actionable insights?</p></li><li><p>Effort: How much time and resources will it take to complete the research?</p></li></ul><p>The RICE formula is:</p><p>(Reach x Impact x Confidence) / Effort = Priority Score</p><p>For example:</p><ul><li><p>Research Study A: (500 users x High Impact x 80% Confidence) / 20 Effort = Priority Score: 2000</p></li><li><p>Research Study B: (100 users x Medium Impact x 90% Confidence) / 30 Effort = Priority Score: 300</p></li></ul><p>The project with the higher priority score should take precedence.</p><p><strong>The MoSCoW method</strong></p><p>This method categorizes research efforts into four distinct categories:</p><ul><li><p>Must-Have: Critical to the success of the business or product.</p></li><li><p>Should-Have: Important but not essential for immediate impact.</p></li><li><p>Could-Have: Nice to have but non-essential.</p></li><li><p>Won&#8217;t-Have (for now): Can be deprioritized for future consideration.</p></li></ul><p>For example:</p><p>If a company is planning an international expansion, research related to market feasibility would fall under &#8220;Must-Have,&#8221; while a UI tweak might be classified as &#8220;Could-Have.&#8221;</p><p><strong>Impact vs. Effort Matrix</strong></p><p>This simple yet effective tool helps categorize research tasks based on their potential impact and the effort required.</p><ul><li><p>High Impact, Low Effort: Prioritize immediately.</p></li><li><p>High Impact, High Effort: Plan for strategic investment.</p></li><li><p>Low Impact, Low Effort: Consider if resources allow.</p></li><li><p>Low Impact, High Effort: Deprioritize.</p></li></ul><p>For example:</p><p>Researching onboarding improvements (High Impact, Low Effort) should be prioritized over an exploratory study with unclear business value.</p><h3>3. Prioritize based on risk mitigation and uncertainty</h3><p>Prioritize research efforts that help reduce uncertainty in critical business decisions. Consider the level of risk associated with moving forward without insights.</p><p>Questions to ask:</p><ul><li><p>What decisions are currently being made without enough data?</p></li><li><p>Which research gaps could lead to costly mistakes if left unaddressed?</p></li><li><p>What are the riskiest assumptions within the current product strategy?</p></li></ul><p>For example:</p><p>If a company is about to launch a new feature, conducting usability testing to uncover major usability blockers should take precedence over general exploratory research.</p><h3>4. Consider the research&#8217;s longevity and reusability</h3><p>Prioritize research that can serve multiple purposes across teams and functions. Look for research opportunities that can generate insights applicable to:</p><ul><li><p>Future feature development</p></li><li><p>Marketing messaging</p></li><li><p>Customer support improvements</p></li><li><p>Product roadmap planning</p></li></ul><p>A well-conducted persona study can inform product design, marketing campaigns, and sales strategies, making it a valuable long-term investment.</p><h3>5. Be transparent and communicate prioritization decisions</h3><p>Once priorities are established, ensure they are clearly communicated across the organization. Transparency helps set expectations and gain stakeholder support.</p><p><strong>Steps:</strong></p><ol><li><p>Host a quarterly research prioritization meeting with stakeholders.</p></li><li><p>Share a research backlog with visibility into priorities and timelines.</p></li><li><p>Regularly update stakeholders on progress and any changes.</p></li></ol><p>Struggling to prioritize? Join the <strong><a href="https://www.userresearchstrategist.com/">Impact Membership</a></strong> to start prioritizing with confidence and clarity.</p><h1>Bridging the gap between UX and business goals</h1><p>User research and business objectives often seem like separate worlds, one focused on usability, empathy, and human-centered design, and the other on revenue, market share, and profitability. However, the most successful organizations understand that great user experiences drive business success, and it&#8217;s the responsibility of user researchers to help bridge the gap between the two.</p><p>When research insights are directly tied to business metrics, the value of research becomes undeniable. This alignment ensures that both user needs and business goals are met, creating a win-win scenario for the company and its customers.</p><p>When user experience efforts aren&#8217;t aligned with business goals, organizations face several challenges:</p><ul><li><p>Research that doesn&#8217;t tie into strategic business initiatives may go unused, wasting valuable resources.</p></li><li><p>If UX research doesn&#8217;t demonstrate business impact, decision-makers may deprioritize it.</p></li><li><p>Teams may focus on usability improvements with minimal impact instead of addressing core business drivers like retention and growth.</p></li><li><p>If research isn&#8217;t aligned with business strategy, products might solve user needs but fail to address market realities or revenue goals.</p></li></ul><p>For example:</p><p>A SaaS company wanted to improve their onboarding experience. While their UX team proposed usability improvements, leadership was focused on reducing churn. By reframing research findings in terms of churn reduction, the UX team gained buy-in and saw their recommendations implemented.</p><p>Bridging UXR and business goals isn&#8217;t about sacrificing user needs, but about aligning research efforts with strategic objectives to ensure actionable impact.</p><h2>Key challenges in bridging UXR and business goals</h2><p>Before we explore solutions, it&#8217;s important to understand the common roadblocks that prevent user research from influencing business decisions:</p><ol><li><p>Different priorities between UXR and business teams</p><ol><li><p>UX teams prioritize usability, accessibility, and engagement.</p></li><li><p>Business teams prioritize revenue, growth, and scalability.</p></li><li><p>The challenge is finding common ground where both perspectives align.</p></li></ol></li><li><p>Lack of a shared language</p><ol><li><p>UX professionals often communicate findings using design terminology, while business leaders think in terms of KPIs, revenue, and growth metrics.</p></li></ol></li><li><p>Measuring UX impact is difficult</p><ol><li><p>Unlike sales or marketing, UX research outcomes aren&#8217;t always directly quantifiable, making it harder to tie efforts to ROI.</p></li></ol></li><li><p>Siloed organizational structures</p><ol><li><p>UX, product, and business teams often work in silos, leading to misaligned goals and miscommunication.</p></li></ol></li></ol><h2>How to effectively bridge the gap</h2><h3>1. Align UX goals with business objectives early on</h3><p>The earlier research efforts align with business goals, the more effective they will be. UX researchers should proactively collaborate with stakeholders at the planning stage to understand business objectives and identify ways UX can support them.</p><p><strong>Steps:</strong></p><ul><li><p>Attend business planning meetings to gain insights into company goals and priorities. Ask strategic questions, such as:</p><ul><li><p>What are our revenue goals this quarter?</p></li><li><p>What challenges are preventing growth?</p></li><li><p>How can UX improvements help achieve business objectives?</p></li></ul></li><li><p>Develop UXR objectives that map directly to business KPIs, such as:</p><ul><li><p>Reducing churn rates</p></li><li><p>Increasing conversion rates</p></li><li><p>Lowering customer acquisition costs (CAC)</p></li></ul></li></ul><p>For example:</p><p>If a company&#8217;s goal is to increase annual recurring revenue (ARR), UX research should focus on identifying friction points in the user journey that prevent renewals or upsells.</p><h3>2. Speak the language of business</h3><p>Stakeholders respond better to data that reflects their priorities. To influence business decisions, UX researchers must translate findings into metrics that executives care about, such as:</p><ul><li><p>Customer Lifetime Value (CLV): How UX improvements impact long-term profitability.</p></li><li><p>Retention Rates: Demonstrating how better UX leads to increased customer loyalty.</p></li><li><p>Cost Savings: Showing how improved UX reduces support costs and churn.</p></li><li><p>Convert UX metrics into business metrics.</p><ul><li><p>Instead of saying &#8220;Users struggle with navigation,&#8221; say &#8220;Reducing navigation confusion can increase conversions by 20%.&#8221;</p></li></ul></li><li><p>Use business-oriented storytelling, focusing on the financial impact and risk mitigation.</p></li></ul><p>For example:</p><p>If research reveals that users struggle to find key features, instead of reporting it as a usability issue, position it as an opportunity to improve feature adoption and increase upsell potential.</p><h3>3. Quantify the impact of UX improvements</h3><p>Executives want numbers. One of the best ways to bridge the gap is to show tangible, quantifiable results from UX improvements.</p><p><strong>Steps:</strong></p><ol><li><p>Leverage A/B testing and analytics to measure the business impact of UX changes.</p></li><li><p>Track metrics before and after improvements, such as:</p><ol><li><p>Task completion rates vs. conversion rates</p></li><li><p>Reduced support inquiries post-design changes</p></li><li><p>Impact on Customer Satisfaction (CSAT)</p></li></ol></li><li><p>Use qualitative and quantitative data to create compelling reports.</p></li></ol><p>For example:</p><p>If optimizing an onboarding flow leads to a 25% increase in activation rates, presenting this result in revenue terms makes the impact clearer to executives.</p><h3>4. Establish cross-functional collaboration</h3><p>Breaking down silos between UX, product, marketing, and business teams ensures research findings are considered in broader strategic conversations.</p><p><strong>Steps:</strong></p><ol><li><p>Create cross-functional task forces that include UX, product managers, marketers, and executives.</p></li><li><p>Conduct collaborative research planning sessions to align on priorities and share insights.</p></li><li><p>Offer UX training sessions for business teams to help them understand the value of research.</p><p></p></li></ol><h3>5. Prioritize research that drives strategic decisions</h3><p>Not all research projects hold the same value. Prioritizing research that aligns with high-impact business goals helps secure stakeholder buy-in.</p><p><strong>Steps:</strong></p><ol><li><p>Use prioritization frameworks like the RICE (Reach, Impact, Confidence, Effort) model to select research efforts with the highest business impact.</p></li><li><p>Focus on projects that answer critical business questions, such as market expansion, customer retention, or pricing strategy.</p></li><li><p>Develop a long-term research roadmap that aligns with business growth strategies.</p></li></ol><p>For example:</p><p>Instead of focusing on minor UI changes, prioritize research that helps understand why customers churn after onboarding and what changes can improve long-term engagement.</p><h3>6. Communicate UX impact regularly</h3><p>Regular communication of research insights ensures they remain top-of-mind for business stakeholders and demonstrates ongoing value.</p><p><strong>Steps:</strong></p><ol><li><p>Provide executive-friendly reports with key findings, potential business impact, and actionable recommendations.</p></li><li><p>Use storytelling techniques to illustrate how UX improvements have driven positive business outcomes.</p></li><li><p>Create UX dashboards that show real-time impact on key business metrics.</p></li></ol><p>Example:</p><p>A monthly UX impact report could showcase how reducing friction in checkout flows resulted in an X% increase in completed purchases.</p><p>Inside the <a href="https://www.userresearchstrategist.com/">Impact Membership</a>, you&#8217;ll learn how to effectively connect UX insights with business goals, making research a strategic asset that drives long-term value.</p><h1>Securing stakeholder buy-in and advocacy</h1><p>As a user researcher, one of the most crucial aspects of your role isn&#8217;t just conducting research, but ensuring that your insights influence decisions and drive action. However, many researchers struggle with getting stakeholders to truly engage with research findings, resulting in insights being ignored or deprioritized.</p><p>Without stakeholder buy-in and advocacy, even the most valuable research can end up collecting dust in a slide deck rather than informing strategic product and business decisions.</p><p>Securing stakeholder buy-in means:</p><ul><li><p>Your research gets implemented, not overlooked.</p></li><li><p>Research becomes a strategic asset, not an afterthought.</p></li><li><p>You build long-term advocacy and trust with leadership.</p></li><li><p>You gain access to greater resources and influence within the organization.</p></li></ul><p>Getting stakeholders invested in your work is a game-changer for both the success of your research and your career as a researcher.</p><h2>How to secure stakeholder buy-in</h2><h3>1. Understand stakeholder motivations</h3><p>To gain buy-in, you must first understand what drives your stakeholders. Different teams have different priorities, and your ability to align research with their goals will determine your success.</p><ul><li><p>Identify key stakeholders across departments:</p><ul><li><p>Product managers care about feature success and roadmap alignment.</p></li><li><p>Executives focus on business growth, profitability, and risk reduction.</p></li><li><p>Marketing teams look for insights to drive customer engagement and acquisition.</p></li><li><p>Customer support wants solutions to reduce complaints and friction points.</p></li></ul></li></ul><p>Example approach:</p><p>When presenting to executives, focus on how research can drive revenue growth or reduce customer churn, rather than usability improvements alone.</p><p>Questions to ask stakeholders:</p><ol><li><p>What are your current priorities and pain points?</p></li><li><p>What success metrics are most important to you?</p></li><li><p>How do you currently make product and business decisions?</p></li></ol><p>By understanding what stakeholders value, you can position your research as a solution to their most pressing needs.</p><h3>2. Involve stakeholders early and often</h3><p>Stakeholders are more likely to support research when they feel like they are part of the process rather than being handed insights they weren&#8217;t involved in.</p><ol><li><p>Hold kickoff meetings with stakeholders to discuss research objectives and align on goals.</p></li><li><p>Invite stakeholders to observe research sessions to create emotional connections with user pain points.</p></li><li><p>Establish a continuous feedback loop, ensuring stakeholders feel heard and involved.</p></li><li><p>Send weekly progress updates to keep stakeholders engaged and informed about ongoing research efforts.</p></li></ol><h3>3. Deliver actionable, concise insights</h3><p>Stakeholders are busy, and they don&#8217;t have time to sift through lengthy research reports. To capture their attention, provide clear, concise, and actionable takeaways.</p><ul><li><p>Summarize insights into one-page executive summaries with key findings and recommendations.</p></li><li><p>Use data visualization tools to highlight important trends.</p></li><li><p>Offer a &#8220;next steps&#8221; section to guide immediate actions based on research.</p></li></ul><p>Example report structure:</p><ol><li><p>Key Insight: Users abandon carts due to complicated checkout forms.</p></li><li><p>Business Impact: Potential revenue loss of $500K per month.</p></li><li><p> Recommendation: Simplify form fields and introduce guest checkout.</p></li><li><p>Next Steps: A/B test the new checkout flow within the next sprint.</p></li></ol><h3>4. Build credibility and trust </h3><p>Trust isn&#8217;t built overnight; it&#8217;s cultivated through consistent, valuable contributions. The more you demonstrate the reliability and impact of your research, the more stakeholders will advocate for it.</p><ol><li><p>Start with small, high-impact wins to prove the value of research.</p></li><li><p>Regularly showcase how research has driven positive business outcomes.</p></li><li><p>Be transparent about research limitations and collaborate on solutions.</p></li></ol><p>For example:</p><p>In a quarterly review, show how past research led to a successful feature launch that increased engagement by 30%.</p><h3>5. Foster advocacy by educating stakeholders</h3><p>Many stakeholders have limited exposure to user research. Taking the time to educate them on its value can turn them into strong advocates.</p><ul><li><p>Conduct UX workshops and training sessions to demonstrate how research informs better decision-making.</p></li><li><p>Share industry best practices and case studies showing research&#8217;s impact on business success.</p></li><li><p>Encourage stakeholders to champion research by including their perspectives in case studies and reports.</p></li><li><p>Create a &#8220;Research 101&#8221; guide for non-researchers to help them better understand your role and the value you bring.</p></li></ul><h1>Elevate your research impact </h1><p>Bridging the gap between research and business goals, prioritizing the right research efforts, influencing product decisions, and securing stakeholder buy-in aren&#8217;t just skills, they&#8217;re essential for driving meaningful change and proving the value of user research within any organization.</p><p>But mastering these skills doesn&#8217;t happen overnight. It requires strategic guidance, proven frameworks, and a supportive community to help you navigate the complexities of user research with confidence. That&#8217;s where the Impact Membership comes in.</p><p>With the Impact Membership, you&#8217;ll gain access to the tools, insights, and expert mentorship you need to take your research career to the next level. Whether you&#8217;re looking to:</p><ul><li><p>Position yourself as a strategic leader by aligning research with business objectives</p></li><li><p>Maximize your influence and ensure your insights drive actionable change</p></li><li><p>Develop business acumen that helps you speak the language of stakeholders</p></li><li><p>Collaborate effectively across teams to prioritize research for maximum impact</p></li><li><p>Secure stakeholder advocacy and establish research as a critical function in your organization</p></li></ul><p>&#8230;the impact membership program provides everything you need to level up your skills and increase your influence.</p><p>What you get:</p><ul><li><p>Learn from industry leaders on how to align research with business goals and influence high-level decisions</p></li><li><p>Actionable tools that help you prioritize research, communicate insights effectively, and gain stakeholder buy-in</p></li><li><p>Connect with other ambitious researchers, share experiences, and gain support from peers facing similar challenges</p></li><li><p>Personalized guidance to help you navigate the toughest challenges in your research career</p></li><li><p>See how top researchers successfully apply strategic thinking to create business impact</p></li></ul><p>The next level of your career starts here.</p><p>If you&#8217;re ready to go beyond traditional research practices and step into a role where your insights drive real business outcomes, the Impact Membership is for you.</p><p><strong><a href="https://www.userresearchstrategist.com/">Join the Impact Membership today</a></strong> and start transforming the way you approach user research.</p><p>Make your research matter. Make your impact count.</p><div><hr></div><p>If you&#8217;re finding this newsletter valuable, share it with a friend, and consider subscribing if you haven&#8217;t already. There are group discounts, gift options, and referral bonuses available.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://userresearchacademy.substack.com/p/playing-games-in-user-research?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&amp;token=eyJ1c2VyX2lkIjoxNTMxNjc5MTcsInBvc3RfaWQiOjE0OTY1Mjg0MSwiaWF0IjoxNzM4MDYxMjE4LCJleHAiOjE3NDA2NTMyMTgsImlzcyI6InB1Yi0xNzQ4MDc2Iiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.N_eH9WYqU1CZpRP19pgtHVWLYoWnOPPn7NPjqyptKzQ&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:&quot;button-wrapper&quot;}" data-component-name="ButtonCreateButton"><a class="button primary button-wrapper" href="https://userresearchacademy.substack.com/p/playing-games-in-user-research?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&amp;token=eyJ1c2VyX2lkIjoxNTMxNjc5MTcsInBvc3RfaWQiOjE0OTY1Mjg0MSwiaWF0IjoxNzM4MDYxMjE4LCJleHAiOjE3NDA2NTMyMTgsImlzcyI6InB1Yi0xNzQ4MDc2Iiwic3ViIjoicG9zdC1yZWFjdGlvbiJ9.N_eH9WYqU1CZpRP19pgtHVWLYoWnOPPn7NPjqyptKzQ"><span>Share</span></a></p><p>Stay curious,</p><p>Nikki</p>]]></content:encoded></item><item><title><![CDATA[Nailing your UXR OKRs - A framework for maximizing research impact]]></title><description><![CDATA[The ultimate guide to creating meaningful UXR goals that drive real business impact]]></description><link>https://www.userresearchstrategist.com/p/nailing-your-uxr-okrs-a-framework</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/nailing-your-uxr-okrs-a-framework</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 11 Mar 2025 09:01:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!uCCH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Hi, I&#8217;m Nikki. I run Drop In Research, where I help teams stop launching &#8220;meh&#8221; and start shipping what customers really need. I write about the conversations that change a roadmap, the questions that shake loose real insight, and the moves that get leadership leaning in. <a href="https://www.dropinresearch.com/">Bring me to your team.</a></em></p><p><em>Paid subscribers get the power tools: the UXR Tools Bundle with a full year of four top platforms free, plus all my Substack content, and a bangin&#8217; Slack community where you can ask questions 24/7. Subscribe if you want your work to create change people can feel.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uCCH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uCCH!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 424w, https://substackcdn.com/image/fetch/$s_!uCCH!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 848w, https://substackcdn.com/image/fetch/$s_!uCCH!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 1272w, https://substackcdn.com/image/fetch/$s_!uCCH!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uCCH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png" width="500" height="333.1043956043956" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:970,&quot;width&quot;:1456,&quot;resizeWidth&quot;:500,&quot;bytes&quot;:212705,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uCCH!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 424w, https://substackcdn.com/image/fetch/$s_!uCCH!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 848w, https://substackcdn.com/image/fetch/$s_!uCCH!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 1272w, https://substackcdn.com/image/fetch/$s_!uCCH!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa485252d-8fd3-4c67-ae33-a4412b150f9d_4000x2666.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://unsplash.com/illustrations/leadership-business-women-think-figure-out-how-to-get-to-the-top-of-the-chart-to-achieve-business-goals-symbol-of-business-aims-mission-opportunity-and-challeng-concept-illustration-vector-eps10-AObyUOA6YKc">Unsplash</a></figcaption></figure></div><p>If you&#8217;re a user researcher trying to create meaningful OKRs (objectives and key results), you&#8217;re not alone if you feel overwhelmed or unsure about where to start. Maybe your company&#8217;s goals are unclear, or you&#8217;re working in a team that doesn&#8217;t directly connect to product development, like marketing or customer support. Or perhaps you&#8217;re focusing on improving internal research processes and you&#8217;re unsure how to measure your impact effectively.</p><p>User research can sometimes feel intangible, and writing OKRs that clearly demonstrate its value can be challenging. But with the right approach, you can create OKRs that are both meaningful and measurable, ensuring your research aligns with broader business objectives, even when those goals aren&#8217;t explicitly defined.</p><p>This guide will walk you through the exact steps to create OKRs that are actionable, tailored for your role as a user researcher, and adaptable to different organizational structures and team functions. Whether you&#8217;re supporting product teams, marketing, customer success, or focusing on improving internal research efficiency, you&#8217;ll find everything you need to write OKRs that demonstrate impact and align with key business goals.</p><h2><strong>What are OKRs and why do they matter?</strong></h2><p>OKRs (objectives and key results) are a goal-setting framework that helps teams set ambitious yet achievable goals and measure progress. They consist of:</p><ul><li><p><strong>Objectives (O):</strong> What you want to achieve. These should be clear, ambitious, and inspiring.</p></li><li><p><strong>Key results (KR):</strong> How you will measure progress toward the objective. They should be specific, measurable, and time-bound.</p></li></ul><p>The purpose of OKRs is to help align individual and team efforts with broader organizational goals, ensuring that everyone is working toward a common vision.</p><p>User research often operates in the background, influencing decisions indirectly. OKRs can help by:</p><ul><li><p>Prioritizing what matters most and avoid spreading efforts too thin</p></li><li><p>Making it easier to communicate the value of research to stakeholders</p></li><li><p>Aligning with and contributing to business goals, even if company goals are vague</p></li><li><p>Providing a clear framework for tracking progress and iterating on research efforts</p></li></ul><div><hr></div><p><strong>This guide walks through a practical way to build OKRs from the ground up, starting with a broad goal you can defend, turning it into a clear objective, then choosing key results that track real progress instead of busywork. Inside the full guide, paid subscribers get:</strong></p><ul><li><p><strong>A step-by-step process for turning your research focus into OKRs (broad goal &#8594; objective &#8594; key results &#8594; evaluation)</strong></p></li><li><p><strong>Stakeholder questions to help you find a research goal that fits your org, even when priorities are muddy</strong></p></li><li><p><strong>Four goal-braindumping techniques (pain point mapping, reverse engineering success, the &#8220;so that&#8221; method, stakeholder alignment sessions)</strong></p></li><li><p><strong>Simple formulas for writing objectives and key results, plus examples you can copy</strong></p></li><li><p><strong>A breakdown of UXR-friendly metrics (engagement, performance, quality, operational) and how to avoid &#8220;output-only&#8221; KRs</strong></p></li><li><p><strong>A practical checklist for evaluating and refining OKRs (SMART test, stakeholder review, peer clarity test, tracking plan, confidence scoring)</strong></p></li><li><p><strong>Common OKR scenarios with example OKRs: unclear company goals, B2B vs B2C, and non-product teams like marketing and customer support</strong></p></li></ul><p><strong>If OKRs have been the thing you avoid until the last possible second, this will give you a clean path to follow.</strong></p><p><em><strong>Exclusively for paid subscribers</strong></em></p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/nailing-your-uxr-okrs-a-framework">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Use Contextual Ecosystem Mapping to Spot Opportunities and Lead]]></title><description><![CDATA[The Secret to Spotting Gaps and Leading Your Market]]></description><link>https://www.userresearchstrategist.com/p/use-contextual-ecosystem-mapping</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/use-contextual-ecosystem-mapping</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 11 Feb 2025 09:31:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!NHhH!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F692e8686-1e93-49f1-ad6f-c95d9d9834ad_4000x3200.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995; <em>Hi, this is Nikki with a subscriber-only article from the User Research Strategist. I share content that helps you move toward a more strategic role as a researcher, measuring your ROI, and delivering impactful insights that move business decisions.</em></p><p><em>If you want to see everything I post, subscribe below!</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/use-contextual-ecosystem-mapping">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[The SEA Framework: Breaking down user research impact]]></title><description><![CDATA[Part one of my four part series on defining user research impact]]></description><link>https://www.userresearchstrategist.com/p/the-sea-framework-breaking-down-user</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/the-sea-framework-breaking-down-user</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Tue, 26 Nov 2024 14:22:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!7G4D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is a special edition of my newsletter &#8212; the first in a series of posts sharing my new framework for defining user research strategy. Regularly scheduled content will resume after the holidays!</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7G4D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7G4D!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 424w, https://substackcdn.com/image/fetch/$s_!7G4D!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 848w, https://substackcdn.com/image/fetch/$s_!7G4D!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 1272w, https://substackcdn.com/image/fetch/$s_!7G4D!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7G4D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png" width="498" height="498" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1080,&quot;width&quot;:1080,&quot;resizeWidth&quot;:498,&quot;bytes&quot;:83230,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!7G4D!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 424w, https://substackcdn.com/image/fetch/$s_!7G4D!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 848w, https://substackcdn.com/image/fetch/$s_!7G4D!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 1272w, https://substackcdn.com/image/fetch/$s_!7G4D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F688f6f67-4422-4102-9b51-c6aa7175cf14_1080x1080.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>You&#8217;ve just presented a beautifully crafted research report to your team. The insights are spot-on, the data solid, and you&#8217;re confident that real change is c&#8230;</p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/the-sea-framework-breaking-down-user">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Work With Me: Building a Research Plan for a Consultancy Client]]></title><description><![CDATA[Watch as I build a research plan in real time]]></description><link>https://www.userresearchstrategist.com/p/work-with-me-building-a-research</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/work-with-me-building-a-research</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Wed, 04 Sep 2024 08:14:46 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/147229741/63a12ec01674d3ec902cf0517dddb979.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>&#128075;<em> Hey,&nbsp;Nikki&nbsp;here!&nbsp;Welcome to this month&#8217;s&nbsp;</em>&#128274;<em>subscriber-only </em>&#128274;<em> walk-through. In these walk-throughs, I review my previous work and comment on what worked, what didn&#8217;t, and how I would improve it or walk you through work I am currently doing to share my thought process.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p>
      <p>
          <a href="https://www.userresearchstrategist.com/p/work-with-me-building-a-research">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Write Kick@ss User Research Goals]]></title><description><![CDATA[Learn the Recipe for Goal-Setting Throughout Your Research Process]]></description><link>https://www.userresearchstrategist.com/p/write-kickss-user-research-goals</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/write-kickss-user-research-goals</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Wed, 13 Mar 2024 09:23:09 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!W1Bq!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcce28c8b-42a9-4b75-ad65-f05ffc0df182_500x500.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;<em> Hey,&nbsp;Nikki&nbsp;here!&nbsp;Welcome to this month&#8217;s&nbsp;</em>&#10024;<em>&nbsp;<strong>free article&nbsp;</strong></em>&#10024;<em> of User Research Academy. Three times a month, I share an article with super concrete tips and examples on user research methods, approaches, careers, or situations.</em></p><p><em>If you want to see everything I post, subscribe below!</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>One of my favorite things to talk about is user research goals because I truly believe they are the foundation of successful user research projects. </p><p>When I first started as a user researcher, I had no idea how important goals would be to my research projects. Over time, I realized and recognized that they are (to me, at least) one of the most important indicators of success in a research project. </p><p>However, I also quickly learned that writing effective user research goals can be difficult. It took me a lot of practice to feel comfortable writing research goals in a way that made sense to me, the team, and for the overall study. </p><h1>Why are Goals Important?</h1><p>So many times, I have sat down to determine what should be in a research report, survey, dashboard, share out or&#8230;</p><p>And this is also a constant question <strong><a href="https://www.userresearchacademy.com/uxrmembership">in my membership</a></strong>. </p><p>I often find that whenever I am stuck, I haven&#8217;t fully defined the goals of whatever I&#8217;m trying to do. I also see this happen so much with other researchers I have worked with. </p><p>I can be easy to get deep into a project, making it so difficult to see the forest for the trees. You get stuck trying to find an answer, and it can be almost impossible to see that you need to take a step back and look at what you are trying to accomplish. One of my members, Harmony, and I talk about this very conundrum in a <strong><a href="https://open.substack.com/pub/userresearchacademy/p/episode-66-membership-spotlight-harmony?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">podcast episode about tooling</a></strong>.</p><p>I faced this very situation when I was trying to find research tools for my organization. I spent hours upon hours Googling different tools, comparing features, and writing up summaries of each platform. At the end of this process, I felt overwhelmed and dissatisfied. I hadn&#8217;t come up with some magic answer through my research and, instead, created more questions for myself.</p><p>In the end, I realized I had dove straight into researching the tools and hadn&#8217;t sat down to really think about the goals of this initiative. What was I trying to accomplish? What was the expected outcome? Once I asked those questions, not only was I able to narrow the scope of the initiative, but I focused on what I needed to make more informed decisions.</p><p>Goals give our projects and initiatives forward momentum and help us whenever we feel stuck at a certain point. When I started using a goal-first approach, I recognized a huge shift in my work:</p><ul><li><p>My studies were more aligned with what stakeholders needed and everyone knew what the outcome of the study would be (fewer surprises or disappointments)</p></li><li><p>Stakeholders came to my presentations excited to learn because they knew what to expect</p></li><li><p>My research sessions were significantly more focused because I knew the information I needed to get from participants (while still leaving room for unknown unknowns, of course)</p></li><li><p>My workshops were extremely successful, and colleagues took action after them</p></li><li><p>Colleagues <a href="https://dscout.com/people-nerds/writing-user-insights">used my insights</a> because they were directly related to a decision they were trying to make</p></li><li><p>I no longer had paralyzing doubt when trying to decide on the direction I needed to go on certain projects or deliverables </p></li><li><p>There was no longer this dread I could feel during my research projects when I was unsure if the outcome was what the stakeholders needed</p></li></ul><p>When I truly consider and think about my goals, there is so much less head-banging and spinning my wheels. Goals make everything clearer and give you a solid direction for moving forward.</p><h1>How to Include Research Goals in Your Process</h1><p>I quickly learned (and was thrilled to understand) that defining goals goes beyond just user research studies. You can apply this goal-first mindset to almost every single part of your user research process, and it will help you ensure you are doing or including whatever you need to accomplish in an efficient and effective way. </p><p>As I mentioned above, I used this goal-first approach to help me with my user research tool initiative, and it didn&#8217;t stop there. I&#8217;ve become rather obsessed with always starting with the goals of something (even in my business!) and always reminding people that whatever they do, they can focus and get answers by defining goals.</p><p>Since I discovered the power of user research goals, I have used them in many different circumstances to help me feel confident about my decisions. In this article, I&#8217;ll walk you through a few different ways to incorporate goals into your process so you can benefit from this wonderful approach!</p><h2>Research Studies</h2><p>My research studies were the first and most obvious place I started thinking about goals. In the past, I conducted several projects in a row that led to relatively disappointing results &#8212; either the methodology didn&#8217;t get us the information we needed, or the research outcome wasn&#8217;t effective in helping teams make better decisions.</p><p>I thought this was just a painful part of the research process for a while, but something felt off. It didn&#8217;t seem like getting the right information should be a toss-up, and I was tired of disappointing my colleagues. I had done so much work trying to prove the value of research, and I didn&#8217;t want anything to undo that.</p><p>So, I investigated what might be going wrong to lead to such a mismatch. After some whiteboarding and stakeholder conversations, I discovered that my lack of goals (poorly written goals) hindered our research projects. Because this step in the process wasn&#8217;t as well-planned, we struggled to get the exact information we needed from the study.</p><p>As soon as I realized this, I started to think much more deeply about defining research goals in my studies. Not only did this lead to better alignment on the project, but I was also able to more confidently choose participants and proper methodologies.</p><p><em>P.S. Want to learn more about writing user research plans? Check out <strong><a href="https://userresearchacademy.substack.com/p/how-to-create-an-impactful-user-research">this super-detailed article</a></strong>.</em></p><h2>Presentations or Reports</h2><p>There were so many times when I asked myself the type of information I should include in a presentation or report; it is also a question I get asked often by my <a href="https://www.userresearchacademy.com/uxrmembership">members</a> and <a href="https://www.userresearchacademy.com/mentorship">mentees</a>.</p><p>Before I started to think about the goals of my reports and presentations, I would spend hours going around in circles, trying to guess which information to include and what to leave out. </p><p>I would get stuck in this cycle trying to figure out the information I needed to include in these presentations or reports and spend hours endlessly contemplating this. In the end, I guessed, and I often felt nervous walking into those meetings. Did I strike the right balance? Did I include what I needed to? Or did I include too much? Or maybe I missed something? </p><p>And my biggest question: how did I get people to care about the things I was talking about in these presentations?</p><p>Sometimes, I nailed it, but sometimes, my reports fell flat. It was the inconsistency that really frustrated me. So, similar to what I did with my research studies, I tried to figure out how to improve my reports and came to the same conclusion: defining the goals and outcomes. </p><p>Once I started thinking about the goals and outcomes of my reports and presentations, I felt more confident about what I presented and saw the difference in engagement. People actually cared more about what I spoke about, and I had more active discussions that led to action. </p><h2>Deliverables</h2><p>There are many deliverables, from <strong><a href="https://userresearchacademy.substack.com/p/building-a-b2c-persona">personas</a></strong> to <strong><a href="https://userresearchacademy.substack.com/p/building-a-b2b-customer-journey-map">journey maps</a></strong> to dashboards to flows. It can become exhausting trying to understand which deliverable to choose and, on top of that, what information to include in said deliverable. </p><p>I used to employ the &#8220;go-with-the-flow&#8221; approach, where I simply picked a deliverable or created the deliverable stakeholders asked for. The problem with this was that there was very little (or no) intention behind the deliverable. That ultimately meant the end result fell flat and was rarely used in people&#8217;s work.</p><p>This happened to me several times before I realized I was facing the same problem as before. I would spend so much time and effort creating personas or journey maps only for colleagues to look at them once or twice and then leave them to the wayside. Something was missing from them, but I had no idea how to get people to use these deliverables.</p><p>Since I couldn&#8217;t avoid creating them forever (I tried, trust me), I had to fix this problem before I continued to lose confidence in my skills. Instead of moving forward with a certain deliverable without thinking about it, I started to define why I was creating it and what I wanted to accomplish with it. Then, I started asking my stakeholders the same thing. </p><p>With this approach, I gained insight into the goals behind the deliverables and the outcomes we all expected from them. Armed with this information, I could create deliverables that aligned with stakeholders&#8217; needs that they actually <em>wanted</em> to use.</p><h2>Workshops</h2><p>Workshops are fundamental to <strong><a href="https://userresearchacademy.substack.com/p/activating-your-insights">activating your insights</a></strong> and helping your teams go from the problem to the solution space. I didn&#8217;t think about my workshop goals for a long time. I just ran them because I was <em>supposed</em> <em>to</em> run them.</p><p>But, as you might imagine, this was disappointing for everyone. When people came to my workshops, they weren&#8217;t clear on the goals or the outcomes of the workshop, and my sessions lacked focus. In the end, there wasn&#8217;t a clear resolution. </p><p>People stopped coming to my workshops because they felt unstructured, and I got feedback that they were a waste of time. I was devastated. Workshops are a key soft skill for researchers, and I was terrified this would hold me back from leveling up in my career.</p><p>I went back to the drawing board and took the time to plan <em>why</em> I was running my next workshop. At first, I blanked, unsure why I had ever run workshops &#8212; I wasn&#8217;t used to thinking about this. However, I started to ask myself what I was trying to accomplish, what I wanted participants to get out of it, and what I wished the outcome would be. These helped me define my why and create more effective and impactful workshop sessions.</p><h1>How to Write Research Goals</h1><p>Writing goals takes practice and patience, especially if you haven&#8217;t deeply thought of them in the past. I remember trying to develop goals for different parts of my process, and I felt overwhelmed. I started to question everything, including my skills. Why did I do <em>anything</em>? Why this method? Why that report structure? Why, why, why?</p><p>Eventually, I took a deep breath and started asking myself different questions, and on top of that, I asked <em>other</em> people these same questions. With this mindset shift, I could better define goals without completely giving up on my user research career and becoming a bookstore owner (something I <em>still</em> want to do, though!). </p><p>I don&#8217;t want you to feel the same overwhelm (and, quite frankly, despair) I went through, so I will walk you through how I write goals for the different parts of my research process.</p><h2>Avoid the Guessing Game</h2><p>It&#8217;s really funny because, as researchers, we value asking questions so incredibly much. However, I frequently forgot to ask the most important people questions. And those were my stakeholders. I see this happening to other researchers as well. We tend to forget that our <strong><a href="https://userresearchacademy.substack.com/p/treat-stakeholders-like-users">stakeholders are our users as well</a></strong>. </p><p>Instead, we try to guess and read minds, asking <em>ourselves</em> what <em>we believe</em> our stakeholders might want or need. I had to laugh out loud when I realized I was doing this. I&#8217;d spent years telling my stakeholders not to assume or guess what users were thinking only to understand I was doing the same exact thing!</p><p>So, from there on out, I started to include my stakeholders in this process, asking them questions that would help me define and align (what a great rhyming mechanism!). Depending on what I was doing, I tailored my questions for each part of the process. These questions include:</p><p><strong>Information-gathering</strong></p><ul><li><p>What information do you need at the end of the project? Why?</p></li><li><p>What information do you need in the presentation/deliverable/report? Why?</p></li><li><p>What are the top three questions you need to be answered? Why are those your top three questions?</p></li><li><p>What is your number one gap in knowledge?</p></li></ul><p><strong>Decision-making</strong></p><ul><li><p>What decisions do you want to be able to make by the end of the project?</p></li><li><p>What decisions do you want to be able to make based on the deliverable?</p></li><li><p>Who will use this presentation/deliverable/report?</p></li></ul><p><strong>Expected outcomes:</strong></p><ul><li><p>What is the ideal outcome of this project?</p></li><li><p>What is the ideal outcome of this presentation/deliverable/report?</p></li><li><p>What is the number one thing you will do with the information?</p></li></ul><p><strong>Tracking success:</strong></p><ul><li><p>What is your definition of success for this project?</p></li><li><p>How does this project relate to any organizational/business goals?</p></li><li><p>What metrics are you using to define the success of this project?</p></li><li><p>How would you describe the success of this deliverable/report/presentation?</p></li></ul><p>After asking these questions, I then ask them to fill out this mad-lib exercise:</p><p><em>I need [information] to understand [proposed research goal] to make [decision] that will impact [team/organizational goal]. Then, by the end of the study/workshop, I need [ideal outcome].</em> </p><p>By gathering all this information up-front, you are ready to create goals that get everyone what they need and give clear focus to your study.</p><h2>For Research Studies:</h2><p>When it comes to writing goals for research studies, it is especially important to include your stakeholders whenever possible by asking them the above questions. Including them ensures you get the outcome they need and also helps you immensely with choosing the correct participants and methodology.</p><p>For research studies, I break up the goal-writing process into two parts:</p><ol><li><p>A research statement</p></li><li><p>Research goals</p></li></ol><p>A <strong>research statement</strong> is what you are trying to learn about users at a high level. Here is a model you can use:</p><p><em>We want to better understand how users [think about/make decisions on/interact with] [subject of research/ product] to [create/improve] [product/website/app/service].</em></p><p>This statement gives one or two sentences that describe what the overarching project is about. It is important to solidify this statement because it helps you to create your research goals.</p><p><strong>Research goals</strong> directly relate to your research statement because they are the more in-depth areas you want to explore in your research statement that will help you answer what you are trying to learn. Your research goals should address what you want to learn and how you will study the research statement.</p><p>These goals are the things you want to be able to gather information about by the end of the study. They aren&#8217;t posed as questions, but you want to be able to &#8220;answer&#8221; them in the sense of getting enough data to feel comfortable making decisions. Below are some models you can use for creating research goals.</p><p>Common generative research goals:</p><ul><li><p>Discover people&#8217;s current processes/decision-making about [research subject], and how they feel about the overall experience.</p></li><li><p>Learn about people&#8217;s current pain points, frustrations, and barriers about [current process/current tools] and how they would improve it.</p></li><li><p>Understand what [research subject] means to people (how they define it) and why it is important to them.</p></li></ul><p>Common evaluative research goals:</p><ul><li><p>Evaluate how people are using a [product/website/app/ service].</p></li><li><p>Evaluate how people are currently interacting with a [product/ website/app/service].</p></li><li><p>Uncover the current tools people are using to [achieve goal], and their experience with those tools. Uncover how they would improve those tools.</p></li></ul><p>These definitely aren&#8217;t all the goals you could have, but they can give you a structure and a jumping-off point for writing your research goals. If you&#8217;re having a hard time creating research study goals, you can ask yourself and your stakeholders these questions:</p><ul><li><p>What do we want to learn about [research topic]?</p></li><li><p>What type of experiences do we want to learn about?</p></li><li><p>What information do we want at the end of the study?</p></li><li><p>What decisions are we trying to make by the end of the study, and what can help us make those decisions more confidently?</p></li></ul><h3>Research Study Goal Examples</h3><p>Here are two examples of research study goals:</p><p><strong>Example one:</strong></p><p>Research statement: We want to understand better how users currently find content through the Merchant Portal in order to understand how to improve the experience for them (through a search functionality or otherwise)</p><p>Goals:</p><ul><li><p>Discover how users currently search through content within the Merchant Portal</p></li><li><p>Uncover users&#8217; pain points when it comes to finding content within the Merchant Portal</p></li><li><p>Identify how users think about search and their unmet needs when it comes to finding content within the Merchant Portal</p><div><hr></div></li></ul><p><strong>Example two:</strong></p><p>Research statement: We want to understand better how users interact with sustainable choices within our platform to create an experience that aligns with their needs and mental models</p><p>Research goals:</p><ul><li><p>Evaluate how participants interact with the sustainability prototype</p></li><li><p>Uncover any pain points or confusion participants encounter when interacting with the prototype</p></li><li><p>Uncover other current tools participants use to understand their sustainability when choosing travel options</p></li></ul><p>I recommend, for each study, having no more than three goals. I&#8217;ve found that going over three goals increases the scope and makes it hard to get in-depth information on each goal.</p><h2>For Presentations or Reports</h2><p>It used to take me hours just to figure out what to put in my user research reports or presentations. I used to want to scream, staring at a blank page for what felt like <em>ever</em>, constantly asking myself what I could do to get people to care about my report. </p><p>After some time, I decided to take a step back and assess my research reports, like a retrospective on how I put them together. I mainly did this because I felt my reports had become dull and dry, causing people to yawn or disengage. This was the last thing I wanted. I needed colleagues to be engaged and get excited about my research findings, not take a snooze during my presentation.</p><p>I returned to the basics and asked myself, &#8220;Why should they care about this presentation/report?&#8221; </p><p>My simple answer was that it should help them make decisions. </p><p>That was an ah-ha moment for me. I used to organize my reports and presentations by themes (which I still occasionally do), but I decided to try a slightly different approach. I went straight to answering the research goals aligned on at the beginning of the project. The incredible impact of this structure was that it directly answered what the stakeholders needed to know.&nbsp;&nbsp;</p><p>Let&#8217;s say the research goals were to:</p><ol><li><p>Understand people&#8217;s current mental models around deciding on where to travel next&nbsp;</p></li><li><p>Discover pain points behind deciding on where to travel&nbsp;</p></li><li><p>Identify the tools people currently use when getting inspired and deciding on where to travel to next</p></li></ol><p>So, instead of grouping by themes, I would structure it like:</p><ul><li><p>Research goal one title</p><ul><li><p>Finding one directly related to the research goal</p><ul><li><p>Evidence of finding one</p></li></ul></li><li><p>Finding two directly related to the research goal</p><ul><li><p>Evidence of finding two</p></li></ul></li><li><p>Finding three directly related to the research goal</p><ul><li><p>Evidence of finding three</p></li></ul></li></ul></li></ul><p>Whenever I use this structure to present my reports, people tend to be more engaged because the information in the report is directly related to what they care about and what can help them make better decisions.</p><p>If you aren&#8217;t necessarily presenting to a specific team or presenting a research report, you can ask yourself (and your audience) the following questions to help you define your presentation goals:</p><ul><li><p>Who is the audience?</p></li><li><p>What do they care about? </p></li><li><p>What decisions are they trying to make? What are the top three pieces of information to help them move forward in that decision?</p></li><li><p>What is the number one thing you want them to take away from the presentation?</p></li><li><p>What is the number one thing you want them to do after the presentation?</p></li><li><p>What is your ideal outcome of the presentation? What is <em>their</em> ideal outcome?</p></li><li><p>Why would they care about coming to this presentation?</p></li></ul><p>If I were presenting to high-level executives, some of my goals might include:</p><ul><li><p>Communicate the top three findings that impact the most relevant business metrics they care about</p></li><li><p>Get an idea of prioritization based on the research findings from a business perspective</p></li><li><p>Get buy-in for additional research studies and budget by presenting research's impact on teams and the organization</p></li></ul><p>Always think about these goals because they will give you a structure of what you need to include and what you are asking participants to act on during or at the end of the presentation, making it more inspiring and clear.</p><p><em>P.S.: If you are struggling to write insights, look at this <strong><a href="https://userresearchacademy.substack.com/p/write-impactful-user-research-insights">step-by-step guide on how to write impactful UXR insights</a></strong>.</em></p><h2>For Deliverables</h2><p>How many times has a stakeholder come to you asking for a particular deliverable? I can&#8217;t even count the number of times I&#8217;ve been asked to create personas, journey maps, mental models, or Jobs to be Done. And one of the worst things I&#8217;ve done is agree and move forward without getting context. It always resulted in the deliverables dying with a dusty cover. Or colleagues looking at the deliverables with confused expressions.</p><p>Although I did my best to avoid creating most deliverables for a while (<em>that</em> was interesting and funny), I couldn&#8217;t avoid them forever, and eventually, I had to create another set of personas. I didn&#8217;t want it to end up the same as before, so I knew I had to approach it differently.</p><p>The next time a stakeholder came to me with an idea for a deliverable, I asked:</p><ul><li><p>What decisions are you trying to make with the deliverable?</p></li><li><p>What type of information are you looking for?</p></li><li><p>How does the team best digest information?</p></li><li><p>What&#8217;s the ideal outcome of this deliverable?</p></li><li><p>What kind of visuals/deliverables have been helpful in the past? Why have they been helpful?</p></li></ul><p>Using this combination of information, I created goals that led me to a completely different deliverable than the one the stakeholder had requested. Originally, the stakeholder asked me to create a persona, but when I probed for information, I found they needed:</p><ul><li><p>An understanding of how people thought about a certain process</p></li><li><p>The areas/gaps where the product wasn&#8217;t supporting users in their tasks/needs</p></li><li><p>The features that we should de-prioritize because they weren&#8217;t helping users</p></li></ul><p>With this, I knew a persona inherently wouldn&#8217;t help the team with what they needed. Personas are much more geared toward building a deeper understanding of people&#8217;s needs, goals, and pain points, and can help us prioritize roadmaps and future work.</p><p>Initially, I was stuck. So I took their needs and created them into goals:</p><ul><li><p>Communicate the process people go through when ordering meal kits, including how they feel about the process</p></li><li><p>Visualize where our product isn&#8217;t helping users and the tasks users are doing without our support</p></li><li><p>Highlight the necessary features of our users and those that aren&#8217;t being used in people&#8217;s process of ordering meal kits</p></li></ul><p>With this, I quickly realized that a mental model diagram would be the perfect fit for the needs of this project. I then reversed-engineered the deliverable goals into study goals and aligned with the team to make sure this all made sense. It did, and guess what? They loved the mental model diagram.</p><p>So, when you are trying to choose a deliverable, always talk to your stakeholders to understand what they need and then use that to create your deliverable goals &#8212; it will help save you a huge headache!</p><h2>For Workshops</h2><p>Whatever the topic, I always create goals for my workshops and share them with my audience. Whenever I have been super transparent and up-front in my workshops about what people can expect, I&#8217;ve got much more out of the session. If people know what you need, they are better equipped to participate in a helpful way. </p><p>Whenever I am creating goals for my workshop, I start by asking myself:</p><ul><li><p>What do I want people to get out of this workshop?</p></li><li><p>What do I want to achieve by the end of the workshop?</p></li><li><p>What action do I want to have done within or at the end of the workshop?</p></li><li><p>What is the ideal outcome of the workshop?</p></li></ul><p>For me, workshops typically fall into one of the following themes:</p><ol><li><p>Create or innovate on new ideas based on problem statements from research</p></li><li><p>Assess current ideas to identify problems before launch</p></li><li><p>React to research findings or post-launch learnings</p></li></ol><p>Each of these themes has goals that also correspond with different activities. </p><h3>Create or Innovate on New Ideas</h3><p>A workshop within this theme would aim to:</p><ul><li><p>Generate a specific amount of ideas to test</p></li><li><p>Come together to solve complex user problems</p></li><li><p>Find the unknown unknowns</p></li><li><p>Gain a deeper understanding of pain points and how to solve them</p></li></ul><p>With this in mind, you would pick activities that would help us achieve these goals. Some activity examples would be:</p><ul><li><p>Empathy mapping</p></li><li><p>How Might We statements</p></li><li><p>Crazy 8's</p></li></ul><h3>Assess Existing Ideas</h3><p>You can use theme after the team has created a prototype or concept. The goals for this theme are:</p><ul><li><p>Identify and solve problems in ideas before they launch</p></li><li><p>Get critical and early feedback from users</p></li><li><p>Come together to create a better solution</p></li><li><p>Create a specific number of new ideas to tes</p></li></ul><p>When trying to get feedback on the prototype or concept within a workshop, you can try the following activities:</p><ul><li><p>Method 6-3-5</p></li><li><p>Usability speed testing</p></li><li><p>Do, undo, redo</p></li></ul><h3>React to Research or Post-Launch Findings</h3><p>Sometimes, you need a workshop to give your research that extra attention. At least, I know I do. Often, a report is insufficient, so holding a workshop to unite people is a great way to utilize your insights.</p><p>Some of the goals for these workshops include:</p><ul><li><p>Maintain launched products (ex: usability issues, customer support)</p></li><li><p>Gain a deeper understanding of your user</p></li><li><p>Create deliverables together</p></li></ul><p>For these workshops, there are several different activities you could use based on the goal:</p><ul><li><p>Maintain launched products &#8594; Gathering issues across departments and prioritizing them via the <a href="https://www.intercom.com/blog/rice-simple-prioritization-for-product-managers/">RICE model</a> (rather than dot voting) OR reviewing an <a href="https://userresearchacademy.substack.com/p/prioritize-qualitative-research-insights">opportunity gap survey</a></p></li><li><p>Gain a deeper understanding of your user &#8594; Reviewing research plus How Might We statements</p></li><li><p>Create deliverables together &#8594; Persona generation, journey mapping</p></li></ul><p><em>P.S.: Check out <strong><a href="https://userresearchacademy.substack.com/p/activating-your-insights">this article</a></strong> for more in-depth ways to activate your research insights!</em></p><h2>Join my membership!</h2><p>If you&#8217;re looking for even more content, a space to call home (a private community), and live sessions with me to answer all your deepest questions, <strong><a href="https://www.userresearchacademy.com/uxrmembership">check out my membership</a></strong> (you get all this content for free within the membership), as it might be a good fit for you!</p>]]></content:encoded></item><item><title><![CDATA[Creating and maintaining an outcome-based user research roadmap]]></title><description><![CDATA[How to organize your research so that it's aligned with the business]]></description><link>https://www.userresearchstrategist.com/p/creating-and-maintaining-an-outcome</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/creating-and-maintaining-an-outcome</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Wed, 29 Nov 2023 08:24:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!coge!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5453bad-49c7-4455-9469-274aa45164c3_2944x768.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;&#127995;<em>Hi, this is Nikki with a&nbsp;</em>&#128274;<em>subscriber-only </em>&#128274;<em> article from User Research Academy. In every article, I cover in-depth topics on how to conduct user research, grow in your career, and fall in love with the craft of user research again.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>Demonstrating impact is critical to our jobs as user researchers, especially recently. Everywhere you look, it&#8217;s all about impact. Showing impact, proving impact, sharing impact. </p><p>Often, the impact gets tied to the <em>output</em> of user research. Were the insights impactful? Did they spur action? What about the product changed? How did the stakeholders feel about the report?</p><p>I played the dangerous game of waiting until the insights came from the research to see how impactful the project was. This approach led me to <em>hoping</em> my research would impact teams and the organization rather than knowing I was conducting the most influential research possible. </p><p>At one point, I ran several studies in a row that had landed on my desk without really thinking about how they tied back to the larger goals of the team and the organization. When it came to performance review time, I had <em>done</em> a lot of stuff, but there wasn&#8217;t much to show. It was disappointing for me, but a fantastic lesson.</p><p>A few months before, I had started a user research roadmap and backlog to organize my upcoming projects better and share them with my stakeholders. I&#8217;d &#8220;grown up&#8221; with product roadmaps being important, so I simply took the same concept to apply to my research projects. </p><p>However, over time, I saw my research roadmaps fall into the same trap many product roadmaps do &#8212; they became like &#8220;feature factories&#8221; filled with projects focused on outputs rather than outcomes. I simply threw research projects on the roadmap, not thinking too much about how the project tied back to the outcome or the business. </p><p>While this worked for a while, it ended up not serving me. I was frustrated churning out projects as teams churned out features. The work felt disjointed from the &#8220;bigger picture,&#8221; but I didn&#8217;t want to let go of my roadmap. Although it wasn&#8217;t the most ideal, it gave me such a great place to plan from and was so helpful in transparency on what I was working on. </p><h2>What&#8217;s an outcome-based research roadmap?</h2><p>An outcome-based user research roadmap is a living, ever-evolving document that shows the work a user researcher (or user research team) is focused on and how it specifically relates to a larger business objective. It mainly includes projects the team will work on in the next quarter, half-year, or year. </p><p>The biggest difference between a general roadmap and an outcome-based roadmap is that, instead of feeling like a feature factory, all of the projects on the roadmap are tied to a specific outcome. </p><p>The roadmap demonstrates the different outcomes the team will work on and how they will achieve them.</p><h3>What do you mean by outcome?</h3><p>Outcomes can be difficult to measure in user research, but it is essential to consider how our work ties to larger team or company-based objectives/goals. </p><p>When simply putting projects on my roadmap without thinking about the larger goal, I spun my wheels doing the same work repeatedly and always failed to answer the question, &#8220;How does user research relate to business?&#8221;</p><p>When it comes to outcomes, you have to look to your colleagues, teams, and organizations to help you. User research is a support system for decision-making and risk mitigation, so the outcomes of your research should support teams in the decisions they have to make and the risks they might be taking.</p><p>This concept can be tough to understand, so let&#8217;s look at some examples:</p><h4>Example one:</h4><p>Imagine we are working with the <strong><a href="https://open.substack.com/pub/userresearchacademy/p/how-to-conduct-retention-research?r=2j6x4d&amp;utm_campaign=post&amp;utm_medium=web">retention team</a> </strong>at Pokemon TCG Live (add me if you play - my username is nicolerothier), and the team&#8217;s outcome is to &#8220;improve our day-7 app retention rate by 10%.&#8221;</p><p>Essentially, when people sign up for the live Pokemon Trading Card Game, we want them to return to our app within seven days because, once they pass that seven-day threshold, they are more likely to play for longer and purchase more cards from the store.</p><p>There are a million ways that we could likely brainstorm how to do this. This means there are a million <em>risks</em> they could take to try and move this metric. That&#8217;s where user research comes in and where we can start to tie our research back to an outcome.</p><p>Our research project becomes about mitigating the risks the team are taking when they work to improve this metric. It gives them more of a path or guidance toward making better decisions that resonate with users. So, within this project, our outcome would be: </p><p>&#8220;Reduce the risk of wasted time/energy when creating solutions to improve our day-7 app retention rate by 10%.&#8221;</p><p>You could also just tie the project directly to the team outcome, but I always like mentioning risk mitigation, help with decision-making, or reducing choice because it really is the essence of user research.</p><h4>Example two:</h4><p>For this example, I will demonstrate the difference between a more feature-based research project versus an outcome-based research project.</p><p>Imagine we are working with a conversion team (focused on increasing conversions in our product) at a company called Spooky World, where we sell year-round Halloween decorations (anyone??? Let me know if you want to go in on this <em>fantastic</em> idea). The team&#8217;s questions are:</p><ul><li><p>&#8220;Should we add a quick buy button?&#8221;</p></li><li><p>&#8220;Should we add product reviews and photos?&#8221;</p></li><li><p>&#8220;Should we include tips on how to decorate? Or maybe create a blog?&#8221;</p></li><li><p>Should we let users connect with each other and talk about tips? Should we create a Spooky Community?&#8221;</p></li></ul><p>These questions focus on features rather than an outcome or users. They can lead to many usability tests that, ultimately, don&#8217;t add much value to an organization or a product. When you go into a performance review, these types of projects can sometimes feel like low impact while still being a good amount of work.</p><p>Instead, you can reframe these questions to a larger outcome. Let&#8217;s take the quick buy button and the product reviews and photos feature. Instead of just talking about features, we could look at this like: &#8220;Improve conversion rate by 5% by simplifying the purchase process.&#8221;</p><p>Or, if we take the blog and community idea, we could tie it to a much larger objective of &#8220;Increasing customer basket size by 10% by helping customers understand how to use decorations together.&#8221;</p><p>Either way, we aren&#8217;t focused on &#8220;adding a button&#8221; or &#8220;creating a community,&#8221; but rather, we are mitigating risk by helping users and moving business metrics. </p><h4>Example three</h4><p>The final example I want to share is when you run into research projects that feel like they aren&#8217;t particularly tied to an outcome, such as generative research. A lot of the time, generative research can be lofty and abstract or could have a lot of potential outcomes across multiple teams.</p><p>For this example, let&#8217;s imagine we are working at Lego, and we are trying to understand our customers better by doing generative research to uncover their pain points, goals, and needs. We&#8217;re focusing primarily on parents because&#8230;money &#128513;</p><p>Within this context, what is our outcome? There are a few things we could tie this to:</p><ol><li><p>Break down exactly how this project could impact the current outcomes/goals your teams are working on. Generative research helps you gather a lot of useful information that could help all the major metrics (<a href="https://userresearchacademy.substack.com/p/how-user-research-impacts-the-aarrr">AARRR metrics</a>, for instance). For instance, generative research could absolutely help with improving retention rates by understanding unmet needs.</p></li><li><p>Look further than team-based metrics toward higher-level organization metrics. Generative research helps identify avenues for growth and innovation, which tend to be larger company goals.</p></li><li><p>Take into consideration internal outcomes for your team/yourself. You could have a goal of conducting more generative research (e.g. balancing evaluative and generative research better), so that could be an outcome you could include in the study.</p></li></ol><p>Try not to tie it to a deliverable because that&#8217;s an <em>output</em> rather than an outcome, so if you are looking to create an output, what is the <em>outcome</em> that output will achieve? For example, the output of a project is a <strong><a href="https://userresearchacademy.substack.com/p/building-a-b2c-persona">persona</a></strong>. The outcome of the project is what that persona helps the team do. If we created a persona at Lego, we would want to tie it to an outcome such as creating user-centric product roadmaps or increasing retention rates by understanding and addressing unmet needs.</p><h2>What&#8217;s in an outcome-based research roadmap?</h2>
      <p>
          <a href="https://www.userresearchstrategist.com/p/creating-and-maintaining-an-outcome">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[Treat stakeholders like users]]></title><description><![CDATA[The mindset shift that completely changed my career]]></description><link>https://www.userresearchstrategist.com/p/treat-stakeholders-like-users</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/treat-stakeholders-like-users</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Wed, 26 Jul 2023 13:33:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!CDfA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>&#128075;<em> Hey,&nbsp;Nikki&nbsp;here!&nbsp;Welcome to this month&#8217;s&nbsp;</em>&#10024;<em>&nbsp;<strong>free article&nbsp;</strong></em>&#10024;<em> of User Research Academy. Three times a month, I share an article with super concrete tips and examples on user research methods, approaches, careers, or situations.</em></p><p><em>If you want to see everything I post, subscribe below!</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h1>What does &#8220;treat stakeholders like users&#8221; mean?</h1><p>I spent years trying to convince stakeholders the importance and value of user researchers. It felt like I was constantly begging or compromising to feel heard and seen by stakeholders. Sometimes they didn&#8217;t value user research, couldn&#8217;t find time for it, or ignored the findings in favor of assumptions or opinions. As a result, I felt frustrated and helpless when working with stakeholders.</p><p>One day, I got sick of it. I didn&#8217;t want to feel like I was begging colleagues anymore. I was burnt out and tried of banging my head against the wall. But I wasn&#8217;t sure what to do instead.</p><p>I was in a meeting later that day and one of the stakeholders was talking about how he felt like he was trying to convince users to use a certain feature, and how frustrating it was. I opened my mouth to say, for the millionth time, &#8220;we should do user research to understand our users.&#8221;</p><p>And then it clicked.</p><p>We constantly tell our teams to focus on our users. Whenever they aren't sure about something, we tell them to ask users. Whenever they want to make decisions, we beg them to keep users in mind. We ask them to empathize with users.</p><p>Yet, do we do this with stakeholders?</p><p>I would spend hours on reports or deliverables, and no one seemed to care. At one point, when I was working with a low budget, I created my own repository from scratch. I got a few developer friends to help, making it <em>perfect.</em> My stakeholders could not ignore research now because I had set them up for success. They didn't use the repository.</p><p>I spun in circles until I realized what I was doing. I created reports, personas, repositories, and even research projects, but these outcomes were <em>mine.</em> I made them based on the rules and best practices I had learned over the years.</p><p>And that was the key for me. I'd committed the acts we begged teams not to do. I had ignored my users. I didn't think about their needs, goals, pain points, or motivations. Instead, I thought about what I was taught to do. And I made so many assumptions about what I learned was the right way of doing things.</p><p>Our stakeholders are our users, and it is crucial to think about them in this way. If we don't, we risk missing the mark on our deliverables and potentially our entire process. It&#8217;s the equivalent of asking a designer to create something without any user feedback.</p><p>This guide will help you shift your mindset and put into place strategies that help you align and collaborate so much better with users.</p><h2><strong>How to conduct stakeholder interviews</strong></h2><p>We need to empathize with our stakeholders because they are ultimately the users of our information. They use our products, such as reports, repositories, and deliverables. We ask them to be a part of our process.</p><p>If we don't understand our stakeholders, how do we create an experience for them that aligns with their needs and goals and that alleviates their pain points?</p><p>Stakeholder interviews are the perfect way to begin this empathetic process. You can even offer an incentive like coffee or lunch, or some sort of raffle for prizes. When I started doing stakeholder interviews, I bought lots of coffee and even gift certificates to favorite lunch spots.</p><h3><strong>Prepare for the interviews</strong></h3><p>Whenever I go into stakeholder interviews, it is important to have a clear goal, outcomes, and a script. I treat this similarly to a research project.</p><p>Of course, goals vary depending on what information you need from stakeholders, but my most common goals for these stakeholder interviews are:</p><ul><li><p>Understand stakeholder needs when it comes to user research (as well as their learning needs and how they digest information)</p></li><li><p>Identify the pain points they have in their team, as well as with user research</p></li><li><p>Discover the goals they have for themselves and their teams</p></li><li><p>Uncover their previous experiences (and anxieties) with user research</p></li><li><p>Understand their mental model about user research and how they currently interact with user research and deliverables</p></li></ul><p>My outcomes include:</p><ul><li><p>Have a better understanding of what stakeholders are working on so that I can best align my work to support them (and show them the value of user research)</p></li><li><p>Understand their pain points about user research so that I can ensure the projects I pitch help ease those pain points and anxities</p></li><li><p>Create and share a user research process that makes sense to me and as well to my stakeholders so that we are aligned and reduce any barriers to entry</p></li><li><p>Empathize with them in understanding how they interact with user research to make it the best possible experience</p></li></ul><p>Whenever I reach out to stakeholders to book this session, I typically invite them for a 60-minute 1x1 where I share:</p><p>&#8220;I would love to take this hour to understand more about your role, your goals for yourself and team as well as your previous experience with user research and your current experience with our research process. Think of this as a user interview, except this time you are the user!&#8221;</p><h3><strong>Conduct the interviews</strong></h3><p>There is something slightly easier about user interviews because, theoretically, you are likely never going to see the participant again. Or, at least for the most part, you aren&#8217;t working with them day-to-day (unless you are conducting a lot of internal/organizational research).</p><p>With stakeholder interviews, it can be tricky because sometimes they won&#8217;t want to share information that hurts your feelings, so you have to do your best to remain as objective as possible and remind them of this. My introduction calls this out and reassures them that I am really there to learn and improve with their feedback and can&#8217;t do it without honest thoughts!</p><p>Here is an example of my intro:</p><p>&#8220;I&#8217;d love to chat today about your current goals for yourself and team, as well as any needs you have for user research that come to your mind. Additionally, I want to hear about any pain points you have with user research, based on your previous experience as well as our current process and approach. I know this might be tough to share, but I really appreciate your honest feedback because that is the only way we can improve and make this as seamless an experience for you as possible! My outcome is to truly understand your experiences and thoughts, so please feel free to share - you won&#8217;t hurt my feelings&#8221;</p><p>In terms of the interview questions, I generally follow the<a href="https://www.youtube.com/watch?v=WCq31cZOF5w"> </a><strong><a href="https://www.youtube.com/watch?v=WCq31cZOF5w">TEDW principle</a></strong>, but, with stakeholder interviews, we can be a bit more leading in our questions and even ask some future-based questions. However, whenever you can, try to follow open-ended question techniques!</p><p>I organize my questions via my goals, so my script could look something like:</p><p><strong>Goals, needs, and pain point questions</strong></p><p>Within the first section, I dive into the following questions:</p><ul><li><p>What are your main goals day-to-day? For this quarter? Beyond?</p></li><li><p>What would you like to achieve yourself? With your team?</p></li><li><p>What are some metrics/OKRs that you would like to improve? Why?</p></li><li><p>What would the ideal outcome be by the end of this quarter? The end of the year?</p></li><li><p>What areas are you struggling with when it comes to achieving these goals?</p></li><li><p>What are some of your ideas to achieve these goals/OKRs/metrics?</p></li><li><p>What are generally some areas in the process that you struggle with? How would you improve this?</p></li></ul><p><strong>Previous experience with UX research questions</strong></p><p>To understand more about their previous (and future) experience with user research, I ask:</p><ul><li><p>If I asked you to define user research for me, how would you explain it?</p></li><li><p>Have you ever worked with a user researcher before? If yes, tell me about the experience?</p></li><li><p>How do you feel about user research?</p></li><li><p>Tell me what happened the last time you did user research.</p></li><li><p>Tell me about a time when research went poorly. What happened? How would you improve it?</p></li><li><p>What are the most significant barriers you feel to conducting or including user research?</p></li><li><p>How could we improve our relationship?</p></li><li><p>How could you imagine user research helping you with (the goals, needs, and pain points mentioned above)?</p></li></ul><p>In my membership, one of the templates is a huge interview script filled with stakeholder alignment questions to make sure you aren&#8217;t missing anything!</p><h3><strong>Analyze the interviews</strong></h3><p>Just like with any other user research project, analyzing the information makes it actionable! I create an affinity diagram (you can use<a href="https://miro.com/miroverse/user-research-debrief-and-synthesis/?social=copy-link"> </a><strong><a href="https://miro.com/miroverse/user-research-debrief-and-synthesis/?social=copy-link">this board</a></strong> as a jumping off point) for each stakeholder, categorizing each space with:</p><ul><li><p>Goals</p></li><li><p>Needs</p></li><li><p>Pain points</p></li><li><p>Previous experiences</p></li><li><p>Process feedback</p></li></ul><p>As an example, I'm working with a product manager on the retention team at a meal kit subscription company.</p><p>The first thing I do is my stakeholder interview and affinity diagram:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CDfA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CDfA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 424w, https://substackcdn.com/image/fetch/$s_!CDfA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 848w, https://substackcdn.com/image/fetch/$s_!CDfA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 1272w, https://substackcdn.com/image/fetch/$s_!CDfA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CDfA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png" width="1016" height="1040" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/be1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1040,&quot;width&quot;:1016,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CDfA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 424w, https://substackcdn.com/image/fetch/$s_!CDfA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 848w, https://substackcdn.com/image/fetch/$s_!CDfA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 1272w, https://substackcdn.com/image/fetch/$s_!CDfA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe1a4f8b-57a8-4015-a00b-c724e67a50af_1016x1040.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I then identify any themes across stakeholders to highlight things like (we will dive deeper into this in the next section):</p><ul><li><p>Where I can make improvements in the user research process</p></li><li><p>Pain points that stakeholders are having that I can address</p></li><li><p>Research needs or goals that span different teams, which could become really holistic and impactful research projects</p></li></ul><p>This is the first step in beginning to work with and align better with your stakeholders, and leads perfectly into the next step. Please try not to skip these interviews as they gives us the context we need to make the rest successful!</p><p>Whether you have been at your organization for a week, six months, or two years, it is never too late to conduct stakeholder interviews and go through this process!</p><h1>Align your work with stakeholders&#8217; needs and goals</h1><p>Now that we have all this information from stakeholders, we can get to work in aligning with them so that we are doing our most impactful and valuable work. Let&#8217;s go through the steps of using that information we found to collaborate with our stakeholders.</p><h3><strong>Form potential research ideas based on your affinity diagram</strong></h3><p>Now that you have all this amazing data, it&#8217;s time to take some ideas and make them concrete. Through this affinity diagram, there is information on stakeholders&#8217; needs and goals for their teams. The most impactful research we can do (and the easiest way to show value to teams) is if we conduct research that immediately and positively impacts their needs and goals.</p><p>Let&#8217;s take a look at the example above:</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q1xo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q1xo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 424w, https://substackcdn.com/image/fetch/$s_!Q1xo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 848w, https://substackcdn.com/image/fetch/$s_!Q1xo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 1272w, https://substackcdn.com/image/fetch/$s_!Q1xo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q1xo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png" width="880" height="226" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:226,&quot;width&quot;:880,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Q1xo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 424w, https://substackcdn.com/image/fetch/$s_!Q1xo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 848w, https://substackcdn.com/image/fetch/$s_!Q1xo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 1272w, https://substackcdn.com/image/fetch/$s_!Q1xo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd61ec6a-ec8c-491d-ad1a-e0a205f3bf06_880x226.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>From this information, my stakeholder needs to understand:</p><ul><li><p>Why people are canceling their boxes</p></li><li><p>What pain points or unmet needs are coming up</p></li><li><p>How to alleviate those unmet needs or pain points</p></li><li><p>What is missing or confusing about the experience</p></li></ul><p>This gives me some potential ideas already for user research that I can jot down. If you want, at this step, to go one step further, you can already start to prioritize the potential project ideas. Whenever I prioritize, I use an impact and effort matrix, which looks at how much impact the research project would have on the team/organization and how much effort the end-to-end project would take.</p><p>Additionally, you can look across stakeholders to identify similar themes/patterns in needs to think through some potential cross-functional research that would positively impact multiple teams!</p><h3><strong>Brainstorm ways to address their pain points or anxieties</strong></h3><p>Before running straight into a research plan, it&#8217;s essential to look into what their pain points or anxieities surrounding research are. If we tackle these up-front, we are more likely to ease stakeholders&#8217; minds and get buy-in for our research. I always say that transparency and empathy are key, so addressing pain points head-on is super effective.</p><p>Going back to our example:</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cgy5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cgy5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 424w, https://substackcdn.com/image/fetch/$s_!cgy5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 848w, https://substackcdn.com/image/fetch/$s_!cgy5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 1272w, https://substackcdn.com/image/fetch/$s_!cgy5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cgy5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png" width="708" height="238" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:238,&quot;width&quot;:708,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cgy5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 424w, https://substackcdn.com/image/fetch/$s_!cgy5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 848w, https://substackcdn.com/image/fetch/$s_!cgy5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 1272w, https://substackcdn.com/image/fetch/$s_!cgy5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe267fc50-122e-47f6-9bbe-130b2e2d14b4_708x238.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>To address these anxieties, I would:</p><ul><li><p>Ensure the goals of the research address the business problem and metrics</p></li><li><p>Make sure we are getting information that can help make decisions to move the metrics further</p></li><li><p>Use a mixed-methods approach to help with small sample size anxiety</p></li><li><p>Find an approach that delivers value quickly (even if it's only part of the study)</p></li></ul><h3><strong>Understand their overarching needs &amp; learning style to choose a deliverable</strong></h3><p>Now on to the next step, which looks at diving a bit deeper into a particular project. At this stage, I float a potential research idea by them that they mentioned and that I think would be a really impactful project. If you did your prioritization exercise above, then you can use the most prioritized projects to get a bit more information. If you haven&#8217;t already prioritzed your ideas, now is a great time to do so.</p><p>Once you pick 1-2 projects, you need to get a slightly deeper understanding of what type of information the stakeholder needs and the decisions they are trying to make.</p><p>For this step, I simply send them the following prompt to fill out:</p><p>I need (information needed) to answer (questions they have) by (x timeline) in order to make (the decisions they need to make).</p><p>If you want to do this in a more-interview style format, you can use the following questions:</p><ul><li><p>What type of information do you need at the end of the project?</p></li><li><p>What decisions do you want to be able to make?</p></li><li><p>What are the top three questions you need to be answered?</p></li><li><p>In which ways do you best digest information?</p></li><li><p>How could you imagine seeing these results?</p></li></ul><p>Let&#8217;s look at a quick example of answers to these questions and potential outcomes:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!A2LU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!A2LU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 424w, https://substackcdn.com/image/fetch/$s_!A2LU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 848w, https://substackcdn.com/image/fetch/$s_!A2LU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 1272w, https://substackcdn.com/image/fetch/$s_!A2LU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!A2LU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png" width="1456" height="592" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:592,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!A2LU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 424w, https://substackcdn.com/image/fetch/$s_!A2LU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 848w, https://substackcdn.com/image/fetch/$s_!A2LU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 1272w, https://substackcdn.com/image/fetch/$s_!A2LU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78b8badd-5fd5-4da2-baab-474adb10fb65_1600x650.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This information will really help you with going deeper into the goals of a project, the timeline, and the type of deliverable that might be best!</p><h3><strong>Brainstorm ways to improve the current process</strong></h3><p>Before creating the research plan or roadmap, we want to brainstorm ideas and ways to improve the current process. So, taking in the feedback, what are some ways to optimize or make the user research process smoother?</p><p>Some ideas could include (but these should be based on feedback!):</p><ul><li><p>Including an<a href="https://dscout.com/people-nerds/user-research-request"> </a><strong><a href="https://dscout.com/people-nerds/user-research-request">intake form</a></strong> to get research requests easily in one place and reduce meetings</p></li><li><p>Creating transparency by having a clear<a href="https://dscout.com/people-nerds/prioritizing-user-research-projects"> </a><strong><a href="https://dscout.com/people-nerds/prioritizing-user-research-projects">prioritization process</a></strong></p></li><li><p>Reducing barriers by including a<a href="https://docs.google.com/spreadsheets/d/1nPS1ndjaUxHQctOymkht64twMnpw1FKP/edit?usp=sharing&amp;ouid=110218421796148629396&amp;rtpof=true&amp;sd=true"> </a><strong><a href="https://docs.google.com/spreadsheets/d/1nPS1ndjaUxHQctOymkht64twMnpw1FKP/edit?usp=sharing&amp;ouid=110218421796148629396&amp;rtpof=true&amp;sd=true">sign-up sheet</a></strong> for each project</p></li><li><p>Make sure to include<a href="https://dscout.com/people-nerds/how-to-write-a-user-research-plan-that-sets-your-project-up-for-success"> </a><strong><a href="https://dscout.com/people-nerds/how-to-write-a-user-research-plan-that-sets-your-project-up-for-success">research plans</a></strong> in your project as an easy way to align</p></li></ul><h3><strong>Pitch your research plan (or roadmap) based on their goals and needs</strong></h3><p>Once you have all the above information, it&#8217;s time to share a research plan or a roadmap based on the goals and needs you identified. This research plan or roadmap can be both for an individual stakeholder as well as a group of stakeholders if you&#8217;ve identified cross-functional research projects.</p><p>Let&#8217;s look at an example from the project above:</p><p><strong>Research plan</strong></p><ul><li><p>Goals:</p><ul><li><p>Understand the top reasons people cancel their box after discount period</p></li><li><p>Uncover the decision-making process of box cancelation after discount period</p></li><li><p>Identify the major pain points with the box before cancelation during discount period</p></li></ul></li><li><p>Success metrics: Fewer cancelations after the discount period &amp; more boxes/customer</p></li><li><p>Methods: Survey + 1x1 interviews (start fast)</p></li><li><p>Recruitment:</p><ul><li><p>Survey: 150 people who recently canceled after a discount period ended + 150 who are currently in the discount period</p></li><li><p>1x1 interviews: 15 people who recently canceled after discount period</p></li></ul></li><li><p>Deliverable: Report with visuals + recommendations to make changes based on pain points &amp; needs</p></li><li><p>Next steps: Ideation workshop to solutionize the pain points</p><ul><li><p>Outcome: Prototypes + ideas to test with users</p></li></ul></li><li><p>Timeline: 4-5 weeks (survey results ready in 1.5 weeks)</p></li></ul><p>This research plan addresses:</p><ul><li><p>Slow research anxiety by including a quantitative component that gets us some quick intermediate insights</p></li><li><p>Small sample size worry by including a survey</p></li><li><p>Directly impact business and team goals through the goals and ideation session to immediately create solutions</p></li><li><p>The way people digest the research by including a few different types of visualization in the report</p></li></ul><p>Taking the information you learned and using that to curate your research plan will get you much more buy-in from stakeholders because you will have addressed their anxieties as well as highlight the direct impact research can have on their goals and needs!</p><p>Instead of a research plan, you can put together a roadmap of various projects.<a href="https://dscout.com/people-nerds/research-roadmap"> </a><strong><a href="https://dscout.com/people-nerds/research-roadmap">Check out this article</a></strong> to learn more about creating a roadmap and backlog (+ a template to get you started)!</p><h3><strong>Put constant feedback in motion</strong></h3><p>Getting feedback from your stakeholders doesn&#8217;t have to (and shouldn&#8217;t!) be a one-time project. Ideally, the most success I&#8217;ve had in getting buy-in and engagement from stakeholders is when I have had the space for them to give me consistent feedback. Some really great ways to get continuous feedback from stakeholders are:</p><ul><li><p>Holding<a href="https://dscout.com/people-nerds/user-research-retrospective"> </a><strong><a href="https://dscout.com/people-nerds/user-research-retrospective">retrospectives</a></strong> after each (or every few) research projects</p></li><li><p>Implementing a stakeholder satisfaction survey</p></li><li><p>If you have a team, hold<a href="https://dscout.com/people-nerds/user-research-reviews"> </a><strong><a href="https://dscout.com/people-nerds/user-research-reviews">regular research reviews</a></strong></p></li><li><p>Use a<a href="https://miro.com/miroverse/failure-journal/?social=copy-link"> </a><strong><a href="https://miro.com/miroverse/failure-journal/?social=copy-link">"Failure" journal</a></strong> to help track your projects, improvements, and progress</p></li></ul><h1>How to work with stakeholders that &#8220;don&#8217;t care&#8221;</h1><p>As researchers, we got into user research because we love it. Talking to people, trying to understand them, bringing data together, and sharing information with others is exciting. However, it may not be everyone's cup of tea (or we'd have way too many researchers). We are specialists, and not every stakeholder will love our specialty. The best we can do is show them how we can help. Because even if they don't love user research, they can appreciate it.</p><p>It took me up to eight months to shift some relationships with stakeholders. It can take three to six months to change these relationships and gain trust. Keep at it and be consistent - building trust can take a lot of time!</p><p>If your stakeholders <em><strong>still</strong></em> don&#8217;t care or won&#8217;t talk to you, here are some things you can still do to be effective in your role WHILE building your case studies:</p><ul><li><p>Do product-team agnostic research (such as personas, journey maps)</p></li><li><p>Look into data analytics, past research, or customer support to identify research projects</p></li><li><p>Do internal research (if applicable) if you can't get access to users</p></li><li><p>Work with any champions you can</p></li><li><p>Do user research for other departments (ex: marketing)</p></li></ul><p>And, once you have the space to apply and interview at other companies, do so!! You deserve to be valued!</p><h1>Your next steps:</h1><ul><li><p>Make meetings with stakeholders to conduct your stakeholder interviews</p></li><li><p>Remind stakeholders you are there to support them with their work</p></li><li><p>Gather the data and highlight their goals, pain points, and needs</p></li><li><p>See if there are any similarities for larger projects across teams (huge impact!!)</p></li><li><p>Brainstorm the projects that will immediately help the team</p></li><li><p>Do impactful research!</p></li><li><p>(Make sure you are continuing to learn)</p></li><li><p>Rinse &amp; repeat</p></li></ul><p>If you loved this and want to dive deeper into these topics while having a community of support to help you with these issues, you would enjoy my<a href="https://www.userresearchacademy.com/uxrmembership"> </a><strong><a href="https://www.userresearchacademy.com/uxrmembership">user research membership</a></strong>!</p>]]></content:encoded></item><item><title><![CDATA[How user research impacts the AARRR metrics]]></title><description><![CDATA[Lets channel some pirates]]></description><link>https://www.userresearchstrategist.com/p/how-user-research-impacts-the-aarrr</link><guid isPermaLink="false">https://www.userresearchstrategist.com/p/how-user-research-impacts-the-aarrr</guid><dc:creator><![CDATA[Nikki Anderson]]></dc:creator><pubDate>Wed, 19 Jul 2023 09:42:55 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!LPon!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>&#128075; Hey, I&#8217;m Nikki. Each week I write about UX research strategy, communicating impact, and using AI to do your best work. For more: <a href="https://claudeskills.uxrstrategist.com/">Claude Skills Bundle</a> | <a href="https://www.uxrstrategist.com/uxr-ai-prompt-library">AI Prompt Library</a> | <a href="https://www.dropinresearch.com/">Team Training</a></em></p><div><hr></div><p><em>P.S. Paid subscribers get access to full archive, all content, a private Slack community, Substack lives, and a hub of templates, scripts, and mini-courses</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.userresearchstrategist.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.userresearchstrategist.com/subscribe?"><span>Subscribe now</span></a></p><p>Tying the impact of research on metrics can be challenging. I struggled with it for a large part of my career. How did I take a 1x1 interview and show how it impacted our revenue? Often, it felt like a lofty goal that I would never accomplish, and for a while, I left it to the wayside. Metrics were for other roles, not user research.</p><p>While that attitude served me for a little, there was a bumpier road ahead that I couldn&#8217;t see. After a round of layoffs and struggling to find a new role, I finally accepted a job offer.</p><p>However, I quickly realized how much of an uphill battle my role would be because no one, besides the person who hired me, believed in user research. I didn&#8217;t know people could self-select into believing or not believing in a literal craft, but here we were. I was terrified of losing another job and being unable to pay rent, so I tried to embrace the challenge.</p><p>The number one question I continuously got bombarded with was, &#8220;What is the impact of user research?&#8221;</p><p>I had the jaws of a fish as I opened and closed my mouth, unable to create a concrete answer. Eventually, I defaulted to the only things I knew to say:</p><ul><li><p>&#8220;We understand our customers better so we can make customer-centric decisions&#8221;</p></li><li><p>&#8220;If we understand people&#8217;s needs or pain points, we create relevant products for them.&#8221;</p></li><li><p>&#8220;We can reduce time guessing or basing ideas off assumptions that might fail.&#8221;</p></li></ul><p>Although I had witnessed user research do many amazing things, I couldn&#8217;t fully articulate the impact in a way my colleagues understood or cared about. This plagued me. I constantly felt defeated and unvalued because I didn&#8217;t know how to tie something like qualitative research back to what people cared about: money.</p><p>And for a long time, I &#8220;stood my ground,&#8221; which really meant that I was hugely antagonistic and stubborn when it came to incorporating business into user research. I used to say that I, as a user researcher, was not there for the business and didn&#8217;t care about the business. I was there for the user. And usually, in my mind, the business and the user were pitted against each other as villain and victim. </p><p>To get to the point, this <strong>did not</strong> work out for me. </p><p>By creating a mindset and environment where it was me + users versus the business, I was stuck in the middle of a complete mess. I lacked trust with my colleagues, got into fights (literally, I yelled at people &#128584;), and spun on the hamster wheel of trying to prove the value of user research without business.</p><p>Fast forward a few months when I sat in a performance review. My craft, interviewing, usability testing, synthesizing, was spot-on. I was good, if not great, at conducting user research. But I had a huge glaring gap in skills like stakeholder management, tying research to the business, and workshop facilitation. </p><p>I was super bummed about that performance review and, unfortunately, didn&#8217;t have much guidance on how to make it better. Was I going to lose my job over it? Likely not. However, I quickly saw there was no way I was going to advance in my career if I didn&#8217;t figure something out. And I knew exactly what I needed to work on.</p><h1>Formulating a plan</h1><p>It took me a few weeks to do some research and formulate a plan for how I was going to tackle this issue.</p><p>The first thing I did was research &#8220;business.&#8221; It was tough because it was such a broad area and scope. I wasn&#8217;t super familiar with how businesses operate, what goals they had, or what metrics were important to them, so it took a lot of Googling and some embarrassing question-asking. </p><p>I learned how important revenue was to a company &#8212; it shouldn&#8217;t have been such an &#8220;ah-ha&#8221; moment, but in my need to be so user-centric, I lost complete vision of the holistic picture.</p><p>A company (usually) needs to make money to create a product/service. Customers want a product/service that helps them achieve their goals or alleviate a pain point. When they find that experience, they give the company money. </p><p>A product/service aligned with customers&#8217; needs = more money for a company.</p><p>User research could help determine the experiences, needs, and pain points of users to increase the amount of money a company was making.</p><p>Finally, I started to wrap my head around this concept, but I still wasn't sure how to apply it because that sounded like one of the fishy and vague answers I gave about user research impact. I wanted something more concrete.</p><h2>Stakeholder interviews</h2><p>I was in the green with some of my stakeholders, and definitely not a fan favorite with others (remember those fights I spoke about?), so this part was very challenging for me at first. But I knew I needed to talk to my stakeholders to learn more about their goals and the business. Without this, how was I going to draw concrete ties between user research and impact?</p><p>Biting my tongue, I bought a lot of &#8220;I&#8217;m sorry, please talk to me&#8221; lunches and coffees. I started the conversation with sharing that I was sorry about any disagreements made and sharing why I had acted in that way. Then I spoke about what and how I was trying to change. Most of my stakeholders were super kind and understanding, and they also apologized back. For a few, we were never really able to repair the relationship, but, hey, 80/20 rule, right?</p><p>As I spoke to these stakeholders, I learned about goals and started to see patterns and trends evolve in what they were talking about. There was a lot of concern or goals around the same terminology. With this, I went back to Google to further investigate.</p><h1>Uncovering the pirate metrics</h1><p>Through my research, I stumbled on something called the pirate metrics, named aptly for the acronym: AARRR.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!LPon!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!LPon!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 424w, https://substackcdn.com/image/fetch/$s_!LPon!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 848w, https://substackcdn.com/image/fetch/$s_!LPon!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 1272w, https://substackcdn.com/image/fetch/$s_!LPon!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!LPon!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png" width="1413" height="1204" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1204,&quot;width&quot;:1413,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:75730,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!LPon!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 424w, https://substackcdn.com/image/fetch/$s_!LPon!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 848w, https://substackcdn.com/image/fetch/$s_!LPon!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 1272w, https://substackcdn.com/image/fetch/$s_!LPon!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff1a6aa6a-1c01-4c54-b9eb-a756ad34e105_1413x1204.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This model, coined by Dave McClure* in 2007, highlights the five most critical metrics for businesses to track for success. Not only did these make sense to me based on my desk research, but they were also commonly referred to in my stakeholder interviews. </p><p>I wanted to understand how to concretely tie user research to these hugely important metrics. With that, I would no longer feel as much like a fish out of water. Instead, I could start to answer the questions about my impact more confidently and know that I was helping the business <em>and</em> users. </p><p><em>*Dave McClure is not a stunning human, having been accused of sexual harassment. I&#8217;m not a fan of him but wanted to cite the original source of the pirate metrics. While he sucks, hopefully, we can leverage this model for the greater good.</em></p><h2>Tying user research to the metrics</h2><p>Off I went on this adventure that would change my career forever. I dedicated as much time as possible to understanding these pirate metrics and figuring out how I could tie user research projects to each of them.</p><p>This is everything I learned and still practice to this day for each of the pirate metrics. I&#8217;ll be using a specific example from a travel company I used to work for.</p><h3>Acquisition - how people find and are introduced to your product/service</h3><p>Acquisition is all about getting new customers into your product/service so that they, essentially, know it exists. There are many ways a company can do this, such as:</p><ul><li><p>SEO</p></li><li><p>Marketing (including email and social media)</p></li><li><p>Sales</p></li><li><p>Paid advertising</p><p></p></li></ul><p>When I was working at the travel company, acquisition was hugely important to us because we didn&#8217;t have complete market share and, instead, shared the space with quite a few competitors. Getting customers without a ridiculously high customer acquisition cost (CAC) was incredibly important in our revenue. </p><p>We struggled a lot with finding new customers because of the sheer amount of competition out there and also because of the trust (or lack thereof) that comes with using a third-party ticketing product.</p><p>However, I knew how important it was to think about help improve our acquisition so I met with the acquisition product manager and we brainstormed the most important metrics within the acquisition space:</p><ul><li><p>Increasing traffic to our main page</p></li><li><p>Reducing bounce rate from our main page (without any clicks)</p></li><li><p>Increasing time spent on page</p></li><li><p>Understanding the breakdown of where our traffic is coming from</p></li></ul><p>These are all relatively high-level, top of funnel (TOFU) metrics and, to me, they seemed quite broad and generic, but it was the best we could do. <em>(PS - these things all take time to learn and I&#8217;m still learning more about metrics/business so always take the time to experiment).</em></p><p>With that, I started to identify some research projects I could do to help these TOFU metrics (I love calling something in product Tofu). With that, I came up with the following projects:</p><ol><li><p>Content testing through a <strong>highlighter test</strong>. The reason I decided on a highlighter test was because it is a great method to help determine value proposition and what information is necessary to help users achieve their goals. It can also reduce the amount of text on a page to focus on what is essential to the user. During this test, I copied and pasted the text from our homepage into a google doc and I asked them to use three colors:</p><ol><li><p>Green = text that was helpful to them</p></li><li><p>Organge = text that was confusing</p></li><li><p>Red = text that was unnecessary </p></li></ol></li></ol><p>After they highlighted, we went through the content and I asked them follow-up questions on why they highlighted certain things in the specific colors, how they might reword confusing content, and also if there was content missing that might be helpful. </p><p>We then went on to test A/B versions of copy and content to see what was most effective. This was to help <strong>reduce bounce rate</strong> and <strong>increase time spent on page</strong>.</p><ol start="2"><li><p><strong>Five-second tests</strong>. Once we updated our copy, I wanted to use five-second tests to understand what message we were communicating to our users. Could people understand we were a ticketing platform? Did they understand what we were able to give them or what needs we were trying to meet? We used this to continue to refine our message and clarify how we could help users achieve their goals from the moment they lied eyes on our platform. This was also to <strong>reduce bounce rate</strong> and <strong>increase time spent on page</strong>.</p></li><li><p><strong>Closed word choice survey. </strong>Finally, we wanted to understand the types of words people associated with us, namely around the &#8220;trust family&#8221; since trust was a huge object of concern with us being a third-party ticket platform. I set up a survey with several images of our website and sent it out to participants asking them to select all the words they would use to describe our platform. I used words like:</p><ol><li><p>Empowering</p></li><li><p>Approachable</p></li><li><p>Disconnected</p></li><li><p>Friendly</p></li><li><p>Irrelevant</p></li><li><p>Patronizing</p></li><li><p>Untrustworthy</p></li><li><p>Trustworthy</p></li><li><p>Skeptical</p></li><li><p>Easy</p></li><li><p>Relevant</p></li><li><p>Simple</p></li></ol></li></ol><p>I also followed-up with some open-ended questions, such as &#8220;please describe why you chose those words&#8221; to try to get further insight. We ended up having to contact some respondents to get a better insight into why they chose certain words through quick interviews. This was also to <strong>reduce bounce rate</strong>.</p><ol start="4"><li><p>Lastly, I did some work with the <strong>four forces diagram</strong> in Jobs to be Done. This looks at why people stay with or switch between products when it comes to their habits, anxieties, pushes, and pulls. I conducted about 15 interviews with non-users of competitors to understand a bit about why they used several different competitors and why they switched between them. It was super interesting to learn about people&#8217;s anxieties and habits when it came to travel and this helped us with creating some great messaging to help foster trust and make people look at us as a reliable platform. This helped a lot with <strong>reducing bounce rate</strong>.</p></li></ol><h3>Activation - how people begin to use your product/service</h3><p>It&#8217;s all good if people find you, but that first interaction is <em>key</em>. After running my business for almost two years full-time now, I know how important it is to activate users and get them to take that first step with you.</p><p>There are many ways to activate users and it hugely depends on your product/service/organization, but when you think about activation, think about the primary conversion metrics that determine the success of certain channels and campaigns, rather than high-level or micro-conversions. This could look like a funnel:</p><ol><li><p>Someone comes to your website</p></li><li><p>They see a value in your work</p></li><li><p>They try a free trial or book a demo or sign up for a newsletter</p></li></ol><p>Activation is the beginning of your relationship with the customer. Before this, they&#8217;re anonymously researching your business and competitors before they take any specific action that allows you to begin directly engaging with them.</p><p>When it came to the travel company I was working at, we actually didn&#8217;t have too many activation channels besides &#8220;booking a ticket,&#8221; which is a conversion but there were many steps prior to that conversion. So, I sat down with the product manager and we thought through some activation metrics:</p><ul><li><p>Increase number of newsletter signups </p></li><li><p>Increase number of trip searches</p></li><li><p>Increase first time ticket purchases </p></li></ul><p>This was tough for me to apply user research on because I wasn&#8217;t sure exactly how to impact activation without going into full-blown conversion rate mode. I also wanted to test some of the other metrics outside of purchasing a first ticket like just searching or signing up for our newsletter.</p><p>So I came up with a few project ideas to try to help more these metrics:</p><ol><li><p><strong>Walk-the-store interviews.</strong> I spoke to about 15 users who hadn&#8217;t yet purchased with us yet to understand how they felt when they landed on our page and when they were searching for a trip. In this interview, they shared their screen and took me through their reactions and perceptions of what they say and what they were doing. It was very much qualitative and gave us some great feedback on confusing elements and components, as well as some glaring mistakes in the experience. This was to help <strong>searching for a trip.</strong></p></li><li><p><strong>1x1 interviews with first-time purchasers. </strong>I then wanted to dive into the first-time purchasing experience with users so I screened for people who had recently purchased their first ticket with us. The reason behind this (and me <em>not</em> doing a usability test at this phase) was to understand the qualitative side of their experience. How had it been for them? What had been confusing? What had been missing? Another 15 people walked us step-by-step through their first experience which gave rise to even more pain points and improvements we could make in the experience. This study hugely helped with increasing people who <strong>purchased their first ticket</strong> since we could streamline the experience for them.</p></li><li><p>I struggled quite a lot with newsletter sign ups and ended up with continuously surveying our audience to understand why they signed up for our newsletter. I knew it would be tough to understand why people <em>didn&#8217;t</em> so I decided to instead, tap on what we had and could easily find out. Through understanding why people signed up and the value they got from it, we were better able to articulate that in our content. We also had an unsubscribe survey to understand why people unsubscribed from us to get improvements from that side. There were fewer responses to the unsubscribe survey but we were still able to take some actions from it. This hugely helped with <strong>increasing newsletter sign ups.</strong></p></li></ol><h3>Retention - how people come back to and continue to use your product/service</h3><p>Ah, the bread and butter. We get people, but nothing is better than <em>keeping people </em>(as serial killer-ish as that sounds). Retention is king/queen/royalty because when people purchase multiple times from you, their value to you as an organization sky rockets. </p><p>There are many ways to measure retention like:</p><ul><li><p>Customer lifetime value</p></li><li><p>Churn rate</p></li><li><p>Returning to your website</p></li><li><p>Opening emails repeatedly </p></li><li><p>Checking your product repeatedly in a given timeframe</p></li></ul><p>Now, I&#8217;m just going to say it: churn is a tricky subject, which I will cover in its own full article because churn research gives me the heebie-jeebies &#128561;.</p><p>Retention was another big hitter for us since our CAC was so high. We were getting customers, but we weren&#8217;t retaining them, which was not helping grow our revenue. So we really focused on the following metrics:</p><ul><li><p>Increasing repeat order rate and customer lifetime value</p></li><li><p>Increasing repeat searches </p></li><li><p>Increasing the number of accounts made </p></li></ul><p>Retention research is the most fun for me, so I really leaned into these projects and ran:</p><ol><li><p><strong>1x1 generative research interviews. </strong>Retention is all about creating a product that helps people achieve their goals more effectively and efficiently than competitors while alleviating any pain points to achieve the goal. To me, generative research interviews are a slam dunk in getting that information. I took these interviews away from the product and into the complexities of planning and trip from end-to-end, including their needs, goals, and pain points. I used the TEDW framework to ensure open-ended questions:</p><ol><li><p>&#8220;<strong>T</strong>ell me&#8230;&#8221; or &#8220;<strong>T</strong>alk me through&#8230;&#8221;</p></li><li><p>&#8220;<strong>E</strong>xplain&#8230;.&#8221;</p></li><li><p>&#8220;<strong>D</strong>escribe&#8230;.&#8221;</p></li><li><p>&#8220;<strong>W</strong>alk me through&#8230;.&#8221;</p></li></ol></li></ol><p>This study led to some key understandings of what our customers needed and led to an understanding of where we were failing to meet those needs and alleviate those pain points. We were able to pivot and change the product in ways to better align with users through things like easier price comparison, eco-friendliness of trips, and sharing trips with friends. This led to <strong>increasing repeat order rate </strong>and <strong>customer lifetime value.</strong></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pujq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pujq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 424w, https://substackcdn.com/image/fetch/$s_!pujq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 848w, https://substackcdn.com/image/fetch/$s_!pujq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 1272w, https://substackcdn.com/image/fetch/$s_!pujq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pujq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png" width="1456" height="817" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:817,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:410304,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pujq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 424w, https://substackcdn.com/image/fetch/$s_!pujq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 848w, https://substackcdn.com/image/fetch/$s_!pujq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 1272w, https://substackcdn.com/image/fetch/$s_!pujq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe00b72ad-e8b7-4a22-a347-d3f495cb3bfc_2648x1486.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Goals + TEDW question examples</figcaption></figure></div><ol start="2"><li><p><strong>Usability testing (quantitative and qualitative). </strong>I first started with some qualitative usability testing, going through the process of booking a trip and asking people about their experience as they went through it. On its own, this study led to huge learnings on how we could improve the experience of our product through clunky filters, inability to link to trips, no favoriting, etc. Once the qualitative side was done and we made improvements, I went on to benchmark the current experience using task-based usability test and measuring time on task and task success, as well as the Single Ease Questionnaire and System Usability Scale to gather more data on usability and satisfaction. We saw people struggle with basic tasks and were able to make critical fixes that helped increase the usability of our platform, which directly contributed to <strong>increasing repeat searches as well as increasing repeat order rate </strong>and <strong>customer lifetime value.</strong></p></li><li><p>Accounts were really difficult to understand because, well, to be honest, we didn&#8217;t really have any value in our accounts. You didn&#8217;t need one for anything other than storing your data to use more easily again (ex: storing credit card information). I deprioritized this and, unfortunately, wasn&#8217;t able to get to it before I left the company. If I could have, I would have probably run some surveys to understand the current value of the account (if there was one), and also maybe running interviews with people using competitor accounts to understand the value they were getting from those.</p></li></ol><p></p><h3>Referral - how people share your product/service with others (positive and negative sentiments)</h3><p>Referral is all about people spreading the word about your product/service with other people - either with a positive or negative sentiment. Referrals can be great because, if positive, they can almost feel like free customers. </p><p>Referral metrics can look like this:</p><ul><li><p>Engaging with referral campaigns or emails</p></li><li><p>Using a referral bonus or program within your program/service</p></li><li><p>Leaving reviews</p></li></ul><p>When we sat down to talk about referrals, we had some metrics we wanted to start tracking and thinking about:</p><ul><li><p>Increasing sign-ups through a bonus referral program (that we hadn&#8217;t yet created)</p></li><li><p>Increasing our review rating</p></li><li><p>Increasing shared trips</p></li></ul><p>Referrals were a very interesting area for me and one that I wish I had invested more time into, but we had a lot of other priorities at the time. I didn&#8217;t have the bandwidth to dive in and learn how to conduct referral-related research as much as I wanted. However, I was able to sneak in some research when it came to referrals:</p><ol><li><p><strong>Data triangulation. </strong>We had quite a few reviews on our apps, and I decided it might be interesting (and fun, in a sadistic way) to go through these reviews and sort the data. I did this in a very manual way, combing through the qualitative reviews and creating a miro board with categories and tags. I wish I had a photo for you but I didn&#8217;t manage to snag one, but the categories consisted of:</p><ol><li><p><strong>Pain points/problems</strong> people were coming up against and complaining about</p></li><li><p><strong>Feature requests</strong> which I needed to dig into more deeply in future research because they were quite shallow</p></li><li><p><strong>Positives</strong> of what we were doing well that we could use more of</p></li></ol><p>Within these major categories, I found patterns and trends and used the amount of reviews/mentions to help me weigh them for prioritization. For instance, a huge pain point people encountered in our system was a delayed email confirmation that ended up freaking people out that they had bought a ticket from a bad third-party app. We were able to fix this relatively quickly and easily. This project directly impacting our metric of <strong>increasing our review rating.</strong></p></li><li><p><strong>1x1 interviews and concept testing.</strong> Since one of our main metrics was about a referral program we hadn&#8217;t yet created, I had to start from scratch on this project, which was quite exciting. I held about 15 1x1 interviews about referral programs where I asked really broad questions on people&#8217;s previous experiences with referral programs both inside and outside of the travel sphere (previous experiences with products, even if they are outside of your industry, are much more reliable than asking future-based questions). We were able to understand some key pain points and the needs of our customers. With this, we built a concept of a referral program, which I then used to run a concept test with 12 more people. We evolved the concept with this feedback and eventually shipped the feature, which we then were able to finally apply our metric to <strong>increasing sign-ups through referrals</strong></p></li><li><p><strong>Usability testing.</strong> We had heard from previous research done in one of our retention projects that sharing trips easily with others was an important feature for our users. We thought that increasing trip shares (especially with people that didn&#8217;t have the app or an account) would be an interesting metric to track because it may boost people&#8217;s motivation to use us. We conducted a usability test on the sharing trips functionality and then monitored its usage. We found, interestingly, that when users shared trips with others, they also shared their referral code, which helped us <strong>increase sharing trips</strong> and <strong>sign-ups through referrals </strong>simultaneously.</p><p></p></li></ol><h3>Revenue - how people are generative revenue (against costs) for your product/service</h3><p>Now, all of the above stack up into and impact revenue. If you aren&#8217;t getting new customers and retaining them, likely you aren&#8217;t making money. Revenue can be broken down into so many different ways, such as:</p><ul><li><p>Revenue that exceeds CAC</p></li><li><p>Monthly recurring revenue</p></li><li><p>Minimum revenue</p></li><li><p>Breakeven revenue</p></li></ul><p>There was no one project that I could attribute to in terms of revenue, but rather it was the accumulation of the multiple projects I did <em>with this new intention of directly impacting business as much as I could</em>.</p><p>Because I was able to impact the metrics we determined above, I could indirectly link my work back to any revenue shifts that we saw in the business.</p><h1>Try it!</h1><p>Through this experience, the entire trajectory of my career shifted in such a positive way. I started thinking about projects not just with the user&#8217;s goals in mind but also with the business too. It hugely accelerated my career to tie these knots together and show the value of user research through this lens, which stakeholders greatly appreciated. </p><p>I recommend starting with one of these letters/topics and going to a trusted colleague to talk through metrics and potential user research projects that could impact the metrics. This is hugely a brainstorming session at first, especially if you are new to metrics + UXR, so a trusted colleague who is open to talking through potential ideas and experimenting is key.</p><p>If you don&#8217;t have any colleagues open to this discussion, try to brainstorm on your own (this is something I did for a while before I found trusted colleagues) and talk to other peers or join a community - <a href="https://www.userresearchacademy.com/uxrmembership">you can check out my user research membership</a> - to get feedback and ideas. </p><p>But, the first thing is to start with one of the letters and go from there in trying - you might not get it right the first time (I certainly didn&#8217;t), but it is a great skill to practice and hone over time!</p><p></p>]]></content:encoded></item></channel></rss>