Continuous Discovery: From Theory to Practice with Framework Practitioners
Continuous Discovery is more than a buzzword. It’s a fundamental shift in how modern product teams are staying informed to build products that are both customer-centric and drive real business value.
Join us for this 60-minute webinar to explore both the theory and real-world application of Continuous Discovery, featuring special guests who have successfully implemented the framework within their organizations.
When: Thursday, April 10
Time: 11:30 AM EDT / 5:30 PM CEST.
Why Continuous Discovery?
In modern product development, keeping a pulse on customer needs is no longer optional, it’s essential. This webinar will go beyond theory to showcase how real-world practitioners are embedding Continuous Discovery into their product development processes.
What You’ll Take Away:
By the end of this webinar, you’ll walk away with practical insights to integrate Continuous Discovery into your product development process. Including:
- A general understanding of the Continuous Discovery framework– Learn the coreconcepts.
- Real-world implementation strategies – Discover how leading companies integrate Continuous Discovery into their product cycles without disrupting development.
- Challenges and opportunities in its adoption – Hear from practitioners about common roadblocks, how they overcame them, and what they’d do differently.
- How Continuous Discovery improves decision-making – Learn how the framework has led to better, data-driven product decisions.
- And much more!
Transcript:
[Nate Brown]
All right, well, let’s go ahead and get started. Again, this will be recorded with the slides available for everyone who has attended or maybe those who registered and missed. Got Sarita already sharing her screen, so we’ll go ahead and jump in here and get started.
All right, now this webinar today is all about continuous discovery, a really hot research method in the UX industry. And we got some experts on continuous discovery here, some of our guest speakers that have themselves implemented this research strategy. So we’re really excited to break it down, share with you some best practices on how to employ this research methodology in your own research practice.
So let’s go ahead and introduce our guest speakers for today. Susan, our first guest speaker. We’re so excited to have her on. Susan is a senior UX research manager at SAGE, heading up the user research team on the performance marketing function. And Susan has been at SAGE for about four years now and has since tripled the size of her research team. So she’s had a lot of success over there. The team delivers actionable and tactful insights to the business, influencing the strategy and fueling tactical initiatives across the global SAGE team. And then prior to SAGE, Susan was the head of design research in a multi-disciplinary design team at Virgin Bunny. Susan, thank you so much for giving your time for us today.
[Susan Liu]
Very happy to be here, thanks Nate.
[Nate Brown]
Amazing, we also have my good friend, Dr. Kyle Gibson, who is an anthropologist and freelance principal UX researcher, formerly at both Meta and 1,800 Flowers, where he conducted generative and evaluative research, using mixed methods, and he focuses on helping people to strengthen their relationships through gifting, altruism, and exchange. And I can personally say it has a great recommendation for New York restaurants. Kyle, thanks so much for joining as well.
[Dr. Kyle Gibson]
Thanks, Nate. Good to see you.
[Nate Brown]
Awesome, and then lastly, we have Userlytics own, Sarita. Sarita is a principal UX researcher consultant here at Userlytics with a background in psychology, socio-cultural studies, and market research. She speaks a few different languages, Spanish, English, and French, which helps her understand different user experiences and different cultures. And at Userlytics, Sarita also uses various methods, plus it’s also key to share her knowledge in UX research and conference talks, consulting calls, and webinars just like this. Sarita, also thank you for your time today.
[Sarita Saffon]
Thanks, Nate.
Awesome, and lastly, there’s me, Head of Channel Development here at Userlytics, again, hosted with the most on today’s webinar. But let’s go ahead and kick things off.
All right, now we wanna do a quick poll first just to get everyone interacting and kind of comfortable. So we’ll go ahead and get the first poll going here. So all of the attendees today, go ahead and we wanna know how often do you personally talk to your customers? Do you maybe never get the chance to? Is it pretty infrequent only a couple of times per year? Or maybe you’re a daily talker.
Looks like we have some results rolling in and we’ll be sharing those shortly. We wanna give everyone a chance to answer those first. But again, as we’ll learn today, Continuous Discovery is all about a frequent, consistent cadence of speaking with customers. And again, we’ll break all that down, but let’s give it just a few more seconds for everyone to add their answer in.
And I will go ahead and end it three, two, one, and let’s see how the results fared. Looks like a little bit of a mixed bag, but most popular would be about every quarter. Kyle, Susan, I’m actually interested to hear yours. I know that you didn’t have a chance to answer in the poll, but also interested in kind of how often you guys are talking to your clients.
[Susan Liu]
Yeah, I guess I’ll go first. So I have a team of five user researchers and typically we aim to deliver at least one study per month. So if you times that against five, that’s five times a month. But within those studies we’ll be speaking to anywhere from between six and maybe 50 people. So we do engage with users quite often as part of our work.
[Dr. Kyle Gibson]
And then I’ll be talking about kind of two different channels of continuous discovery that I take part in. And one is my own work where I am talking with people hopefully weekly, use a couple of people a week, I would hope. And then, and those can be internal users, external, wherever. And then also leading a, what I call a listening lab, which is a small D democratization of research at the company a bit, which is where I have product owners talk with people, which is sometimes terrifying, but I think terrifying for them, but we’ve got some processes to make it easier. And that happens every two weeks. They’ll talk with one person and I’ll talk later too about how we generalize that further.
[Nate Brown]
Yeah, amazing. Very excited to hear about that, Kyle. I want to make a quick note before we kind of jump in here. So again, we have the chat. So feel free to connect with each other in the chat during the webinar. We also have a QA. If you have questions, we’re gonna be doing QA at the end. So you’ll be able to ask your questions to our guest speakers and love the QA. I think that’s a really interesting part to kind of hear from them on the fly with the different questions you have.
And then Sarita, also interested to kind of hear from you on what you would have put in the poll. And then I’ll kind of pass the slides over to you.
[Sarita Saffon]
Definitely. Well, for me, I would say weekly because this is what we’re going to be talking about this in this webinar, but the more the merrier. I think that’s what I would summarize it. As more, as you know your customer, you know their pain points, their desires, their needs, the more you’re going to give them the experience that they really, really want and that will eventually engage the most with your brand.
Great. So thank you, Nate, for that very exciting poll. And thank you, Susan and Kyle, for joining me in this very exciting webinar.
To start, even though I would not consider myself a continuous discovery expert, I did want to give you a brief overview of the continuous discovery framework based on Teresa torres’s book Continuous Discovery Habits, so that you and the audience that have not yet read the book or that are not familiar with this framework can be put into context of what we are going to be talking about today and why precisely we asked you about your frequency of your conversations with your customers.
But actually to understand the continuous discovery, I think that it is most important to go a little bit back and define what discovery is. And this is very important because probably most of our audience today work in a product team so you are aware of its definition, but maybe can also be researchers or have many other different roles.
So it is important to understand what product discovery is as a part of the product development life cycle, because this is where the continuous discovery framework should be applied. And Tim Herbick, product management coach, author, and speaker, defines a discovery as a process of reducing uncertainty around a problem or idea to make sure that the right product gets built for the right audience.
And this definition is really interesting, but it leaves us with a lot of questions. How do we reduce that uncertainty? How do we make sure that we are building the right product? What is the right product? And this is where the Continuous Discovery framework comes in.
So the Book of Continuous Discovery Habit proposes this structure that we’re seeing on the screen called the Opportunity Tree, which is like a step-by-step guide to help product teams organize their discovery time and efforts, answering those questions that I just mentioned and many more, but specifically to avoid making any wrong decisions and wasting time and resources on that.
So yes, this framework talks about research, but it does provide other activities that the product team should do. And the research that is done in this framework comes way before the usability testing that you’re accustomed to do in the Userlytics platform. This research is done to make strategic decisions for taking those decisions before deciding, before building, or testing anything. The framework actually indicates that there’s the need to engage with the customer throughout the discovery phase before actually starting to build or test any kind of thing. So from the beginning where you’re trying to tackle that initial question or that initial problem of deciding what to build or if to build something all the way until trying out different solutions before creating any type of prototype.
The first step in this process, as we can see here, is defining your desired outcome. The outcome should be established by the product trio, which the book explains that is the combination of the product management, the product designer, and the developer. The three of them should work together in defining this desired outcome, and should be focused not only in what the team needs to do and deliver, but on the impact and the value their actions will have on the company and their customers, on what change it will create on both the business metrics and the user behavior. So instead of telling product teams what to build, the output, the outcome-driven companies empower their product teams with autonomy and trusting them to find their own direction. So rather than assigning a fixed list of features that they have to complete by an exact date or having them write code without any type of purpose, these things are challenged to solve real customer issues or meet meaningful business goals.
And because of this, a desired outcome has to be framed in a customer-centric way. So rather than giving the team an output of “release a referral program,” the output should be “drive user acquisitions through word of mouth.” Additionally, it is important to tackle one outcome at a time because as you saw on the opportunity tree, working out an outcome requires several activities and this needs focus. So you need to work on one at a time.
That first activity is finding what are the opportunities of that outcome that you decided upon. And this is actually the most important space for research itself, because this is where you need to explore what are the customer needs, their desires and their pain points, and it is divided into different parts. First of all, it is necessary to map out everything you know or you think you know about your customers. And then you can refine that map and change things as you learn more about them.
But how do we do this precisely by doing customer interviewing? The framework indicates that is precisely the product trio that should talk to the customers at least once a week. Therefore, my answer before. And this is because the opportunity tree should be fed constantly. So that precisely you can modify things and start changing things and seeing what are the opportunities that are really coming up along the way. This interview should not be something very long or structured. It is not like a moderated test that we run for you, for example. These are more of conversations, more informal conversations that the product team member has with a customer or a potential customer based on one or two questions that they need to answer or that they’re exploring that exact week about a specific need or a point that they’re trying to explore or to validate. This should become then something automatic, something that is part of the weekly team routine.
However, as we all know in research, the most important part, but also the most hard part is to actually recruit the users. So this is why Userlytics comes in and offers you the service to recruit with our proprietary panel and program those interviews so that you always have those users waiting for you to talk. And therefore, how the book says, it then becomes easier to interview than not to interview. But we’ll talk about more that in detail at the end of the webinar.
Some of the tips that the book gives about conducting these interviews include differentiating your research questions from your interview questions. Do not ask the customer what their pain points with your product are or what they want the product to offer? Because this is actually going to give them a wrong idea. Ask them about their last experience using a specific feature of your product or purchasing your product or a similar one if there are potential customers still. It has been studied by neuropsychology that when you’re asked a hypothetical question, our left brain comes in and answers how they think they should do something or what they would imagine that they would do or that they should be doing, instead of saying how or what they would actually do. So it is recommended to ask customers about an experience rather than how they would do it in the future so that you can learn what they would in fact do and not what they might rationalize that they would ideally do. And then you can start digging deeper from there on what happened before and after that, who they were with and what challenges they faced and what did they do to resolve them.
Additionally, you must not get ahead on the opportunity tree steps. Ask them about their needs and pain points, not yet on the solutions that I do know that everyone already has in mind on how to solve those desired needs and pain points. If they do propose a solution, turn it around and ask them, why would this be helpful for you? What would that do for you? So that you can discover the underlying need, the underlying pain point, and the actual opportunity that you’re trying to look for.
This framework also offers an interview snapshot that we can see on our screens, suggesting that the product trio member summarize and synthesize the most important information gathered throughout the conversation and that they can present them to the rest of the team and update their opportunity tree.
However, not all of the opportunities should go on the tree. Only those that really show a customer need, desire, or pain point that are frequently in customer conversations and that really drive that desired outcome that you have set. Also, and most importantly, they should not be solutions.
This can be distinguished by asking, for example, the question: Is there more than one way to solve this opportunity? If there is, it is actually an opportunity. If it’s not, if it has only one solution, then it is actually the solution. And this is precisely because the next step is committing to one of those opportunities and looking for the multiple solutions that it could have.
For this, it is necessary to ideate individually first, focusing on creating as many ideas, as many solutions as possible. And then as a team, you can evaluate those solutions and choose three of them. You can use, for example, voting, and the important thing is that at the end, you get three, even if it requires multiple rounds.
Then the team should identify the hidden assumptions of that action, of that solution. And this can be done, for example, by doing a story map, conducting a pre-mortem, or actually walking backwards on your opportunity tree to see if, for example, customers really want this solution, or if there is any potential harm in carrying this solution out.
We then come again to the research in the experimental space, where the assumptions are mapped on this grid that we can see here, based on how important on the success and the idea is each of those assumptions and how much evidence you have to confirm or not confirm it. And then you should test those assumptions that are critical for the solution but that you know little about.
This should be done in quick overnight tests or surveys so that you can collect very fast evidence. This is not to decide if you go with the idea or not — it is to evaluate different ideas and then you can decide.
This is actually the last step of the framework. So as I mentioned, it is a quick overview of it. But just to give you those main concepts that we’re going to be talking with Susan and Kyle and to get an idea of that process that the framework proposes.
Also, it is important to say that this is not a linear process nor a simple process because you can then explore different solutions, different opportunities, and different outcomes. So as the name suggests, it is a continuous process.
I like to finish precisely with the definition that the book gives of continuous discovery to conclude with an overall message that the framework leaves, at least to us. And it is that throughout the different opportunity tree spaces or steps, constant contact with your customer and small research activities by the research team is vital in order to reach your outcome and also to take those strategic decisions.
So the next part of our webinar is precisely talking to our clients that are actually implementing this framework so that they can share their experience, their challenges, the different learnings that they have of actually implementing this into their teams. But I do think, Nate, that you have another poll coming in.
[Nate Brown]
Yeah, wonderful. Let’s go ahead and kick that one off. I’ll go ahead and launch it now. Just want to say a huge thank you to Sarita. I found that super valuable. I love the insights you have and kind of just breaking it down for us.
But we want to hear from everyone on this next poll. What is your biggest challenge in product development right now, right? So kind of understanding what are some of the challenges that you and your team might be facing as we kind of digest the learnings that we just got from Sarita.
So I’m super excited to hear from both Susan and Kyle next, but we’ll give everyone a few more seconds to fill out their poll here and then we’ll hear kind of what are some of the challenges that people are facing.
As we’re waiting, I also want to mention that we do have the QA open and the QA will happen at the end of the webinar. So if you have a question about maybe something Sarita covered or you want to ask Susan and Kyle kind of a direct question about how they’re implementing this practice, please, please throw that in the QA and we will definitely get to those towards the end.
I see a good question there already.
All right, now let me go ahead and end the poll here, 3, 2, 1, and let’s see what everyone said. So it looks like again, a good mixed bag. It looks like at the top balancing speed with quality. And so I actually love that answer because I think that is a challenge that continuous discovery addresses itself.
Sarita, would you agree?
[Sarita Saffon]
Yeah, definitely. I think that precisely the fact that you are doing the continuous is going to take off the pressure of going fast. But you are going fast because you’re gathering information as you go every single day.
[Nate Brown]
Hey Susan, Kyle, wondering if any of these stick out to you or maybe if there’s a different challenge that your team faces in kind of the product development stage?
[Dr. Kyle Gibson]
I think balancing speed with quality is always going to be on the list, you know, it’s real-world research. So that comes. But Serita, you’re absolutely right. You’re going to be doing this so often that you develop, you have a huge backlog of qualitative information that you can pull from at any time. And so you can say, I will, yeah, I remember when we were interviewing this person six months ago, they mentioned this, and if you’ve got 30 or 40 interviews that you can go back to, you have real data that you can always pull from and have insights that are much better than, you know, staring into the abyss.
[Susan Liu]
Yeah, for me, definitely, it’s all of the above, but balancing speed with quality is definitely one of the ones that stands out to me as well. And also how we use the kind of user research insights to help prioritize features that also marry up with the business goals as well. So that’s always quite an interesting one.
[Nate Brown]
All right. Well, I appreciate that. Let me go ahead and give this back to you, Sarita.
[Sarita Saffon]
Great. Thank you, Nate. Well, these results are very interesting. Give us actually a very good starting point to talk to our guests today. Once again, Susan and Kyle, thank you so much for joining us and sharing your experiences with all of our audience today.
So to follow up on our poll subject, what would you say were the biggest challenges you faced now during the implementation of the Continuous Discovery framework in your team, and how did you overcome them? Let’s start with you, Susan.
[Susan Liu]
Thanks, Sarita, and thank you so much for that introduction to Continuous Discovery. I feel like I’ve also learned so much from it already as well, so that’s great. So the biggest challenge I think we had to overcome is just overthinking it, being apprehensive about how to fit it in, not knowing what you don’t know. In the end, I think the apprehension actually helped us. We carved out time for it and we executed against that plan, so we stayed committed to it. To help us create the plan, we first identified our goals. We discussed as a team what the process could look like with a view on the desired outcomes. So it was important to have an idea of what success looks like as that helped us keep on track.
[Sarita Saffon]
Interesting. Thank you for that. How about you, Kyle? What are those challenges that you faced implementing this framework?
[Dr. Kyle Gibson]
The tactical challenges come from recruiting, as you mentioned earlier. So that’s why it’s really nice to have a platform like Userlytics. And we use it to select people in and out of studies based on screeners and things like that. It’s awesome for that.
And other challenges are, I alluded to this before, but getting people comfortable with the process of interviewing people themselves has been a challenge. And it’s not because they don’t want to—because when they’re talking about their own product, potentially at the company or specifically at the company, they’re super curious about it and they’re invested in it. But again, if you say, hey, you’re going to be talking with a stranger for 30 minutes and there are going to be 25 people observing and who are your colleagues, that’s daunting, right? For anybody, if you haven’t done it before, particularly for junior PMs and things. So I’ve instituted for that program a light version where we’ll get somebody, a Userlytics panelist, but it’ll just be them and me and the person who’s doing the interviewing. And that makes it a lot easier because you just need to break the ice and realize that it’s just a conversation.
And the way that these continuous discovery interviews go—you’re going to have, it said one to two questions. I always write down four to six, I would say, and we can follow that. But it’s what is the PM really interested in? What do you want to get out of this conversation? That’s the conversation we had beforehand. And so in 30 minutes, you can meander and talk. You’re not in a huge rush. And they realize that it’s fine. You’re just having a conversation.
[Sarita Saffon]
I love to hear how you’ve been changing a little bit as you have been doing this continuous interview. Has there been any other unexpected learning or insight uncovered during that process?
[Dr. Kyle Gibson]
Yeah, I’ll just keep following, I guess. I actually—so when we do this live and we do have observers, I have everybody in a FigJam and they’re taking notes. And one of the places in the FigJam that I have set aside is, “Did anything violate your intuitions?” And I think that is, as a researcher, that’s—you know, they say it’s not all great discoveries are highlighted with the word Eureka. It’s more like, “Huh,” you know, or “That’s interesting.” And that’s what that is. It’s like, what surprised you? That’s where the real insight comes from. And particularly, again, when you’ve got somebody who’s a PO or a PM, and they are the expert in their subject and something surprises them in an interview, that’s the lightning that we’re trying to capture there.
[Sarita Saffon]
Great, thank you. Susan, you explained to us a little bit of that process that you took to implement the framework in your team. If you were starting over, what would you do differently, or if anything?
[Susan Liu]
I think if we were starting over, we would try to be a bit more clear about the outcomes that we want to achieve to better focus our interviewing and note-taking; to focus more on the wider behaviors and outcomes we want to influence. I feel like we lost sight of that over the course of time as we had a habit of being so laser-focused on the point of conversion that sometimes it was difficult for us to stop trying to drive the conversation on how to make them convert more and just take a step back and have a wider conversation about the individual’s journeys.
In an ideal world, I think we would plan in reviews and revisit them periodically to remind ourselves of why we are adopting the continuous discovery approach in order to assess whether it’s still appropriate to proceed with. And I think as part of the analysis process, we highlighted key opportunities in context of an overall customer journey, because we thought that this was a good way to help us frame ideas and surface opportunities at a journey level. I think we would want to explore this a bit more as part of the analysis process. I believe it’s a very compelling format for contextualizing and framing opportunities alongside Opportunity Tree.
[Sarita Saffon]
Could you share maybe an example of how your team implemented continuous discovery in a real-world scenario?
[Susan Liu]
Yeah, definitely. We decided to adopt the continuous discovery methodology to inform our understanding of the human needs that drive small business accounting and payroll prospects—because we sell finance software on our website—and what happens during their consideration stages of purchase decision-making.
So we decided to run bi-weekly interview sessions, two sessions every two weeks basically, which entailed one-to-one depth interviews with representatives of the target audience. These typically lasted between 45 and 60 minutes. Initially we planned for 30-minute sessions, but I think once we got going we found that 30 minutes definitely wasn’t long enough for a deep and purposeful conversation.
Just like Kyle, we assigned roles and responsibilities to everyone involved in the process. Everyone who was observing had a designated role to play. So we had note-takers. We had people responsible for recruitment scheduling, setting up the sessions on Userlytics, getting in touch with Userlytics to ensure the recruitment would take place smoothly. We even had someone in the team who was like a continuous discovery principal who would provide any feedback along the way about the methodology because we wanted to ensure that we were tracking back to the methodology as intended.
After each research session, we organized group-level debrief sessions to consolidate learnings and to create those one-page snapshots, which were super handy for socializing and talking about and drawing back on the learnings from each session. They were very, very easy to share with our stakeholders and they provided a synopsis of what we learned in each of the sessions.
After a period of time, we would get together as a project team to review the process, brainstorm ideas for improvement and actions, and create action plans for taking insights through to actions. And these would then be fed into our normal BU delivery pipelines for implementation or further discussions.
[Sarita Saffon]
Nice, it seems like you have everything set up. Are there any challenges that you still face or things that you think that you still need to improve?
[Susan Liu]
Definitely, there are constant improvements along the way. To be honest, we’re still quite early on in this methodology and I think fitting it in is still a challenge for us and just adapting our mindset to this idea of doing continuous interviewing I think is still a challenge for us.
[Sarita Saffon]
Okay, great. So Kyle, can you also share with us maybe how your team implemented the continuous discovery in a real-world scenario?
[Dr. Kyle Gibson]
Sure, yeah. And good point, Susan. Forty-five minutes is the sweet spot, I think, on these interviews. You don’t always have to go that long, but it’s terrible to find yourself having a great conversation and then being at time, you know? So I’m with you.
Yeah, recently—so here’s a great example of continuous discovery at work. I was working with an external team to develop an app. And I don’t know who here does agency work or has worked with agencies, but it’s a little bit of a different experience than doing things all in-house and having people right there.
We were developing an app and we hadn’t seen even screens of it for a while and we finally got a prototype—and it was an App Clip actually, so in the Apple Store. It wasn’t even a fully built out app, it was a semi-functional prototype, and we knew that we wanted to move really fast on this, so this is the time to start implementing continuous discovery work and doing things really iteratively.
So what we did was start interviewing people about what they saw in those App Store screenshots—basically app shots—and then eventually into the App Store screenshots and how they felt about it. Was the information right? Did it seem like a good hook? Was the value prop there? All of those things. And that helped us to build better messaging, essentially, and better positioning leading into the actual app release.
And so the engineers in the outside firm are building, building, building away, and we’re collecting all this information on things that we’re going to change before this thing launches, even though they’re doing that. Then when the app actually does release, we continue on with continuous discovery, asking people about the onboarding process—does this make sense to you?—getting it in people’s hands as a release, but still something that’s not released to the public.
One of the neat things you can do on the App Store is actually gate things with a password so that people can download the app and play around but they can’t actually access the features until they enter the password in. So you can hold things off so you don’t get a bunch of bad reviews if you’re not fully baked yet.
Anyway, you get that in people’s hands, they start playing. We realized that the onboarding process is actually really overly complex. There are too many onboarding screens. It’s over-indexing on getting the value prop in front of people instead of just letting them use this. And then there were also some issues with importing contacts and some permissions issues that came along with iOS—which is not a bad thing—but some usability and security often don’t go hand in hand, right?
So what we realized was that we needed a wholesale redesign of the onboarding process and needed to make it super quick, super streamlined, really easy to get in there and get using it. Because what was happening is that people were downloading it, they’d play around with it for a second, they would get a bit frustrated and then go, “Meh.” And if that happens, people are never coming back. You’ve lost them forever.
So we completely redesigned the onboarding process and put it in a new version, got that out, and that is now what’s in the wild. And then we’re continuing to do discovery on that at a weekly release rate. They release every Friday. We try to get something, talk to some people, get some feedback, get that back to them by the next Wednesday so it can go into the next build or the build after—something like that. And it uncovers big showstoppers too, that you’re like, this is a P0 on your priority list of things because it really breaks the experience.
[Sarita Saffon]
So I think that I can conclude that you find this framework really useful. But overall, why do you think continuous discovery is essential in a modern product development team?
[Dr. Kyle Gibson]
Because you have all this context because you’re talking with people all the time. The information isn’t coming out of nowhere. You’re seeing how people that you’re interviewing and that you’re talking with—how their response is differing based on the product that you have in front of them that’s changing all the time.
It makes it very clear that the people we talked with last week or two weeks ago didn’t understand what was going on. The people understand what is going on now. And if you can show that, or if you have a PM or a PO in the room, that’s good.
And I also say—I don’t remember who came up with this thought—but it’s that PMs particularly need to see what “no” looks like and engineers need to say what “no” looks like. So if they’re in the room, they think that they’ve got the greatest idea ever made— a lot of us do, we put a lot of time into it—and they think, oh, everybody’s going to like this. And you start putting it in front of people and they’re like, “I don’t get it.” If I tell you that as a researcher, or a colleague tells you that, they’re like, “Nah, you just don’t get it.” But when a random person tells you that, it means something much different. And it’s important for people to hear what “no” actually sounds like.
[Sarita Saffon]
Susan, why do you think this is essential? Or how would you convince someone else that this is actually something that they need to apply in their team?
[Susan Liu]
I think the idea behind continuous discovery is an important concept for any user research to explore, as it provides a way for us to have continuous access and conversations with the customer. It’s a methodology that is particularly effective for getting to the human need behind the experience that they see on the website.
It appropriately uncovers the emotional experience and the human side to what is driving the individual who is in the process of making purchase decisions. Quite often when we’re crafting online experiences it can be quite difficult to put yourself into the emotional experience of that interaction. And we felt like continuous discovery—because it’s participant-led and it’s interview-based—you’re really trying to learn about those human aspects of their experience.
Importantly, it helps us uncover the customers’ stories so that we get to know them better as people. And I think it’s about staying in touch with our customers so that we remain in tune with their habits and so that we are identifying behaviors and attitudes and trends over time as opposed to trying to understand them at a particular point in time when we want to and when we need to.
[Sarita Saffon]
So what would you say has to happen in a team in order for the need for continuous discovery to be triggered?
[Susan Liu]
I think one of the reasons why I love being a user researcher is you often find gold when you have meaningful conversations— in particular the types of discovery-led conversations. We’ve had a user research function in place for four years and we carry out a wealth of user research. We wanted to theme our research in a way that told the story of the users from their perspective as another way to inject that empathy into the solution creation process to make sure that we were building the right thing but also building it correctly.
And I think speaking to the user needs is particularly powerful when we are thinking about crafting copy, content, and messaging that resonates with the end user. So together as a team, we agreed that the continuous discovery framework was the most suitable methodology to inform and validate our assumptions and to use it as a means to populate our user needs log.
[Sarita Saffon]
Yeah, obviously, I agree as a researcher. I do think that keeping the customer first is always very important. But we know that working in a team where business is first, it’s sometimes hard to give these ideas—that activities with the customers should be prioritized. So I was wondering how continuous discovery helps the team balance that customer-centric focus while also driving business value. What do you think, Kyle?
[Dr. Kyle Gibson]
Again, that’s a classic scenario that’s always going to exist—the business needs versus the research needs. I came from academia and that’s where we could spend, you know, two years doing a project that doesn’t apply in industry, right?
I think one of the ways to really get this across is—depends on your company, depends on who they are—but if you’re working closely with marketing or if you’re part of a marketing org or even the business org too, what you can do is generalize some of these findings by taking what you hear over the course of, say, a week or a month, and turning it into a survey and putting some numbers around it.
You can say, we heard these things, XYZ violated our intuitions, we hypothesized these things; survey 300 people, put some confidence intervals around things, maybe find some relationships, do some T-tests—whatever you need to do. That really helps to generalize. Because again, one interview is not research, right? It’s sort of research, but kind of. To generalize, you’re going to need at least 10 of these in a pretty short run about the same thing. For a lot of people, the numbers really work well. So I encourage mixed methods wherever possible, of course—but I’m very biased.
[Sarita Saffon]
What do you think, Susan? What shifts in the business—roles or mindset—are needed in order to adopt this continuous discovery approach?
[Susan Liu]
I think one of the biggest shifts in mindset is moving from a project-based, time-bound, one-off research model to an ongoing, iterative approach where user feedback is consistently integrated into decision-making. This shift highlights the importance of being flexible and responsive to new insights rather than waiting for large formal research reports to be delivered at the end of a traditional user research study.
I think it helped us to be more lean in our execution and delivery of user research, which kind of helped mitigate against that speed-versus-quality consideration. And I think this type of approach encourages greater collaboration through multidisciplinary involvement in the research process, where designers, engineers, product marketers, or developers can join in the interviewing process.
I think this helps spark a better shared understanding, which lends itself to a faster and more accurate translation of insights.
[Sarita Saffon]
It’s interesting that we’re going back to that poll—the battle between fast and quality. So what advice would you give teams that are struggling to balance the discovery work they’re already applying with the pressure of delivery deadlines?
[Susan Liu]
I would say don’t treat discovery as a phase that only happens at the beginning or right before the delivery project is kicked off. Try to treat it as a recurring part of the team’s workflow. So it’s a shift in mindset.
Choose an interview cadence that is sympathetic to your team’s availability, schedules, and capacity. The way my team got started last year was we decided, as a team of user researchers, to set aside a small amount of dedicated time—couple of hours a week—for learning activities and new techniques. That’s how we got started.
From the output of that work, we worked hard on socializing and evangelizing the outputs from the research. We were able to track the research insights right through to action. And we were actually able to report on the revenue that was generated as part of the opportunities that came out of our research.
[Sarita Saffon]
And how about you, Kyle? What would you say are the keys and recommendations to integrating continuous discovery into the product cycle without disrupting ongoing development?
[Dr. Kyle Gibson]
Someone has to own it. That’s the big thing. You’ve got to have somebody dedicated—or a team dedicated—but it’s got to be on people’s calendars. They have to be getting recognition for it. You can’t just say, “It would be nice if this happened. We hope you do it,” because it will just fall by the wayside.
At 1-800-Flowers, they’d been wanting to do something like this for years. The VP of UX and Design and the SVP of Product—the SVP had been a startup founder, went through a lot of acquisitions, and was used to talking with hundreds of people a year. He was a big fan of this methodology. While he was at Flowers, that wasn’t happening, and he was a real advocate and champion for it.
They had a junior UX/marketing person whose job was to find out what people were doing before big holidays were coming up—what they were feeling, things like that. It kind of happened quarterly, but it became clear that wasn’t enough.
When I was there, it was: we need to start doing this every week or every two weeks or whatever you figure out. And I said, okay, glad to figure it out—just need to put that on my roadmap, that I’m managing it.
Once you start getting people involved, particularly if it’s a public thing where they can observe and participate, it really becomes a community event. People are excited about it. They want to show up and talk about it. They buzz about it for the rest of the day and into the next week.
I would make it my role to monitor it—everyone’s on a chat together, they’re all chatting about what they’re hearing, taking notes. It’s really neat. And then I would gather everything people had said and the notes, synthesize, and report out: boom, boom, boom—this is what we heard, this is where we’re going next, this is why this matters, and so on.
[Sarita Saffon]
What would you say are the minimum skills or knowledge that a product team needs to start implementing this framework?
[Dr. Kyle Gibson]
You have to know how to craft decent interview questions and basic interviewing skills. Questions that aren’t leading—that sort of thing. Again, you don’t need to write a lot of questions for this, but you do need to understand: why do we want to talk with anybody at all? What’s the goal here? Write some questions and then, for any interviewer who’s just starting out, being quiet is the hardest thing.
The silent probe, as we call it—everyone wants to fill the space. But if you do, the person you’re interviewing will not. Sometimes you almost make them nervous so they fill the space talking with you—not in a bad way, of course—but it takes practice. You just have to get people in the room and doing it, and they build confidence. It’s just like anything else.
[Sarita Saffon]
Susan, thinking back on the habits the framework requires, which ones have been most critical for supporting your ongoing discovery within your team?
[Susan Liu]
I definitely agree with Kyle there. Just having that interview approach and not being led by a test script or an interview script is quite a challenge. Establishing that habit where you’re genuinely just curious about the person in front of you—being led by their responses to your initial question—rather than asking them question after question and simply ticking off a list.
For me and my team, the most critical habit was establishing a regular, ongoing cadence of touchpoints with customers from the perspective of continuous learning. That consistent engagement forms the heartbeat of the continuous discovery process. It’s incredibly powerful for uncovering opportunities for innovation and building the confidence we need to pursue bigger, bolder ideas.
[Sarita Saffon]
The book has a section about metrics. What type of metrics do you use in your team to measure continuous discovery success?
[Susan Liu]
We lean on the digital experimentation process to help us quantify research-led opportunities and outcomes, so we feed into rigorous and robust—A/B testing optimisation, personalisation process. The metrics that we leverage quite often is we will define our value, outcomes and return on investment through conversion of the revenue generation, as well as engagement metrics such as click rates and bounce rates, exposure rates and attractiveness rates, which we gather through content square and google analytics. Additionally, we track improved digital satisfaction and increased quality-of-experience scores over time.
[Sarita Saffon]
Kyle, I see you nodding to some of these metrics. Do you use these as well, or do you use others?
[Dr. Kyle Gibson]
Yeah, all of those. In addition, we always try to route every PRD—every product requirement document—in a user problem. It says it right up front. At Meta they called these “people problems,” which I love. Every PRD starts out with, “What problem are you solving for people?”—which is different from a problem statement and not always 100% applicable, but we try.
We try to develop metrics that get at that, which is tricky because some are derived. You use bounce rate as a proxy and conversion to get conversion rates and then a billion other metrics—but you can easily get analysis paralysis.
It’s often best—if you can—to partner with DS and derive a metric that makes sense, and iterate on that over time. It’s an art and a skill for them to come up with this stuff. But if you can measure the behavior on site, you can often derive a metric.
[Sarita Saffon]
How would you say continuous discovery has improved your team’s decision-making in the product development process?
[Dr. Kyle Gibson]
I’ll go back to something I said earlier. If I’m working in a continuous discovery environment and we learn that something isn’t going to work—or something works better in a different way—it’s not me saying it. This is empirical evidence. This is what I have.
You have to have thick skin as a researcher sometimes because people can get annoyed with you for bringing these things up. But you just have to remind them to back up and go through the experience themselves again. Try to have fresh eyes, like the people we interviewed did. More often than not, they come back and say, “Oh yeah, we have a problem.”
[Sarita Saffon]
I wanted to finish asking you both: how has Userlytics helped you implement this continuous discovery framework?
[Susan Liu]
For us, we leaned on Userlytics—our trusted partner—as the platform for carrying out the continuous discovery sessions. We also relied on Userlytics for recruitment. We knew from experience we could count on Userlytics to fulfill quotas quickly and on a prolonged, regular basis. You help us so much with recruitment. The Special Ops recruitment service is invaluable to us; without it, I don’t think we would have as many smooth user research sessions as we do. Userlytics has been an invaluable operational partner throughout for us. Thank you so much to everyone we’ve worked with at Userlytics, Sarita.
[Sarita Saffon]
Amazing, thank you, Susan. How about you Kyle? What are the benefits of using Userlytics for implementing this continuous discovery framework?
[Dr. Kyle Gibson]
For me, recruitment—again—it’s amazing. I’ve never not found the right person with Userlytics, which is awesome. One of the things I like about Userlytics participants is it’s not their first rodeo doing user research. They’re not coming in completely cold. You can ask them anything within your research scope.
Something we do live all the time is ask them to share their screen and go to Google and search for whatever your product is—search for flowers, search for steel beams, whatever—and you can watch their entire flow live, which is really cool. It’s all recorded too.
And then, particularly for the larger process I do with PMs, there are blind observers. Participants know they may be observed, but it’s not like a Zoom call where they see 25 people in the background, which would be incredibly intimidating and would change your results. The blind observer link is huge.
[Sarita Saffon]
Precisely, we wanted to share with the audience how Userlytics offers different services to help apply this framework in your teams. Even though our core is the platform, we also have— as Susan mentioned—our operations team and our UXC team that can help with implementation.
We want to show a little bit of what we can do to help this transition go smoother, regarding the challenges Susan and Kyle mentioned. Because we go beyond usability, we want our clients to see us as a partner, not only in the usability testing phase, but way before that.
Because of this, we can help with continuous interviewing—those weekly conversations with your customers—throughout all the opportunity-tree spaces. You can count on us for the services you might need. In the platform, you can conduct those unstructured weekly interviews and gather all the necessary information to build and refine your opportunity tree constantly.
As the book suggests, automate the recruitment process—precisely with the Userlytics panel, as Susan and Kyle mentioned. Also, speaking for myself: the framework notes that the product team should do the continuous interviewing. But if there’s any week you can’t carry out a session, don’t lose that user or that slot. Our UX consulting team, like myself, can help moderate those sessions so the participant is still interviewed and the information keeps flowing continuously.
We can also provide the interview snapshot to gather with your other information, capturing the most important data from the interview we carry out for you. And you’ll also have the recording in your Userlytics account to check later.
So, going on to pricing, I’ll leave Nate to give you all the details.
[Nate Brown]
Yeah, thanks so much, Sarita—and Susan and Kyle as well. So much good information. Maybe you’re sitting in the webinar now and you’re like, I love what Susan and Kyle have implemented. I want to do this on my team. But maybe you’re not sure where to start. As Sarita mentioned, Userlytics is here to help—not only on recruitment, but also to help you get started with the process.
Sarita and her consulting team can basically start this process for you. You can sit in as observers and see how they do it, connect with the researchers—why did you ask this or that question? It’s a great learning, potentially even a training experience to help you get that feed off the ground. Or maybe you just want to outsource it totally.
We have some numbers here. Again, we’ll provide these slides afterwards, so I don’t need to go through each specific number—but an eight-week process or a sixteen-week process, where we can handhold as much as you want through building that process and moderating those sessions for you.
And then, on the next slide, we do have an offer since we’re so thankful you joined us today. Specifically on recruitment: if you’re interested in getting things started—the most important step is the first one—we have an easy way to do that: a two-week free trial, where we’ll provide two participants per week to get you off the ground, at no cost. That’s specifically for recruitment, but that is our offer to you.
We’ll be reaching out afterwards to everyone who attended or registered with the slides, the recording, and of course this offer if you want to get started.
But I know the moment everyone’s been waiting for—a little live Q&A. We have about 30 minutes left. Let’s dig into the questions. I’ll read one out and toss it to one of our speakers, or we’ll do a round robin.
One of the first ones was: how do you balance the need to explore a wide range of ideas with the pressure to move fast and deliver quickly? Susan, I thought you might be a good one to answer this.
[Susan Liu]
I definitely think it’s about adopting that team-sport mentality—getting everyone involved, making sure everyone’s taking part in sessions, doing their bit, capturing notes. Encourage everyone to sit in and observe the sessions and take notes first hand.
After sessions, we follow up with a 30-minute debrief. That debrief is super powerful for aligning everyone on the key takeaways and next-step actions. Turning around a quick one-page snapshot is super powerful as well.
Also, remind yourself that even though you’re generating this one-pager as your key deliverable, the data is not lost because it’s all captured in your canvas—whether it’s a notepad, Miro, or FigJam. And leveraging Userlytics: nothing is lost. Even if you don’t have time to write a 40-page report, you can always go back and watch insights on the platform.
All the sessions are recorded, transcripts are available; you can search transcripts; you even have sentiment analysis incorporated into the transcripts as part of Userlytics, and AI-generated summaries as well. Leveraging the platform supports working in lean ways and not being afraid to create snapshot outputs as part of your user research.
[Nate Brown]
Follow-up, Susan: on those debrief sessions, are they led by someone on the team—maybe the principal—or is it more of an open-table discussion?
[Susan Liu]
Typically it’s led by the user researcher. We invite everyone to share top-of-head insights that stood out to them. That helps us understand what really stood out to people and helps indicate priorities.
As part of that, a user researcher can guide discussions—really honing in on underlying needs without jumping into solutions. The point about aligning on insights is to help the team understand there’s a variety of solutions that can come from one insight.
So yes, the researcher leads the debrief to guide discussion and keep everyone on track so it’s not taken off in one direction without addressing the full view of insights from the sessions.
[Nate Brown]
Kyle, should you recruit different users each time or repeatedly interview the same small group?
[Dr. Kyle Gibson]
It really depends on the product. There are times when long-term recurring studies—where you’re growing with the person or the group—are really insightful because you see change.
I’m more of a snapshot person in general. I like to get in front of people for the first time, craft questions, and do the research in a way that’s meaningful in that instance. That could be a focus group or a group of people. I like snapshots because there’s a diversity of opinion that’s very valuable.
You’ll get two people in the same room with diametrically opposed use cases. That’s always the best because it’s really insightful.
Sarita, how do cross-functional teams—product, design, engineering—collaborate across the opportunity tree without jumping straight to the solution?
[Sarita Saffon]
One helpful question is: does this “opportunity” have multiple possible solutions? An opportunity, as the book describes, is the actual need, desire, or pain point of the user—not something defined in your product or business. If it only has one “solution,” you’re talking about a solution, not an opportunity.
It’s natural to jump to solutions, especially when we’re working toward delivery. But we need to go back to people. Discuss their needs in a general way—well before they purchase your software or use your site. What challenges do they have as people?
When you focus on those human needs, you can identify opportunities—not just jump to solutions.
[Nate Brown]
Good question from Marcus here. He’s looking for inspiration on using continuous discovery for developing the business itself. So not specific to maybe the product, which I know we focused a lot on so far. Does anyone have any experiences or tips on developing continuous discovery, or should say using continuous discovery for developing the business?
[Sarita Saffon]
Continuous discovery is always about talking to your “customers.” If you’re building culture, your “customers” are your employees. If you’re building the business, talk to potential customers and audiences you’ll need to address. This is discovery before product—it applies to business development, too.
[Susan Liu]
Focus on the outcome. Know what you want to achieve by adopting continuous discovery. Then start talking to people and learning.
[Dr. Kyle Gibson]
I’ve done a lot of internal research. It’s not much different. If you’re building dashboards or processes, UX researchers often work on process. Talk to people doing the work on the ground—ops managers can connect you.
Get down to the warehouse; then to the distributor; then get a bird’s-eye view. Engage at every level. Employees are “trend empires.” They deal with your customers every day and are doing research for you. Incorporate what they’re saying in a meaningful way—not only to build better products but to make employees feel heard.
[Nate Brown]
How do you manage leadership decision-makers who observe a session and want to take action when it might not be representative of the broader user base?
[Susan Liu]
I challenge the recommendation and take it back to why. Why that solution? Why do they think it’s the best action based on the insight? Focus back on the insights. Ask whether the proposed solution actually answers the user problem identified in sessions.
[Dr. Kyle Gibson]
This happens. It’s well-intentioned. That’s why I have processes for generalizability. One PM attends one interview and hears X; another attends a different one and hears Y. They walk away thinking “people think X/Y.”
Draw a line between research and capital-R Research. Yes, technically it’s research, but we’re not making huge decisions based on a single session. Do a survey or a rapid follow-up with a panel. Interview 10 people. If 10/10 say the same thing, you’ve probably hit saturation. Until then, be careful.
[Sarita Saffon]
In the experimental space, set clear criteria for testing assumptions. Also, evangelize the research context—sample size, method, cadence—so observers know a single session isn’t the whole picture.
[Susan Liu]
And if we can’t get them to back down, we can materialize the idea and bring it back to customers in the next interview cycle. Continuous discovery isn’t just talking; it includes concept testing and experimenting. Validate, then use data/insights to push back if needed.
[Nate Brown]
How many researchers do you have, and what’s the balance of time on continuous discovery vs. other research?
[Susan Liu]
I have a team of five user researchers. We carve out dedicated weekly time—typically Friday afternoon—for learning and new techniques. We used that time to implement continuous discovery without overly impacting BAU.
[Dr. Kyle Gibson]
At 1-800-Flowers, it’s highly matrixed—many people do research-adjacent things. I led a weekly R&D meeting with 8–10 attendees. For continuous discovery, I did interviews every two weeks with PMs, plus a 30-minute pre-brief and 30-minute debrief. PMs did it once or twice a year each, more if they wanted.
At Meta, I was part of a team of 90 (and thousands across Meta). Continuous discovery was more event-based—for example, around elections.
[Nate Brown]
Sarita, how would you address stakeholder skepticism on research and ideation before solution?
[Sarita Saffon]
Show results. Numbers help sometimes, as Kyle said. Show the impact on ROI and on experience. If you can run a small trial, show how people care about brands that care about their needs and pain points. Show how the experience improves over time with continuous discovery—and how internally it leads to better, faster strategic decisions. That’s how you change skeptical minds.
[Nate Brown]
Susan, how do we get continuous discovery to drive decision-making instead of stakeholders making decisions and using research just to validate them?
[Susan Liu]
Our user-needs repository. We consolidated findings and made them accessible to stakeholders so they can self-serve. Continuous discovery is how we gather insights; the repository is how we communicate and operationalize them. Now, there’s no reason to base ideas on anything but actual user needs. We implemented this to move away from research as validation and toward research that inspires outcomes and experiences.
[Nate Brown]
Kyle, what methods can we use to research customers and get non-biased data?
[Dr. Kyle Gibson]
There’s always going to be bias. Mitigate it with larger sample sizes or interviewing enough people to hit saturation (15–20 for a narrow topic). Write proper, non-leading questions.
On Userlytics, I never tell participants what company I work for—people want to please, and they’ll tell you what they think you want to hear.
Triangulate: use several methods. If they all point the same direction, you’re probably right. And remember: in industry, you often need directional information to make decisions, not journal-ready causality.
[Nate Brown]
As we wrap up, one last tip for someone beginning to build the continuous discovery framework into their team. Susan?
[Susan Liu]
Just do it. If you believe in it and it will help you achieve your outcomes, get started. Reach out to Userlytics to support anything you need to get going. It becomes a habit.
Kyle?
[Dr. Kyle Gibson]
I agree 100%: habitualize it. Start doing it every Friday afternoon or whenever works. It’s a nice way to end the week. Do it.
[Nate Brown]
Sarita?
[Sarita Saffon]
If you’re doing it, please do it—and count on us to help you with it.
[Nate Brown]
And from me: reach out to Userlytics and get your free two-week trial so you don’t have to pay for the first couple of interviews.
Thank you, everyone, for joining. I hope this was an insightful webinar for you. We’ll be in touch with slides and the recording. Have a great rest of your week.
Meet Our Experts:
We are excited to have distinguished speakers joining us to share their expertise:
Sarita Saffon

Sarita Saffon is a UX Researcher with extensive experience in conducting end-to-end market, brand, consumer and design research, using both quantitative, qualitative, and mixed methods. She has been working in the research field for over 7 years, both on the client and the consulting side, and in diverse industries from technology to consumer goods. Her goal is to be the voice of users inside the businesses, so that their viewpoint is taken into account, achieving this by transforming their opinions and behaviors into actionable insights that help companies make user-centered and data-based decisions.
Schedule a Free DemoSusan Liu

Susan is a Senior user research manager at Sage, heading up the user research team within Sage’s Performance Marketing function. Susan has been at Sage for 3.5 years now and has grown the team from 2 to 6 members altogether. The team is a trusted and valued partner, delivering actionable and impactful insights to the business, influencing strategy and fuelling tactical initiatives across Sage.com, globally. Prior to joining Sage, Susan was Head of Design research in a multidisciplinary design team at Virgin Money.
SageDr. Kyle Gibson

Dr. Kyle Gibson is a UX Research Manager and Anthropologist. Recognized for his innovative work in user experience and design thinking. His prior research at Meta, identifying trust and safety patterns, helped the company to scale its identification of election integrity risks, globally. His academic work has been cited by the Supreme Court of the United States and Psychology Today, among others. Kyle is a board member for Pawlytics, a B Corp specializing in software for animal rescue operations. He holds a Ph.D. in Anthropology/Human Evolutionary Ecology from the University of Utah.
Nate Brown

Nate is an accomplished account manager for many large enterprise-level companies in the North American region. With multiple years of experience collaborating with research teams to maximize their research in the Userlytics platform, Nate possesses key insights into why some research projects lack substance and others produce valuable insights. His favorite part about working at Userlytics is building lasting relationships with his clients, even in a remote setting.
Schedule a Free DemoLearn more about the Continuous Discovery framework here.