#833: Qualtrics’ Ali Henriques on accelerating the speed to insights with synthetic research


The Agile Brand with Greg Kihlström® | Listen on: Apple | Spotify | YouTube 

Help others find the show by leaving us a review


What if the biggest risk to your next global campaign isn’t the market, but the months you’ll spend waiting for research to tell you what the market wants?

Agility requires not just moving quickly, but making high-confidence decisions at the speed of the market. It demands that our ability to learn and validate is no longer the primary bottleneck to our ability to act.

We are in Seattle at the Qualtrics X4 Summit, and today,we’re going to talk about how to overcome the speed to insights bottleneck. We’ll explore a fundamental shift in market research, moving away from slow, traditional cycles and toward a world where synthetic data and AI can give us near-instantaneous insights, allowing us to simulate customer behavior and de-risk major decisions before they even launch.

To help me discuss this topic, I’d like to welcome back to the show, Ali Henriques, Executive Director of Market Research at Qualtrics.

About Ali Henriques

Ali, a market research practitioner, leads research innovation for Qualtrics Edge, which comprises of AI-powered tools and solutions, wrapped in human-powered services. With nearly 2 decades of market research experience, Ali spearheads thought leadership for Edge, guiding the innovation pipeline for transformative research tools and supporting our legacy services business to deliver 10,000 projects per year.

Ali Henriques on LinkedIn: https://www.linkedin.com/in/ali-henriques-2581683/

Resources

Qualtrics: https://www.qualtrics.com

The Agile Brand podcast is brought to you by TEKsystems. Learn more here: https://aglbrnd.co/r/2868abd8085a9703

Drive your customers to new horizons at the premier retail event of the year for Retail and Brand marketers. Learn more at CRMC 2026, June 1-3. https://aglbrnd.co/r/d15ec37a537c0d74

Enjoyed the show? Tell us more at and give us a rating so others can find the show at: https://aglbrnd.co/r/faaed112fc9887f3

Connect with Greg on LinkedIn: https://www.linkedin.com/in/gregkihlstrom

Don’t miss a thing: get the latest episodes, sign up for our newsletter and more: https://aglbrnd.co/r/35ded3ccfb6716ba

Check out The Agile Brand Guide website with articles, insights, and Martechipedia, the wiki for marketing technology: https://www.agilebrandguide.com

The Agile Brand is produced by Missing Link—a Latina-owned strategy-driven, creatively fueled production co-op. From ideation to creation, they craft human connections through intelligent, engaging and informative content. https://www.missinglink.company

Transcript

Greg Kihlstrom: (0:49) What if the biggest risk to your next global campaign isn’t the market, but the months you’ll spend waiting for research to tell you what the market wants. We’re here in Seattle at Qualtrics X4, and today we’re going to talk about how to overcome the speed to insights bottleneck, moving away from slow, traditional market research cycles, towards a world where synthetic data and AI can give us near instantaneous insights, allowing us to simulate customer behavior and de-risk major decisions before they even launch. To tell me to discuss this topic, I’d like to welcome back to the show, Ali Henriques, Executive Director of Market Research at Qualtrics. Allie, welcome back to the show.

Ali Henriques: (1:25) Thank you. Great to see you.

Greg Kihlstrom: (1:26) Yeah, yeah. I think, I was just saying I think this is episode number three for you, so welcome, welcome back.

Ali Henriques: (1:32) Oh. yeah, I was in Mexico City for one. I yeah, we’ve had a whole host of global interactions.

Greg Kihlstrom: (1:39) Yeah, yeah, love it. Love it. So, before we dive in though, why don’t you give a little background on yourself and your role at Qualtrics?

Ali Henriques: (1:45) Yeah, happy to. This is year number eight for me at Qualtrics, which is fantastic. I’ve been a Qualtrics user my whole life, so always in the market research capacity. And rewinding eight years, I started um in a capacity to support clients and designing and executing on their research programs, pretty traditionally, right? And fast forward for the past two to three years, right? I think as long as we’ve been, we’ve been chatting, I’ve taken on strategy for our market research kind of division. So very, very close partnership with product, taking kind of the market signal and turning that into recommendation on where we should make product investments. And so this is how Synthetic was born and a lot of our features and functions are, are, are designed to specifically solve the speed to insight challenge that you mentioned at the outset. So.

Greg Kihlstrom: (2:35) Yeah. Yeah. Well, and maybe a little bit more, I know you and I have talked about Edge, you know, Qualtrics Edge a little before in the show, but for those less familiar, why don’t you just give a high level?

Ali Henriques: (2:44) Yeah, absolutely. And so we, we actually announced Edge at X4 three years ago. And it was really meant to be disruptive and really signal to the attendees and of course the market that we are taking AI very seriously and this is how we will incubate and really label our innovation for market research in particular, because each of our different product divisions have have different um approaches to their own incubation. And so Edge started with a couple of different solutions to help connect researchers as well as the kind of product and marketing teams to data sources and insights and intelligence that are always on. That’s really the ambition and the idea because speed to insight is critical. And so Edge right now describes, it’s used to skew, right, that we have effectively reference for certain AI powered solutions. So synthetic is Edge Audiences, and we have Edge Instant Insights, and we’ll keep going with that as a way to again, label and kind of delineate what are some of our newer, more innovative solutions for the market. So happy to speak more about Synthetic if that’d be helpful, but I’m sure we’ll get into it in different ways.

Greg Kihlstrom: (3:57) Yeah, yeah, definitely. We’ll, we’ll, we’ll dive in here. And what, why don’t we start with looking at research from that, that strategic level? and starting with diagnosis. So traditional research, as as many people listening to this and, you know, we’ve we’ve been involved with for for years, great in so many ways and yet, it’s often presents a bottleneck to taking action. Where do you see this friction causing the most damage in, you know, a large marketing organization needs to move quick and, you know, all the pressure and everything like that. You know, where where are some of those those friction points?

Ali Henriques: (4:31) You know, what immediately popped to mind, Greg, is actually the cost of doing nothing, right? And so I and I really do think that these solutions should be considered more for that than even trying to compete or replace traditional ways of doing things. Think about all the decisions that are made based on gut or you know, that that meeting with that team and we just feel like this is the right thing to do. To me, that’s there’s incredible opportunity there because it’s just not really been cared for because research is a bottleneck, right? the teams are small and nimble and doing their best to to service their stakeholders. Now, the the scenario we all want to avoid as researchers is, well, we had to make a decision, so we chose to put this feature and we we chose to, you know, put this on the box and it’s already in production, uh even though you’ve just gotten back to me six weeks later with the research. It’s just, it’s a horrible, it’s a horrible position for us all to be in, right? The researcher, the marketing team, the product team. Um but it happens because research is um has been conducted the same way for decades. And unfortunately, each of the steps of the the workflow and life cycle take time and it and we have to um accept that these AI power solutions are not at all degrading the rigor and right, and the science of of how we’ve we’ve done things. And so we’ve really started to separate the types of projects and the the the times that the researcher gets involved in marketing, product, ops type of research into strategic and quick turn, right? And starting to convince our stakeholders that it’s okay for X, Y, and Z use cases to turn this way and be very AI led and AI powered. And there are still times when the maybe it’s the weight of the decision warrants a more involved timeline, human powered research, right? multi-phase, multi-modal type of of investment.

Greg Kihlstrom: (6:37) Yeah. Well, and maybe for those a little less familiar with, you know, synthetic data, synthetic research in general, you know, can you break it down in in practical terms? You know, how are we moving from those traditional research methods that most are probably familiar with to generating AI results? Like what what does that look like in practice?

Ali Henriques: (6:57) Yeah, it’s we chose to build for solve for that first, which is think about a research typical workflow and life cycle. Roughly two weeks we spend in the design and survey instrumentation phase. Could be two to four weeks in the data collection phase, and another two to three to four weeks, you know, in the analysis and reporting phase. So as, you know, the team that I I I started on eight years ago was called Research Services and we existed to connect our clients to third-party panel audiences that they don’t have in their database. And so it was a natural place for us to really solve for this challenge of speed to insight first, by reducing that data collection from weeks down to minutes. I ran Synthetic for Jeff’s keynote this morning, 15 minutes.

Greg Kihlstrom: (7:46) Oh, wow!

Ali Henriques: (7:47) Study design, I wrote the questions, ran, this is these are facts. I I can show the (laughter).

Greg Kihlstrom: (7:53) Yeah, that’s good.

Ali Henriques: (7:53) We have time stamps on the responses. And so it’s just, it’s wild, right? so very intentional about using Synthetics, Synthetic it can mean so many things and, you know, we’ve talked about this for years now. I will speak only to synthetic responses. And so our model is trained and tuned to survey data. And that’s what we have and that is what, you know, we’ve built our existence as kind of this MR pillar of Qualtrics, helping clients with. And so this, the survey data has something to do with advancing a product or a service, it’s pricing, it’s features, it’s introducing a new concept, something to do with competition. Why are my customers going to that brand for, you know, that purpose and how do I get them back? Or an audience, right? Can we learn more about millennial moms and their shopping behaviors? And that’s what’s feeding our model and those are the types of consumer attitudes and behaviors that we can very well represent with it. Right now it’s quantitative, right? Because that’s the nature of our product. But what we’re demoing downstairs is qualitative. It’s that same model, but now I’m interacting with it in natural language. I’m having a conversation with that millennial mom instead of forcing my questions into a survey instrument to get record level data back. And so as the researcher, I wouldn’t have trusted qual from any model,

Greg Kihlstrom: (9:13) Right.

Ali Henriques: (9:14) last year, two years ago, right? But now that we we’ve got the model commercially available, it is meeting all of our accuracy measures and expectations, I trust the qual. I I know that I’m having a conversation with something that’s really rooted in robust objective quantitative data behind it.

Greg Kihlstrom: (9:35) Yeah. Well, and maybe just for context there, because technically, not to pick on any of the LLMs, but I mean, you could ask ChatGPT a question, right? Pretend you’re this, that, whatever. How does this, you know, how should people think of this differently?

Ali Henriques: (9:49) such a great question. It’s true, right? And I use, I’ve got, I’ve got ChatGPT. We’ve got Gemini, we’re a Gemini shop, right? we’re not using it for this purpose though, right? And I I’ve heard clients this week, you know, they took all of our like past 10 research projects, threw it into co-pilot, and that’s what the product team is using. I’m like, get again.

Greg Kihlstrom: (10:07) Yeah.

Ali Henriques: (10:08) That’s great. Nothing could go wrong here. so what we rewind the models have gotten very good though, out of the box, right? And we, there is a very thin layer of our Edge Audiences product that that is publicly available LLM, Llama, Claude, you know, Open AI. But 95% of what our model’s considering is our survey data. And so we actually ran an experiment, I’m sure you’ve seen this with there was actually a Google study that was published, academic research, right? We replicated it in partnership with them. it was about Google search. There were eight KPIs. We tested out of the box Gemini, ChatGPT against we ran new human responses and of course Edge Audiences. And the distributions are fascinating. So what you see with the out of the box models, it it answered, right? It gave us I can’t remember 500 responses each. What you see is we were intentional with showing the variability of the data because humans are quite irrational, right? And have, we have very different perspectives and opinions, and you see that in the human data. So you’ve got, you know, a score, mean score, top two box score, but then you see the spread, right? the distribution of the data. We don’t see that with out of the box models. Those models are tend to be a lot more similar in pattern and behavior, and so you don’t see the variability. You see them cluster around I either love it, hate it, I I’m agreeing. And they moved together. So ChatGPT and Gemini tended to move across these these eight different KPIs together. Whereas our model, again, that’s been studying kind of human behavior and how we react and respond to survey questions, almost perfectly matched that of of the human responses.

Greg Kihlstrom: (11:52) Yeah, yeah.

[Music plays]

Greg Kihlstrom: (11:57) So you did a, you did a presentation with booking.com. Let’s I’d love to talk a little bit about that and maybe walk us through, you know, that, before and after and that example, because I think it’s a great great way to kind of put this stuff in context.

Ali Henriques: (12:12) Yeah. I love partnering with Elena. She’s been such a a curious researcher on this journey. We we presented together one year ago. both times it’s been on her birthday. So what a what a delightful human to spend her birthday with us.

Greg Kihlstrom: (12:26) Amazing.

Ali Henriques: (12:27) I did bring a cake on stage and made everybody sing to her today. No, sorry, on Tuesday. so our journey’s been interesting. We approached our experiment last year. The model was not at peak performance, right? And so it was one of these eyes wide open. We are going to take some questions from your brand tracker. You’ve already collected human responses to them. Let’s collect some synthetic responses. And for some of those, she even had operational data. So it was a very different type of experiment, but you know what was crazy? Her learnings maintained, right? That human, I know this sounds obvious, but right, the human’s still very, very important in terms of structuring the questions and making sure that we’re applying our, you know, expertise to even how we interpret the data. And so she had a lot of the she actually used the same slide as kind of, you know, the three things that she learned from this. We chose this time to do a psychographic segmentation, which is wild. I had never done one of those. And so I was super excited. We and I found myself in one of the segments, right? Which is just wild. Like I’m I’m a human connecting with synthetically generated segments, like that’s just nuts. Like that one was so me and that one was so her. So it was really fun to show the also again, like the the variability of the data and, you know, this one was an always on trend setter and that one was an independent traditionalist. And you see all the same behaviors that you’d expect and you see from from human-based segmentation. But we, we threw a probably 30 minute questionnaire at these synthetic respondents. You can’t do that with humans. With segmentation, you want to explore so many different territories. Her hobbies and interests question would have been rejected by Allie if we were running this with human responses. It was like 75 things on this list. But Synthetic takes the time to go through those and select all of the hobbies that, you know, that that respondent engages in. And so it was it was fascinating. And it was kind of a reveal to the crowd. She walked through the segmentation as just kind of a readout and then said, and all of this was synthetically generated, right? And I talked a bit about the model and how it worked and So it was it was really, really cool. They’d never run psychographic segmentation at Booking before. They’ve got more I think more behaviorally rooted segmentation. So this was a a great way for them to experiment internally with something they’ve never tried before. And they’re now taking some of the learnings here about social media behaviors and they’re going to pilot some YouTube Reels based on what we found with our synthetic segmentation. So, very cool.

Greg Kihlstrom: (15:08) Yeah, that’s that’s amazing. Well, and I think it goes back a little bit to what you were saying earlier, you know, there are a lot of like gut decision ideas out there. Some of them, we hear about the good ones, right? The ones that like turned into something that’s successful, but to me part of this, the value of the human and us being creative and thinking strategically and all that is that again, not all our our ideas are going to be great in execution, but what if there was a way to quickly test those, run them and see and validate the ones that are because then you get less of these gut decisions that go nowhere, right? Yeah.

Ali Henriques: (15:47) Yeah, that’s exactly it. And the other thing that I’ve been trying to encourage all of us, um Qualtrics and clients this week to think about is, we have to start to shut ourselves of traditional research constructs. And what I mean is this, you know, survey design and it being perfectly crafted. Because what happens with agentic is we we let go of all of that. And we sure, it it sits behind the scenes, right? Some type of structured way of, you know, organizing our questions and curiosities into research, but it becomes less important. And so what I’ve I I love Synthetic too for exactly what you said, but just to build on it, maybe we’ve just commissioned some really expensive human-powered research. And there were three to five things we wish we would have asked, right? Or we’re in the room and the and our stakeholders are asking us and we’re we’re caught a little flat-footed and like, well, I didn’t ask about that.

Greg Kihlstrom: (16:42) I might have done that before.

Ali Henriques: (16:44) Oh, totally. Yeah, we’ve all been there, right? And it’s like, you know, and you’re you’re you’re confined. You cannot possibly ask everything that you want. And so go and run those three to five questions with Synthetic. And, you know, it’s you’re not, you’re probably not rooting the decision on them and that’s okay. But, God, it’s it’s better than nothing, right? and you can’t go back to the humans that you just ran that research with.

Greg Kihlstrom: (17:07) Yeah, yeah. So how do we look at, I mean, I I think we’ve touched on on several things, but how do you look at the ROI of this? Is it, is it it’s certainly speed, you know, and we’ve talked about that. But what what are maybe some of the other metrics here?

Ali Henriques: (17:22) Yeah, you know, it’s a such a great question. We’ve been, you’re very familiar with the MR trends, data. We’ve started to explore the value of market research traditionally, right? So that we’ve got a baseline for what new value does this create? I think in simplest form, it is more throughput, it is more output. we research is constrained. It’s it’s time constrained, it is resource constrained, and I mean that in the human and the dollar cents. And so what does this allow us to do? Certainly a lot more endless curiosities, you know, addressed and answered. And so there’s something to this kind of throughput that we haven’t quite put our finger on, but, you know, it’s the what we started this with, right? The all of the curiosities that that go untested. and decisions that are made, you know, based on gut. And so, yes, there’s obviously, you know, a cost savings too. The constrained traditional research costs more money, takes more time. we’re able to do that at half to if not more than half, you know, the costs with Synthetic. And what happens when our researchers are, more efficient in their their day-to-day, right? What what else can they do? And where do they invest that time? and I I think it’s truly I’ll I’ll reference a client conversation I had just yesterday. He said, I’ve PhDs sitting here, right? Building charts and PowerPoint. So you know, that is kind of ridiculous, you know? they should be driving strategy and product investment, you know, decisions, not producing charts in PowerPoint.

Greg Kihlstrom: (19:01) Yeah, yeah. So, you know, looking out a few years, what is this, you know, what how does this change the the structure of a marketing or, you know, or an insights team? And, uh,

Ali Henriques: (19:10) Oh, that is such a great question. I don’t know what three years out looks like. But I’ve had this conversation. We’re actually running some qual right now with research and insights professionals to understand how they are preparing their current like ICs and teams for these technological advances and what skills are they trying to develop? And so, I think the closest I can answer when when pressed by even my team, right? I was hired to do research. We’re doing a little bit less of that, right? Each year. What will I be doing in three years? I think it’s kind of what we were just talking about. I should be really driving strategy and bridging the gap what Jason spoke about in the first keynote, bridging the gap between understanding and acting. And that’s where research often falls flat, right? It’s a lot of information about something. It’s rarely telling you exactly what you need to go and do next, what button to click and what investment to make. And so and we wish we had more time to spend on that. So I think we’ll become a bit more strategic advisors that way as a research population and as folks who have been around data and insights for, you know, forever, how do we accelerate then the decisioning that needs to happen on on the back of this? And for it it’s so tough, right? Each industry is so different, you know, I can see so many different variants of that. I think that’s the simplest way to think about elevating the role and I mean that, right? Away from the the menial cross-tabs and um charts and all of that to all the things that we wish we had more time to spend on and the the recommendations and also stitching together all these different data sources. So building kind of a a fabric, if you will, of of intelligence that again, we’re so constrained today and unable to do.

Greg Kihlstrom: (21:14) Yeah, love it. Well, Allie, always great to talk with you. A couple, couple last questions as we wrap up here. what’s been a highlight of X4 for you so far?

Ali Henriques: (21:23) You know, I was not thrilled about it being in Seattle. I live a mile from the convention center in Salt Lake. But this venue has been really cool. It is beautiful. Um we’ve flipped a couple of different things to the agenda, but I think it’s working really, really well. Um so I’m just I’m loving the venue and I’m loving the kind of even, I’m sure it’s attracted, you know, new attendees that just hadn’t hadn’t come in the past, right? And so it’s just, yeah, it’s been quite fun meeting new people in a new place.

Greg Kihlstrom: (21:55) Yeah, love it. And last question, I know I’ve asked you this before, but I’ll ask it again. What do you do to stay agile in your role and how do you find a way to do it consistently?

Ali Henriques: (22:04) Yeah, okay, gotta think of something new now, Greg. You know, I’ll say, I’m not sure if I’ve said this before, but I really think I should have answered, I’m answering both questions in one, but I think part of what I love about X4 period is the opportunity to talk and connect with so many people who are, you know, somehow rooted in insights and intelligence, right? And hearing all of their perspectives and, you know, what challenges they’re facing and what cool new things they’re experimenting with, right? Helps me to kind of expand. I don’t get to to interact with clients, you know, too regularly. And so having it all kind of consolidated here in one place, I’d say more more connection with, you know, buyers and insights professionals just just to learn from from them.

The Agile Brand with Greg Kihlström podcast

The top-ranked enterprise marketing podcast | 8 years, 800+ episodes, millions of downloads.




The Agile Brand Guide®
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.