#654: Translating research into cross-functional strategic change with Adam Hagerman, Indeed

We are recording live at Qualtrics X4 in Salt Lake City and seeing and hearing all about how to create and enable amazing customer experiences.

It’s important to collect customer experience data, but if it’s not driving change across your organization, is it really helping your business? Today we’re going to talk about making meaningful cross-functional change, using CX research and data as a guide.

I’m joined by Adam Hagerman, Director of UX Research for Employer Products at Indeed. Adam has led transformative efforts at Indeed to turn customer experience research into cross-functional strategic change, driving real improvements in both user satisfaction and product success.

Resources

Indeed: https://www.indeed.com

Qualtrics: https://www.qualtrics.com

Connect with Greg on LinkedIn: https://www.linkedin.com/in/gregkihlstrom

Don’t Miss MAICON 2025, October 14-16 in Cleveland – the event bringing together the brights minds and leading voices in AI. Use Code AGILE150 for $150 off registration. Go here to register: https://bit.ly/agile150

Don’t miss a thing: get the latest episodes, sign up for our newsletter and more: https://www.theagilebrand.show

Check out The Agile Brand Guide website with articles, insights, and Martechipedia, the wiki for marketing technology: https://www.agilebrandguide.com

The Agile Brand podcast is brought to you by TEKsystems. Learn more here: https://www.teksystems.com/versionnextnow

The Agile Brand is produced by Missing Link—a Latina-owned strategy-driven, creatively fueled production co-op. From ideation to creation, they craft human connections through intelligent, engaging and informative content. https://www.missinglink.company

Transcript

Greg Kihlstrom:
We are recording live at Qualtrics X4 in Salt Lake City and seeing and hearing all about how to create and enable amazing customer and employee experiences. It’s important to collect customer experience data, but if it’s not driving change across your organization, is it really helping your business? Today we’re going to talk about making meaningful cross-functional change using CX research and data as a guide. I’m joined by Adam Hagerman, Director of UX Research for Employer Products at Indeed. Adam has led transformative efforts at Indeed to turn customer experience research into cross-functional strategic change, driving real improvements in both user satisfaction and product success. Adam, welcome to the show. Thanks for having me. Yeah, looking forward to diving in here. Before we do, though, why don’t you give us a little background on yourself and your role at Indeed? Sure.

Adam Hagerman: I lead a team of UX researchers. We look over the employer products. We’re trying to make sure that what we end up shipping for people to consume is solving relevant needs and helping them do what they need to do better, faster, cheaper, easier.

Greg Kihlstrom: Wonderful, great. So yeah, let’s let’s dive in here. And so we’re going to talk about a few things here. But I want to start by talking about transforming satisfaction measurement into strategic decision making. So you and your team at Indeed have transformed your approach to measuring user satisfaction. What led to the shift?

Adam Hagerman: We needed to. Satisfaction measurement is not new. We’ve been doing it since, like, phone surveys from olden days that you would get at dinnertime. And the tool we were using was the same one, the Net Promoter Score. It’s evolved. It’s iterated over time. It’s had improvements here or there. But at the end of the day, it’s a brand measurement. and we have a product we need to work on. NPS is well known, and my stakeholders were very excited. They’re not anti-user sentiment. It’s just the tool they were using wasn’t as helpful as it could have been. We asked the question, can we make this better? What can we do? Here are the shortcomings. Here’s how it’s preventing us from helping people do what they need to do better, faster, cheaper, easier. It’s not giving us the insight we need, so let’s find a new way to do it. And the intention of finding the new way to do it is to actually help our stakeholders build better, faster, cheaper, easier products.

Greg Kihlstrom: Yeah, yeah. So along those lines, I mean, probably, you know, many people listening out there are using NPS, CSAT, you know, you name it. What were some of the telltale signs that it wasn’t giving you everything that you needed?

Adam Hagerman: If it tells us to push a lever, and we push the lever, but nothing happens, it’s not actually telling us what lever to push.

Greg Kihlstrom: Makes sense.

Adam Hagerman: I guess that’s the answer.

Greg Kihlstrom: Yeah, yeah. Hey, so you mentioned that data was harnessed not just to inform, but to quantify impact and to guide strategy. How did you approach turning research into something measurable and actionable for the business?

Adam Hagerman: Research is the process of collecting information. The reason we collect information is because we need to make a decision. The product stakeholders need to make a decision. Do we do it this way? Do we do it that way? They receive information from lots of resources. They get feedback from their go-to-market team. They get feedback from the engineering team. They get feedback from random person on the street. And they have to take all of that information and make a decision. What we bring to the table is kind of the collective baseline for what our users want. Our job is to advocate for users among that entire organism of ecosystem of information floating around. Data collection is a deliberate act. Just because something’s been collected doesn’t mean it’s what you should be collecting. And we ask that question, are we collecting information that actually helps us advocate for users? Once we were able to do that and demonstrate, here’s what we’re doing and here’s what it means for you, here’s your return on investment, it was an easier case to make. Does that answer your question?

Greg Kihlstrom: Yeah, I mean, I think so. I mean, so is it, because there’s lots of signals, right? So, I mean, again, there’s some go-to measurements that a lot of people use, like NPS and others. And again, to your point, nothing wrong with that, but if it’s the sole measurement, there’s- Just ask some questions. Right, so yeah, so I guess how do you determine, is it an incremental, like in the advertising world, it’s like media mix modeling or something like that. Is it a similar approach?

Adam Hagerman: I actually used a very similar approach to media mix modeling. In media mix modeling, what you’re really trying to do is of all the money that’s floating around in brand advertisements, activation, whatever, you’re trying to see where does an incremental dollar give me more benefit? So if I have $1 left to spend, am I going to put it in this thing or that thing? And what my measurement was trying to do was say, of the experiences available, where we could put our investments into improving user experiences, where should you put your dollar? And we had to do a mathematical exercise, create an empirical argument for that’s the right place to put the dollar. And then we had to hope that it worked. that we weren’t lying. So we created systems of accountability for ourselves. In order for us to advocate for this new type of measurement, it needs to meet these criteria. And we set out a protocol for how we were going to check ourselves. And before we were ready to really roll full scale and say, this is truth, listen to us, we wanted to make sure that we were actually representing the lived experience of our people, the customers that we have. So we took that medium mixed modeling approach. How can we model where return on effort into fixing user experiences would give us that outcome?

Greg Kihlstrom: So what was that process like then? Lots of math. Yeah, I would imagine, right? What’s the process then of convincing people to listen to, you know, again, people get really stuck in their ways. This is a change management thing as much as it is a measurement thing, right?

Adam Hagerman: You hit it on the nail or you hit the nail on the head. It’s a change management thing. People are coming in with their own set of expectations, biases, baggage. This is what it means for me, either in good or bad terms. And the approach I like to take is just being brutally honest. Direct, this is what’s going on. Here’s how moving forward with this. Here’s what you can expect as a consequence. I’m not going to tell you what to do. But if you’re going to do this, here’s your consequence. I’m giving you other options. And by the way, I have data to back it up.

Greg Kihlstrom: Yeah, I mean, that’s the key thing, right? It’s instead of a feeling or a hunch, or I did this at this other place, you’ve got the incremental improvements, right?

Adam Hagerman: And that’s the media mix modeling that I talked about. Yes, it’s working in a theoretical and hypothetical space. but we adhere to all the rules of statistics that have come to us through like the current philosophy of science. And that’s how we construct our argument.

Greg Kihlstrom: Yeah, yeah. So how did that go? I guess at first, like is that- Asymmetric, asymmetric.

Adam Hagerman: There were some people who saw what we were trying to do and they were like, yeah, let’s go. Yeah. There were other people who were more, they had been burned in the past. I guess is the best way to say it. We work with very smart, intelligent, experienced people and Indeed was not their first job. So they have the baggage from wherever they were and whatever research team did that thing. So when they hear Adam saying, this is what we’re going to do now or don’t do that or whatever. I have to acknowledge that they’re also a human being that have their own set of experiences, baggage, whatever. And the connection I make with them, or I try to make, is we’re both here to do the same thing. We both want the same good outcomes. What questions do you have? If there are things that I can do to make you feel more comfortable, I’d like to know what it is. It may just be a matter of I didn’t say it on that slide. So it’s having frank conversations.

Greg Kihlstrom: Well, and this is where it comes down to, you know, there’s lots of talk about data-driven decision-making, but this is the culture shift part of that, right? Is, again, I did this thing at this other place and it worked really well, so it must, you know, work again at this new place in different circumstances. Our context is different, and sometimes that’s part of the argument.

Adam Hagerman: Yeah. When I’m convincing people, it’s like, yeah, over there it worked, and I can see why it would. or I can also see why it would fail miserably. It’s that idea of no context is exactly the same and the success depends upon your context and bringing that out, having empirical arguments.

Greg Kihlstrom: So, how do you look at making arguments, or not even arguments, let’s not make it arguments, let’s make it friendly, the quick win versus the sustainable growth? A lot of this is there can be quick wins when you make any change sometimes, or any good change, but how do you look at balancing between, let’s get some of those quick wins under our belt versus, okay, this is going to achieve sustainable growth, which is hard to project.

Adam Hagerman: I had the long-term vision in my head. I knew what I wanted to do. I knew where I needed more information, like here’s where it might fail. But I had to start with the quick wins. It’s just like any other product that you push out. Your first users are going to be your best sources of feedback. They’re going to say, okay, Adam, here’s how you and your team maybe thought about this differently. Here’s feedback I would have because here’s how my context will change how you want to approach something. So the quick wins are necessary, but you have to know where you’re going. The corollary to your question is people who do a bunch of quick wins but don’t know where they’re going. And that’s how you get cruft. Like that’s what it is. So thinking about your research, I try to think about my research programs as a creative. Whatever we’re doing today is building on what we did yesterday and we’ll build on it tomorrow.

Greg Kihlstrom: So, yeah, so the quick wins are, and I’m a huge fan of that approach, but to your point, as long as there’s a strategy. But it’s almost as much about, I mean, there’s benefit to the business. Ideally, there’s benefit to the customer as well with those, but it’s also about kind of winning hearts and minds, right? That’s a big part of it.

Adam Hagerman: You have to show that it works.

Greg Kihlstrom: Yeah.

Adam Hagerman: That if somebody’s going to say, hey, Adam, I want your stuff on my surface because I want to use your tool to make sure that our user experience is good. I want to deliver on that. I want to actually say, yes, I helped you do that.

Greg Kihlstrom: Yeah. So what would your advice be to leaders? Let’s say they’re not sitting in the research or the data component of the organization, but Again, they read the same things I read about data-driven decision. They know somewhere in their head that there’s a lot of value here, but they’re having a hard time kind of getting past, whether it’s biases, whether it’s other loud voices in the organization. What’s your advice to them to kind of just make the first step? Well, that’s a deep question.

Adam Hagerman: I’m sure it depends too. Kierkegaard would say, take your leap of faith, stare into the abyss and go. And I do take kind of a similar approach. I talk about the series of experiences, bag of lessons learned, that helps us decide how we should move forward. We may not have all of the information we need to make a decision, but we have a lot of information that can help us know whether we’re making a bad decision. Check yourself before you wreck yourself. your internal system of accountability, and then at some point, you just need to go.

Greg Kihlstrom: Yeah, yeah. I remember when I was at the media briefing the other day, and you had mentioned, you know, Indeed is unique to some organizations in that being a technology company, access to data, things like that is a little more prevalent than in some more, maybe more legacy companies and things. How much do you credit having that access to data? Is it always a good thing, or are there good and bad things about it?

Adam Hagerman: I think it’s good to have access to it. There’s a barrier to entry. You have to know how to type in the right query. You have to know what are the signals that maybe you didn’t do the query correctly. There’s a lot of self-awareness that goes into working with that. The people who primarily work in that space, this is what they do all day long. So they have those tips and tricks to be conversant in that kind of work. I viewed, or I continue to view, democratization of information as inherently a good thing. My concern comes with, do we have the appropriate guardrails in place? Have we checked ourselves before we wreck ourselves?

Greg Kihlstrom: Well, a lot of, yeah, and I’m a huge fan of it as well, but also data literacy and all the things that, as well as the ethical components of the course. How do you make sure that there’s kind of a unified version of the truth then with all that? Because again, it’s a great thing to have access, but everybody’s asking their own questions, so on and so forth.

Adam Hagerman: I’m not the only person participating in this conversation. Yeah. And there are times where I know my data is not the data somebody needs at that point. I should respect that because the data I have to offer is not the data they need at that moment. I try to think instead, how do I participate in that conversation? How do I add an additional layer, let people know that this layer exists, that they can use it to ask questions of their own? One of the important aspects of the program I talked about was that it didn’t just live in Qualtrics. It was actually married with all of our other data. So people could do their own investigations and they could use our data source as part of, as evidence in what, I know you don’t want to use the word argument, but in whatever empirical argument they want to make.

Greg Kihlstrom: You can use it, yeah.

Adam Hagerman: I feel like I’m not answering your question.

Greg Kihlstrom: Well, I mean, it’s kind of a broad question. You know, I think it is, and I think this is where platforms and kind of, we do need some kind of unified source of truth. At the end of the day, this is about happier customers and long-time customers. And so, you know, at the end of the day, that’s the goal, right? So if there’s many different answers to that, Maybe there could be some challenges there.

Adam Hagerman: I guess where I’m getting at is I don’t want to preclude somebody from having access to this. But I do want to make sure that when people are accessing it, they understand what they’re accessing. We do that through documentation. We say, this is what you’ll find in this database, in this data frame. We have lots of information about what is this program? How can you use this information? Oh, do you have questions about how we came up to it? Look over here. We try to make sure people are empowered to use the data in the correct way or in the intended way. Correct makes it sound like there is a right and wrong. In the intended way.

Greg Kihlstrom: Well, and I would imagine in empowering them as well, you’re also getting some amazing ideas that one person with access centrally would not have.

Adam Hagerman: This goes back to that idea of a creative understanding. We don’t have to be the only source of truth. But what we’re doing is we’re adding to our knowledge.

Greg Kihlstrom: Absolutely, I love it. Well, as we wrap up here, just a couple questions for you. I know we’re almost wrapping up the event here, but I wanted to ask what’s been a highlight so far for you at Qualtrics X4?

Adam Hagerman: I usually enjoy the day two keynotes. Day one is very product forward, like, look at the cool stuff we’re doing. Day two is more, how should you think about doing your work? And I always enjoy those reframings because when we’re just going about our day, we do what we do. And sometimes we need to stop and listen to how do other people do it? How are they approaching this existential question? And I always enjoy those.

Greg Kihlstrom: Love it, yeah. So one last question for you. I like to ask everybody, how do you stay agile in your role and how do you find a way to do it consistently?

Adam Hagerman: Change is inevitable. On my team, I don’t have a lot of process. I do not have a lot of, you must do this, then this, then this, then this. Because I know as soon as you build the process, something’s gonna change and it all just goes out the window. For myself and my team, I try to make sure that we’re all oriented around what problem we’re trying to solve, and we understand that the way we do that will be flexible.

The Agile Brand with Greg Kihlström