#735: Creating a winning personalization and testing approach with Muqtadaa Miandara, Upbound Group


The Agile Brand with Greg Kihlström® | Listen on: Apple | Spotify | YouTube 

When it’s a struggle for marketers to even keep up with the latest trends, are brands truly keeping pace with their customers, or are they stuck playing catch-up with outdated strategies and technologies?

Agility requires a willingness to experiment, learn, and adapt quickly. It also demands a deep understanding of your customer and the ability to anticipate their evolving needs.

We are here today in New York City at Opticon25 and seeing and hearing some amazing things about the future of martech and how AI will shape the role of marketing in the months and years to come.

Today, we’re going to talk about building an agile brand in the age of digital transformation, focusing on the power of experimentation, personalization, and continuous improvement. To help me discuss this topic, I’d like to welcome, Muqtadaa Miandara, Principal, Digital Growth at Upbound Group.

About Muqtadaa Miandara

Muqtadaa Miandara on LinkedIn: https://www.linkedin.com/in/muqtadaa/

Resources

Upbound Group: https://www.upbound.com/

The Agile Brand podcast is brought to you by TEKsystems. Learn more here: https://www.teksystems.com/versionnextnow

Don’t Miss MAICON 2025, October 14-16 in Cleveland – the event bringing together the brights minds and leading voices in AI. Use Code AGILE150 for $150 off registration. Go here to register: https://bit.ly/agile150

Connect with Greg on LinkedIn: https://www.linkedin.com/in/gregkihlstrom
Don’t miss a thing: get the latest episodes, sign up for our newsletter and more: https://www.theagilebrand.show

Check out The Agile Brand Guide website with articles, insights, and Martechipedia, the wiki for marketing technology: https://www.agilebrandguide.com

The Agile Brand is produced by Missing Link—a Latina-owned strategy-driven, creatively fueled production co-op. From ideation to creation, they craft human connections through intelligent, engaging and informative content. https://www.missinglink.company

Transcript

Greg Kihlstrom (00:00)
When it’s a struggle for marketers to even keep up with the latest trends, are brands truly keeping pace with their customers or are they stuck playing catch up with outdated strategies and technologies? Agility requires a willingness to experiment, learn, and adapt quickly. It also demands a deep understanding of your customer and the ability to anticipate their evolving needs. We’re here in New York City at Opticon 25 and are going to talk about building an agile brand in the age of continual transformation and focusing on the power of experimentation, personalization, and continuous improvement. To help me discuss this topic, I’d like to welcome Muqtadaa Miandara Principal Digital Growth at Upbound Group. Muqtadaa welcome to the… Yeah, looking forward to having this conversation and then great to be in person here at events like this. I love it. I know, I know. Before we dive in though, why don’t you give a little background on yourself and your role at Upbound Group.

Muqtadaa Marianda (00:38)
Thank you so much for having me.

New York City.

Yeah, absolutely. So I’m the principal of digital growth at the Upbound Group. I oversee a team dedicated to testing and CRO practice, essentially. Overseeing our two main lines of business, Rent-A-Center and Acima, serving customers in ⁓ subprime and near prime credit score areas to help them get financing for the things that they need. So…

Within that, have some very specific funnels that we have to optimize and personalize and ⁓ a lot of regulatory and compliance-based specifics that we have to be mindful of at all times. it’s a unique challenge having to sort of manage a team that’s doing that every single day.

Greg Kihlstrom (01:29)
Yeah. And I know you mentioned a couple of brands that are part of UpOn Group, but maybe give us a little background on the brands like Renacenter and others that you work with, as well as the overarching mission that drives the work.

Muqtadaa Marianda (01:42)
Yeah, absolutely. So Upbound’s mission statement is creating financial opportunity for all. So our goal always is to just create access for people to get the things that they need when they need them in a way that doesn’t break the bank for them. At lower income levels, cash flow is more important than net savings, unfortunately, very often. And it’s not great, but it’s also kind of just what it is. So we offer opportunities to get the products that everybody needs furniture appliances, large scale electronics for kids going to school like laptops or iPads or whatever. And we do so in a way that’s like an affordable regular payment as opposed to having to get try to get approved for a credit card you’re almost certainly either going to be denied for or you’re going to get a minimal approval amount on. And so we provide access to thousands of dollars worth of durable goods by comparison.

Greg Kihlstrom (02:33)
Yeah, great. So we’re going to talk about a few things here. And I want to start with the testing and personalization journey that Upbound has been on. And certainly Upbound has been recognized for its innovative use of Optimizely for testing and personalization in general. Can you maybe take us back to early days of testing and personalization? What were some of the initial challenges as well as some of the early wins?

Muqtadaa Marianda (03:00)
So I came out of college December of 2020, mid COVID, and I applied to be a UX designer at Rent-A-Center. And I did not get the job. And I got a call back instead saying, hey, you have like front end dev skills. Would you like to ⁓ interview for this job called testing and tagging specialist? And I was like, I have no idea what that is. Sure. Sure. What do I need? And she was like, OK, I can get you an interview with the hiring manager tomorrow. Do you know what Optimizedly and Google Tag Manager are?

And I said, no, but I will on Monday. And now we’re here. So when I joined, we had just signed on with Optimizely. We were literally in month one of onboarding. I think they had signed 45 days before I joined the company. We had switched over from, I want to say, SiteSpect. I never had the opportunity to work with that tool, but I don’t think I can anymore, right? So it was definitely a bit of a challenge where I’m coming in very green to corporate space in general and on top of that, the specific business model and workflow and process. I had none of that context. All I knew was I was a decent front-end developer, a decent UX designer, and a passable UX researcher. And that’s really being generous to younger me. And the greatest challenge was we were an incredibly small team. We were our UX manager, who is now our director of digital experience, my boss, Aaron, a designer who’s…

I get why she got the job over me. She’s infinitely better at it than I am. To this day, I look at how she designs and I’m just like, that’s what I was supposed to be doing, got it. And then myself, and we were three people and we had to start scaling testing across, it was originally just the Renaissance brand because we had just acquired Acima and so we weren’t yet in a position where our orgs were interacting in a way where we could just step in and start telling people we’ve got ideas for you.

And there was a lot of relationship building that had to happen, a lot of communication and education on what it was we were even trying to accomplish. So there were a lot of times that I think some missteps early on were assuming that everyone would be excited about some numbers that we had to present and saying, but this is good for the business. And then being like, can you prove that really? And having to learn how to prove that and then how to make it other people’s excitement, driving it as much as our own. So that sort of rapport and that sort of process was

probably the hardest part of the early building of the program. Scaling with like additional resources or tools or impressions or technical expertise, that all kind of, that naturally evolves over time, but like embeddedness within the organization, that’s a challenge. And building that and finding the right way to do it to where people are not only supporting testing and like test and learn processes as part and parcel of the way we work, but actively advocating for it themselves or asking for things to be tested.

That was an evolution that we had to go through.

Greg Kihlstrom (05:35)
So first, I love the initiative on your part of just like, I’ll learn it by Monday or whatever. I mean, I’ve been in that position too, back in the day. sometimes you just have to have to do that. And I mean, with things evolving so quickly, it’s hard to, nobody can know everything anyway, right?

Muqtadaa Marianda (05:53)
No, mean, of course. Growing up, was always the kind of person to, as soon as I found a piece of software or a tool that I thought was interesting or even growing up, our vacuum started making a weird noise. I took it apart, fixed it, and put it back together. Granted, they’re meant to be taken apart. There are things I’ve taken apart I could not put back together. But it was the same thing. As soon as I learned what the inspector tool was on a browser at the age of 12, I was in there learning what HTML was over time making little websites and then I was freelancing and it all just kind of like Coalesced into where I am now. It’s a very strange sort of looking back on it the story makes sense But at the time it was real chaotic. Yeah

Greg Kihlstrom (06:31)
So the other thing that you brought up in kind of telling about the early days was just not only education on, you’re fairly fresh out of school and everything like that, but and so there’s some education on marketing and things, but also having to understand how the business really speaks and responds. And again, we can get some great, whether those are like vanity metrics or whatever they are, like we can get some things that are really good for a marketer, but to translate and to understand how to continually translate to that the business. mean, what role does having like the right tool and the right platform, the right analytics and stuff help with that?

Muqtadaa Marianda (07:06)
Yeah, absolutely. So part of it is definitely aligning on the right outcomes, right? Like the business has strategic goals annually, quarterly, however you want to break it down. And if you’re just testing in spite of that, cool, not what we were looking for kind of vibes, right? Whereas if the business says the top of funnel is bleeding, we have to fix it. These are our key like inflection points and we need to raise these numbers by X percentage.

All of a sudden you’ve got a plan and you can say, well, now I know a project. Everything on a product page, we have to bump start order clicks by 12 % relative 2 % uplift. Okay, I know tactics for that and I can build a series of tests and we can look at the cumulative effect over time. like all of a sudden you’re speaking a language that is a lot more reassuring to people who are looking at those numbers from different perspectives where it is more translatable. it really comes down to like there’s a UX principle of storytelling, right?

It’s about creating a narrative, not a dishonest one, but one that translates what you understand is exciting into a way that it can be exciting for other stakeholders. And part of that has been working with strategic partners within Optimized Lead Strategy team or within our, the friends we’ve made through coming to these events at road shows and various customer pop-ups and things like that. Getting to know other people in the space and learning how they’ve overcome similar challenges has been a massive benefit for us in formatting how we speak to people internally and where we bring in experts to say like, hey, they think what we’re doing is really smart. So you should also, and look at them, they’re a, you know, however many billion dollar companies. So you should also agree with them kind of thing. You know, like borrowing credibility almost, so.

Greg Kihlstrom (08:44)
Yeah.

You know, another part of this too is, you you test and experiment to improve the thing that you’re testing and measuring, but there’s also kind of the, I know if it’s the work about the worker or how you would call it, but there’s also that continuous improvement of testing and learning and personalization in and of itself. How do you manage to do that and continually improve the way that you’re doing these things?

Muqtadaa Marianda (09:07)
Yeah, I think like there’s a very tactical improvement you can achieve, better technical expertise, more hands, better, you know, better process like at the tactical level. But if you want to really improve a program, you have to work strategically, zoom out and start thinking again, like aligning on the business priorities and the strategic business opportunities building project plans that span a series of tests as opposed to and then making those like if you say have four key business priorities for the year All of a sudden you can start building prioritization frameworks You know in combination with like a rice framework or something like that because I think that works really well for experimentation because of the reach aspect of it Because if you know if you don’t have good reach for an experiment There’s really like a lot less value to running it not to say that you shouldn’t just that its priority is inherently going to be lowered

because the total impact will always be lower. So yeah, using prioritization frameworks alongside something like strategic opportunities or like OKRs or something like that, that’s a great way to say like the business wants these three things. So if something you’re requesting as a stakeholder falls in that bucket, amazing, it probably gets bumped up. But if you’re asking for something outside of it, I’ll still put it in the backlog. I’ll still prioritize it. But you have to understand it might be a quarter before we get to it because A, resource constraints and B, it’s just not as important right now and that’s okay. Doesn’t mean it’s not valuable, it’s just not as valuable.

Greg Kihlstrom (10:30)
But in that context, everyone is at least, there’s visibility on that.

Muqtadaa Marianda (10:34)
There’s an accountability aspect to it and a transparency aspect to it that is more reassuring than just like ignoring a request or doing a poor job of it if you were to just like rush it through anyways. So that’s part of it. And I think moving more strategically like that also helps you to, you know, lean into the statistical rigor a little bit because you’ve given yourself the breathing room to say, well, now let’s go calculate MDE. Let’s go look at our ⁓ expected sample size. Let’s go look at the time we need to run this test and let’s take time after the test has run to do a deeper post-hoc analysis in our data warehouse and so on and so forth. You suddenly have given yourself the structure to do those things as opposed to having some amorphous timeline in which you might or might not have results. And you might not have complete results in that scenario where you only tracked certain things and didn’t remember to track others. And so in the optimizely summary, it’s very hard to see what the movement was.

and then you have to go put in a request to one of the analytics teams and they have a six week turnaround and whoops, your two week promise is dead in the water. So building trust by delivering on time, having results when you say you’ll have them or like educating people on the time it takes to have outcomes because good analysis takes time. And so if you don’t give it time, you’re gonna get bad analysis and nothing is gonna harm your credibility more then if you come out with one set of results and then a week later, your tune has changed and you’re saying, well, actually now that we’ve had time to dig into it, it’s the opposite of what I first said. So yeah, I think just the more strategically you can move, the more value you can output. In 2024, our team successfully managed to be like one of the number one win rates in Optimizedly globally, which was an insane thing to find out after the fact. And we didn’t intend to do it. We were down ahead. I had one less testing resource.

We had been through a massive reorg and so we were just like, okay, if we’re going to work on anything, it has to be the most important thing. We can’t be working on anything spurious. So somehow that sort of conscious constraint of the macro situation led us to the best outcomes we had as a program ever. So.

Greg Kihlstrom (12:37)
Yeah, I mean it’s amazing what you can do within constraints, right?

So speaking of Optimizely, we’re here at Opticon here. ⁓ Curious just where does a platform like Optimizely fit into this strategy as well as the workflow?

Muqtadaa Marianda (12:52)
Yeah, absolutely. mean, so there are so many tools, obviously. some of them are getting cannibalized and gobbled up like others by others. But Optimize has been kind of like less just a tool or vendor and more of a partner kind of from the beginning. And that has been the defining aspect of the relationship we’ve built over the last four and a half, five years. And so if it weren’t for that, maybe by now we would have optioned another tool, right? Because like, yes, you guys are ahead of the curve on the tech and ahead of the curve on the product offering. Although I do know for a fact, our SMO department sometimes is just like, that’s a big line item. Do you guys need that still? Are you sure? The sourcing team will just like every now and then check in and be like, so what do we feel about Optimize Lee? And we’re like, absolutely, gotta keep it, leave us alone. yeah, Optimize Lee’s role in the growth of our team and the growth of our business has been profound. That partnership, the strategic partnership in particular, to help us get access to the resources who can help us learn how to do what we do better, smarter, or validate that what we’re doing makes sense and is the smart thing to do. Like that’s been very, very helpful. know, testing is a niche space. You can work on it for years without meeting someone else who does it and just in a vacuum think, God, I don’t know anything. I don’t know what’s going on. I don’t know how to make it happen.

The first time I came to Opticon or went to like an Optimize.ly customer pop-up and I spoke to other people in the space, I was like, I’m good at this. The people liked what I was saying because they thought that was smart and that means I’m good at this. And I did not have that validation before because nobody in my company knows exactly what it is we’re doing besides us. So it can be a very validating thing. And so that partnership, that validation, and then the continued innovation, it’s what keeps positioning Optimize.ly is like a continued partner for us.

Greg Kihlstrom (14:34)
Yeah, to your point, it’s beyond the technology. There’s not only the partnership with Optimizely, but also the internal, I mean, you said you’ve been through reorgs. You started at least with a small team and stuff like that. What are some of the organizational factors that you feel have been crucial to success?

Muqtadaa Marianda (14:55)
Initially, it’s ⁓ our RC level that we rolled up into was very passionate about the idea of being able to prove outcomes. And so without her support, 100 % we would not have gotten as far as we did. So she’s no longer with the company, but she was absolutely vital to like saying, don’t do that. Let’s test it first. Right. And then because she was overall of marketing and all of all of our creative services and stuff.

She was very in our copywriting teams and everything. She was very open to us kind of poking our nose in other people’s corners. Like I reached out to our email team and I was like, hey, I you guys do A.B. testing on email subject lines. Could you also tell me like what other A.B. testing you’re doing so I can line up the content on the site and personalize those journeys a little bit? What do you think? Would that be cool? And they’re like, what does that even mean? Like we only really care about the click through rate opens. And I was like, no, no, You understand the web is where the value is, right?

You understand the dollar value happens on the web, so if you’re not, like you can get them there and if they don’t stay, it doesn’t matter. So can we make them stay now? That’s the question and you know, that’s part of that education and rapport piece I was kind of talking to earlier. Similar with like our like paid media teams or our SEO teams being like, you guys are doing a lot of work to get people to click links. I can get them to stay. That’s important. Should we focus on that?

But she helped kind of that that C level support helped pave that path because you know, some of these some of these teams are 50 years old. They’re very embedded in the ways that they do things, even if they’ve transformed in the last 20 years for the web. It’s an old business. It’s slow to change in that way. Organizationally, the operating model is not quick to evolve. So having someone so high up just say, no, we’re changing the way we do things to make a little bit more value out of it.

Greg Kihlstrom (16:34)
Well, and that’s, mean, to me and what I’ve seen, you know, that’s so critical that there is that top-down support because I mean, there’s a lot of things. I mean, you mentioned the silos, you know, classic challenge for a lot of organizations, but also, you know, we’re talking about experimentation and by nature, not every experiment achieves the exact thing that you wish that it would, right? So in other words, you some people call that failure, I call it learning, right? So how does

Muqtadaa Marianda (17:02)
Learners

not losers, yeah.

Greg Kihlstrom (17:03)
Yeah,

yeah, but like, you know, how does that, I mean, it sounds like a lot of it is, you know, top down leadership by example and things like that. you know, how does Upbound Group as an organization foster that culture of continuous improvement?

Muqtadaa Marianda (17:17)
We’re very oriented on the customer. And whenever we start seeing that there’s a discrepancy between what we intended and what the outcome is, questions start to get asked. And so one of the easy ways to answer that question is to say, well, let’s do some research. Let’s run some analytics. Let’s look at some heat maps. Let’s run a couple tests. It’s become part of the language, right? And it didn’t always used to be that way. But even without now this C level that we used to roll up into,

Part of the process, the next evolution of getting people on board from the top down has been getting back to that sort of education piece, executive education almost, to be like, here’s what we’re doing. Here’s the incremental revenue impact. Here is why this is important. This is what happens if you annualize that data and here’s the curve, the drop off or whatever. Getting into the math of it all and saying, hey, I know you’re a CFO. Let me tell you why this is good for you.

Like I can pad your bottom line basically and help account for some stuff that you weren’t planning to have in the bank at the end of the year. That’s great for the EBITDA. That’s great for our margins. That’s great for our EPS. like, you know, creating that, that rapport and helping people feel more comfortable with the risks that you’re taking by saying, look, one in five tests is going to win. So that means 80 % of the time I’m striking out, but I’m also helping the business decide what not to do. And that’s huge. Cause imagine if we committed a million dollars of developer hours and like AI credits to this project. And at the end of it, we lost money. If I could run like a dummy version of it and prove no one’s going to click that thing to enter this new flow, maybe we don’t do it in the first place. So, so there’s, there’s, think there’s a few different avenues to like getting that kind of support and to creating that culture. Well, one of the things we do a lot is also running a lot of like workshops. We do like ideation.

brainstorming workshops with stakeholders very often where we bring in like a product team and we say, hey, and they’re designers as well and whoever else and we say, hey, this is your flow. You know it better than we do. You know, your product analysts know it better than just like others in the company do. Where do you see the problems? We’ve laid the flow out on a whiteboard here. Let’s just start marking things we could try based on the notes you have about what’s not working. And that’s always been very successful for filling out a backlog of test ideas. And we don’t even have to do that work ourselves anymore. And then we just go through and align it to strategic goals and fill out priority based on reach and impact and move forward. And it’s great.

The Agile Brand Guide
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.