In today’s digital ecosystem, data compliance and privacy are more than just legal requirements; they’re critical components of customer trust and brand reputation, especially for B2B marketers. To navigate these waters and understand the intersection of compliance, technology, and customer experience, we’re joined by Jordan Sher, Vice President of Brand and Communications at Drata.
Jordan Sher
With over 20 years of experience in advertising, marketing, online marketing, content marketing, and brand management, I’ve got a core competency in making enterprise B2B SaaS brands famous. Whether it’s building a marketing practice from scratch, crafting a scalable brand story, or executing a product-led growth strategy, I have the skills, creativity, and vision to make it happen, and I’ve got a successful exit to prove it.
Currently, I am the acting Vice President of Brand and Communications at Drata, a cloud-based security and compliance automation platform that helps SaaS companies earn and keep the trust of their customers and stakeholders. In this role, I am responsible for developing and executing the go-to-market strategy, brand identity, messaging, and awareness for the company.
Resources
The B2B Agility podcast website: https://www.b2bagility.com
This episode is sponsored by BetterHelp. Give online therapy a try at betterhelp.com/AGILITY and get on your way to being your best self.
Sign up for The Agile Brand newsletter here: https://www.gregkihlstrom.com
Don’t miss the Mid-Atlantic MarCom Summit, the region’s largest marketing communications conference. Register with the code “Agile” and get 15% off.
Get the latest news and updates on LinkedIn here: https://www.linkedin.com/showcase/b2b-agility/
Check out The Agile Brand Guide website with articles, insights, and Martechipedia, the wiki for marketing technology: https://www.agilebrandguide.com
B2B Agility with Greg Kihlström is produced by Missing Link—a Latina-owned strategy-driven, creatively fueled production co-op. From ideation to creation, they craft human connections through intelligent, engaging and informative content. https://www.missinglink.company
Transcript
Note: This was AI-generated and only lightly edited.
Greg Kihlstrom:
Data compliance and privacy are more than just legal requirements. They’re critical components of customer trust and brand reputation, especially for B2B marketers. To navigate this and understand how compliance, technology, and customer experience connect, we’re joined by Jordan Sher, Vice President of Brand and Communications at Drata. So why don’t we get started with you giving a little background on your experience and brand and communications and, you know, a little bit about about Drata as well.
Jordan Sher: For sure. So I come to Drata with about 20 years experience in brand and communications and about 15 years experience in the startup world, working with a variety of different startups, primarily on the infrastructure and data side of the equation. At Drata, I run the overall brand management team, and that would include content, design, social, but also the corporate communications. So we evangelize the Drata name into the market with press. We stand for compliance. We talk a lot about privacy. And we are really brand advocates for building trust both in the startup marketplace and in the enterprise marketplace at large. We really want to equate the Dorada brand with the notion of trust and integrity and everything that that means to these companies who are challenged by audits all the time.
Greg Kihlstrom: Yeah, yeah. So yeah, we’re going to talk about a few things here today. But I want to start with this concept of privacy by design. And I just I love this. Just off the bat here, just this concept and, you know, it’s, it’s something, and I’ll, you know, I want you to explain a little bit more, but, you know, essentially, brands integrating privacy into their products and services from the ground up. So with that, could you explain a little bit, you know, what exactly does that mean? You know, when we say privacy by design, you know, how does and how does adopting this philosophy help brands in compliance efforts and As importantly, sometimes communicating their commitment to the compliance and handling customer data responsibly.
Jordan Sher: Yeah, you know, it’s funny, if you would have asked me this 10 years ago, I would not have been able to foresee the brand value of privacy in the technology marketplace, right? Like privacy has just accelerated exponentially in terms of its overall brand value, which is surprising. And it’s really created this intersection between the definition of privacy by design, which I’m glad you really brought up and the notion of both brand safety and brand advocacy as well. So let me start by talking a little bit about privacy by design. Privacy by design, or in some cases it’s also conflated a little bit with zero trust, is this notion that when you engineer anything, if you engineer a software product, If you engineer a social media platform, if you engineer anything from the ground up, you can build with privacy in mind, trust and safety and integrity. And some of the most successful brands really start with a privacy by design approach. And so that means being proactive about identifying places where data integrity must be maintained. being incredibly transparent about the principles that guide your product decisions or being incredibly transparent about data protection or data value or things that you’re doing with customer data. Or it’s as simple as adhering to GDPR regulations, and we can get into that as well, where you’re giving your customer the opportunity to have more control over how their data is used. And if you start at the very beginning of the product development lifecycle or the mind than you are executing on the principles of privacy by design. And so then we get to this question of ultimately how that impacts brand advocacy. And I would say that the most successful brands that have their customer in mind and really consider customer experience and make that a part of their brand should amplify the story of how they engineer with privacy by design. So it’s always very interesting to me how product management and product development can intersect with brand and brand experience and user experience. And the product is one way and, you know, engineering for privacy now is another way. And it’s just become very popular over time.
Greg Kihlstrom: Yeah. Yeah, definitely. I know it’s, it’s definitely been a change and, you know, certainly I think there’s efforts like GDPR and some other things in the, you know, in the States and other countries have certainly brought visibility and as well as some guidelines to those as well. But yeah, definitely that change in privacy as a brand value and sort of a brand asset is definitely different. And, you know, along those lines, Dorado works with a number of brands in this space. I wonder if you could explain a little bit, you know, how, how exactly do you support brands and implementing what you just described as, you know, privacy by design? And, you know, how does how does that actually work in practice?
Jordan Sher: Yeah, so I can talk a little bit about it from the the draw-up perspective. I mean, as you can imagine, there are a lot of different ways to implement privacy by design and following the core principles. Sure. Following some of these very specific compliance frameworks like ISO actually has controls and you will ultimately need to provide evidence of these controls that you’re adhering to privacy by design principles for certain compliance frameworks. That’s one way, but the other way that Drata really focuses on is by evangelizing, helping companies to evangelize how they adhere to some of the most important auditable evidence that they are operating in a way that is compliant with some of the most popular frameworks. and regulations that exist in the world. We call this presenting a trust center. And so a lot of our customers, they will make sure that they maintain some of these core principles of these different frameworks. And I can give you some tangible examples here in a moment. And then they want to promote that. And so with Drata, you can put on your website that you have a trust center. And you can check. Anyone can navigate to a website. Your eye can navigate to a website that has the Trust Center badge or the Trust Center URL. And you can actually see in real time how compliant are these companies within some of these privacy frameworks or customer data frameworks or any of these compliance frameworks that really matter in a regulatory environment. And they can see in real time that these companies are very trustworthy. So one of the core examples that we have is SOC2. SOC2 is a very popular compliance framework that most SaaS companies need to adhere to. And when you are SOC2 compliant, you are collecting evidence about your employees, that they are taking the necessary trainings and adhering to certain security practices. And you need to collect evidence from a lot of your different security postures and security policies. And you present that to an auditor, and an auditor checks a bunch of boxes, at given intervals and says, yeah, you pass the audit and you have your SOC 2 attestation. And when you have that SOC 2 attestation, it’s in your best interest to promote it. And you promote it via a trust center. And in that way, that’s just one example. We have over 20 compliance frameworks. In that way, we really seek to become the currency of trust between our customers and their customers. So their customers can see, all right, privacy has been engineered to my experience of this software product, and I can really trust these brands. So, you know, it’s funny, in some ways, it’s a very technical, you know, requirement that these companies need to adhere to. But in other ways, it’s a branding exercise, right? Like, I am now branding myself as trustworthy, I am now branding myself as a company that is on the side of my customers. I feel like there’s a lot of emotional value in that, particularly in a world where we don’t know what’s happening with our data. And we don’t know whether we’re having private interactions online or not.
Greg Kihlstrom: Yeah, absolutely. I think it’s important on all of those fronts. And that’s great to understand a little bit better about how you work as well. For any business to thrive, effective communication is key, not just for success, but for maintaining a healthy, respectful work environment. That’s where Service Skills comes in, a proven e-learning platform that’s transforming organizations through micro-learning modules. Think about it, are your customer service interactions up to par? How inclusive and collaborative is your workplace? Service Skills offers hundreds of courses on everything from customer satisfaction and team building to management and respectful workplace practices. Validated by millions, these courses empower your staff to excel and communicate effectively, enhancing both personal and company-wide success. And the best part? It’s all available at your fingertips with affordable pricing and flexible options that fit your organization’s needs. Elevate your team’s skills, boost workplace respect, and foster a culture of diversity and inclusion. Join the ranks of satisfied clients who’ve seen tangible results in performance and communication. don’t let a lack of staff resources be your downfall. Level up your people skills and build your dream team today. Visit service skills.com slash podcast, and let them know the innovation economy sent you. That’s service skills.com slash podcast, America’s premier soft skills training. So I want to move on to another topic. And you know, we certainly talk about AI a lot on the show. And you know, we’ve talked about it in number of different ways, but we’re talking about privacy and compliance. I wanted to look at how AI can help here with compliance and automation of compliance and other things like that. If you could give us a little detail on what is the role of AI in compliance and automation for brands? And what are some of the things that you’re seeing, whether emerging trends or other things that companies should be mindful of in this area?
Jordan Sher: Yeah, obviously, everybody has to have a perspective on AI today. And I will say, From what I’ve seen in the world, there are really two ways of developing with AI and having compliance in mind. There is a fast way that gives you a competitive advantage of speed in the market, and you can get to the market first, and you can capture on the buzzy quality of AI. Or in our mind, at Drata, there’s the right way. And the right way really proactively builds in an ethical perspective on building with AI. And to me, there’s a stark intersection between building with AI and privacy by design. So the way that we think about it here at Drata is that there are a few different swim lanes in developing with AI that we need to pay attention to. First, number one, there’s the data. The data that we ingest to build a large language model. And there are many different ways to ingest data. We do it in a very particular way that segments our customer data. And so our customers know when we are providing an AI feature with their data, that their data is protected and their data is built on their own personal tenant, for example. So we’re not mixing anybody’s data with anybody else, number one. I think that’s very important. So you know that you not only have a good understanding about what the AI can do, but the data that the AI can build from. Number two, the architecture of building an AI feature using a very controlled data set is really important. And so we architect with the most ethical uses of AI in mind. And so when we present a feature, if we were to present a feature in the future, it’s just critical that our AI feature set is architected with the principles of privacy by design in mind, number two. And then number three, the way that we deploy AI also has to be incredibly ethical and incredibly controlled. And so we’re not here to apply AI techniques or AI engineering or AI development to any feature that we have within the Drata platform, but rather we’re picking and choosing the right features to think about the use of AI. So, our customers can, number one, understand what they’re getting out of it and have a level of dominance and control over the end product, but also understand that AI is not here to willy-nilly solve all compliance problems, but rather just a core fundamental set of problems that, again, are the right level of problems that AI can solve at this time. And I think that that is That’s in stark contrast to some of these other AI wrappers that you see out in the world. Maybe they’re using publicly available data. Maybe they’re not putting data on an individual tenant. Maybe they’re pulling just general large language models and applying them. it’s just a different approach for us. And I think that our customers demand that level of scrutiny. And also, we just feel like it’s the right thing to do. If we want to stand for trust and integrity, we have to execute on all fronts with that in mind.
Greg Kihlstrom: Yeah, yeah. Well, and of course, anyone paying attention to this space, it’s, let’s call it, there’s a lot of change, a lot of things that get added. I mean, as you mentioned, and going to your site and everything, you work with a lot of existing framework, CCPA in California, HIPAA for healthcare, a lot of those different existing frameworks. But as we know, not only in the data privacy space, which is pretty fast moving, even here in the States, there’s a lot going on. There’s especially a lot of conversations going on with AI. There’s going to be more regulation soon in that. So, how do you recommend that brands you know, try to stay one step ahead of these, obviously, until something is regulated. Like I remember, you know, back in the day with GDPR, it’s like, we were all just trying to figure out how to interpret what that meant, and stuff when the draft came out or whatever. But it’s like, you know, as these things are coming out and shifting and everything like that, how does a brand try to stay one step ahead?
Jordan Sher: Yeah, it’s a tremendous question. And I will offer a couple of things that I consider when thinking about brand evangelism and privacy. Number one, it’s only going to get more critical. So when we think about brands and customer experience, the conversation about privacy, about data quality, about data integrity is only going to get more impactful and more important, especially as we try to reconcile how AI is going to be used in the future, and we try to reconcile how people are showing up online. There are these conversations that exist in the world now more than ever about the use of data about kids and online experiences. I think that conversations like that, as a one-segmented example, are only going to accelerate. I think the the velocity of regulatory pressure is going to increase the need for companies to evangelize that they are operating within the confines of the regulatory environment. So when you think about your brand and you think about customer experience, privacy and data integrity need to be factored. Number one. Number two, when you build any sort of online interaction, I think the more that you can be transparent about how you’re using that data and where that data goes and how you’re storing it helps your brand. So transparency about data in the future is going to be key. If you can write about it, if you can produce white papers about it, if you can redact some of that data and show some insights that you’re pulling about customer interactions, I think that is going to also be very impactful. Number three, if you can align yourself with technology and other brands that emphasize your value set, particularly when it comes to privacy and trust and integrity, I think that will be very important as well. So, for example, if Drata is a brand that’s known for trust in the marketplace and you can present a trust center on your site, that just helps in communicating the customer experience and how much emphasis and investment you put on privacy. If you look at some of these brands now that are dealing with trust and safety violations, I think about the big ones. I think about X. I think about Facebook. I think about these social media networks. They are not putting enough of an emphasis in the right ways about trust and safety, and it has long-term brand value impact. So there is a story there. And I think every company that is working with data today needs to tell it and they need to remember their why. And if it is a customer experience focused business, then privacy has to be part of that customer experience.
Greg Kihlstrom: Yeah, and building on that, totally agree with that. And it reminds me of, you know, in the sustainability or environmental world, you know, they call it greenwashing, you know, when when it’s sort of like, there’s a facade of, like, hey, you know, a social network throws up a privacy page, but obviously, you know, there’s not enough behind it, we’ll just put it that way. But, you know, when when an organization does have a commitment to this, know, to data compliance and these things that we’re talking about. How do you recommend that they think about, you know, you’ve got to balance the, obviously you want to say we’re the most secure and, you know, we treat everything, you know, as great as possible, but you’ve got to balance that with realism and consumers are, they’re way more educated about this stuff, you know, and B2B audiences are, you know, probably a little more educated than B2C in general. And so, you know, how do you do this in a way like to find that right balance of, you know, yes, this is something we value, but, you know, it doesn’t feel like that equivalent of like the greenwashing.
Jordan Sher: Okay, this is a fantastic question. There is a difference between saying, here’s what we are doing with your data in a privacy policy versus really putting yourself in the position of a user and asking yourself, what does the user want out of an experience of my product that will make me feel like my data is protected? I think in the product roadmap, in the product development lifecycle, that is an important question that you need to ask when you’re identifying features. So if you zoom out, the best example of this is, OK, we are building a social network. And I understand that as a user, I may have concerns that my kids are logging on to the social media network and they can get an account and they can start, you know, in 30 seconds, they can start posting on the social network. And I care about their privacy. And so I am going to engineer features that put up guardrails, that allow the users ultimately to be able to control who gets online, at what age, what is the culpability, those kinds of controls. And so really, there is a difference between a brand saying that we care about privacy, and we have a 10-page long privacy policy, and we have a trust center, and we have the whole thing, versus these are the features that we are engineering into the product. that keep your data safe. These are the proof points that you know your data is going to be safe. When you log on, your data exists in its own tenant, and here’s the proof of that. You’re going to get the login credentials to use your own data. You’re going to be able to control, at all times, where the data goes, who is using it, what happens to it. You’re going to be able to control who gets an account, user access reviews. a great feature that allows you to have some control about who can log into your platform and what they can do. And if we can take that as an example of a feature that can be evangelized across all of these different software platforms and really put tight controls over user access reviews, then all of a sudden I as a user kind of understand, yeah, this company cares. It’s not about growth as much as it is about my experience and privacy.
Greg Kihlstrom: Yeah, I love that. Yeah, it’s showing, not telling, right?
Jordan Sher: I do think, I will say one more thing. I think that there is a tension that exists in the tech world and the regulatory environment right now. The regulatory environment is there to put up guardrails or roadblocks or speed bumps in accelerated development life cycles. And I would say that if you truly want to be a brand that stands for privacy and trust, that the more you embrace the regulatory environment, the more you proactively engineer privacy by design, and the more you accept that these are the frameworks that we need to abide by, and so we’re going to get ahead of it, We are going to provide evidence and attestations of these compliance frameworks, even though maybe we’re not required at this time, but we think it’s the right thing to do. I think that would go a long way in providing evidence of a brand’s emphasis on privacy.
Greg Kihlstrom: Yeah, I love it. Well, Jordan, thanks so much for joining the show. Before we wrap up, just one last question, and maybe even just a kind of a recap question here. But what do you see in the future? Obviously, as we talked about, there’s a lot going on right now, but there’s also a lot in development and kind of coming down the road. So what future developments do you See in this realm of data compliance privacy and you know how should be to be marketers prepare to adapt to these changes. So they’re you know, they not only ensure compliance but also ensure ongoing consumer trust.
Jordan Sher: Yeah, I would definitely when we are out there building brand stories, and we think about the customer, we engineer our brand stories long term to talk more about trust and integrity and trust and safety. So in any brand building exercise, that’s got to be part of the experience. And then in the future, I do see that compliance is going to be pulled earlier into the conversation at any software company. And so at its core, when we are talking about core principles and values, which is a critical part of the brand, that is also a place to start evangelizing our perspective on compliance and data and trust. So just bring it out and bring it out into the sunlight and tell that story as soon as you can.