Andre Durand [00:00:00]:
All the interactions where there is some level of trust embedded in our way of operating. If you really dissect all the places that there’s a level of trust for us to do business, it’s pretty extreme. It’s like every interaction has some element of trust. So to embed verification into every transaction comes at an extreme, both infrastructure cost, technology cost, as well as possible friction to end users. And so we live in the balance of the tension between how much risk do we wanna assume, how much friction are we willing, or can we afford to introduce to our end users. All I’m suggesting is that the bar is rising on all of those dimensions.
Karissa Breen [00:00:59]:
Joining me today is Andrej Duran, CEO and founder of Ping Identity. And today, we’re discussing Verify More, Trust Less. Andre, thanks for joining, and welcome.
Andre Durand [00:01:08]:
It’s my pleasure. Thank you.
Karissa Breen [00:01:10]:
So let’s start right there. Talk to me more about what you mean by verify more and trust less.
Andre Durand [00:01:16]:
I would say, really, through the evolution of human history, we’ve gone from small, communal like societies where reputation and trust and what I’ll call the network of trust, you know, was kind of implicitly woven into how communities worked to now. The world is flat and all 8,000,000,000 people can essentially click a button and communicate and interact. And in this new world that is made up of both good actors and bad actors, the possibility for essentially theft and fraud and malicious intent has, you know, essentially gone exponential in this digital world. And so the historical perspective on the boundary between what is safe and what is not safe, what is known versus what is unknown, and what is presumed either trusted or untrusted. In the digital world, we’ve defined those boundaries historically is our network. So our network and our people and the computers and systems and data, we would essentially both physically and virtually create a barrier, you know, that we kind of refer to as the firewall, for example, and or the door that gets you into the building and a physical manifestation of that. And so we had these trusted areas and inside of these trusted areas, there was a lot of trust. If you were kind of through the barrier, if you, if you’re in the building or through the firewall, so to speak.
Andre Durand [00:02:55]:
But we live in a world now where the speed and the intricacy of the interconnectedness of everything to everyone is one in which that paradigm essentially doesn’t work for our current world. I mean, people are working from home on personal devices. They’re changing locations a lot. All the applications and data obviously have left our building and are now spread throughout multiple clouds. And so in that world, now we’re in a situation where trusting and what does it mean to trust people or their actions is being called into question. And there is a, a security. Wanna call it necessarily a philosophy, but, you know, maybe a goal in which let’s presume that we can’t trust interaction by default versus trust them by default. And we now need to verify always as just a safer paradigm from which to transact in our new world.
Andre Durand [00:03:58]:
So don’t trust the identity, verify the identity. Don’t trust the device that the user is using at a moment in time that they’re interacting with you. Let’s verify the identity of the device. A lot of malicious intent happens because actors spoof an identity, but on a different device. And the entire notion of a secure network, let’s not trust that the network is secure. Let’s presume that bad actors are already on the network. So it’s a it’s a shift in thinking from old world kind of safe zones, trusted zones, and untrusted zones. All those lines have now been blurred or blown away, and we are taking on a new view of how to secure, just a highly distributed world.
Karissa Breen [00:04:40]:
There’s a couple of things in there that I want to just press on a little bit more. You said shift in thinking. Now we are optimistic in cybersecurity or we we try to be. Would you say from your experience and where you’re sort of sitting, we have arrived in that sort of shift in thinking?
Andre Durand [00:04:55]:
Oh, we’re long ways from it. I I think that there’s a general appreciation for the fact that the topology that we’re looking to secure has now changed and the methods, the tools, and the techniques that we historically leveraged to create security are now less relevant in the topology of the new environments that we find ourselves in. And so as a result of that, I think there is a general recognition that we need to shift from to something from, you know, call it secure boundary thinking or things in my control that I can secure to things that are outside of my control, but I still have to secure. But the journey to get there is, sophisticated, if not complex. It involves a lot of new tools and mental models and thinking. The environments that we’re looking to essentially evolve to this new way of thinking really navigate a multi generational IT landscape and much of which is not, let’s just say friendly, if you will, inherently friendly to to all the new things that we’re trying to do. So it just makes the, it makes it a big challenge. It doesn’t change the fact that it’s reality and we have to deal with it, but it’s gonna be a journey and it’s gonna be measured as a multiyear, if not a decade long journey.
Karissa Breen [00:06:18]:
So you said before, we have to shift from something to something. So what would be the to something?
Andre Durand [00:06:24]:
So the to something is where the presumption is that anything or anyone that you’re interacting with needs to be verified as best as we can verify. For example, if I send you an email with a link that says, follow this link to enroll in our MFA program, and we will have you download some software, install the software on your phone and the software and the device will now be leveraged as a factor of authenticating you. We no longer should trust that the person that clicks on the link and enrolls their phone is actually the person that we think it is. We might wanna verify that it’s actually that person enrolling their phone into your strong authentication. Does that make sense?
Karissa Breen [00:07:11]:
Absolutely. So is that the part where you would sort of think with customers? That’s the part that people are finding difficult maybe to understand or there’s more friction or where’s the where’s the part? Because you mentioned before that we’re not sort of there in terms of the shift of the of our thinking.
Andre Durand [00:07:26]:
Zero trust thinking expands far beyond the example that I gave you. So I think if you give any one concrete example, I don’t think it’s difficult for people to comprehend or or appreciate that some of those challenges that they’ve been living with the risk associated with what I’ve described. The risk has been rising in recent years, just given the focus that is now happening on with digital fraud. I mean, it just really is starting to reach a level of criticality that I think everyone is feeling it. So again, it’s not that they don’t see or appreciate it, but the examples that I gave are broad, all the interactions where there is some level of trust embedded in our way of operating. If you really dissect all the places that there’s a level in trust for us to do business, it’s it’s pretty extreme. It’s every interaction has some element of trust. So to embed verification into every transaction comes at an extreme, both infrastructure cost, technology cost, as well as possible friction to end users.
Andre Durand [00:08:31]:
And so we live in the balance of the tension between how much risk do we wanna assume, how much friction are we willing or can we afford to introduce to our end users, all I’m suggesting is that the bar is rising on all of those dimensions.
Karissa Breen [00:08:46]:
Yeah. And I know and I know it’s not an easy thing to sort of answer, so I know that we’re speaking in more generalized terms. So I had a discussion yesterday with head security from a large Australian retailer here in Australia. And we were talking going back to your point before around friction. And it’s like, you know, some of these organizations, let me focus on retailer for a moment. They don’t wanna introduce extra fiction because it could result in people not checking out and not, you know, making them money. How do companies sort of then balance that? But then as a result, this organization was in the news recently this year because of, you know, credential stuffing, etcetera. How do you how do you find that balance and that equilibrium?
Andre Durand [00:09:22]:
It’s a great question and a great example, and we’re actually involved in many companies that are, you know, tackling both the security and the user convenience impact to the bottom and top line. And I’ll and I’ll share generically, you know, what some are doing here. So one of our retail customers, a ways back, built a business case that around the notion that logging in and abandoned carts, meaning they might’ve responded to an email or an ad, put something in a cart and kind of got cold feet and left. And when they come back, they have to register again. And so the thesis was if we could figure out a way to not force users to register all the time, maybe we could figure out a way for them to register every 6 months, for example, or maybe once a year, not register, but authenticate, log in. And so that was the business case. That was the hypothesis that some amount of top line was being lost to the friction of login and abandoned carts. Lo and behold, they implemented some of our technology that, in essence, leverages a growing number of signals, risk signals, that might indicate that an account or a session has been hijacked or an account has been taken over.
Andre Durand [00:10:35]:
And as a result of implementing this behind the scenes technology, they were able to achieve really what was the desired goal of not having users log in every single time they click on anything and came back to the website. They were just logged in and they saw a massive increase. I can’t share the numbers, but a massive increase to the top line. So the presumption was actually accurate. Friction was robbing them of top line business. Furthermore, they actually ended up seeing a reduction in fraud. So they ended up seeing what I kinda refer to as the trifecta for a retailer in the online world, which is they saw a better top line. They saw reduced fraud, and they delivered a better user experience, which speaks to MPS and loyalty.
Andre Durand [00:11:16]:
Ultimately, retailers online measure the distance and competition in really seconds and keystrokes versus minutes and miles in the physical world. So one bad user experience and people have too many choices where they can get an exceptional experience. And so loyalty is very fickle to the friction in the retail experience. Anyway, so there’s an example. We’ve had since had many retailers implement what I’m describing to similar top line, bottom line, user experience impacts. It’s just a great example of where we’re progressing the technology to find a better balance in the security, broad, frictionless end user experience.
Karissa Breen [00:12:00]:
And I think it’s a great example because that’s one that I’m probably a little bit more familiar with and one that came up my conversation yesterday. And I agree with you in terms of loyalty being fickle nowadays. I just wanna follow this example up a little bit more. Now if we go back to retailers, would you say and using this Australian retailer that was in the media recently, they were sort of gambling with, well, we don’t wanna introduce more friction because, I don’t know, 10% of customers are not gonna check out, and we’re gonna lose 10% of revenue or whatever it may be. Then because there was an issue, it actually meant it was a disruption to their business. They had to pay customers back, etcetera. So do you think they had calculated that risk on, okay, if we had maybe just implemented the friction at the start and front loaded that sort of maybe risk to their business on the revenue, perhaps we wouldn’t have had the long tail impact. Now I don’t know whether it’s the same or, you know, outweigh one another, but do you have any thoughts then on that with your experience?
Andre Durand [00:12:59]:
I do. It’s such an astute question when you’re focused on the movie and not a frame of the movie. So everything related to what are you willing to spend today based upon the probability of a future event that could cost you. And in all of the risk management math, I guess at the end of the day, while there are statistics that say, for example, the average cost of an insider breach caught early, not late averages $16,200,000 to an organization. Okay. Well, what is the probability of an insider event from which I could derive my willingness to invest today, to keep a possible event from occurring and a much more exorbitant fee being essentially extolled by the organization. It’s a tough argument. I’m just gonna say that.
Andre Durand [00:13:54]:
The the reality is security and risk, while it is getting board level attention and people do recognize that there’s material immediate financial cost, but an unknown very large reputational cost, I think companies recognize it, but I wouldn’t say that is necessarily fully reflected in the investment or spend. So there does seem to be a disconnect and, you know, maybe the industry needs to work harder to connect the probability of those events and the cost to the actual investments which are being made. I do think that there probably is work there to be done and maybe a little bit more science to be applied. Do you know insurance and and some of the exorbitant fees through ransomware and other things as insurance companies try to recover from some really, really big, very, very, very expensive breaches. It certainly has created a financial awareness that many companies cannot can no longer avoid. They can’t afford to be without some level of cyber insurance and d and o insurance. And there’s regulation that is that is putting even more focus on the responsibility of boards and management teams to protect customer data and essentially behave in a cyber responsible manner. So I do think that things are improving, but I think there’s pretty big disconnect.
Karissa Breen [00:15:13]:
Yeah. I think that’s a great point because when you sit around board levels, I used to write the CISO for the one of the largest banks here in Australia, all their reports. So I sort of had a appreciation for the communication side of it. One of the things, just going back to the example again, is this is I had this discussion with his head of security. Do you think we’re just not doing a great job at explaining, hey. If you invest arbitrary number a 100 k here, yes, we’re probably gonna, you know, lose a bit of money and then doing an investment, whether it’s with, you know, Ping Identity or whoever. But then we potentially may not then lose, I don’t know, a $1,000,000 worth of sales because, I don’t know, something happens down the line or, you know, it’s a disruption to our business. We’re not, you know, making the revenue that we that we were.
Karissa Breen [00:15:55]:
Do you think that perhaps security practitioners not conveying that perhaps to a CFO? And I asked that because I have spoken to CFOs and really, if you if you zoom out, what they ultimately care about or their function is how much money is the business earning and how much money is the business sort of burning.
Andre Durand [00:16:11]:
And they have benchmarks, frankly, in IT that includes security that they also are monitoring, which is oblivious to the risk or cost of deferring risk. So, yes, I would say we are not doing a good job as an industry connecting the dots. Now, one of the things that we do at ping, both on the cost avoidance, meaning cyber incident avoidance, as well as on the productivity or top line, say in the retail example, we have spent a fair amount of time quantifying what we can and providing essentially value assessments or value calculators that are used by the champions and others who do leverage our technology, who, like everything, seek budget in a finite pool of budget to invest proactively in the integrity of their security stuff. So we we have done a fair amount there. You know, that tool and the effort that we apply to that tool in concert with some of our customers is not to all of our customers by any means, by any stretch. And I would say tools, whether it’s ours or others like ours, to, in essence, put to number what would be a responsible spend and what the outcome and lowered risk would be for the CFO. We just have a long, long ways to go.
Karissa Breen [00:17:29]:
So in terms of connecting the dots, with your experience and your background and the tenure you’ve been doing this, where do you think was sort of going wrong? Because this this is an executive podcast. So this podcast is not just for Sizos. It is for CFO and friends. I’m really curious then to hear your thoughts because I think that this is still a problem that has been around for so long. Oh, we can’t communicate. Oh, it’s being technical. And really coming back to, well, what’s important to a CFO is what I mentioned before. It’s not about the technical elements to a product.
Karissa Breen [00:17:59]:
Can this make me money, and how much does the thing cost really when you when you zoom out?
Andre Durand [00:18:03]:
Well, I mean, if you net it out, the role of management is in the allocation of resources, people in investments in tech and processes, both capital and operating, and everyone is starved ultimately for what they would wanna feed every part of their business. And some parts of their business are, you know, cost of operations and other investments really drive the top line. So when push comes to shove, you wanna feed the side of the organization which drives the top line in revenue, and you’re trying to cost contain at some level, you know, what could be an endless amount of investment in the back office, so to speak. I do think that it just really comes back to, can we drive harder and more defensible return on investment analysis of this infrastructure? I think that’s probably the area, and and that normalizes it because everyone’s talking about a dollar spent here and what the ROI is and a dollar spent there. What’s the ROI? And I just think we can and should do a better job on that front. And the risks are rising rapidly. I mean, it’s not we’re we’re not living in the same threat landscape or digital infrastructure that we were a decade ago. It is materially more active and more sophisticated as evidenced by the number of breaches and the and the amount of money that has been extolled from companies.
Andre Durand [00:19:27]:
As, you know, it has been steadily rising at a and really at an accelerated pace, I’d say over the course of the last 5 years. So, whereas you might have been able to, you know, not focus on it because probability of risk, you know, was x and now the probability is y. I think we’re entering a zone to where it becomes somewhat unavoidable. We just need to be paying more attention to it.
Karissa Breen [00:19:49]:
Would you say as well that people are willing to, in air quotes, gamble, run the risk until there is some type of incident? Because no one’s gonna no one’s gonna willingly wanna pay more money for stuff like cyber that you know? And I and I know this because my brother-in-law is a CFO, and he talks to me a lot about these sort of things. So is that is that a is that a very true sort of, you know, reality that people are willing to maybe gamble? Not all companies, I’m just speaking generally, of the average sort of organization that is out there until something happens, and then maybe always better do something.
Andre Durand [00:20:21]:
Well, I think the larger the company, the more risk aware they are, the more resources they actually do have. And you also have a series of regulated industries that, you know, that have forced a level of access control assurance and other controls to ensure companies are, you know, not subverting the responsibility of protecting their system. So in regulated industries and large companies with more resources, ones that have more to lose in their brand reputation, their entire business, say, for example, financial services, rests upon a certain level of trust of their customers that they are good stewards of their, you know, of their money, which is effectively is, you know, boils down to a field in a database somewhere. You think about that? All our value is stored as a field in a database, not as dollars or gold bars in a safe. It would be extremely damaging. So there are industries where I think they’re much more risk aware and investing appropriately as a result. But as a general note, yeah, I would say that there are certainly a lot who, you know, are more worried about the top line, and this line item on the spreadsheet is not in the top 10.
Karissa Breen [00:21:34]:
I agree. I I absolutely agree. Because I’m I’m at the coalface of this industry. I’m interviewing people like yourself, you know, people that are outside the space. They are saying that, you know, this they’re like, well, this stuff costs a lot of money. Cybersecurity is not cheap. It is a massive line item. Can’t see it.
Karissa Breen [00:21:48]:
Well, you know, we didn’t get hacked or breached this year. Clearly, it’s done its job, but, again, going back into the the discourse in which maybe a CFO sees things is very fundamentally different the way in which maybe you or I would see something.
Andre Durand [00:22:01]:
Well and it could be endless. I mean, how much how much defense in-depth is enough? And, again, to your point, if you haven’t been reached last year, it’s a decent argument. And, you know, human nature is if it didn’t kill me last year, it probably won’t kill me this year right up until it does. And so I mean, this is an inherently challenging conversation to have.
Karissa Breen [00:22:21]:
No. I I, a 100%, I really do get that. And going back to your point around, you know, financial services, you know, worked in a bank, worked across that. Absolutely. You know, you’ve got government bodies breathing down your neck. You have to be compliant in certain areas. So it’s a little bit different for, I don’t know, retailers and hospitality companies. You know, they don’t have that sort of overarching cloud above you that you have to do things.
Karissa Breen [00:22:44]:
So, I mean, look, it’s not about having all the answers. It’s more just having a chat and trying to, you know, get insights from people like yourself to share with the wider community. So on that note, I wanna shift gears now and maybe talk a little bit more about from your experience and what you’re seeing. Are are you seeing that shift to companies leaning more into verification? I know we spoke a little bit more about, you know, the cost and and or in having that mindset shift, but are you seeing that happening more now?
Andre Durand [00:23:14]:
Well, Ping is seeing it probably because we have a solution here. It’s it’s been around for a long time, but I would suggest that verification, you know, it’s expensive. Typically, it’s not a great user experience. The in person in many cases is a proxy for the verification, so remote digital verification. You know, we’ve had different techniques for a while, q and a. Right? Question and answer for things that only you should know, like a form of secret, if you will, through the credit unions, know, was what, by and large, the industry did to verify your identity remotely for a period of time until until all of those data sources essentially got hacked and the secrets were out. So all of a sudden you can’t rely upon some aspect of your financial information to be a secret that only you know. And so it has now shifted to, at least in the self-service world, the ability to verify user.
Andre Durand [00:24:07]:
It shifted now to verifying in some way you’re physically issued digital credentials and think driver’s license, real ID, passport, and combining it with your biometrics, liveness tests of you, and comparing it to the biometric on the on the issued physical document. I think that’s an interim state because there’s a lot of room for mistakes in interpreting in interpreting a falsified document combined with deep fakes hitting about now. But it’s it’s the best we can do. I mean, society, for the most part, trust in society does rest upon when truly asked, let me see your driver’s license. Honestly, if you like, all kinda legal mechanisms rest upon the government needs to do a job in vetting you, issuing you some form of identification, typically physical, and we ask to look at it as if we would know a fraudulent one from I mean, maybe if you’re at a bar and you’re used to recognizing fraudulent IDs, but for most of us, like, looking at an ID, I I I wouldn’t be able to tell a fraudulent one for not. And none of us recognize all the different formats of these IDs. In the states, it’s 50 different states and 50 different IDs, and even Colorado has changed the look and feel of the ID through the years. So I might recognize 1, not recognize another.
Andre Durand [00:25:25]:
But look, it is state of the art at the moment. There is so much fraud going on now that the willingness to introduce the friction of verification is higher than it was before. So just flat out the need to secure transactions and reduce fraud. We had one of our financial institutions who in the 1st week of implementing our verification upon loan origination cut $300,000 of essentially fraudulent loan origination. In the 1st week, I would say there is a very heightened awareness now. People are looking at every step in the identity journey from verification through authentication through authorization, and there just is a heightened awareness that this is what’s being abused that is leaking money to fraud in many cases or compromised accounts that ultimately leads to some form of fraud and lost money. So we’re having to fortify every step in our identity perimeter against these attacks because they’re so focused here now.
Karissa Breen [00:26:33]:
Would you say that’s what’s the big push is to getting people to, you know, lean into verification? Because it’s like, well, 300 k in 1 week, that’s a lot. I myself used to report on numbers the bank used to lose per month, and it was a lot. And this was going back 10 years ago, so I wouldn’t wanna know the numbers now. Do you think it’s just more so financially? It just makes sense.
Andre Durand [00:26:54]:
It just makes sense. It just makes sense. And, you know, things that were hard to solve and people were willing to assume the risk for the greater good of customer friction, and so you just you you know, I mean, think about, like, for example, you know, the credit card industry operates on roughly 3%. Right? If fraud is if fraud grows to a certain amount, the business model breaks. It doesn’t work. It can’t be profitable. So and I would say most of our businesses presume some fraud, but at some point, our business model, the fraud overtakes our business model. So it is an imperative that we stay ahead in the techniques that allow us to continue to conduct good business and keep product bay.
Andre Durand [00:27:35]:
Now all value, you know, all forms of value will always and forever have been attacked. So we’re in a forever war on this front unless everyone in society became a good actor, which I’m not unfortunately too hopeful about. So yeah. So I think 0 trust, verify always, and it’s not just verification as I described it of a user ID. It’s verification of the device that the user is using. It’s verification of their user behavior. It’s verification of all the other signals, which could indicate authenticity of the user we think we’re interacting with, and it’s just gonna take everything we’ve got to ensure that integrity in our digital channels. There’s one other piece that I was you know, I mentioned state of the art at the moment is just verifying the physical credentials.
Andre Durand [00:28:21]:
There’s one step beyond this that I’m super excited about, and it’s right around the corner. And it’s when trusted parties issue digital credentials called decentralized identity. And the notion is your phone becomes a wallet for digital credentials, very much the way you would store an airline ticket or a or a movie ticket in your iOS wallet or the wallet on Android. But think think much broader in terms of what that digital ticket, you know, or card could do. So we refer to them as digital credentials, and and it’s all backed by, you know, some very, very good crypto. But the notion is an individual can carry around certain proofs about themselves, proof that they’re an employee of a com digital proofs that they’re an employee of a company or a or a loyalty member of this airline or a customer of this bank, a citizen with a real ID, you know, issued by the government. We, today, as individuals, cannot carry digital proofs around with us to prove something about us when asked. So the act of verifying, whereas today, the act of verifying is very onerous, and every company has to do it over and over and over again.
Andre Durand [00:29:29]:
We have to do that because we have no reusable proofs that individuals could carry around with them. That’s about to change, and that is massive for 0 trust. We live in a world where let let’s presume 0 trust. Okay. I wanna transact with you, and I don’t trust you. What am I willing to do? Well, not a lot. I’m not willing to risk a lot. It’s the reason why if you if you get a new credit card for the first time, I’ll I’ll give you $500 credit.
Andre Durand [00:29:56]:
I won’t give you 50,000 because I don’t trust you. There’s no history of interaction with you to extend that level of trust. So trust is built over time as our say do ratio, a one to one, we say it, we do it. When we follow through over a period of time, trust grows, and we, in relationships, are willing to extend higher and higher levels of risk. But that’s a process, and that process from low to high trust takes a lot of time. And in the gap, there’s a lot of money that’s lost. We could do a lot more if we could go from low to high trust fast. How do you go from low trust to high trust fast in new relationships? Well, one of the things we do is we borrow from your reputation with other entities.
Andre Durand [00:30:42]:
It’s what the credit score actually is. So if you borrowed money from a whole bunch of other people and you’ve paid it back on time, your reputation or trustworthiness to loan money and get it paid back is based upon your credit score. Credit score is reputational trust in a digital form. With these digital credentials, it is now possible that our reputation can be carried on our phone in a wallet with all these digital proofs. And that in new relationships, we could go from low trust to high trust, meaning verifiable trust or verified trust in a single click, in a millisecond. What would it mean for business if you can go from low to high trust with a click? It’s a really big deal, and we’re about to unleash that. It’s right around the corner.
Karissa Breen [00:31:27]:
Okay. So there’s a couple of things in there I wanna get your thoughts on. Just to go back one moment, digital identities. Now in Australia, they’re pushing pretty hard for that. You probably are aware. So I don’t carry a physical ID on me anymore. I just use the app, and I show people if I need to. But in the comment section, when I read on LinkedIn, you know, when the, you know, minister was announcing this, people just seem rattled by it.
Karissa Breen [00:31:51]:
Why do you think people are so rattled by this?
Andre Durand [00:31:53]:
Well, there is always the unknown prospects of anything digital, creating a digital trail that would somehow violate privacy in ways that people don’t know or appreciate or can see. And that’s very valid, honestly. It is very valid. So are these systems designed in such a way where we can do 0 knowledge proofs where I need to prove something about myself without giving away everything? You don’t need to know my name to verify my age. You don’t need to know this to verify that I live at this address. And so there is you know, we talk about security by design, meaning it’s not an afterthought. There’s also privacy by design. And I do think that there is a well founded fear of any new thing and what the unintended consequences will be at some future point in time.
Andre Durand [00:32:45]:
And that there is an increasing encroachment upon our privacy with every move that somehow attempts to verify identity and attack fraud in our society are positive. There’s some unintended or potential future abuse of that system by governments or otherwise. And again, I think the fears are well founded. Throughout history, there have been good intentions, but the downstream unintended consequences have been very, very significant. So I think part of it is education. Part of it is actually ensuring that the systems and techniques we use actually do have a both security by design and privacy by design mindset. It’s not something that we wanna just do quickly because it solves one problem and introduces 5 problems that are more significant 10 years from today.
Karissa Breen [00:33:37]:
But would you also say that there are companies out there that are prematurely moving quickly without having heavy considerations around security by design, privacy by design? And then potentially as a result, there’s a breach that happens. And then, therefore, what happens after that? People lose more trust. Oh, we shouldn’t have done that. Then that is sort of a little bit, you know, cyclical then.
Andre Durand [00:33:56]:
Yeah. I would say if history is an indicator of the future, what you described is the human condition. So yes. But, you know, progress is still made to be clear. And progress is made on the skeletons of a lot of mistakes many times. Maybe we could have gotten there faster with better foresight and better architecture and planning. There comes a moment in time where you can’t foresee everything in the future and you just have to make a move. So, you know, I do subscribe philosophically as an entrepreneur to look.
Andre Durand [00:34:26]:
There’s moments in time where you step beyond the point of no return. You let the details sort themselves. Otherwise, we would always make no movement in analysis paralysis. So there’s a healthy tension between forward progress, knowing that we will learn and iterate, but nothing’s perfect, And sitting back and contemplating all the risks before we’re willing to even make the first step. I I think that that is, I think the tension is healthy. The balance is healthy. You could be reckless on either ends of that spectrum.
Karissa Breen [00:34:55]:
I wanna touch on something as well with the deep fakes. So I had an interview yesterday around deep fakes, and the guy who was based in the United States, obviously, got the election coming up. It was just based around, you know, what’s real, what’s perceived real, all of that type of stuff. So what are your sort of thoughts then on deep fakes and verification? How’s that gonna look now moving forward?
Andre Durand [00:35:15]:
Well, they’re they’re very intertwined. So, you know, humans are bio biologically programmed not to authenticate, but to recognize. We use all of our senses to recognize people and things that we’ve seen before. And we are biologically programmed to remember good experiences associated with people or things. And 2 of our primary senses are our eyes and our ears, more so than our taste, our smell, or our fingers. Now if you’re an animal, your smell is pretty damn good. Ours, humans, not so great. So eyes and ears are our primary senses that are biologically programmed to recognize.
Andre Durand [00:36:01]:
And the challenge with deep fakes is that 2 of the most trusted forms of recognition of authenticity now have effectively been compromised. When AI can replicate what I look like in motion and what I sound like in voice and do it in real time through a video chat, we’re in trouble. So how to reintroduce knowing versus trusting 0 trust through all digital interactions now is going to be a challenge of the security industry and the identity history. And we do have some answers, but they have not rolled out en masse. And the speed with which abuse will occur in the absence of education of how good these deep fakes can be perpetrated along with social engineering. Unfortunately, a lot of money in scams aided by deep fakes is right around the corner. And we knew it was gonna happen and it’s starting to happen right about now. And it is going to become a major issue.
Karissa Breen [00:37:04]:
Yeah. Most definitely. And sort of the consensus I got yesterday is basically, like, nothing, really. Like, just be mindful and just really watch what you’re doing.
Andre Durand [00:37:13]:
You know, without going into a technical conversation, there are ways for us to verify in video calls, in voice calls. There are ways if we’re not sure or if that the request seems odd. There are right around the corner, very, very good techniques to allow us to reintroduce a verification step peer to peer between 2 humans that have a potentially compromised digital channel between them. But we’re gonna go from 0 because because the tech exists, but it’s not rolled out. We have to go from 0 to everyone and everything in a fairly short amount of time. That’s gonna be pretty daunting. In a few years back, you really could kind of tell something that was rendered. Take like a photograph of a house.
Andre Durand [00:38:02]:
In the last 12 months, I no longer can tell. And what I’ve noticed is that my feed, the perfection, like utopia, like perfection of everything in my feed has gone up somewhat exponentially. It’s like everything is perfect now. And, and you literally, if, if it’s counting pixels, so to speak and looking at the clarity, is that 4 ks or is that rough? Like you can no longer tell authenticity is under attack in all digital forms. That is pictures. That is video and it is now voice. And you’re seeing it encroach. It’s like all of a sudden, the only way to recognize that something is real anymore is that it’s not perfect.
Andre Durand [00:38:46]:
And trust me, they’ll be onto that one as well. So how do I manufacture imperfections since that now becomes the point of recognizing something that’s deep fake or not? We are we are upon a real challenge where everything that we presumed or we trusted was real. It’s gonna switch to the complete opposite of the spectrum where everything we see and or hear, we’re gonna presume as fake. It’s unfortunately a sad reality. Our kids are gonna grow up in a world of not believing anything outright and needing to verify everything. It is a sad commentary, unfortunately, that is an unintended consequence of the speed, convenience, and productivity that technology and digital has given us.
Karissa Breen [00:39:32]:
That feels exhausting, though.
Andre Durand [00:39:33]:
Well and the challenge for the industry is to not make it exhausting. And, you know, there are there will be answers there. So don’t So
Karissa Breen [00:39:39]:
we’re looking at you, Andre. We need the answers.
Andre Durand [00:39:41]:
Yeah. And look, we’re working on it. So look, I’m optimistic that there is an equal and opposite use for every technology. Generally speaking, humans as a species have moved forward. You know, if you look at nuclear you’ve got nuclear power and you’ve got nuclear weapons. Pretty much every technology that we’ve created, there is a positive and negative use. We try to regulate the negative use out of the system so we can enjoy the positive, so that we can, as a species, continue to move forward. So look, I’m optimistic that we’ll continue to moving forward.
Andre Durand [00:40:16]:
However, will there be a lot of people who pay the price for our progress. Unfortunately, the answer is yes. There will be a great cost to a lot of unsuspecting or unprepared people for the progress which we’re making.
Karissa Breen [00:40:43]:
Thanks for tuning in. For more industry leading news and thought provoking articles, visit kbi.media to get access today.