Episode 187 Deep Dive: Brian Grant | Securing Your Data: A Social Responsibility

In this episode, Brian Grant shares his insights on data security and why organizations should prioritise securing their data before anything else. He emphasises that effective data security starts with a fundamental change in how organizations approach the issue as a social responsibility, not just a business concern. The discussion also delves into the consequences of data breaches, including the potential for life-threatening outcomes, and the need for everyone to prioritise safety in the digital age. Additionally, Grant discusses the critical role of education and consulting in putting cybersecurity controls and investments in context, and how raising awareness can lead to a tipping point where organizations and individuals prioritize it as a foundational value.

Brian Grant is the Thales Regional Director with responsibility for the Data Protection business in Australia and New Zealand. He started his career in technology when he enlisted in the Royal Australian Air Force, working on critical air traffic control systems. He has had extensive hands on and leadership experience in delivering innovative and unique solutions for organisations, with a particular emphasis on networking and cybersecurity. He has held regional leadership roles for a number of technology start-ups and is passionately committed to helping clients embed data security to better deliver confidentiality, integrity and availability in our data dependent world.

Episode Transcription

These transcriptions are automatically generated. Please excuse any errors in the text.

Karissa Breen [00:00:30]:

Joining me today is Brian Grant, Regional Director ANZ from TELUS Cloud Security. And today we're discussing security misunderstandings to data security. So Brian, thanks for joining. It's wonderful to have you on the show today. Now, you've made a comment that everyone is talking about cybersecurity today without necessarily knowing what they're talking about. So what do you mean by this statement specifically? And what is it that people aren't really getting? Yeah, really great opening question. So I suppose maybe I should start by

Brian Grant [00:01:17]:

first encompassing that cyber security, because cyber security is this all encompassing term that pretty much covers any security around digital or computing environments. In actual fact, I would say the old definitions of cyber security being about information systems is really incorrect now because data and digital systems are completely dominating our world as it stands today, whether it be autonomous systems or IoT. Now, what we're getting into there is that cybersecurity is being seen as this, it's probably the toughest thing to do in IT today. It's really, really tough. And We sort of encompass that as doing everything for everyone. Data security itself is slightly different. It's sort of like a subset of cyber security, and it's to do with data itself. And the best way I have of sharing that is data security is to cyber security like seat belts are to vehicle safety or car safety. In the event of an accident with your car, your seat belt is what saves you from serious harm or worse. In cybersecurity, data security is what saves your data in the event of a cybersecurity incident. So the terms get inter-operated, so to speak, but really data security is this subset that prevents data to come into harm. And to be honest, in our data dependent world, everything

Karissa Breen [00:02:55]:

is driven by data now. Yeah, I 100% understand what you're saying in terms of people sort of using terms interchangeably. And then I guess on that note, and I probably should have asked this before, people probably more familiar with Talus Group rather than Talus Cloud Security. So maybe if you can just explain a little bit more of the demarcation.

Brian Grant [00:03:12]:

My analogy around seatbelts and cars is probably a good way of pivoting to that because Thales Group builds complex things as my old CEO here for Thales Australia used to say, Chris Jenkins, he used to say, we build complicated things. And so Thales Group does everything from put lasers on Mars. I don't know if you know, but the Mars rovers that are wandering around Mars use lasers that actually were produced by TALOS. And it's a really cool thing, but that's not who I work for. I work for this specific group that works in data security and we get up every day trying to make the world a safer place when it comes to the data and the data that we all depend on today. So we're sort of like the seatbelts in this digital sense. We're the seatbelt guys. We're trying to stop people from data from being seriously harmed. Talus group builds complicated things like cars and trucks and satellites,

Karissa Breen [00:04:11]:

and that's the bigger group. So we're a specific cyber security vendor inside that group. No, absolutely. Most definitely. And I think that I wanted to ask that because people probably may get the wrong sort of interpretation of that. So I appreciate you making that a little bit more clear. Now you mentioned before seatbelt guys, Would you say that people in general think of themselves as the seatbelt guys or more the no-belt guys?

Brian Grant [00:04:37]:

I anticipate that today, cybersecurity or most information security organisations are no seatbelt guys, to be brutally honest, Carissa, that's the reality of the world we live in. Now, what is going to happen, what is happening, and you're seeing that today, you would have noticed that changing the notifiable data breach legislation to make it more onerous that if you do a data breach it's now not a slap on the wrist, it's actually could be a serious fine. And when I say serious fine, we're talking tens of millions of dollars. It's like seatbelts were back in the late 60s, early 70s. Seatbelts were sort of like an optional thing that in actual fact, you didn't even have to buy a car with a seatbelt because it wasn't mandated. What then happened was government stepped in because people were having accidents and basically our road toll was going through the roof as more cars came on the road. In a digital sense, that's where we stand today is not everyone's wearing the data security seatbelt. And as a consequence, we're continually having massive data breaches. We're having successful ransomware attacks. And what that points to is people aren't wearing data security seatbelts. What's now happening is government is stepping in and regulating or legislating to say, hey, you need to start doing something about this. And this is actually literally what happened in the vehicle industry back in the late 60s, early 70s. The government stepped in and started making it legally required that you wear a seatbelt and that changed the world. And I anticipate that this is where we're going from a data security perspective. The legislative and regulatory bodies will start stepping in and saying, if you keep failing to be successful in securing people's data and critical data and critical infrastructure data will make it a legal responsibility for you to do it better. And in actual fact, I think we're seeing that come about today.

Karissa Breen [00:06:43]:

So where do you think stands like people's mindset around not wearing data security seatbelts? Or do you think their intention was to have that seatbelt? But I mean, these things aren't necessarily easy things to accomplish a lot of the time. It's quite complex and there's a lot going on. So I guess people's intentions may be right, but maybe execution's poor. But would

Brian Grant [00:07:04]:

you say that people do have the right intention and maybe they fall down on the execution front? Or do you think that some people just haven't really thought about it? I think everyone has the right intention. I'm a positive thinker around people and what they're doing and why they're doing it. I'm more of an optimist in that regard than a pessimist. Put it from the perspective of, we all know that probably 50% of cyber incidents or cyber attacks are human error. It's a research fact. If you go to IBM, ourselves, in our research, we find that probably about half, maybe even more of all cyber incidents are due to human error. And some of the recent massive breaches, the underlying cause of them was a human error. Now, because it's a human error, the immediate response is, well, if it's a human error, hey, if I can just train people or coach people to be better at cybersecurity, then consequently I'll stop the consequences that being data breaches and ransomware attacks, successful ransomware attacks. The problem is that's a little bit of a fallacy. That's sort of like saying, if I make everyone get a driver's license, we'll never have a car accident. And even the best drivers in the world from time to time will make mistakes. And I do use the statement, to err is human. We are all human. Well, not everyone now, chat GPT makes it hard to tell who's human anymore, but We're all human and we will all make mistakes. So what we've got to do is accommodate the fact that while people making mistakes is the cause of it, we've got to stop the consequences of that mistake. We can mitigate it, we can reduce the risk, but until we secure what's at risk, being the data, then we're never going to successfully stop these massive data breaches and data incidents. So I think that's the problem here is it's just a perspective of saying, hey, people are making a mistake, let's train them and then we'll save ourselves. Yep, that's a good idea, but it doesn't save yourself completely. Networks are being used to attack our data. Hey, if we just secure the network, then the data won't be breached either. And again, networks, complex, connecting, connectivity is so broad ranging now. And there's a thing in cybersecurity where we talk about, you know, cybersecurity teams have to defend everything. Yet it only takes 1 mistake or 1 exposure or 1 0 day exploit that no 1 knows about and suddenly you're compromised at that edge environment. And that's what my story about cybersecurity being the toughest job in IT today. It is the toughest job because guess what success looks like? Success looks like nothing. It means success is nothing bad happens. So when you're going to your board of directors trying to get funding and going, hey, I'd like to spend more money on cyber security or data security, and by the way, the outcome will be nothing, as in no bad incidents, no cyber attacks, no data breaches, no ransomware, successful ransomware, no outages of critical infrastructure or no compromises of people's data or privacy information. That's a pretty hard sell because this measure of success is nothing. It's not because people are deliberately not trying to do this. It's just because

Karissa Breen [00:10:39]:

we're looking at it from the wrong perspective, if that makes sense. Totally get it. And yes, of course, you're right. Nothing happens. And then that's why it's then harder to prove like, well, why am I paying for all this stuff? Well, then nothing happened. Okay, so I've got a different question then for you. I was on a panel discussion this week and going back to the fines and obviously they're increasing millions of dollars or tens of millions of dollars. Now, 1 of the guys beside me said, oh, for certain companies, like tens of millions of dollars is nothing. And I think that what he said was some of these companies or these guys just literally pull it out in cash from the back of their Maserati. I mean, I've worked in a bank before in earlier days and tens of millions is a company that makes $9 billion in profit per annum. Tens of millions is not that much money for these guys. Do you think that the fines to certain companies doesn't really matter?

Brian Grant [00:11:34]:

Really good point. I think you're right for a certain size organisation, even tens of millions, it could be considered the cost of doing business. And this is where, and actual fact, it's funny you talk about the panel, it's where I get up in front of audiences every day for the exact same thing. And what I, what I lead with is this is actually not about business. It's not about return on investment. It's not about fines. At the end of the day, this is actually a social responsibility to keep data secure. And what I mean by that is that when data breaches occur or ransomware attacks occur or critical infrastructure is compromised and power doesn't work or whatever it might be, the reality is it hurts real people. It hurts your friends, your family, your colleagues, your loved ones. So until we start getting it right, the consequences of this is not financial, it's actually social, it's human. We've got to appreciate that we hurt real people when we don't do this and do this well. And that's really when it's going to change fundamentally with inside organizations. When the CEO, the CEO, the board of directors sit there and go, we're not doing this for financial reasons. We're doing it because it's the right thing to do for humanity, for people. And as we become more data dependent, as we lean into more autonomous systems, as we start doing things like driverless vehicles and transportation systems, the responsibility for keeping data safe will become more than just compromising people's privacy. Potentially in the future, Data compromise could actually be something that could kill someone. And at that point in time, we've gone 1 step too far. We've got to take responsibility from a social perspective of doing this. The fines justify it. The fines actually allow you as a board to justify it. The reality is we've had to socially embrace the fact that data needs to be better secured. And once we do that, then we're really home. Then we're actually making a change to the world where everybody understands that everything really is now data. I have a little story about Gucci handbags and if you have time later on, I might share my Gucci handbag story about why Gucci handbags rely on data.

Karissa Breen [00:14:16]:

Yes, please, because I own a Gucci handbag, so I'm definitely interested. So, Brian, okay, so I've got something that's coming to my mind as you were speaking about the social responsibility. Totally get it, totally agree. Now, as a media person, a journalist, you know, was a former practitioner, I get into this, I got into this because 1, there's a lot of work we need to do in this space. Of course, interviewing people like yourself and then sharing that out to my network so people can learn. But you are right. Now, a couple of things I want to approach you about is when you look at the consumer, when Medibank and all these other breaches happened, I actually got on the front line and did some interviews with people around how did that impact you as a consumer. Some of the stories are pretty full on. Then I look at it from a business perspective. If you're old mate sitting in your ivory tower, you're just looking at P&Ls, you don't actually know who's your customer and there's no face behind that customer. Do you think it'd be sort of worthwhile like, I don't know whether I do this or someone else out there can actually say, hey, here's a story for 1 of your customers that this data breach actually created due to your inadequacy of not following through with what you said you were going to do or flaws in your security controls or whatever it is. This is someone on the ground, how they got impacted. Now, I ask this because that show undercover boss, when you see the CEO actually out there on the ground and seeing people's stories, they change. And that actually always breaks them a little bit because they are at the coalface and they're experiencing it. So Do you think that there's this disconnect from all these high-flying executives and all their big salaries and fancy cars that they're not actually seeing the consumer on the ground being impacted by their, I don't know, lack of awareness about cyber security? Because Maybe they didn't give the CSO the budget that they needed to do this stuff and therefore there was a breach.

Brian Grant [00:16:05]:

My experience with this is that it's a lack of understanding and awareness more than a disconnect from the consequences. I'm pretty sure that 99.9% of people out there are trying to do the right thing. They're just approaching it from the wrong direction. And we're approaching it from the traditional way of trying to achieve that security and achieve that outcome. Nothing against some of the standards we use, but I'll give an example. We often hear, and you probably hear this as well, we hear a lot of organizations going, okay, we want to improve our cybersecurity posture. So we're going to implement the Essential 8. We're going to do the Essential 8 and we're going to get maturity level 10 or 3 of the Essential 8. Now if you look at Essential 8, it has some really good recommendations in it and things like that. The interesting thing about Essential 8 when it comes to data security is the 0.8 of the Essential 8 is around data. And it says for, as part of the Essential 8, you should back up your data and should have good backup policies and should have good data backup. From a data security perspective, if I use the same analogy for a Maserati, what we're actually saying to organizations with that, hey, make copies of your data as your data security strategy or your data as your data security strategy is that's like going out and you buying, Carissa, a Maserati, okay? And I say to you, hey, Carissa, to keep your Maserati safe and prevent you from harm and everything like that. Here's what I'd like you to do. I'd like you to buy another Maserati and put it in your garage as a backup. And by the way, to be really, really safe, can you buy a third Maserati and put it in someone else's garage, it's almost assuming that there's going to be a data compromise or an incident. And our way of recommending to clients and organizations to prevent that is to go buy another Maserati. Don't secure the Maserati, drive safely, put your seatbelt on, all that sort of thing, have insurance, all that sort of thing around making the original Maserati safe. Our principle is to, hey, go buy another Maserati, put it in the garage, And by the way, best practice is to buy a third 1 and put it in someone else's garage. Literally that's what that says. Now, nothing against that that's exactly what you should be doing. You should be making copies of your data so that you can recover. But shouldn't we be saying, hey, why don't we try and prevent bad things from happening in the first place? And that is what data security is. That backing up of data is data resilience. That's being able to restore data in the event of an incident. Shouldn't we be trying to stop the incident in the first place by securing the data and explaining to organizations and leaders how to secure data effectively? And the great thing about it is, and I'll be honest, data security is easy. You just need to do it. And that's the great thing about being in the data security business. When we do do it well, people don't get compromised. It's great. So going back to your comments before around the CEOs, executives,

Karissa Breen [00:19:29]:

you said It's a lack of understanding and awareness. But what is it specifically? Is it that they don't know how to do anything at all? Is it that they're not sure around some of the compliance mechanisms if they're in a regulated space? We always say in this space, like any conference, oh, it's a lack of awareness. Yeah, but what is it specifically? Like, we've been saying this stuff for 20 years. So when are we going to get to the stage where people get the awareness? So either we're doing a bad job as cybersecurity practitioners, or we got really bad people internally that can't communicate very well, or there's something seriously wrong here. The fundamental question is that I ask every CEO I walk into and every executive I talk to, I ask them the fundamental question of,

Brian Grant [00:20:15]:

Are you data dependent? Do you use data for making decisions and delivering services and delivering change or disrupting traditional markets? And that normally gets them to say, well, what do you mean by data? And My Gucci handbag is an example there. What that means is you've got to sort of redirect their attention away from what the risk is to what the target is. And this gets to sort of Sun Tzu's art of war type principle of you can't defend everything. You've got to choose your ground. And if data is what is driving your organization, And most organizations are data dependent today. If that's what's driving your organization, the CEO should be told, hey, we need to secure the data itself. And this is this concept of data security is data security. It's not network security. It's not user awareness. It's not edge device. It's not laptop security. It is the actual data itself. It drives me crazy, Chris. I've been advocating this for years and years and years and years. And we're still having the same thing happen over and over again. And it's through people like yourself getting out there and saying, well, yeah, let's take a different look to this problem because whatever we're doing today is not working. And this is what we're doing is we're repeating that over and over again, where we're buying this next shiny new thing and putting it on our edge devices or our network, anticipating that I put the next technology on, that's going to stop it. But I think strategically, we're doing this the wrong way. Our execution is reversed. If data is our most valuable asset and it's most at risk, then that shouldn't be where we start. And I'll be honest, every cybersecurity organization I go into are all talking about how they're securing their edge and their users. No one's coming to me and saying, before we do anything else, Brian, we're going to secure our data. Data is actually the thing they do last. Whereas probably it should be the thing they do first. Because if data is what's really at risk and what causes the most damage, if exposed or compromised or unavailable, then shouldn't we do that first? And I don't know why. It's just something that makes complete sense to me and doesn't seem to make sense to anyone else.

Karissa Breen [00:23:03]:

So anyway, that's my perspective of the world. So you mentioned before that you can't defend everything, which I totally get, or else we as human beings would just never leave our home and be wrapped up in cotton wool. There's always a risk that I could get hit by a bus or something falls out the sky and hits me on the head and then I die. Like, gosh, I'm touching wood when I'm speaking like this. But you understand the theory, so I get that. So would you say then people aren't asking the right questions then around, well, what should we be leading with? You said before, people are taking data last rather than first. So maybe they're just not asking the right questions. So for example, you go to a doctor and you have a sore arm, but you're saying your foot's sore. You're not getting to the root cause of the problem because you're not even asking the right question in the first place. Let me give you a practical story. So I met with the CEO of a manufacturing organization here in Australia. We were having an informal conversation

Brian Grant [00:23:55]:

in his office and he said to me, he said, okay, so Brian, what should I be asking my cybersecurity team in order to keep my organization safe? And I said, well, the problem, John, it's called John Smith, you know that you're biased. And he looked at me and went, well, what do you mean? I went, you have a perspective bias on this. You have a personal bias but you're not aware of it. Let me walk you through this and I'll explain why. John goes, well, okay. I said, do you have a laptop or a computer that you use? He goes, well, yeah. I'm a modern 21st century guy. I use my laptop for everything. I go, great. Okay. So let's pretend you came into work and you open up your laptop and you found that the data on your laptop and the data that you use to work every day is unavailable. So your laptop's down, you can't do anything. So what do you do? He says, well, I call IT. No problem at all. Makes sense. So let's pretend you're down for the day. You can't do anything for a day on your laptop. How do you feel? He goes, Oh, I'd be pretty annoyed. Great. Makes sense. Okay. Let's pretend that your laptop isn't working or not available for a week. No, no, no. Let's make it a month. How would you feel? And he looked at me and said, well, someone's losing their job. I said, fair enough. I understand completely. Let me ask you a follow-up question to that. Your manufacturing systems run on digital platforms now. They use data. They're all highly automated. You've got some great technology that you're using to run your business. Let's pretend the data that runs your manufacturing plant was unavailable for a day. What would it cost your organization? And he goes, without blinking, he said, half a million dollars. He knew the number straight away. And I said, so let's pretend the data that's driving your business, your manufacturing business and your digital platform that works with it was unavailable for a week. No, no, let's make it a month. What would happen? And he stopped and paused for a second and he looked at me and went, we're all losing our jobs. I said, that's the problem. The perspective is you see IT and information systems from a personal perspective, which is basically your laptop on your desk. The reality is You should be focusing on that whole critical part of your organization that is material to the entire organization and try and shift your bias away from your personal view of it to 1 of this step back view of what really is important versus what's immaterial. And that allowed him to go back to his cybersecurity team and went, okay, Yep, I know laptops are important. I know mobile phones are important. I know the network's important. What are we doing to secure the business itself being that data and digital system that runs our manufacturing plant? And that's when he got to the essence of what he needed to really do as a critical priority. And to this day, a couple of years on from that conversation, he does come back to me and said, best thing we ever did was have that conversation because it just changed my perspective of what's important to me, the organization, my team, my staff, my family.

Karissa Breen [00:27:34]:

Wow, that's excellent. I think that's a great story to share and I think like you said it is a real live example and I think that sort of led me back to my original points around asking the right question. Is the person in the room aware of how to get the best, how to get that perspective like you said from the CEO to get them to understand because you used analogies that were relevant to them. You used the right language. You understood from their point of view what would resonate with them. You weren't giving these sort of arbitrary made up scenarios that people can't relate to. I would say in this field, that's probably 1 of the biggest problems that I see because sometimes people rattle off a whole bunch of things and like, whoa, I don't even get what you're going on about. I'm from this space. You've got to think about people that are not from this space, that aren't thinking about this every day the way we are obsessing about it. So I really appreciate you sharing that story. But then you mentioned something as well around, like people have this obsession to try to defend every single thing. Where does that sort of stem from? Is it because security people are just naturally want to defend? So therefore any risk, any anything, they'll just go, I just need to shut it down. Or why do people have that mentality?

Brian Grant [00:28:44]:

I have my own bias. Because I see the world as just continually as a set of data that we're consuming in all manner of ways. And so my view of the world is sort of matrix-like. Everything is data when it comes down to it now in the modern digital world we live in. And so my perspective is, hey, if everything is data, that's where keeping the world safe starts with data. That's my perspective. So I don't understand it. And I'll be honest, I just don't understand why. And I do understand from a perspective of a lot of the cybersecurity controls don't necessarily prioritize certain things over other things, or that if they do, they probably prioritize the wrong things. So for example, if you look at some of the standards that we use, like the Information Security Manual and NIST and these sort of standards, They are basically a checklist. It's a massive checklist of controls. It actually doesn't take a step and say, well, okay, if you're this type of organization or if you have this type of social responsibility and you're collecting this type of information, this is what you should do first. It literally just gives you a to-do list. And maybe that's a contributing factor to it because we don't put context around cybersecurity very often. We basically go and say, here's all the best practices from user training to edge security to network micro segmentation to firewall design and configuration. And we list them all out and we don't actually, again, it gets back to the question. We don't ask the question of what's important first and then reverse that to let's work from the important out. And everything in those cybersecurity controls is valid. Don't get me wrong. Everyone should be trying to achieve the best cybersecurity they can. It's like, think of what we do in terms of data security is like the safe for your home where you put your jewelry and your most valuable things. Does that mean you leave your front door open? Of course not. You do secure your front door, you make sure it's safe, you make sure you've got the right basic security in place. But if you've got a million dollars worth of jewelry, you're not going to leave it on the shelf and hope that the front door and the windows are going to keep it safe. You do something more in keeping with the value of it. And I think as the value of data has escalated or exponentially escalated, you look at some of the things we're doing now around the use of data. We didn't do it 10 years ago. Data didn't have this role in our life and in the world that it does today. And now we've got to take action to prioritize our investments around cybersecurity. And from my perspective, prioritizing that investment should be, if you're a data dependent, data critical organization should be on the data, not necessarily on the edge of a laptop. And actual facts, I've sat down with organizations and said, if someone's laptop gets compromised, what does it cost you? And it's probably time and effort to recover it, nothing else. But yet, if you've got 10, 000 users or 10, 000 laptops or 10, 000 devices, and you spend $10 a device a month to secure it, you add that up, and suddenly you're spending $100, 000 a month on that. And that's $1.2 million a year just to secure that device that if it was compromised in itself is not catastrophic. So why would I spend $1.2 million to secure something that doesn't catastrophically impact my business? And that's where the mistakes made. We're sort of coming from the wrong angle, but that's my perspective and probably there's 90% of the cybersecurity industry will probably disagree with me. You mentioned before around context.

Karissa Breen [00:33:06]:

I totally agree with you and I understand those standards. I mean, I've spoken to people at length about this on the show around so many things are overwhelming. That's why people don't do them because they're like, oh my gosh, there's like 3, 000 pages of stuff I've got to sift through and 70% of them aren't relevant to me. So how do I know where to start? Oh, what do people do when they feel overwhelmed? They don't do it. They procrastinate. I'll do it tomorrow, but tomorrow never comes, so I understand that. So Why isn't there context or is it just a big blanket sort of thing like here's all the things that we would advise but you've got to go ahead and trawl through it yourself to make the best sort of course of action? I think

Brian Grant [00:33:42]:

the consulting industry makes a great living out of trying to put things in context and quite often they do a good job of it to be perfectly frank. They do get below the surface and go, what's important to you? Again, I think the reality is if you look at where we are today, we've got to shift, nudge the needle in a different direction. I don't think any 1 person or any 1 thing, you look at the incidents recently and they continue to happen. I think we're getting to the point where we're almost sadly starting to almost accept it as an outcome that's going to always happen. That's really wrong. We shouldn't be accepting of bad outcomes. So I don't know, you asked the question, but I'm passionate about trying to make sure that people are aware of this to the point where we come into organizations and we go, look, we don't care if you use our capability. We only care that you do it. And I think the more people that come out and say, hey, just do it. Just make sure that you're doing it. Make sure that you wear that seatbelt and don't put your family and friends and colleagues in harm's way will materially change the world. And we just have to keep nudging it. You know, the old nudging economics principle, Carissa, where we just have to keep nudging people in the right direction and just keep nudging and nudging. And eventually we'll get a tipping point where people will realize and organizations will realize and business leaders will realize and executives will realize and security leaders will realize that this is a foundational

Karissa Breen [00:35:27]:

building block of a safe world. Yeah, most definitely. And I listened to a guy the other day, I think Ed Milet, and he used this analogy which I liked. It was like a pinata at a kid's party and he goes, you hit it once, nothing comes out. Hit it again, maybe nothing. He goes, you keep hitting it, eventually it will come out. Now obviously That's only like a 30-minute sort of thing to get the candy out of the pinata, but obviously this is a bit different in the context we're talking about. I think that the same sort of theory applies to just trying to move the needle, have these conversations, Get people on the show like yourself to explain your lens on the industry. So in terms of trying to conclude our interview today, Brian, I'm really curious to hear about the Gucci bag story. This gets to the principle that the world is built on data.

Brian Grant [00:36:16]:

And I was trying to explain to my 18 year old daughter that data is in everything we do and everything we love and everything that relates to the world today. And she has a, like most teenagers has a love of beautiful things and does love beautiful handbags. And I was explaining to her that the Gucci handbag is at its essence, just data. They sat down and started designing a handbag. It was designed in a digital sense. So that data was translated into a set of manufacturing specifications. Then they would have had to take those handbags and ship them out to all around the world. That shipping and logistics is all just data. And finally, the marketing team, when they sent you out that email to say, come along and see the new handbag, promoted that handbag that then allowed you to go into the store and pay your ridiculously large amounts of money to buy that handbag was underneath that simply all data. Now, let me explain what happens if data isn't secure. Let's pretend the data was compromised at the design phase. Someone stole it or manipulated it or tampered with it. A cheap copy of it might've turned up on the market before the original handbag even came out. So what's the point? Our brand's been compromised. You would never have known about it and therefore again, you would never have bought it. Your medical history, your family's medical history, your health and wellbeing, when it comes down to it, it's all just sitting as a set of data that has, if not secure and kept safe, could bring harm to a person. I can tell you now medical data, for example, is 1 of those ones that is so intimate and so personal that if it's compromised and used against us in any way, shape or form is horrendous and terrifying. So

Karissa Breen [00:38:23]:

I'll be honest, the world's built on data. Well, I'll be remembering that next time I go into Gucci store and think about the interview I had with you. There might be down 1 customer now, but I love that story. I love the analogies that you use because I'm a big analogy person. So I really appreciated that. So Brian, I've absolutely enjoyed today's conversation, your honest and real responses to the space. So I thank you so much for sharing your thoughts and your insights and coming on the show today.

Share This