The Voice of Cyberยฎ

KBKAST
Episode 281 Deep Dive: Yuri Miloslavsky | Risks of Digital Footprint in Information Sharing
First Aired: October 30, 2024

In this episode, we sit down with Yuri Miloslavsky, CEO of Sharepass, as he discusses the risks associated with digital footprints in information sharing. Yuri delves into the importance of awareness about the vulnerabilities introduced by our online activities and the challenges of balancing security with user convenience. We explore the cybersecurity industry’s need to simplify security processes like multi-factor authentication (MFA) to enhance user adoption, and the pivotal role of education in improving security practices. Yuri also addresses corporate privacy policies, the trade-off between convenience and security, and the necessity for tighter compliance and regulation to ensure transparent data management.

Yuri Miloslavsky is an IT professional and entrepreneur with over 15 years of experience in the industry. As the co-founder and CEO of SharePass, a cutting-edge digital footprint management and privacy protection platform, Yuri is at the forefront of developing secure solutions for the modern digital communication landscape. His expertise extends to building and running a successful Managed Service Provider (MSP) specializing in IT consulting and cloud services.

Help Us Improve

Please take two minutes to write a quick and honest review on your perception of KBKast, and what value it brings to you professionally. The button below will open a new tab, and allow you to add your thoughts to either (or both!) of the two podcast review aggregators, Apple Podcasts or Podchaser.

Episode Transcription

These transcriptions are automatically generated. Please excuse any errors in the text.

Yuri Miloslavsky [00:00:00]:
Privacy is a problematic area, but if we think enough and we kind of do our own risk assessment and act on it, we can at least reduce the risk. We can at least be aware of it and do something about it.

Karissa Breen [00:00:28]:
Joining me today is Yuri Miloslavsky, CEO from Sharepass. And today, we’re discussing the risks of digital footprints in information sharing. So, Yuri, thanks for joining and welcome.

Yuri Miloslavsky [00:00:44]:
Thank you. I’m glad to be here.

Karissa Breen [00:00:46]:
Okay. So, Yuri, you know, you and I have spoken a lot over the time I’ve known you, and I’m curious to probably start with a little bit more around digital footprint, people who aren’t familiar. I know we’ve touched on it in the past in our chats, but tell me, let’s start right there. What is it?

Yuri Miloslavsky [00:01:07]:
So it’s quite a wide term, but it generally includes a trail of information or, and data that you leave behind when you use the internet. Now, whether if it’s browsing, sharing information, or really searching or anything really, any information you put online. So digital footprint is what’s left and the trails you leave behind, which can be identifiable information, but it stays after your mind.

Karissa Breen [00:01:31]:
Yeah. Of course. So would you say do you think though I mean, I’m gonna ask this. Do you think people, like, don’t really care about that, though? Like, but just with everything that’s happening at the moment, now we’re getting into that, but, like, like, I get it, but do you think people care?

Yuri Miloslavsky [00:01:47]:
People don’t care because they’re not necessarily aware of it. So, you know, you usually, if there are certain risks and you’re not aware that there is a risk, then obviously you kind of care less. So people tend to prioritize kind of accessibility rather than, their own, private security because they’re not always aware of the rest. Yeah.

Karissa Breen [00:02:05]:
But I also think that perhaps, you know, even I’m speaking to people at different generations, now some people say they just don’t care even if they are aware of the risks. Even if I’m I’m speaking to someone on the weekend about, you know, these are some of the risks, people just go, oh, I don’t really care. Like, all these other breaches have happened, and all these other things are happening. I think people I don’t know. Do you think that perhaps that is like, well, there’s so much going on in the world? Like, it’s just another thing I’ve gotta think about and worry about. And until it happens to them as we know, do you think people are just not gonna really think about it or or care?

Yuri Miloslavsky [00:02:38]:
I think the awareness is slowly going up and people are a little bit more aware of the risks. So, yes, there will always be people that, kind of care less about those things. There are also people who don’t care and they still, for example, use very, very simple passwords or, like, no MFA. And you will always find those type of users. But the majority of people are becoming more and more aware. And, as kind of technology progresses and, the new generation becomes more technological, I would say, People understand, in my opinion, more and more that, their online presence is actually important and critical, and it can really risk, like, pretty much everything. And starting from financial and on the way up to, you know, private and, so forth. So we know everything online today.

Yuri Miloslavsky [00:03:23]:
So I think awareness is going up. It’s still not quite there, but I believe people care more about it.

Karissa Breen [00:03:29]:
Okay. So you said before online presence, you know, when people are aware, they care more about it. What do you think when you’re speaking to people, what what do I care most about, would you say? They had to guess.

Yuri Miloslavsky [00:03:39]:
Well, people, most of all, care about just accessing data and that being convenient and user friendly. So, people love when it’s easy. So and in the pursuit of easy, sometimes they give up on their, personal information. They give up on a very important aspect of privacy.

Karissa Breen [00:03:54]:
And, I mean, that sort of leads me to my next sort of point is people are willing to trade I I mean, we’ve heard this term, security or privacy for con you know, for convenience, which I understand. Right? So work out that conundrum for me. Like, I get it. Like, if I put my general person hat on, I understand it. But then, of course, from a security point of view, it doesn’t work. But then sometimes, the way in which cybersecurity practitioners think just it it it’s people can’t operate like that. We can’t have all of these things and all these barriers or else things would never get done. Right? So when where do we find the equilibrium?

Yuri Miloslavsky [00:04:31]:
What happens normally is there today, there is a quite a wide gap between the industry and the end user. And as you said, convenience is, quite a major factor for the typical end user because they just wanna get things done and they want it done simply and easily. And this is why I believe there is a gap because the industry tends to offer a lot of solutions for a lot of problems, but quite a lot of these solutions are quite a steep curve in terms of learning. And some of them tend to be more complex. And a lot of them tend to also complicate things to the point where the user says, well, you know what? I will accept the risk, because it’s just easier and I cannot be bothered with all those tools all the time. So the trick is to find and to produce the right tools that are simple enough for users to use, for example, like in the click of a button. Right? So if I’m doing something and it only requires for me, like, another tap, then in a way it’s acceptable, at least for me. If suddenly you need to change the whole process and you need to now onboard a new tool and you need to use it in a different way, and there’s a lot of, those pop ups and annoying things, then the user will be like, yeah, well, I don’t have time for this.

Yuri Miloslavsky [00:05:43]:
So it’s a really kind of, I would say, sensitive balance that we need to find as the providers, as the industry to allow the users to still carry on their work while keeping it simple.

Karissa Breen [00:05:54]:
So can I ask, how do we get to this point where overcomplicating things for people where it’s becoming so people are saying, this is so complicated and convoluted? As a result of that, I’m choosing to just go with the the option, which probably isn’t the best option, but it’s just easier. Have people not paid attention to how users operate or what?

Yuri Miloslavsky [00:06:15]:
We have 2 sides here. 1 is basically the, you know, the world of hackers and whoever does the online crime. They’re getting more sophisticated. And because they’re getting more sophisticated than the industry and, you know, cybersecurity industry, it has to become more sophisticated. And part of that is becoming more technical. Now it was always a challenge to take technical concepts and explain them to everyday end users because, well, they don’t do it on a daily basis. They don’t necessarily understand it. I can take an example.

Yuri Miloslavsky [00:06:46]:
Have a look at MFA. Now it’s obviously recommended and urged to be deployed in all organizations, but there are still many companies, and I’m even talking about companies that haven’t implemented that because of the complexity that users have of just using it. Just saying, oh, I have the SMS then and they it arrives and suddenly it does it. And a lot of people getting really confused even on the onboarding process. Now you can see percentages up, where you see more and more users. So it’s a matter of, deploying those solutions and kind of getting used to them. And it just takes time. So So not, not everything becomes immediate, but once we create enough awareness and we provide it with simple enough tools, it will improve because more and more users will become more aware.

Yuri Miloslavsky [00:07:30]:
More users will try, more users will learn the process And, you know, as a result, the tools will become effective, but it’s a learning curve and it takes time.

Karissa Breen [00:07:39]:
What do you mean by awareness though? Now I know it sounds like a very basic question, but people say we need more awareness. I think people are aware though, and I think they choose not to do it. I’ll give you an example. I was at an event. It was in an office, sophisticated office. The guy had his password literally written on a post it note. I was rattled. He’s like, yeah, I probably shouldn’t do it, but it’s just easier.

Karissa Breen [00:08:01]:
This is a real example. This was recent, like, couple of months ago. So people say smoking’s bad, but yet people still don’t. There’s awareness. They’ve tried to crack down it, try to tax you more. People are still doing it. So, dude, how do we get past that point, though?

Yuri Miloslavsky [00:08:16]:
Yeah. Well, it kind of reminds me of dealing with children. Right? Sometimes they until they get hurt, they will keep on doing the same thing no matter how many times you warn them. And, unfortunately, it’s just the simple reality of humanity. We all assume it will not happen to us. We all kind of like, yeah. Yeah. I understand the risk.

Yuri Miloslavsky [00:08:31]:
I understand all that, but it’s not really relevant to me. Who am I? I’m just a regular person. And everybody assumed that, and the next thing happens that they get hacked and then they, you know and then suddenly they realize what sort of a hassle it is because now they need to reset passwords, call the bank, call the insurance company, call this, call that. And the problem is that once your online identity gets breached, it just keeps on rolling. Like, you don’t know where it stops because everything runs online. So once, your account, they have compromised, your bank details, for example, got compromised and credit cards. And from there, the the road is very long. So, unfortunately, some people have to get burnt to learn the lesson, and I don’t think there is a way around it, to be honest.

Yuri Miloslavsky [00:09:12]:
But then again, that’s our responsibility as kind of like the as the industry, as the providers, as the vendors, is to educate as many people as possible so at least they understand the risks. And if people are willing to take that risk, then, you know, end of the day, it’s a free country. It’s a democracy. And if they that’s what they want to do and they choose to do it, then they just need to be aware there is a risk. But the problem is that sometimes it’s not a risk for them to take. And I think that’s where the problem begins because if a user risks his own online identity, his own information, it’s one thing. But if he does those practices in an organization, it’s quite a different, issue here because they risk somebody else’s assets. And then can they can be liable with their job.

Yuri Miloslavsky [00:10:01]:
They can be liable with, you know, they can be liable for that in general. So it really depends on the case.

Karissa Breen [00:10:07]:
Okay. So there’s a couple of things in there which is interesting. So you said you said before, like, you know, giving that that broad awareness of people under has to understand. Do you think perhaps vendor and friends educating on the tech rather than the users. Now I’m looking at content in companies across the world and large, medium, small vendors all the time. You think people are too focused on that? Because really, end of the day, like, it does need to be focused on the users in terms of, like, what do they get out of it? Because, really, no one wants to do more things than they have to do. People already got enough on their to do list if they can try to perhaps get back 30 seconds of not doing the extra, you know, authentication with MFA. They probably will.

Karissa Breen [00:10:49]:
But do you think that perhaps there’s just been a lack of understanding people at sort of a fundamental level and living in a perfect world where people are gonna do all these checks and balances?

Yuri Miloslavsky [00:10:59]:
Again, I think it’s a matter of progress across the timeline. If you can remember where we started with computers, quite a lot of systems didn’t have passwords at all. It would be just open. And that, at some point, was considered to be fine. And sometimes those systems were quite sensitive systems. But as the risk increased, solutions emerged and, today is like an obvious thing. When you access something online, it’s pretty much already obvious that you have password and, and nobody challenges that anymore. People do challenge passwords.

Yuri Miloslavsky [00:11:29]:
They want other methods, I understand, for convenience, but nobody challenges the fact that you need to have it segregated and you need to have it locked. And the same will happen with majority of the technologies we see today. Antivirus thing is another example. No computer here used to have it. It came, much later. Initially, there was no antivirus because there were no a lot of viruses. So as as I said, as the the dark side of the online, activity emerges, so do the methods to protect. And, the implementation of it, yes, it just takes time across the timeline.

Yuri Miloslavsky [00:12:06]:
And that’s the only thing I see. So in 50 years, whether if it’s a fingerprint or face recognition or or anything or of voice recognition, it will still remain. It will be just simplified. And and I guess that’s what the industry is trying to do. We just want to simplify our solutions so the users do not consider it as a major burden.

Karissa Breen [00:12:25]:
And I think that’s the way it it probably does need to go. Because, again, like, in a perfect world, we’d love to do all of these things, but that’s not the reality of how how people are and what they care about. They feel like friction. So then on that note, and I’ve used this example before, if you’re looking at an ecommerce company, your job effectively is to make money. So making money is to get people to buy things. If there’s so many things that creates friction, like there’s so many MFA, the average person, and they’re like, oh, I don’t know this. I don’t know that. I can’t be bothered.

Karissa Breen [00:12:53]:
Therefore, I’m going to leave the platform because it’s all too hard for me to procure something through there. So would you say companies out there know that and are willingly gambling with that fact of, well, I need Yuri to buy from my platform because that’s what we’re here to do. At the end of the day, businesses are here to make money. Security is a very big part of that, of course. But I would say from talkings I’ve had with certain companies out there that they do gamble on that. So it’s like, well, we don’t increase friction because we’re not gonna then collect the money from a bunch of people because they’re gonna get frustrated because they don’t know their password. They don’t know how to reset it. They can be bothered, and then they’ll leave.

Karissa Breen [00:13:32]:
Do you think, from your perspective, companies are acutely aware of that? And then as a fact saying, well, we’re just gonna overlook that. And if something bad happens, then they’ll say, oh, well, we didn’t know or we tried our best.

Yuri Miloslavsky [00:13:46]:
Oh, a 100%. 100%. You can see it, and and you can prove it by the fact that we have, cybersecurity insurance. That’s what cybersecurity insurance is for. So companies are well aware aware of the risk. They’re well aware that there will be breaches. There will be problems, with security, and they, you know, they basically put policies around them to protect themselves, and cyber insurance being a part of it. So a 100% yes.

Karissa Breen [00:14:13]:
But how do we get people past that? Because at the end of the day, like, you need customers to trust you to buy from you. And the thing that we’ll probably perhaps we’ll start seeing, and I don’t know if you agree on this, would be, well, people are like, well, it costs a lot of money to implement all of these things. And as people say, it’s not tangible. We can’t see the thing that we bought, all these 1,000,000 of dollars and all these tools and all these people and all of this. But do you think that companies will consistently gamble that risk with their customers because they wanna collect the revenue? They wanna keep going past go and just hoping that they don’t end up in jail on a proverbial, you know, monopoly board.

Yuri Miloslavsky [00:14:50]:
Companies need to, first of all, always be, you know, following certain, regulations and compliance. And then and and you can see compliance is increasing. So there is obviously a move to force companies to think more about security. And this is why, for example, during the phase even of development, there is a security by design where security is being implemented from the ground up. So it’s not like you design a software and you’re like, oh yeah, but we don’t have security. Let’s, let’s put it now. It’s something that you need to consider as a vendor from day 1. And a software that is just by design, not secure enough, then will be, you know, replaced with another software, which is.

Yuri Miloslavsky [00:15:26]:
And it’s just again, it’s just a matter of time. So companies definitely addressing that. They’re also aware that there is still a risk. They they have to be, because like you said, profitability is then probably number one priority for all businesses, even for the ones that claim that’s not, because that’s the whole idea of a business. So does the profitability comes, as a priority on top of security? Probably. But, again, if your security is not in place, your profitability would suffer. Now whether if it’s fines or whether if it’s breach of your data would, you know, result in 1,000,000 of dollars if it’s a large company. If you even for small companies, the damages you can’t measure them because they’re damages to reputation.

Yuri Miloslavsky [00:16:09]:
So companies that gets breached, usually, there is a big hit on their reputation and reputation is everything for them. So it’s a direct hit on their productivity. So if they think about revenue and revenue only, well, cybersecurity is something they definitely have to consider.

Karissa Breen [00:16:25]:
So, Yuri, you also talk about current practices on information sharing. So perhaps give us some examples.

Yuri Miloslavsky [00:16:31]:
We’ve been in the industry for quite a while. I run an MSP. And what we see a lot is user sharing data of, any tool. Now I’m not saying one or the other. People will share data everywhere, whether if it’s social media, where it’s programs like chatting tools, WhatsApp, Microsoft Teams, email, even SMS. And it’s convenient because people love convenience. So I want to share something with you and I want to do it over SMS and I wouldn’t do it over WhatsApp because those are programs I already have on my mobile and, easy to use. Email is another example.

Yuri Miloslavsky [00:17:02]:
Everything’s being shared over the email. And as I always say, those platforms, they’re fine to share information, but the problem is that that information will remain on those platforms after you shared. And as users, we are not aware of it. So I, I will send you something. It could be an address. It could be details of client. It could be even a password. And once I send it, I sort of achieved my mission because I’m like, yeah, I need to get that information to you.

Yuri Miloslavsky [00:17:31]:
That’s my line of thought. And after the information is with you and you achieved what you need to do, we both forgot about it. It’s no longer relevant, but the digital footprint is there, which means that information is on my mobile and that information is on your mobile. So there are two points of weakness and it’s just out there in clear text. So if your phone gets compromised or mine, in my case, I can vouch for my own security, but in your case, I sent you that data. I have no idea what you’re doing with that phone. Maybe you’re leaving it unlocked in the middle of a street. I I I have no way of knowing.

Yuri Miloslavsky [00:18:05]:
And that’s where the rest begins.

Karissa Breen [00:18:06]:
And, again, do you think people are aware of these risks? But, again, it’s like, oh, I couldn’t be bothered trying to figure it out. I just need to really text Yuri because I need to fly out of the country soon, so therefore, it won’t matter. Yeah. A lot of those sort of theories are going in people’s minds. Like, maybe they kind of are aware, but they’re just sort of banking on the fact that, again, it doesn’t happen to them or, you know, I’m just one person. Those are the thoughts that are going through people’s minds.

Yuri Miloslavsky [00:18:30]:
Some people. People who are more, for example, aware of privacy. For example, I’m a very, very private person. I I learn my own privacy and, like, majority of people, for example, don’t read privacy notes and things like that. I do. Not all of them maybe, but for the most of it. So some people are not aware, of course, and some people don’t even care, but this is where we come in. And that’s our job is to explain and then to keep highlighting that it is a risk.

Yuri Miloslavsky [00:18:55]:
And, you know, we we all can implement all sorts of tools and companies can implement tools that cost 1,000 and tens of 1,000 of dollars just for the regular end user eventually go and share a database, passwords over, Teams. And then if that Teams get compromised, that’s it. From there, you can leverage your way into the system and do a lot of damage.

Karissa Breen [00:19:19]:
Yeah. And I understand that. And I think that, you know, at the end of the day, when I put 2 hats on my, you know, my my journalistic hat on, you know, asking people like yourself in the industry these types of questions, then I put my, like, general person hat on. It’s just always interesting to to look at things from both sides. So you talk a lot about the price you pay for accessibility and convenience as we’ve discussed, but I really wanna follow this a little bit more that, you know, at the end of the day, companies care about making size, I guess your agenda is different. But to your point, people go into businesses to make money. So where how does this then then make sense? I think companies, do you think that they care to a point up until they get pinged? And do you think that they again, they’re trading and they’re gambling on these things? Because we we’ve seen it time and time again. And companies will then put something up on their site to say, we care about your privacy and and all of that.

Karissa Breen [00:20:19]:
But it’s like, hey, well, you didn’t invest in x company or you didn’t invest in better policies or practices to avoid this issue. So I feel like now people become a bit more skeptical towards companies and, you know, are they genuine in saying that? Because anyone can say that. But as you know, has to be followed up with action. So what are your thoughts then on this?

Yuri Miloslavsky [00:20:38]:
100%. A lot of companies say that and, and and because they know they have to. But the good part of it is be as I said previously, we have more and more compliance, being introduced, which, again, it has its benefits and it has its drawbacks. But companies, it’s no longer really a choice for them in many cases. It’s a lot of it is being enforced today. And it’s good in a way because that enforcement is a benefit to the end user. So how can I, as an end user, know for a fact what the security is implemented for a certain company? I can read the website, and, obviously, they will make it look good. But behind the scenes, maybe they it’s not their priority.

Yuri Miloslavsky [00:21:18]:
And it can happen. But as you can see, you know, compliance, is advancing quite fast. EASO 27100, you have, SOC, you have a lot of type of compliances, and and some of them are being enforced. Like, you have to have it in order to provide certain services. So if you provide, payment gateways, then you need to have PCI DSS compliance, for example. And a lot of it is ticking the boxes, but by ticking the boxes, you resolve a lot of those issues, not all of them, but I think awareness in businesses around cybersecurity is increased tenfold. Year on year, you can see like more and more companies take it more seriously because the amount of riches there is increasing constantly and, you know, that drives improvement. So I’m quite optimistic about it, to be honest.

Yuri Miloslavsky [00:22:04]:
I think security in general is improving. It’s not degrading.

Karissa Breen [00:22:07]:
Okay. So PCI DSS, for example. Now this is a really interesting one. Now I get that, but there’s a lot of people out there not really governing or auditing what these people are doing. I’ve even called up Mastercard for an interview about that. No response was no, we don’t wanna do an interview. So again, do you think that because you’re gonna think about how much time and I mean, I worked in a bank in, I worked in a governance role. So it’s like, do you think that, again, it’s like, well, how much money does all of this cost? That may, a, outweigh the potential money we may lose on, you know, the fraudulent side of things.

Karissa Breen [00:22:41]:
You think I think businesses really are looking at it like that.

Yuri Miloslavsky [00:22:45]:
Oh, it is. So, for example, if, a bank or a credit card company like, think of a as a certain risk. Right? You’re saying, okay. We have that risk. Alright. As a result of that risk, if if that risk becomes real, we lose 5,000,000. Okay? Let’s estimate it at 5,000,000. Alright.

Yuri Miloslavsky [00:23:04]:
Fixing that risk, dealing with the problem will cost us 50,000,000. Okay. You know what? We’re happy to accept the risk. And it happens. It happens as part of the profitability and part of the budget, kind of, estimation and planning of each company. Certain things they can get away with and take the decision. Other things are being enforced and they cannot. And now you can see seesaws who made the wrong decision suddenly, you know, being prosecuted, and you can see extremely high fines.

Yuri Miloslavsky [00:23:35]:
And, it happens. It’s complex. Again, when we’re talking about those mega large companies, do they have certain kind of things that go around? They they’re not solving, but rather they’re kind of hiding those problems? Possibly. But I don’t think that’s the general, idea. I don’t think that’s how it works for the majority of businesses.

Karissa Breen [00:23:57]:
Well, I mean, I I just I speak to a lot of people, and I’ve I’m hearing a bit of this coming through. It’s like like you said, if it’s $5,000,000 potentially of the money we could lose or it’s 50,000,000 to fix the problem, people are gonna go with the $5,000,000. They absolutely will.

Yuri Miloslavsky [00:24:10]:
The interesting part is they have, an insurance policy that covers it. So take, for example, a bank. Right? So if you had unauthorized transaction and that unauthorized transaction came from from a known vulnerability, okay, for the bank, they’re saying, well, those authorized transactions in a year cost us, I don’t know, a few $1,000,000. But we are aware that to fix that, it’s an overhaul of the whole system. It’s downtimes. It’s this, it’s that. Like, we cannot afford it because if we will do that, our competitors obviously will use that and basically, you know, take advantage, and we will lose our, competitiveness as a result. So, obviously, there are business consideration here in this case, and the company needs to make the right decision for itself, but they also need to maintain a high level of security.

Yuri Miloslavsky [00:25:01]:
So I guess it’s a complex process, but that that I’m sure that’s how it worked.

Karissa Breen [00:25:06]:
Okay. I wanna talk now about online identity and how your digital footprint can compromise this.

Yuri Miloslavsky [00:25:14]:
Yes. So online identity, I always define it as a collection, a lot of information. It’s like a puzzle. Right? So let’s say I have your name. That on its own, not critical. It’s public information. I have your email. Not critical.

Yuri Miloslavsky [00:25:30]:
Public information, easy to find through scrapers, through social media. Not a problem. Then I have your date of birth. Also public. Easy to find. Not such a big drama. Your address, a little bit more sensitive, but again, probably possible. But if you take all of them together, suddenly it becomes more real.

Yuri Miloslavsky [00:25:52]:
Suddenly, it’s easier to steal your identity. Because if I know your name, your date of birth, your address, your phone, and until now I could identify myself, like online to let’s say insurance company or to a medical company. I couldn’t identify myself just by those details. Even, like, transport, office. I recently called and, the identification not, not recently, it was a few years ago. And the identification was, what’s your name? What’s your address? What’s your date of birth? Oh, okay. It’s you. And I was in shock.

Yuri Miloslavsky [00:26:25]:
I was like, wow, that was so easy. Like anyone can do that. So when you have a set of those details that basically creates your online identity. Now, if enough information, personal information also defined as PII is compromised, then I can sort of conclude the rest. Right? So if I have enough information about you, I can conclude the rest and fill in the gaps, or maybe even guess some of the other gaps. And by that, I can gain access to a lot of the online activity.

Karissa Breen [00:26:57]:
I get all that. But I wanna sort of I wanna sort of just zoom out for a second that majority of people, especially, you know, the new generation and, you know, etcetera, people are operating on the internet. So it’s like there is always going to be that risk of this happening. And we I need to go and get health insurance from a company. Therefore, I am then putting my trust in these businesses to do the right thing, and they don’t do the right thing as we know. They make mistakes, which happens. So we cannot completely be exempt from this. And I mean, I’ve spoken to people that, you know, like, you know, parents, friends, and stuff like that.

Karissa Breen [00:27:35]:
They’re obviously very concerned and are you worried about them? I’m like, yeah. But you are still operating online. You still do Internet banking. You still do all of these things. Like, you can’t even go into a branch anymore to withdraw money out. They’ve closed them. So it’s like it’s we’re almost forced to be in this position whether people like it or not. It’s just the reality of how it is. Yeah. We have to accept

Yuri Miloslavsky [00:27:56]:
that position that everything becomes more and more online because it’s more efficient, it’s more convenient. And, and eventually it’s also more productive because, who wants to drive to the bank to do a basic transaction? I would rather do it online. So will I give up part of my privacy as a result? For the majority of the cases, yes. Because again, we cannot keep on doing things the way they’ve been done ahead of 20, 30, 40 years ago. That’s called progress. But at the same time, we also need to be aware of what information are we giving because, you know, people are feeling very comfortable of giving too much information everywhere. So, for example, you go into a new website. I go straight away to the security settings.

Yuri Miloslavsky [00:28:38]:
I wanna see if there are any restrictions I can apply. Take LinkedIn. Right? There is a bunch of, privacy settings and security settings you can apply and you can control. Now will that resolve all the problems? No. But it will reduce the risk. It’s what we essentially wanna do. We don’t wanna eliminate the risk. We can’t eliminate the risk.

Yuri Miloslavsky [00:28:56]:
No matter how much we try, no matter how how many tools we’ll buy, the only way for you to be a 100% secure is just throw your phone, throw your computer, your laptop, disconnect from technology, and go and live in the woods. Then you’ll probably be secure.

Karissa Breen [00:29:12]:
Yeah. But then you’ll be miserable at the same time.

Yuri Miloslavsky [00:29:15]:
Yes. Yeah. Exactly. So we’re always making the choice of, you know, convenience versus security and privacy. And in many, many cases, the convenience and the convenience will win. The productivity will win. The profitability will win, but it doesn’t mean that we now can say, okay, because it will win, then we we don’t need to care about security. We still need to care about security.

Yuri Miloslavsky [00:29:39]:
We still need to do the best we can to protect ourselves, to protect our family. And as I said, I’m gonna keep on going back to this. As awareness grows, people start to take things into their hands. More and more people apply the MFA. More and more people, think twice before they share certain information. People will ask sometimes, like, somebody asking you for a certain information. Like, why do they need that information? I ask it quite a lot. It doesn’t like, if somebody asks you for any information, doesn’t mean you have to give it.

Yuri Miloslavsky [00:30:08]:
So it’s more about trying to suspecting, trying to kind of think, okay, what can happen? What is my risk at the moment? And as I said, you cannot eliminate it. You can only reduce it. And even if you reduce your risk by 10%, good. That’s 10% less of a risk to you to be compromised online.

Karissa Breen [00:30:28]:
Okay. So, like, where’s the line? So okay. You’re obviously more let’s say, you’re on the conservative side, but the average probably person just isn’t. Right? They’re not aware. They probably don’t care. They’re like, well, I’ve been in Medibank and Latitude and Optus breach and anyone else after that as well. So I’ve even had people say, yeah, but Chris, I’ve been in all the breaches. Why should I care? And the other thing is this.

Karissa Breen [00:30:48]:
This is the other question I have. Even if you said all these companies have put these privacy policies and all, and then you actually ask them the question. You know what their response is? I don’t know. And then you just never hear back from these people again. Don’t know the answer. They’ve just put it on there with the intent that no one asks us, but when someone does, I’m like, yeah, then I’m often finding people just there’s no response, not really understanding. So do you think people forced though? So it’s like, I get your point, but I feel like companies are forcing us to forego our security and our privacy now, whether we like it or not.

Yuri Miloslavsky [00:31:23]:
Well, at the end of the day, when you approach a company and you want a service from them, you need to give some level of information. You need to give and by that you can say, I give a little bit of my privacy. Well, yeah, if you don’t wanna give away any of your privacy, don’t engage them. End of the day, if I’m coming to you to get a service from you and you request me to sign a contract and you request me to fill in my details. Again, it is my choice. It’s just a choice that I don’t wanna make. Right? So I do wanna get your services and I don’t wanna sign it, then I will not get your services. So I wouldn’t say forcing.

Yuri Miloslavsky [00:31:59]:
It’s it’s part of the trade. It’s something that we as users have to accept, but we can question that. We have the right to question that, especially with, the larger companies. With smaller companies, it’s still not implemented to that level and it’s still not being enforced and still not being necessarily doesn’t have the right levels of compliance. But with the larger ones, again, they’ve been audited for a lot of the stuff already. Is it perfect? No. But it’s again, I believe the process itself is improving.

Karissa Breen [00:32:28]:
Do you think generally people’s hands are a little bit tired? And what I mean by that is, like, everyone needs a bank. Everyone needs to put their money somewhere. So it’s like, well, which bank is going to be the best? Who’s gonna maintain my privacy and and, you know, all of these things. People need to have health insurance. So it’s like, okay, I need to give away some of my privacy in order to get the health insurance. They’re not gonna just go, oh, okay. Well, you’re a customer where they’re giving us these details. So do you think that again, people’s hands are being forced.

Karissa Breen [00:32:54]:
Like we need these services to operate in our lives. So therefore we do have to give that piece away. But where is that line, would you say?

Yuri Miloslavsky [00:33:02]:
Well, it’s it’s quite individual. So that line is for each person. Each person draws his line. And and again, in many cases, you don’t really have an option. Like, yeah, when I when I did my health insurance, I gave a lot of the information. Now if I had a choice of not given that, would I take that choice? Yeah. I would prefer not to give that information, but at the same time, I do want the that service. So I can choose not to do health insurance, but probably it’s not a good idea when you have a family.

Yuri Miloslavsky [00:33:32]:
You can call it being forced. I I just call it the trade. I’m getting the service. I’m paying for that service, and I need to provide the information end of the day. Again, the company health insurance company, they need that information to do their own math and estimate. It it makes sense. Otherwise otherwise, they will go to bankruptcy. They are again, that’s where, you know, profitability and and the numbers are coming in.

Karissa Breen [00:33:55]:
Okay. So let me give you another example. Going back to the branches. So just say someone’s like, Chris, I don’t wanna do online banking. I don’t trust it. Okay? I have heard that from people. Fine. But they are now being forced.

Karissa Breen [00:34:05]:
There’s no bank branches that are available. Where are they? They’re all gone. All like, there’s maybe one, like, in every x amount of suburbs now. Back in the day, not even that long ago, there’s lots of them. So that, again, that’s why people’s hands are being forced. So it’s like, well, we don’t have a choice now. So how do people get back and call back that privacy? I don’t know. I I just don’t know if people are gonna really have that choice.

Karissa Breen [00:34:27]:
If they wanna operate on the Internet and do things and be in this, you know, day and age, this is the risk and the trade that you you will have to forego.

Yuri Miloslavsky [00:34:36]:
That’s how it already works. Yeah.

Speaker C [00:34:38]:
It’s

Yuri Miloslavsky [00:34:38]:
true. Because again, everything goes, on like majority of the services today go online and, yeah, you will have to forego some of your privacy and and we’re already doing it. Right? Like social media is tracking us everywhere and it’s known. But for example, if you look at Australia, right, so they are now applying certain restrictions on social media until a certain age. That’s a good move. And that’s an example of, something is being done. Now, again, will it solve all the problems now, but at least we can see that there is a progress to try and kind of mitigate some of the risks, at least to young people. So, yeah, we do give up certain amount of information about us.

Yuri Miloslavsky [00:35:17]:
We do give certain amount of privacy, but it depends who you give it to. Now, above it all, there is a concept called trust, and this world cannot work without trust. You have to trust certain people and you have to trust certain companies and you have to trust organizations. Now, does it mean they will not, kind of, disappoint you? No, probably they will. That’s just life. I have no other way of defining that, but that’s just life.

Karissa Breen [00:35:45]:
So with that being said, like, where do we go to from now? As we’ve established that, you know, more companies are moving online, we need to have these services to operate our daily lives. Like you said, there’s still a choice on if I’m gonna go to a big bank over, you know, Carissa Breen’s opened up her dingy little bank, perhaps I’m not that secure. Right? So what sort of happens now? Like, this is the reality of the world. A dray of people are operating on the internet. Okay. And as the generations come up, they are, you know, already so much more involved in this than, you know, you or I were growing up as well. So where does this leave us now from your perspective?

Yuri Miloslavsky [00:36:23]:
From our perspective, it leaves us like, the way I see it is I I just wanna have control over my information. Even if I work with a company, say it’s insurance company, bank, doesn’t matter who, medical company, I wanna know what is being done with my information. When those organization accessing my information, I wanna know about it. Some people don’t, some people don’t care. I do. And I think more and more and more people will. So then there needs to be there needs to be levels of control for the end user so he’s aware what’s going on with his data. And if if that means one of the employees of of the bank that I’m working with is checking my information, well, I wanna know about it.

Yuri Miloslavsky [00:37:02]:
I don’t necessarily in terms of notification. I want it to be an option. But if I go into a log or something in the back end and I wanna see that, I want it to be available for me. I think it’s a fair, request because when we give them that information, we give them basically the permission to use that information. And this is my main problem with privacy today, because the assumption is that they now own that information, which I think is is the wrong way to think about it. And I think but that’s the fact that causes a lot of problems because they treat it as their information. They treat it like their owners, but it shouldn’t be that way. We are the owner of the information.

Yuri Miloslavsky [00:37:42]:
And if they wanna use that information, I would much prefer to be aware of it or I much prefer that to be recorded somewhere.

Karissa Breen [00:37:50]:
That’s not really happening now. So how are you gonna get companies to get to that point? Is it just gonna be a compliance thing? Is it gonna be we’re regulated, and this is what you need to do? You need to inform you when you’re doing that?

Yuri Miloslavsky [00:38:00]:
This is gonna be compliance. Hands down, it’s gonna be compliance because there is no other way. End of the day, right, if

Speaker C [00:38:05]:
I if I go into

Yuri Miloslavsky [00:38:06]:
a business and I’m like, oh, look that product and that’s cool and you need to implement that and it will do this and this and that, the company will be like, okay, what am I getting out of it? What? No money out of it? Well, that’s alright. We don’t need that. Versus, hey, this is a tool that you need to implement to comply with the current, we had a no compliance work frame, this or another. And if not $30,000,000 fine, then they can look, okay, implemented a 100 ks fine. 30 mil. Okay. Let’s implement it. I think it has to be this way.

Yuri Miloslavsky [00:38:37]:
It has to be through compliance. It has to be through regulation. And again, to the right businesses, again, you only limited to what you can enforce on businesses because certain businesses simply don’t have the budgets. And and again, that’s another problem to deal with. And this is why Australia introduced the maturity level model. So, you know, each business can kind of comply to a certain level.

Karissa Breen [00:38:58]:
So, Yuri, do you have any sort of closing comments or final thoughts you’d like to leave our audience with today?

Yuri Miloslavsky [00:39:03]:
Take care of your privacy. Always think about the risks and just, you know, throughout the whole discussion we talked about if people care or don’t care. I generally think people should care. I don’t think it’s all lost. Privacy is a problematic area but if we think enough and we kind of do our own risk assessment and act on it we can at least reduce the risk. We can at least be aware of it and do something about it.

Share This