THEOS Cybernova

Mick Moran: CSAM as the Insider Threat Missing from Your Playbook

Theos CyberNova Season 2 Episode 7

Disclaimer: This episode discusses child sexual abuse material (CSAM) and includes references that some listeners may find distressing.

For Mick Moran, child sexual abuse material (CSAM) is not just a law enforcement issue; it is a cybersecurity blind spot. As CEO of the Irish Internet Hotline and a former INTERPOL Assistant Director, he argues that every CISO needs to know how to detect CSAM, what to do if it surfaces, and how to protect both staff and reputation.

Through wargames at the Council of Europe, Mick shows how easily organisations falter without a CSAM policy: HR rushing to dismiss, legal silenced by uncertainty, and security teams exposing staff to trauma. He connects these lessons to Asia Pacific, where remote abuse and sextortion networks highlight the urgent need for corporate readiness.

This is not a topic widely discussed in cybersecurity circles, but it is one every CISO must factor into their playbook. Detection, wargaming, reporting, and welfare cannot be ignored.

Production Credits:

Presented by: Paul Jackson
Studio Engineer & Editor: Roy D'Monte
Executive Producers: Paul Jackson and Ian Carless
Co-produced by: Theos Cyber and W4 Podcast Studio

Speaker 1:

This week on the Theo's Cybernova podcast.

Speaker 2:

Okay, so CSAM is an acronym that stands for Child Sexual Abuse Material. Now that's also known as child pornography, but there is some material out there that is actively shared online that involves the rape of babies. Most CISOs don't have a light on their dashboard for CSAM. They've never heard of it, they're not interested in it, they think it's just some sort of a child welfare thing. Ai generated CSAM, which is still illegal, by the way. It's still illegal to possess, to produce, to disseminate, to distribute. My biggest fear is that it starts to normalize it.

Speaker 1:

The Theos Cybernova Podcast hosted by Paul Jackson. Welcome to another episode of Theos Cybernova Podcast with me, your host, paul Jackson. I'm proud today to have the legendary Mick Moran with me for another episode in season two of the podcast. Mick, thanks for joining us today on the show.

Speaker 2:

Thanks very much, paul Legendary. I think I'm legendary in my own lunchtime, a bit like yourself. If the truth be told, I think we're two self-proclaimed legends here in this recording box.

Speaker 1:

Yes, we are, and indeed we are in this little recording box in the fabulous venue that is the Council of Europe in.

Speaker 2:

Strasbourg, that's right. We're in the Council of Europe and the Council of Europe, to be fair to them, were one of the first outfits really to pay attention to cybercrime. They have a convention that goes way back, I think to about year 2000 or something.

Speaker 1:

Something like that.

Speaker 2:

yeah, yeah yeah, yeah, 1999, I think, and all through the late 90s they were doing a lot of work on cybercrime. And then they have their convention, their Budapest Convention, and from my perspective so my expertise is in the whole child exploitation online area and my interest in the Budapest Convention is it's the only convention that really calls CSAM or child pornography. It's the only convention that mentions CSAM as a cybercrime. Yeah, other people tend to think that you know, you can't have something that's content labeled cybercrime, but whereas I would say, no, it's not the content that's the problem, it's the transferring of it on networks and the moving of it.

Speaker 1:

Indeed, and we are going to drill into that. But first I'm going to kick off with a very startling statistic, because you gave an excellent presentation yesterday, an excellent workshop, and one of the metrics that you gave there shows the connection between the private sector, if you like, and the exploitation of children, Because most listeners are probably thinking, well, what's it got to do with cybersecurity? But we are going to cover that. But this metric is pretty staggering and it comes from an organization called NetClean and they state that one in 500 business machines contain CSAM, as we call it. Now. This is pretty staggering, but let's start by explaining what is CSAM.

Speaker 2:

Okay, so CSAM is an acronym that stands for Child Sexual Abuse Material. Now that's also known as child pornography and in most of the laws and in most of the conventions, like the aforementioned Budapest Convention or in the EU Directive on child abuse online, they refer to it as child pornography. And in fact, in laws even in Japan and in China it's referred to as child pornography. So I have no problem. I mean, I think Thailand recently made child pornography illegal. I know that they've been working hard in the UAE, for example, to address the issue there. So it's child pornography.

Speaker 2:

And because you can't have child pornography without having child abuse, we started pushing a few years ago to use the acronym CSAM instead of child pornography, because pornography gives you the impression that it is the same type of pornography that you would see on, you know, redtube or any of the commercial pornography websites, or in OnlyFans or somewhere like that, whereas this is not pornography in that sense, because it involves the actual abuse of children.

Speaker 2:

It's a recording of the actual abuse of children, and when I say children, I'm not talking about young people, I'm talking about children. The vast majority of CSAM that's out there is, in fact, prepubescent children, so it's a classic bell curve where it starts at zero and it goes up in the bell curve. Those of you who know what a bell curve is know that it peaks out, tops out at around 12 years of age and falls away then to between 12 and 18, right down to, of course, it's not child anymore once it's over 18. But that point being that the vast majority of that you know and that's an important point to make the vast majority of it is you know it peaks at 12. But I mean, there's an awful amount of stuff out there seven years of age, six years of age, children of that age, and indeed a lot of stuff involving a lot of movies and images involving pre-speech children. So under three years of age, under two years of age, there's an awful amount of stuff like that out there.

Speaker 1:

That's incredibly shocking. I mean, and I'm sure many of our listeners will be shocked by these metrics and I should have perhaps warned them at the beginning that they may find some of this a little bit upsetting. But we have to face reality and the kind of work you do. I mean, I am in awe of the work that you do and also the NCMEC, the National Center for Exploited and Missing Children, our friend Gigi over there, who does great work, but it's not highlighted very often, you know, in terms of the in the context of cybersecurity.

Speaker 1:

Now, here at Theos right, we do loads of cybersecurity drills. You know tabletop exercises for ransomware, for other kind of breaches or DDoS attacks or nobody ever does them for incidents where you know CSAM is found on their systems and how it got there, who might be responsible, what the. You know CSAM is found on their systems and how it got there, who might be responsible, what the. You know what the impact is in terms of you know how many people within their organisation might be involved. It's actually potentially extremely damaging reputationally and obviously morally.

Speaker 2:

Yeah, no, there's a moral aspect. But I think, more importantly, one of the things that I highlight when I do that exercise is that unless you have policy in place, you're running panicking right groups. So we had the HR group, we had the legal and compliance group and we had the CISO group and we saw very clearly that none of them was talking to each other. They were all off thinking about the problem, that they all got the same problem and they were all off thinking about the problem in their independent silos. So straight away you could see that that was a problem. Now I think like this another place where CSAM differs from ordinary pornography is most companies will have a pornography thing in their acceptable users policy or they'll make it very clear Whatever you your end users, they're getting company devices or they're bringing their own devices and one of the main things is you don't use it. Look, if you're working for me, you're working between nine and five. You shouldn't be browsing pornography between nine and five, right? And if we find pornography on your system, it's a disciplinary offence, it's a HR matter. It's certainly not illegal.

Speaker 2:

The problem with CSAN from a cyber security perspective is that it is out there. You rightfully talked that figure one in 500 machines. But it's out there and it attracts special attention. And the reason it attracts special attention is that it's illegal in most countries. Right, okay, that's one reason it attracts attention. So now you have an immediate legal compliance problem. So you have, in effect, a serious criminal operating in your company If you find this on your system now. Somebody has put it there, who put it there, and you rightly pulled up the risk of reputational damage. But here's another one that companies don't listen and don't think about. You are exposing your staff to very traumatic images or videos.

Speaker 1:

Yes, right.

Speaker 2:

Your staff are getting exposed to that because you didn't care about it, because you didn't deal with it as a threat, that you didn't have signatures in your seam that will flag up this as an indication of compromise when you're dealing with these videos.

Speaker 2:

Now let me just give you an example of a video, right, and again a warning to listeners that this could be upsetting to somebody and it could be triggering to some of your audience. So please, you know, skip forward just 10 seconds if you're of the sensitive nature. But there is some material out there that is actively shared online that involves the rape of babies, right, including the screaming, including the soundtrack that goes with it. Now, if you are not warning, if your staff member working in your help desk or the secretary of an executive who has this on his machine gets exposed to this, and then in the post-incident thing, you have half a dozen people in a room watching it on a big screen and you're exposing them to it, you are dealing with material that can be traumatising to people and there's another big risk involved. So part of our action plan in relation to dealing with it is the complete forensic quarantining of the material, only exposing those who it's absolutely necessary, and even then, after they've been trained properly in relation to it All right.

Speaker 1:

So before we go on to the detection of this, I want to keep touching on the trauma side, because you actually raise an important point and we're both experienced former police officers and, sadly, we've both had to deal with this kind of material and we've seen how horrific it can be and the impact it has. The audience may be quite curious Psychologically, how do police officers deal with this? Do you know, psychologically, how do police officers deal with this and how, from a leadership point of view, how do you help? Or, you know, try to identify when officers may be suffering from trauma through repeated viewing.

Speaker 2:

So like I'm the ceo of of of hotline, the irish internet hotline, and we we manage see some reports from members of the public and we we were member of the in hope network and so we manage uh, notice and take down stuff for sites and URLs all around the world. And I have analysts there who are trained primarily as internet analysts and then secondary as child CSAM analysts and they understand fully and one of the key parts to their work is the whole welfare package that we build in around them. Now, when I was the assistant director at Interpol, I was the assistant director of the vulnerable community section and there I was dealing with trafficking in human beings, I was dealing with people smuggling and I was dealing with online child exploitation and, as you know, interpol houses the International Child Sexual Exploitation Database, icsa. That is basically a registry, a kind of a record of all of the CSAM that's floating around out there on the internet and that has many, many uses for whatever. And one of the key things you have to build around a team in this way is welfare. Now when I started this game back in 1997, there was none. Nobody thought about welfare, nobody thought about anything else yet. But I mean, like I'll.

Speaker 2:

Often people say to me how can you do this? And I just say look, as a police officer you deal with fatal traffic accidents, you deal with suicides, you deal with really the dark side of life in general. You never know, one at one end of the day to the other, what you're, what you're going to be facing when you go out there, at least when you're working in this area. You're dealing with CSAM all the time and therefore you can prepare yourself in advance. You can step away from the computer and switch it off if you're feeling a little bit overwhelmed, and of course then you'll have welfare. So my team now at the minute at the Irish Internet Hotline they have obligatory visits to a supervising psychologist who has a chat with them once every quarter and they're free to go to her anytime actually they want to.

Speaker 2:

So they call it vicarious stress or vicarious trauma. So in other words, an analyst can start to take on board the trauma that the child is having in the image or movie and once that starts to happen, the analyst can become very quickly unwell if they're not careful. And you see, when you add that vicarious trauma which another word for it might be stress, and if you add that to just the regular stress of life in general, right? So if you've had a bad day, maybe your significant other has been giving you some shit that morning, and then you're sitting in the traffic on the way into work, and then you get into work and now you're sitting down and you've all that stress on your shoulders and then you sit down in front of a screen full of stress. Yeah, not going to be a good day. Not going to be a good day.

Speaker 1:

Not going to be a good day. Yeah, so I'm going to also ask quite a sensitive question, a difficult question, which is you know, as a police officer, you know you're obviously going to decide who does those kind of jobs which you know, because obviously, categorizing CSAM is a laborious job. You know, many cases may have thousands of images, right, and each one needs to be categorized. How do you prevent somebody who actually thinks, well, I want to view these images, so the best way of viewing them without getting arrested is to become a police officer and view these images?

Speaker 2:

Yeah, that's something that we manage as a risk within this environment. We manage it as a risk through a selection process, obviously, and then that's just an addition to the recruitment process. An addition to that would be to try and spot that. Not easy, not easy, but we're in the world of risk. That's who we are. You know cybersecurity and working online and our everyday life is risk. So identifying it as a risk online and our everyday life is risk. So identifying it as a risk, flagging it within the recruitment process. And then, as I said, there's a supervising psychologist there and the supervising psychologist their job is behavior and yeah, so you'd like to think that they might spot it and ultimately, that's it. You can know or you can't know.

Speaker 1:

You know and that's just the way it is Okay. So let's go back to you know, our CISO, because that's primarily who our audience is here. How do they implement a program whereby they can detect, or perhaps be on the lookout for, this kind of material?

Speaker 2:

Okay. So, first and foremost, I think the CISO is the perfect person and that's why I'm really thankful to you, anthony, for allowing this podcast to be made, and I'm very thankful to the Underground Economy, to Team Kumru, who run it here. I'm very, very happy to talk to CISOs and to explain to them that, first and foremost, like it's a nice segue in when we talk about risk, because that's a CISO's world, isn't it? A CISO's world is all about risk and, like if he looks at his blinking lights of stress, he just wants to know which one is blinking brightest.

Speaker 2:

Right, the problem with CSAM is that most CISOs don't have a light on their dashboard for CSAM. They don't, no, they've never heard of it, they're not interested in it, they think it's just some sort of a child welfare thing that ladies who lunch worry about. It's nothing to do with me. Well, unfortunately, and as that statistic from NetLean says, one in 500 machines it has something to do with you and because it involves children, it's, as the French say, enchaîné. It's connected to, it's linked to many, many other compliance issues. So when, for example, you talk about GDPR right In GDPR, there's special categories of data right, special categories of PII, there's special categories of data right. Special categories of PII and, let's face it, one of the most dangerous, shall we say, one of the most mined areas of data is data relating to children Right of course, yeah, yeah.

Speaker 2:

And so that's another reason it's differentiated from normal pornography. It's another reason why your policy in relation to pornography within the house is not good enough, because there's other factors that link into it. One's interested in knowing more about the hotlines. They'll find it at hotlineie. But, like in ireland, we have two pieces of law that are related to c-sam.

Speaker 2:

One of them is the child trafficking and pornography act, where it is defined and where the, where the guardi, our police service, where they will, will, will pull their power to have warrants in relation to it, powers of arrest, all that sort of thing. And then down in the bottom of and then down in the bottom of the article, down in the bottom of the Act is an article one section that I think it's section 9 of the Act, which basically makes the body corporate responsible, where if they have CSAM on their system and they know that they have CSAM on their system or here's the interesting line or are reckless as to whether they have CSAM on their system, they can be held responsible for it under the Act and it's a no-fault thing. So in other words, the CISO might know about it, but he might be standing in front of court facing a criminal charge for having CSAM on his system.

Speaker 2:

Yes, the reckless as we all know, it's an interesting one, it's an interesting angle, isn't it?

Speaker 1:

And obviously we won't go down that legal rabbit hole. No, neither of us are lawyers, but we know from experience.

Speaker 2:

Yeah, and here's the second piece of law that attracts it right Is the Criminal Justice Act relating to the disclosure of sexual abuse of children.

Speaker 2:

So, in other words, if you are a company, you're a CISO in in a company, you find some of this and you treat it as pornography, then you're likely just to delete it and to pass it over to hr for from to make it a hr problem. Well, here's why you need a c-sam policy, because if you delete this material and you don't inform the authorities because these are and don't forget, at their base these are pictures that are of a rape scene, basically Child sexual abuse. Okay. So if they are a picture of a rape of a child, then that is a serious crime happening to that child. You don't know who that child is. That child might be from Canada, that child might be from the UK, but it might equally be your employee's daughter. So it's very important to remember that you are obliged under the Act in Ireland to report this to the authorities, I think that's the case in many jurisdictions, probably, probably, probably is.

Speaker 1:

Obviously, we have an Asia focus here, and I'll come to that in a bit because obviously I spend a lot of time in the Philippines, where a lot of our employees in Theos are based.

Speaker 2:

And yeah, let's come to that later, because that's yeah, we can talk about the Asia angle, because there is angles there in Asia, yeah, 100%.

Speaker 1:

Yeah, 100%. Ran yesterday here at the Council of Europe at the amazing Underground Economy Conference that I do hope more of our listeners who are in the private sector in Asia-Pacific would consider attending next year. It's absolutely priceless in terms of A the content and B who you meet and you get to meet Nick of course.

Speaker 1:

If I'm invited back next year, who knows? But in all seriousness, the exercise threw up some interesting points for me because I think you know the scenario. Let me recap the scenario right. So you had a help desk, a computer that wasn't functioning properly, so help desk or IT support were fixing it and whilst they were fixing it, they happened to notice these images of CSAM. Right Simple scenario could happen to any company.

Speaker 1:

But then you know, of course it was split into groups and you know, immediately people think well, what should I do about that employee who owned it? You know, and it's kind of a guilty until well, you know, proven innocent rather than innocent until proven guilty, because there are all sorts of circumstances that could have led to that, those images being on the machine, and there's a lot to think about. How do you act in regard to that employee? It's a serious allegation. What do you do? You know, where do you take this? And it threw up so much confusion with the audience and I could tell that this has never been done as a tabletop exercise in any of the organisations that were present yesterday.

Speaker 2:

Yeah, yeah, no, and that's it and okay. So we basically, as you say, the scenario, the help desk scenario, yeah, no, and that's it and OK. So we basically, as you say, the scenario, the help desk scenario, and then I split the people up into groups and I did the old one, two, three. So I counted one, two and three, which which split up buddies. It split up buddies, which was very useful, and I split them up and so one was the, one was the legal and compliance team, two, two was the HR team and three was the was the CIS compliance team, two was the HR team and three was the CISO team, and then they all went into different corners and they talked about it and they only got a few minutes, because they got about seven minutes or eight minutes, because the last line in the scenario was in 10 minutes.

Speaker 2:

You have a meeting with the Dragon CEO and she's not happy about this and she's going to have her poker about this and she's going to be, she's going to have her poker face on and you're going to have to explain to her what the scenario is, what the situation is, what you know about it and what are you going. More importantly, what are you going to do about it? So I mean, hr wanted to sack him immediately. You know, the CISO was panicking about not exposing too many of his staff and and that was useful. I thought that was very useful Not exposing too many of his staff and one of the CISO one of the people there that was a CISO like she was very interesting because she just kept bringing it home that this guy is innocent. We don't know how this stuff got there, because most people start with it's there, it's his, but that's not the, that's not the reality and it absolutely cannot be the reality. And that's why I say you need policy, you need to war game it, you need everyone needs to understand their role in it, because you cannot say how it got there. And and I thought that she was being very clever because she kept bringing this up, so every time they came to a solution, she she'd go yes, but and you see, because none of them were thinking and this, I think, should have been the CISO's thoughts on it the CISO shouldn't have even talked to HR or Legal and Compliance until he knew, or she knew, what they were actually dealing with. Agree, right, agree, because it might be.

Speaker 2:

Everybody went into a tizzy based on what the guy in the help desk, who may not be the brightest sandwich in the picnic right or chip in the block. You know they're all basing everything. So what are you doing? Are you triggering this whole internal panic based on what the guy, the help desk guy, saw? But has anyone confirmed what he saw? Is anyone satisfied that this is CSAM? Or has he saw some nudity that he thinks it looks young? Or is this actually confirmed CSAM? Agree?

Speaker 1:

Yeah, I've seen this time and again where people run off on the assumption that the junior guy is right, yes and wow. Yeah, mistakes can be easily made.

Speaker 2:

So look the CISOs all got into, you know, do we get? Let's take an image of the machine, let's find out what else the machine is connected to. Let's run off on, let's run down some of the leads that you can get. What else have we got around the image? So, if it's an image or a movie, okay, it's one file, okay. So what else have we got? Have we got thousands of these files on our system? Are they in our network? Are they on a share? Is it in the cloud? You know, all of those are very important aspects, not just from a legal perspective, but also from a reaction and what, the next steps that the company will take, and so like. Again, wargaming policy and training are so important for people to understand how to deal with the incident and then to map it out properly before you then pull the lever that says CSAM policy. Right, get HR involved, get whoever involved. At least have some knowledge of what you're dealing with, right.

Speaker 1:

So let me put your Interpol hat back on. You know, I know you spent many happy years in Lyon at Interpol where I met you many times at various conferences, but we won't go into that. But obviously in our part of the world. So this podcast, theos, etc. We're focused on the Asia-Pacific region and sadly, a lot of this content does originate from our part of the world. A lot of the abuse does originate. Are you shaking your head? No?

Speaker 2:

No, no, no, no, there is some. Okay, if you were to talk about online. So the umbrella phrase we use is online child sexual exploitation and abuse. So some people say Oxy, but I just say it's O-C-S-E-A, so online child sexual exploitation and abuse.

Speaker 2:

If you're going to talk to me about the asian pacific region, c-sam is not the first thing that comes to mind, one of the main reasons being that a lot of the countries over there haven't yet got law that illegal makes it illegal. So that's one reason c-sam is not the primary concern over there. The primary concern from certain countries in that region is sexual extortion of kids in the west and sexual exploitation of kids in real life. That's actually involved with traveling sex offenders and then, linked to the traveling sex offenders angle is people who travel from the West to the East to abuse children and unfortunately, they get access to children in those certain environments in the East, certain countries in the East. There's also remote sexual abuse. So remote sexual abuse is an interesting one and it's one that's unfortunately happens in the Philippines a lot, and there have been newspaper articles and everything else. So it's not just me saying it. I want to make that clear. I'm not dissing the Philippines in any way or Filipino people in any way, but there is a certain type of exploitation that's facilitated by the internet, which is called remote child sexual abuse and basically Western person, sex offender or person with a sexual interest in children makes contact with a mama san in, you know, with a madame, if you like, in a poorer region of the world and often this is the Philippines and the child is basically abused to order on webcam.

Speaker 2:

Well, that's horrific. Yeah, it sounds like it. It sounds pretty horrific, but it's a very common practice in the Philippines and it's an unfortunate one, and law enforcement do their best. I know the french do a lot of work in this area. The english do a lot of work in this area. The americans do a lot of work in this area because it's their, it's their um citizens, who are the, the clients, the consumers yeah, the consumers of it, and they're paying like 30 euros or 3030 using, I suppose, paypal or wiring the money in some way, western Union or whatever. And, to be fair to Western Union and PayPal, I know that they are are grooming kids and adults online until they engage in some sort of sexual activity on the webcam and then they threaten them that they're going to.

Speaker 1:

Yeah well, we heard an excellent presentation. Yes, I'm not going to name the company Obviously we're under NDA here in this, but it was reassuring to hear that the social media companies are taking this kind of sexploitation seriously. You're shaking your head again.

Speaker 2:

I'm not shaking my head, I'm kind of nodding, I'm kind of doing the yeah, all right, they're taking it seriously.

Speaker 2:

They could do more right and they are doing loads. That's true. And when you talked earlier on about the welfare of police officers who are dealing with CSAM, it's very important to remember my staff in hotlines all around the world, like there's 54 hotlines under the In Hope umbrella 54 hotlines around the world. But also trust and safety people in all these big companies, the trust and safety people, they are some genuine good people and they're doing really hard work constantly to deal with bad behavior on their platforms and they don't get the credit for it.

Speaker 2:

You know we'll go after the head of Meta and he gets his head slapped in some congressional hearing in the United States of America, but nobody talks about his trust and safety people. And you know trust and safety itself has come in for some significant flack recently around with the change of attitude in the States and the kind of shall we call it? The change of direction in relation to certain political linked aspects. But don't forget that behind that flack that you're firing, there are trust and safety professionals that are doing great work, indeed In keeping your platforms the ones you're using every day, keeping them spotlessly clean from the types of stuff like CSAM, for example.

Speaker 1:

Yeah, no it's good to hear and you know, I think, even though this is obviously an underground economy, it's focused on financial crime. It's organized crime who are financially motivated. It's the primary reason we are here. But let's face it, I mean the world you live in, the murky world I have to say that you live in, is also financially motivated. Of course, you know a lot of these images and you know they're to order, they're for financial profit.

Speaker 2:

Well, I'm going to shake my head again. Okay, like the financial crime and organized crime angle of this, is absolutely the sexual extortion that we've just talked about, the extortion of money from people using their sexual activity as a threat. Southeast Asia is good at that Right Yep.

Speaker 2:

Africa is very good at that, so that's certainly one aspect of the financial side of it. There is CSAM available on the dark net for payment, so you can pay Bitcoin or a bit of a Bitcoin and you can get access to some stuff. Sometimes people who are producing it will charge for it, but the vast majority of CSAM that's circulating around there is circulating like for like, so it's aficionados sharing, you know, like baseball cards.

Speaker 1:

That sounds awful.

Speaker 2:

Yeah, it is awful. It is awful, it's wrong, and people collect it, and they are always after the next best. It is awful, it is awful, it's wrong, and people collect it, and they are always after the next images or movies in a series. Series are constructed generally around a victim, and so you can have scenarios like I. Certainly, during my 10 or 11 years in interpol, I actually watched certain children growing up in abuse videos oh, that's horrible.

Speaker 1:

Yeah, do you know what I mean? Like I'd see her now and she now she's seven, now she's.

Speaker 2:

You know, when I saw her first she was four yeah you know, and, and and. Now she's 14, and when I'm leaving interpol, we still haven't found her. We don't know who she is and she's still being abused every day. Yeah, so that's yeah.

Speaker 1:

Yeah, that's sad.

Speaker 2:

But it doesn't like. I'm not here to shock, I'm not here to moralize. I'm here to make the point that this is a cybersecurity threat. Indeed, it's an insider threat. That okay. That actually maps very well onto any of the threat models that are out there the Gartner, the NIST model threat model. It absolutely maps onto them very, very well.

Speaker 2:

And so, if you want to talk about preparation for an incident, so having your policy, having war-gamed it a little bit like we did yesterday, and then the detection right, how are you going to detect it? Right? And there are ways. There's commercial ways, like NetClean, have it Like Cloudflare, for example, on your website, have you got it switched on on Cloudflare? It's an absolute, it's a setting on Cloudflare. And if your website is on Cloud Flare, it's a setting. If your hosting company is not providing this type of scanning, especially if people are uploading files to your network, for whatever reason, you absolutely should have it switched on. And then how else do you detect? How do you detect it within your own network? Well, I mean, you need to have the signatures in your SIEM, the hash values.

Speaker 2:

Yeah, they've got to be in your SIEM and you've got to be getting a red flag when one of these triggers. And once it triggers, then you're putting your policy in place. And how have you wargamed this? How is your NIST, how is your information security officer Like? What are the steps and the next steps to take? Have you got somebody in-house to do forensics? Have you got somebody? You know? All of these are questions. And have you got an advisor on the end of a phone? That's the thing. In Ireland. We're advisors, we are consultants to companies and especially member companies, companies who are members of the Irish Internet Hotline. They get consultancy from us in relation to this and we can help them to set it all up.

Speaker 1:

We can do the. Well, this is exactly what you know. A little plug for our own company, Theos, which you know the incident response retainers that we have aren't just for the ransomwares, they are for the insider threat, the internal investigations. We have tons of experience dealing with employee investigations.

Speaker 2:

Okay, and so I would say to you that you need to bring it into TIOS too, and I'll be very happy to consult. I'm sure you will All right.

Speaker 1:

Before we hit the last question, which is always a music question, I've got to talk about AI. Yes, because nowadays, I'm sure a lot of this CSAM material can be generated via AI. Are you seeing this?

Speaker 2:

Yes, we're seeing it. We're seeing it happening already. To be fair, most of the big LLMs and the big generative AI engines won't allow it to happen. I mean, even I have problems when I'm asking it questions based on CSAM. It goes. I'm sorry, I'm only an LLM model.

Speaker 2:

I can't help you, you know, and then I have to come at the question again and I have to explain that I'm a professional, I'm an academic. Whatever you know, you can find out about me here. I'm an academic who studies this, and so can you please. But again, if I'm getting around it, so too can offenders get around it. So like it is a big question. There's a number of things Like people talk about child protection areas and the role that LLMs are having there, and there's a recently their chat, GPT brought in rules in relation to parental controls and things like that, because there was a very sad case of a young man who committed suicide because the LLM kept kept, if you like, validating his opinions about it, and eventually the kid killed himself.

Speaker 2:

So like validating his opinions about it and eventually the kid killed himself. So like that's an unfortunate one. It's an example of how LLMs can be dangerous, especially around children and how vulnerable children can be. We see huge regulation coming into platforms and things like that now, especially in the European Union, and so LLMs are going to have to be the same way. But the ones we're more concerned about is the CSAM, because if you have a very large collection of CSAM and you run it in as a source data into a generative of AI, and then you request, and then you put in many, many, many another collection of non-CSAM in there, and now you ask it to produce new CSAM.

Speaker 2:

Well, we're going to see a huge increase in this, and that increase will come at a time when law enforcement hotlines, CSOs, are already snowed under with work, and now here's another problem coming down the line for them, and they're already snowed under. I mean forensic teams in law enforcement. All over the world, forensic teams in law enforcement will tell you that 80% of their work you know, somewhere between 60% and 80% of their work is CSAM. Yeah, that's shocking, isn't?

Speaker 1:

it. Yeah, it is.

Speaker 2:

But if that's what they're dealing with now, what will it be like when this AI-generated CSAM which is still illegal, by the way, which is still illegal that's an important point to make it's still illegal. It's still illegal to possess, to produce, to disseminate to, to um distribute what's going to happen there to those teams who are already snowed under, as I said, but also it it it forms, it will form part of the of the of the ai slop that's out there, and my biggest fear, my biggest fear, is that it starts to normalize it. Yes, do you know what I mean?

Speaker 2:

I do If people are starting to see it. Okay, they're shocked initially and they get over the shock, you know. And then what?

Speaker 1:

I truly hope it'll never be normalized. Look, we're coming towards the end of the podcast now. I'm sure this is an eye-opener, because it's not a subject that's actively discussed enough. You know, I hope this opens a few people's eyes. I'm sorry if it's shocked a few people, but it's the reality. Um, but anyway, you know, your job is stressful, more stressful than most when you deal with these things. But yeah, I always rely on music to uh, to unwind a little bit.

Speaker 1:

I like music and so so I'm gonna ask you you know what do you listen to? What's on your turntable at the moment? I'm going to guess Valdunican.

Speaker 2:

Oh, yeah, yeah, valdunican. Well, it has to be said that I love music and I absolutely adore Spotify. I'd be a bit of a vinyl man and I've got quite a big vinyl connection. I have a lot of music on. I have like gigs and gigs and gigs of MP and everything that I haven't looked at in years. You know, with Spotify, what I love about it is I'd have a song in my head. I wake up sometimes with a song in my head. I put on the song on Spotify and then I let the algorithm do the rest Right, and through that I pick up some really really cool bands and some good bands.

Speaker 2:

And BBC Radio 6 music, fantastic Altern, fantastic alternative, kind of just a little bit off the the beaten track and you discover some some really really good stuff on there. Eight radios another big eight radio is on in our office all the time. I don't know, I don't know who put it on. It's one of my favorite radio stations and they didn't do put it on for me, but it's um, it's ticking away there in the background. And eight radio is also a great source of alternative stuff but what am I?

Speaker 2:

listening to right now? Yep, I suppose I'm listening to a band called Kingfisher who've just released an album called Halcyon. Right and? And Kingfisher are an Irish group and they fuse rock and I don't know, tink, dermot, kennedy, hosier, and a bit of the Wolf Tones, you know kind of Irish music which, and a bit of the Wolfe Tones, you know kind of Irish music which, like, if you take the likes of Kneecap, they're also doing that right now. There's another great band out there at the minute called Amber. They're doing the same. C-mat is doing a little bit of that as well, kind of rock fusion, rock country fusion, whereas Kingfisher have just released Halcyon, as I said, and that's the one that's been on, released Halcyon, as I said, and that's the one that's been on repeat for the last while. So I totally recommend that people pick that up.

Speaker 1:

This is why I asked the question, because I get great recommendations and I will be checking that out. Look, mick, I can't thank you enough.

Speaker 1:

You really are a legend the work you do is the work you do is yeah listening to and hopefully you'll join us again for future episodes of the theos cybernova podcast, mick, thank you so much, much obliged. Thanks, paul. Theos cybernova was presented by myself, paul Jackson, the studio engineer and editor was Roy DeMonte, the executive producer was myself and Ian Carlos, and this podcast is a co-production between Theos Cybernova Podcast.

People on this episode