top of page

Podcast: Why are we still separating cyber-risk management from 'risk' management?

Is it time to end the silo-mentality around cyber-risk? Isn't it all just risk? Our guest, Guy Warren of enterprise IT and operational resilience providers ITRS Group, offers some useful insight around cyber-risk and explains why it's time for all of us, whether we're in operational, risk management or even risk underwriting roles, to fully understand and take ownership of digital risks and their very real-world outcomes. Listen in...

If you would prefer to read rather than listen to this episode, please see our transcript below.

Join in our discussion around risk engineering and management on LinkedIn here.

Discover more about RiskSTOP.


Johnny Thomson  00:01

Hello everyone and welcome to the RiskACUMEN podcast, which offers thoughtful insight around risk engineering and management. Now, despite the fact that the digital revolution started midway through the previous century, and the World Wide Web being with us for more than 30 years, we still tend to separate online from offline, digital from non-digital, cyber from physical, as if they were part of entirely separate worlds.  This kind of silo mentality, especially when it comes to risk and risk management, is not really how we should be seeing things according to my guest Guy Warren, who is CEO of ITRS Group, a global organisation that offers IT expertise, and helps with operational resilience. Hi Guy, thanks for joining me today. 


Guy Warren  00:48

Good to be here. 


Johnny Thomson  00:49

Now, before we get on to the main topic of discussion and this apparent disconnect between cyber and so called real world risk management, tell me a little bit about you Guy and your organisation.


Guy Warren  01:00

Yeah, so I'm Chief Executive of ITRS group, we provide monitoring and analytics tools to enterprise customers, a lot of them in the financial services, but also telco media, large service providers, technology service providers, MSPs they're call multi service providers, and organisations around the world. Generally organisations that require a very high level of availability, uptime, 99.99%. And guaranteed performance, even being slow slightly can be a major impact on their business. You can imagine if you're a bank or a telco and your online streaming media, your Netflix is running slow, that's a real problem. If you're a gaming site and your shared game, online game is performing badly, people will stop playing. So, highly demanding IT environments, which often also need high security. Need to be well locked down, often have valuable assets inside the data centres, which bad actors will target.


Johnny Thomson  02:03

And as I mentioned in my introduction, in your role there and the service and so on that you offer, you're particularly concerned aren't you about this silo-mentality around things like cyber security and operational resilience within organisations. So what's the problem? And why do you feel it needs to be addressed Guy?


Guy Warren  02:20

Yeah, I think the industry has grown up requiring operational teams to run the data centre and to do it, you know, with high availability and high performance and operational disciplines. And tooling has grown, you know, very strongly around that. Cybersecurity probably came in a lot later. And the obvious thing to do, as it was back in the day with physical buildings, is to try and secure the perimeter. You have, you know, access gates into a building, you have big fortress gates on it on a city. And so things like firewalls are invented and it was assumed that if you could secure the perimeter of your IT, that you were probably safe, people couldn't get in, the people who are inside were trusted and weren't going to do anything wrong. And therefore, that was the approach that was taken. So most organisations would set up an IT security team, maybe even not reporting into the Head of Production. Often there was a Chief Information Security Officer. And that's probably pretty much how it is for most organisations today, there'll be a CISO Chief Information Security Officer and the CIO, Chief Information Officer and they may not be the same person and they may not even report into each other. All of the skills, knowledge disciplines, tooling, processes are embedded in the security team. And the operations team get very little training and tooling that lets them try and improve the security of the environment. Their job is to keep it up and running and highly performant. And to add an extra responsibility for it to be secure as well. hasn't been done generally. But I think it's getting, it's a bit like trying to bolt quality on the end of a poor production process. If you're not putting a car together properly, you can have the best QA in the world, but you will miss a lot of defects that are just impossible to find until the car is in the real world. The same is true with trying to add security on after the event to an IT estate. You will find vulnerabilities and you will be able to shut them down, but you will miss some which will eventually be exploited by bad actors who are trying to get inside your estate.


Johnny Thomson  04:31

And I guess the lines have just become increasingly blurred haven't they between the between the two areas. And with a pandemic, where we've seen even further accelerated digital transformation within organisations and the world, this just makes this even more of an issue now, doesn't it?


Guy Warren  04:47

It does. We are seeing a slight change that we've had something called DevOps for a while now. It was brought in by the big internet giants, yout Googles your Facebooks your Amazons etc, where they were running, you know, very large and complex software in shared data centres. And they wanted to be able to bring new features through incredibly quickly. So they would develop them and put them live. Someone like Google will be making 2,000 or 3,000 changes to the software that we use in the internet search engines, the email, etc, the maps, all those products that Google have available, are being updated and changed all the time. And the focus would still be around performance and availability. But we're now seeing is something called DevSecOps, which the 'Sec' is the security side of it, which says the development and operating of software ought to have security embedded in it. And knowing that it is being deployed and managed securely and even designed to be secure, is actually becoming more and more important. Not too many organisations yet are doing full DevSecOps. But it does show the direction the world's going in, and that's a good direction. Once you start moving your... what would have been in your data centre, which you think you've locked down into the cloud, you need to make sure that the cloud has itself been secured properly. Have all the default passwords been changed? Have all the other security settings that would give someone access if they knew AWS inside out, been changed? There is now software which you can point at your cloud tenancy and it will tell you all the vulnerabilities you may have left on were not shut off, that someone could now access the equipment of your datacenter, but it's obviously running in AWS or Azure or whatever.


Johnny Thomson  06:38

Yeah, and this point about blurred lines Guy, can you can you give me some examples, perhaps, of how let's say cyber risk is really just part of everyday real world risk. There was the recent Colonial Pipeline attack, for example, wasn't there?


Guy Warren  06:52

Yeah, yeah. So the risks are many and various. They can be everything from trying to stop you trading, what would be called a denial of service attack, DoS attack, sometimes distributed denial of service attack (DDos), where a bad actor wants your website off the internet, and they will attack it with hundreds of robots that are trying to bring your web service down. There are people who want to come inside and steal data and might be straight criminals who want financial data of one form or another. They might be, this is very common at the moment, ransomware. So again, they get inside your data centre and encrypt key pieces of your data and software, and won't unencrypt it unless you pay a ransom. And those ransom can be tens of millions or even hundreds of millions of dollars to get your own data back. And sometimes the decryption is never given to you. So there's a whole industry growing. And there are those who steal IP, a lot of American companies have claimed that their intellectual property has been stolen by nation state bad actors where a country has deliberately picked up their IP, not to blackmail them, not to trash it, but to copy it. And that can be software, that can be a design patent. It can be you know, manufacturing processes, and they're taken to another country, copied and they then flood your market with replicas of your product which has taken you years or tens of years to develop. So the motives can be many and various and they can be, you know, political, whatever. There were a lot of accusations after the 2016 election of tampering of the election. And again, after the 2020 elections, of tampering of the election through the voting machines, which had multiple vulnerabilities in them. So it depends what your objectives are as to you know, why people are coming in. I think the other thing is that the this flaw that by having firewalls you've protected it, is easily beaten. We've seen a number of attacks recently, where the bad actors have embedded their code inside software, which was then deployed in the data centre, willingly. It could be monitoring tools, which is what my business sells, that actually has bad code in it, which will then go and do whatever damage you want it to do, and you willingly installed it, in what you thought was to monitor your estate and lo and behold, it's actually now doing damage to your estate, without your knowledge. 


Johnny Thomson  09:22

And that's it and these digital and coded threats, and when I say code, I mean code threats, they are very real world outcomes. 


Guy Warren  09:32



Johnny Thomson  09:33

For example, if somebody was to tamper with, you know, something IoT, the internet of things that can disrupt the supply chain, access control systems likewise could open up physically your business to somebody who wants to get in. Fire suppression systems can be tampered with, which can lead to massive physical damage. And that's the whole point, isn't it?


Guy Warren  09:55

Yeah. You know, the Colonial Pipeline, clearly it took them four days to come do that and there was no gas going through to that part of North America. There's been penetration of physical devices, they're called programmable logic controllers PLCs, which were controlling power stations and Iran claimed that they're PLCs, these controllers which control production lines and cooling systems, etc, had been hacked by the Israelis, and they were shutting off key systems to nuclear power stations. So yes, if your IT, you know, IT varies from a full computer all the way through to a an embedded device like an electronic valve. Most of them have some electronics which can be tampered with. There's a company in San Francisco, who have been investigated because the chips which they manufactured had a backdoor built into them. And those chips are used in thousands and tens of thousands of computers and the computers with those chips are open for someone to come in unbeknown and your actual computer chip is attacking you. So yeah, it doesn't have to be full computers in a data centre, it can be a fire suppression system, it can be access control gates for entry to a building, it can be a digital passport, it can be you know, you name it, it can be hacked.


Johnny Thomson  11:14

I guess it can even be something as simple as email where somebody uses social engineering techniques as well. Yeah, to again, fool somebody and gain access to something that then allows them to breach security systems and so on.


Guy Warren  11:29

There's a well reported case of in Formula One. In the 90s, two drivers were racing and for McLaren, and one of them was told on the radio to come into the pits, and he duly came into the pits. And when he got there, the team were just completely shocked why he'd come in and he said I got a message to came in and I acknowledged it. And they said, we didn't give you the message. So someone managed to hack his radio telling him to come into the pits, so you know that the goals are nefarious and the techniques are they now have a secure comms, I should add, I happened to find that out. Yeah, but you know, it depends what goal you have. That might be worth a lot of money if you can win Formula One championship because a leading driver pulls in thinking he's been summoned to the pits.


Johnny Thomson  12:15

It's a good example. It makes me think of smart vehicles as well and there are potential problems even there isn't there with people's safety, that somebody could literally hack into your car and create some damage that would put you in danger.


Guy Warren  12:27

And how would you prove it? The the car manufacturer would say my software was good. It's fully tested and checked and been certified by a regulatory body. But someone who wants you to crash and he has to get access to your software in your car doesn't do what it's supposed to do, and accelerates when he should break or drives into oncoming vehicle. How can you prove that wasn't just either an error in the software or a genuine mistake or whatever? It is unlimited risk with technology now, and often very physical risk for like, you know, the death of someone in a car if it's self driving cars. Most cars today have some degree of smart vehicles. Whether it's a lane changing indicator or distance for, you know, cruise control, dynamic cruise control, if that's overridden, and you think you're safe, suddenly, you put yourself at risk.


Johnny Thomson  13:14

Yeah. So so we can see that the kind of digital and physical risks are really merging, in many ways so what do you feel is needed Guy, what needs to be done?


Guy Warren  13:25

I think cybersecurity needs to go right back to the beginning and be embedded from the start of software engineering process. There is a technique called zero trust networks. It basically says, assume that any other piece of software that's trying to talk to you is a baddie, and make them prove that they are a trusted counterparty that you should be talking with. Most software is actually a combination of pieces of software working in concert. But the default is respond to any API that's called. If you're told to give it some data, you give it some data. Because generally, you're operating inside the firewall, this concept of a secure perimeter. So you have to design your software, both the way the software works, the way data is managed, or both on the wire and at rest, such that it is not easily got, even if someone is inside your data centre or particularly if someone's inside your data centre. And then as you come up through the operational processes, people have a normal job for the operational resilience need to be operational security and resilience, the DevSecOps paradigm we talked about earlier, Where train them on what CVEs are. Common Vulnerability Exposures is a term used by security guys. I wonder how many operations guys will know what a CVE is or how to close it down. It can be anything from how you've configured Linux through to, you know, a unknown bad act or a known bad piece of software.But we do not train software engineers or operations staff on security property, we expect to try and add it on afterwards. And that's not going to work. It's too late. There's too many places you can have a hole in it.


Johnny Thomson  15:09

Yeah. So each needs to have a better understanding of each other disciplines in essence. So...


Guy Warren  15:14

Starting for all of them right at the beginning, you're not allowed to write code unless you understand security.


Johnny Thomson  15:20

Yeah. And if you're a traditional kind of security, physical risk manager, or you're responsible for that within the operational chain, then you need to know a lot more about the digital side of things as well, because it will have real world threat?


Guy Warren  15:34

Yeah, totally. Yeah. And then that can be a storm model, or a hurricane mode. We had the famous 1987 'no, there's no hurricane coming' and then all the trees got blown down, as we all remember. But what if someone deliberately tampered with a weather reporting system, and shipping and a bunch of other things took decisions based on it. I'm afraid digital crime is very broad now. And we rely far to... well not not too much. We rely heavily on technology, and it being reliable and not being tampered with. It won't be a matter of time, before we have very significant physical risks triggered by technology, failures or tampering.


Johnny Thomson  16:16

And as a consequence, that spills off into connected fields that are tasked with risk management and risk transfer, such as insurance. 


Guy Warren  16:25



Johnny Thomson  16:25

Yeah, risk engineers, advisors and underwriters also need to expand their skills around cybersecurity and digital transformation and all of that, because that is ultimately going to have the real world consequences that impact in that area as well.


Guy Warren  16:41

I completely agree. Look my customers who are themselves heavily regulated and very risk averse, as they should be, are now sending very lengthy and detailed questionnaires back to companies like mine who provide tools that run in their data centre, to say 'can you prove the provenance of your codebase?' They call it supply chain security. So do I know every line of code and where it came from, and that none of the code written and embedded in my software has been put there by a bad actor? And can you prove that you're in control of all the libraries you buy in, all the processing in it, plus, if you're running a SaaS service, this would have been quite common anyway, is your data centre itself secure such that we're not at risk? But the supply chain thing is a new thing, it came off the back of the solar winds breach, which is a monitoring tool that had been hacked by a nation state actor, bad actor. And a large number of companies had their security, including the Department of Homeland Security, and a few others were notably hacked by it. So it's good. People are having to go back upstream and say I can be as secure as I can be but I'm relying on those programmable logic controllers, I'm relying on my fire suppression unit, I'm relying on my monitoring tool to also be secure, I'd better go and check.


Johnny Thomson  17:56

So if you're if you're one of these operational people, or your more traditional risk manager, shall we say, what kind of resources are out there Guy that will help these people extend their skills and their knowledge, so that they're more aware and more au fait with, with all of this sort of digital security area, and so on.


Guy Warren  18:18

As I say, it used to be a specialist domain all of its own, there's plenty of information on the internet aimed at a CISO about how to be secure and how to lock down their environments. I think the risk managers now need to start reading, they will need to be aware that there's a risk based on this and start reading what can be done in an organisation both to lockdown its own processes and tooling and everything and also to reach back into its supply chain and make sure that's secure. There's no shortage of information, that's not the issue, it's the awareness and the focus on it. That's been the the limiting factor, I think.


Johnny Thomson  18:58

Yeah. And, and to sum up, it's just time to stop separating cyber risk from physical risk and start seeing it all as just 'risk'.


Guy Warren  19:07

Correct. Correct. We've given five or six examples, they overlap very, very much. So people used to just worry about, you know, a GDPR data breach or someone stealing some data and contacting your customers. That's now a trivial risk compared to some of the other risks we're talking about now.


Johnny Thomson  19:24

Yeah, absolutely. So yeah. Link it all together, rather than keeping keeping things apart.


Guy Warren  19:30



Johnny Thomson  19:32

Brilliant guy. Well, I mean, thanks for the thanks for the insight of the day it's been a real pleasure chatting with you. 


Guy Warren  19:37

Great, thank you Johnny, yeah.


Johnny Thomson  19:38

And yeah, it's fascinating because I think we're all sometimes a little bit afraid of digital terminology and, and so on as well. And that kind of almost puts you off, where you feel it's an area that you might feel an expert in risk, but that you are completely put off by this separate field where you feel like you almost have to start again.


Guy Warren  19:59

Yes, there's plenty of information out there. You can start with a dummies guide and build it up. Yeah, it's, it's just knowing that it's a problem and you need to deal with it.


Johnny Thomson  20:07

Yeah. Fantastic. Thanks, Guy. And well, that's all for the latest episode of the RiskACUMEN podcast. If you have any questions or comments around the topic we've been discussing today or any other of our risk related content, please just head to our LinkedIn page. You can find a link at Thanks again Guy. And until next time, everyone. Good bye for now.



bottom of page