‘We collectively need to do the right thing here’. Accenture’s Jacky Fox on the current state of cybersecurity.
In January, Palo Alto Networks hosted its Ignite on Tour Event in Ireland’s capital, featuring a number of prominent cybersecurity experts and leaders discussing the most pressing trends of today’s threat landscape.
One of the most anticipated keynote presentations of the day was delivered by Jacky Fox, global security strategy practice lead at Accenture, who showcased and discussed the World Economic Forum’s (WEF) latest Global Cybersecurity Outlook report, which was written in collaboration with Accenture.
Fox delved into the bones of the report in relation to the Irish cybersecurity scene, highlighting the effects of artificial intelligence (AI) and regulations and the growth of ‘cyber inequity’ in the Irish business landscape.
Hoping to learn more about Fox’s views on Irish cyber trends, SiliconRepublic.com recently sat down with one of Ireland’s top cybersecurity experts to find out about the present and future of Ireland’s cybersecurity industry.
A helping hand
One of the first topics we broached in our discussion was the subject of cyber inequity, a particularly thought-provoking subject described by Fox at the Ignite event. Cyber inequity refers to the disparity between larger and smaller organisations in relation to cyber resilience, which is brought on resource limitations such as finance and internal support.
Fox describes the situation as “larger organisations becoming more mature and smaller organisations becoming less mature”.
“As larger organisations are looking at their risk management through a lens of their third parties, they’re looking at some of these smaller organisations and saying ‘Well, here’s a questionnaire, fill it out, and if you don’t pass, we’re not going to do business with you’.”
Fox believes that this will result in a much smaller pool of third parties doing business with larger organisations, which might alienate smaller and younger companies and prevent them from innovating in their field.
“If we end up with a smaller number of third parties with specific services, then by the nature of doing that, you’re going to stifle innovation, because innovation happens in young companies. Innovation happens when you’ve got room to breathe,” she explains. “And it’s not about cyber innovation. It’s about innovation and whatever service they’re supplying, because people always want to differentiate.
“If we get rid of that differentiation, and have very small number of monopolistic kind of suppliers, it’s not a good thing, and it’s not a thing that cybersecurity wants to drive.”
While this is a global issue, cyber inequity is especially undesirable for a small country such as Ireland, says Fox. “If you think about it from a resilience perspective, if you look at the whole of Ireland, and if everybody is relying on two or three suppliers for a particular service and one of them gets knocked out, it could knock out a third of Ireland if it’s something that’s quite critical.”
The key to preventing this stifling and monopolisation, according to Fox, lies with the larger organisations. Larger organisations, instead of “auditing the small organisations to death”, need to help the smaller businesses mature their cyber resilience and serve the market better.
“Ultimately, if the third parties aren’t secure, the larger companies aren’t either, because they’re part of their chain.
“We collectively need to do the right thing here.”
To shoot, or not to shoot
When talking about the major cybersecurity trends of today, it should be expected that AI will be making an appearance at some point.
AI’s involvement in the cybersecurity world has been frequently discussed over the past few years, including its potential for assisting burnout among cyber professionals and its increased involvement in attack and defence functions.
Agentic AI in particular is increasingly being talked about, with its potential impact on this attacker-defender conflict.
Fox says that while organisations could absolutely have their defences set up with agentic AI at the moment, there is a nervousness about fully committing the technology to managing cybersecurity processes.
“People are often choosing not to have that end bit, as in the response, fully automated as part of that workflow, and there’s a nervousness around it, because, like in military terms, will you shoot or not shoot?
“In defensive terms, it might be, will I cut off that workstation or not? You know, if it’s the CEO’s workstation or if it’s somebody who’s out in the middle of giving a presentation, and something happens that you cut them off, it can be quite impactful,” she says. “And I think people are a little nervous to go to that end state of saying, ‘Trigger it, make it happen’.”
However, Fox advises companies to start getting more comfortable with this decision, as attackers using agentic AI will be launching sophisticated attacks – from reconnaissance for vulnerabilities to infiltrating systems – at high speed.
“If you don’t have a response that’s going to happen at the same speed as those attacks, then that’s not really a comfortable place to find your organisation in.”
“So with a lot of things in cybersecurity, when we’re setting up tools where we think, or use cases where we think that a response might be catastrophic, or where it could be very impactful, we often set things up in what we call ‘pass through mode’, so we’re triggering that this response should happen,” she says. “We’re saying this is what I would do, but it’s either allowing it to go through for now and reporting on it, or it’s asking you to verify before you do it.
“So I would be suggesting to organisations that they need to get a lot more pushy about getting to that either pass through mode or where you have a human that actually verifies something at the end.”
She says that people need to really start thinking about this and planning so that if one day they actually do have an automated response in place, at least they have most of the process sorted so that it’s just a matter of saying “Okay, you can do it now”, as opposed to starting from scratch.
Disentanglement difficulty
But while organisations should be planning for the use of AI in their defences, Fox says that attention needs to be paid to having appropriate guardrails and policies in place at the same time.
Without proper governance and policies, organisations can fall victim to significant AI-related issues. For example, Fox says that she has seen examples of people inadvertently loading data up into public models, with that data popping up elsewhere.
One such example that she refers to, which she also highlighted at Ignite, was when stock image company Getty Images began seeing bits of their watermark appearing in AI-generated images.
“Even though it’s copyrighted, it’s getting absorbed into models and it shouldn’t. And I think the thing about that is, how do you disentangle something once it’s gone out?”
This highlights a major concern of ungoverned AI use, where data that gets uploaded into public models proliferates so intensely that it’s impossible to get back.
“In a traditional processing manner, the outcome is deterministic, like one plus one equals two. Whereas with AI, it is artificial intelligence therefore it’s non-deterministic. And if you ask it the same question on three different days, you may well get three different answers. Or if you ask for an image to be generated, you can get three different images by asking it in a row,” she explains.
“So you can’t go back and say, I can determine that when that question was asked at that time, that’s the answer that you got. And that piece went on to process something else. So therefore, if you lose data into an environment like that, you can’t get it back. It’s gone. It’s got sucked into something else. It’s just impossible.”
Celebrating commitment
While the cybersecurity landscape of today often seems like it’s full of concern, and understandably so, Fox has some particularly positive views on the industry and its future.
She cites the greater awareness of cybersecurity, particularly from the upper management echelons of organisations, and the greater proactivity towards defences as a promising symbol of the future.
The noble drive of cybersecurity professionals is also something Fox celebrates.
“I think the people who work in cybersecurity have a very public service gene about them. They’re not doing it for the greater glory. There’s real meaning to the work that they do. Like they get it,” she says.
“You know, you’re actually making a difference in the world if you work in this sphere, and it’s challenging, it can sometimes have very long hours, but it’s very rewarding work, when you either give somebody advice that you know is going to be really impactful and meaningful for them, or when somebody’s having their worst day and you go in and help them.
“It’s a really satisfying job if you’re prepared to put in the yards and keep up to date on things.”
Regulations are also a part of this positive outlook, which she believes plays a big part in the increased responsibility and care of organisations.
“I personally am a big fan of regulation, because I think that’s actually what’s driving a lot of the good behaviour. Regulation is only a baseline, like it’s not actually what people need to do to be secure, but it is absolutely better than where we were 10 years ago,” she says. “So I think we’re doing all the right things.”
And while the battle between attackers and defenders continues to intensify in its ever-growing back and forth, Fox believes the defenders are putting up a good fight.
“I don’t know who’s going to win in the end, but I think there’s a lot of people who are very committed to trying, which I find very positive.
“I can’t say that we’re winning, but we’re certainly not giving up.”
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.