Fortunately, there’s a cyber army out there fighting on our behalf – and Cardiff University’s Airbus Centre of Excellence in Cyber Security Analytics is training the next generation of troops. It will be their job to stay one step ahead of the criminals and safeguard the systems we all rely on.
It’s work that is becoming ever more important as systems get more complex and automated. Pete Burnap, Professor of Data Science and Cyber Security and director of the £5million Centre, says: "Cyber attacks can be very complex and difficult to detect. They can go on without you realising for quite some time, until all of a sudden, everything comes crashing down around you."
As an example, he points to the Stuxnet cyber attacks on nuclear facilities in Iran. "The attackers kept speeding up and slowing down the centrifuges in the uranium enrichment plant, little by little. If the attack hadn’t been detected, they would have fallen apart and the whole thing would have collapsed."
Places where universities and companies can come together to turn ideas into real-world solutions are a front line in the war against hackers. For Cardiff University and Airbus, forming a partnership made perfect sense. The company employs around 900 people at its Defence and Space site in Newport, South Wales, and is a global player in securing critical systems – including the Westminster Parliament and 90 per cent of Ministry of Defence networks.
Places where universities and companies can come together to turn ideas into real-world solutions are a front line in the war against hackers."
Dr Kevin Jones, Head of Cybersecurity architecture, innovation and Scouting at Airbus, says: "For cyber security monitoring and attack detection, we need an increasing level of automation and analytics. We knew that Cardiff University had a proven track record of excellence in the development of machine learning and artificial intelligence."
The Centre has already chalked up many successes. It was the first institution to be recognised by the National Cyber Security Centre (the public-facing arm of the Government intelligence service GCHQ) as an Academic Centre of Excellence in Cyber Security Research. Its academics have published numerous papers in high-impact journals, and it secured £4million to fund its research activities between 2017 and 2021.
"We’re focused on solving real-world problems and grand challenges," says Professor Burnap. "We’re doing a lot around machine learning and artificial intelligence (AI) for protection against cyber attacks. AI is very useful in seeking out cyber threats on IT systems, and even automatically blocking them or fixing the situation."
Professor Burnap, Cardiff University
We’re focused on solving real-world problems and grand challenges."
PhD student Matilda Rhode is working on fighting ransomware, where hackers stop a system working and demand money to get it back up and running. It’s not just a theoretical threat: a massive, global ransomware attack infected computers across 150 countries in May 2017, including parts of Britain’s health infrastructure. Her work focuses on finding ways to tell whether a piece of software is dangerous – even when its author has come up with ways to avoid being detected.
It’s a fight against a highly motivated enemy. Hackers who write malicious software know that the financial rewards can be vast, even if only a small percentage of ransoms are paid. "Cyber crime is driven by humans, which makes it a very interesting problem," she says. "I definitely want to stay in cyber security research. It’s a fast-moving area with a lot going on!"
However, survey after survey suggests that employers are finding it difficult to recruit people with the right digital skills, and that there’s a long-standing global shortage of cyber security experts. The Centre is helping to plug this skills gap, and major expansion is planned over the coming years.
It means that students like Matilda – and the centres of expertise to train them – are needed more than ever. Professor Burnap says: "The more we become dependent on AI, the more we’ll need people who can understand the security implications. We want people who can not only build new technologies, but build them so they’re secure and safe.
We mustn’t end up with AI that’s full of holes, like a lot of current technology. It’s far too important for that."