Stories of #TeamScience
Scientific research is rarely performed by an individual scientist; it usually involves the work of an entire team of people. Besides this, some topics and challenges are so complex and multifaceted that they affect various faculties and call for multidisciplinary approaches. At TU Delft, a team of researchers and support staff have joined forces to tackle the issue of improving cybersecurity and detecting cybercrime. How do we keep the internet, and therefore Dutch and European society, safe?
In this longread, we talk to our cybersecurity professors Georgios Smaragdakis and Michel van Eeten.
The challenge
We need to ensure that we can protect the digital infrastructure that just about all our services and products run on – not only the electricity grid, the water network, our hospitals, national defence, the business of our public authorities and banking but also stock markets and entire production chains. Georgios Smaragdakis explains: “What the work I do on cybersecurity basically boils down to is: how can we make our world, and in particular the digital transformation of our society, as secure as possible? How do we secure it against cyber attacks? Everything, and I mean without exception, is and happens online these days – so our digital infrastructure is vulnerable."
We need to make sure the bad guys don't get access to data they shouldn't have access to. "And because all the institutions and services we just mentioned are interconnected, we also need to know how to isolate them if they do get digitally attacked, to prevent a domino effect and ensure that no further damage occurs."
Georgias Smaragdakis
Georgios Smaragdakis is professor of cybersecurity at the Faculty of Electrical Engineering, Mathematics and Computer Science. His field of study covers the technological side as well as the geopolitical play around cybersecurity. Pasted to the door of his room in the Echo building there is a poster relating to a hacking competition – which must be the perfect way to get students interested in the subject! Before joining TU Delft, Smaragdakis worked at TU Berlin and the Massachusetts Institute of Technology (MIT).
Michel van Eeten
Michel van Eeten is professor of cybersecurity at the Faculty of Technology, Policy and Management. His field of study covers corporate and government security, attacker behaviour, and the impact of policies. The press regularly seeks his views when there are data breaches or other problems on the worldwide web. “When it comes to tackling cybercrime, we really are only just beginning.”
An enormous subject
For many people, this topic seems somehow remote from their own lives. Georgios Smaragdakis: "Many people think: it's not my problem, or the subject is too big, too complicated. People often don't even fully understand what it is, or just how big it is." And no, it’s not as though the professor expects the average resident of the Netherlands to understand the full complexity of the digital network and how to secure it. "But we certainly do have a shared responsibility in this regard. Since literally everything is online these days, we are reaching the point where we all need to be aware of the dangers and of our own responsibility." He sees it as a fundamental responsibility for businesses and public authorities to secure themselves properly, and for citizens to be aware to some extent of all the dangers looming. “Being careless online not only harms you yourself, but might also give criminals access to other users such as your colleagues or family members. Do you agree, Michel?”
Michel van Eeten: “Yes I do, but I’ve also noticed a trend that’s the opposite. Around the year 2000, when people first encountered viruses and malware, these were entirely your own problem. You had to fix these problems yourself, ISPs did nothing. The government did nothing and even software companies like Microsoft did nothing. Until the latter realised that it was damaging to their brand; at this point they started investing in security. And I do think users over the years have dutifully done the things they were told to do; most people install antivirus programs and run updates regularly. Users have responsibility to a certain extent, but an increasing part of the problem is not in the hands of the user."
As a good example he cites the outages last July that occurred due to a faulty software update at the IT company CrowdStrike, which affected airports and hospitals and rendered them temporarily unable to function. Van Eeten: “End users couldn’t do anything about this. You can alert people to a WhatsApp message that says: 'Hi, I have a new phone number and I'm stuck, can you transfer a thousand euros quickly?' – sure, at that level we all have responsibility. But citizens could not prevent those enormous outages like the one that took place in July."
Disruptions like these clearly illustrate how big the subject is: cybersecurity affects hospitals, airports, banks, defence, production chains. Just about everything in our digital society is connected to everything else online, and our private data is contained in dozens of profiles and databases.
Phishing emails
Van Eeten: “There’s a lot that users simply need to assume: that Apple checks the App Store somehow, for example, and that there is no malicious intent behind the apps you download from it. And these assumptions are not wrong: without a certain amount of trust we wouldn't be able to do anything online."
And another thing: a certain percentage of your organisation will fall for that phishing email, that is just inevitable, the professor argues. “The Maastricht University hack started with a researcher opening the wrong email. As a society, you can do all kinds of things to reduce these kinds of mistakes but the percentage of people who are duped will never be reduced to zero per cent. Even well-informed people make mistakes, for instance if they’re pressed for time or are tired or distracted." Therefore, security should never be based on the assumption that users will do their jobs flawlessly. "If you build a motorway, you also build crash barriers, right? An organisation that is hacked because a single user clicked the wrong link effectively has no security. As cyber security professionals, therefore, we need to make sure that security not only works and protects people, but also protects them from their own mistakes."
As cybersecurity professionals we need to make sure that security not only works and protects people, but also protects them from their own mistakes.
Michel van Eeten, professor of cybersecurity
Smaragdakis: "My point is that we need to raise awareness about cybersecurity, among users but also within organisations, in public authorities, in academia. But I totally agree with Michel that when building the systems, you have to assume that something will break down at some point. You may assume that someone will break in, because this is bound to happen sooner or later, and then you have to have the mechanisms and tools to ensure that this intruder can do as little harm as possible, and that the systems such as the power grids or the system of an airport can remain operational, and that, for example, access is not gained to the backup."
To measure is to manage
Scientific knowledge is important too. Michel van Eeten: “One of the important contributions TU Delft can make in the field of cybersecurity is the fact that we measure it. There are very many people who just claim things to be true, and there are many assumptions out there that aren’t entirely correct. Fact is, you only really know once you’ve studied and measured things. This is how we find out where the real risks are and how big they are. As a university, we are an independent institution and therefore we play a very important role in this regard."
Smaragdakis: "We can investigate: how big is the problem? Sometimes it is a matter of things having gone wrong in an organisation, sometimes of things having gone wrong in the implementation of a system, sometimes people are unaware of certain steps or protocols, sometimes economies of scale cause security problems. Additionally, we also develop security systems and algorithms ourselves. But as is always the case with innovation: you might come up with 100 things, of which 99 don't work. The one that does work is the one what makes us get up in the morning.”
Raising awareness
A key question is how to raise awareness about cybersecurity. Smaragdakis thinks education can play an important role here: "We need to make cybersecurity an important topic at universities, in our education, and that is what we are doing here at TU Delft. Following graduation, TU Delft alumni will then use the knowledge they’ve acquired here in their jobs in the government, at water networks or airports, to make the systems there safer. In addition, this subject needs to become more important in public debate. And perhaps cybersecurity should even be addressed earlier in the education process, in secondary schools already."
Van Eeten adds: "We definitely need to train more people, because everywhere in society at all levels people are now developing digital products and services. This is wonderful, but they also need to be made safe. For more and more people, it is a professional responsibility to learn about this. You could also say: if you fail to make products that are safe, you have to pay the fine if something goes wrong. This way you encourage companies to invest more in making their products safer. Europe is currently exerting great regulatory pressure on this issue."
Non-events
The problem with cybersecurity is that you don't know what it is until it goes wrong. Van Eeten concurs: “Someone at Microsoft wrote an interesting paper on this. He says cybersecurity is a non-event, something that doesn’t happen. You can't see it, and it may even seem unimportant." He explains that the very fact that it is not top of mind for many people actually means that it works well enough. "There's not that much going wrong."
The CrowdStrike outage generated discussion, because it had made clear how vulnerable some systems or key infrastructure is. The Prime Minister's official response was: 'This is bound to happen more often in the future, get used to it.' Michel van Eeten: “You could see this as a very frank response. However, you could also say: No, that's not good enough, because we rely on these systems. We need to keep figuring out how to improve the situation. In the 1950s, we didn't say ‘A plane is bound to fall from the sky, yes, get used to it' either. Instead, we started looking at how to make aviation safer."
However, he does not think that cyber attacks will become more frequent. “There constantly are new kinds of risks, though. Every new technology involves new sources of disruption. When you add the new risks to the existing risks, it might seem like we’re going to have more and more problems, like we’re becoming more and more vulnerable. Objectively, this is not true: the number of disruptions is actually decreasing in relative terms. It has been declining for years. Fortunately! This is because there are very many people who are working extremely hard to make systems safer, and to detect errors. That is also why the research and teaching we do here at the university is so important."
“And, in reality, globally there aren’t that many people who are skilled enough to hack really complex systems. There are a few groups who can do it, but worldwide their number can be counted on two hands. They’re very good and they can do damage; these are groups that are pursued by the FBI and other security agencies."
Improvement
Anything you invest time and money in gets better. A good example is how video call technology improved during the Covid period. Van Eeten: "In a very short time, everyone switched to video conferencing. In the beginning, there were all kinds of glitches, people breaking into other people's conversations and so on. These problems were all sorted out very quickly. Now Teams and Zoom work very well. You have to look at it this way: the security benefits of keeping an economy running with video calls far outweigh the security risks. We need to realise that we are dependent on something, and that there are vulnerabilities; then more money goes into making this better. This can also be to the detriment of certain sectors or services. An example is the defence network: there is no perceived dependency because it has never failed before. This may very well mean that it is suffering from under-investment."
Does it take war or a crisis to get systems secured? Van Eeten: “You need trial and error, and you need to be dependent to a certain degree to know what you need to invest in. The best way to set priorities is to figure out where it hurts. If a security risk leads to economic damage, we often start there. The reason Zoom and Teams became so good in such a short time was because Silicon Valley pumped billions into it during the Covid crisis to improve it, because there was money to be made."
Smaragdakis adds: "You do sometimes need shocks to make progress, yes. But we should not wait for the next crisis before raising awareness about cybersecurity – among everyone. We are in the midst of a digital transition: more and more services are taking place online, more and more data is on the internet. Schoolchildren and students need to learn about securing data, we need to train far more professionals, more research money deserves to go into it, it should be put higher on the political agenda. It is set to become more and more important."
We should not wait for the next crisis before raising awareness about cybersecurity – among everyone
We should not wait for the next crisis before raising awareness about cybersecurity – among everyone