Industry insights

Cybersecurity is a public health crisis, so why don’t we treat it that way?

Susan Morrow
April 29, 2021 by
Susan Morrow

The past year of dealing with the Coronavirus has made one thing clear: Institutions have a variety of  processes and tools that use data to measure and inform public health across the globe. Cybersecurity, like public health, has a similar worldwide scope, so why is our cyber public health so lacking in guidance?

That’s a question posed by Adam Shostack in a recent public lecture.

“We have [health] guidance for the public,” Shostack wrote in his lecture abstract. “We have few equivalents in the world of cybersecurity. We do not know how many computers have malware on them. We do not know what the equivalent of deaths are: is it systems lost to ransomware? . ... Security experts rarely give advice on the level of “wash your hands.” Their advice is rarely consistent with other experts, or the public. People are naturally confused and give up. These are all things that public health statistics could help us define and measure.”

It’s an intriguing idea from Shostack, who has vast experience in security engineering and threat modeling. Can this public health approach help define the application of effective cybersecurity measures?

Should you pay the ransom?

Should you pay the ransom?

Download The Ransomware Paper for real-world ransomware examples, mistakes and lessons learned.

The discipline of cyber public health

Public health studies look at how to identify emerging problems like communicable diseases, such as viruses, and non-communicable diseases, such as diabetes. Public health also factors other variables such as environmental and behavioral aspects of health.

Shostack argues a discipline of cyber public health could create a better approach to the discipline of cybersecurity.

For example, one of Shostack’s concerns is the lack of consistent and rich data in the field of software engineering. As he states, “If we don't know what protects or improves the security of people or societies, that puts strong limits on the effectiveness of our efforts to improve it.”

Product decisions impact security

How a product is designed, developed and deployed brings about decisions that are made without thinking about the person using the software. 

Shostack used Microsoft’s Autorun feature as an example of how insecure defaults allow insecurities to permeate day-to-day tasks, such as opening a pdf. This feature, in Windows 95, allowed software manufacturers to set a program to automatically install when a CD was inserted. A long story short, this opened an opportunity for malware to write copies of itself to USB ports. He went on to point out just how difficult it has been to quantify the impact of this single change on the proliferation of malware.

The clue is in the data

Data is an issue in many cybersecurity scenarios. Therefore, it is difficult to make an analogy with public health and to enact public health-like cybersecurity structures. Getting to the heart of high-quality, informative data is complicated because of the often-interwoven relationships between computers, people and applications. These relationships can be difficult to tease out and quantify, as Shostack states:

“…but I can't tell you how many security problems have resulted from a well-intentioned person enabling an attack at a crucial moment. I also cannot tell you how many have an ill-intentioned but authorized person behind them. …we spend dramatically more effort managing those vulnerabilities than the issues from interface bound attackers.”

The programmers and security engineering

How can these issues be related to public health? In terms of the number of issues, vulnerabilities far outweigh design flaws. However, the numbers do not necessarily represent the scope of the problem. As Shostack says, “there are only seven coronaviruses that make people sick.” The programming software itself can cause flaws, for example, expecting programmers to have perfect syntax is effectively setting them up for a fall that could result in a security vulnerability.

All these things must be used to inform how to quantify cybersecurity issues.

Relying on developers to understand all the nuances of secure programming may be too much of an expectation. Training is a key part of improving the numbers in software code vulnerabilities.

Without public health style data, decisions are reduced in value and accuracy.

People, applications and cybersecurity

People have limited choices in the technology they use. The design and remit of the technology are already set. Shostack gave the example of an electric car: you cannot buy an electric car without it being connected to the cloud. The public use of technology is ubiquitous and sometimes this use will be influenced by cyberattackers. 

However, going back to the public health analogy, Shostack says that cybersecurity experts rarely advise the public at the level of “wash your hands.” At best, the advice given on security hygiene is inconsistent and does not meet the requirements of public health advice of “what is the threat?”

What elements of cyber public health exist now?

Shostack pointed out that there is a nascent set of elements that can form a basis for a cyber public health initiative:

CERTS and information sharing

CERTS focuses on information around vulnerabilities and indicators of compromise (IoCs). Data on intrusions are augmented by companies that focus on attacker groups and analyze data on these groups. However, publication and peer review are not consistent.

Government agencies

Shostack makes an important note that scientific data gathering, and dissemination, is not listed as being part of the mission of some agencies including CISA and NISA. Shostack notes that in terms of ENISA, its focus is on expertise, policy and capacity. As a comparison, the CDC focuses on health security, putting science into action helping medical care, fighting disease and nurturing public health.


Shostack gives the example of The Cambridge Center for Cybercrime, an organization that collects in-depth data on cybercrime, over time. He also mentions that the Ecole Polytechnic of Montreal uses real-world observation to collect cybersecurity data.

The problem with “what you see is all there is”

The problem is that any notion that the data you see is a complete picture of a problem is not always true. This is an issue in cybersecurity where information sharing assumes that the information that is gathered and shared fills in the picture, but often the information is only a view of certain isolated indicators and does not reflect problems or mechanisms. And importantly, this data rarely informs new research. What you see is all there is (WYSIATI) is not a truism in cybersecurity or cyber life.

The example given is Google’s VirusTotal. This platform allows a user to upload suspicious files and URLs, which it then analyzes and shares data with the security community. This is analogous to the Council of State and Territorial Epidemiologists (CSTE) but goes further by focusing on a precise sample.

However, there are discrepancies in the VirusTotal data. This misalignment of results means that applying the scientific method to the results is difficult. One of the areas that exacerbate this is that the details of how an algorithm works are often confidential and cannot be used to inform the data or any experiment required.

The building blocks of cyber public health

Shostack concludes that a mapping exercise is needed to create a discipline for cyber public health that complements information security in the same way that public health complements medicine:

  • Find the disease equivalent
  • Include both communicable and non-communicable equivalents
  • Add in environment and lifestyle homologs
  • Use a framework that sets out that what is required for people to live healthy cyber lives

This can help to develop a broader understanding of the types of problems faced by people and less about how computers are compromised. This is a novel view on cybersecurity threat mitigation and one that modern cybersecurity professionals can benefit from understanding.

How computers are compromised is understudied, according to Shostack.

Shostack also makes an analogy with the open-source community in software engineering whom he says provides research benefits from a more cohesive and community approach to development.

Phishing simulations & training

Phishing simulations & training

Build the knowledge and skills to stay cyber secure at work and home with 2,000+ security awareness resources. Unlock the right subscription plan for you.

A call to action for a cyber public health framework

Shostack leaves with a call to action: Academics, software professionals and technology policy can all benefit from, and feed into, cyber public health, which will be an interdisciplinary area. The area needs to have further research, and new funding vehicles are needed to make this happen. But before making any progress, the industry needs to admit that there is a problem in secure software engineering, made more difficult by a lack of threat data. 

There is an urgent need to understand how the public health effects of technological systems interplay with other elements of regulation. One of these issues is that many systems are being used as designed at the point that they are compromised.

Adam finishes with an important point:

“The ways in which technology amplifies disinformation and distrust — and so inhibits our responses to the pandemic — should not be ignored. Those internet systems which carry that disinformation are working as intended. Their confidentiality, integrity and availability are not inhibited. The need for a discipline of cyber public health is not merely a need for our engineering work, but a need for our societies.”

As cybersecurity moves on from its adolescence stage and into an established industry, perhaps it will catch up to where public health is today — and this cyber public health framework will grow more mature.



We Need A Discipline of Cyber Public Health, CASA 

Public health services, World Health Organization: 

Virus Total

Council of State and Territorial Epidemiologists, CSTE

Susan Morrow
Susan Morrow

Susan Morrow is a cybersecurity and digital identity expert with over 20 years of experience. Before moving into the tech sector, she was an analytical chemist working in environmental and pharmaceutical analysis. Currently, Susan is Head of R&D at UK-based Avoco Secure.

Susan’s expertise includes usability, accessibility and data privacy within a consumer digital transaction context. She was named a 2020 Most Influential Women in UK Tech by Computer Weekly and shortlisted by WeAreTechWomen as a Top 100 Women in Tech. Susan is on the advisory board of Surfshark and Think Digital Partners, and regularly writes on identity and security for CSO Online and Infosec Resources. Her mantra is to ensure human beings control technology, not the other way around.