4 min read

That company's big hack probably wasn't an employee's fault

Companies that blame an employee or human error following a cyberattack often neglect to account for broader security failings.
a stock photo of someone looking at their phone while working on their computer with Slack open.
Photo by Austin Distel / Unsplash

If you have read as many data breach notifications and security incident disclosures over the years as I have, a familiar phrase appears with alarming frequency: "Human error."

An employee clicks on a malicious link and now an entire company has ransomware. Someone's password was stolen and used to access banks of customer data. A staffer unwittingly picked up a hacker's phone call and was tricked into allowing them access to the network. You get the idea.

In fact, several companies of late have used similar language to describe their security lapses, firmly placing the blame as a result of the actions of their employees.

A U.K. transportation business collapsed after hackers were accused of guessing an employee's weak password. Retail giant Marks & Spencer blamed an employee at its outsourced IT provider for allowing a cyberattack and data breach that caused disruption for weeks. Healthcare provider Ascension said a contractor made an "honest mistake" when they installed malware that allowed hackers to siphon millions of people's health records. Even AI giant Anthropic recently blamed human error for the inadvertent release of its proprietary source code to the Internet.

Someone, surely, should do something about this epidemic of *checks notes* ...humans! The reality is that it's rarely — if ever — an employee's fault for causing a security incident. Technical guardrails exist to prevent humans from accidentally screwing up (we do, often) because humans are simply a cog in the wider machinery that can be exploited during a cyberattack.

As security consultant Brian Honan recently wrote:

"The real issue around cybersecurity is not human error, it is the failure of the technology and the system designs and architecture to support real human behavior.

...Even if a person did get phished or fell victim to a malicious email this should not prompt another round of finger-pointing. Instead, it should raise urgent questions about why so many of our systems still leave people so vulnerable."

I've covered a lot of data breaches during my career as a cybersecurity journalist, and it's important to recognize that things can and will go wrong. When they do, it's critical to learn something from the incident so that it doesn't happen again. (The way that news outlets cover and frame the reporting of a breach also matters, though not every media outlet or publication can parse the complexities of security incidents in the immediate aftermath. I am sure that I am guilty of this in the past as well; but I try to live and learn.)

But as companies often blame their employees for the breach, it's the companies themselves that shoulder the ultimate responsibility.

In the case of that U.K. transportation company, its executives underinvested in basic cybersecurity measures, which allowed an employee to set an easily guessable password and allowed hackers to break in and irreversibly scramble its systems.

Marks & Spencer years ago outsourced a key part of its IT security to a third-party corporate conglomerate with a history of security lapses, saving the retailer gobs of money by not having to pay its own IT team — but evidently not from a cyberattack.

The ransomware attack on the Ascension health network was not caused by a contractor accidentally installing malware on their computer, but years of technical debt and other long-known security failings that allowed hackers to obtain patients' data with ease.

And, for a company of Anthropic's size, you would think a tech titan would be able to keep their internal code secure, but alas failed on not one, but two separate occasions.

Even in the rare, edge cases where an employee acts maliciously from within the company, the damage done by insider attacks or intrusions that rely on hijacking an employee's legitimate access can be lessened by properly segmenting and compartmenting employees' access to sensitive data and systems.

We can (and should!) learn from cyberattacks, in the same way that we would want to learn from any mistake to prevent a repeat incident. Attributing cyberattacks solely to human error is a scapegoat excuse for corporate types to avoid accountability.

It's only by reflecting on the whole picture — including how we use (or fail to use) technology — of how and why an attack was allowed to succeed. If any one employee is ever to blame, it's the leader at the very top of the corporate ladder who bears the burden for when things go wrong.

~ ~

Thank you so much for reading ~this week in security~. Please consider a paying subscription. Feel free to reach out with any feedback, questions, or comments about this article: this@weekinsecurity.com.