Threat Research Blog

November 25, 2021

Errare humanum est

Nick Ellsmore is an Australian cybersecurity professional whose thoughts on the future of cybersecurity are always insightful.

Having a deep respect for Nick, I really enjoyed listening to his latest podcast "Episode #79 Making the cyber sector redundant with Nick Ellsmore".

As Nick opened the door to debate on "all the mildly controversial views" he has put forward in the podcast, I decided to take a stab at a couple of points made by Nick. For some mysterious reason, these points have touched my nerve.

So, here we go.

Nick: The cybersecurity industry, we spent so long trying to get people to listen to us and take the issue seriously, you know, we're now getting that, you know.

Are the businesses really responding because we were trying to get people to listen to us?

Let me rephrase this question.

Are the businesses really spending more on cybersecurity because we were trying to get people to listen to us?

The "cynical me" tells me No. Businesses are spending more on cybersecurity because they are losing more due to cyber incidents. It's not the number of incidents; it's their impact that is increasingly becoming devastating.

Over the last ten years, there were plenty of front-page headliners that shattered even seemingly unshakable businesses and government bodies. Think of Target attack in 2013, the Bank of Bangladesh heist in 2016, Equifax breach in 2017, SolarWinds hack in 2020 .. the list goes on.

We all know how Uber tried to bribe attackers to sweep the stolen customer data under the rug. But how many companies have succeeded in doing so without being caught? How many cyber incidents have never been disclosed?

These headliners don't stop. Each of them is another reputational blow, impacted stock options, rolled heads, stressed-out PR teams trying to play down the issue, knee-jerk reaction to acquire snake-oil-selling startups, etc. We're not even talking about skewed election results (a topic for another discussion).

Each one of them comes at a considerable cost. So no wonder many geniuses now realise that spending on cybersecurity can actually mitigate those risks.

It's not our perseverance that finally started paying off. It's their pockets that started hurting.

Nick: I think it's important that we don't lose sight of the fact that this is actually a bad thing to have to spend money on. Like, the reason that we're doing this is not healthy. .. no one gets up in the morning and says, wow, I can't wait to, you know, put better locks on my doors.

It's not the locks we sell. We sell gym membership. We want people to do something now to stop bad things from happening in the future. It's a concept of hygiene, insurance, prevention, health checks. People are free not to pursue these steps, and run their business the way they used to .. until they get hacked, get into the front page, wondering first "Why me?" and then appointing a scapegoat.

Nick: And so I think we need to remember that, in a sense, our job is to create the entire redundancy of this sector. Like, if we actually do our job, well, then we all have to go and do something else, because security is no longer an issue.

It won't happen due to 2 main reasons.

  • Émile Durkheim believed in a "society of saints". Unfortunately, it is a utopia. Greed, hunger, jealousy, poverty are the never-ending satellites of the human race that will constantly fuel crime. Some of them are induced by wars, some — by corrupt regimes, some — by sanctions, some — by imperfect laws. But in the end — there will always be Haves and Have Nots, and therefore, fundamental inequality. And that will feed crime.
  • "Errare humanum est", Seneca. To err is human.

Because of human errors, there will always be vulnerabilities in code. Because of human nature (and as its derivative, geopolitical or religious tension, domination, competition, nationalism, fight for resources), there will always be people willing to and capable of exploiting those vulnerabilities.

Mix those two ingredients — and you get a perfect recipe for cybercrime.

Multiply that with never-ending computerisation, automation, digital transformation, and you get a constantly growing attack surface.

No matter how well we do our job, we can only control cybercrime and keep the lid on it, but we can't eradicate it. Thinking we could would be utopic.

Another important consideration here is budget constraints. Building proper security is never fun — it's a tedious process that burns cash but produces no tangible outcome.

Imagine a project with an allocated budget B to build a product P with a feature set F, in a timeframe T. Quite often, such a project will be underfinanced, potentially leading to a poor choice of coders, overcommitted promises, unrealistic expectations.

Eventually leading to this (oldie, but goldie):

Add cybersecurity to this picture, and you'll get an extra step that seemingly complicates everything even further:

The project investors will undoubtedly question why that extra step was needed. Is there a new feature that no one else has? Is there a unique solution to an old problem? None of that? Then what's the justification for such over-complication?

Planning for proper cybersecurity built-in is often perceived as FUD. If it's not tangible, why do we need it? Customers won't see it. No one will see it.

Scary stories in the press? Nah, that'll never happen to us.

In some way, extra budgeting for cybersecurity is anti-capitalistic in nature. It increases the product cost and, therefore, its price, making it less competitive. It defeats the purpose of outsourcing product development, often making outsourcing impossible.

From the business point of view, putting "Sec" into "DevOps" does not make sense.

That's Ok. No need.

.. until it all gloriously hits the fan, and then we go back to STEP 1. Then, maybe, just maybe, the customer will say, "If we have budgeted for that extra step, then maybe we would have been better off".