Friday, September 08, 2023

Cybersecurity

When deciding on the level of cybersecurity rules to enforce, there is always a tradeoff between security and cost (money and more importantly workplace efficiency). Since 100% cybersecurity is not possible, risk tolerance becomes important.

The first step of cybersecurity is identifying the threat: Are you trying to prevent middle school kids from gaining root access to your servers or do you have national security agencies in mind? Are you trying to prevent data leakage or data corruption? If you want to minimize leakage, is your data really worth the protection cost? People confuse value with secrecy. Something valuable does not make it automatically secret. Many things in life are valuable but not secret. In my experience, the real secrets are far less than what is commonly claimed. Classifying threats and security levels requires individuals who are both experts in security and are aware of its costs. Good luck finding them. Your typical security advisor has a vested interest in scaring you enough to sell you expensive products.

Risk averse managers lean on more and stricter rules. Managers become risk averse when the expected value of taking risks is low for them personally, i.e. when managers are punished when a risk is realized but not rewarded enough when the benefits of taking risks are reaped. In other words, when loyalty is more important than competency.

What is worse is that stricter rules may create the illusion of security (Security Theater) but can actually decrease it. For example, if you force employees to change their passwords every week, they will use more predictable passwords and even write them on Post-its and attach them to the side of their screens!

In such an environment, the cost of lower efficiency almost never comes up in management meetings and that cost is passed on to the lower level managers and engineers. If you are a software developer, your build times increase and sometimes fail. You have to spend a week to find out that the security software was blocking it, and inform the IT department to make an exception for your build. Since software is an ever changing field, you face such problems regularly. If there is a security breach, individuals are held responsible and the system is never questioned. In the long run, this causes longer shipping times, low quality, low morale and loss of personnel. In short, less security and less value to secure...

You have to make peace with the fact that a security breach will eventually happen, similar to earthquakes. What I propose is to use a minimal set of malware scanners and to focus on preventing data corruption by having good backup systems in place. Regulary check that you can build back your system from those backups so that a corruption will have minimal impact.

For more, see my previous posts.

1 comment:

Adnan said...


The "cobra effect" originated from an incident during British rule in India. The government offered rewards for dead cobras, which led to more snakes being killed. People began breeding cobras for profit, and when the government stopped the rewards, breeders released the snakes, causing the wild cobra population to increase.