What causes bugs in software?
A flaw or failure in a software program could occur due to the following reasons.
- Program errors that programmers create while coding the application. These could be logical errors, syntax errors and semantic errors.
- Lack of testing due to limited time or not having skilled testers to thoroughly test the application for issues and defects.
- Frequent changes in the requirements and miscommunications among the clients, business analysts, developers and testers.
Y2K was both a software and hardware problem. Software refers to the electronic programs used to tell the computer what to do. Hardware is the machinery of the computer itself. Software and hardware companies raced to fix the bug and provided "Y2K complaint" programs to help. The simplest solution was the best: The date was simply expanded to a four-digit number. Governments, especially in the United States and the United Kingdom, worked to address the problem.
In the end, there were very few problems. A nuclear energy facility in Ishikawa, Japan, had some of its radiation equipment fail, but backup facilities ensured there was no threat to the public. The U.S. detected missile launches in Russia and attributed that to the Y2K bug. But the missile launches were planned ahead of time as part of Russia's conflict in its republic of Chechnya. There was no computer malfunction .Countries such as Italy, Russia, and South Korea had done little to prepare for Y2K. They had no more technological problems than those countries, like the U.S., that spent millions of dollars to combat the problem.
Due to the lack of result, many people dismissed the Y2K bug as a hoax .
What Cybersecurity Lessons Can We Learn From Y2K?
The Y2K event was unique in human history and can provide rare insights into how computer systems and microprocessor-based devices function under unusual and unpredictable stress. And that should be instructive for cybersecurity professionals.
- Fixing a vulnerability may create a new vulnerability problems came from the patches and fixes for the Y2K bug, not the bug itself. While testing for Y2K bug problems was thorough, testing the fixes was sometimes less so. Always test the fixes thoroughly.
- Fixing your own vulnerabilities also improves cybersecurity for connected systems. With the Y2K bug, the patches applied in the United States for global systems controlling financial systems, for example, protected countries that took far less action to prepare for Y2K. Likewise, the cybersecurity fixes applied by a supplier may also help protect you, and vice versa. Take a big-tent approach to cybersecurity and make sure everybody is doing their part.
- Don’t expect everyone to give you credit for averting disaster. Cybersecurity people are in an unhappy position, and it’s just part of the job. If you fail to avert disaster, many will blame you for the failure. But if you succeed, they may blame you for being alarmist, spending too much time and money on the problem and misrepresenting a threat. The best you can do is do the best job you can communicating both the risks, the remedies and the benefits of averting crises after the fact.
- The biggest risks come from not one, but multiple points of failure or vulnerability. It’s easy to form tunnel vision about vulnerabilities. But most major cybersecurity failures result from multiple points of failure — a lack of employee training combined with inadequate tools, for example. Think holistically.
- Testing is everything. During Y2K, a regulation that forced mandatory testing enabled the fixes that prevented the most serious problems. Red-team exercises and their many variants are valuable exercises for figuring out in advance where the vulnerabilities lie. Be obsessive about testing.
- Investment to prevent catastrophe is expensive but often money-saving in the long run. Most of the damage caused by cyberattacks is, in the end, expressed in financial terms. But the costs of preventing or minimizing cyberattacks can also be costly. Make sure the cost-benefit analysis of cybersecurity investment is clearly stated in dollars and cents (as well as in other contexts, such as reputational damage). While cybersecurity tools, programs and staff cost money, breaches and attacks can cost far more.
- Old systems can create new problems. Legacy systems or programming languages that had fallen out of vogue meant that the people charged with fixing the problem may not understand how they worked. That was certainly the case with the Y2K bug. (Companies dragged retired programmers out of retirement to help fix the problem.) While it’s easy to ignore or overlook legacy systems that have been churning away for many years, always consider how they might contribute to new problems in the future.
0 Comments