Bellovin pointed out several years ago that most network penetrations are possible thanks to coding errors. Within those coding errors, about 50% are due to buffer overflows.
An attacker interested in penetrating your system knows that the weakest link will not only include people, but also code that they’ve written. The buffer overflow is a common technique because of the amount of C code in use and the lack of array bounds checking required by the C programming language. This gives an attacker an easy to cause an error in a system in a controlled manner, which in turns allows them access to an organization’s networks, applications, and information.
In fact, Robert Morris’ Internet worm (The Morris Worm) from the late 1980s used a buffer overflow attack to exploit known vulnerabilities in Unix sendmail, finger, and rsh/exec. Another example is the “Ping of Death,” which results in a buffer overflow when a 65,536-byte ping packet is sent fragmented.
According to Bellovin, there are several things that organizations can do to mitigate the risk of buggy software. These include:
- Use any programming language other than unprotected C
- Program in Java
- Program in C++ using class String
- If an organization must use C, then a safe string library that properly handles string values should be created
- If C must be used, evaluate the various library functions and use the ones that are known to handle string values safely
- Programmers should always check buffer lengths
Not only is this useful advice for an organization writing code to follow, it also gives useful information that can be applied during testing and validation activities. For example, one test of a new or changed system should always be how does it handle strings that are outside of expected values?
Bellovin also indicates that part of the reason buggy code exists is that requirements are usually insufficient. For example, a developer is commonly given a requirement such as the following:
“Field x should be 256 bytes long.”
Which is significantly different than:
“Field x should be 256 bytes long, any input longer than this must be rejected.”
The second form of the requirement is clearly more complete than the first. A developer uses the requirements that he gets to code the system; the more complete these are the more error-free the system will be. A software tester will also use these requirements to validate the functioning of the system.
Unfortunately, firewalls don’t address every security risk organizations face. As leaders in the field of information security indicate, getting requirements right, coding and testing using those requirements, and using the correct tools are all aspects of a holistic approach to security.