“Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail. It involves thinking like an attacker, an adversary or a criminal. You don’t have to exploit the vulnerabilities you find, but if you don’t see the world that way, you’ll never notice most security problems.”
–Bruce Schneier, cryptographer, computer security and privacy specialist, writer
Computer security literature and documentation are filled with recommendations for best practices for information security. Much of this is based on historical precedent, pain experienced during a security breach or the practices of incident response. Security professionals need to be aware of those practices, especially when they conflict with social factors such as user needs. On systems, networks and even personal systems, systems engineers (try to) practice these policies continually, because the headlines are filled with the news of violations of computer security.
When applying best practices, systems administrators and network professionals use guidance from vendors, as well as governmental organizations, formal training classes and available literature in books and on the Internet. Such guidance includes:
- Physically secure systems.
- Patch early and patch often.
- Limit administrator user rights to those who actually need them, have all others run as “standard user” and enforce role separation.
- Implement Defense in Depth.
- Implement security baselines.
- Apply strong password and file access rules.
Best practices are often implemented in standards. For example, Microsoft’s Best Practices Analyzer (BPA) is built into the Server Manager for Windows Server 2008 through 2012 R2. BPA provides automatic guidance and remediation for configuration and security problems. For operating systems, SQL Server, Exchange Server and a wealth of other applications, these tools will guide administrators and security professionals through automated paths to harden their systems.
Other guidance comes from the U.S. National Institute of Standards and Technology (NIST) Computer Security Resource Center and Special Publication 800 (NIST SP 800). Most applicable to the U.S. federal government and particularly the executive branch and the Department of Defense, the documents in NIST SP 800 provide detailed instructions and recommendations. I recommend starting with SP 800-1 (Introduction to Security), SP 800-39 (Risk Management), and SP 800-53 (Recommended Security Controls).
As part of President Obama’s Comprehensive National Cybersecurity Initiative (CNCI), the National Security Agency (NSA) offers detailed guidance as well. One of the twelve initiatives in CNCI is to advance cybersecurity education in the general community, including businesses and general education in schools, K–12 through college. For example, October has been designated National Cybersecurity Education Month. See my post about that here.
More recently, NIST has released the National Cybersecurity Framework. Based on an executive order signed in February 2013, this program applies best practices and lessons learned over the last five years of hardening and strengthening the executive branch infrastructure to protect it from hackers and cybercriminals.
If you look at books and publications on cybersecurity, starting with Microsoft’s original Windows 2000 Security class, it’s clear that we continue to emphasize common information security best practices: Keep up with patches, disable unnecessary services, have users work with limited privilege, follow system hardening principles, and maintain an ongoing program of user education.
Depending on the industry, there are guidelines, standards, or even federal regulations that drive cybersecurity. These include the Payment Card Industry Data Security Standard (PCI/DSS), the Health Insurance Portability and Accountability Act (HIPAA) and industry-specific guidance. Unfortunately, as shown by Heartland Payment Systems and Target Stores, these standards are often viewed as encompassing security solutions, rather than baseline, minimal requirements.
In cybersecurity, defenders must remain vigilant, but the bad guys only have to win just one. Continuing education, maintaining awareness of current threats and attacks, and understanding the cybercriminal can also require understanding the hacker mindset. As cryptographer Bruce Schneier points out, this may not be a normal way of thinking for the typical system and network administrator. Courses, such as the Certified Ethical Hacker (CEH) program from EC-Council, train defenders on the methodologies they’re up against.
In addition to applying best practices for hardening and securing systems and networks, administrators must also test their environments for vulnerabilities and misconfigurations. This includes using both open-source and commercial vulnerability scanners and then applying remediation plans to the results.
It is said of motorcycle riders that there are only two kinds: those who have dropped the bike and those who are going to drop the bike. The computer security analogue is that there are only two kinds of networks: those that have been hacked and those that will be. An organization must be prepared when hacks and other attacks occur. The incidence response process, then, must be part of ongoing operations, a component of a Business Continuity and Disaster Recovery plan. Finally, after a hacking attack or data loss, an organization needs to look in the mirror by hosting a post-incident review to integrate the lessons learned into an organization’s security framework and understand how to prevent the problem in the future.
One of the problems with compliance with regulation and security frameworks such as HIPAA or PCI/DSS may be a false sense of security. Organizations view these sets of rules as requirements when they are actually minimum standards — baselines, if you will. Because an organization passes HIPAA or PCI/DSS compliance tests doesn’t mean that it is secure in the larger sense, but rather, that it has met the minimum requirements dictated by the certifying organization.
As a result, we see what I call Security through Shame. A hacking incident or data breach may be painful enough that an organization hardens the infrastructure in response to a data loss.
Or, leaders of companies in similar industries see an incident at a competitor or partner and decide to take action. In my CEH v8 classes, I call this a Resume Generating Event, or RGE.
Similarly, for healthcare organizations, there is the U.S. Department of Health and Human Services “Wall of Shame” that any medical consumer can query. DataLossDB.org and DataBreaches.net maintain records of significant exposure of Personally Identifiable Information (PII), and Zone-H.org maintains a database of defaced websites. Companies and other organizations whose names and incidents appear on these websites face issues, such as lack of consumer trust, that may cause potential customers to pick a vendor that hasn’t had a data breach.
As a final note on implementing best practices, one of the best techniques may be to raise the level of preparedness of an organization. The first step is to learn about and implement the best practices for both infrastructure and security. The next step is to measure the implementation, with an eye on the security return on investment (SROI).