In order to protect your assets, you must first know what they are, where they are, and understand how they are tracked and managed. Are they secured? Who has access to them? Who tracks and manages them? Do you have functional procedures in place to respond and recover from a security breach quickly? Do you have a process improvement cycle to prevent re-occurrence?
These are all important issues related to assets. It’s important to remember what an asset is — it’s anything used in a business task. Generally, asset protection involves identification of assets, assessment of an asset’s value, and a determination of the technologies needed to provide sufficient security for that asset. There are many facets to the job of asset security including:
- Cloud Computing
- Secure Coding
- Identity Management
- Information Assurance
- Public Key Infrastructure
The cloud offers computing services as a commodity. This involves a wide range of capabilities including online storage and backup, virtual/remote desktop, collaboration services, software as a service, platform as a service, and infrastructure as a service. Popular services include online office productivity (such as Google Docs or Office 365), computing services for custom applications (such as Engine Yard or Windows Azure), or complete back-end scalable datacenters (such as GoGrid or Rackspace). While cloud computing can greatly benefit an organization, it also introduces new and unique security concerns.
Cloud services are at odds with some regulations and security standards. Each organization is responsible for their own compliance of issues like prohibition of comingling of certain data types, hardware types, or data locations. Also, traffic flow must be understood. Is your sensitive and critical data encrypted in transit and while stored/processed in the cloud? Who has access to the encryption keys? What procedures are in place to manage ease of access, recovery options, downtime concerns, backup, privacy protections, and speed of interaction and throughput? Cloud computing revolutionizes technology. The benefits and drawbacks need to be considered carefully before shifting aspects of your infrastructure into the cloud.
Virtualization is the creation and/or support of the simulated copy of a real machine or environment. Virtualization can be used to provide virtual hardware platforms, operating systems/platforms, storage capacity, network resources, and applications. Virtualization can also be used to host applications on a different OS than they were originally designed or allow a single set of server hardware to host several server operating systems in memory simultaneously. Virtualization offers benefits of lower hardware costs, reducing operating costs, efficient backups/restoration, high-availability, portability of services, faster deployment, expandable/scalable, and more. Virtualization adds security to the computing environment by permitting servers to be logically separated from each other. However, virtualization can cause problems with licensing, patch management, and regulation compliance which may cause slower performance of services, greater potential of single point of failure, and potential security concerns due to hardware re-use or sharing.
Secure coding practices are essential to reducing the threat caused by the exploitation of processes, bad/poor coding, and flaws in design. Secure coding includes the consideration of appropriate controls at the onset of development, proper consideration given to design, robust code and error routines, minimizing verbose error messages, eliminating programmer back doors, bounds checking, input validation, separation of duties, and comprehensive change management. Failure to use secure coding practices leads to software that is susceptible to buffer overflow attacks, DoS attacks, and malicious code injection attacks. Non-robust code can also provide a path for database and command injection attacks.
Secure coding practices can include many aspects of secure design integration and attack prevention. For example, software can be designed to authenticate all resource requests and processing actions before allowing a task to operate. Additionally, software needs to limit and sanitize input to prevent scripting, meta-characters, and/or command injection are essential parts of secure coding. Secure coding is more than just a few extra lines of code; it is an entire process and architecture of software development.
Secure coding is an essential security practice not just for vendors that sell/release products to the world-wide market but also for internal software developers that develop code for use exclusively by internal users or which is exposed to the world via an Internet service. One of the biggest mistakes companies make in relationship to the Internet is assuming their Internet servers are secure and cannot be compromised, and if they were ever compromised it would not lead to serious consequences or a breach of their private network. This is usually a poor assumption. With the growing popularity of fuzzing tools to find coding errors, the proliferation and distribution of buffer overflow exploit code, and with several variants of code injection attacks (including SQL, command, XML, LDAP, SIP, etc.), no Internet service can ever be assumed to be immune from breach.
Companies collect a lot of customer and employee data. Identity management involves the protection of all personally identifiable information (PII). This protection includes proper classification of information, delineation of the lines of communication, and strict policies and procedures for access control. Accountability is a key requirement to hold all information requestors (‘subjects’, both internal users and outside attackers) liable for their actions.
Credentials are a popular form of PII subject to attack. All repositories of personal information, access channels to those repositories, and exchange of information with those repositories needs to be protected with strong authentication and encryption. Today’s sharing of information, transient locations of data repositories, and society’s acceptance of weak authentication set the stage for transitive attacks. Transitive attacks occur when a trust is allowed without realizing that it included other trusts that you were unaware of, and that can defeat your security.
Information assurance satisfies management’s desire for a given security profile, indicating that all data is properly protected and able to be accepted as accurate and readily available. The set of processes needed to support this assurance requires the establishment of a reliable means to lock down assets and track their usage. Specifically, information assurance is focused on the security of data or information typically stored in files. It is important to properly manage the risk of using, processing, transmitting, and storing these data files. Secure data management addresses not just electronic or digital issues, but physical storage media (especially portable media) as well.
Public Key Infrastructure
Public Key Infrastructure (PKI) is a security framework and is generally comprised of four main components: symmetric encryption, asymmetric encryption (often public key cryptography), hashing, and a reliable method of authentication. Symmetric encryption is used for bulk encryption for storage or transmission of information. Asymmetric encryption is used for digital signatures and digital envelopes (i.e., secure exchange of symmetric keys). Hashing is used to check and verify integrity.
How will you assure reliable authentication is used to ensure that only valid entities participate in the PKI environment, secure key delivery, secure key use, and key revocation? Customers’ belief in the credibility of certificates, and therefore security of transactions with your website, depend on the reputation and reliability of the CA. Due to recent events by hackers, blind use of digital certificates has been called into question. As with any protection measure, companies need to understand what PKI technology affords us in terms of protection, as well as to be cognizant of the technology’s limitations and vulnerabilities.