“ To catch a thief, to prevent from theft, think like a thief”.

The idea is that if as a security professional you don ̳t know what threats you are facing from crackers/hackers, you will never be able to build an efficient security system. In fact many a times in movies you would have seen that the police is always behind the thieves but is unable to catch them. The reason for the same is that they are unable to think like a thief. If you want to catch a smart hacker you have to be equally aware without hacking. As an organization you must have a right mixture of both security and hacking skills. This image above clearly explains what will happen if you are not able to think like a hacker. You will note that this book will explain you both sides of the coin.


Confidentiality, Integrity and Availability (CIA). CIA is a widely used benchmark for evaluation of information systems security, focusing on the three core goals of confidentiality, integrity and availability of information.

  • Confidentiality

Confidentiality refers to limiting information access and disclosure to authorized users — “the right people” — and preventing access by or disclosure to unauthorized ones — “the wrong people.” Authentication methods like user-IDs and passwords, that uniquely identify data systems’ users and control access to data systems’ resources, underpin the goal of confidentiality.

  • Integrity

Integrity refers to the trustworthiness of information resources. It includes the concept of “data integrity” — namely, that data have not been changed inappropriately, whether by accident or deliberately malign activity. It also includes “origin” or “source integrity” — that is, that the data actually came from the person or entity you think it did, rather than an imposter. Integrity can even include the notion that the person or entity in question entered the right information — that is, that the information reflected the actual circumstances and that under the same circumstances would generate identical data.

  • Availability

Availability refers, unsurprisingly, to the availability of information resources. An information system that is not available when you need it is almost as bad as none at all. It may be much worse, depending on how reliant the organization has become on a functioning computer and communications infrastructure. Availability, like other aspects of security, may be affected by purely technical issues (e.g., a malfunctioning part of a computer or communications device), natural phenomena (e.g., wind or water), or human causes (accidental or deliberate). Information Systems are decomposed in three main portions, hardware, software and communications with the purpose to identify and apply information security industry standards, as mechanisms of protection and prevention, at three levels or layers: Physical, personal and organizational. For each of the three categories (confidentiality, integrity, and availability), it is necessary to determine if the protection requirement is:
High: A critical concern of the organization.
Medium: An important concern, but not necessarily paramount in the organizations priorities.
Low: Some minimal level of security is required, but not to the same degree as the previous two categories.


Threat:An action or event that might compromise security. A threat is a potential violation of security

Vulnerability:Existence of a weakness, design, or implementation error that can lead to an unexpected and undesirable event compromising the security of the system

Target of Evaluation:An IT system, product, or component that is identified/subjected to require security evaluation

Attack:An assault on the system security that is derived from an intelligent threat. An attack is any action that violates security

Exploit:A defined way to breach the security of an IT system through vulnerability