Basics

  • InfoSec/Assurance is a nascent (young) field and has been evolving "organically." Formal educational/training programs are just getting bootstrapped and most of the high-level professionals as well as the prominent minds in the field have come to it by miscellaneous paths and varied backgrounds, having self-taught most of what they know about the subject. Formal methods and practices in infoSec are also just developing and it is still considered largely "an art" more than a science. Most of what is known has arisen as reaction to attacks and breakdowns, rather than following some kind of pure, underlying scientific structure so the body of accumulated knowledge is all over the place and it's hard for any one expert to know everything about the area, especially because it's evolving so quickly in concert with the technologies on which it feeds. Good luck!
  • CIA: Confidentiality, Integrity, Accessibility. This is the classic InfoSec triangle that you will read and hear about frequently if you're studying the subject. We now know there are more "points to the triangle" but "dodecahedron" doesn't have the nice ring of "triangle" so it's kind of "triangle (+)" for now. And the shape (kind of) captures the tradeoffs between the desirable goals, eg. greater confidentiality makes accessibility tougher, etc.
  • Part of the evolution in the field comes from the evolving threat landscape. In the last decades, we've seen a fundamental shift from hacking for pride or malicious intent to profit-motivation. Hacking has gone from asocial teenagers in the basement to a sophisticated black market industry. In fact, one suite of hacking tools from a vendor in the former Soviet Union was advertised to come with a year of free technical support. This has been bad news for infoSec - we're now up against well trained and equipped professionals and there is now a shadow infrastructure for cashing in on compromised information and resources, leading to the kind of eternal escalation, or "arms race" we saw in the former cold war with the USSR.
  • Key terms: Vulnerabilities are weaknesses that could be attacked. (Consider the analogy of humans - we have soft tissue that can be catastrophically damaged by hard metal objects, for example, like swords.) If we're worried about being attacked, we might try to minimize our vulnerabilities by wearing armor, but there will be spaces between the helmet and the chest protector that could be exposed when we move certain ways. The same is true for organizations. Exploits are the tools used to attack the vulnerabilities, eg. the swords. If no exploits exist for a vulnerability, then there is ostensibly no threat but these things have a way of popping up unexpectedly, so it's probably not wise to rely on that. Threats are the potential attacks but that potential may be low if no one knows about your vulnerability or there are no known exploits or few people are motivated to hurt you (exactly unlike Microsoft, for example). Risk is the threat discounted by the probability of it actually being realized: a meteorite could kill you which would be super bad but the probability is...well, astronomical so the risk is quite low, despite the consequence being catastrophic if it did occur.
  • Planning is an imperative for assurance which is the broader challenge that encompasses infoSec. Assurance is making sure the organization and its staff have safe access to the information they need to do their jobs effectively. Part of that is having plans in place so when bad things happen to good organizations, they can still function and aren't debilitated or destroyed. An Incident Response Plan (IRP) guides the immediate response to an attack so nobody panics and the right steps are followed quickly and in the right order. A Business Continuity Plan (BCP) is executed after the incident has ended (if needed) and ensures that the organization has a way to keep the business functions functioning and minimize disruption until things are back to normal.  A Disaster Recovery Plan (DRP) provides the steps for getting back to normal after a major incident.
  • Controls are the things we can put in place, as managers, to address the risks facing the organization - the things we can do to manage the risk to acceptable levels. Controls come in the form of 1) policy, 2) education/training, 3) technology or some combination of two or all three. Of course, none of them will work without good managerial practice, eg. we could reduce the risk of password compromise setting policy requiring users to create a new password, 50 special characters, every 2 days but there would be severe backlash and users would derail any benefits by writing the passwords on Post-its, etc. In organizations, infoSec is really more managerial than technical.
  • Remember our goal is to manage risk to the level deemed to be in accordance with the organizations strategic goals. (That's super hard to do accurately but we do the best we can.) Part of risk management is determining which asset/vulnerability pairs need to be addressed with controls and then identifying the control options and assessing the feasibility of implementing them. 
  • Feasibility is multi-faceted with issues ranging from "Do we have the technical infrastructure (staff as well as the technology itself) to support it?" to  "Is there political support within the organization to make it work?" (Think in terms of controls that depend on new policy...and compliance.) But for controls that involve expenditure of dollars for services or equipment, financial feasibility must also be considered so a Cost/Benefit Analysis (CBA) is required. This is tricky in infoSec because the benefit of a control is really the value of avoiding a cost, eg. from a data breach. So the benefit is "not incurring a $500,000 expense", for example. Anyway, the point is to ensure that the total cost of a control is exceeded by the expected value of the benefit. (See the tutorial movie on how to compute CBA found on the Tutorials page in this blog site.)
  • Three critical points regarding CBA for controls: 
    1. You need to be super-conscientious to make sure the total cost includes all the costs, not just the purchase price but also the training of your staff, the annual maintenance fees, etc., 
    2. You'll be fighting an uphill battle to get funding from upper management because infoSec represents a "cost-center" (doesn't generate revenue - only avoids further costs and only "possibly"), and they don't understand technology nor the threats, so make your case clear and compelling.
    3. Just because benefit exceeds cost doesn't mean you have passed the go-forward threshold. All the other forms of feasibility (managerial, technical, operational...) must also have been met but in addition, the Return on Investment (ROI) for the control must also be favorable. In other words, the control's benefit might only exceed the cost by a small amount over time relative to the revenue that same investment might have generated if applied new marketing campaign or development of a new product instead. To choose the control, it's CBA must show an ROI that beats the other uses of that money within the organization.
  • The legal system impacts infoSec in two ways, one represents a burden, the other a tool: 
    1. Burden: It imposes legal requirements that the organization must comply with to control criminal and civil liabilities. This burden can be significant in terms of time and actual cost because it means the organization must keep up on the latest laws and regulations (federal, state, local) and implement new policies and processes for compliance. And the laws around infoSec spring up and evolve quickly, trying to keep pace with the threats which keep pace with the technology. 
    2. Tool: It provides both deterrence to prevent crime and a system for pursuing prosecution and retribution (victim getting money back from the perpetrator). Thus, the good guys have a weapon which ostensibly should make criminals think twice about attacking and should get them punished when they do. It should even make them pay back the victim organization for damages. Unfortunately, it's really hard to catch these guys, especially if they're overseas. And when they are caught and prosecuted, they may escape conviction. If they are convicted, they may get a light punishment, especially if it's a first offense. If the organization sues them for damages, it may lose or if it does win, it may be difficult or impossible to collect the funds - the offender may not even have any money to collect. And all of this take money which the victim organization has to invest to go forward with the prosecution and/or civil suit. Lawyer fees are high as is the cost of gathering the evidence, forensics, to discover and show exactly what happened when and why. Bottom line: the legal system is a marginally helpful tool and needs to do better to help the good guys win this war.
  • An important legal concept for InfoSec is that of due care/due diligence. In general, the legal system reflects our society's expectation that people are reasonably careful when they do anything that could potentially have an adverse affect on others. Due care is the level of "carefulness" society expects and it varies, of course, depending on what you're doing - mowing your lawn, driving on the freeway, performing surgery - these all carry different expectations about how careful you should be. If you accidentally hurt someone but you were demonstrating due care then you (ostensibly) are free of liability but if you were not exercising due care you could be sued for any damages that accrue. The same is true for organizations. If there is a data breach and customer's identities are stolen as a result, then the organization better be able to show that it was exercising due care in the stewardship of that data. If not, they're in big trouble and the customers could sue for damages, probably successfully. Part of the infoSec professional's duty to the organization then is to ensure that due care is being applied and if not, that corrective measures are put in to place to get there before a problem occurs. 
  • Due diligence is often confused with due care but they are not the same. Due diligence is the doing of things that society would expect you to do if you were meeting the expected level of carefulness known as due care. So if you were driving on the freeway and meeting our expectation of due care, you would be doing your due diligence by checking your mirrors and blind spot before changing lanes. In terms of the organization and infoSec, you would be checking the network firewall logs on a daily basis, for example, to discover any breaches and inform potentially victimized customers immediately.