This effort looks cool. Providing a codified overview of What We Should Know about cyber security could be very helpful, esp for teaching. I just read the Software Security Knowledge Area and it wasn’t a bad read. It felt a little “listy” without a good cognitive map. In particular, the cop out on the completeness of their taxonomy of faults. I don’t blame them on not claiming to be comprehensive, but I don’t know whether they cover the bulk or the important ones just from reading this. I should have a better sense of what I don’t know!
Then there was this thing that bugged me:
• A detection technique is sound for a given category of vulnerabilities if it can correctly conclude that a given program has no vulnerabilities of that category. An unsound detection technique on the other hand may have false negatives, i.e., actual vulnerabilities that the detection technique fails to find.
• A detection technique is complete for a given category of vulnerabilities, if any vulnerability it finds is an actual vulnerability. An incomplete detection technique on the other hand may have false positives, i.e. it may detect issues that do not turn out to be actual vulnerabilities.
Oy! This reverses the ordinary (i.e., mathematical logic) notions of soundness and completeness…sorta. They didn’t quite flip the meaning, but instead they focused on an entailment class that’s weird. Take soundness. The entailment they pick is, “Program P, for vulnerability class V, has no Vs.” It’s that “no” that messes it up. In order to conclude that there are no Vs it has to be the case that ifthere was a V, it would find it. I.e., that it was complete with respect to “P has a V”. And I mean, the last sentence makes it clear that they are thinking at the “P has V” level. And, of course, their bogus complements focuses on the “P has a V” level, so they just screwed up. Sigh.
It would be much more straight forward to define a detection technique for V as a procedure which takes P as an input and returns a list of “P has a V” statements (with specific Vs). Then the technique is sound if it produces no false positives and complete if no false negatives. A sound and complete technique that returns the empty list allows us to conclude that the software is secure wrt V.
Then there’s this:
It is important to note, however, that some detection techniques are heuristic in nature, and hence the notions of soundness and completeness are not precisely defined for them. For instance, heuristic techniques that detect violations of secure coding practices as described in 2.3 are checking compliance with informally defined rules and recommendations, and it is not always possible to unambiguously define what false positives or false negatives are. Moreover, these approaches might highlight ’vulnerabilities’ that are maybe not exploitable at this point in time, but should be fixed nonetheless because they are ’near misses’, i.e., might become easily exploitable by future maintenance mistakes.
Sigh. Detection techniques that are “heuristic” are generally unsound or incomplete. What they seem to be talking about is problems (or maybe just infelicities) with the definition of some category of vulnerabilities.
Still! It’s in development and even as such, I’d point a student at it. These things aren’t supposed to substitute for textbooks, but they can be helpful as a quick orientation and sanity check.