Possible Content and Content Criteria
This page lists possible content and possible content criteria for
software facts. That is,
these are the fact themselves and some criteria for deciding which facts to
include. Jeff Williams considered four groups of facts:
the application or software itself
(both design and code), people, process, and tools used.
Other background material covers:
- links to and descriptions of similar
programs to learn from or model after, and
- possible scopes, including audiences,
product classes, goals, criteria, and terminology.
Possible Criteria for Content
It would be helpful if we had clear criteria why one measure or metric is
included in the set of software facts and another isn't. It would also help
develop new metrics or refine existing one. Some possible criteria for content
are:
- facts should be verifiable (by the user? by a knowledgeable third
party? with access to source code?)
- based on science, that is, demonstrated influence on security
- accurate and consistent, in other words, objective and repeatable
- Absolutely simple for vendors to produce or extract. This suggests these
criteria
- Facts generated automatically, e.g., lines of code
- Tools to generate or find the facts are available to everyone
This criteria is repeated from above. Conceivably each fact might
be simple to produce while the ensemble is hard or vice versa.
- not obvious or useless or just a disclaimer
- "Caution: Hot beverages are hot" - on a coffee cup
- "Not for human consumption" - on a package of dice
- "Caution: Remove infant before folding for storage" - on a portable
stroller
- "Do not attempt to stop the blade with your hand" - in the
manual for a Swedish chain saw
- unambiguous - its clear what it means or measures
Possible Content
Jeff Williams organized facts in four groups: the application or
software itself (both design and code), people, process, and tools
used. We add an "other" category for business aspects. There is
overlap and synergy between items in each group, but this separation
should help us be more thorough.
Obviously many of these need standard units, e.g.,
function points and pages of documentation. It cannot be perfect
(see
Rice's
theorem), but at least the terms will be codified and open
to improvement.
The Software Itself
Maybe facts about design and architecture should be separate from
those about the artifact itself. Many of the results (e.g., no severe
weaknesses found) must refer to the process used,
either explicitly (in the label or a web site) or implicitly (meets some
performance standard).
- Do security controls exist? List of security controls
- Encodings supported
- Backend connections (email, Internet [as client or server])
- Resources or access: disk, screen, key logging, ...
- All communication over SSL?
- encryption used
- what are single points of failure?
- architecture signed off by app sec certified software engineer?
- Server (what does this mean? PEB)
- are there known vulnerabilities? percentage by type (CWE)
- percent of SQL queries that use prepared statements
- uses OWASP Enterprise
Security API (ESAPI)?
- No banned APIs used? (banned by whom? PEB)
- what (open source) components are included?
- % LoC from open source
- size of code (LoC, function points, number of modules)
- coding language(s). object-oriented vs. procedural vs. functional
- code (or module) signatures (hashes)
- What libraries are used?
- pedigree/provenance - what is the origin of the modules? internally
developed, contract/custom, commercial/closed, free/open
- tests and test results (some overlap with testing in the Process section)
- results of automated static analysis
- summary (e.g. no more than 7 high severity weaknesses found) or
details (2 XSS found, 17 buffer overflows, 2 hard-coded
passwords)
- mention tool & version used or some normalized measure ("we used a
tool that satisfies SwAC Level W")
- % of the application analyzed
- Limits (e.g., "no more than 32 767 crew changes per month")
- What are the configuration files? (registry entries, ...)
-
Steve Christey names "Unforgivable Vulnerabilities": some that are so
easy to exploit and well-known that their presence "is highly
suggestive of the developer's lack of security awareness combined with
a lack of security testing." He describes an elaboration of David
Litchfield's Vulnerability Assessment Assurance Levels (VAAL) and
shows that unforgivable vulnerabilities are on the lowest
level. Christey suggests that VAAL could be elements in a software
label. His paper is available at
http://cwe.mitre.org/about/documents.html
(accessed 5 October 2007).
People
These are the software architects, developers, testers, and so forth.
Process
This is information about how the software was developed.
- development process (e.g., XP, spiral, etc.)
- development environment (e.g., Eclipse, Visual Studio, etc.)
- was there a threat model?
- are there requirements?
- testing methodology
- black box testing
- unit security testing?
- integration security testing?
- penetration testing? red teaming (by whom?)
- tested on what platforms?
- code review? (by developers? other internal? third party?)
- test/review for malware, esp. code from outside
- static analysis
- how well were the secure coding practices carried out?
- how is it maintained?
- pedigree/provenance - protection of master copy from unauthorized
manipulation, code protection in supply chain
Tools
These are pertinent tools used.
- static?
- dynamic?
- build?
- test case generator?
- content management?
- version control?
- security issue management?
- compilers and linkers
Other
- Cost of license
- Is the source available? No / Yes (where) / Escrowed (where and access
rules)
- Is the default installation secure?
- Certificates (e.g., "No Severe or Moderate weaknesses found by
CodeChecker ver. 3.2")
- vulnerability disclosure policy Full Disclosure / Responsible Disclosure /
Non-disclosure
- is there patch management? automated?
-
Google's
Software
Principles have several points which could be stated in the label,
for instance, Upfront Disclosure and Clear Behavior. Indeed, the
label might be the location of something like a "Follows Google
Software Principles" seal.
-
Andrew Jaquith's Security Metrics: replacing Fear, Uncertainty,
and Doubt has interesting measures on pages 80 and 83 that might
be included.
To the software facts main page
Created
Fri Aug 8 15:41:38 2008
by Paul E. Black
([email protected])
Updated
Thu Feb 28 15:30:03 2013
by Paul E. Black
([email protected])
Information Technology Laboratory,
Software and Systems
Division
PRIVACY/SECURITY
ISSUES •
FOIA •
Disclaimer •
USA.gov
NIST is an agency of the
U.S. Commerce Department
This page's URL is http://hissa.nist.gov/~black/SoftwareFacts/possibleContent.html