Name of Your Organization:

Center for Internet Security

Web Site:

Adopting Capability:

Center for Internet Security Configuration Assessment Tool (CIS-CAT)

Capability home page:

General Capability Questions

Adoption Capabilities

If the functionality is available now, indicate "Yes." If it has been implemented but not released, indicate "Beta". If planned but not currently available, indicate "Planned". If there are no plans for a specific category, that section(s) is not included as part of the questionnaire below.

OVAL Definition Evaluator - Yes
OVAL Systems Characteristics Producer - Yes

Product Accessibility <AR_1.9>

Provide a short description of how and where your capability is made available to your customers and the public.

CIS-CAT is available only to CIS Security Benchmarks members. Members can download CIS-CAT, as a bundled .zip file, from the CIS Members Web Site,

Language Version Indication <AR_1.10>

Describe how and where the capability indicates the version of the OVAL Language used to validate, create, or update its content.

Bundled in the .zip distribution, the CIS-CAT User's Guide contains an instructional section entitled "Using CIS-CAT with SCAP Content". Within this section, subsections exist describing the various standards implemented in CIS-CAT. An "OVAL Implementation" subsection describes the OVAL language, as well as the various OVAL versions supported in CIS-CAT.

Capability Correctness Questions

Error Reporting <AR_2.1>

Indicate how a user who discovers an error in the capability’s use of OVAL can report the error.

All CIS Security Benchmarks Members receive free support for all resources included in their membership, CIS-CAT included. Members can e-mail to initiate a support request.

Responding to Error Reports <AR_2.2>

Describe the approach to responding to the above error reports and how applicable fixes will be applied.

All CIS-CAT-related support requests are logged into our Customer Relationship Management system and assigned to the appropriate technical resources. Those technical resources research each issue, communicating with the member as needed. Once a cause has been identified, appropriate corrective actions are taken, such as technical tickets logged into our issue tracking system, documentation updates, automation content updates, etc. If an issue with CIS-CAT is discovered, the appropriate code fixes are applied, unit tested, staged, and regression tested prior to being checked into our root source control repository. Any code changes committed to source control are logged and included in the CHANGELOG of subsequent releases of CIS-CAT. New versions of CIS-CAT are released to our members on a monthly basis.

Documentation Questions

Adoption Documentation <AR_3.1>

Provide a copy, or directions to the location, of where the documentation describes OVAL and OVAL Adoption for any customers.

The following information is included in the CIS-CAT User's Guide, bundled with each new version of CIS-CAT, and made available to all CIS Security Benchmarks members:

OVAL is used to identify vulnerabilities and issues. Common examples of the use of OVAL files are:

  • the checking language referenced from a separate XCCDF file,
  • the checking language referenced from a checklist component of a SCAP source data stream, and
  • the checking language referenced from a CPE dictionary component of SCAP source data stream.

The OVAL component will contain the definitions, tests, as well as the state a target system is expected to exhibit. When CIS-CAT encounters a reference to an OVAL definition, it parses the specific OVAL components/files and uses those referenced definition identifiers to look up the appropriate tests to be executed. Each OVAL definition may be comprised of one-to-many OVAL tests; the results of which may be logically combined to enumerate an overall definition result. The CIS-CAT evaluation engine is the controller for parsing the required tests, collecting the appropriate system characteristics, evaluating the collected information against the expected state, and recording the success, failure, or any error conditions of a given test. CIS-CAT supports components specified using versions 5.3, 5.8, and 5.10.1 of the OVAL Language.

Language Support <AR_3.2>

List each supported component schema and specific OVAL Tests in those component schemas that are supported. (AR_3.2)


  • family_test
  • filehash_test
  • filehash58_test
  • environmentvariable_test
  • environmentvariable58_test
  • sql57_test
  • textfilecontent_test
  • textfilecontent54_test
  • unknown_test
  • variable_test
  • xmlfilecontent_test


  • file_test
  • inetd_test
  • password_test
  • process58_test
  • runlevel_test
  • shadow_test
  • uname_test
  • xinetd_test


  • partition_test
  • rpminfo_test
  • selinuxboolean_test


  • accesstoken_testauditeventpolicy_test
  • auditeventpolicysubcategories_test
  • cmdlet_test
  • file_test
  • fileauditedpermissions_test
  • fileauditedpermissions53_test
  • fileeffectiverights_test
  • fileeffectiverights53_test
  • group_test
  • lockoutpolicy_test
  • passwordpolicy_test
  • process58_test
  • registry_test
  • regkeyeffectiverights_test
  • regkeyeffectiverights53_test
  • service_test
  • sid_test
  • sid_sid_test
  • user_test
  • user_sid_test
  • user_sid55_test
  • wmi_test
  • wmi57_test
  • wuaupdatesearcher_test

List any core constructs defined in the OVAL Language that are not supported. (AR_3.2)

CIS-CAT currently does not support the comparison of items using the SimpleDatatypeEnumeration value for "ios_version" EntityAttributeGroup:mask.

OVAL Assessment Method <AR_3.3>

List each supported assessment method if applicable. (AR_3.3)

Assessment of state by a host-based sensor.

OVAL Content Error Reporting <AR_3.4>

Provide a copy, or directions to the location, of where the documentation describes the procedure by which errors in OVAL content may be reported for any OVAL content that is produced by the product.

Any OVAL content provided in the context of a CIS benchmark should have passed through internal QA procedures prior to release to members.

Benchmark content issues may be presented to CIS in a number of ways:

  1. All CIS Security Benchmarks Members receive free support for all resources included in their membership, CIS-CAT included. Members can e-mail to initiate a support request.
  2. Members can post questions and/or comments to a particular CIS benchmark community discussion forum. These discussion forums require participation in the respective Benchmark's consensus community.
  3. Members with a valid login to the CIS Security Benchmarks community website may have access to the members-only CIS-CAT Discussion Group, on which they may post CIS-CAT-related issues and questions. These questions may include issues with the functionality of CIS-CAT, as well as issues/comments regarding the automation content bundled with CIS-CAT, including OVAL content.
Content Validity Questions

Syntax Error Detection and Reporting <AR_4.1> <AR_4.2> <AR_4.3> <AR_4.4>

Indicate how the product or repository detects and reports syntax errors in any OVAL content that is consumed by the product or repository.

CIS-CAT can be executed to assess against either an OVAL Definitions file, a XCCDF benchmark referencing OVAL content, or against a SCAP-compliant source data stream collection. When consuming any of these sources, CIS-CAT evaluates the validity of that content by routing it through a W3C XML Schema validation as well as the appropriate Schematron validation for the version of OVAL specified in the source content.

Depending on the user interface mode selected for assessment (CIS-CAT supports both a Graphical User Interface and a Command-Line User Interface), any XML Schema or Schematron error messages are displayed to the user at the point of validation failure.

Definition Evaluator Capability Questions

Content Transparency <AR_8.1> <AR_8.2>

Indicate how the product allows users to determine which OVAL Definitions are being evaluated and examine the details of those definitions.

CIS-CAT allows users to specify or interactively select the source content to be assessed. CIS-produced benchmarks are contained in the downloadable bundle released to members each month. All CIS benchmarks, including those utilizing OVAL are bundled as plain XML files, digitally signed, which users may open and edit using the myriad XML authoring tools available.

Content Import Process Explanation <AR_8.3>

If the capability does not support consuming OVAL content at runtime, explain the documented process by which users can submit OVAL content for interpretation by the capability, including how quickly submitted content is made available to the capability.

N/A – CIS-CAT consumes OVAL content at runtime.

Content Evaluation <AR_8.4> <AR_8.5> <AR_8.6> <AR_8.7>

Indicate how users can review the detailed results of evaluating an OVAL Definition on a target system.

CIS-CAT results may be produced in numerous formats depending on the input source data stream. XCCDF benchmarks which reference OVAL content may produce result reports in HTML, XML, Plain Text, CSV, and OVAL Results. When produced, these results are saved to a location configurable by CIS-CAT users, for later examination.

Full OVAL Results <AR_8.8>

Indicate how users can review the full OVAL Results of the evaluation of an OVAL Definition on a target system.

Once CIS-CAT has completed its assessment of a target endpoint based on an OVAL Definitions file, XCCDF benchmark referencing OVAL content, or OVAL content contained within an SCAP source data stream collection, users can choose to produce an OVAL results report. That OVAL results report is produced in XML format, allowing users to review assessment results using the XML authoring tool of their choice.

Systems Characteristics Producer Capability Questions

Collecting System Data <AR_5.2> <AR_5.3>

Explain the criteria used to collect system data that is included in an OVAL System Characteristics document.

When CIS-CAT is executed against a set of OVAL Definitions, the tool will probe the target system to collect the appropriate set of system characteristics according to those Object components defined by the OVAL Definitions.

Content Export <AR_5.2> <AR_5.3>

Indicate how the product allows users to export OVAL System Characteristics documents.

Once CIS-CAT has performed its assessment of a set of OVAL Definitions, the OVAL Results produced by the tool will contain the full set of system characteristics collected during the assessment.

Adoption Signature

Questions for Signature

Statement of Adoption <AR_1.2>

"As an authorized representative of my organization I agree that we will abide by all of the mandatory adoption requirements as well as all of the additional mandatory adoption requirements that are appropriate for our specific type of capability."

NAME: William Munyan
TITLE: Security Software Developer

Statement of Accuracy <AR_1.2>

"As an authorized representative of my organization and to the best of my knowledge, there are no errors in the correctness of our capability’s use of the OVAL Language and the interpretation of the logic."

NAME: William Munyan
TITLE: Security Software Developer

Statement on Follow-On Correctness Testing Support <AR_1.7>

"As an authorized representative of my organization, we agree to support the Review Authority in follow-on correctness testing activities, where appropriate types of OVAL documents might need to be exchanged with other organizations attempting to prove the correctness of their capabilities."

NAME: William Munyan
TITLE: Security Software Developer

Page Last Updated: February 28, 2014