Four ways to handle waivers with OpenSCAP

This tutorial describes three approaches how to handle a system that for whatever reason needs to be keept not fully compliant.

After installing scap-security-guide package there is a couple of DataStream files.

    # yum install -y scap-security-guide
    # rpm -ql scap-security-guide | grep .ds.xml


I can use any of them to run the scan of a system, for example on Red Hat Enterprise Linux 7:

    # oscap xccdf eval --profile xccdf_org.ssgproject.content_profile_common --report x.html /usr/share/xml/scap/ssg/content/ssg-rhel7-ds.xml

I have noticed, that when I run such scan on all of my systems, some of them report positives I cannot remediate. I mean the systems are technically incompliant with the profile, but I need to keep the systems in the incompliant state to avoid breakage of the running services. For instance, on one of my clusters, I have couple messaging brokers. On each broker system, I need to keep qpid daemon running, although scap-security-guide advises me to turn-it off. I can tell the broker system from any other easily. The broker has python-acme-broker package installed on it.

So how do I proceed? Basically there are three options.

  1. Prepare nothing. Keep the reports coming red. Print-them out. Sign-off the failures. There are highly secured deployments that needs to do this. However, for others this is not affordable solution.

  2. This is rather footnote: You can advocate for the waiver support in OpenSCAP tooling. There is waiver support in XCCDF standard and we have it preliminary implementation in OpenSCAP (example usage). The problem is that we do not have nice user interface to include waiver yet. The presentation of the waivers in HTML is done.

  3. Customize policy for each specific system. Scap-workbench is great tool for customization. The scap-workbench produces XCCDF Tailoring files. The process is very quick, however You will need to distribute these Tailoring files along the existing DataStream files. You also need to make sure nobody tampers with the tailoring files once they are distributed to the target systems.

  4. The last option is to include waivers directly in the policy. This allows you to have single policy/profile for whole infrastructure. In my specific example, I could amend existing DataStream file and make the "Disable qpid" rule apply only on the systems that do not have python-acme-broker package installed. This approach to things, however, needs to be applied with security engineering rigor. For example, I risk that someone install the python-acme-broker package on other system to allow qpid daemon run unnoticed. In this case, I am fine accepting the risk.

In the following blog post, I would like to shed more light how to implement the fourth option.

Solution: Build DataStream from Red Hat CVE stream to automate vulnerability scan with Satellite 6.1

This blog describes how to prepare Red Hat OVAL (RHSA/CVE) content for automated vulnerability audit using Foreman or Red Hat Satellite 6.1.

Using RHSA stream on single machine

Red Hat publishes OVAL content for assessing vulnerabilities at https://www.redhat.com/security/data/metrics/ . The easiest way to audit Red Hat Entrprise Linux for unpatched vulnerabilities is to run following commands:

$ wget https://www.redhat.com/security/data/oval/com.redhat.rhsa-all.xml.bz2
# oscap oval eval com.redhat.rhsa-all.xml.bz2


Note: Older oscap may not support .xml.bz2 natively. Please unzip the file first by running `bunzip2 com.redhat.rhsa-all.xml.bz2`.

Create RHSA DataStream

Download RHSA stream
wget http://www.redhat.com/security/data/metrics/com.redhat.rhsa-all.xccdf.xml
wget https://www.redhat.com/security/data/oval/com.redhat.rhsa-all.xml.bz2
bunzip2 com.redhat.rhsa-all.xml.bz2


Port XCCDF 1.1 to XCCDF 1.2
xsltproc --stringparam reverse_DNS com.redhat.rhsa \
    /usr/share/openscap/xsl/xccdf_1.1_to_1.2.xsl \
    com.redhat.rhsa-all.xccdf.xml \
    > com.redhat.rhsa-all.xccdf12.xml


Ensure the newly created file is valid
oscap xccdf validate com.redhat.rhsa-all.xccdf12.xml

Create DataStream from the downloaded OVAL and from transformed XCCDF
oscap ds sds-compose com.redhat.rhsa-all.xccdf12.xml NEW-RHSA-DS.XML

And a DataStream is ready for a distribution using Satellite 6.1. Learn more about upcomming Satellite 6.1 features at https://www.youtube.com/watch?v=XI1QcQL4qQ8

Foreman & OpenSCAP :: First UI Mock-ups

Let's see how the user interface of Foreman & OpenSCAP integration may look like.

We will heavily use word "Compliance" within the UI. It seems to more comprehandable than just OpenSCAP. The foreman_openscap plug-in brings in two main concepts:

  • Definition of comliance policy and

  • compliance report of particular asset.

Each of those gets one link from main hosts menu:

Similarly, foreman_openscap will introduce two new roles to Foreman's users: can edit compliance policies and can view compliance reports.

Compliance policies can be listed:

and created or edited by user


Once the reports are collected from infrastructure according to the policies definition they can be listed and searched.

For each report there are a very verbose detailes available.

Foreman & OpenSCAP :: Architecture

This blog is a reminiscence of my very first blog post here. As well, it is a follow-up on recent introduction of project SCAPtimony. First application of SCAPtimony will be integration with Foreman.

Foreman is a complex, web based, systems managing system. It provides life-cycle management as well as configuration management using popular Puppet project.

The integration of Foreman and OpenSCAP will enable administrators to set-up compliance policies for their infrastructure and to audit on continuous basis. The integration effort includes multiple sub-projects. Each of the project has restricted set of responsibilities as follows:

  • OpenSCAP is open-source implementation of SCAP line of standards. Foreman will leverage broad API of libopenscap.so for manipulation of SCAP files (parsing, modification, creation of HTML reports, etc.).

  • ruby-openscap provides Ruby bindings to OpenSCAP. Or more accurately, ryby-openscap exposes certain OpenSCAP functions to Ruby language using FFI. Since the API of libopenscap.so is quite low-level, ruby-openscap encapsulate data and functions into classes that relate the objects in SCAP standards. Ruby-openscap is still quite light-weight and includes only subset of functionality available in base OpenSCAP package.

  • puppet-openscap exposes SCAP scan operation to Puppet language. With puppet-openscap users can write puppet-modules that will create OpenSCAP scan reports on periodic basis. puppet-openscap also includes a Puppet class that will upload SCAP result immediately to foreman-proxy_openscap.

  • foreman-proxy_openscap is a tiny plug-in for Foreman's smart-proxy. This plug-in exposes single API function for client system: (PUT /openscap/arf/arf/:policy_name/:date) . This is API is approached by puppet-openscap when uploading SCAP report. Client upload is authenticated using Puppet certificates. Then, the collection of SCAP reports is forwarded to foreman_openscap.

  • foreman_openscap is a Foreman plug-in (rails engine) that binds Foreman and SCAPtimony together. It provides back-end API for foreman-proxy_openscap to upload SCAP reports. Foreman_openscap also provides user interface. Front-end API for user automation is planned as well.

  • SCAPtimony project has been discussed in recent blog post. SCAPtimony is a mountable rails engine. SCAPtimony is independent of Foreman, however it can be mounted to Foreman. SCAPtimony will enable advanced SCAP operation that cannot be implemented in OpenSCAP package, as they need multiple inputs.

Introducing project SCAPtimony

How do I archive all the SCAP result coming from my infrastructure? For how long? What kind of SCAP result post-processing would help me retain control over environment? How do I ensure that all the nodes in given perimeter has been audited by given policy in last week? What are the good practices for operating SCAP audit of multiple nodes? It is a a heck of a lot of XML files and there has to be a better way!

These are common question amongst operational guys and there needs to be piece of software to help answer them. Let me introduce project SCAPtimony, its motives and mission statements.

SCAPtimony project gives full testimony about compliance of your infrastructure. SCAPtimony is open source compliance center build on top of SCAP, the U.S. Government standard. SCAPtimony is a collection (database) of auditable assets, SCAP policies, audit schedules, SCAP results, and waivers. SCAPtimony is modern, RESTful, highly efficient, robust, and cloud-class scalable solution to the common problem of SCAP document storage. Going forward, SCAPtimony pushes the envolope by leveraging OpenSCAP to empower administrators in a sustainable way! ... Bingo!

Planned Features
+ Define security/compliance policies
    + Archive distinct versions of the policy
    + Upload SCAP content and assign it with the policy
    + Set-up a periodical schedule of audits for the policy
    + Organization defined targeting (Assign a set of nodes with the policy)
    + Define known-issues and waivers (Assign waivers with a set of nodes and the policy)
    + Set-up rules for automated deletion of SCAP results
+ Achieve SCAP audit results from your infrastructure
    + Provide API for tools to upload collected SCAP results
+ Result post-processing
    + Search SCAP results
    + Search for non-compliant systems
    + Search for not audited systems
    + Comparison of audit results
    + Waive known issues

Let me know, if your feature is missing. In the meantime, source codes are brewing at https://github.com/OpenSCAP/scaptimony.

And by the way, project SCAPtimony would never be possible if there was no oscap_source redesign in OpenSCAP. That redesign significantly improved post-processing capabilities of OpenSCAP needed especialy for SCAPtimony's waivers.

The oscap_source API Redesign

This blog is concerning API redesign in libopenscap, its motives, and implications. It is rather significant change coming to OpenSCAP 1.2.0, though it is expected to come unnoticed by OpenSCAP users. Let me just start with historical background.

Prior the introduction of the DataStream file format to SCAP version 1.2, there has been a jungle of distinct file formats in the SCAP world. The OpenSCAP has implemented the very most of them, however common bind between them has been utterly missing.

Hence, we have ended-up with a dozen of independent file parsers. Each parser carried its own structures and its own ways to approach things. Any generic routines that would exist took in the path to file. The processing of SCAP content took us subsequent multiple file openings and multiple initializations of DOM or xmlReader parsers.

When the DataStream file format was introduced to the standard, OpenSCAP followed the easiest path to implement it. That means, we have just created very thin layer that was just decomposing a DataStream file to multiple other SCAP files. We have spin the DataStream support in OpenSCAP quickly and the most easy use-cases were covered. Remember that nobody has been using DataStreams at that time, so we didn't know if DataStreams will be adopted, neither we could anticipate what would be more advanced use-cases. OpenSCAP slitted the DataStream internally to a temporally directory and then we used the old file-openings parser to parse the content of DataStream. Later, we started to be unhappy with this approach, it was hard to add new functionality, the advanced functions for DataStream handling.

This September we have reworked whole file handling. We have introduced oscap_source structure as the abstract handle to any SCAP content. The oscap_source abstracts all the parsers from the medium, be it either file, HTTP response or part of another SCAP content (DataStream).

The oscap_source also abstracts from common operations one may want to call over the content. The oscap_source can tell the document type and schema version, or validate the document. Overall the code is a bit more cleaner, efficient, and flexible.

We have also introduced source DataStream session and result DataStream session. These structures hold internal information about the opened DataStream and they can return parts of DataStream in a form of another oscap_source.

The change involved more than 300 commits. During the work we have modified each parser to bind well with oscap_source and old the routines were deprecated. While OpenSCAP may carry the deprecated routines working for a while, library users are advised to upgrade to a new API. Each deprecated function points to the new and preferred function.

OpenSCAP is now able to work with DataStreams natively and efficiently without creating temporally files. And as a bonus, we have introduced native support for bzipped files. If the filename is *.xml.bz2, OpenSCAP will recognize the file and process it as a plain XML. That is pretty cool, because it will allow us to build an efficient SCAP results storage, the project SCAPtimony.

How does OpenSCAP work?

This is an introductionary into OpenSCAP internals. After read, you will understand the high-level architecture of OpenSCAP.

Structure of codebase

OpenSCAP project consists of three notable parts: command-line tool, shared library, and the OVAL probes. Let's first take closer look on command-line tool. The oscap is just small binary which makes functionality of the shared available to an user. We are trying to keep the codebase of oscap command as small as possible.

Everything oscap does is backed by the shared library, libopenscap.so. The library is logically divided to parts (XCCDF, OVAL, DataStream, CPE, ...). The parts of the library match to the components of SCAP standard. Each part includes parser and exporter code as well as the implementation of algorithms defined by standard.

The heart of the library is implementation of OVAL language. That is the part that includes a lot of low-level code to query system characteristics. Library uses special executables: OpenSCAP probes to examine the system. None of the checks is implemented by library itself.

When a library evaluates OVAL object, it finds an appropriate probe, executes it and sends the query to the probe's stdin. The probe queries system and returns a result on its stdout. The very compact protocol (SEXP is used for communication. Each probe holds a cache to minimize system load and avoid duplicate queries.

Introduction of High-level API

In past, we focused to build a low-level API. That means that users of library libopenscap.so are able to build tools on top of the library that may do pretty much everything. For instance an SCAP editor may be build on top of libopenscap.so.

On the other hand, when we had this low-level API, it required profound intellectual exercise to write tools on top of it. For example writing simple scanner in C required several hundreds lines of code. That all just to load a datastream, initialize various structures, start evaluation, and export results correctly. There were simply too many options available with the low-level API.

Then, we have introduced xccdf_session interface. The xccdf_session interface is recommend starting point for any developer. It allows you to write very flexible XCCDF scanner, despite being abstracted from all low-level decisions.

Writing simple scanner

The ruby bindings for the library moved this to next level, a scanner tool can be written in 8 easy lines:

    1:     require 'openscap'
    2:     s = OpenSCAP::Xccdf::Session.new("/usr/share/xml/scap/ssg/fedora/ssg-fedora-ds.xml")
    3:     s.load
    4:     s.profile = "xccdf_org.ssgproject.content_profile_common"
    5:     s.evaluate
    6:     s.remediate
    7:     s.export_results(:rds_file => "results.rds.xml")
    8:     s.destroy


All the magic happens on the fifth line of code. The XCCDF module will build a structure called XCCDF_POLICY that holds an evaluation context. That is the object model of XCCDF file, selected profile, resolved values and handles to OVAL files referenced from XCCDF.

The XCCDF Policy then goes through the tree of XCCDF Rules and asks OVAL module to evaluate particular definitions. For each query OVAL module ask particular probe and caches result in form of collected object and module then computes test-result and definition-result. The final value is returned to XCCDF module and forms xccdf:rule-result.

Spacewalk & OpenSCAP :: Deletion of SCAP Results

This blog post describes a new feature of Spacewalk 2.1 (not yet released) which allows for SCAP results at the server to be deleted.

Previously, it was impossible to delete an SCAP scan from Spacewalk, but it turned out that some of the SCAP results loose importance over time and users want to delete them. In general, deletion of auditing results is tricky. The audit may serve as an evidence or may be subject to retention policies. Thus, Spacewalk should not allow SCAP result deletion, unless certain requirements are met.

Enabling SCAP Result Deletion

By default Spacewalk 2.1 does not allow SCAP result deletion, until the deletion is configured in Organization Configuration dialog.



In the Organization Configuration dialog (shown in figure above), there are two configuration items related to the SCAP Deletion. The first, Allow Deletion of SCAP results, enables deletion for the given organization. This option defaults to off, once enabled it allows system administrators to delete any SCAP results they can see and which has already passed its retention period. The second configuration item, Allow Deletion After, defines the retention period (in days) for SCAP result. If the retention period is set to zero, SCAP results can be deleted immediately upon finish.

Once configured, SCAP scan results may be deleted on web user interface or through API (using system.scap.deleteXccdfScan() function).

Spacewalk & OpenSCAP :: Detailed SCAP Results

This blog post describes a new feature of Spacewalk 2.0 related to OpenSCAP integration.

In previous versions, Spacewalk was able to schedule OpenSCAP scan on the managed systems and gather SCAP results to its database. Only a brief sub-set of SCAP results was stored in Spacewalk database. That included the result of evaluation for each xccdf:Rule, rule's ID, and CVE or CCE identifiers assigned to it. This kind of information was essential to spot misconfiguration early, however it was often not sufficient to investigate the causes of the failure. The more comprehensive information was available to a user only after manual scan re-run.

Since there were users requesting for Spacewalk to include more SCAP information, I decided to extend amount of aggregated data. Spacewalk 2.0 is now able to grab and store SCAP result files produced by OpenSCAP scanner. The figure below shows how the new data is presented.

The scan's details page now includes Detailed Results row which enables user to download SCAP Result files. There are three types of SCAP results files:


  • xccdf-result.xml -- The XCCDF Result File as described by SCAP standard.

  • *.result.xml -- The OVAL Result Files as described by SCAP standard. There will be one such file for each OVAL Definition file on input.

  • xccdf-report.html -- The human readable report for the scan. It includes summary information from XCCDF and OVAL files.

The above figure shows details of SSG (SCAP Security Guide) evaluation. There is the XCCDF result file, the HTML report and the ssg-rhel6-oval.xml.result.xml representing OVAL results. All the files can be downloaded using web browser.

Setting This Up

By default Spacewalk does not aggregate detailed SCAP results. Please follow these two steps to set-up detailed SCAP results add-on.


  1. On the client side: Make sure you have spacewalk-oscap package of version greater or equal to 0.0.15.

  2. On the server side: Make sure you have enabled SCAP Detailed Results in the organization configuration dialog.

    The organization configuration dialog (figure below) contains two rows related to this feature. The first is Enable Upload Of Detailed SCAP Files which turns the feature on. The second, SCAP File Upload Size Limit, defines the biggest acceptable size of file in Bytes. The size limit defaults to 2 MiB.


How to convert USGCB to DataStream with OpenSCAP

USGCB stands for United States Government Configuration Baseline; according to its web page, it is initiative to create security configuration baselines for technologies deployed across the federal agencies. On 2011-09-30, USGCB released official baseline for Red Hat Enterprise Linux 5 Desktop. This baseline remained the only official guidance for Linux systems for long time.

The USGCB for RHEL 5 Desktop, was released in form of SCAP 1.1. In this blog, I will show you how to use OpenSCAP to convert this baseline to a new version of SCAP, SCAP 1.2. Note that the SCAP 1.2 defines new file format (called DataStream) which allows us to bundle multiple legacy files into one huge xml.

For this example, we will need the newest OpenSCAP library which was yesterday released as version 0.9.6. First of all, we need to download the baseline from usgcb.nist.gov and unpack it.

        $ mkdir USGCB; cd USGCB
        $ wget http://usgcb.nist.gov/usgcb/content/scap/USGCB-rhel5desktop-1.0.5.0.zip
        $ unzip USGCB-rhel5desktop-1.0.5.0.zip
        Archive: USGCB-rhel5desktop-1.0.5.0.zip
           inflating: usgcb-rhel5desktop-cpe-dictionary.xml
           inflating: usgcb-rhel5desktop-cpe-oval.xml
           inflating: usgcb-rhel5desktop-oval.xml
           inflating: usgcb-rhel5desktop-xccdf.xml
        $ mv usgcb-rhel5desktop-xccdf{,11}.xml

Now, we have the content prepared and we may start with the conversion to DataStream. The XCCDF document we have uses XCCDF version 1.1, but within DataStream file format, there is only XCCDF 1.2 allowed. We need to convert XCCDF document to the newest version of the standard. We will use XSL Transformations which comes with the newest OpenSCAP package for this task.

Unfortunately, it's not so simple. The USGCB baseline contains 4 buggy <xccdf:sub> elements which refer to nonexistent values. These 4 elements remain unnoticed as long as the content is in XCCDF version 1.1. However, the XSD validation schema for XCCDF 1.2 is more strict and it will be picking up on these. Let's use the very first XSLT to remove bogus sub elements:

        $ xsltproc /usr/share/openscap/xsl/xccdf_1.1_remove_dangling_sub.xsl \
           usgcb-rhel5desktop-xccdf11.xml \
           > usgcb-rhel5desktop-xccdf11fixed.xml

Once, there are no dangling <xccdf:sub> elements, we can proceed with document conversion to XCCDF 1.2:

        $ xsltproc --stringparam reverse_DNS gov.nist.usgcb \
           /usr/share/openscap/xsl/xccdf_1.1_to_1.2.xsl \
           usgcb-rhel5desktop-xccdf11fixed.xml \
           > usgcb-rhel5desktop-xccdf.xml

You can verify that the resulting document is compliant with XCCDF 1.2:

        $ oscap xccdf validate usgcb-rhel5desktop-xccdf.xml

Next, we will let oscap create DataStream from this new XCCDF.

        $ oscap ds sds-compose usgcb-rhel5desktop-xccdf.xml usgcb-rhel5desktop-ds.xml
        OpenSCAP Error: Could not found file http://www.redhat.com/security/data/oval/com.redhat.rhsa-all.xml: No such file or directory.

The error message which is produced by OpenSCAP is expected and can be safely ignored. The message means to say that OpenSCAP found reference to the OVAL content which was not found locally, therefore the OpenSCAP failed to include it into the DataStream. The OVAL file on that location changes in time and USGCB authors wanted this file to be downloaded at the scan time. At any step, oscap-info module can be used to inspect resulting document:

        $ oscap info usgcb-rhel5desktop-ds.xml
        Document type: Source Data Stream
        Imported: 2013-04-21T22:30:31

        Stream: scap_org.open-scap_datastream_from_xccdf_usgcb-rhel5desktop-xccdf.xml
        Generated: (null)
        Version: 1.2
        Checklists:
           Ref-Id: scap_org.open-scap_cref_usgcb-rhel5desktop-xccdf.xml
               Profile: xccdf_gov.nist.usgcb_profile_united_states_government_configuration_baseline
        Checks:
           Ref-Id: scap_org.open-scap_cref_usgcb-rhel5desktop-oval.xml
        No dictionaries.

In the DataStream, there are two components, the first is the XCCDF and the second is its OVAL dependency. We can use their identifiers to evaluate the USGCB.

        $ oscap xccdf eval \
           --datastream-id scap_org.open-scap_datastream_from_xccdf_usgcb-rhel5desktop-xccdf.xml \
           --xccdf-id scap_org.open-scap_cref_usgcb-rhel5desktop-xccdf.xml \
           --profile xccdf_gov.nist.usgcb_profile_united_states_government_configuration_baseline \
           --fetch-remote-resources \
           usgcb-rhel5desktop-ds.xml

However, in this case, when there is only one XCCDF within the DataStream, we can skip these arguments and let OpenSCAP to find it automatically.

        $ oscap xccdf eval \
           --profile xccdf_gov.nist.usgcb_profile_united_states_government_configuration_baseline \
           --fetch-remote-resources \
           usgcb-rhel5desktop-ds.xml

Attentive readers must have noticed that in the original archive, there were 4 files, but in the DataStream we have only 2 components. The CPE dictionary was not added to the DataStream during the sds-compose operation and it needs to be added separatelly.

        $ oscap ds sds-add usgcb-rhel5desktop-cpe-dictionary.xml usgcb-rhel5desktop-ds.xml
        $ oscap info usgcb-rhel5desktop-ds.xml
        Document type: Source Data Stream
        Imported: 2013-04-22T17:17:16

        Stream: scap_org.open-scap_datastream_from_xccdf_usgcb-rhel5desktop-xccdf.xml
        Generated: (null)
        Version: 1.2
        Checklists:
           Ref-Id: scap_org.open-scap_cref_usgcb-rhel5desktop-xccdf.xml
              Profile: xccdf_gov.nist.usgcb_profile_united_states_government_configuration_baseline
        Checks:
           Ref-Id: scap_org.open-scap_cref_usgcb-rhel5desktop-oval.xml
           Ref-Id: scap_org.open-scap_cref_usgcb-rhel5desktop-cpe-oval.xml
        Dictionaries:
           Ref-Id: scap_org.open-scap_cref_usgcb-rhel5desktop-cpe-dictionary.xml

We are done. The USGCB baseline is available in DataStream file format. The very same procedure may be used to convert Scap Security Guide into DataStream. However, there might be problems with broken identifiers which time to time appear during the development of SSG.