Thursday, November 5, 2009

NIST releases Security Content Automation Protocol for FISMA

| Brett D. Arion |

Automated tools take sweat out of security compliance

When it comes to complying with federal security mandates, chief information security officers contend with a set of arduous tasks that could rival the 12 labors of Hercules.

Under the Federal Information Security Management Act, agencies must file annual reports to Congress that outline their compliance with more than a dozen categories of security controls that span technology, management and operations.

In addition, the Federal Desktop Core Configuration (FDCC), which seeks to secure desktop and laptop PCs that run Microsoft Windows, has an extensive list of required security settings that agencies must track and report on for every computer they operate.

Such reports relay configuration data on hundreds or thousands of devices and can take months to compile.

So it’s no wonder that federal CISOs who responded to an ISC(2) survey earlier this year identified “meeting compliance objectives” as among their top three priorities. One software vendor’s research has found that agencies’ security managers spend anywhere from a quarter to almost half their time on compliance issues.

However, agencies can get some help. The National Institute of Standards and Technology, which writes the FISMA standards, has created the Security Content Automation Protocol to deal with some of the problems of compliance. SCAP targets the security posture of individual devices and can be used to verify patch installation and check a machine’s security configuration settings.

A number of vendors offer SCAP-enabled security monitoring products that can automate and reduce the painstaking effort involved in achieving several aspects of compliance mandates.

However, those products can only go so far. Industry executives say the tools focus mostly on asset-level security and don’t, by themselves, provide a way to tackle the big picture of FISMA compliance. However, NIST and security vendors are working to make SCAP and related tools more broadly applicable.

SCAP makes inroads

The lack of standardized, automated methods for securing software makes the jobs of patching and securely configuring systems labor intensive and prone to error, according to NIST. In addition, vendors have different ways to identify vulnerabilities and platforms, so organizations that use tools from multiple vendors often generate inconsistent reports that can bog down a security assessment.

SCAP is intended to perform those chores in a consistent way. The protocol has found its greatest traction so far in the realm of FDCC compliance testing. For FDCC, the Office of Management and Budget requires agencies to adopt a standard configuration involving about 300 security settings. The idea is to shrink the avenues through which intruders could compromise a government computer. OMB requires agencies to use SCAP tools to verify that their PCs adhere to FDCC settings.

The National Science Foundation has been using SCAP-enabled scanning tools to continuously verify security configurations for more than a year.

“The main benefit of SCAP is that it allows us to determine the level of compliance to FDCC settings with a high degree of accuracy,” said Bill Marsh, an information technology security officer at NSF’s Division of Information Systems.

“FDCC is the most common use case…across the federal government,” said Matthew Scholl, group manager for security management and assurance at NIST’s Computer Security Division.

Organizations with thousands of devices and the need to assess daily changes in their security posture would find compliance reporting difficult without a SCAP tool, said Matt Mosher, senior vice president for the Americas at Lumension Security, which offers SCAP-validated tools.

“If I have no automated tool to assess and report on [FDCC configurations] on a fairly regular basis, there’s no way I can comply,” Mosher said.

In NIST’s SCAP validation program, independent laboratories validate vendors’ products against several SCAP capabilities, one of which focuses on FDCC.

In the case of FDCC scanning, vendors usually offer SCAP features as part of their vulnerability management or governance, risk, and compliance software. About 20 vendors offer validated FDCC scanning capabilities.

SCAP is based on Extensible Markup Language and enables the creation of machine-readable security configuration checklists. The validated tools process SCAP checklists that map to FDCC guidelines and compare the checklists against a given machine’s configuration.

Such tools generally fall into two categories: agent-based and agentless. The first group deploys software agents on the devices to be checked. The agents send reports on the security status of the various machines to server-based software for analysis.

The agentless tools operate in much the same way as a network vulnerability scanner: They seek out devices on a network and check for gaps in their security posture.

Products may work in one mode or the other or support both. Each approach has its pros and cons, said John Bordwine, public-sector chief technology officer at Symantec, which offers the Control Compliance Suite Federal Toolkit as its SCAP tool. The Symantec product offers both modes.

"An agent based system provides much more detail around the endpoint configuration since it is resident to the device,” said Bordwine. “Network based approaches can provide a fairly deep level of detail as well, but this requires some level of administrative access that has to be granted.”

SCAP tools are typically licensed based on the number of assets an organization seeks to evaluate. Prices range from $5 per device to more than $50, depending on the capabilities provided, among other factors, said David Wilson, vice president of product management for Telos’ Xacta IA Manager.

SCAP's limitations

Although SCAP can help agencies track FDCC settings, it is far from a comprehensive compliance strategy.

For example, SCAP-based tools check for compliance at a specified level for a particular setting, but they stumble when a device is set at a higher level than the FDCC requirement.

“If the FDCC setting is ‘medium’ and the NSF setting exceeds that as ‘high,’ the SCAP tool would mark the setting as ‘noncompliant,’ which is not accurate,” Marsh said.

However, the bigger issue for most is SCAP’s usefulness for complying with FISMA.

“FISMA is an overarching set of policies and controls that is really covering all the security aspects of the organization,” Mosher said. “The SCAP tools are one subsection of FISMA.”

Wilson pointed to a gap between SCAP's objectives and FISMA’s objectives. Much of FISMA reporting occurs at the systems level, while SCAP focuses on “the configuration of a single asset, one at a time,” he said.

NIST Special Publication 800-53, which provides recommended security control guidance for complying with FISMA, discusses protection in terms of “information systems” — sets of resources rather than individual devices.

Although SCAP doesn’t address FISMA in its entirety, it can be applied to the recommended technical security measures that support the directive.

“No SCAP content can make you FISMA compliant,” Scholl said. “We do have SCAP content that directly reflects the technical security controls found in SP 800-53.”

For example, NIST has developed checklists that map Windows XP security configuration settings to the high-level security controls in SP 800-53, according to NIST’s guide for adopting SCAP.

In addition, Scholl said organizations can and often do create their own SCAP content that directly reflects their security policies.

Meanwhile, SCAP vendors say they are offering tools to help close the SCAP gap. Wilson said Telos’ Xacta IA Manager can aggregate asset-based configuration data so agencies can draw security conclusions at the systems level.

Bordwine said industry and NIST are also seeking to extend SCAP to operational checks. For example, SP 800-53 calls for security awareness training for information systems’ users. Vendors are exploring ways to search records in a training database and pull out relevant data for compliance reporting, he said.

What's the bottom line for SCAP? Tools that support the protocol can bolster certain aspects of compliance reporting, particularly those related to FDCC. However, the ability to automate broader FISMA duties is still a work in progress.

A zero-day flaw in the TLS and SSL protocols, which are commonly used to encrypt web pages, has been made public.

| Brett D. Arion |

Security researchers Marsh Ray and Steve Dispensa unveiled the TLS (Transport Layer Security) flaw on Wednesday, following the disclosure of separate, but similar, security findings. TLS and its predecessor, SSL (Secure Sockets Layer), are typically used by online retailers and banks to provide security for web transactions.

Ray, who along with Dispensa works for two-factor authentication company PhoneFactor, explained in a blog post on Thursday that he had initially discovered the flaw in August, and demonstrated a working exploit to Dispensa at the beginning of September.

The flaw in the TLS authentication process allows an outsider to hijack a legitimate user's browser session and successfully impersonate the user, the researchers said in a technical paper.

The fault lies in an "authentication gap" in TLS, Ray and Dispensa said. During the cryptographic authentication process, in which a series of electronic handshakes take place between the client and server, there is a loss of continuity in the authentication of the server to the client. This gives an attacker an opening to hijack the data stream, they said.

In addition, the flaw allows practical man-in-the-middle attacks against hypertext transfer protocol secure (Https) servers, the researchers said. Https is the secure combination of http and TLS used in most online financial transactions.

The flaw will prove a problem for a long time to come, security researcher Chris Paget wrote in a blog post, as it also affects SSL.

"How about the thousands of different software update mechanisms out there that depend on SSL being secure in order to function?" wrote Paget. "This is a protocol-level breach; one that requires a modification to the way that SSL and TLS function in order to repair."

After they found the flaw, Ray and Dispensa disclosed their findings to the Industry Consortium for the Advancement of Security on the Internet (Icasi), a tech association that consists of Cisco, IBM, Intel,Juniper Networks, Microsoft and Nokia. The researchers also alerted the Internet Engineering Task Force (IETF) and a number of open-source SSL implementation projects.

On 29 September, the various groups involved met and decided to set up a project, called Project Mogul, to handle remediation efforts. It will first concentrate on creating a protocol extension as a preliminary solution. Ray said in his blog that he expected to see announcements from the multi-vendor collaboration "shortly", including an internet draft proposal for the fix.

At the September meeting, Ray and Dispensa were informed about research being done by the IETF TLS Channel Bindings working group, which was following a similar line of inquiry into the TLS protocol.

On Wednesday, Martin Rex, a member of the IETF TLS Channel Bindings working group and researcher at SAP, published a man-in-the-middle TLS renegotiation flaw in Microsoft IIS. The flaw, which is essentially the same as the one discovered by Ray, was publicised on Twitter by security researcher HD Moore.

Ray and Dispensa decided on Wednesday that the flaw was in the public domain, and so decided on full disclosure of their work.

Free Security Magazines