The UCF Mapper approach to compliance mapping

Using Crosswalking to map Citations in a grid is too time consuming and expensive. Not everyone wants to spend the money on consultants or a super computer. Instead of mapping tens of thousands of Citations in a matrix, why not define Common Controls to map against?

When mapping a tagged Citation against Common Controls, each Citation is examined once. It is either found to match an existing Common Control, or a new Common Control is created. Instead of mapping each Citation to every other Citation, each Citation is mapped to a Common Control. As a result, matching Citations are found to match the same Common Control. This is shown in the diagram below.

UCF-ThreeModels-01.png

The mathematical formula is 1 mapping task per Mandate. It either matches or it doesn’t. If it doesn’t, a new Common Control is created. You can recognize this approach because the mapping will show the Common Control that links the Citations together.

Let’s go back to our example of comparing the NIST ID.AM-1 (NIST CyberSecurity document) with both the ISO and the NIST 800-53 document. There are a total of 1,060 Citations that need to be compared between the three documents. If you are going to compare two sets of documents, you’ll have to calculate mathematical formulas for each. The comparison between the NIST ID.AM-1 document and the ISO document equals only 325 different combinations. Comparing the NIST ID.AM-1 document with the 800-53 document only takes 829 tasks. And comparing everything is simple – it’s the same number of Citations as all three documents!

DOCUMENTCITATIONSDOCUMENTCITATIONS
NIST CyberSecurity Citations94Cybersecurity to ISO325
ISO 27001 Citations231CyberSecurity to 800-53829
NIST 800-53 Citations735All three to each other1,060

Mapping to Common Controls eliminates the problem Crosswalking creates – so you can work smart instead of hard

Leveraging Natural Language Processing

What the Super Computer approach does with Natural Language Processing and tagging of text is brilliant. How it handles the tagging and recognizing of text – not so much. The UCF Mapper uses the same type of NLP engine that the Super Computer uses, and we get the same results as shown below.

The UCF Mapper has a built in dictionary (and process to find and add new terms) that no Super Computer has. And because of that, the UCF Mapper allows a trained user to add new terms, such as scope of the BCMS as shown below.

Which allows for adding terms to the dictionary and tagging the sentence properly, as shown below.

The UCF’s Compliance Dictionary (ComplianceDictionary.com) has more than a quarter of a million words and phrases and relationship connections that neither the Consulting Armies nor any Super Computer can leverage.

The UCF Mapper gives you a huge dictionary and the same NLP power the Super Computers have!

In addition to these improvements that take away the need for either a Consulting Army or a Super Computer, the UCF Approach also adds additional capabilities to ensure that any Used Car Salemen trying to pass off guess work are buried. The UCF Mapper provides both a mapping accuracy calculation and standardized audit question formats.

Report Mapping Accuracy

Using the same examples, the UCF Mapper approach can identify the terms being tagged in each Citation and Common Control, and demonstrate how each term is semantically linked to the Citation or Common Control it is mapped to. The diagram below shows the Unified Compliance Framework® version of that linkage.

Not only should a Unified Compliance method show the semantic relationships, it should also show the Erdős distance number between each term selected for mapping. It’s one thing to say that two phrases match each other because of a semantic relationship. It’s something completely different to say that they match with 80% match accuracy versus 20% match accuracy. If there is no accuracy reporting for a “Unified Compliance” method, you won’t have enough information to decide whether the mapping has been performed adequately.

With Mapping Accuracy reports, your work is instantly defendable!

Standardize Audits

Over 90% of Authority Documents lack audit questions. The only way to achieve auditability for Unified Compliance is to establish and maintain a methodology to create standardized Audit Questions.

If you think about it, Citations reference people (either individually, in groups, or corporately), records (paper or electronic), assets (whether buildings or computers), and processes (whether formal or informal). People can be interviewed. Records can be examined. Assets can be tested for compliance. Processes can be observed.

These four audit methods (interview, examine, test, observe) can be added to every Common Control that exists. And once created for a Common Control, the audit methods can be applied to each Citation that maps to that Common Control.

For two documents of 25 Citations each, 1,225 Citation-to-Citation potential audit questions are created without Common Controls. Using the Common Controls method, 50 potential audit questions are created. It’s obvious that the Common Controls approach creates a much more realistic and manageable audit process.

The UCF Mapper takes the guess work out of creating audit questions!

Sign Up!