Everybody seems to show how this or that document compares to a bunch of other documents. More than half of it is malarkey. Until recently compliance mapping was a huge endeavor and very costly. Organizations that didn’t have the manpower or couldn’t absorb the cost either didn’t really do the work or they just guessed at the answer.
There are four scenarios we encounter when looking at a set of Citations mapped to any other set of Citations. We’ve broken these down into the Consulting Army approach, the Super Computer approach, the Used Car Salesman approach, and the UCF Mapper approach.
The Consulting Army ApproachLearn MoreThe Super Computer ApproachLearn MoreThe Used Car Salesman ApproachLearn MoreThe UCF Mapper ApproachLearn MoreIf the provider can’t answer at least six of these questions with a “yes,” they do not have a Modern approach or a Unified Compliance offering. And if they checked Citation to Citation mapping, ensure that they can prove they are really checking each and every Citation to Citation pairing possible. We’ve never seen that to be a workable solution, but then again, we’ve never seen a Unicorn either – maybe they exist.
The Consulting Army Approach | The Super Computer Approach | The Used Car Salesman Approach | The UCF Mapper Approach | |
---|---|---|---|---|
Do they have a documented Citation extraction methodology? | ||||
Can they demonstrate their Mandate tagging methodology? | ||||
Do they match Citations to Citations? | ||||
Do they match Citations to Common Controls? | ||||
Do they leverage NLP engines for tagging? | ||||
Do they leverage a formal Compliance Dictionary? | ||||
Can they demonstrate their matching process? | ||||
Do they track the % of semantic relationship accuracy of each mapping? | ||||
Do they have a standardized Audit Question methodology? |