Cybersecurity failed because cybersecurity training failed. Full stop.
Relying on self-paced videos followed by multiple choice questions and calling these activities scenario-based awareness and training has harmed our National Security.
So as we work with CyberDI, and LTP/LPP, on Cybersecurity Maturity Model Certification we want to help address this crisis.
Considering how to teach and assess the seventeen domains of CMMC weigh heavily on us. See it’s not just 17 Domains. You need to think practices, processes, and assessment objectives. In the end we have to ensure assessors have the ability to apply the yet released CMMC Assessment Process using the yet to released scoping guidance to check compliance on 705 assessment objectives.
So we turn to to you the audience and ask, “How would you teach the Domains?” It always begins with audience.
I have such an amazing Instructional Design team:
- Leighton Johnson
- Vincent Scott
- Paul Netopski
- Brian Rogolaski
- Dana Mantilla
- Richard Dawson
- Lauren Tucker
- Lisa Lancor
Compliance RecipeIn the end how you assess all 705 objectives like making a breakfast smoothie. Everyone will tell you the best order to add fruit but in the end the blender blades treat them all the same. CMMC does the same with assessment objectives. In the end you must have compliance on all 705 assessment objectives. In the wild assessment strategies have coalesced around four main plans:
- You break objectives down into People, Processes, and Technology
- You organize the Domains into Groups and deploy assessors likewise
- You organize your Domains into technical systems and deploy assessors likewise
- You organize 705 assessment objectives into one gigantic spreadsheet
The approach the assessor takes does not matter. In the end you chop up 17 Domains into 700 and something assessment objectives. Still, I want to cover the common text structures deployed by experts in the field.
Determining an Objectives Scheme
I can’t share the CMMC-AB objectives, but we have to cover ALL the Domains, ALL Practices, and ALL Processes in CMMC. That means 705 objectives.
Now we can’t give you a quiz with 705 items on it so we had to immediately think on ways to cast an ontological net to meet the CMMC-AB objectives without threatening our validity of any pre and posttest.
You see objectives take multiple items to measure. Given we have 17 domains if we wanted to use forced response items (multiple choice) we would find our learners in a pickle.
Technically you should have two items, really three, per objective. So if we assessed every assessment objective we would need a post test of over 2,115 items. So writing objectives at the assessment level out.
Next, I turned to to think about the rigor of the scenario problems we had to craft. I do not want learners spending hours in class looking for random page numbers and CMMC Practice and Process numbering systems.
I don’t care if learners can count (unless talking IADM).
So, I turned to Webb’s Depth of Knowledge
- Level 1. Recall and Reproduction:
- Level 2. Skills and Concepts:
- Level 3. Strategic Thinking
- Level 4. Extended Thinking
This got me to:
- identify impacted domains and practices
- list common strategies deployed by the OSC to meet compliance of those practice
- compare alternate strategies for meeting compliance on these practices
- create advice for to OSC to include on a POA&M
Still, this left me with 68 objectives. Still too many. I would need 130-210ish forced response items. Plus, you usually must start with a bank of ten items for each objective to get down to three good multiple-choice items.
Can’t happen. Not without threatening validity.
Multidimemsional Scenario Problems
So forced response items went out the window. Instead, Dr. Tucker and I began to think on a multidimensional scenario-based problem so we could create one template to pilot and test on domain, do some content validity work with our SMEs and then draft the rest of the domains to pilot.
Our CCP Domain Scenario Template
Given a scenario identify impacted domains and practices, list common strategies deployed by the OSC to meet compliance of those practices, compare alternate strategies for meeting compliance on these practices, and create advice for to OSC to include on a POA&M in the (area) for out of compliance.
|Domains & Practices||Common Strategies for Compliance||Compare Alternate Strategies for Compliance||Advice for OSC on a POA&M|
|1 point||1 point||1 point||1 point|
|All Domains and practices were appropriately identified.||All common strategies for compliance deployed by the OSC for compliance standards are identified.||Provides a minimum of one alternate strategy for compliance are identified and explained with OSC approach.||Provides at least two pieces of advice to the OSC to include on a POA&M in the specified area for out of compliance.|
|(If incorrect, no additional points awarded. Revise and resubmit.)||(If incorrect, no additional points awarded. Revise next to sections to resubmit.)||(If incorrect, no additional points awarded. Revise next to sections to resubmit.)||(If incorrect, no additional points awarded. Revise next to sections to resubmit.)|
Domains & Practices If learner scores a 0 on the domain and practices, the remaining columns need to be redone, no remaining points can be awarded.
Common Strategies for Compliance If learner scores a 0 on the strategies for compliance, the remaining columns need to be redone, no remaining points can be awarded.
Alternate Strategies If learner scores a 0 on the strategies for alternate strategies, the remaining column need to be redone, no remaining points can be awarded.
Each scenario can have a total of four points but they cumulates so if you cannot identify the correct domain and process you cannot earn credit for the assessment objectives.
Content Validity Steps
Now that Dr. Tucker and I have a template to play with we have provided it to our two content validity experts. Leighton Johnson and Vincent Scott. They and I will create a Domain specific scenario. We will then get together with our three exemplars and resolve any disagreements until we reach 100%.
Then we will divide up the Domains and finish writing scenarios. Then comes the hard part. Running cognitive labs with students, doing inter-rater reliability checks, and writing scoring guides.
Having so much fun.