Assessment Scoring Model
The CYC Assess scoring model is derived from and compatible with the Microsoft Azure Review Checklists (ARC) spreadsheet dashboard. Enterprise clients familiar with the ARC spreadsheet will recognise the scoring dimensions and chart types in the CYC report.
CYC extends the model in three ways not present in the ARC spreadsheet:
- WAF pillar scores computed per design area in addition to overall
- Remediation effort classified per finding (quick win / medium / complex)
- Drift comparison available for Tier 3 recurring assessments
Item Status Values
Every assessed checklist item is assigned one of five status values, directly mapping to the ARC spreadsheet taxonomy:
| CYC Status | ARC Equivalent | Meaning |
|---|---|---|
compliant | Fulfilled | Recommendation followed. No action required. |
non_compliant | Open | Recommendation not followed. Action item exists. |
not_applicable | N/A | Does not apply to this environment — resource not deployed or organisational context makes it irrelevant. |
not_required | Not required | Understood but intentionally not adopted. Confirmed via intake questionnaire. |
data_unavailable | Not verified | Insufficient data to assess. Typically a missing permission or collection error. Excluded from scoring denominator. |
Scoring Dimensions
The Assessment Engine computes scores across four dimensions:
Top-Level Metrics
| Metric | Formula | Meaning |
|---|---|---|
| Overall compliance % | (compliant + not_required) / (total − data_unavailable) | Percentage of applicable items that are compliant |
| Open items % | non_compliant / (total − data_unavailable) | Percentage of applicable items requiring remediation |
| Assessment coverage % | (total − data_unavailable) / total | Proportion of items with sufficient data. Equivalent to the collection quality score. |
| WAF pillar score | (compliant + non_compliant − min) / (max − min) | Normalised score per WAF pillar. Identical to ARC spreadsheet formula. |
| Progress per area | (compliant + not_required) / total per area | Closure rate per design area. Feeds the radar chart. |
Report Visualisations
All chart types are directly comparable to the ARC spreadsheet dashboard:
| Chart | Type | Data source |
|---|---|---|
| Overall status | Pie | All items: compliant / non_compliant / not_applicable / data_unavailable |
| High severity status | Pie | High severity items only |
| Medium severity status | Pie | Medium severity items only |
| Low severity status | Pie | Low severity items only |
| Status per design area | Stacked bar | Status counts per design area |
| Item distribution by area | Pie | Total item count per design area |
| Item distribution by severity | Pie | Total item count per severity level |
| Design area coverage | Radar | Progress % per design area — primary executive visual |
| WAF pillar scores | Radar | Normalised score per WAF pillar (CYC addition) |
| WAF × area matrix | Heatmap table | Pillar score per design area combination (CYC addition) |
Drift Detection (Tier 3)
Tier 3 recurring assessments include a delta comparison against the most recent prior assessment for the same tenant:
- Items moved from
non_compliant→compliant— remediation confirmed - Items moved from
compliant→non_compliant— regression detected - New checklist items added since last assessment
- Items that changed severity classification upstream
Drift comparison uses CYC IDs as the stable join key across assessment cycles. This is why fixed-sequence ID generation was chosen — assessment-scoped IDs would make drift tracking impossible.