Skip to main content

PCI DSS Scoping Engine

Not every PCI DSS requirement applies to every merchant. The Scoping Engine automatically adjusts the assessment to reflect the merchant’s actual environment by hiding requirements that are not relevant and marking the corresponding fields as Not Applicable.

How Scoping Works

The scoping engine evaluates a set of scoping rules against the assessor’s answers to scoping questions. Each rule consists of:
  1. A condition — a field path, an operator, and an expected value.
  2. An action — what to do when the condition is met (hide_requirement, show_requirement, or set_na).
When an assessor answers a scoping question (e.g., “Does the entity use wireless technologies?”), the engine evaluates all rules whose conditions reference that field. Requirements that fall out of scope are removed from the workbench view, and their fields are automatically set to “Not Applicable — Hidden by scoping rules.”

Supported Condition Operators

OperatorBehavior
equalsField value exactly matches the expected value
not_equalsField value does not match
containsField value (string) contains the expected substring
not_containsField value does not contain the substring
existsField has a non-empty value
not_existsField is empty, null, or undefined

Built-In Scoping Rules

Kliper ships with scoping rules derived from the official PCI DSS v4.0.1 ROC template. These rules cover the most common scoping scenarios:
Scoping question: Does the entity use wireless technologies?When answered No, the following requirements are automatically hidden:
Hidden RequirementDescription
1.2.3Wireless access points configuration
2.1.1Wireless vendor defaults changed
4.1.1Wireless transmission encryption
11.1Wireless access point testing
11.2.1Wireless scanning processes
11.2.2Wireless IDS/IPS deployment
Additional sub-rules for wireless scanning method (11.1.c) and automated monitoring (11.1.d) are evaluated independently based on whether those specific techniques are in use.
Scoping question: Does the entity transmit cardholder data via end-user messaging technologies?When answered No, Requirement 4.2.2 (securing end-user messaging technologies) is hidden.
Scoping question: Is the assessed entity a service provider?When answered No, Requirement 12.9 (service provider acknowledgment of responsibilities) is hidden.
Scoping question: Does the entity use a PCI-listed P2PE solution?When answered Yes, Requirement 3.4 (PAN rendering unreadable) is hidden — the P2PE solution addresses this requirement.
Scoping question: Does the entity store cardholder data?When answered No, Requirement 3.1 (cardholder data retention policies) is hidden.
Scoping question: Does the entity use network segmentation to reduce PCI DSS scope?When answered No, Requirement 6.1 (segmentation testing) is hidden.

Scoping Evaluation Flow

Assessor answers scoping question


Scoping Engine loads all rules


For each rule:
  ├── Evaluate condition against answers
  │     (field path → operator → expected value)

  └── If condition is TRUE → apply action:
        ├── hide_requirement → remove from visible set
        ├── show_requirement → add to visible set
        └── set_na → mark specific field as N/A


Return:
  • visibleRequirements[] — shown in workbench
  • hiddenRequirements[] — removed from view
  • naFields{} — fields auto-set to N/A with reason

Re-Scoping

Scoping is not permanent. If the assessor changes a scoping answer (e.g., updates “Uses wireless?” from No to Yes), the engine re-evaluates all rules immediately. Previously hidden requirements reappear in the workbench, and their N/A markers are cleared. No assessment data is lost during re-scoping — answers that were previously entered for a now-hidden requirement are preserved and restored if the requirement becomes visible again.

Assessment Workbench

The Assessment Workbench is the primary interface where assessors conduct their evaluation. It is designed for extended, focused work sessions on individual requirements.

Layout

The workbench uses a three-panel layout:
PanelPositionPurpose
Section TreeLeftHierarchical navigation of all PCI DSS sections and sub-requirements. Shows completion status per section.
Question PanelCenterThe active requirement’s testing procedures, reporting instructions, answer fields, and finding selection.
Context PanelsRight (collapsible)Stacked, collapsible panels for Cortex AI, Attachments, Comments, Collaborators, Gap Assessment, and Audit Trail.

Section Tree (Left Panel)

The section tree displays all 12 PCI DSS principal requirements and their sub-sections in a collapsible hierarchy. Each node shows:
  • Requirement number (e.g., 3.4.1)
  • Completion indicator — visual status showing whether the requirement has been answered
  • Scoping visibility — requirements hidden by the scoping engine do not appear in the tree
Clicking a node loads that requirement into the center panel.

Question Panel (Center)

For each requirement, the center panel presents:
1

Requirement Header

The requirement number, title, and the full PCI DSS requirement text.
2

Testing Procedures

Each testing procedure defined in the ROC template for this requirement. Testing procedures specify what the assessor must examine, interview, or observe. Each procedure has a structured response field.
3

Reporting Instructions

The ROC template’s reporting instructions — structured guidance on what the assessor must document. These instructions describe which documents to review, which personnel to interview, which configurations to inspect, and what to report.
4

Validation Steps (Structured Prefix)

Pickable list fields for documenting:
  • Documentation Reviewed — link to uploaded evidence files
  • Samples Taken — sampling methodology and selections
  • Personnel Interviewed — names and roles
  • Assessor — lead QSA or associate
  • Critical Technologies — systems and components examined
  • Settings Reviewed — configuration parameters inspected
  • Methods — testing procedures and approaches used
  • Software — PCI SSC validated products or other applications
5

Assessment Finding

A selection for the requirement’s finding status:
  • In Place — requirement is fully met
  • Not Applicable — requirement does not apply to the assessed environment
  • Not Tested — requirement was not evaluated
  • Not in Place — requirement is not met
With optional method flags:
  • Compensating Control — Appendix C applies
  • Customized Approach — Appendix E applies
6

Findings Description

A free-text field for the assessor’s written findings. This is the narrative that appears in the final ROC. Cortex AI can auto-generate a draft for this field based on the testing procedures, uploaded evidence, and assessor responses.

Context Panels (Right Side)

The right side of the workbench contains collapsible panels that provide contextual information without leaving the current requirement:
A chat interface for interacting with Cortex. The assessor can ask questions about the current requirement, request PCI DSS guidance, or trigger auto-fill for the findings description. Cortex responses are contextualized to the specific requirement being worked on.See the Cortex AI guide for details.
Lists all evidence files uploaded for the current assessment, optionally filtered by section. Each file shows:
  • File name and type
  • Upload date and uploader
  • Malware scan status (clean, pending, quarantined) with per-engine results
  • AI validation status (Pending, Complete, Partial)
  • Tags (requirement associations, document tags)
Files can be uploaded, downloaded, previewed, and deleted from this panel.
Threaded, requirement-scoped comments. Assessors can:
  • Post comments on specific requirements
  • @mention team members (triggers notifications)
  • Mark comment threads as resolved
  • View comment history and timestamps
Shows team members assigned to the assessment and their roles (Editor, Reviewer, Viewer). Displays live presence — which team members are currently viewing the assessment.
A chronological log of every change made to the current requirement — who changed what, when, and the before/after values. Useful for QA review and responding to PCI Council inquiries.

Answer Status Progression

Each assessment answer progresses through a defined status lifecycle:
Pending  ──▶  Reviewed  ──▶  Approved
   │              │
   └──────────────┘
     (can return to Pending if changes are needed)
  • Pending — initial state. The assessor is still working on the requirement.
  • Reviewed — the answer has been reviewed by a peer or supervisor.
  • Approved — the answer is finalized and locked for inclusion in the ROC report.

Document Evidence Sync

When files are uploaded to an assessment, Kliper automatically syncs them into the Section 6.4 Documentation Evidence table. This happens transparently:
  1. File is uploaded with optional tags (e.g., doctag-DOCFW for a firewall documentation tag).
  2. The platform creates or updates a row in the 6.4 answer’s docEvidence array.
  3. Each row contains: file ID, document reference tag, file name, AI-generated purpose summary, and upload date.
  4. Manual rows (entered directly by the assessor) are preserved alongside auto-generated rows.
A “Resync All” action is available to force re-synchronization of all files into the 6.4 table.