PCI DSS Scoping Engine
Not every PCI DSS requirement applies to every merchant. The Scoping Engine automatically adjusts the assessment to reflect the merchant’s actual environment by hiding requirements that are not relevant and marking the corresponding fields as Not Applicable.How Scoping Works
The scoping engine evaluates a set of scoping rules against the assessor’s answers to scoping questions. Each rule consists of:- A condition — a field path, an operator, and an expected value.
- An action — what to do when the condition is met (
hide_requirement,show_requirement, orset_na).
Supported Condition Operators
| Operator | Behavior |
|---|---|
equals | Field value exactly matches the expected value |
not_equals | Field value does not match |
contains | Field value (string) contains the expected substring |
not_contains | Field value does not contain the substring |
exists | Field has a non-empty value |
not_exists | Field is empty, null, or undefined |
Built-In Scoping Rules
Kliper ships with scoping rules derived from the official PCI DSS v4.0.1 ROC template. These rules cover the most common scoping scenarios:Wireless Technology
Wireless Technology
Scoping question: Does the entity use wireless technologies?When answered No, the following requirements are automatically hidden:
Additional sub-rules for wireless scanning method (11.1.c) and automated monitoring (11.1.d) are evaluated independently based on whether those specific techniques are in use.
| Hidden Requirement | Description |
|---|---|
| 1.2.3 | Wireless access points configuration |
| 2.1.1 | Wireless vendor defaults changed |
| 4.1.1 | Wireless transmission encryption |
| 11.1 | Wireless access point testing |
| 11.2.1 | Wireless scanning processes |
| 11.2.2 | Wireless IDS/IPS deployment |
End-User Messaging
End-User Messaging
Scoping question: Does the entity transmit cardholder data via end-user messaging technologies?When answered No, Requirement 4.2.2 (securing end-user messaging technologies) is hidden.
Service Provider Status
Service Provider Status
Scoping question: Is the assessed entity a service provider?When answered No, Requirement 12.9 (service provider acknowledgment of responsibilities) is hidden.
P2PE Solution
P2PE Solution
Scoping question: Does the entity use a PCI-listed P2PE solution?When answered Yes, Requirement 3.4 (PAN rendering unreadable) is hidden — the P2PE solution addresses this requirement.
Cardholder Data Storage
Cardholder Data Storage
Scoping question: Does the entity store cardholder data?When answered No, Requirement 3.1 (cardholder data retention policies) is hidden.
Network Segmentation
Network Segmentation
Scoping question: Does the entity use network segmentation to reduce PCI DSS scope?When answered No, Requirement 6.1 (segmentation testing) is hidden.
Scoping Evaluation Flow
Re-Scoping
Scoping is not permanent. If the assessor changes a scoping answer (e.g., updates “Uses wireless?” from No to Yes), the engine re-evaluates all rules immediately. Previously hidden requirements reappear in the workbench, and their N/A markers are cleared. No assessment data is lost during re-scoping — answers that were previously entered for a now-hidden requirement are preserved and restored if the requirement becomes visible again.Assessment Workbench
The Assessment Workbench is the primary interface where assessors conduct their evaluation. It is designed for extended, focused work sessions on individual requirements.Layout
The workbench uses a three-panel layout:| Panel | Position | Purpose |
|---|---|---|
| Section Tree | Left | Hierarchical navigation of all PCI DSS sections and sub-requirements. Shows completion status per section. |
| Question Panel | Center | The active requirement’s testing procedures, reporting instructions, answer fields, and finding selection. |
| Context Panels | Right (collapsible) | Stacked, collapsible panels for Cortex AI, Attachments, Comments, Collaborators, Gap Assessment, and Audit Trail. |
Section Tree (Left Panel)
The section tree displays all 12 PCI DSS principal requirements and their sub-sections in a collapsible hierarchy. Each node shows:- Requirement number (e.g., 3.4.1)
- Completion indicator — visual status showing whether the requirement has been answered
- Scoping visibility — requirements hidden by the scoping engine do not appear in the tree
Question Panel (Center)
For each requirement, the center panel presents:Testing Procedures
Each testing procedure defined in the ROC template for this requirement. Testing procedures specify what the assessor must examine, interview, or observe. Each procedure has a structured response field.
Reporting Instructions
The ROC template’s reporting instructions — structured guidance on what the assessor must document. These instructions describe which documents to review, which personnel to interview, which configurations to inspect, and what to report.
Validation Steps (Structured Prefix)
Pickable list fields for documenting:
- Documentation Reviewed — link to uploaded evidence files
- Samples Taken — sampling methodology and selections
- Personnel Interviewed — names and roles
- Assessor — lead QSA or associate
- Critical Technologies — systems and components examined
- Settings Reviewed — configuration parameters inspected
- Methods — testing procedures and approaches used
- Software — PCI SSC validated products or other applications
Assessment Finding
A selection for the requirement’s finding status:
- In Place — requirement is fully met
- Not Applicable — requirement does not apply to the assessed environment
- Not Tested — requirement was not evaluated
- Not in Place — requirement is not met
- Compensating Control — Appendix C applies
- Customized Approach — Appendix E applies
Context Panels (Right Side)
The right side of the workbench contains collapsible panels that provide contextual information without leaving the current requirement:Cortex AI Panel
Cortex AI Panel
A chat interface for interacting with Cortex. The assessor can ask questions about the current requirement, request PCI DSS guidance, or trigger auto-fill for the findings description. Cortex responses are contextualized to the specific requirement being worked on.See the Cortex AI guide for details.
Attachments Panel
Attachments Panel
Lists all evidence files uploaded for the current assessment, optionally filtered by section. Each file shows:
- File name and type
- Upload date and uploader
- Malware scan status (clean, pending, quarantined) with per-engine results
- AI validation status (Pending, Complete, Partial)
- Tags (requirement associations, document tags)
Comments Panel
Comments Panel
Threaded, requirement-scoped comments. Assessors can:
- Post comments on specific requirements
- @mention team members (triggers notifications)
- Mark comment threads as resolved
- View comment history and timestamps
Collaborators Panel
Collaborators Panel
Shows team members assigned to the assessment and their roles (Editor, Reviewer, Viewer). Displays live presence — which team members are currently viewing the assessment.
Audit Trail Panel
Audit Trail Panel
A chronological log of every change made to the current requirement — who changed what, when, and the before/after values. Useful for QA review and responding to PCI Council inquiries.
Answer Status Progression
Each assessment answer progresses through a defined status lifecycle:- Pending — initial state. The assessor is still working on the requirement.
- Reviewed — the answer has been reviewed by a peer or supervisor.
- Approved — the answer is finalized and locked for inclusion in the ROC report.
Document Evidence Sync
When files are uploaded to an assessment, Kliper automatically syncs them into the Section 6.4 Documentation Evidence table. This happens transparently:- File is uploaded with optional tags (e.g.,
doctag-DOCFWfor a firewall documentation tag). - The platform creates or updates a row in the 6.4 answer’s
docEvidencearray. - Each row contains: file ID, document reference tag, file name, AI-generated purpose summary, and upload date.
- Manual rows (entered directly by the assessor) are preserved alongside auto-generated rows.