Skip to main content
Kliper generates a fully formatted Report on Compliance (ROC) document in DOCX format directly from the assessment data. The output is a Word document based on the official PCI DSS v4.0.1 ROC template (Revision 3), with all assessment answers, contact information, findings, evidence references, and summary tables populated into the correct locations.

Prerequisites

Before generating the ROC, ensure the following are complete:
1

Contact Information (Section 1.1)

Complete all contact fields: assessed entity details (company name, DBA, address, website, primary contact), QSA company information, lead QSA credentials, QA reviewer details, and any associate QSAs, other assessors, or ISAs.
2

Assessment Dates (Section 1.2)

Fill in the report date, assessment start date, assessment end date, and onsite dates.
3

Business Description (Section 2.1)

Describe the nature of the business, how cardholder data is stored/processed/transmitted, and any other relevant details.
4

Scope & Segmentation (Section 3)

Document scope validation results, segmentation details, and validated product usage.
5

Testing Procedure Responses (Section 7)

Complete the assessor responses for each testing procedure across all applicable requirements. Each response populates a Response-X.Y.Z tag in the final document.
6

Findings & Statuses

Select a finding status (In Place, Not Applicable, Not Tested, Not in Place) for every requirement. Write or auto-generate findings descriptions for each requirement.
7

Evidence & Sampling (Section 6)

Upload evidence files, document sampling methodology, and complete the evidence retention attestation.
You can generate a ROC at any point during the assessment to preview progress. Incomplete fields will appear as empty in the document. The platform does not block export for incomplete assessments.

Generating the Report

  1. Navigate to your assessment in the Assessment Workbench or the Engagement Hub.
  2. Select the Export ROC action.
  3. The platform generates the DOCX file and triggers a download.
The output filename follows the pattern: {client-name}-roc-{YYYY-MM-DD}.docx

How the Export Works

The DOCX export engine processes the ROC template through a multi-step pipeline:

Step 1 — Template Loading

The engine loads a Word document template from the templates/ directory. The default template is report.docx, based on the PCI SSC’s official ROC template (Revision 3). Custom templates can be used by specifying a template name. The template is a standard .docx file (ZIP archive containing XML). Tags are placed inline in the Word document as {{Tag-Name}} placeholders.

Step 2 — XML Run Merging

Word frequently splits text across multiple XML <w:r> (run) elements due to spell-check markers, formatting changes, and language attributes. This means a tag like {{Client-Name}} may be split across 3 or more XML elements. The engine merges adjacent runs within each paragraph so that {{ ... }} tags appear in a single <w:t> element. This ensures reliable tag detection and replacement.

Step 3 — Loop Processing (Table Row Cloning)

For repeating data (tables with multiple rows), the template uses loop tags:
{{#LoopName}}
  {{.fieldA}}  |  {{.fieldB}}  |  {{.fieldC}}
{{/LoopName}}
The engine:
  1. Identifies the <w:tr> (table row) elements that contain the loop open and close tags.
  2. Treats the enclosed row(s) as a template.
  3. Clones the template row once per item in the data array.
  4. Replaces {{.fieldName}} with each item’s field values.
  5. Renders boolean fields as checkbox symbols: true → ☒, false → ☐.
Supported loop data sources:
Loop NameData SourceDescription
AssociateQSAsSection 1.1Associate QSA names and mentor assignments
OtherAssessorsSection 1.1Additional assessors with certificate numbers
ISAsSection 1.1Internal Security Assessors
ValidatedProductsSection 3.3PCI SSC validated product listing references
DataFlowsSection 4.2Account data flow descriptions
SADStorageSection 4.3Sensitive authentication data storage locations
TPSPsSection 4.4Third-party service providers
InScopeNetworksSection 4.5In-scope network segments (account data)
NonADNetworksSection 4.5In-scope network segments (non-account data)
LocationsSection 4.6Assessment locations and facilities
ComponentsSection 4.7In-scope system component types
ExtScansSection 5.1External vulnerability scan results
IntScansSection 5.3Internal vulnerability scan results
SampleSetsSection 6.3Sampling sets and methodology
DocEvidenceSection 6.4Documentation evidence table
InterviewEvidenceSection 6.5Interview evidence records
OtherEvidenceSection 6.6Other evidence (observation, configuration review)

Step 4 — Diagram Image Insertion

For Sections 4.1 (Network Diagrams) and 4.2 (Account Data Flow Diagrams), the engine:
  1. Queries uploaded image files tagged to these sections.
  2. Embeds the images directly into the Word document at the {{diagrams-4.1}} and {{diagrams-4.2}} tag locations.
  3. Creates the necessary Word XML relationships for inline image rendering.

Step 5 — Tag Replacement

All remaining {{Tag-Name}} placeholders are replaced with assessment data. The engine maintains a comprehensive tag map covering every section of the ROC:
TagSource
{{Client-Name}}Assessed entity company name
{{Client-DBA}}Doing Business As
{{Client-Address}}Company address
{{Client-URL}}Company website
{{Client-Contact-Name}}Primary contact name
{{Client-Contact-Phone}}Primary contact phone
{{Client-Contact-Email}}Primary contact email
{{QSAC-Name}}QSA company name
{{QSAC-Address}}QSA company address
{{QSAC-URL}}QSA company website
{{QSA-Name}}Lead assessor name
{{QSA-Phone}}Lead assessor phone
{{QSA-Email}}Lead assessor email
{{QSA-Credentials}}Lead assessor certificate number
{{QA-Name}}QA reviewer name
{{QA-Phone}}QA reviewer phone
{{QA-Email}}QA reviewer email
{{QA-Credentials}}QA reviewer credentials
{{Report-Date}}Date of report
{{Date-of-Kick-off}}Assessment start date
{{Assessment-End-Date}}Assessment end date
{{Onsite-Dates}}Onsite assessment dates
TagSource
{{Biz-Desc}}Nature of business
{{How}}How cardholder data is stored/processed/transmitted
{{Why}}How services impact security
{{Other}}Other relevant details
TagSource
{{ScopeVal-*}}Scope validation fields (results, assessor, methods, documentation)
{{Segmentation-*}}Segmentation details (used, implementation, out-of-scope environments)
{{Validated-Products-*}}PCI SSC validated product usage attestation
TagSource
{{Is-Initial-External}}Whether this is the initial external scan
{{Ext-Scan-Doc}}External scan documentation
{{Ext-Scan-Comments}}External scan comments
{{ASV-Attestation}}ASV attestation completion status
{{Is-Initial-Internal}}Whether this is the initial internal scan
{{Int-Scan-Doc}}Internal scan documentation
{{Int-Scan-Comments}}Internal scan comments
TagSource
{{Ev-Repos-Desc}}Evidence repository description
{{Ev-Controller}}Evidence repository controller
{{Ev-Retention-Ack}}Evidence retention acknowledgment
{{Ev-Assessor-Name}}Evidence assessor name
{{Sampling-*}}Sampling methodology fields (used, rationale, representative, standardized)
Per-requirement findings are populated dynamically:
  • {{Findings-X.Y.Z-N}} — the justification text for requirement X.Y.Z, finding index N.
  • {{Response-X.Y.Z.a-N}} — assessor response for testing procedure letter a, row index N.
  • {{Response-X.Y.Z}} — all assessor responses for requirement X.Y.Z, concatenated.
Summary counts per requirement group are populated as:
  • {{rs-OK-N}} — count of “In Place” findings for group N
  • {{rs-NA-N}} — count of “Not Applicable” findings
  • {{rs-NT-N}} — count of “Not Tested” findings
  • {{rs-KO-N}} — count of “Not in Place” findings
  • {{rs-CC-N}} — count of compensating controls
  • {{rs-CA-N}} — count of customized approach findings
Auto-computed lists of requirements by status, with manual override from Section 1.8 notes:
  • {{list-NA}} — requirements marked Not Applicable
  • {{list-NT}} — requirements marked Not Tested
  • {{list-KO-Legal}} — requirements Not in Place with legal exception
  • {{list-KO-NotLegal}} — requirements Not in Place without legal exception
  • {{list-CC}} — requirements using compensating controls
  • {{list-CA}} — requirements using customized approach

Step 6 — Checkbox Resolution

The template uses checkbox tags for selection fields (radio buttons in the assessment UI):
{{check-onsite}}     → ☒ (if selected) or ☐ (if not)
{{check-combination}} → ☒ or ☐
{{check-remote}}      → ☒ or ☐
Checkbox tags cover: remote testing method, QSA consultation, subcontractor usage, assessment completion type, and overall compliance result.

Step 7 — Output Generation

The processed document is compressed using DEFLATE and returned as a buffer. The file is named using the pattern {client-slug}-roc-{date}.docx and streamed to the user’s browser for download.

Custom Templates

Organizations can use custom Word templates by placing .docx files in the templates/ directory. Custom templates must use the same {{Tag-Name}} convention. If a requested template is not found, the engine falls back to the default report.docx.
Custom templates must be based on the PCI SSC’s official ROC template structure. Tags that do not match the expected naming convention will be left as-is in the output document.

Troubleshooting

IssueCauseResolution
Tags appear as {{Tag-Name}} in outputTag name does not match the expected convention, or the field was not filled inVerify the tag name matches the tag map; ensure the corresponding field is populated in the assessment
Table rows are emptyLoop data source returned no itemsPopulate the corresponding assessment section (e.g., fill in the TPSPs table for the {{#TPSPs}} loop)
Diagrams missingNo image files tagged to Section 4.1 or 4.2Upload network diagram or data flow diagram images and tag them to the correct section
Checkboxes show ☐ for all optionsSelection field not answeredSelect the appropriate radio option in the assessment form
Split or garbled textWord XML run splitting not resolvedThis is handled automatically by the run-merging engine; report the issue if it persists