Chapter 10: Document Control Audit Checklist: 25 Questions Every Internal Auditor Should Ask

An effective internal audit of document control covers 25 questions across five domains: identification (5 questions), revision control (5 questions), distribution and accessibility (5 questions), external document management (5 questions), and records and retention (5 questions). Internal auditors who use this 25-question checklist before a Stage 2 audit catch on average 4-7 nonconformities that would otherwise surface during the certification audit. PinnacleQMS clients run this audit 30 to 60 days before every Stage 2 — first-attempt pass rate stays at 98% across 250+ certifications.
The checklist below is built for a single internal audit pass that touches every clause covered by ISO 9001 clause 7.5, ISO 13485 clause 4.2.4, IATF 16949 clause 7.5.3, AS9100 clause 7.5.3, FSSC 22000 clause 7.5, and ISO 17025 clause 8.3 — one audit, every standard. Findings are scored, ranked, and packaged for management review before the certification body arrives.
Domain 1 — Document identification (Questions 1-5)
Question 1: Does every controlled document carry a unique identifier, title, and revision number on every page? What to ask: Pull 15 documents at random from three different processes. What evidence to collect: photograph or screenshot the header and footer of each. What a finding looks like: a procedure missing revision number on page 4 of 7, or two procedures sharing the same identifier — major finding under clause 7.5.2 of any ISO management system standard.
Question 2: Is the document identification scheme documented and consistently applied? What to ask: produce the numbering convention (e.g., QP-PUR-001 = Quality Procedure, Purchasing, Sequence 001). What evidence to collect: the master document register and three sample documents. What a finding looks like: a register that allows free-form numbering — minor finding, but a leading indicator of bigger issues.
Question 3: Are document types classified (procedure, work instruction, form, record, external document) and is each type subject to a defined control level? What to ask: produce the classification matrix. What evidence to collect: matrix plus 5 examples of each type. What a finding looks like: forms that are managed casually as "templates" rather than controlled documents — minor finding that turns major when it touches a medical device or aerospace process.
Question 4: Does each document show its owner, approver, effective date, and next review date? What to ask: pull the metadata page or header from 10 documents. What evidence to collect: screenshots showing owner, approver (different person), effective date, review date. What a finding looks like: same person as owner and approver — automatic major in ISO 13485 contexts.
Question 5: Is there a master list of controlled documents that matches what is in use on the floor? What to ask: print the master list, then walk three workstations and verify document numbers and revisions against it. What evidence to collect: master list timestamp + photos of in-use documents. What a finding looks like: a workstation using QP-OPS-014 Rev 3 while the master list shows Rev 5 active — major finding, immediate corrective action required.
Domain 2 — Revision control (Questions 6-10)
Question 6: Does a documented procedure govern how revisions are proposed, reviewed, approved, and released? What to ask: produce the change-control procedure and walk a recent revision through it. What evidence to collect: change request, impact assessment, approval signatures, release notification. What a finding looks like: a revision released with no impact assessment — major finding under clause 7.5.3.2.
Question 7: Is there a revision history embedded in or linked to every controlled document? What to ask: open the revision history on 10 documents. What evidence to collect: screenshots showing version, date, change description, approver. What a finding looks like: revision history that shows "Rev 4" but no description of what changed — minor in low-risk processes, major in safety-critical ones.
Question 8: Are obsolete revisions removed from points of use within a defined timeframe? What to ask: how long does the organization allow between release and removal of obsolete copies? What evidence to collect: the procedure-defined timeframe (typically 24-48 hours) plus three examples of removal logs. What a finding looks like: an obsolete revision still hanging in the laminator next to a CNC machine — major finding, full stop.
Question 9: When a revision is released, are affected personnel re-trained or re-acknowledged before the new revision becomes effective? What to ask: pull three recent revisions and verify training records. What evidence to collect: training matrix, sign-off records, effective date alignment. What a finding looks like: revision effective March 1, training completed March 15 — major finding because untrained operators were running the new procedure for two weeks.
Question 10: Is there a documented method for emergency or expedited revisions (e.g., recall response, customer corrective action) that still preserves approval integrity? What to ask: produce the expedited revision path and one example of its use. What evidence to collect: the expedited record, including post-release review. What a finding looks like: no defined path, leading to ad-hoc revisions — minor finding under clause 7.5.3, but a high-risk gap. Auditors should reference Chapter 4 of this series for a deeper revision-control walkthrough.
Domain 3 — Distribution and accessibility (Questions 11-15)
Need guidance on your certification journey?
Our consultants have prepared more than 250 manufacturers globally — from growing businesses to large enterprises — for successful certification. Get a free, no-obligation consultation tailored to your industry.
Question 11: Are controlled documents available at every point of use without requiring a search? What to ask: walk three operators and ask each to produce the procedure governing their current task within 60 seconds. What evidence to collect: time-stamped observations. What a finding looks like: an operator who cannot produce the procedure — major finding under clause 7.5.3.1.
Question 12: Is access to documents role-based, with read, edit, and approve permissions controlled by the document management system? What to ask: produce the permission matrix and demonstrate a denied-access attempt. What evidence to collect: screenshot of permission grid plus a denial event. What a finding looks like: shared drives with no permission control — major finding in ISO 27001 contexts and a minor finding elsewhere that frequently escalates.
Question 13: Is there a backup and disaster-recovery method for the document repository, tested at defined intervals? What to ask: produce the last DR test record. What evidence to collect: test plan, test execution date, RTO/RPO measurements. What a finding looks like: never tested, or tested more than 12 months ago — major finding under clause 7.1.3 supporting infrastructure.
Question 14: Are documents available in languages and formats appropriate to the user population? What to ask: identify the operator language profile and verify documents match. What evidence to collect: HR language data plus document language inventory. What a finding looks like: an English-only work instruction at a station with French-speaking operators — major finding under clause 7.2 competence.
Question 15: Can the system demonstrate, on demand, who accessed which document on which date? What to ask: pull the access log for one safety-critical procedure for the last 30 days. What evidence to collect: log export with user, document, timestamp. What a finding looks like: no access log capability — minor finding under most standards, major under ISO 13485 and FDA-aligned contexts.
Domain 4 — External document management (Questions 16-20)
Question 16: Is there a register of external documents (standards, customer specifications, regulations, supplier drawings) used to plan and operate the QMS? What to ask: produce the external document register. What evidence to collect: register listing source, version, owner, review frequency. What a finding looks like: no register, or one limited to ISO standards while customer specs sit in email — major finding under clause 7.5.3.2(b).
Question 17: Are external documents identified and their distribution controlled? What to ask: trace a customer drawing from intake to point of use. What evidence to collect: receiving log, controlled-copy stamp or watermark, distribution record. What a finding looks like: a customer drawing photocopied and circulated without revision tracking — major finding, especially in aerospace and automotive contexts.
Question 18: Is there a defined frequency for checking external document currency (e.g., new ISO amendments, regulatory updates)? What to ask: produce the monitoring schedule and one example of an action taken from a detected change. What evidence to collect: schedule, monitoring log, change action. What a finding looks like: monitoring schedule that lists "annually" but the standard was amended 18 months ago and the QMS still references the old version — major finding referencing IAF guidance on transition periods.
Question 19: Are obsolete external documents either destroyed or marked clearly when retained for legal or knowledge purposes? What to ask: produce three obsolete external documents and their retention status. What evidence to collect: marked copies, retention rationale, location. What a finding looks like: an obsolete safety data sheet sitting in the operator binder unmarked — major finding, particularly under ISO 45001 and FSSC 22000.
Question 20: Are accreditation-relevant external documents (e.g., calibration certificates, test method standards) tracked against their issuing body? What to ask: pull three calibration certificates and verify accreditation status of the issuing lab. What evidence to collect: certificates plus current ANAB or equivalent accreditation listing. What a finding looks like: a calibration cert from a lab whose accreditation lapsed three months ago — major finding under ISO 17025 and a critical finding when it touches product release decisions.
Domain 5 — Records and retention (Questions 21-25)
Question 21: Is there a records retention schedule that names every record type, its retention period, its storage location, and its disposal method? What to ask: produce the schedule. What evidence to collect: schedule with at least 30 record types covered. What a finding looks like: a one-page schedule covering only "quality records" generically — minor finding that escalates to major when statutory retention is missed.
Question 22: Are records protected from unauthorized alteration after creation? What to ask: attempt to edit a closed record (training record, calibration record, internal audit report). What evidence to collect: system response — denied, or version-locked. What a finding looks like: editable closed records without audit trail — major finding under clause 7.5.3.1(c) integrity requirement.
Question 23: Are records retrievable within a defined timeframe? What to ask: request five records spanning the last three years and time the retrieval. What evidence to collect: timestamps, retrieval method. What a finding looks like: records that take more than 24 hours to locate, or worse, cannot be located — major finding.
Question 24: Are records covering accreditation-critical activities (calibration, internal audit, management review, training, CAPA, supplier evaluation) all available, complete, and signed? What to ask: pull one record from each of those eight categories. What evidence to collect: complete records with all required signatures and dates. What a finding looks like: a management review record missing the CEO signature — major finding under clause 9.3.
Question 25: At end of retention, are records disposed of in accordance with the retention schedule, with disposal evidence preserved? What to ask: produce the last disposal log. What evidence to collect: log showing record types, dates, method, witness. What a finding looks like: no disposal log, leading to records held indefinitely "just in case" — minor finding, but a data-protection and storage-cost red flag.
How to score the audit and prioritize findings
Each of the 25 questions is scored on a four-level rubric. Compliant means objective evidence is present, current, and consistent with the documented procedure. Minor means a single isolated lapse with no systemic impact — typically corrected within 30 days. Major means a systemic gap, a missing required control, or a finding that affects multiple processes — corrective action within 60 days plus root-cause analysis. Critical means an immediate risk to product, customer, regulatory compliance, or accreditation status — containment within 24 hours, full corrective action within 30 days.
Findings are then ranked by audit risk. Three filters apply: does the finding touch a clause the certification body has emphasized in past audits, does it affect a process tied to customer-specific requirements, and does it appear in more than one of the five domains. A finding that satisfies all three filters jumps to the top of the prioritization list regardless of severity score, because it signals a systemic issue that will draw certification-body attention. Findings touching only one domain and one process can be handled in standard CAPA workflows.
Sample audit-day timing
A realistic internal audit of document control at a mid-sized plant runs five hours. The opening meeting takes 15 minutes — auditor, audit lead, document control owner, and process owners present. Domain 1 (identification) takes 45 minutes including floor walks. Domain 2 (revision control) takes one hour because it requires tracing recent revisions end-to-end. A 15-minute break separates morning and afternoon blocks. Domain 3 (distribution and accessibility) takes 45 minutes split between system review and floor verification. Domain 4 (external documents) takes 45 minutes — the document register review is fast, but tracing external documents to point of use takes time. Domain 5 (records and retention) takes 45 minutes including retrieval timing tests. The closing meeting takes 30 minutes — findings reviewed, severity confirmed, corrective action owners assigned, follow-up audit scheduled. Total elapsed time: five hours including breaks, four hours of active audit work.
This 25-question checklist is the same one PinnacleQMS clients run 30 to 60 days before every Stage 2 audit. Across 250+ certifications spanning ISO 9001, 14001, 45001, 13485, 22000, IATF 16949, AS9100, FSSC 22000, 17025, and 22301, the first-attempt pass rate sits at 98%. The platform automates 22 of the 25 questions through built-in document control, version locking, distribution tracking, and retention scheduling — the remaining three (floor walks, language verification, retrieval timing) are human-verified during the internal audit. Organizations preparing for certification or recertification can engage accredited auditors and platform specialists through the contact page to scope an internal audit, configure the document control hub, and schedule the Stage 2 with confidence. A complete document control system, audited against this checklist, removes the single largest source of certification-audit findings — and frees the organization to focus on the work the standard was actually written to enable.
Chapter 9: Electronic Signatures and 21 CFR Part 11 / Health Canada Requirements
Electronic signatures in a QMS are legally and audit-equivalent to handwritten signatures when they meet four requirements: identity verification (the person si
Back to Series Overview
Review the full table of contents
Request a Consultation
Fill in your details and we'll get back to you.


