15 EPD Mistakes That Delay Verification, Trigger Corrections, and Weaken Submission Quality
Environmental Product Declarations (EPDs) do not usually face verification problems because the framework itself is unclear. In most cases, the problem is execution. The recurring failures seen in EPD development come from weak modelling logic, incomplete documentation, inconsistent interpretation of PCR requirements, and poor data governance.
Below is a structured list of 15 critical mistakes that frequently undermine EPD submissions. Each issue is explained with practical guidance on how to avoid it before the verifier identifies it for you.
1. Environmental impact result tables contain errors or omissions
One of the most damaging mistakes in EPD development is submitting environmental impact tables with missing indicators, incorrect values, formatting inconsistencies, or results that do not match the final LCA model. This immediately raises concern because the published declaration is expected to reflect the verified model exactly.
How to avoid it: Reconcile every reported indicator against the final model export, verify required indicators against the applicable PCR and EN 15804 requirements, and perform a final table-by-table technical check before submission.
2. PCR or c-PCR requirements are interpreted incorrectly
Many EPD developers move too quickly into modelling without first translating the PCR into operational modelling rules. As a result, they apply the wrong scope, the wrong declared unit, the wrong impact reporting structure, or the wrong scenario requirements.
How to avoid it: Build a PCR compliance checklist before any modelling begins. Review all clauses related to scope, modules, data quality, scenarios, and reporting format. Treat the PCR as a technical instruction document, not as background reading.
3. Background report documentation is incomplete or ambiguous
Even when the model is broadly acceptable, weak background documentation can create major verification friction. If assumptions, exclusions, data sources, conversion rules, and methodological choices are not clearly documented, the verifier cannot trace how results were produced.
How to avoid it: Ensure the background report explains the modelling logic in a transparent and traceable manner. Every major assumption should be explicit, justified, and linked to either primary data, secondary sources, or PCR rules.
4. System boundaries and life cycle modules are defined incorrectly
A frequent failure point is inaccurate treatment of A1-A3, A4-A5, B stages, C1-C4, or module D. Some submissions exclude required stages, include non-permitted stages, or describe the scope differently in different parts of the report.
How to avoid it: Map the system boundary visually and textually before modelling. Confirm the required lifecycle modules from the PCR, then align the model, background report, and EPD document so they all describe the same boundary consistently.
5. LCIA calculations contain mistakes or required impact categories are missing
Using the wrong characterisation method, omitting mandatory categories, or mixing incompatible indicator sets creates a serious compliance problem. These errors often arise when old templates are reused or when software defaults are accepted without verification.
How to avoid it: Lock the impact assessment method at project start, verify that the method matches the PCR and program operator requirements, and confirm all mandatory indicators are present in the final output set.
6. End-of-life scenarios are oversimplified or unrealistic
C1-C4 modelling often becomes a weak point because developers use overly generic disposal assumptions, unrealistic landfill or recycling rates, or poorly justified waste treatment routes. This weakens both technical credibility and compliance.
How to avoid it: Use regionally relevant waste treatment assumptions where possible, justify all end-of-life pathways, and document how scenario choices reflect actual or representative practice for the product category.
7. Data sources, database versions, and data quality are not reported properly
Verifiers frequently flag unclear data lineage. Developers may fail to distinguish primary and secondary data, omit database version information, or provide no meaningful data quality discussion. This makes it difficult to assess reliability and representativeness.
How to avoid it: Declare all databases and versions used, identify which data are primary and which are secondary, and include a structured data quality assessment covering representativeness, completeness, consistency, and temporal relevance.
8. Electricity mix selection and energy modelling are incorrect
Electricity is often a high-influence input. Using the wrong grid mix, applying non-representative datasets, or mixing electricity assumptions inconsistently across processes can materially distort results and trigger verifier concern.
How to avoid it: Use location-specific electricity data aligned with the manufacturing context and PCR requirements. Apply the same logic consistently across the model and explain clearly whether the energy source is measured, contracted, or generic.
9. Life cycle scenarios and technical assumptions are poorly justified
Assumptions related to service life, maintenance, replacement, installation losses, or use-stage behaviour are often inserted too casually. When these assumptions are weak, unverifiable, or disconnected from product reality, the entire submission becomes vulnerable.
How to avoid it: Base technical assumptions on product documentation, engineering input, industry references, or client records. Avoid placeholder assumptions and ensure every scenario used in the model can be defended during verification.
10. Transport distances, modes, and fuel assumptions are modelled poorly
Transport modules such as A2 and A4 are often underestimated or generalized. Developers may use arbitrary distances, unrealistic logistics chains, or inappropriate vehicle datasets, which undermines result quality and makes the model less credible.
How to avoid it: Use actual logistics data wherever possible. If estimates are necessary, document the basis for them clearly, including transport mode, load assumptions, backhaul treatment, and fuel or emission factors.
11. Functional unit or declared unit is inconsistent across the submission
This mistake creates confusion quickly. The unit used in the model, the unit described in the report, and the unit displayed in the EPD table do not always match. Even small inconsistencies here can invalidate interpretation and comparability.
How to avoid it: Define the declared unit or functional unit at the beginning of the project and lock it across the model, report text, calculation sheets, and final EPD layout. Review all unit references before submission.
12. Allocation rules are applied incorrectly in multi-output systems
Allocation remains one of the more technically sensitive issues in LCA and EPD work. Problems arise when co-products, scrap flows, recycled inputs, or shared utilities are divided using weak or unjustified allocation methods.
How to avoid it: Follow the hierarchy defined by ISO 14044 and the PCR. Where allocation is unavoidable, justify the chosen basis clearly and show that the rule is applied consistently throughout the model.
13. Cut-off criteria are applied inconsistently or without evidence
Some developers exclude flows too aggressively, while others mention cut-off criteria in the report but fail to apply them systematically in the model. This creates a disconnect between stated methodology and actual execution.
How to avoid it: Define cut-off rules before modelling, apply them consistently, and document which flows were excluded and why. Do not rely on vague statements such as “minor inputs were neglected” without technical justification.
14. The LCA model and the final EPD report do not fully match
This is a common operational failure. During editing, revisions are made to the model or the report, but not both. The result is inconsistency in numbers, assumptions, lifecycle modules, or text descriptions between the technical files and the submission package.
How to avoid it: Conduct a full model-to-report reconciliation as the final quality control step. One responsible reviewer should check that the model, background report, EPD layout, and supporting annexes all reflect the same final version.
15. No independent pre-submission review is carried out
Many EPD developers submit directly to verification without an independent technical review. That means structural errors, documentation gaps, inconsistent assumptions, and reporting weaknesses are first discovered by the verifier instead of being resolved internally beforehand.
How to avoid it: Build a formal pre-submission audit step into the EPD process. An independent technical review before verification can identify methodological weaknesses, improve submission quality, reduce revision cycles, and protect credibility.
Why these mistakes matter
EPD verification is not only a compliance checkpoint. It is also a test of technical discipline, methodological consistency, and documentation maturity. When submissions are weak, organizations lose time, increase internal workload, and risk undermining stakeholder trust.
The strongest EPD submissions are not created by rushing to publication. They are created by applying disciplined modelling logic, rigorous data control, structured documentation, and independent review before the verifier ever opens the file.
Conclusion
If your organization is developing an EPD, the objective should not be to simply reach verification. The objective should be to submit an EPD that is already technically coherent, transparent, and verification-ready. Avoiding these 15 mistakes can significantly reduce correction cycles, strengthen confidence in the declaration, and improve the quality of the final published EPD.
EPD Services
End-to-end Environmental Product Declaration development aligned with ISO and EN standards.
Explore Service →EPD Pre-Submission Audit
Independent technical review to ensure your EPD is verification-ready before submission.
Explore Service →EPD Training Programs
Professional training to build internal capability in EPD development and LCA workflows.
Explore Training →EPD Data Framework
Structured data collection system covering A1–D modules for verification-ready workflows.
Explore Framework →Request Quotation
Get a tailored quotation aligned with your EPD scope and timeline.
Request Now →Share this:
- Email a link to a friend (Opens in new window) Email
- Share on LinkedIn (Opens in new window) LinkedIn
- Share on X (Opens in new window) X
- Share on Facebook (Opens in new window) Facebook
- Share on WhatsApp (Opens in new window) WhatsApp
- Share on Reddit (Opens in new window) Reddit
- Print (Opens in new window) Print
- More






