This video explains how to track progress, validate deliverables, automate compliance checking, and capture lessons learned under ISO 19650. The written guide below covers how the MIDP and responsibility matrix support progress tracking, why deliverables should be validated against the Exchange Information Requirements, how automation replaces manual cross-checking across multiple screens, and why structured lessons learned create a feedback loop that improves every future project.
Why tracking and compliance determine whether ISO 19650 delivers real value
Planning documents, information requirements, and execution plans are only as valuable as the team’s ability to track whether deliverables are actually being produced, are complete, and meet the requirements that were agreed at the start. In practice, this is where many projects struggle. Teams end up juggling models on one screen, the BIM Execution Plan on another, the Exchange Information Requirements on a third, and a spreadsheet on a fourth, manually cross-checking whether information has been delivered to the right standard. This approach is frustrating, error-prone, and sometimes simply impossible to sustain as the volume of deliverables grows across milestones. ISO 19650 provides the framework, but without practical tracking and compliance mechanisms, the framework stays theoretical.
The first step toward practical compliance is using the Master Information Delivery Plan (MIDP) and the responsibility matrix as live tracking tools rather than static documents. The MIDP shows which information needs to be delivered and by when, while the responsibility matrix makes it clear who is accountable for each deliverable. Together, they create a roadmap that eliminates confusion about what has been done, what is outstanding, and who owns each piece of work. When these documents are connected to the actual model data rather than maintained in separate spreadsheets, tracking becomes visual, current, and far less reliant on manual effort.
The second step is validating deliverables against the Exchange Information Requirements. Every piece of information produced during the project should align with what was defined at the start. By cross-checking outputs against the EIR, teams can identify gaps, missing data, and non-conformities before they become problems during formal review or handover. The Verify module connects models directly to tasks and information requirements, automatically evaluating the percentage of completeness for each deliverable. Rather than opening multiple files and trying to compare information manually, the system links element data to task requirements and flags anything that is missing or incomplete. Each task shows a clear completeness percentage, and as elements are statused, the model updates visually so that teams can see exactly where things stand.
The third step is automating the checking process itself. Automated model checking removes the dependency on manual verification, which is both time-consuming and vulnerable to human error. When models are connected to tasks and requirements in the same environment, the system can evaluate information completeness continuously rather than in periodic batch reviews. Teams can filter by discipline, milestone, or status to see exactly which elements are passing verification and which still need work. Models are only moved to a published or authorized status once the tasks associated with them have been verified, ensuring that no information is released for use until it has been properly checked. This approach transforms compliance from a stressful, reactive exercise into a structured, repeatable process that builds confidence at every stage.
The final and often overlooked step is capturing lessons learned. Too many projects leave this until the end, when the team has already moved on and the insights that matter most are lost. Under ISO 19650, lessons learned should be tied to specific milestones and captured as the project progresses. When tasks arise that were not originally scoped, they should be recorded and reviewed as a team. These unscoped tasks represent real gaps in the original requirements that, if not captured, will repeat on the next project. In Plannerly’s template and reuse workflow, lessons can be incorporated directly into updated templates. When a new project is created from a template, the system flags any updates that have been made since the template was last used, allowing teams to incorporate those improvements into their current work. This creates a seamless feedback loop between project data and organisational learning, ensuring that every project benefits from the lessons of the one before it.
How to implement tracking, compliance, and lessons learned
- Track progress using the MIDP and responsibility matrix – Use the MIDP and responsibility matrix as live tracking tools. Ensure every task has a clear owner, a defined milestone, and a visible status so that the entire team can see what has been delivered and what remains outstanding.
- Validate deliverables against the EIR – Cross-check every output against the Exchange Information Requirements to confirm that information meets the agreed standard. Use the Verify module to connect models to tasks and evaluate completeness automatically rather than comparing files manually.
- Automate the checking process – Replace manual cross-checking with automated model verification that links element data to task requirements and flags missing or incomplete information. This reduces human error and makes verification continuous rather than periodic.
- Use statuses and milestones to manage approval – Apply clear statuses to tasks and models as they progress through verification. Use grid, timeline, and Kanban views to visualise progress across disciplines and milestones. Only move models to published or authorized status once the associated tasks have been verified.
- Filter by discipline and milestone for focused reviews – When reviewing compliance, filter tasks by specific teams, disciplines, or milestones to focus on the deliverables that matter for each review cycle. This avoids information overload and makes compliance reviews manageable.
- Capture unscoped tasks as they arise – When tasks come up during delivery that were not part of the original scope, record them immediately. Review these as a team at each milestone to understand what was missed and why, building a structured record of gaps in the original requirements.
- Update templates with lessons learned – After each project or milestone review, incorporate lessons into reusable templates. When new projects are created from updated templates, the system flags changes so teams can incorporate improvements without manually tracking what has been updated.
- Create a feedback loop between projects – Treat lessons learned not as a one-off exercise at project close but as a continuous process. Feed insights back into templates, requirement libraries, and process documentation so that each new project starts from a stronger foundation than the last.
What you’ll learn
- Progress tracking with the MIDP and responsibility matrix – How using the MIDP as a live roadmap and the responsibility matrix as an accountability tool gives teams clear visibility of what has been delivered, what is outstanding, and who is responsible for each deliverable.
- Deliverable validation against the EIR – Why cross-checking outputs against the Exchange Information Requirements is essential for maintaining quality, and how connecting models to tasks makes this validation automatic rather than manual.
- Automated compliance checking – How automation replaces the frustrating process of juggling models, spreadsheets, and PDFs across multiple screens, reducing human error and making verification continuous throughout the project.
- Status-driven approval workflows – How applying clear statuses to tasks and models, and only publishing information once it has been verified, ensures that no deliverable is released for use until it meets the agreed requirements.
- Structured lessons learned – Why capturing lessons at every milestone rather than at the end of the project creates actionable insights that improve future delivery, and how unscoped tasks reveal gaps in original requirements.
- Template-driven improvement – How updating templates with lessons learned and using Plannerly’s update flagging system creates a seamless feedback loop that carries improvements from one project to the next.
Common questions
Why is manual compliance checking so problematic?
Manual checking requires teams to open multiple files simultaneously — the BEP, the EIR, the model, various spreadsheets — and compare information across them. This is slow, error-prone, and impossible to sustain consistently as the volume of deliverables grows. Every time someone switches between files, they risk missing requirements, working from outdated versions, or overlooking incomplete data. Automated verification solves this by linking model data directly to requirements and evaluating completeness in real time.
How does model-to-task linking work in practice?
In the Verify module, task names are connected to element properties inside the model, such as type names. The system automatically finds and links matching elements, then evaluates the percentage of information completeness for each task. As tasks are statused, elements change colour in the model view, giving teams a visual representation of progress and allowing them to identify exactly which elements still need work.
When should models be moved to published status?
Models should only be moved to published or authorized status once the tasks associated with them have been verified and approved. Publishing a model signals to other teams that the information it contains has been checked and is ready for use. If tasks are still incomplete or have not passed verification, the model should remain in a work-in-progress or shared state until the outstanding issues are resolved.
How do lessons learned feed into future projects in Plannerly?
When unscoped tasks or missed requirements are identified during a project, they can be added to reusable templates. When a new project is created from an updated template, the system flags that updates are available and allows teams to incorporate them. This means improvements are carried forward automatically rather than relying on people to remember what changed on previous projects.
Explore further
- Information verification, project tracking, compliance, and lessons learned – The full expert course lesson covering tracking, compliance, and lessons learned workflows in detail.
- Tracking project progress and delivery – The advanced course lesson on moving from guesswork to visibility in project tracking.
- BIM model quality and the Verify module – How automated model checking supports quality assurance and compliance across project deliverables.
- ISO 19650 concepts and workflows – The full help centre collection covering how each component of ISO 19650 works together in practice.