Volume 4: The Document Automation Consultant

Chapter 10: Delivering the Engagement — From Discovery to Go-Live

What Makes Implementations Succeed or Fail

The technical work of building document automation solutions — designing data models, writing template logic, importing client data — is learnable and teachable. What is harder to teach, and what distinguishes consultants who build thriving practices from those who struggle with client satisfaction, is the discipline of delivery: setting expectations correctly, communicating proactively, making the right decisions when unexpected problems arise, and knowing the difference between a scope adjustment that protects profitability and an accommodation that builds a relationship.

Most failed document automation implementations don't fail because the templates were wrong. They fail because: - The client's data was in worse shape than expected and nobody caught it before go-live - The scope expanded without documented agreement and the consultant either resented it silently or absorbed it until it became unprofitable - The client was never trained properly and the system sat largely unused - Something broke two months after go-live and the consultant was hard to reach

This chapter covers the complete delivery methodology — five phases from signed contract to thriving client — with specific attention to the failure modes and how to prevent them.


The Five Phases of Delivery

Phase 1: Deep Discovery (Weeks 1–2)

The deep discovery phase is distinct from the sales discovery conversation and must be treated that way. Sales discovery was designed to understand pain and demonstrate value at a high level, with a 30-minute conversation as the primary tool. Delivery discovery is designed to understand the business at a level of detail that allows you to build a system that actually works — not theoretically, but in practice, with real data, used by real people under real time pressure.

Schedule the right meeting with the right people.

Book a 90–120 minute structured interview that includes two people: the person who approved the engagement (usually the owner, managing partner, or director), and the person who does the actual document creation work every day. These are frequently not the same person. The owner approved the engagement but has never personally assembled a monthly owner statement; the property manager who does it every Tuesday at 8 AM knows every quirk of the current process, every edge case, every data source. You need both in the room.

The deep discovery interview agenda:

Complete document inventory (30 minutes). Walk through every document the organization creates, not just the ones you agreed to automate. Ask about documents they create daily, weekly, monthly, annually, and on-demand for unusual circumstances. Ask specifically about documents they create by modifying previous versions — these are often the most time-consuming and the most error-prone, and they frequently don't surface in sales conversations because people think of them as "just part of the job."

The goal is a complete inventory, not just the agreed scope. You will often discover documents that should be Phase 2 additions and occasionally discover documents that should replace a Phase 1 priority because the pain is sharper than what was initially described.

Data sources and current technology (20 minutes). For every document, identify where the underlying data lives: which software system, which spreadsheet, which paper file, which person's email. Understand what can be exported and in what format. Understand what currently exists only in a person's head. Understand the relationships between data sources — does the billing software connect to the contact database, or are they entirely separate?

Ask explicitly: "If you were to leave tomorrow and someone else had to build this document from scratch, where would they find every piece of information that goes into it?" The answer reveals data sources you would otherwise miss.

Workflow mapping (20 minutes). For each priority document: what triggers its creation? What review or approval does it require before it leaves the office? What happens to it after it's created — filed, emailed, printed, signed, archived? Who needs access to it later, and for what purpose? Understanding the full workflow prevents building a generation system that delivers documents in the wrong format, to the wrong person, at the wrong stage of the process.

Exception cases and special circumstances (20 minutes). Ask directly: "What are the cases that don't fit the normal pattern?" The lease for the tenant who has a special accommodation. The invoice for the client on a non-standard billing arrangement. The grant report for the funder who requires a different format than everyone else. The property in the jurisdiction with unusual local requirements.

These exception cases are where amateur implementations fail and professional implementations demonstrate their value. Every edge case you discover in discovery is a conditional block you build into the template. Every edge case you miss shows up as an error at go-live in front of your client.

Collect real document samples.

Ask the client to share five to ten recent examples of each document type you'll be automating. Actual documents, recently produced, from different scenarios. Not a blank template — filled-in documents that reflect how they are actually used.

These samples reveal critical information that would otherwise require weeks of work to discover: - The actual format, layout, and language they use — which may differ substantially from what they described - Fields that appear in real documents but weren't mentioned in discovery - Edge cases already embedded in real usage (the lease that has three named tenants, the invoice that has a discount applied, the grant report that has two fiscal periods covered) - Their quality expectations — what "professional" actually means to them

Conduct the data audit.

Request access to the actual data sources you'll be connecting to. If data lives in spreadsheets, open them. Count the records. Examine every column. Look at real values, not just column headers.

What you're looking for: inconsistent formatting in date fields, blank required fields, inconsistent values in fields that should be standardized (the same state appearing as "CA," "California," "Calif.," and "ca" across different rows), duplicate records, and missing relationship links between related tables.

Every data quality problem you identify during the audit is a problem you can fix before building templates. Every problem you miss at this stage is a problem that appears in document output at go-live — in front of your client, with their real records, and with much higher stakes.

Write and distribute the discovery document.

Within three business days of the discovery meeting, write a 2–4 page summary and share it with the client. This document should contain: the complete document inventory, the data sources identified, key decisions made and outstanding, your refined implementation plan, and a list of anything you need from the client before you can proceed.

This is not a formality. It is a professional practice that serves three critical purposes: it ensures you and the client have the same understanding of scope before you build a single template; it gives the client an opportunity to add anything you missed; and it creates a written record you can reference if scope disputes arise later. Every hour spent writing this document saves three hours of rework.


Phase 2: Data Structure Design (Weeks 2–3)

With discovery complete, design the full data model before writing a single template field. The data model is the foundation. Everything built on a correct foundation is easier to build, easier to test, and easier to maintain. Everything built on an incorrect foundation requires disruptive rebuilding when the mismatch surfaces — which it always does.

Present the structure to the client.

You don't need to present a technical entity-relationship diagram. Present the concept: "Here are the categories of information the system will manage: your property records, your tenant records, your lease records, and your maintenance records. Let me walk you through what we'll track for each." Two things happen when you do this: the client may tell you something you missed (a field they track that you didn't know about), and you prepare them for the data import conversation — they begin to understand what information they're providing and in what format.

The data import plan.

Write down exactly what data the client needs to provide, in what format, by what date. Create spreadsheet templates with the exact column headers you need for each table. Provide explicit instructions for exporting from their existing software if applicable. Include an example row showing what a complete, correctly formatted record looks like.

Then state it clearly: "I need this data by [date] in this format. If anything is unclear about the format, let me know before you invest time creating the export. Once I have it, I'll handle all cleaning and importing."

Never tell a client to "send the data in whatever format works for them." That instruction reliably produces: a PDF printout of the current system, three different spreadsheet formats with inconsistent column names, and a folder of old Word documents containing some of the information. The data import plan — explicit, formatted, with a template — prevents this entirely.


Phase 3: Template Development (Weeks 3–8)

Build in priority order — highest pain first.

Your first completed, functional document output should exist by the end of Week 4 at the latest. An early working deliverable — ideally the document the client described as most painful in discovery — builds client confidence, provides a concrete object for early feedback, and keeps the engagement feeling active rather than silent.

Clients who see nothing for eight weeks while you build silently become anxious regardless of how well the work is progressing. Clients who see their most important document working correctly in Week 4 remain enthusiastic through the remaining build phase.

The individual template development cycle:

Design on paper first. Before opening Word, sketch the document: what sections it contains, what fields appear in each section, what conditional blocks apply, what loops iterate over which child tables. Every design decision made on paper costs minutes; the same decision made mid-build while troubleshooting costs hours.

Build against sample data. Build and test every template against your controlled sample dataset — not the client's real data — until it produces correct output for every test case including edge cases. Your sample data is clean, complete, and includes the scenarios you've deliberately designed for. Real client data is none of those things.

Test edge cases systematically. For every conditional block, test both paths: the true case and the false case. For every loop, test with zero child records, one child record, and many child records. For every formatted field, test with values at the boundaries of normal range.

Client review after each document. Send the generated sample output to the client after each document type is complete. Ask specifically: "Does this match what you'd expect to see? Is there anything that looks wrong, missing, or that you'd want formatted differently?"

Early feedback is cheap. Feedback at the end of the build phase is expensive. The client who sees the generated lease agreement after Week 4 and says "actually, we always put the lease commencement date in bold here" is saving you from discovering that preference during training.

Document your logic. In complex templates — especially compliance documents — add comments explaining the purpose of conditional blocks: [California only — applies mold and bed bug disclosure requirements per Civil Code §1954.603]. Six months from now, when you need to update this template for a law change and you haven't looked at it in two months, you will be grateful for this documentation. So will any resource you bring in to help with future implementations.

Managing scope creep professionally.

Scope creep is not a client character flaw — it is a natural consequence of a process that requires the client to understand what they need before they've seen what's possible. When a client says mid-project "could we also add the insurance certificate tracking document?" they're not being unreasonable; they've learned something from watching the build that they didn't know before it started.

Your response should be neither resentment nor silent absorption:

"Absolutely, I can add that. It's outside the scope we agreed to in the engagement letter, so there would be an additional fee of approximately [amount] — this one will take [hours] to build. If you'd like to add it now, I can fold it into the current timeline; otherwise we can handle it as a Phase 2 addition after go-live. Which would work better for you?"

This response does three things: it says yes, which maintains goodwill; it is transparent about the commercial implication, which prevents resentment; and it gives the client a choice, which respects their priority-setting authority. Most clients respond well to this framing. The few who push back against paying for additions were going to be difficult clients on other dimensions as well — and it's better to establish clear scope management early than to discover the misalignment six months in.

Weekly status communication.

Send a brief written status message to your primary client contact every Friday during the build phase, regardless of whether you have something tangible to show. The message contains three items: what you completed this week, what you're building next week, and any information or decisions you need from the client.

This rhythm prevents the "what's happening?" calls that consume time and signal anxiety. Clients who receive weekly updates feel informed and confident. Clients who hear from you only when you need something or when something goes wrong develop a distorted picture of the engagement's progress.


Phase 4: Data Import and System Testing (Week 9)

Once templates are complete and approved against sample data, import the client's actual data and rebuild your testing against it.

The import sequence:

Import in dependency order: reference data first (lookup tables, dropdown value lists, static reference information), then master records (clients, properties, members — the top-level entities), then detail records (transactions, line items, enrollments — the child records). Verify relationships after each import stage.

Real data produces real surprises.

After import, run every template against real data and review actual output carefully. Real data reliably produces surprises that sample data didn't: a business name containing an apostrophe that creates a display issue, a date field that imported with timestamps appended (so "2026-03-15" becomes "2026-03-15 00:00:00" in the output), a currency field that arrived as text because the source system exported dollar signs as part of the value.

These are not template bugs — they are data format issues — but they show up in the document output and the client cannot distinguish between them. Find and fix every one before training.

The pre-go-live review.

Before the training session, generate one complete set of real documents — every template, two or three real examples each — and review them as carefully as a client would. This is your quality gate. Check every conditional for correctness (did the California lease get the California disclosures? did the Texas lease get the Texas disclosures?), every loop for completeness (does the owner statement show all the month's transactions?), every formatted field for proper display.

Errors caught at this stage cost 15 minutes to fix. Errors found by the client during training cost credibility and, depending on their severity, can undermine confidence in the system before it has even launched.


Phase 5: Training and Go-Live (Week 10)

Training philosophy: they operate, you narrate.

The most common mistake in implementation training is the consultant operating the system while the client watches. The client experiences this as a demonstration, not training. They leave the session having seen it work without having done it themselves — and when they sit down to use it independently the next day, they have no muscle memory for the steps.

Effective training requires the client to operate the system from the beginning, with you providing narration and guidance. "You're going to go into the task pane and select the lease agreement template. What do you see?" Let them navigate. Let them make small errors. Let them recover from those errors with your guidance. The errors made during training with you present are worth ten times the errors avoided by letting you drive.

Training structure for a standard implementation:

Opening (10 minutes): Explain the three-layer architecture in plain language. "Your data lives in this spreadsheet — this is where all the information comes from. Your templates live here — these are the document designs that know how to use the data. Generating a document means telling the system which template and which records. Let me show you, then you try it."

Core generation workflow (25 minutes): Walk through generating one document of each type — lease agreement, owner statement, late payment notice — with the client operating from the start. Correct as needed, but don't take the controls. End this segment when the client can independently generate each document type without prompting.

Adding and updating records (20 minutes): Demonstrate adding a new tenant record, updating a payment status, recording a maintenance request. Then have the client do each one. The system is only as current as the data; the client must be confident in data entry to keep it accurate.

Edge cases and troubleshooting (15 minutes): Walk through what to do when something looks wrong. "If a document generates but there's a blank where you expected text, here's how we figure out what happened." Cover the two or three most common issues based on what you encountered during testing.

Questions and next steps (15 minutes): Open for questions. Confirm the support protocol for the first two weeks. Provide written quick-reference notes covering the most common tasks — a one-page summary they can keep at their desk without having to call you every time.

Go-live support commitment.

Be genuinely available — responsive within two hours during business hours — for the first two weeks after go-live. The issues that arise in this window are almost always small: a conditional not triggering as expected on an unusual record, a question about how to handle an edge case they hadn't encountered yet, a formatting preference they want to adjust. Addressed quickly, these become confirmations that the support relationship works. Left to accumulate, they become doubts about whether the system is reliable.

Schedule a 30-minute check-in call at two weeks post-go-live. This call is not optional — it is part of your delivery methodology. It surfaces any remaining issues before they become habits, collects early metrics for your case study, and provides an opportunity to identify additions or enhancements that will make the renewal conversation easier in eleven months.


Measuring Success and Building the Case Study

A successful implementation produces two things: a client whose operations have measurably improved, and documented evidence of that improvement that you can share with future prospects.

At 30 days post-go-live, ask three questions: - Which documents have been generated, and at what volume? - How has the time savings compared to what you expected? - Is there anything that works differently than expected — better or worse?

At 90 days, when the results are measurable and the initial enthusiasm hasn't faded, write the case study:

Client profile: Industry, approximate size, key role of primary contact, situation before implementation.

Before: Specific process described in discovery — hours per document, frequency, annual cost calculated.

The solution: Number and type of documents automated, key intelligence features, key compliance elements handled.

After: Measured results — hours saved, errors eliminated, compliance problems avoided, revenue impact, ROI.

In their words: A direct quote from the primary contact in their language, about their experience.

This case study becomes your most powerful sales asset for every subsequent prospect in the same vertical. A prospect who hears "I work with property management companies" is mildly interested. A prospect who reads a case study from a 350-unit property management company describing how lease compliance risk was eliminated and owner reporting time dropped from two days to three hours per month has seen their own future. This is the difference between being a vendor and being a solution.


The Delivery Quality Standard

Every implementation you deliver should be measured against the same four-part standard:

Accuracy: Every template produces correct output — right data in every field, correct conditional logic for every record state, complete loop iteration for every child record set.

Completeness: Every document contains all required information. No fields that should be populated are blank. No sections that should appear are missing.

Compliance: For every regulated document, the compliance requirements are correctly and completely encoded. A lease agreement that works beautifully but is missing a state-required disclosure has failed this standard regardless of how impressive the rest of the implementation is.

Professional appearance: Every output looks like it was produced by a professional who cares about quality — consistent formatting, appropriate typography, logical layout, the client's branding applied correctly. Clients judge the system partly by how the documents look. Beautiful output reinforces confidence; mediocre output undermines it even when the data is correct.

Hold every implementation to this standard before go-live. Your reputation — the asset that drives referrals and renewals — is built one implementation at a time.


Chapter Summary

  • Delivery success depends on disciplined process as much as technical skill; the most common failures are not wrong templates but missed data problems, unmanaged scope creep, inadequate training, and poor post-go-live communication
  • Deep discovery requires a 90–120 minute structured interview with both the decision-maker and the person who actually does the work; includes complete document inventory, data source audit, workflow mapping, and edge case identification
  • Always write and distribute a discovery document within 3 days of the meeting — it aligns expectations before you build and creates a scope reference that prevents disputes
  • Build in priority order — first completed output by Week 4; send client review after each template; document conditional logic in comments
  • Manage scope creep professionally: say yes to additions, state the commercial implication, give the client a choice between adding now or phasing — never absorb silently or refuse reflexively
  • Training: client operates from the beginning, you narrate; muscle memory from doing beats observation every time
  • Be genuinely available for 2 weeks post-go-live; schedule the 30-day check-in; write the case study at 90 days
  • Every implementation should meet the four-part quality standard: accurate, complete, compliant, professionally presented

Next: Chapter 11 — Scaling Your Practice


Chapter 10 | The Document Automation Consultant | datapublisher.io/books