Table of Contents: Docs as Tests
Previous Chapter: About Docs as TestsNext Chapter: Chapter 2—Turning Documentation into Action

The product that Vincent documents allows admins to manage user access and permissions through a web interface. After extensive debate, collaboration, and revisions with security, engineering, and product teams, he published the workflows for creating user accounts, assigning role-based permissions, and broadly setting up access controls. Vincent’s docs are used by IT admins at major enterprise customers who need to carefully control what their employees can access.

Vincent confirmed that the new content was live and wondered how soon customers would start using it, but someone had already asked when he would start on another project, and his attention shifted. The access control docs already began to fade into memory as new priorities made themselves known, and life went on.

Six months later, Vincent got a ticket from the customer support team that a screenshot in the access control docs was outdated. How could that have happened? he thought. Vincent took that screenshot himself when he wrote the docs. Then again, he hadn’t looked at the docs since he wrote them because it wasn’t yet time for the annual docs audit.

Confused, Vincent navigated to the product’s access control section. The screenshot indeed was out of date; there were a few new toggles and expandable options compared to when he last saw it. What used to be individual role assignments was now a system of permission groups with inherited access, resulting in different default access permissions.

But the docs still showed the old single-role model recommendations, which Vincent realized had led IT admins to think they’d properly restricted user access when they hadn’t.

It dawned on Vincent just how big a problem this was: This seemingly small UI and behavior change was creating actual security risks because users had unintended system access. Fixing the docs wasn’t enough here. He needed to communicate this issue to affected customers so they could fix their security settings, even if it was going to damage their trust in his company. There would be an up-tick in support escalations from security teams. Auditing requirements were likely going to get tighter. There might even be regulatory compliance implications. This was a mess.

So he escalated. Vincent engaged product and customer support teams, who started to triage the customer issues. Upon investigation, the engineering and design teams said that they were simply updating the UI and product behaviors to be what they should’ve been in the first place, even if it was a change post-release. “Sorry, Vincent,” they said, “for not informing you. We figured you were busy with other projects and didn’t realize it impacted you.” They promised that they would tighten up review procedures and keep Vincent in the loop next time.

Vincent was amazed how docs being “slightly wrong” could have such major consequences. Consequences that his company might literally not be able to afford again. And how many similar issues like these lurked in the docs and product, yet unknown?

He couldn’t rely on manual docs audits anymore. Not if this was the potential scale of impact documentation inaccuracies could have. Vincent needed a systematic way to validate documentation against the current product UI. He needed automated testing that confirmed documented procedures still produced the expected results. But how?

The Problem with Broken Docs

Broken docs are a barrier to user success, and user success is the key to product adoption and usage. If prospective users can’t get started with your product, they simply won’t use it. They’ll find the nearest competitor and move on. If existing users can’t complete tasks, they get frustrated and leave. They’ll find a product that works better, or at least that they perceive as more reliable, and switch. And if frustrated users don’t leave, the best you can hope for is that they contact support, increasing your support costs and reducing your team’s effectiveness.

This applies to internal users, too. While external users are your customers, internal users are your coworkers and colleagues, the people who build, maintain, and support your product. If your docs aren’t accurate, internal users can’t rely on them and can’t do their jobs effectively. Instead of creating value for your company, they’re debugging issues and documenting solutions elsewhere, if they’re documenting their solutions at all, leading to fragmented knowledge and increased development costs.

But if this is such a large problem, why haven’t people solved it before now? They’ve tried.

The Broken Docs Spiral of Doom

Figure 1: The Broken Docs Spiral of Doom

Manual Validation

Since the earliest days of technical writing, and often still today, documentation testing is largely a manual process. Technical writers carefully craft their content, then meticulously work through each procedure, step by step, to ensure accuracy.

This method, while thorough, is incredibly time-consuming and prone to human error. Adding to the problem, teams often verify content only when it’s created, rarely checking it again afterward.

Some organizations go through regularly scheduled “freshness” reviews. While this is the ideal within the manual validation strategy, the reality is that most technical writing teams are chronically understaffed and under-resourced. They simply don’t have the luxury to set aside time in their schedules for a comprehensive review.

In those cases, the testing is still performed—by users. Each time a user follows the documentation, they’re effectively testing it. But while this passive validation approach sometimes catches inaccuracies, it causes severe customer distrust.

Even for those who do schedule regular reviews, the amount of person hours it takes to manually validate content is at least directly proportional to the amount of content that needs validating. A single writer might be able to validate a small documentation set in a week, but it would take a significantly longer time to validate a doc set with hundreds or even thousands of pages, even split across a large team. Then there are multipliers for how difficult the tasks are to complete and whether testing resources are readily available.

Manual validation isn’t scalable from a time-investment perspective. Even worse, the testing is still performed by humans, so it’s prone to error. Far from a reliable solution, manual testing is likely to produce inaccurate docs.

The QA Era: Borrowing from Software Testing

Documentation testing evolved alongside software development practices. Many organizations began involving their Quality Assurance (QA) teams in the documentation process. QA professionals, accustomed to rigorously testing software, applied similar principles to documentation.

This approach brought several benefits:

  • Systematic testing: QA teams applied structured test cases to documentation, resulting in more comprehensive coverage.
  • Fresh perspective: Having someone other than the original author test the documentation helped catch issues that might’ve been overlooked.
  • Integration with product testing: Documentation testing could be incorporated into broader product testing cycles.

But this method also had its limitations:

  • Resource intensive: Involving QA teams in documentation testing required significant time and personnel.
  • Disconnect from writing process: QA teams, while skilled at testing, might not have the same depth of understanding about documentation principles and user learning styles as technical writers.
  • Reactive rather than proactive: Testing often happened after documentation was written, leading to a cycle of write-test-revise rather than ensuring accuracy from the outset.

While some teams found success with this approach, it still fell short of providing a scalable, efficient solution to documentation testing. Though teams could now measure how much of their docs were covered by tests, now engineers were required to write the code for those tests, and when resources were reduced or reassigned, the doc tests were often the first to go. After all, if you’re a product manager (PM), would you rather have a tested feature or tested docs? Many shortsighted PMs choose the former, and the docs—and therefore the users—suffer for it.

Additionally, with the rise of Agile, Developer Operations (DevOps), and Continuous Integration/Continuous Deployment (CI/CD) cultures, the pace of product development accelerated, and most traditional QA tasks fell on primary engineers. In many sectors, dedicated QA teams mostly vanished, leaving the QA testing strategy outright unavailable for many writing teams and likely unsustainable for many more in the long term.

For documentation validation to be sustainable, it needs to be the domain of the writers, not the engineers.

Comparing manual and QA testing approaches

Figure 2: Comparing manual and QA testing approaches

 What Does That Mean?
Agile: A project management approach where work is broken into small chunks and completed in short cycles (usually 2-4 weeks), allowing for frequent feedback and adjustments.
Developer Operations: A practice that brings together software development and IT operations teams to automate and streamline the process of building, testing, and releasing software.
Continuous Integration: The practice of frequently merging code changes back into a shared codebase, usually multiple times per day. Each merge automatically triggers tests to catch problems early.
Continuous Deployment: The practice of automatically deploying code changes to production after they pass all tests and quality checks.

The First Automation Attempt: Leveraging Tools

Some forward-thinking organizations explored ways to automate aspects of documentation testing. This often involved using tools to check for broken links, running spell-checkers, and using complex grammar and style validators.

These automation attempts were a step in the right direction, but they were limited in scope. They could catch surface-level issues but struggled with verifying the actual content and procedures described in the documentation. They tested the docs for syntax, for stylistic correctness, but stopped short of making sure that the content was correct. The tools checked for presentation, structure, and language issues, but they didn’t verify the accuracy of the content’s intent or instructions.

To be clear, these tools are hugely valuable, but they didn’t help docs become any more resilient to change. Also, they were often custom-built for specific use cases, making them difficult to scale and maintain, and required technical expertise to set up and run, putting them out of reach for many technical writing teams. These tools, like Vale, aren’t covered in detail in this book.

So How Do Engineers Do It?

To keep from introducing bugs into their code, engineers validate code by writing tests suites, which are collections of tests (sometimes referred to as test cases). Each test is a specific scenario that the code is expected to handle correctly, and within each test case, there are one or more assertions, which are statements about the expected behavior of the code, such as the code running successfully or a variable matching a specific value.

When the test suite runs, the assertions are checked against the actual behavior of the code. If any of the assertions fail, the test fails.

Engineers run their tests frequently—as often as every time they introduce a change to their code—to catch issues early. If tests fail, the new code has to be fixed before it’s added to the existing code. This is how teams of engineers manage large, shared code bases without breaking things constantly.

What Does That Mean?
Test suite: A collection of tests that are designed to validate a particular aspect of the code or product.
Test (or test case): A specific scenario that the code or product is expected to handle correctly. A test contains one or more assertions.
Assertion: A statement about expected behavior in the code or product that evaluates to either true or false when a test runs.

The Genesis of Docs as Tests

The concept of Docs as Tests emerged from a confluence of factors: the limitations of existing documentation testing methods, the increasing pace of product development, the reality of limited tech writing bandwidth and resources, and the growing adoption of automation in software development practices.

I was frustrated with the constant battle to keep documentation accurate and up-to-date, so I developed Doc Detective, an open-source tool designed to automate documentation testing. More importantly, my development experiences led to the formulation of the Docs as Tests strategy.

The core idea of Docs as Tests is simple yet powerful: treat documentation as a series of testable assertions about how a product works.

Docs are assertions that a tool is supposed to work a certain way. Because those assertions are verifiable, they are testable. That means that docs are tests, regardless whether they’re run formally or informally.

The real power of Docs as Tests comes from formally automating these tests. Just as software developers write tests to verify their code functions correctly, technical writers can create tests to ensure their documentation accurately reflects the current state of the product.

The Docs as Tests strategy represents a shift in how documentation is created and maintained. It moves documentation from a static, after-the-fact process to an active, integrated part of the product life cycle, leaning on learnings from software development and quality assurance practices.

This approach addresses several key challenges in documentation maintenance:

  • Keeping pace with product changes: By automating tests, documentation can be quickly verified against the latest product version.
  • Reducing manual overhead: Automated tests can run frequently without requiring significant time from technical writers.
  • Catching issues early: Tests can be integrated into development pipelines, flagging documentation issues before they reach users.
  • Building trust: Consistently accurate documentation builds user confidence and reduces support burdens.

Core Tenets

Docs as Tests is a strategy that can be implemented in many ways, but there are a few essential tenets, each of which will be elaborated upon throughout this book.

Docs are tests. Docs as Tests is a strategy for testing your docs against your product. Each doc is a test suite, each procedure a test case, each step an assertion. Treat them as such. Your docs are testable statements that your product works a certain way, and if you don’t run those tests yourself, your users will do it for you.

Tests run against the product. Doc-based tests run against your product—not against mocks, and not against code (unless the code is the product). Doc-based tests need to validate the actual UX that your users experience. If your product’s UX changes, your docs should change too. If you can test against multiple product environments (such as production, staging, and development, as we’ll discuss later in the book), test each environment within reason.

Tests are repeatable. Engineering tests are repeatable, and doc-based tests should be too. If you run your tests once and never again, your docs are no more resilient than if you never ran them at all. Doc-based tests should repeat as often as is reasonable to keep your docs in sync with your product. You should at least test your docs at least once for every change your users experience, such as testing once for each release, though testing more frequently can help you catch issues early, such as testing with every code push into a staging environment prior to release.

Resilient implementation, resilient tests. Everyone—tech writers, engineers, product managers, and more—contributes to docs, and therefore everyone contributes to tests. Your Docs as Tests implementation should be resilient to anticipated doc contributions. Automate what you can, like using a tool to make sure keywords and formatting are used appropriately, and educate your contributors on how to write docs that make good tests.

Doc-based tests don’t replace, they complement. While doc-based tests could be classified as end-to-end tests, they’re not a replacement for engineering best practices. Doc-based tests instead complement unit tests, integration tests, and other tests that are designed to validate your product’s code. Doc-based tests don’t validate code—they validate your product’s user experience as it is presented to your users.

Docs as Tests core tenets

Figure 3: Docs as Tests core tenets

What Docs as Tests is, and What it’s Not

To fully understand the Docs as Tests strategy, it’s important to clearly define what it encompasses and what it doesn’t. This clarity helps in setting appropriate expectations (for both yourself and your stakeholders) and in effectively implementing the approach.

What Docs as Tests Brings to the Table

Docs as Tests is format-agnostic, working with any documentation system. The strategy adapts to your existing documentation tools and workflows—whether you use a CMS, a Docs as Code workflow, or any other content solution—rather than forcing you to change your content infrastructure.

Docs as Tests transforms static documentation into living, verifiable truth about your product. Like a vigilant sentinel, it continuously validates that what you tell users matches what they’ll actually experience.

Trust builds through consistency, and Docs as Tests delivers exactly that. When you implement doc testing, you create a bridge between your content and your product—a bridge that alerts you the moment it starts to crack. Your documentation becomes more than words; it becomes a reliable contract between your product and your users.

Automation stands at the heart of this transformation. Just as a smoke detector doesn’t need manual intervention to warn you of danger, automated doc tests constantly monitor for discrepancies between your docs and your product. Doc tests catch issues before your users do, maintaining trust and reducing frustration.

The beauty of Docs as Tests lies in its flexibility. Whether you’re using specialized testing tools or crafting custom solutions, the strategy adapts to your needs. It’s not about the specific tools—it’s about the fundamental approach of treating documentation as testable truth.

What Docs as Tests Doesn’t Replace

A hammer is fantastic for nails but terrible for screws. Similarly, Docs as Tests excels at validating accuracy but isn’t meant to solve every documentation challenge.

Style guides, content architecture, and clear writing remain essential. Docs as Tests won’t fix unclear prose or poor organization. Think of it as a safety net, not a substitute for solid documentation practices.

Human judgment remains irreplaceable. While automation catches mismatches between docs and product, it can’t evaluate clarity, completeness, or the subtle nuances that make documentation truly excellent. Technical writers bring the crucial human element that transforms accurate documentation into exceptional documentation.

Though it’s one of the core tenets, it bears repeating: Doc tests don’t replace engineering tests. While engineering tests validate code functionality, doc tests validate docs accuracy and user experience. They serve different purposes, and each team needs their own types of tests to validate their outputs.

Table 1: What Docs as Tests Is and Isn’t

What it isWhat it isn’t
A testing strategy for documentationAn automation approachA bridge between documentation and productA trust-building mechanismA way to catch documentation issues earlyA tool-agnostic strategyFormat validatorsStyle checkersA replacement for other documentation best practicesA substitute for engineering testsA silver bullet for all documentation issuesA one-size-fits-all solutionLimited to text-based contentA replacement for human judgment

Understanding these distinctions helps in setting realistic expectations and in effectively integrating Docs as Tests into existing documentation workflows, which we’ll discuss in later chapters.

Isn’t this Behavior-Driven Development?

In short, no, but let me elaborate.

Behavior-driven development (BDD) is a software development process that emphasizes collaboration between developers, testers, and business stakeholders. BDD focuses on creating tests that describe the expected behavior of the system from the user’s perspective.

These tests are written in a natural language like English and are designed to be readable and understandable by all stakeholders. But while the tests themselves are easily readable, they still need to be backed by code, written by engineers, to run.

Docs as Tests and BDD are both valuable testing approaches, but the biggest differences are those of intent and involvement: Just like other engineering testing approaches, BDD validates that the code is sufficient and accurate, and it needs a comparatively large group of stakeholders to implement. Docs as Tests validates that the documentation is accurate and up-to-date, and it can often be implemented by a docs team without outside involvement.

If your workgroup implements BDD, wonderful! More testing is a good thing. But don’t let them prevent you from validating your doc content because they don’t understand Docs as Tests. Educate them instead, as we’ll discuss later.

Learning from Other Fields: The “Docs as…” Journey

Every technical writer knows the challenge: explaining complex systems in ways that users can understand and trust. Over the years, our field has found inspiration in unexpected places, borrowing wisdom from software development, product management, and beyond. This cross-pollination of ideas has given rise to several powerful “Docs as…” approaches.

Overlaps between docs, engineering, and product discipline

Figure 4: Overlaps between docs, engineering, and product discipline

Docs as Code: Writing Meets Engineering

One of the earliest and most influential of these approaches is “Docs as Code,” popularized by Anne Gentle’s book Docs Like Code.

Just as engineers use version control and other tools to manage complex piles of text (code), review changes coming from a variety of different sources and authors (other engineers), and publish new versions of their output frequently (release code), we content owners can use the same tools to manage complex piles of text (docs), review changes coming from a variety of different source and authors (subject-matter experts), and publish new versions of our output (update the docs).

 Voice of Practitioner
“Docs as Code is basically taking developer techniques and applying them to documentation. So it’s using…the collaborative version control system, test automation. Then it’s also building something that can be published immediately.”
—Anne Gentle

With Docs as Code, a writer’s morning routine might include reviewing incoming changes, running automated doc set builds, and deploying content updates—all practices borrowed from software development.

This approach brings powerful tools to our fingertips:

  • Version control captures every change
  • Automated doc set builds ensure consistent output
  • Review processes catch issues early
  • Deployment pipelines deliver updates reliably
  • Linters (rule-checkers) verify compliance to your style, grammar, and formatting rules

While Docs as Code introduces a wide variety of tools, those tools mostly transform how content is written and delivered. The content might be written in a different format, but it doesn’t change the inherent behaviors of the content itself.

Docs as Product: User Experience Takes Center Stage

Great products solve user problems. Great documentation should too. The Docs as Product mindset, popularized by The Product is Docs, transforms technical writing from a support function into a core user experience and a primary deliverable from your organization.

Picture your documentation as a product:

  • User research guides content decisions
  • Analytics reveal what works (and what doesn’t)
  • Regular updates respond to user needs
  • Success metrics drive improvements

Docs as Product treats documentation as a key part of the overall product experience, with its own lifecycle and development process. It aligns closely with product management practices, making sure that documentation meets user needs and business goals.

Docs as Ecosystem: Growing a Documentation Community

Documentation thrives on community involvement, but documentation can also be the heart of a product’s community. The Docs as Ecosystem approach, introduced in a book of the same name by Quetzalli Writes (published under the pen name Alejandra Quetzalli), recognizes that documentation, like a garden, needs tending to flourish.

This perspective changes how we work:

  • Community contributions enrich content
  • Cross-team collaboration improves accuracy
  • Active management ensures sustainability

Docs as Ecosystem views documentation as a living entity that thrives through active management and community involvement, in turn driving further involvement and investment in the product and community.

Docs Observability: Seeing Documentation in Action

Traditional documentation felt like shouting into the void—you wrote it, published it, and hoped it helped. Docs Observability changes that paradigm. Like a heart monitor showing vital signs, it reveals how users interact with your content in real-time. Docs Observability, coined by Fabrizio Ferri-Benedetti, brings concepts from engineering telemetry and observability to the documentation space.

This approach brings new insights:

  • Usage patterns reveal content effectiveness
  • Integration with product metrics shows impact
  • Predictive analytics guide content strategy
  • Real-time monitoring catches issues quickly

Docs Observability treats documentation as a first-class product surface area, applying the same rigor to monitoring and improving it as one would to a software product.

 Docs as… Overviews
Docs as Code
• Version control
• Continuous Integration/Continuous Deployment (CI/CD)
• Collaboration
• Infrastructure as Code
Docs as Product
• User experience
• User research
• Iterative development
• Metrics-driven decision-making
Docs as Ecosystem
• Community engagement
• Cross-functional collaboration
Docs Observability
• Real-time monitoring
• Correlating metrics
• Predictive analytics

How Docs as Tests Complements the Picture

Each “Docs as…” approach solves specific challenges. Docs as Code makes documentation agile. Docs as Product makes it user-focused. Docs as Ecosystem makes it sustainable. Docs Observability makes it measurable.

Docs as Tests adds a crucial element: reliability. It transforms documentation from hopeful assertions into verified truth. While other approaches optimize how we create, manage, and measure documentation, Docs as Tests ensures what we publish matches what users experience.

The beauty lies in the flexibility. Whether you write in Markdown or DITA, publish to the web or PDF, Docs as Tests adapts to your workflow. It doesn’t demand you abandon your current tools or processes—it enhances them with the power of validation.

But How?

The Docs as Tests strategy is a powerful, flexible approach to documentation testing, but it can also seem intimidating. How do you turn your docs into tests, and how do you run those tests against your product? How do you convince your team and other stakeholders to help, or that testing your documentation is a good idea in the first place? All good questions. Let’s answer them in the next section.

0
Would love your thoughts, please comment.x
()
x