What 142 ops leaders told us about knowledge loss
The number one predictor of process drift wasn't size, sector or tooling. It was something we didn't expect.
In late 2025 we ran a structured survey of 142 operations leaders across UK small and medium-sized businesses, ranging from twenty to three hundred employees. We asked four questions: how often do your processes drift from their documented versions; what do you believe causes that drift; how do you currently detect it; and what happens when you do. The answers to the first three were largely as expected. The answer to the fourth was where things got interesting.
The finding that changed how we think about process drift had nothing to do with tools, size, or growth rate. It was about the shape of a team.
01 · SampleWho answered
The 142 respondents spanned eleven sectors, with the largest clusters in professional services (34%), technology (22%), and logistics and distribution (18%). Headcount ranged from twenty to three hundred, with a median of 67. Most were in companies that had been operating for at least five years, which matters because process debt tends to compound with age.
We deliberately excluded companies undergoing active mergers or formal restructures at the time of the survey. Structural upheaval is its own category of risk, and conflating it with the quieter, steady-state kind would have muddied the results. The respondents we included were companies running normally, or as normally as any company of that size runs.
Job titles varied. Heads of Operations, Operations Directors, COOs at the smaller end of the range, and a handful of Ops Managers whose remit was broader than the title suggested. What they shared was direct accountability for process quality and day-to-day delivery, which made them the right people to ask.
02 · The hypothesisWhat we expected
Before running the survey we made some predictions, partly to force ourselves to be honest about what the data would need to show in order to surprise us.
Our working hypothesis was that drift would correlate with three things: company size (larger teams have more surface area for processes to diverge); growth rate (faster-growing companies introduce more new people in less time, diluting institutional knowledge); and tooling churn (teams that switch platforms frequently carry the transition cost in their processes for longer than they expect).
All three of these turned out to be weakly correlated with drift. Size had the least predictive power. A thirty-person firm with poor documentation habits drifted just as readily as a two-hundred-person one with the same habits. Growth rate was slightly stronger, but only at the extremes: companies growing faster than 40% per year showed elevated drift, but the relationship was not linear below that threshold. Tooling churn had a modest effect, concentrated in the twelve months following a major platform migration.
None of these was the primary driver. We looked for what was.
03 · The findingWhat actually predicted drift
The strongest predictor of serious process drift was neither size nor pace of growth. It was staff-mix: specifically, the combination of high average tenure and high new-hire rate in the same team.
Teams where the average employee tenure exceeded three years, and where new hires represented more than 20% of headcount in the previous twelve months, were 2.3 times more likely to report serious drift than teams without that combination. The effect held across sectors and across size bands.
This surprised us, because the intuitive model of knowledge loss focuses on departure: senior staff leave, take their knowledge with them, and the process deteriorates. That does happen, and it matters. But the more common and less-noticed pattern is different. It is not that the experienced people leave; it is that the ratio between them and the newcomers shifts fast enough that the knowledge cannot transfer.
In a team where eight out of forty people are new in the last year, the experienced majority still sets the tone. The new arrivals learn by proximity and informal observation. In a team where sixteen out of forty are new in the same period, the informal transfer system is overwhelmed. The experienced staff are outnumbered in enough day-to-day interactions that the tacit knowledge stops spreading as reliably. The documented processes, which were never quite accurate to begin with, become the primary reference for the new arrivals because the informal mentorship is no longer available at the density required.
Correlation between staff-mix index and process-drift score, 142 firms.
We called this measure the staff-mix index: the product of average tenure and new-hire rate, normalised to allow comparison across firm sizes. A high index value means long-tenured existing staff and a high proportion of recent arrivals, a combination that looks stable on the surface but carries significant knowledge-transfer risk. Across our sample, the correlation between the staff-mix index and self-reported process-drift scores was 0.61, which is substantially stronger than any of the variables in our original hypothesis.
The practical reading is this: the riskiest moment for process integrity is not when experienced staff leave. It is when the company hires quickly without a corresponding investment in how knowledge moves from the tenured majority to the new arrivals.
04 · ImplicationsWhat to do about it
The first implication is to track the ratio, not just the headcount. Most hiring dashboards tell you how many people joined; few tell you what proportion of each team is now in their first twelve months. That figure, set against the average tenure of the remainder, gives you a reasonable early indicator of where process drift is most likely to accumulate before it becomes visible in output quality or customer experience.
The second is to treat onboarding as a knowledge-transfer mechanism rather than an orientation exercise. A two-day induction that covers company history, benefits, and a tour of the tools is not a knowledge-transfer mechanism. It is a formality. The transfer happens over weeks, through repeated co-working, through the informal question-and-answer that occurs when a new hire cannot figure out why the process works the way it does. Companies that formalise this, by pairing new hires with mid-tenure staff on live work rather than in classroom-style training, showed meaningfully lower drift scores in our data. The effect was not large in absolute terms, but it was consistent.
The third implication is specific to documentation: when you know you are about to hire quickly, treat it as a trigger to audit your most critical runbooks before the new cohort arrives, not after. The gap between what your experienced staff actually do and what the documentation says they do will be smaller at the start of a hiring wave than at the end of one. Closing that gap while you still have the institutional knowledge to close it accurately is substantially easier than closing it afterwards, when the tacit layer has already fragmented across thirty different conversations and nobody can agree on the authoritative version.
· · ·
A note on the study’s limitations. This was a self-reported survey. Drift is subjective: different respondents interpreted “serious” differently, and the absence of a common measurement standard across organisations is a genuine constraint on the conclusions. We triangulated where possible, asking for specific examples alongside the ratings, but the data cannot carry the weight of a controlled study. What it can do is point to a pattern worth investigating more carefully, especially in organisations planning significant hiring rounds. The staff-mix finding replicated consistently enough across subgroups that we are confident it reflects something real, even if the precise magnitude of the effect will look different in different contexts.