Intentionally Designed Systems Can Go Wrong
In a previous post, I argued that most organizations don't design their management systems. They inherit them. And I suggested that engineering those systems with care and intent might offer a powerful path forward.
But that framing needs a second half.
Robert McNamara engineered a management system. It was deliberate, logical, and internally coherent. Whether the Vietnam War itself was strategically or morally defensible remains contested, but the way it was conducted, driven in large part by that system, was neither. The structure didn't just fail to prevent harm; it encouraged it.
According to Nick Hodges in a recent InfoWorld article, McNamara "took the view that anything that can't be measured should not be taken into account in the decision-making process" and "made metrics the sole means of determining the correct course of action." Body count, as a primary performance measure, didn't merely distort incentives—it encouraged brutality.
That mindset didn't end with Vietnam. It lives on in boardrooms and dashboards across every industry, creating what Hodges calls the McNamara Fallacy—reducing decisions to data and excluding anything that resists quantification.
The Pattern Repeats Everywhere
Consider software development, where teams chase metrics like lines of code written, story points completed, or pull request cycle time. These numbers are easy to collect and display—but they often have little connection to actual progress or value. A developer who writes bloated, unnecessary code can outperform a teammate who solves the same problem elegantly. A team that rushes to close tickets may look productive while delivering little that matters.
The tools are more sophisticated now, but the trap remains identical: mistaking activity for impact.
Or take the procurement decisions I've written about, where organizations compare fully loaded internal costs to marginal external quotes. The math is precise. The methodology is standard. The savings are phantom. Finance approves the switch, but the promised cost reductions never materialize because the fixed costs remain while and utilization diminishes.
In each case, the system performs exactly as designed. That is the problem.
When Measurement Becomes Mission
What starts as a directional signal can quietly harden into assumed truth. Over time, no one remembers what the measure was meant to reflect—just that it needs to move upward. And once movement becomes the goal, behavior adapts to hit the number, not the purpose behind it.
I've seen this pattern across industries: manufacturing teams that optimize unit costs by running oversized batches, creating inventory problems downstream. Sales organizations that celebrate revenue from deals that actually lose money. Customer service departments that keep call times low while satisfaction scores crater.
Each group hits their targets. The business suffers anyway.
Systems like these persist not because everyone agrees with the outcomes, but because the process appears rational. As long as the spreadsheets stay full and the graphs trend upward, the cost of misalignment remains invisible—until it isn't.
The Leadership Trap
Here's what makes this particularly insidious: these systems often get implemented by competent, well-intentioned leaders who believe they're bringing scientific rigor to business decisions. The logic is seductive. If you can't measure it, how can you manage it? If you can measure it, shouldn't that drive your decisions?
The answer is more complex than the question suggests. Some things resist measurement but still matter enormously. Others can be measured precisely but don't indicate what you think they do. And the act of measuring—especially when tied to rewards and consequences—changes the behavior you're trying to assess.
McNamara wasn't stupid. He was applying what appeared to be disciplined, analytical thinking to an impossibly complex situation. But when measurement systems become disconnected from context and unmoored from judgment, they create scale without discernment.
That's not just a management risk. It's a leadership failure.
The Missing Element
So engineering isn't the villain here. System design is still the missing discipline in most organizations. But when design work ignores context, dismisses judgment, and assumes that precision equals importance, it becomes dangerous.
Real system work—the kind that actually improves outcomes—isn't just analytical. It's moral. It involves naming tradeoffs, questioning incentives, surfacing tolerances, and asking who benefits from the defaults.
Design must be anchored in discernment. Measurement needs context. And system curation requires the wisdom to know what shouldn't be optimized, even when it could be.
The Modern Versions
The Vietnam example feels extreme, but diluted versions play out constantly in contemporary business. I've encountered organizations where:
Quality metrics reward finding defects rather than preventing them, creating perverse incentives for inspection teams
Efficiency measures drive behaviors that optimize individual departments while destroying coordination across them
Customer satisfaction surveys get gamed through selective sampling rather than improved through better service
Training completion rates stay high while actual capability gaps persist and widen
These aren't accidents. They're symptoms of systems doing exactly what they were built to do—just without anyone asking whether they should.
What This Means for Leaders
The temptation, when faced with McNamara-style failures, is to abandon systematic thinking altogether. To rely on intuition, relationships, and informal coordination. But that's not the answer either. Intuition scales poorly. Relationships create bottlenecks. Informal coordination breaks down under pressure.
The solution isn't less system design—it's better system design. Design that includes context, acknowledges complexity, and preserves space for judgment. Design that asks not just "can we measure this?" but "should we?" Not just "does this number move?" but "does it move toward what we actually want?"
That requires leaders who understand that systems aren't neutral. They embody values, assumptions, and priorities. They shape what people notice, ignore, and act on. And they can either amplify human judgment or replace it.
The choice isn't between measurement and wisdom. It's between measurement in service of wisdom and measurement as a substitute for it.
The Work Ahead
If you recognize these patterns in your organization—if you see metrics that drive counterproductive behavior, processes that optimize for the wrong outcomes, or dashboards that show progress while performance stalls—the solution isn't to throw out systematic thinking.
It's to examine the systems themselves. To audit not just whether they're working as designed, but whether they're designed for what you actually need. To question not just the numbers, but the assumptions behind them.
This work can't be rushed. When measurement changes, behavior changes—and not always in predictable ways. System realignment requires understanding how current measurements shape behavior, where they conflict with actual objectives, and what constraints make change difficult.
Most leaders were trained to work within existing systems, not to examine the systems themselves. That's not a failing—it's just not common training. But it creates a gap between what gets measured and what gets managed.
The decision about what your systems are for lives upstream of every metric and every workflow. That decision shapes everything that follows.
And it can't be outsourced.
System curation is leadership work, but it's specialized leadership work requiring tools and perspective that most leaders weren't trained to use. If you're ready to examine the systems that run your business—and ensure they're designed for what you actually need—let's talk.

