͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­ Frame Break Where the Shadow Falls

The gap between what organizations measure and what they value

[Usman Sheikh]

Usman Sheikh

March 19, 2026

[https://outlook.office.com/mail/id/AAkALgAAAAAAHYQDEapmEc2byACqAC%2FEWg0AECVrVK0ll0uT9sjXrX%2FNDQABUUgxpQAA?nativeVersion=1.2026.317.100] ​

I finally have a version of my own AI agent running. It triages my emails, responds to routine messages, transcribes podcasts and prepares briefing documents that are finally becoming useful.

A year ago I was chatting back and forth with Claude, copy-pasting transcripts in and getting summaries. A few months ago there was a marked shift when it started handling some of the execution work, pulling research, turning raw notes into something structured.

​​Howard Marks, in his latest Oaktree memo​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/7qh7h8h9lpld55az/aHR0cHM6Ly93d3cub2FrdHJlZWNhcGl0YWwuY29tL2luc2lnaHRzL21lbW8vYWktaHVydGxlcy1haGVhZA==, called this shift the difference between a billion-dollar market and a trillion-dollar one.

Individual productivity for the highest performers at my portfolio firms is up dramatically. Yet at the organizational level, the outcomes are still not showing up in the P&L.

Every week I hear from a portfolio company that restructured a process with AI. It didn’t produce the outcomes they wanted. Now they have a hodgepodge of half-automated workflows that may be doing more harm than good.

Something is absorbing the productivity gains before they reach the organization.

In the 1890s, factories swapped steam engines for electric motors and nothing improved for decades. The electric motor could go anywhere, but the factory floor wasn’t redesigned to use it. Productivity gains only appeared much later when new factories were built from scratch.

The popular prescription right now follows this analogy directly. Organizations need to redesign how work flows between people and agents. Redesign the factory floor and the productivity will follow.

The analogy is seductive because it offers a clear prescription. It assumes the people running the factory know what the factory floor should look like.

​​Marc Andreessen, on a recent podcast with David Senra​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/owhkhqhw9l9xdvfv/aHR0cHM6Ly93d3cueW91dHViZS5jb20vd2F0Y2g_dj1xQlZlM00yZ19TQQ==, told a story about working at IBM at the peak of its power in the late eighties. He counted twelve layers of management between himself and the CEO. Each layer was lying to the one above it. Not with malice, rather each person wanted to look good, putting a little spin on what they reported upward.

One layer of spin is tolerable. Two or three and it starts to compound. Twelve layers, and the CEO had no idea what was happening inside his own company. IBM even had a name for it, what Andreessen called the big gray cloud, the entourage that followed the CEO everywhere and insulated him from anyone who was doing the work.

That’s one kind of opacity: bureaucratic and emergent. Nobody decided to deceive the CEO. The structure produced the deception on its own.

There’s a second kind that’s more deliberate. I’ve written before that ​organizations aren’t accidentally messy, they’re strategically opaque​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/6qheh8hlr6rxe3fo/aHR0cHM6Ly9mcmFtZWJyZWFrLmNvbS9wb3N0cy90aGUtaGFsZi1saWZlLW9mLWV4cGVydGlzZS0wMTI5MjAyNg==. The partner who doesn’t document reasoning is protecting leverage. The undocumented exception is job security.

AI threatens both kinds differently. It exposes the truth of operations that bureaucratic layers were smoothing out. It also makes the mechanics of protected expertise visible and transferable.

Both layers of opacity sustained the gap between what organizations said they valued and what they actually measured, a gap that was sustainable when humans ran the factory.

Humans absorb contradictions.

They hold two realities simultaneously: what the organization says it values and what it actually measures. They make judgment calls that the organization’s metrics don’t capture and wouldn’t reward if they did.

​​Mission statements are marketing copy​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/25h2hoh3909lzrf3/aHR0cHM6Ly9mcmFtZWJyZWFrLmNvbS9wb3N0cy9yaXNlLW9mLXRoZS1tZXJjZW5hcnktZGVhdGgtb2YtdGhlLW1pc3Npb24tMDcxMTIwMjU= at most firms. The gap between the copy and tracked metrics has been invisible for decades because people were willing to carry it.

AI doesn’t absorb contradictions. It optimizes for whichever signal is encoded most clearly. In most organizations, those signals are operational: speed, cost, throughput, resolution rate. Not the aspirational ones sitting in the values deck.

When AI enters and acts on those operational signals, the output feels wrong. While efficiency may have increased, what we are losing is something the client used to receive without asking for it.

The easy diagnosis is that the technology lacks judgment. The harder one is that the technology did exactly what the organization’s infrastructure told it to do, and what was missing had always lived in the humans compensating for a misalignment nobody had to confront until now.

The two popular prescriptions for closing this gap both run into the same wall.

The architectural prescription says redesign the factory. But you can’t redesign a factory you won’t look at honestly. The redesign requires defining what the organization actually does, not the version in the strategy deck but the operational reality, and that means confronting where those two diverge. Most transformations stall not on the technology but trying to define that gap.

The context prescription says encode the tribal knowledge. Build ​context layers​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/g3hnh5hm9o9g5lur/aHR0cHM6Ly9mcmFtZWJyZWFrLmNvbS9wb3N0cy9jb250ZXh0LWdyYXBocy10by1yZXNvbHV0aW9uLWxpYnJhcmllcy0wMTE1MjAyNg==, semantic definitions, the institutional memory that lives in people’s heads. But the tacit knowledge inside organizations isn’t all wisdom worth encoding. Some of it is workarounds that grew around misalignments nobody wanted to fix. Encoding those workarounds without addressing what created them just puts the problem on autopilot.

When organizations deploy AI into their processes, it works almost like an x-ray, making visible the mismatches that humans had been quietly absorbing. Klarna deployed AI customer service agents that handled millions of conversations and collapsed resolution times. The CEO, Sebastian Siemiatkowski, ​told Bloomberg​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/48hvhehmlqlwe2cx/aHR0cHM6Ly93d3cuYmxvb21iZXJnLmNvbS9uZXdzL2FydGljbGVzLzIwMjUtMDUtMDgva2xhcm5hLXR1cm5zLWZyb20tYWktdG8tcmVhbC1wZXJzb24tY3VzdG9tZXItc2VydmljZQ== that they had focused too much on efficiency and cost, and the quality wasn’t sustainable.

The human agents had been absorbing a gap between the company’s stated value of customer relationships and its actual metrics of speed and cost. The AI optimized for the metrics and the mismatch between those two had nowhere to hide.

It happens at the individual level too. A loan officer who has always used judgment to approve applications now works alongside an AI that provides recommendations. When should she override it? The question sounds like it’s about the technology’s accuracy, but the real question is whether the bank ever understood that her judgment was part of the product or just let her quietly bridge the distance between policy and the person across the desk.

An x-ray doesn’t create fractures. It reveals the ones that were already there.

​​Chris Argyris called the gap between what people espouse and how they act the shadow between theory and practice.​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/08hwh9h2ewen52il/aHR0cHM6Ly9mdWxsZXIuZWR1L25leHQtZmFpdGhmdWwtc3RlcC9yZXNvdXJjZXMvZXNwb3VzZWQtdGhlb3J5Lw== What his framework and every organizational development intervention since couldn’t do is force the confrontation.

The consultant’s report traveled through the same management layers Andreessen described and got reframed at each one before it reached anyone with the authority to act on it. AI doesn’t travel through layers. It reads the operational signals directly and optimizes for what’s encoded, not what’s espoused. The gap becomes visible whether leadership is ready to confront it or not.

The accumulated distance between what gets measured and what gets valued is the organizational debt separating the companies thriving with AI from the ones whose pilots keep stalling. Not technical debt or process debt. Something older and harder to see, carried for so long that very few in leadership know it exists.

The organizations I see closing this gap aren’t doing it through better models or larger transformation programs. They’re reducing the distance between the person exercising judgment and the metric that captures whether that judgment was good.

Sometimes that means ​flattening layers​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/vqh3hrho030m7gbg/aHR0cHM6Ly9mcmFtZWJyZWFrLmNvbS9wb3N0cy9sYWJvci12cy1pbmZyYXN0cnVjdHVyZS0xMTIwMjAyNQ== and other times it requires building smaller from the start. The common thread is fewer hops between decisions and consequences, which is another way of saying fewer places for the spin to accumulate.

Most organizations would rather put the humans back, restore the ​ceremony​https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/m2h7h5h35052p2am/aHR0cHM6Ly9mcmFtZWJyZWFrLmNvbS9wb3N0cy93aGVuLXRoZS1jb250YWluZXItYnJlYWtzLTAzMDUyMDI2, and make the pain manageable again. That’s the choice sitting in front of every leader whose AI pilot failed for reasons nobody can quite name.

Between the idea and the reality, between the motion and the act, Eliot wrote, falls the shadow.

​Join the conversation on FrameBreak https://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/e0hph7h7vwvmgvf8/aHR0cHM6Ly9mcmFtZWJyZWFrLmNvbS9wb3N0cy93aGVyZS10aGUtc2hhZG93LWZhbGxzLTAzMTkyMDI2

Usman Sheikh © 2025

​​​​​Unsubscribehttps://a6f846b6.unsubscribe.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5​ · ​Customize Frequencyhttps://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/p8heh9h4656r0nfq/aHR0cHM6Ly91c21hbnNoZWlraC5jb20vZW1haWwtZnJlcXVlbmN5 · ​Leave a Testimonialhttps://a6f846b6.click.kit-mail3.com/lmupv27wq9tmhn2q082i6h8v67dn5agh3qgd5/58hvh7hgxox38ea6/aHR0cHM6Ly91c21hbnNoZWlraC5jb20vdGhhbmt5b3U=

[https://outlook.office.com/mail/id/AAkALgAAAAAAHYQDEapmEc2byACqAC%2FEWg0AECVrVK0ll0uT9sjXrX%2FNDQABUUgxpQAA?nativeVersion=1.2026.317.100]