Discussion about this post

User's avatar
Neural Foundry's avatar

Super thoughtful framing of AI within the recursive improvement lineage. The compiler comparison is especially useful since that was arguably the first moment machines could write software for themselves, and most people dunno that history. I've been thinking about this same continuity angle after watching enterprise AI deployments quietly compound gains while consumer chatbots plateau. The Mokyr micro-inventions framwork connects it all nicely.

PEG's avatar

The hockey-stick framing assumes there’s a viable product waiting for its friction-reduction moment. But historically, technologies that showed real value (electricity, cars, phones) saw aggressive early adoption despite massive installation costs—because the benefit was obviously worth it.

Current AI adoption patterns look different: high friction costs and unclear value proposition. That’s not a pre-hockey-stick moment—it’s the expert systems trajectory, where impressive benchmarks don’t translate to sustained business value in deployment.

The compounding innovation narrative mistakes what drove past transformations. It wasn’t marginal improvements accumulating—it was infrastructure and economic barriers collapsing once the core value was proven.

2 more comments...

No posts

Ready for more?