Several colleagues recently shared Omer Singer’s astute analysis of Splunk’s innovator's dilemma. While his insights are spot-on, we’re watching something even more fundamental unfold in the event data space – a story that goes beyond classic disruption theory.
The narrative begins with a simple truth: what started as a bargain has become a burden. Early adopters of Splunk, where I was head of product for their first seven years, remember our initial value proposition. Splunk’s search approach dramatically simplified making sense of machine data. Splunk’s pricing was reasonable, and there was no serious alternative.
But as data volumes exploded and use cases multiplied, something shifted. What was once a cost-effective option has become, for many organizations, their most expensive operational line item. This isn’t just about inflation or scaling costs. It’s about a fundamental misalignment between legacy architectures and modern data realities.
When you’re a market leader with an established revenue model, optimizing for customer cost savings isn't just difficult — it can be existentially threatening. Your investors expect consistent growth. Your top salespeople rely on those large deals. Your entire organization is built around maintaining and growing that revenue structure. Splunk’s aging bones will bring new owner Cisco several billion dollars in inertial license renewals this coming year. They won’t risk refactoring it.