Large successful companies must inevitably confront the Innovator’s Dilemma: when you’re making billions of dollars from your core product, how can you do anything that might disrupt – or even disturb – that cash cow? In most cases, you don’t: you look for incremental gains, adding small wins here and there to keep the gravy train rolling, if not rocketing into the cosmos. This is the innovator’s dilemma; the successes of the past become the cage that keep you from reaching deeply into the future.
Having worked at three different multi-billion companies, I’ve seen some different approaches to this play out at very close range. While far from scientific, I think they provide some interesting case studies. They’re not scientific; these are personal observations from up close. I’m sure others who were in these situations have very different perspectives.
In this post I’m going to chat about one approach to “breaking the innovation logjam.”
Windows Longhorn: All-In Innovation
In the early 2000s, several major executives at Microsoft – notably Bill Gates and Jim Allchin – were getting tired of being perceived as “not innovative” – this was as OSX was getting back on its feet and Apple was beginning a major innovation streak that would culminate – much later – in the wholesale reinvention of the personal computing space. All we knew at that time was that MacOS was starting to look really slick and that innovations like Quartz (the desktop graphic composition engine) and Core Animation were making MacOS feel far more modern. The Windows team was given a clear and simple message: go for it. Take all those big ideas you’ve been dying to get to but that didn’t in previous tentative, incremental releases, and take the time to do them right.
Longhorn was originally intend to be a “quick fix” release of Windows, knocking off some customer pain points before the next big moonshot. Instead, it became a moonshot itself, seeking to scratch every major itch in the Windows technology stack.
What did that entail? Primarily there were three components:
– Avalon: a completely new user interface framework, backed by markup (so that design teams and engineering teams could have a total separation of concerns) and with a deep and sophisticated animation architecture. The Windows desktop, or “shell” was to be completely rewritten using Avalon. Early demos of the new shell showed such rich 2D and 3D animation deeply integrated into the desktop as to make MacOS’ minor window animations seem primitive in comparison.
– WinFS: Microsoft makes some of the most powerful database technology in the world in SQL Server. Wouldn’t this be a more powerful system for users to store their documents and Data? WinFS sought to rebuild the end-user’s storage and search experience atop the world’s most popular (at the time) database. (This is where I first entered the picture: as a new recruit to Microsoft Research, my first task was to design a new user model for document storage on a relational database. More on this another day.)
– Indigo: COM is the fundamental way that Windows applications talk to each other; Distributed COM, or DCOM, enables apps and components to talk to each other over the network. Indigo was “DCOM on steroids,” enabling Windows apps to communicate over the internet, with rich capabilities for apps and components to find each other across the internet, negotiate capabilities, and cooperate.
So here we have every area of the desktop computing ecosystem given a blank check to invent their ultimate form. At the same time.
And one more thing: all of these systems were being written in a language – C# – that was also being invented at the same time.
So what happened?
Technical culture: Windows had one of the largest collections of C programming experts in the civilian world. Taking these world-class experts and making them feel like beginners on a language that was still being refined was not a recipe for happiness. Tricks that had worked in COM or the Windows API didn’t necessarily work in the new world. All the old ways of getting things done, of breaking logjams and “just shipping” were up in the air. Never I have seen “getting it right” and “getting it done” in such direct opposition.
Building on quicksand: any of these major subsystems would have been a massive endeavor – remember, this wasn’t an app, this was an ecosystem with millions of existing applications, drivers, and hardware variations. One senior Windows engineer referred to it as “doing open-heart surgery on a Formula 1 race driver – in the middle of a race – at 240mph.” Having virtually all the major subsystems of Windows being deeply reinvented while depending on each other was simply too much. Development slowed to a crawl, morale dropped through the floor, and simply getting the whole system to compile was becoming problematic.
Meltdown: at a critical juncture in 2004, another critical problem came to the surface. One of the great benefits of a “managed language” like C# is that it manages memory for you using a process called garbage collection. In unmanaged languages, programmers must take care to return any memory to the system that they are no longer using. It’s a huge pain and greatly increases code complexity, but it’s an established methodology. In managed code, all this is handled for you. Unfortunately, there was a huge gotcha with C#: when the core “engine,” or runtime runs out of memory itself, the whole thing just quits. This is tolerable – sometimes – if you’re an app. When you’re the operating system, it’s not acceptable.
At this point the execs had a brutal choice: push through and solve the problems, facing down a full-scale mutiny from some of their most precious technical talent, or admit defeat and junk the whole thing. They chose the latter, dumped the entire code base, pulled the Windows Server 2003 kernel off the shelf, and started again, eventually delivering Vista on a painfully accelerated schedule for what was still an ambitious release, with deep architectural changes like GPU compositing of the desktop, deep security management of every internal API that could touch user data, and – my favorite feature – a new blindingly fast and powerful search engine – basically throwing away the new developer APIs and trying to deliver as much of the envisioned user experience as possible. In hindsight, Windows Vista was a pretty good beta for Windows 7.
What was the lesson? Watching this train wreck up close, I learned a lot about the difference between “good enough” and “perfect” – and about the need to manage the speed at which you are attempting to innovate. A metaphor I’ve heard to describe this is “physics” – there is a certain pace beyond which a code base will not move based on a combination of legacy, culture, and clarity of purpose.
Could Longhorn have been saved? Hard to tell. It would have cost the company billions to find out. Would it have mattered? I think it would; a fresh code base, deep design tooling, and a new, more agile UX layer would have enabled a rapid pace of innovation for MSFT moving forward, and instead it took years to get Vista to a decent level of quality in Win7. Still, I’m not sure I would have decided differently from Gates and Allchin in 2004, and I’m glad it wasn’t a call I had to make.