Saturday, January 23, 2010

Sustainability versus Efficiency

My friend Bradford Cross posted an interesting blog on wages last week. It’s a great piece, particularly his comments on Henry Ford's approach to business profitability.

To a great extent, the Ford model Brad refers to is dependent on the combination of volume and productivity. That aspect of the model came to a screeching halt for Ford in the 1920s when the Model T simply passed its "sell by" date. Once the product outlived its market, sales volume dropped. They not only discontinued production in response to growing inventories, they didn’t have their next product, the Model A, out of the design phase. They were forced to shut down the line for months. That put quite a dent in accumulated profitability. They also lost their lead in market share.

The focus on volume and productivity drives businesses to aggressively remove cost and increase productivity from repeatable processes to maximize profitability. In so doing, they're not focused on sustainability, they're focused on efficiency.

Sustainability requires constant change. We have to constantly think about the surrounding business conditions: labor patterns, competitive threats, customer needs and so forth. Sustainability requires us to be primarily concerned with where the business is going to be tomorrow. Efficiency requires everything to stay the same. We luxuriate in the simplicity of holding everything else constant when we focus solely on efficiency. When we pursue efficiency, we're focused on where the business is right now.

In the extreme, we optimize relative to the circumstances of this moment in the hope - the hope - that time will stand still long enough for us to draw an income, have 2-point-something kids, take a decent vacation every year, and accumulate sufficient wealth to retire.

Hope may be audacious, but it's a lousy strategy.

In efficiency-centric businesses, it’s not uncommon to find people doing substantially the same things that people were doing 10 years earlier. Because the definition of work is consistent, it’s repeatable, and that makes everybody's job that much simpler. That's true for everybody in the business: people on the line do the same tasks, people in HR recruit for the same positions, people in finance forecast costs in the same business model, and so forth. When things don't change all that much - markets, supply chains, etc. – a business can make a lot of money, and individual wage earners draw a steady income. But in an age when things change a lot, you can't make a lot of money this way for very long. A business optimized relative to a set of circumstances that are artificially held constant is a business in a bubble. Production of any sort can't operate in a bubble. At least, it can't operate in one for long. The longer it does, the bigger the mess when the bubble bursts.1

Brad mentions that Ford's model was more complete than volume and productivity. There's another dimension that, if executed, makes a business sustainable and less prone to seismic interruption: constant innovation in response to external factors. With that must also come invention, which is of course not the same as innovation. This is also not the same as internally-driven innovation. To wit: while shaving a few seconds off the time it takes to tighten a bolt might make bolt-tightening more efficient, it's useless if the market has switched to rivets in place of bolts.

If we aggressively evolve both what we make and how we make it, nobody in production will be doing what people were doing 10 years ago, because those jobs didn't exist 10 years ago. They won't exist 10 years from now, either. In fact, we don’t want entire categories of jobs that exist today to exist a decade from now. This means we have to be less focused on the known (what we’re doing) and more focused on the unknown (what should we be doing?) This makes work a lot harder.

Well, as it turns out, building a sustainable business is hard work.

In innovation-centric firms, production isn't in a bubble. In fact, it's very much integrated with its surroundings. That's where Brad's reference to “worker skill” comes into play. In technology, it’s more than just a question of skills: it’s a question of both capability and the passion to acquire more knowledge.

This may seem to be blatantly obvious: of course those are the workers we want. How hard can it be to hire them? It’s just a recruiting problem, right? Brad specifically makes the point that there’s a (wildly mistaken) school of thought that assumes we can get the best people by spraying a lot of money around.

If only it were so simple, as Brad points out. It's very difficult to succeed at this, not only because it requires a change in recruiting behaviours, but because it means significantly disrupting an internal business process. That's harder than you might think: the efficiency-centric mindset is firmly entrenched in business, government, and the universities that educate the management that run them both.

Efficiency-centric firms are process heavy. The people in these firms - badged, independent contractor and supplier alike - are very heavily invested in that firm's processes. Subsequently they resist change to the processes and practices that they have worked so hard at mastering and making “efficient.” This creates organizational headwinds so resistant to change they can bend solid steel. Any "change initiative" that isn’t blown away by these headwinds is corrupted by it. So, the boss says knowledge is power, and he's told us we’ve got to have the most knowledgeable people in the business? No problem: we’ll show how knowledge-hungry our people are. HR will set up some computer-based training and tie a portion of management bonuses to the number of training hours their people “volunteer” for. Managers will then measure their supervisors on training hours their people receive. Supervisors will set a quota for laborers. Laborers will fill out the necessary form to show they’ve hit their training quota, and circulate answer keys if there’s a test at the end so that nobody fails to meet their quota. The efficiency-centric system returns to balance with no net impact: laborers aren’t inconvenienced, management receives its bonus, and the organization can now measure how much they "value knowledge." Everybody plays, everybody... well, it's complicated. Those who lead the initiative "win" because they can report that a measured improvement took place on their watch. Those who sponsored the initiative are "reaffirmed" because the firm can now prove they have knowledge-hungry people. The rest don't necessarily win, but they certainly "don't lose." And isn't that the point these days, to make sure everybody is a winner?

We see this pattern repeated with all kinds of well-intended initiatives, whether it be a mission for zero defects or a drive to be Agile. People will do everything they can to sustain that which they have already mastered, even to a point - misguidedly or maliciously - of giving the appearance of innovating and changing. Efficiency-centric organizations have stationary inertia that is extremely resilient to internally-initiated change. Only when an external event trumps every other priority - and most often it has to be a seismic event at that, such as the complete evaporation of revenue - will a bubble burst.

This kind of industrial thinking has made its way into IT. We assume our external environment (labor market changes, technology changes and so forth) are static, so we stand up a big up-front design, put together a deterministic project plan, and staff at large scale to deliver. We also see it more subtly when people look to code "annuities" for themselves: systems that are business-critical that they can caretake for many years. This creates an expectation of job security and therefore recurring income. This isn’t just a behavior of employees or contractors looking for stable employment: there are consulting businesses built around this model.

Going back to Brad's blog post, this creates a wage discrepancy and with it, a bubble. People who accept the annuity make the erroneous assumption that the rising tide of inflation will sustain their income levels. It’s actually just the opposite: the minute somebody is working in one of those annuities, their skills are deflating because they're not learning and accumulating new knowledge. So is the asset value of the thing they're caretaking. The people who do this misread the market (e.g., assume an outsized value of the asset to the host firm) and subsequently have a misunderstanding of their wage sustainability. The resultant wage bubble lasts until the "market" catches up: either the host firm takes costs out of maintenance (e.g., by labor replacement) or retires the asset. The person who was earning what amounted to an outsized income by being in this bubble faces that same seizmic correction as Ford did in the 1920s if they're not prepared with their own "Model A" of what they're going to do next.

The mis-fit of the industrial model in technology is that industrialization makes no provision for capability: each person is the same, the only difference being they're either more or less productive than the average, and indexed accordingly. That completely ignores the impact of destabilizing change that people make in what they do and how they do it. Disruptive, externally-driven innovation should be the rule, not the exception. Of all lines of business, this should be the case with technology. And with the right group of people, it is.

Disruptive innovation pops a bubble. A popped bubble threatens entrenched interests (e.g., those who have mastered life inside the bubble). But disruptive innovation is what makes a company sustainable.


1 I am indebted to my colleague Chereesca Bejasa for using the term "bubble" to describe a team operating to a different set of processes and behaviours within such an environment. Just as a team can be in a bubble relative to the host organization, the host itself can be in a bubble relative to its market.

Friday, January 22, 2010

Is Google to IBM as Apple is to Apple?

In the late 1970s, the microcomputer industry was still in its emergent stages. Microcomputers weren’t nearly as powerful as mainframe and minicomputers. There also wasn’t clearly a “killer app” for them. But at the time, it was obvious that microcomputers were going to have a significant impact on our lives. People bought computers for home and used them for work. They even brought them from home and used them at work. While the software was primitive, you could solve many different kinds of problems and perform sophisticated analyses more efficiently than ever (e.g., the simple "what if" forecasting that we can perform in an open-source spreadsheet today was a major breakthrough in productivity 30 years ago). Having a microcomputer in the office was something of a status symbol, if a geeky one. And they made work fun.

The microcomputer industry had some other interesting characteristics.

Most corporate technology people (working in a department known as "Information Systems" as opposed to "Information Technology") didn’t take microcomputers all that seriously. They were seen as primitive devices with little computing power. Toys, really. From the perspective of the technology establishment, “micros” were only really useful if they had terminal emulation software (such as VT100, 3270, 5250) so they could connect to a more “serious” computer.

It was a highly fragmented market. There were lots of different combinations of architectures, operating systems and CPUs. There were also lots of different manufacturers, each offering their own standard and going in pursuit of business users, including firms such as Osborne, Sinclair, Commodore, Tandy and a rather unique firm called Apple Computer.

No one microcomputer platform was dominant. Each sought to develop and sponsor a library of applications and add-ons so they could sell hardware. For the most part, each relied on value-added resellers as their primary channel.

IBM took a different tact when they entered the microcomputer market,. IBM didn’t compete against the rest of the microcomputer market. They created a new market for something they called a Personal Computer. Using off-the-shelf components they built an open platform that anybody could replicate. Through the combination of brand, applications and reach, IBM was the standard in the personal computer space. The prevailing wisdom at the time was that “nobody got fired for buying IBM.” This made personal computers a safe corporate investment, and made IBM the standard.

For a few years, Apple and IBM waged a pitched battle. IBM, or perhaps more accurately, the "personal computer" standard as defined by IBM, was victorious and for all intents and purposes remains the dominant platform today. And while they lost control of that which they had created, IBM had strong hardware sales, while Apple was for many years relegated to being a provider in niche markets such as education and desktop publishing.

Fast forward 30 years.

A handheld computer / smartphone industry has emerged in recent years, and it shares many of the same characteristics of the early stages of the microcomputer business.

Smartphones have been underpowered for much of the past decade, but it’s pretty obvious that they'll soon become very powerful and will have significant impact on our lives. The current "killer app" - e-mail - is really a utility function. The equivalent to the microcomputer's “’what if’ scenario” capability hasn't yet been identified for the smartphone. But it will, and these devices will change how we live and work. As with the early microcomputers, a lot of people have bought a personal smartphones, and it’s not uncommon for people to use their personal handheld for work (e.g., using an iPhone for maps/navigation). The smartphone a person carries is something of a status symbol, if a bit of a geeky one. And they’re fun.

Until recently, we’ve been force-feeding big-screen (1440 x 900 pixel) form factors into small handheld devices. That is, until the current generation of smartphone arrived, mobile devices were primarily made useful as “internet terminals” more than application processors, no different from the terminal emulation of a previous generation.

It is a highly fragmented market, with competing CPUs and operating systems. There are also lots of different vendors with proprietary products, such as Nokia, Blackberry, Palm and another called Apple Computer.

No one platform is dominant. Each is seeking to create and sponsor a library of applications as a means by which to gain market share. Most sell through value added resellers.

Google recently entered this market. In many ways they’re taking the same approach as IBM. By offering an open platform, anybody can build a Droid-compatible phone. They’ve built out a sizable applications catalogue in a short amount of time. They also have brand and reach, although it can't be confirmed whether somebody has been fired for buying Google.

It's interesting to see not only the same market phenomenon happening on a different technology, but that Apple Computer (and specifically Steve Jobs) is at the epicenter of it.

Perhaps it will turn out differently this time. Apple has been through this same dynamic once before. They can also learn from Microsoft’s unsuccessful attempts to make Windows Mobile an ubiquitous platform . And Google has entered the hardware business on the Droid platform, but they're not a hardware company. However, none of this may matter. In the 1980s, the value was in the hardware, but the lion's share of revenue in the Android market won't be in hardware sales. This means Google is following a similar pattern, but changing the attributes. They're not pursuing a 30 year old strategy as much as they're updating a strategy to be the dominant provider in a current market.

No matter how this plays out, it's shaping up to be an epic battle for platform supremacy, just as we experienced 30 years ago. The microcomputer industry was highly innovative in the 1980s. It was an exciting business to be in. No doubt the same will be true of the smartphone / handheld computer business in the 2010s.

It was Mark Twain who wrote, “History doesn’t repeat itself, but it does rhyme.” We’re witnessing this now. Best of all, we have a front row seat.