Price is the critical determining factor in purchasing decisions. If I want a new case for my tablet, and I know the case that I want, it's worth a considerable amount of my time to find the lowest price on offer for that case. A penny saved and all that.
Utility purchases are driven by price sensitivity. If I can't really say one product is a premium offering to another, I'll go cheap at the sales. I need calories after I run, a breakfast bar will do, I don't need a designer breakfast bar.
While I was writing chapter 3 of my book, Activist Investing in Strategic Software, I spent time researching the rise of centralized procurement departments in the 1990s. De-centralization in the 1980s created inefficiencies in cost management: it wasn't uncommon to find that one division was paying far more than another division for an identically skilled position supplied by the same vendor. Centralized purchasing found efficiencies by standardizing roles and position specifications and granting preferred partner status to contract labor firms. In theory, standardized buying lifted the burden of negotiation from individual department managers and found cost efficiencies for the company. Buyers could more atomically define what they were buying, sellers swapped margin for volume.
And tech labor became a utility.
Procurement's ascendance didn't create industrial IT (there were already willing buyers and sellers of narrow skill-sets), but it certainly threw copious amounts of fertilizer on it. Within a few years, we saw significant expansion of contract labor firms (or "services", or "consulting", whichever you prefer): firms like Accenture and Infosys grew rapidly, while firms like IBM ditched hardware for services. Buying became an exercise in sourcing for the lowest unit cost any vendor was willing to supply for a particular skill-set. Selling became a race to the bottom in pricing. In this way, tech labor was cast as a utility, like the indistinguishible breakfast bar mentioned above.
In captive IT, the notion of a "knowledge worker" that came to prominence in the 1980s was stampeded by the late 1990s. Knowledge workers are a company's primary labor force, but through the miracle of standardization, tech people became collections of skills, and subsequently interchangeable go-bots. By extension, tech became a secondary labor force to clients. Labor extracted rents from the client for which it toiled, but labor had no equity in the outcomes that it achieved. Tech labor was wage work. It might be high-priced wage work, but it's wage work none-the-less.
With all cash and no equity, employees now had clear rules of the game, too. Certifications became the path to higher salaries. It didn't matter whether you were competent, Sun certified you as a Java developer, Scrum Alliance a Scrum Master, PMI a Project Manager, any employer a Six Sigma blackbelt. In exchange for minor rent extraction by agencies exploiting an industrialized labor market, buyers received 3rd-party reinforcement of their contract labor model.
With all the ink being spilled on subjects that managers of enterprises like to traffic in - things like Agile delivery, product organizations, platforms, disruptive technologies, and the idea economy (obviously, some more meaningful than others) - it's difficult to understand how companies still choose to source labor like it's 1997. The people I need to build long-lived products on my-business-as-a-platform-as-a-service using emerging technologies don't fit any definition of standard procurement. These aren't left-brain skills, they're right-brain capabilities. If you buy the cheapest knob-twisters that money can buy, how could you possibly have any expectation for creative thought and innovative output?
At the same time, it isn't that surprising. Procurement sources all kinds of contract labor, from executive assistants to accountants to recruiters. Yes, technologies like Office, SAP and LinkedIn are fantastic, but they're not exactly the equivalent of serverless in tech. If the bulk of the labor you source is check-the-box, why would you expect - or more to the point, how could you be expected to comprehend - that tech is unique? Accounting is, well, accounting after all. It's not a hotbed of innovation. In fact, it's usually bad news when it is a hotbed of innovation. "Innovation" in tech is - particularly for non-tech manager / administrators - just a buzzword.
In enterprises with dominant procurement functions, "worth" is a function of "cost", not "outcome". If we rent labor on the basis of how much a unit of effort denominated in time will cost, the "worth" of a development capability is the sum of the labor times its unit cost. We therefore value scale because we assume productivity is an implied constant. If we don't understand sausage-making, we simply assume that more gears in the sausage-making machine will yield more sausage. We fail to appreciate the amount of energy necessary to drive those gears, the friction among them, and the distance those gears create between hoofed animal and grill-ready skinned product.
Thus we end up with a payroll of hundreds doing the work of dozens.
Our economic definition of "worth" precludes us from understanding what's going on. We have the labor, so it must be some sort of operational deficiency. We look to process and organization, coaches and rules. All of which is looking in the wrong place. We're not a few coaches and a little bit of process removed from salvation. We staffed poorly, plain and simple.
What a development capability is "worth" has to be correlated to the value it yields, not metered effort or even productive output. Something isn't "worth" what we're willing to pay for it, but what its replacement value is to provide the same degree of satisfaction. If we're getting the output of dozens, we're willing to pay for dozens. The capability of high-yield dozens will be more dear on a unit cost basis. But clear accounting of systemic results will favor the cost of polyskilled dozens over locally optimized low-capability monoskilled masses.
This is the economics of "worth".