Engines of Complexity

9 Min Read

Any time a new technology is introduced, the market traditionally extends conventional modes of use and understanding from the new thing’s nearest neighbor. In their initial instantiations, automobiles were horseless carriages, television was visual radio, and cellular phones were phones without a wall plug. It took time and creative genius, both individual and collective, to find these technologies’ transformational power. Automobiles change


Any time a new technology is introduced, the market traditionally extends conventional modes of use and understanding from the new thing’s nearest neighbor. In their initial instantiations, automobiles were horseless carriages, television was visual radio, and cellular phones were phones without a wall plug. It took time and creative genius, both individual and collective, to find these technologies’ transformational power. Automobiles changed everything from courtship rituals to urban planning. Television news was transformed by such visionaries as Reuven Frank, while Ernie Kovacs, Sid Caesar, and their successors invented new forms of comedy and entertainment. In many parts of the world, mobile phones have evolved to the point where they are only marginally voice devices.

In the early years of enterprise computing, information technology accelerated known processes such as billing, statement generation, and basic accounting. Finding return on investment was fairly straightforward because the baseline time and/or cost metrics could be compared to the results of automation. Even without networking, advances in business practice complicated this relationship: as spreadsheets such as Lotus 1-2-3 transformed the trading of financial instruments, and e-mail made real-time global communications possible and cheap, computation no longer automated existing practice. Rather, it facilitated entirely new ones.

With the move from a processor-intensive computing architecture (for mainframes, minis, then PCs) to a network-centric model, these tendencies appear to have intensified: calculating ROI on networked services such as antivirus or e-mail, or on services-centric architectures in the SOA vein, is difficult if not impossible. The costs and benefits are highly distributed – and disputed, the speed of change can be rapid, and the baseline for cost savings is difficult to calculate for activities that have only loose or no precedents. The contested claims and experiences of business value can make deploying systems supporting such processes as analysis, collaboration, and visualization difficult, more for managerial than technical reasons.

From a management standpoint, the situation gets worse. With local applications running on desktops or servers, risk, cost, and other consequences can be reasonably well anticipated and controlled. In a network of networks, I can not only catch your cold when you sneeze, but also the cold of virtually anyone on the networked planet even when not even a sniffle is heard. With multiple dependencies, latency, and other facts of network computing life, monitoring service levels, tracing root causes, and validation become significant challenges. Being able to sign off on a process’s robustness, auditability, replicability, accuracy, and level of protection becomes an act of faith.

One key factor is an essential network property, the so-called fat tail. Whereas many routine algorithms in business and industry assume Gaussian bell-curve distributions of probabilities, many network scenarios adhere to power law distributions in which notions of “normal” and median” lose all meaning.

-Chris Anderson’s “long tail” has become a commonplace in Silicon Valley. Rather than living on large-selling hits, virtual retailers such as eBay, Amazon, and Netflix make money from onesies and twosies that their physical counterparts cannot afford to stock. In fact, Netflix would much rather you watched their one copy of a Hungarian banjo documentary than wait in line for one of their thousands of copies of Pirates of the Caribbean that they will have to dispose of in a few months’ time.

-The predilection of networks toward winner-take-most outcomes inspired Anderson in the first place. It’s no secret that a small number of sites collect the vast majority of Internet traffic. Rather than follow a 80/20 rule, power laws are more like 95/1: less than 1% of the site population accounts for 95% of traffic.

-The extreme scale of adverse events in networked scenarios is easily understood. While the absolute tallest and shortest human adults in any population may be a factor of 2 or 3 different, with both extremes falling symmetrically off a Gaussian median, networked phenomena such as money or traffic can be millions of times different, and there is not necessarily any central tendency. The dramatic exposure to adverse events such as Amaranth’s or Long Term Capital Management’s meltdowns, or a global liquidity crunch, would be impossible in a world in which banks were local, investment was primary (rather than derivative), and neither trade nor capital moved fluidly around the globe.

Given these dynamics, Stan Davis’s notice of the shift from “crunching to connecting” can be expanded. It seems a short leap to postulate the following: whereas enterprise computing in its first two to three decades automated calculation and therefore increased efficiency, networked computing amplifies complexity, including noise. The consequences of this tendency are both positive and negative.

Downsides include increased difficulty to accommodate these environments to hierarchical forms of organization in that collaboration extends across “home” enterprises. Whose money is the vendor team spending? To whom are they proximally and ultimately accountable? E-mail was a classic example as it removed layers of bureaucratic organization and aligned with a tendency toward greater social informality. The cost of e-mail (in both anti-spam technology and people’s time and attention) is vast, but no payback metric is forthcoming: “just do it” becomes the managerial justification.

“Securing the perimeter” of a networked organization (whether in the technological or organizational sense) becomes a contradiction in terms. This week’s news about a sophisticated attack on Hannaford Brothers supermarkets raises the stakes considerably: rather than hack into repositories of credit card data, this distributed attack essentially compromised then transmitted payment information from point of sale systems on the fly. There are other complications, to be sure.

The upsides of exposure to increased complexity befit a global economy premised more on services than products. One benefit lies in increased access to search space: a given problem can be addressed by solutions garnered from a potentially vast pool of suggestions, local optima, or combinations of disparate elements. Time can be dramatically accelerated: numerous examples confirm the potential for what Alfred Chandler called “economies of speed” to be realized by networks of various kinds.

The importance of improvisation, about which we’ve written previously (summer 2007) grows in such dynamic environments in which resources are ample and diverse; curiously, improvisation in a strict sense works less well in scenarios characterized by extreme scarcity. This is not the place to explore how organizational hierarchies, which manage physical scarcity better than they do virtual surplus, may have to evolve amidst information and relationship abundance.

Thus it may be past time for new managerial metrics and practices within traditional enterprises. Hierarchical (bureaucratic) norms don’t appear to do a very good job of governing a mindset and a body of technology that fit poorly into paper-based, slow-moving, and strictly delimited organizations. Complexity is painful when compared to order, but once attributes such as ambiguity, emergence, and adaptation enter the vocabulary, that same complexity can be viewed (and exploited) as a resource rather than only being “managed” as a constraint.

Share This Article
Exit mobile version