Saturday, August 25, 2012

The Next Great Growth Cycle — The American Magazine

The Next Great Growth Cycle — The American Magazine

Today's techno-pessimists say technology and America have plateaued. Such naysayers flourish during economic recessions. They have been wrong in every one of the 19 economic downturns we have experienced since 1912. They're wrong again.

Apple went public in December 1980, before today's 50 million millennials were born. And there followed the longest run of economic growth in modern history, spanning five presidencies from Reagan through Clinton. Apple grew to become the world's largest market cap company and a tech icon.

That was then. This is now. According to today's techno-pessimists, nothing like that can happen again because technology and America have plateaued. Such naysayers, who flourish like mushrooms in the depths of economic recessions, have been wrong in every one of the 19 economic downturns we have experienced since 1912. And they're wrong again.

Let's quote a few prominent examples:

"We have failed to recognize that we are at a technological plateau." — Tyler Cowen, economist, popular blogger, and author of The Great Stagnation.

"The harsh reality … is that the next 25 years (2013-2038) are highly unlikely to see more dramatic changes than science and technology produced in the last 25 (1987-2012)." — Niall Ferguson, uber-historian, Harvard professor, and widely read author.

"No more fundamental innovations are likely to be introduced to change the structure of [today's] society .... Like every previous civilization, we have reached a technological plateau." — Jean Gimpel, technology historian, professor, and author.

There is one salient difference amongst the above three views. The first two were written in 2011 and 2012 respectively, while Gimpel's conclusion comes from his excellent 1975 book, The Medieval Machine; The Industrial Revolution of the Middle Ages, wherein he extrapolated history's lessons to inform the future. On the book-flap of the 2003 re-release of Gimpel's book we find:

Gimpel … did not foresee the digital boom of the 1980s and 90s and the development of post-industrial economies. Nevertheless, his predictions may provide valuable material for historians of the recent past.

Indeed they should. The issue is more than an academic exercise. The techno-pessimists are innovation Malthusians cut from the same cloth as the resource Malthusians. Every time reality proves them wrong following each crisis, they say a variant on the same thing: I may have been wrong before, but I'm right this time.

That long, post-1980 run of "irrational exuberance" happened because of an underlying technological revolution: The advent of distributed computing and the Internet. Technological innovation is pivotal to whether the American economy will experience prosperity growth again. In a world with a growing population but a tepidly expanding economic pie, we see shrunken expectations and a reversion to fighting over how to get one's "fair share." People lose faith that the pie will ever grow again; in essence, they lose faith in the future itself. Certainly there's limited optimism today about technology's future and what that might mean for the economy, jobs, debt, taxation, and fairness.

Astronomical feats of data crunching are now affordable, enabling new and previously unimaginable services and businesses.

We've been here before. Back in 1980, America was deep in the mire of the Carter recession with a wounded economy barely limping along. The real estate and the job markets were in a shambles. Boomers faced dim prospects as they poured out of colleges in record numbers. Then as now, the Middle East was in turmoil, and energy entered center stage for the first time as a subject for national debate. And most people thought the big innovations that transformed the world during the preceding three decades were essentially played out.

The big worry of the day was that Japan, having launched a national effort to leapfrog America's mighty computer industry, was about to overtake our economy. The Japanese juggernaut seemed unstoppable. The journalism and headlines of that era are eerily similar to those today bemoaning China's ascendance and America's lethargy.

Then came the post-1980 boom arising from the confluence of two great forces. There was a government that, through three successive administrations, held a favorable attitude toward the private sector. And that private sector was unleashed at the right moment in history, just when the next cycle of information industries began to emerge.

It was understandable that people then did not see the next wave coming. They were witnesses to such momentous change over the 30 years since 1950 that it was hard to envision what could come next, other than incremental variants on what was already in place. They did expect more computing and communications, to be sure, but mainly more mainframes and landlines.

From 1950 to 1980 the world had gone from vacuum tubes and copper wires to transistors and fiber optics; from the first transatlantic phone cable to geostationary satellites. Wired communication speeds had risen 10 million fold.

From 1950, when Reagan was the president of the Screen Actors Guild, to his becoming president, we went from the Univac vacuum-tube computer to the ubiquitous IBM 370 mainframes. Computer speeds rose 1,000 fold while computing costs collapsed 10,000 fold.

mills1

That era saw the idea of software emerge from mathematical musings to an industry meriting an employment line-item in the Census. By 1980, no bank, business, or university worth its salt was without a computer. We were deep in the Age of Central Computing. 

But then came the post-1980 economy that was built on the technology advances that had already occurred. We saw entirely new businesses, services, and opportunities that were not imagined, but that were enabled by the technology that preceded them.

What can we say now about what will be built on the foundations of the advances that have taken place since 1980? We know one thing. Over the past 30 years, compute-communicate technologies have advanced even more than they did from 1950 to 1980.

Computing speeds are up 200 thousand fold since 1980, while costs have collapsed 1 million fold. We've seen the emergence of wireless networks with speeds 1 million times faster and bandwidth costs down 100 fold.

What exactly does all this portend for our future, for new services, products, and companies, and for the next Apple? When it comes to predicting the future—especially of technology—with all due respect, one does not turn to historians or economists. Peter Drucker, the brilliant management consultant famous for his predictions, used to say that he only predicted what already happened.

We are poised to enter a new era that will come from the convergence of three technological transformations that have already happened: Big Data, the Wireless Wired World, and Computational Manufacturing.

The era of Big Data is upon us, driven by the remarkable fact that for all practical purposes, computer processing power and data storage are free. When something truly useful becomes virtually free, growth can be explosive. Information technology is now undergoing a change as fundamental as the 1980s' emergence of the Internet itself, which—many have forgotten—emerged from and surpassed that era's already enormous telecom industry.

Big Data drives a fundamentally new information architecture. The emergence of the Internet—call it Internet 1.0—was primarily characterized by the proliferation of distributed personal computing. The architecture for Big Data—Internet 2.0—is epitomized by the proliferation of staggeringly large concentrations of easily accessible, but physically remote, super-computing that has been labeled the Cloud.

The Cloud comprises an evolving and growing network of tens of thousands of massive warehouse-scale data centers, any one of which would make a supercomputer of a decade ago look positively antediluvian.

The practically free, and nearly instantaneous, data muscle of the Cloud enabled the creation first of easy companies like Facebook, and all manner of social media and e-commerce. Now it begins to also enable industrial, commercial, scientific, and medical revolutions anchored in meta-data analyses. Astronomical feats of data crunching are now affordable, enabling new and previously unimaginable services and businesses. We have so far witnessed only the inklings of what is yet possible.

People lose faith that the pie will ever grow again; in essence, they lose faith in the future itself.

Maybe we have lost our capacity for amazement. But consider this: When the IBM 370 mainframe was introduced in 1970, it managed the blazing speed of 1million instruction sets per second, or 1 MIPS. A modern tablet, the iPad, can process 1,000 MIPS, and at one-ten-thousandth of the cost. In just one Cloud data-center, we pack in the equivalent of tens of thousands of such microprocessors.

And the trend in storing information, not just processing it, is following an even steeper trajectory than computing power. To store a single e-book in 1980 required $10,000 of hardware; today it costs less than two cents. The data storage industry is growing at 50 percent a year. People will soon spend more money storing virtual bits than they do storing physical stuff. The commercial, industrial, research, and medical fields are storing so much information that it has given rise to a new class of services focused on peta-scale storage. (A petabyte is a million gigabytes.)

There is a mass migration into the Cloud of enormous quantities of data and information processing across all sectors, from supply chain management to travel, from manufacturing to medical care, to transportation and construction, and to education and healthcare.

The Promethean challenge of the new era of Big Data resides in extracting value and actionable information from the data deluge. Here we find another new industry—even new college degrees—devoted to the techniques for managing and mining that exaflood. (An exabyte is 1,000 petabytes.)

Big Data analytics and services, nonexistent just a few years ago, is already a $3 billion industry and will be $20 billion in a half-decade. Cloud-centric services will drive productivity, and thus, collaterally, job growth across all sectors.

We have entered an era where information about data is more valuable than the data itself. Raw data storage and computing power are so cheap that the key now is what to do with it, and how to package and deliver the results.

This brings us to the second macro trend, the delivery of information—the connective tissue that brings all the computing power to people and businesses. Fiber optics created the wired broadband delivery backbone and unleashed Internet 1.0. Wireless broadband similarly unleashes Internet 2.0 and ubiquitous access to the Cloud.

It is already happening. There has never in history been a time when a billion people—eventually the majority of humans, and their machines—could effortlessly communicate, socialize, and trade in real time, all the time, anytime, anywhere, with anyone … and any thing.

The economic and social implications of the collapse in the cost of wireless mobile connectivity are as big as those following the dawn of telephony itself. It will introduce both opportunity and even chaos—witness the Arab Spring.

Technological innovation is pivotal to whether the American economy will experience prosperity growth again.

And video is the fastest growing part of all the networks. Cisco forecasts a 2,000 percent increase in video traffic over the coming five years, leading to more traffic on the Internet in five minutes than there was in all of 1996. As fast as network providers try to build out, they cannot keep up with demand. Faster is better, and when inevitably cheaper, people will want more.

The third grand technology shift is already starting to happen in the emerging Computational Manufacturing revolution. This is the first core shift in how we manufacture things since Henry Ford launched the economic power of "mass production."

We have already seen some evidence of this transformation in how "automation" has been applied to manufacturing and the supply chain. But computational manufacturing is much deeper and broader. It begins with something commonly called 3D printing—or "additive" or direct-digital manufacturing. This is literally the "printing" of parts and devices directly from a computer model or image, using lasers, electron beams, or microwaves, and powdered raw materials.

3D printing is radically improving and accelerating the design process and already produces commercially viable final parts in some niche applications such as highly customized parts for aircraft or medical devices like knee joints.

A directly related aspect of the manufacturing revolution emerges from the computational design of the materials themselves. Engineers can use supercomputing power to design and build from the molecular level, optimize features and even create new materials, radically improve quality, and reduce waste. For example, there are materials like graphene, which offer as much promise as did silicon itself, and bizarre constructs of so-called meta-materials, which enable, literally, features like invisibility.

Computational manufacturing is poised to become a trillion dollar industry, unleashing as big a change in how we make things as did mass production in an earlier era, and as did the agricultural revolution in how we grew things. It is a manufacturing paradigm defined not by cheap labor, but high talent.

Of course you will hear the usual pessimists warn that this is yet another form of automation taking away jobs. We've seen this movie before. U.S. manufacturing output doubled in the last 30 years. But while the manufacturing labor force decreased, overall employment expanded. (The recent Great Recession is a temporary setback, but one that technology can turn around.)

More employment inevitably comes from productivity driving economic growth, not from labor-intensive activities—whether in manufacturing or energy production. Milton Friedman, the great economist, made a famous observation about this phenomenon while traveling in China years ago. He saw a large number of men digging with shovels to build a dam. The economist pointed out that fewer men with the right equipment could build more efficiently. But Friedman's host objected; if they did that they "wouldn't be employing the other men." Friedman's response: "Well, in that case, why not give them all spoons?"

Every time reality proves Malthusians wrong following each crisis, they say a variant on the same thing: I may have been wrong before, but I'm right this time.

Consider that today China and the United States have about the same manufacturing output. China has about 100 million manufacturing workers. In the United States, there are fewer than 12 million. China knows full well the trajectory. It is, of course, the same as what happened over a century ago in agriculture. Once 40 percent of American workers—about 12 million people—toiled on farms. Technology enabled a 600 percent rise in agricultural output, and America today has only 3 million people who call themselves farmers—barely 2 percent of our workforce.

The emerging grand transformations—Big Data, Wireless Broadband, Computational Manufacturing—are all an integrated part of the next great cycle of the information economy. Returning to Drucker, the evidence that this transformation has already happened is visible in Census data. The share of our economy devoted to moving bits—ideas and information—is already much bigger than the share associated with moving people and stuff.

If we add up everything involved in transporting stuff and people—from making and using cars and airplanes, and from F-150s to Norfolk Southern and FedEx—it accounts for roughly a half trillion dollars of our GDP. The transportation-centric part of our economy drove much of the post-World War II growth. And while still vital, no one believes it's where future growth will come from.

By comparison, accounting for everything associated with information—from digital movies and server farms, from iPhones and Intel, to health records and data mining—we already find 2 trillion dollars of our GDP.

And this information-centric future is one that creates jobs far beyond the obvious opportunities for engineers. In his recent book The New Geography of Jobs, U.C. Berkeley economist Enrico Moretti finds that the innovation economy's jobs have triple the multiplier effect, creating more ancillary jobs, compared to traditional manufacturing jobs. And Moretti finds the innovation-centric jobs generate higher pay.

The Cloud comprises an evolving and growing network of tens of thousands of massive warehouse-scale data centers, any one of which would make a supercomputer of a decade ago look positively antediluvian.

There is an ancillary fact of interest here. The stuff-moving industries are all about liquid fuels and oil. The bits-moving industries are, of course, all about electricity. The activities associated with moving bits already consume more electricity than all of the office buildings in America, and more than the metal and chemical industries combined. This speaks volumes about the importance of the electric sector and our dependence on it. Electric reliability is even more important to our information economy than it is to India's emerging industrial economy.

Not only does the United States have the world's most sophisticated and reliable (and low-cost) electric grid that is a vital infrastructure to fuel the information industries, but the United States also leads in the development of each of the core technological transformations. All things considered, there is every reason to be optimistic about our future and that of rising millennials.

Still, Robert Samuelson, the award-winning economic columnist, is not alone when he writes recently:

It is an axiom of American folklore that every generation should live better than its predecessors. But this is not a constitutional right or even an entitlement, and I am skeptical that today's young will do so.

But I prefer what John Perry Barlow, co-founder of the Electronic Frontier Foundation and former Grateful Dead lyricist said: "The best way to invent the future is to predict it."

You can't predict what company will be the next Apple—though investors try. But you can predict there will be another Apple-like company. And there will emerge an entirely new family of companies—and jobs, and growth—arising from the transformational technology changes already happening. Of course, this will require, once again, government policies favorable to encouraging a dynamic private market.

Mark P. Mills is CEO of the Digital Power Group, an adjunct fellow of the Manhattan Institute, and former chief tech strategist for a tech venture fund. He writes the "Energy Intelligence" blog for Forbes.com and is co-author of The Bottomless Well.

FURTHER READING: Michael Sacasas discusses "Technology in America." Nick Schulz contributes "Mobility Matters: Understanding the New Geography of Jobs" and "The President's Internet Blunder." John Steele Gordon writes "The Henry Ford of Our Time." Mark J. Perry says "Information Technology 'Revolution' Will Aid Manufacturing."

Image by Darren Wamboldt / Bergman Group



Sent from my iPhone

No comments:

Post a Comment