The Fourth Industrial Revolution & What It Means for The American Economy

Photo by Pixabay on Pexels.com

The United States of America is at a crossroad. From the nation’s birth as an agrarian society through three industrial revolutions, this country has transformed in waves that included the introductions of manufacturing, mechanical engines, electronics, telecommunications, computers, and the internet. Through each of these significant technological changes, the American economy thrived and excelled. From the dawn of the Second Industrial Revolution through the late 20th century, America was a land of opportunity. The nation that President Ronald Reagan called “the shining beacon on the hill” represented the power of democracy and a free market economy. But a slow and steady transformation has taken place over the last half century, and the drivers of American industrial economy have deteriorated beyond a crucial level. From the early waves of factory automation through globalization and offshoring of an increasing amount of production, much of the opportunity promised by the American dream has evaporated. Now as we move through the nascent stages of the Fourth Industrial Revolution the American economy has shifted more than ever to a consumer culture fueled by cheap goods produced at a fraction of the cost of American assembly lines manned by high-paid human labor. Offshoring production and moving away from industries such as coal and gas that were staples of the American economy have left millions unemployed with no clear path to financial independence – let alone prosperity. As the Fourth Industrial Revolution accelerates, the future looks bleak for unskilled workers, who will inevitably be replaced by artificially intelligent machines that will take an increasing number of unskilled jobs in the coming decades.

Arguments have been made that America can reverse this trend by doubling down on mining of fossil fuels and instituting tariffs and other barriers on our global trade partners, but there is a sense that the genie is already out of the bottle and that the economy’s future doesn’t lie in reverting to coal mining and large-scale manufacturing, but rather embracing the future and developing a plan to address this pending reality before it’s too late. We are today standing on the shore and watching the tide recede ahead of the pending tsunami of the Fourth Industrial Revolution, which will bring with it technical advancements that will transcend all that came before. We can either plan now and find higher ground or bury our heads in the sand and wait for the wave to wash over us.

Before delving into the current technological revolution, it is important to understand the nature of the three major technological shifts that have occurred in American history. By studying the impact of each, we can better prepare for the future that lies ahead.

The First Industrial Revolution took place in the 18th Century, beginning in England before the United States had won its independence. With the introduction of mechanized production, new energy sources such as the steam engine, the use of new basic materials (iron and steel), and the beginning of the factory system. The Second Industrial Revolution was centered in the United States and began in the late 19th Century. This wave of technological change was centered around the new technologies of electricity, gas and oil that powered a new economy as much as it powered people’s homes and transportation. These technologies drove the world’s industries through two world wars and into the post war world, which saw the advent of the next technological transformation. The Third Industrial Revolution began in the late 1960s with huge advancements in electronics, telecommunications and computers. The United States was the epicenter of both the Second and Third Industrial Revolutions, which contributed greatly to the American Century – a time of great prosperity for the US and its allies, when the US dominated the Western World with its politics, economy, and culture.

Emerging from the Spanish-American War and the Boxer Rebellion, the first half of the 20th century saw the US dragged into a European war, followed by the boom period of the 1920s, and then into the ravages of The Great Depression and a strong recovery brought on by a new Keynesian approach to government investment, which continued into a second World War from which America emerged a new global leader. The period was called “Pax Americana,” which grew from the country’s powerful status and role as a global good Samaritan.

The post-war period in the United States was a time of rejuvenation and prosperity marred only by the undeclared Cold War against Communism, which escalated into bloody conflict in Asian countries half a world away. For most Americans, however, the wars were abstract and unfelt. Even as political unrest mounted against the war in Vietnam and the associated draft, the American economy boomed.

Something changed from the booming 60s into the subsequent decades. The country’s GDP continued to grow, but not quite as fast (See Chart 1). The economy was cooling. There are numerous explanations ranging from President Nixon’s stagflation in the 1970s to the elimination of the gold standard, but the insidious truth behind the numbers is that the slowing growth of GDP is directly correlated with the nation’s balance of trade.

In the 1970s, the United States began its shift to offshoring manufacturing and importing goods from abroad. The shift was a boon to consumers, who benefited greatly by cheaper goods, seemingly unaware of the long-term cost they were paying in their Faustian bargain. Add to the offshoring the increasing automation of US factories and dwindling power of trade unions, and the beginning of the end of the American manufacturing economy was upon us.

Twenty years later another technological breakthrough gave Americans hope for the future economy: the dawn of the Internet age. Enthusiasm for the new technology coupled with a laissez fair stance from the US government and wild speculation by investors fueled a stock market bubble that predictably collapsed when investors soured on companies’ valuations far exceeding any practical measure of their true worth.

The Dotcom crash ushered in a new millennium and the trajectory of the American socio-economic system has taken a decidedly different path than the century that preceded it. While the American Century was one of unprecedented improvement in public health all over the world, that all shifted in the US around the year 2000. In addition to economic weakening in the country, Americans’ average life expectancy was no longer increasing. The reversal of the long-running trend came almost entirely among one group: white Americans without a college degree. (This group makes up 38% of the U.S. working age population.)

In their landmark book, Deaths of Despair and the Future of Capitalism, the Nobel Prize winning couple Anne Case and Angus Deaton highlight that something is making life worse for this population group, noting much of the decline among this group stems from higher rates of suicide, opioid overdoses and alcohol-related illnesses. In true deaths of despair, Case and Deaton show how Americans are, “drinking themselves to death, or poisoning themselves with drugs, or shooting or hanging themselves.”

In their study of American deaths in the new millennium, Case and Deaton portray America as a land of the haves and have-notes. The less-educated members of our society experience higher rates of severe mental disease, have more trouble with the “instrumental activities of daily life,” and report living with more pain. They contrast the plight of the uneducated vs. those with a bachelor’s degree, who continue to live longer, have more stable family environments, report happier lives and are far less likely to abuse opioids and alcohol. Case and Deaton correlate this shift directly with the shift in the American economy, writing about how the absence of good-paying manufacturing jobs that allowed working-class families to succeed led to disastrous consequences, writing:

“Destroy work and, in the end, working-class life cannot survive. It is the loss of meaning, of dignity, of pride, and of self-respect that comes with the loss of marriage and of community that brings on despair, not just or even primarily the loss of money.”

In this environment, it is easy to see how an angry populist candidate like Donald Trump could inspire a legion of followers.

As the fundamentals of the American economy crumble there is a voice that runs counter to this narrative. This voice shouts into the maelstrom: “This time it’s different!” Optimistic futurists point to the new “sharing economy,” which they call the first new economic system to enter the world stage in 200 years. They point to a zero marginal cost society that they claim will narrow the income divide, democratize the global economy, and create a more ecologically sustainable society. In this rose-colored view of the disruption to manufacturing, Collaborative Commons supporters say the digitization of all things will bring about unprecedented abundance. “Private enterprises are continually seeking new technologies to increase productivity and reduce the marginal cost of producing and distributing goods and services so they can lower prices, win over consumers and secure sufficient profit for their investors.”

They claim the near zero marginal cost phenomenon has already disrupted the information goods industry and point to the thousands of Internet enterprises that generated substantial profits by creating the applications and aggregating the networks that allow the emerging sharing economy to flourish.

This new economic revolution will ride on the backbone of the technologies that came before to connect the Internet of Things (IoT) that will allow millions of businesses and consumers to come together to exchange goods and services directly, eliminating middlemen and reducing costs, which sharing economy evangelists say will cause a shift in economic power from the few at the top to the masses. They point to solar energy as an example of how economic control will shift from monopolistic power companies to individuals and communities who have the capacity to produce for themselves. It is – in other words – a reversal of the lifestyle created by the Second Industrial Revolution.

Further, they cite developments in 3-D printing – the process of making three-dimensional solid objects from digital files – as a revolution in manufacturing that will democratize production of everything from medical devices to full homes. While 3-D printing is still primarily the domain of hobbyists and experimenters, one need look no further than the evolution of 2-D printing from low resolution dot matrix documents to today’s full-color, photo-quality images to forecast where 3-D printing will be in the next 20 years.5 Those who follow the Collaborative Commons ideology reference the availability of open-source files and liken these to music sharing services from the early days of the Internet. But to date, none of the prognostications have come to fruition; and tens of millions of Americans have been displaced by the economic shift of the last few decades and are left with no clear path to prosperity.

With no immediate solution to the problems of the displaced and increasing jobs lost to automation, many are left to question the very economic system that brought us here. Capitalists’ faith in the system is based on the notion that in a free market, all participants have an equal opportunity for success. However, the changes to the American economy over the last half century have shifted the playing field where more and more members of society feel the paths to success have narrowed and grown out of reach. If the nature of the economy no longer supports the aspirations of its populace, it is time to consider an alternate approach.

One such approach has circulated for centuries and includes a disparate group of supporters ranging from Thomas Paine to Buckminster Fuller, Bertrand Russell and even Chicago School economist Milton Friedman, who once proposed a “negative income tax” where citizens would receive a tax credit if their income fell below a certain level. Today this negative income tax is most commonly referred to as a Universal Basic Income or UBI. The lofty goal of the UBI is to reduce income equality and provide a living wage for those who are otherwise unable to earn one. Different flavors of UBI call for stipend to every citizen, while others call for payments only to those below the poverty line. Payment schemes for the plan include increased taxes on the wealthy and a tax on corporations. UBI has become increasingly bandied about as the world deals with an ever-increasing number of jobs being lost to technological changes.

Modern proponents of UBI include Elon Musk, Richard Branson and 2020 U.S. Presidential candidate, philanthropist and tech entrepreneur Andrew Yang, who branded it as the “Freedom Dividend,” which he argues will combat the forecast that by 2032, 1 in 3 American workers will be at risk of losing their jobs to new technologies. He notes that unlike with previous waves of automation, new jobs will not appear quickly enough to make up for the loss of existing ones. He says UBI is needed “to avoid an unprecedented crisis,” and calls the Freedom Dividend the “foundation on which a stable, prosperous, and just society can be built.”

One of the greatest issues with UBI is the cost of implementing the program, which would add trillions to the US deficit. Yang’s plan to pay for a UBI suggests consolidating existing welfare programs and implementing a Value Added Tax (VAT) of 10%. Under his plan, current welfare and social program recipients would be given a choice between their current programs or an unconditional stiped of $1,000 per month. Yang proposed funding for the program come from four sources:

  1. Current spending: Existing welfare programs cost US taxpayers between $500 and $600 billion per year. A portion of this funding would be repurposed to pay for UBI. Yang also notes the $1 trillion per year spent on health care, incarceration and homelessness services. He suggests that implementation of a UBI program would reduce this amount by $100 – $200 billion per year.
  2. VAT: Yang proposes implementing a VAT at ½ the European level, which would bring in up to $800 billion/year, adding that “A VAT will become more important as technology improves because you cannot collect income tax from robots or software.”
  3. New Revenue: Yang points out what few economists would argue with, “Putting money into the hands of American consumers would grow the economy.” He cites the Roosevelt Institute forecast that says the economy would grow by approximately $2.5 trillion and create 4.6 million new jobs, which would generate $800 – $900 billion in new revenue from economic growth.
  4. Taxes on top earners and pollution: Restructured and increased taxes on top earners plus a carbon fee for polluters would make up the balance of the cost of paying for UBI, Yang says.6

While Yang’s plan sounds reasonable as laid out, there are many against the concept. The arguments against UBI range from philosophical to political to economic. From a philosophical perspective, many argue the UBI ignores the crucial economic role that households, families and other institutions play in the process of income distribution – ignoring how people choose to live in a free society. From a political perspective, most Americans are against wealth redistribution, and many economists predict UBI would raise tax rates by 27% of national income. Finally, from an economic perspective argue that current welfare programs are calibrated by need, and under a UBI the need-based nature of welfare is eliminated, which would drive up incomes across the board and drive inflation, which would eventually offset any gains from the program.

An alternative source of government spending may be on education. As required job skills become increasingly technical and complex, the importance of an educated citizenry will only expand. A 2015 study by Ludger Woessmann at the University of Munich found that “Education is a leading determinant of economic growth, employment, and earnings in modern knowledge-based economies.” This is an area where the United States is actually moving backwards against much of the developing world. In a 2015 international assessment, the United states didn’t make it into the list of top ten countries in the world in math, science or reading. How can the US hope to keep up in an increasingly technical world when the foundations of our nation’s intellect are crumbling?

Because while the economy slowed and education faltered, technological advancements persisted – leading to what many are calling the Fourth Industrial Revolution, which encompasses the ongoing automation of manufacturing and traditional industrial practices fueled by robotics and more recently the advent of artificial intelligence and machine learning.

Each wave of technological advancement brought with it social and political changes that reshaped society. From people moving out of rural areas and into cities, the rise of unskilled labor, and the eventual formation of labor unions and passage of laws that protected the working class.

But as of yet, no new laws or movements have come about to offset the loss of jobs due to this latest and predicted wave of technological advancement. While we can all understand that radical technological changes require radical action there is no indication the government or private industry is working to address it. Meanwhile the populace careens forward with its attention glued to a never-ending mix of social media streams, reality television, and political dissonance.

Each of the proceeding eras of great technological advancement was accompanied by significant social and economic change that – while disruptive – ultimately moved humanity forward. The optimists may be right: It may be different this time. But maybe not the way they perceive. It is incumbent on us as a society to either embrace or brace for the coming change. For this next wave of technological advancement has yet to show a ready replacement for the jobs it will eliminate and without a planned solution we are bound to be swept away in the technological current. The clearest path forward is to point public policy and public funding toward this new reality. A focus on UBI and improving the American education system are a good place to start.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s