It took sixty-six years from the moment humankind figured how to pull off powered-flight to us landing on the moon. Never in human history has there ever been such a rapid pace of development as the journey from Kitty Hawk to the Sea of Tranquility, and much of it had to do with how the order of human civilisation changed between the First and Second World War, and subsequently the Cold War.
Like the story of human flight, World War 2 itself was marked by an unprecedented amount of progress in wartime technology. Bear in mind that humanity has essentially been fighting one another with variations of pointed sticks and stones for the best part of the last 6000 years, and yet the second World War started with tanks and ended with atomic fire in the span of six years.
Much of this progress had to do with the lessons of World War I. European powers learnt that the traditional deciders of battle, i.e military organisation and tactics, could be easily overwhelmed by the advent of newer forms of weapons or technology. Planes, gas, and tanks tore through the many centuries of military tradition that generals carried with them into the 20th century and they were sure as heck not to be left behind in that regard when hostilities kicked off again in 1939.
This time around the draft wasn’t just for able bodies but for the best and brightest minds, a strategy which was unheard of in the history of warfare up until then.
Up till the latter part of the First World War, technological advancements in weapons were usually a spin-off from the latest developments in society. Spears were made from the sharp implements cavemen used for cutting, mounted calvaries were derived from generations of breeding stronger horses, guns were a spin-off from the gunpowder first intended for medicinal purposes, but harnessing nuclear energy didn’t come before the bomb.
Likewise, the first applications of long-range guided ballistic rockets were to deliver Hitler’s ‘vengeance’ on London, whereas the first digital computer was devised by the British to crack the Nazi’s Lorenz cipher.
The ingenuity behind these technologies came about thanks to the prioritisation of scientific study. For thousands of years, civilisation and the wealth of nations were built on the backs of brute labour who tilled the soil and fought their wars. Not anymore. In only a few short decades the order of things changed as states turned their agenda towards science as a means of expanding and maintaining regional power. This game-changing shift in thinking not only birthed the Atomic Age but ushered in a new world order. One which would dictate the outcome of the Cold War.
Should it have been a traditional war, one involving conventional soldiers and weapons facing off one another, the Soviet Union would have held a significant advantage, as demonstrated by the mobilisation of sheer manpower during the Battle of Stalingrad. But that wasn’t to be the case when hostilities between political ideologies began.
With the spectre of mutually assured destruction looming over the two superpowers, both the United States and the Soviets avoided direct military engagement and instead pressed on in the sciences to develop new ways of breaking the stalemate, hoping that somewhere, some yet to be developed technology held the key to their victory.
While neither side found that mythical silver bullet, the men in white gave the world satellites, the foundations of the internet, and even LSD. But all this technological progress came at a price – a concept communism was fundamentally ill-equipped to handle.
Today we deem the end of the Soviet Union and the Cold War as a triumph of capitalism and democracy. As self-serving as it sounds in a world built by capitalist victors, in many ways, that was the case.
The pursuit of progress and innovation is a quality intrinsic to a capitalist system. The very idea that you invest your current resources in search of an opportunity to reap a bigger return lies at the very heart of capitalism since the dawn of the Industrial Revolution. Without the competitive environment of a capitalist society, the ideology of the redistribution of wealth undermines the need for self-improvement.
Not bounded by such ideological conviction, the free markets of the United States and her allies flourished on a wave of scientific progress spurred on by a symbiotic relationship between state and the private sector. In the meantime, that same desire and drive for innovation gnawed away at the Soviet’s funds with little or no military conquest or gains of new lands and resources to show for it. In the end, the Cold War became a ‘war’ the communist state wasn’t able to afford and subsequently collapsed under its own systemic fallacies.
While the shifting balance of power of the 20th century has irrevocably reshaped society around the guiding light of scientific enlightenment, it only tells half the story of how we are heading to a future of the autonomous car for scientific progress can only give us the means to produce the technology to replace the cognitive complexity of a human driver. The other half of this story is that of the one essential trait of human civilisation, trade.
No Nation is an Island
In 1968, entomologist Paul Ehrlich published a book entitled The Population Bomb. In it he raised an alarm, saying that the human population of 3.5 billion (at that time) was unsustainable and the world would be plagued by devastating famines from as early as 1970.
Today the opposite is true, even with double the population Ehrlich based his predictions on, we have more people dying from obesity than starvation. But Ehrlich wasn’t some basement dwelling nut job. His estimations weren’t junk science, with his book going on to sell two million copies, allegedly influencing public policy, and even made him a mainstream celebrity. So what happened?
Ehrlich’s book predicted that widespread famines would strike India and Pakistan by the 1970s. Those calamities never materialised. By the mid-1970s both countries were self-sufficient in wheat production thanks to the development of hybrid strains by biotechnologists. Ehrlich, for all his scholarly knowledge, didn’t account for the possible solutions science could uncover.
From splitting the atom to Jonas Salk’s polio vaccination, the mantra espoused by society in this new age of enlightenment was that science holds the key to solving many of the world’s problems. However, the drive to seek a deeper understanding of the world we live in is only half the story of the modern era. As it turns out the solution for India and Pakistan’s impending apocalypse wasn’t derived from within, but one that originated from a country on the other side of the world, Mexico.
As much as scientific discoveries and innovation did plenty in the way of resolving societal issues in the 20th century, there was another dynamic that had allowed such ideas to flourish around the world, and that was trade.
We often like to think of trade as purely an economic function, but trade is deeply intertwined with the spread of ideas and knowledge. We residents of the 21st century take the ease by which knowledge can be acquired for granted as it was a luxury few peasants in the kingdoms of the old world could enjoy. The ability to study religion, mathematics, and alchemy was often a privilege afforded to certain elite castes or members of a court. In an agrarian society, the time and effort needed to acquire such skills would be time and effort wasted on hands that be better put to toiling the soil.
However, through the exchange of goods from far off lands, trade had stoked the curiosities of many, and steadily, intrigue spurred the spread of knowledge throughout the old world, carried on the minds of travellers and merchants who plied trade routes which connected cities. Likewise, the curiosity didn’t abate in the 18-century as British trade missions to foreign lands would often bring along with them teams of academics and scientists to study new lands and culture. The lesson of history here is that no nation is truly an island, and it was a theme that would continue to play itself out between the two opposing sides of the Cold War.
The Cold War was a unique conflict in a sense that its ideological animosity was far from just an exchange of barbed words. Guided by their doctrine of manifest destiny as the “international police power”, the United States sought to resist the advances of communism by building and strengthening its sphere of influence.
Knowing full well that times of post-war scarcity could foster populist revolutions – such as the infamous Beer Hall Putsch – which could sway the populace towards the alluring promises of equality in communism, the Americans set about rebuilding the economies of Western Europe, South Korea, and Japan, on top of mobilising military support. This, in turn, forged a tight-knit inter-dependent diplomatic and trade relationship between these countries.
Although there is a certain level of indebtedness between the Cold War benefactor and its recipients, none of these countries was economically beholden to Washington DC like the vassal states of the British Empire were to London.
Arguably this “club of nations” fostered more prosperity and enlightenment than the glory days of empire did. While empire was a great way to enrich the coffers of the ruling country, it often did so at the expense of the conquered nation itself – often leaving little economic benefit to their subjects while fostering potentially destabilising nationalist sentiments. Not a sound strategy in the Cold War scenario.
Instead, in the spirit of free-market capitalism, the United States and their allies were free to pursue their own individual destinies. Unlike the empires of yesteryears, and because of the presence of nuclear weapons, the United States wielded their military as a shield of stabilising geopolitical force rather than brandishing it threateningly like a sword. This protection granted allied nations the security to form trade routes and trade agreements that reinforce one another’s economic standing and threw open the doors for enterprises to venture from the safety of state borders and establish footholds in new markets around the world, safe in the knowledge that the security of their investment was certain.
As trade and capitalism hit its stride, member states had to rapidly industrialise to compete, and in order to fulfil the state’s need for a highly-skilled workforce the states’ agenda was turned towards bolstering education and upskilling their populace, while the needs of the population were provided by trade with other nations.
In the meantime, trade routes continued its unintended purpose as a conduit of knowledge as trade partnerships opened access to more markets, broadening the spread of ideas. One example of this movement of ideas can be found in the creation of the internet, the very invention that would ultimately speed the spread of information many times over. Though CERN was given credit for creating the internet, its infrastructure can be traced to ARPAnet, an internal network developed by the United States’ Department of Defence during the Cold War. Rather than a single innovation, the internet was birthed out of an amalgam of ideas born from research and academic institutions in the States and in Western Europe.
This spread of knowledge, the reshaping of state agendas around scientific study, and the drive for innovation in a globalised world has had a compounding effect on the world’s economy. The period between the late-20th century to the early-21st century has seen the rise of powerful and influential cities such as San Francisco, Tel Aviv, Tokyo, and Shenzhen, in areas devoid of natural resources. Their key drivers aren’t ore, grain, or gems but that of ideas and innovation, which has allowed these cities to achieve GDPs that could rival small resource-rich nations. The scales of global wealth have shifted from an era when resource-based economies dominate, to one where human-capital does.
A salient example of this new world order is that of Singapore. A trading hub for much of its history, the tiny island nation had remodelled itself as a specialised hub for highly-skilled services. Many today laud Singapore as a shining example of globalisation, an island nation built on no natural resources but that of its people who transformed it into one of the most economically rich and densest population centres in the world today. But Singapore’s success and its dependence on its primary resource, its people, serves as an example of why autonomous cars are an inevitability.
Protecting the Rice Bowl
Last year, the city of Singapore was ranked fourth in the world with the most millionaire CEOs, just behind New York, San Francisco, and London. But unlike those cities, its automotive landscape is a pure oxymoron. Premium car brands regularly top the market’s sales charts due to the exorbitant prices levied on acquiring the right to own a car, and due to the high number of wealthy individuals, it comes as no surprise that Singapore is home to some of the rarest exotic cars in the world. However, despite the country’s appetite for fast exotics, Singapore has one of the strictest speed enforcement in the world as well as being one of the first countries to express their desire to adopt autonomous vehicles on a nationwide scale.
Granted that Singapore is an island nation, which would make the implementation of an autonomous infrastructure far easier than one of a large and expansive nation, filling your streets with autonomous cars is far from being an exercise of prestige. There is a far more influential aspect behind Singapore’s audacious decision and that is its dependence on their human capital.
The corridors of economic and political power of any capitalist democracy are highly dependent on the productivity of its citizens, even for a faux democracy like that of Singapore. As CGP Grey had outlined in his absolutely brilliant YouTube video, The Rule for Rulers, the foundation of every leadership is built upon money.
Like any CEO, a politician’s power in a democracy is based on the country’s performance, particularly its ability to generate wealth. A rise in crime or a drop in quality of living might stir impotent unrest amongst the populace. But crash the economy and it is with a greater likelihood that the leadership will be thrown out of office as quickly as any corporation’s board of directors will show an underperforming CEO the way out.
Therefore, like a king drawing the resources of his land to derive his wealth that is used to maintain his seat of power, a healthy human-capital driven economy needs their citizens to be productive.
The more productive they are, the higher the nation’s GDP, the more tax revenue the state can collect, and the longer the ruling politician’s term in office is. This is key to understanding why autonomous cars are favoured by many modern capitalist societies.
Governments like those of Singapore have great incentive to make roads as safe as possible because as it turns out, letting your citizens kill and maim one another and themselves in traffic accidents is bad for business. Far from an inconvenience, traffic accidents and deaths leaves ruinous effects down the line on everything from healthcare to the opportunity cost of lost talent.
A 2014 NHTSA study estimated that motor vehicle accidents cause $836 billion in economic and societal impact on US citizens and around $242 billion in economic losses. That is billions of dollars of potential government revenue being needlessly wasted on human error. You wouldn’t want to keep an accountant who accidentally wiped a few million in profits off your business’ accounts with a slip of their hand, so why would bureaucrats want to keep this cavalcade of road calamities going?
Enforcing strict traffic laws isn’t merely about saving your life and the life of your loved ones because your constituents care about your well being out of a deep-seated goodness of their hearts, but because keeping you healthy serves their own interests and that of the nation.
The more lives and livelihoods they protect, the more citizen productivity they can guarantee, the more economic potential they can extract, the greater financial and political advantage they can secure. This ideology can be used to explain why bureaucrats are always finding ways on cutting down on vehicular pollution and congestion, as it too brings about societal gains in improving civic efficiency and reducing the harmful effects of pollution on citizen health in the long term.
While you will argue that more intensive training is the solution to the problem, the trouble is that there is only so much training the masses can go into making roads safe. Just as not everyone can be a surgeon, we have to admit that not everybody would be able to be that perfect, disciplined driver.
For every true Lewis Hamilton drivers out there, there are a few dozen Pastor Maldonado who will always think they are Hamilton. And it doesn’t matter if you are Hamilton, you only need a Maldonado in a souped-up SUV to turn you into a statistic. Autonomous safety features are steadily proving themselves to be better at minimising or preventing traffic incidents than the average driver could, and the one thing such technology can achieve that we as a collective can’t achieve is a standardisation of capability.
Imagine if tomorrow a bureaucrat finds a problem in the driver training syllabus. If he were to implement a correction he would have to spend months or even years retraining driving instructors and conducting re-certification. The effects of which on driver behaviour would only be seen years down the road, maybe on a handful of drivers who are outliers in the studies’ demographics, if at all.
But with over the air update, all he needed to do was to add a new parameter in the AI’s program, test it in simulations, and zip, every autonomous vehicle of its type will behave accordingly at once. Quick, simple, and easy. What sort of bureaucrat overlooking several hundreds and thousands of vehicles wouldn’t want the convenience and effectiveness of that?
Just as how the horse was made redundant by war and the rapid advance of technology, so too will our cars suffer the same fate. To say that the rise of the autonomous car is thanks to the technology we have at hand is to grossly ignore the socio-political changes that have reshaped the world in the span of a century.
Contrary to what that populist British motoring television presenter says, this “war” against the traditional combustion engined car isn’t just merely a sinister agenda pushed by a cabal of environmentalists and socialists to bring an egalitarian dystopia.
Take away all the elements of the environment and safety, dig deeper into the structures of today’s society and you would find that automating such an essential process in our daily lives makes sound fiscal and political sense. And just as our lives have been influenced by the technology around us, so too will we reshape our lives around an autonomous car just as easily.
While this lengthy exposition on the groundwork from which autonomous technology might bring about the end of the traditional car industry, there is one bigger picture aspect that has dogged my thoughts on the subject.
As we have examined before the story of the car, its rise and prophesied downfall isn’t an isolated subject, but is interconnected with seminal events that took place in the world.
Artificial intelligence is coming, there is no doubt about that and autonomous vehicles is one of the heralds of that new epoch. It stands to reason that when the day autonomous technology is able to take the reins of driving from the human operator another pertinent question will rise; will there be anybody to use these vehicles when the concept of work has changed?