Why Obama and Abe Revisit Hiroshima and Pearl Harbor

 

“As the prime minister of Japan, I offer my sincere and everlasting condolences to the souls of those who lost their lives here, as well as to the spirits of all the brave men and women whose lives were taken by a war that commenced in this very place.”

With these carefully chosen words, Japanese prime minister Shinzo Abe yesterday paid his homage to the fallen in Pearl Harbor in the attack by the Imperial Air Force of Japan in December 1941. The surprise air raid brought the United States to war against the Empire of the Rising Sun and into World War Two.

He was reciprocating the visit by that other master of politics, Barack Obama, who last May became the first U.S. president to visit Hiroshima, where his country dropped the first of only two atomic bombs ever used in a military conflict. The bomb over Hiroshima fell on August 6, 1945. Three days later, the U.S. struck Nagasaki (the mushroom cloud rising over the city after the bombing is pictured above).

Ever since, the deterrent of nuclear warfare has been based on the doctrine of Mutually Assured Destruction, or M.A.D, as the acronyms aptly describe it. It’s the deadly logic of the irrational.

For the dewy eyed, it is not only morality that’s driving Abe and Obama to exchange flowers over the tragedies both nations inflicted on each other. That they did not apologize is beside the point, and not tremendously relevant in terms of political importance.

Foremost among their concerns is the ascendancy of China and its undisguised geopolitical ambitions. This is an unprecedented development in the history of an empire that has never sought to expand beyond what it considered its natural boundaries, marked by the Inner Mongolia to the north and by Tibet to the southwest. The other shared concern by the U.S. and Japan is North Korea, a communist dictatorship run by a madman armed with a nuclear arsenal, who rules over a starving population.

Indeed, Kim Jong Un has his match in the future tenant of the White House, both in his lunacy and his short fuse. Kim has “banned sarcasm” in his country – no joke – as the buffoon about to take over from Obama would too, if U.S. institutions allowed him to. But more importantly, both Kim and the tweeting charlatan are completely unpredictable.

Hence these closing remarks by Obama at Hiroshima should guide our moral compass in the challenging years ahead:

“The world was forever changed here, but today the children of this city will go through their day in peace. What a precious thing that is. It is worth protecting, and then extending to every child. That is a future we can choose, a future in which Hiroshima and Nagasaki are known not as the dawn of atomic warfare but as the start of our own moral awakening.”

Autonomous Vehicles Gain a Big Friend: The U.S. Government

 

The U.S. government recently came out strongly in favor of autonomous cars. Self-driving vehicles, officials said, would save lives, and would make commuters’ less miserable.

To be sure, the government stopped short of issuing new regulations in the rapidly developing market. Still, the 15-point guidelines it issued were sufficiently specific as to signal its focus on safety, yet vague enough as to avoid restricting further developments.

The guidelines deal with four broad issues. Safety standards for the design and development of autonomous vehicles; a recommendation for states to agree on uniform policies on self-driving cars; how current regulations apply to driverless vehicles; and opening the field to new regulations on the technology.

At Future Imperfect, we have addressed repeatedly the challenges posed by self-driving cars, not always welcoming the new technology. It would take this writer a lot of convincing, and perhaps some more forceful methods of persuasion, to ride a fast machine with nobody at the wheel.

Yet what we find commendable in the government’s attitude is what has often set the United States apart from other countries. In the face of inevitable technological progress, the government decided to embrace it, and hence have greater involvement in its development. Hindering it would not stop it, and might even imperil passengers and pedestrians in a regulatory vacuum. Conversely, a farsighted stance is pioneering, and serves best the public interest.

Inflation phobia: Is the cure worse than the disease?

 

Aversion to inflation has been the salient feature of economic thinking of our times. Keynes aside, the traumas of hyperinflation in interwar Germany and the Great Depression of 1929 have left scars deep enough to shape economists and policy-makers’ tenets for almost a century. The Economist reports that officials at the Federal Reserve, “a few of them anyway,” seem to be rethinking their views “in some dramatic ways.” Yet former Fed chairman Ben S. Bernanke has suggested in a blog post that they should curb their enthusiasm. Should they?  As The Economist’s blog suggests, reality has challenged a long held principle: the equilibrium point between unemployment and inflation. Basically, the Fed believed that the optimal unemployment rate hovered around 5 percent without the economy dangerously overheating. A rate much closer to full employment would lead to unbearable inflation, it was feared. Recent employment figures, however, have shown this concern to be unfounded. The U.S. economy has withstood low unemployment (matched with low productivity) without significant inflationary pressures building up. That this has been good is debatable, as one of its outcomes has been a sustained period of low growth. A U.S. economy at walking speed is as bad for the country and the world as, say, runaway inflation. The suggestion is, therefore, that there is a strong case to be made for higher wages. More disposable income drives up demand. Herein may lie a key for reactivating growth in the U.S. and the world. Like stagnant waters, a lethargic economy breeds diseases.