White House Chronicle

News Analysis With a Sense of Humor

  • Home
  • King’s Commentaries
  • Random Features
  • Photos
  • Public Speaker
  • WHC Episodes
  • About WHC
  • Carrying Stations
  • ME/CFS Alert
  • Contact Us

Britain’s Power Peril and Its Lesson for the United States

February 19, 2014 by White House Chronicle 4 Comments

In Britain, they are talking about "the year the lights will go out." The metaphor is based on the 1951 film "The Day the Earth Stood Still."
 
There are those who believe they can pinpoint the year: 2023. It is the year that all but one of Britain's 16 operating nuclear power reactors will have been withdrawn from service because of their age.
 
Britain commissioned its first nuclear power plant back in 1954. For decades, Britain was at the forefront of the development of nuclear energy.
 
Then came natural gas. Discoveries in the North Sea coupled with improvements in gas turbine technology caused a boom in gas-powered electricity generation. At one point, it looked as though 50 percent more gas-fired electricity generation would be installed than needed.
 
The next surge of generating enthusiasm was for wind. Under the Labor government of Tony Blair, Britain planned to lead the world in wind generation, both on shore and off. Wind, as elsewhere, was subsidized because it was politically lovable. What better source of energy for a windswept island with a stormy coastline than wind, wind and more wind?
 
But the high cost of wind-generated electricity, coupled with intermittent availability, began to turn the country off wind. While the Conservative government of David Cameron is still pushing wind through subsidies, it has been forced into a painful re-think to avoid catastrophe.
 
Coal mines — the engine of the Industrial Revolution — began to be phased out under Margaret Thatcher's Conservative government partly because of continuing labor problems, but primarily because its cost was rising as mines became less productive. Britain became an importer of coal.
 
Nuclear just languished; the fabrication capacity declined, the design shops closed up, and the universities turned out fewer graduates in the nuclear sciences.
 
Then came the gas boom of the 1980s and '90s. The North Sea was full of it, the plants were cheap to build and operate, and the emissions were half those of coal.
 
But gas began to peak in Britain's North Sea fields in 2000, and gas imports began to rise. The jig was up for cheap, non-controversial energy.
 
Cameron's government, looking toward the day when the lights will fail, has supported an aggressive nuclear building program — none of it designed or built by British companies. The French government-owned utility, Electricite de France (EDF), will build the Britain's first new reactors; the technology will come from Areva, the French nuclear plant builder, and some of the construction funding will come from China.
 
But to lure EDF, a mechanism called the “strike price” had to be negotiated. Under this deal, the British government guarantees a floor price for the electricity generated at the new nuclear plants. The strike price for the EDF deal is $154 per megawatt hour, or about twice the current wholesale price of electricity in Britain.
 
British industry is screaming that it will be driven offshore, particularly chemicals. The European Union is screaming that this is a subsidy by another name. And British consumer groups are screaming that it will kill off old people, who will not be able to afford the Gallic electrons.
 
The Cameron government has its fingers in its ears, because it knows the screaming will be far worse if the lights do go out.
 
Across the Atlantic, a sequel to the year the lights will go out in Britain may be in production. We are already shuttering nuclear plants; the total down from 104 to 99 with many more endangered as the plants either become uneconomic, as a result of competition from our gas boom, or too old. Four big new nuclear plants are under construction in Georgia and South Carolina, but they are all that are likely to be built in the foreseeable future.
 
Currently, nuclear plants contribute 19 percent of our electricity, about the same percentage they contributed in Britain in the 1990s before plant retirements began. The numbers are being kept up by extraordinary operating efficiency gains and by upgrading– called “uprating” in the industry — the plants.
 
How long the gas boom will last is a matter of conjecture. The lifespan of the new hydraulically fractured fields is not known, but it is expected to be about one-third that of conventional fields. The full environmental consequence is not known either. Yet the euphoria of gas abundance is boosted by multimillion-dollar campaigns from the oil and gas industries, led by the giant American Petroleum Institute.
 
These advertisements give the impression that gas is forever in America. The way it was in the North Sea? — For the Hearst-New York Times Syndicate

Filed Under: King's Commentaries Tagged With: Electricite de France, North Sea oil and gas, U.K. coal, U.K. nuclear power, U.K. wind power, U.S. nuclear power

Cheerio, Your Job Has Been Computerized

February 10, 2014 by White House Chronicle Leave a Comment

Some thoughts about work. It is under attack from a giant labor pool of maybe 200 million eager and qualified people in Asia and elsewhere, who will do it for less than it costs in the United States.
 
It also is under attack everywhere from computerization. Stated bluntly: if jobs are not going to Asia, they may be going to the cloud. The service sector, once the saving grace of the post-industrial world of work, is being computerized: no more people needed. 
 
The somber back story at the recent National Federation of Retailers annual convention and expo at the Javits Center in New York City, as recorded in The Washington Post, was not about new shopping centers, point-of-sale displays, the minimum wage or offshore call centers for warranties: it was about Amazon. Online retailing is eating up traditional retailing — and retailers have seen the future, and it is bleak.
 
Two University of Oxford researchers, Carl Benedikt Frey and Michael A. Osborne, recently calculated that 47 percent of American jobs are under threat from computerization. The only major publication that dwelt on this extraordinary study was The Economist.
 
Even those spoiled children of society, university professors, are feeling the cold winds from the computer vortex. Online learning is shaking up the quietude in the ivory towers. While they have to do something to improve the productivity of their academic staffs, this is not the way.
 
Against this threatening employment sky rages the debate over the minimum wage. But it is a debate that is too narrow; too much about the short-term interests of the employers of minimum-wage earners and too little, if at all, about the endangered workplace. The spurious argument is that any increase in the minimum wage will drive employers to install more computer substitution of workers. 
 
They are hell-bent on that anyway. Look around: checkout counters are being automated; book manufacture is threatened by e-readers; telephones are answered by other telephones, guided by the unseen hand of computers. Soon even those vilified call-center jobs in India, will be under threat. Here, your doctor will not want as many support staff, as records go the Web.
 
The minimum wage should be raised. It will not stop the rush to substitute humans with computer-driven gadgets. When a machine can be finely tuned to cook and serve hamburgers, a machine will be cooking and serving hamburgers. All those untruths about jobs in fast-food chains being only entry level will fade away. 
 
Meanwhile, go into any fast food outlet and count the people who are middle-aged: They are not there because it is a way in. It is a way of hanging on – especially for African Americans and Hispanics. The same is true for hotel room cleaning, chicken-plucking in processing plants, cleaning toilets in commercial buildings, warehouse working and those toiling in the night kitchens of bakeries. Entry into what? Hell?
 
I once earned the minimum wage in New York City. At the hiring hall, I can tell you, there were only those exiting the job world not entering it.
 
You will not get rich driving a non-union truck, either. Delivery people do it because they have no other skill and almost none of them are candidates for retraining, another shibboleth. Wherever there is menial work that is not unionized, there is economic misery.
 
Recently, I attended a conference in Europe — where the jobs problem is as bad as here, and possibly more intransigent — and speakers were talking openly about a decline in the standard of living. We, in the United States, are not immune. Those who have enjoyed middle-class comfort may have to face a devaluation in their quality of life: less and crowded housing, less travel, a smaller, older car or no car, more hourly work and less security, no medical procedures for ailments that some computer may deem elective. Grimmer daily lives that are more 19th century than 21st century.
 
The debate over the minimum wage ought to be a national discussion of the future of work. A rising tide does not lift all the boats anymore. — For the Hearst-New York Times Syndicate


Filed Under: King's Commentaries Tagged With: Amazon, Carl Benedikt Frey, computers, jobs, Michael A. Osborne, National Federation of Retailers, The Economist, University of Oxford

Sorry, but There Are Areas Where We Need More Government

February 2, 2014 by White House Chronicle Leave a Comment

 
Who is going to finance advanced drugs? Who is going to guarantee the electric supply in 30 years? Whisper this: It will be the government.
 
In these two areas and others, the risks are now so large that private enterprise — so beloved in so many quarters — can't shoulder the risk alone. When development risks run into the billions of dollars, the market won't sanction private companies taking those risks.
 
Drug companies, among the richest of corporations, are running up against the the realities of risk. To develop a new drug, the pharmaceutical industry — known collectively as Big Pharma — has to commit well over a billion dollars.
 
It is a long and risky road. A need for the drug has to be established; a compound developed, after maybe thousands of failed efforts. Tests have to be conducted on animals, then in controlled human trials. If the drug works, the developers have to get it certified by the Food and Drug Administration. Then they have to market it and buy hugely expensive insurance — if they can get it — because it is almost a rite of passage that they will be sued.
 
Under this regime complex diseases, that may require multiple drugs, get short shrift not because the developers of drugs are greedy, but because they honestly cannot afford that kind of research.
 
The result is that the pharmaceutical companies increasingly look to universities and individual researchers — sometimes in teaching hospitals — to find new therapies; research that is paid for by the government through grants from the National Institutes of Health (NIH), the Centers for Disease Control, even from the Department of Defense. Even so, drug research is lagging and NIH is turning down eight out of 10 grant requests.
 
In electricity supply, too, there is trouble ahead.
 
The electric utilities, since deregulation, have become risk averse. Only two utilities, the Southern Company of Georgia and Scana Corporation of South Carolina are building new base-load nuclear power plants. These may be the last of the large nuclear power plants to be built in the United States. They are both located in states where electric utilities are regulated and where they can anticipate their costs being recovered in the rates, even during construction. The states are taking some of the risk.
 
For the rest of the country, and particularly the Northern and Western states, deregulation has had an unintended result: It has increased the risk of new construction and in so doing has set the utilities down the path of least resistance. They have turned to natural gas and — because of subsidies and tax breaks — to wind power, which has meant more gas power has to be installed to compensate for variance in the wind.
 
Coal is being edged out of the market for environmental reasons. So the electric utility industry is being pushed into a strategic position it has always said it wanted avoid: over-reliance on too few sources of power.
 
A kind of gas euphoria has gripped the nation as supplies from horizontal drilling and hydraulic fracturing have shot up. When the 99 reactors now operating go out of service, as they get to the end of their lives, there will be nothing comparable to replace them.
 
Many companies, some of them small, are working on new reactor designs that would put the United States back into world leadership in nuclear, while answering criticism of the big light water plants of today. Most of them would even burn nuclear waste.
 
In a time of deficits, the government tends, both with new electrical generating systems and in medical research, to scatter money in the hope that this will lead to the huge private commitments that are needed.
 
Sadly, this creates a dynamic in which companies rush in to consume the seed money without being able to bring the product to to fruition. It is a push rather than a pull dynamic.
 
Government works well, even efficiently, when it establishes a pull dynamic, as in the space program and in supercomputers, or most military procurement. The Pentagon does not issue funds for companies to experiment with weapons systems: It commissions them.
 
The government may have to commission new drugs and new power technologies in the high-risk future. — For the Hearst-New York Times Syndicate



 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Filed Under: King's Commentaries Tagged With: Big Pharma, electric utilities, electricity, federal government, nuclear power, pharmaceutical industry, risk, Scana Corporation, small modular reactors, Southern Company

The Ties Don’t Bind Anymore

January 27, 2014 by White House Chronicle Leave a Comment

A nationwide alert, no, a worldwide alert, should be issued for the necktie. It is in great danger. It is disappearing. Soon it may be consigned to history, to live on only in old movies, like people smoking and men in hats.
 
I am not sure who signed the death warrant for the necktie, but I have my suspicions. It is a long chain of perfidy.
 
First, there was Hollywood. Actors who appear on TV talk shows – and most actors do more appearing on talk shows than acting, in the hope that this will get them jobs, so they can do more acting than appearing on talk shows – did in the necktie. One cannot calculate what these innocent little strips of cloth did to the Hollywood Hills crowd – but actors will not be caught in a suit and tie unless they are playing someone who wears a suit and tie.
 
Then there is the dotcom crowd; billionaires who declared by their actions that creative people ought to dress as though they worked for a landscaper not the estate owners. Remember Steve Jobs, who starred in many iterations of his own show “Genius in Jeans”?
 
Well Jobs was a genius, but he was also dressed like a slob, flaunting an everyman image when he was anything but. Now every man is going around the way Steve Jobs did, except minus the genius and the billions.
 

No! No! For me the suit and tie is my native habitat. It is where I am secure — as safe as ordering chardonnay.
 
It all began with my first day of school, when I first put on what was to become the suit of my life: shirt, tie, jacket, hat or cap. When I left school, my father bought me a suit, two shirts and four collars (those were the faraway days when shirts had detachable collars) and told me I would be paying rent if I chose to stay at home. Who said the good old days were so good?
 
My first serious sartorial crisis was at a newspaper in London. It was Saturday, and I ventured in in a sports jacket tie and flannels. The news editor (city editor) exploded.
 
“Are you going to a cricket match?” he demanded.
 
“No, sir, I thought it would be all right, as it is Saturday.”
 
“All right? It is not bloody all right! I cannot send you to Buckingham Palace dressed like that.”
 
“You want me to go to Buckingham Palace?”
 
“No! I want you to go home and contemplate a career change.”
 
So I stuck with a suit and tie, but it did not save me awkwardness. At a party in Tel Aviv, given so that I could meet members of the Knesset, I showed up in a summer suit and tie. I was the only man in a suit. The only man with a tie. The only man with a jacket. The odd man out.
 
I trailed around China, as a member of the press corps accompanying President Clinton on his visit. My colleagues joke about my formality of dress, so I took the plunge. When we went to the Great Hall of the People, off Tiananmen Square, to watch Clinton appear with Chinese President Jiang Zemin, I went casual.
 
By some secret telegraph, to which I was not privy, my colleagues dressed up; every man in a jacket and tie except me, looking ridiculous and disrespectful in a golf shirt. That is what happens when you let go of your principles.
 
Sometimes sartorial failure is collective. At a U.S.-Japan conference on the Big Island of Hawaii, the first morning the American delegation, including myself, showed up in island wear. The Japanese delegation wore formal suits. After the refreshment break, the Americans had rushed to their rooms to get into suits and the Japanese to get into island wear.
 
If President Obama were to appear at an international conference without a tie, it would be all over for the necktie; it would move from the endangered species category to the extinct. He would do it in as thoroughly as bareheaded Jack Kennedy did in the gentleman's hat. Are we better off, I ask you? — For the Hearst-New York Times Syndicate


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Filed Under: King's Commentaries Tagged With: clothing, dotcoms, men's hats, neckties, President Clinton, President Kennedy, President Obama

The Shame of Biomedical Research in the U.S.

January 19, 2014 by White House Chronicle 4 Comments

When the dark shadow of incurable disease settles across a life, it is brightened only by the hope that science is on the job: The cavalry will come.
Horribly the cavalry — researchers in the big pharmaceutical companies and the government-run National Institutes of Health and the Centers for Disease Control — may not even have mounted.
 
New drug development is a murky business governed by huge risks, inertia, bureaucracy and politics.
 
I've been looking at the role of biomedical research and the development of new therapies and drugs through the lens of one disease, Chronic Fatigue Syndrome (CFS), also known as Myalgic Encephalomyelitis. But it is symptomatic of the whole struggle for cures, which means funds. It is a peephole into a system in chaos; where good intentions, economic reality, public pressure, politics and bureaucratic apathy play a role in where the research dollars go.
 
I've been writing about CFS for several years now, so I understand the dilemmas those who are in charge of biomedical research in government and private industry face. It is a disease of the the immune system, like AIDS, but it is mostly a medical enigma. It is hard to diagnose because there are no normal markers in blood or urine. It prostrates its victims essentially for life. In its severest form, patients lie in bed in darkened rooms, often feeling that their bones are going to explode. It cries out for more research, as do many other little understood diseases.
 
A very small coterie of physicians — maybe not many more than 50 in the United States — specialize in CFS and have developed private clinics for research into alleviating therapies. None of them are set up to do major drug research in the way that pharmaceutical companies do.
 
Big Pharma — as the drug behemoths are known collectively — is at the heart of new drug development, aided by preceding biomedical research that takes place through government grants to researchers in universities, teaching hospitals and private clinics. It is a complex matrix.
 
A new drug can cost over $1.2 billion to develop. It is a very high-risk undertaking — maybe the riskiest investment decision made in the private sector is developing a new drug. It is also a tortuous undertaking.
 
First a target has to be selected where there is a large enough patient cohort to establish a market. Then the science begins. Diseases that are straightforward, in medical terms, edge out those where the causes may be multiple and the resolution may require a cocktail of drugs. Understandably, a rifle shot is more appealing than a shotgun blast. Eight out of 10 drugs fail and are abandoned at some point. The winners have to pay for the losers.
 
If, after years of research, a compound that may work is discovered, the laborious business of testing it on animals must precede human trials with control groups and years of analysis. Finally the drug must be approved by the Food and Drug Administration which looks for efficacy, safety, risk benefit and manufacturing stability.
 
Into this already difficult world of new drug development, enter the politicians.
 
Some believe private enterprise will shoulder all the risks and is the right place for research. Others don't understand the vital role that government research grants — administered by NIH and CDC — play in the development of biomedical knowledge: the essential precursor to new drugs and therapies. Its funding is on a see-saw; it was down under sequestration and funding is restored but not boosted under the new budget deals. It tops out at $29.9 billion, a decline of 25 percent since 2003, according to The Atlantic magazine.
 
Chronic Fatigue Syndrome — which has 1 million Americans suffering hopelessly every day — gets about $6 million a year from NIH. What's wrong with that largesse? Well, remember, it costs $1.2 billion to develop a new drug once the biomedical case is made. As they say, you do the math – and don't expect the cavalry to ride to the rescue anytime soon.
 
Across the board, researchers are dependent on government funds augmented by foundations and charitable giving. Yet biomedical research pays as a national investment. American drugs are an export commodity, the cost of healthcare is contained and, yes, the suffering is reduced even as life is extended. China, by the way, has said it will surpass the United States in actual biomedical research dollars in five years. — For the Hearst-New York Times Syndicate

Filed Under: King's Commentaries Tagged With: Big Pharma, biomedical research, CDC, China, Chronic Fatigue Syndrome, drug industry, drugs, FDA, myalgic encephalomyelitis, NIH

State of the Union: Inspire Us to Explore, Mr. President

January 16, 2014 by White House Chronicle 2 Comments

On Jan. 28, President Barack Obama will deliver the State of the Union address to both houses of Congress, the Cabinet, the Supreme Court and, via television, the nation and the world.
 
I think I know what he will tell us. I think I know what he will trumpet; the economy, the thaw with Iran, the domestic oil and gas outlook and, of course, the courage of our troops and the resilience of our people.
 
I would rather he told us something quite different.
 
I would rather he told us that we are facing wrenching changes in the way we work and the nature of work. I wish he would tell us that we are in such a high state of computerization that we have to create entirely new concepts of work, and that the new will replace the old.
 
He could use the examples of travel, and even golf. A century ago the travel industry was confined to the very rich. Now it is global and almost everyone travels, and that has created the world's largest industry. Golf was for the few now it is an enormous employer: a mega-industry in its own right.
 
I hope he will tell us that while computers take away, they also give back and whole new areas of activity will emerge. The unemployed and underskilled are, alas, the footsoldiers in this war of change. The president should acknowledge the hurt and seek to ameliorate it.
 
I hope he will tell us that one of our strengths is that we are a people who explore and while we explore, we will open new frontiers with new jobs.
 
I hope he will urge major new funding for the National Institutes of Heath so that it can fund all of the worthwhile biomedical projects seeking funding, instead one out of 10, as at present.
 
I hope that he will ask Congress to open the spigot for biomedical research. It is a great area for American genius to again lead the world in drugs and therapies, bringing down the cost of healthcare. A pill trumps a stay in the hospital.
 
I hope that he will point out that the government does some things well, from inventing the Internet to the technology of modern oil recovery. The research, he could say, would be done in universities and private institutes, but some of the risk would be undertaken by the government. He should tell his audience that although the government has had some big failures, it has also backed some extraordinary winners. Government can work well with industry, as it did with Mitchell Energy & Development Corp., in creating the technologies that have led to the oil and gas boom.
 
He should tell us inspiring stories of the world that is to come; for example, how 3D manufacturing is going to change the way things are made, manufacturing with less waste and more precision.
 
And I hope he will tantalize us with the endless possibility extended by a graphene, new product from graphite. Graphene is the stuff of science fiction, but it is here and now; companies around the world are filing patents in the thousands for applications for its use.
 
Graphene is a two-dimensional material — meaning that it is a single layer of carbon atoms and yet is incredibly strong — and could bring about changes only wizards might have thought possible. Its early applications are going to be in cell phone screens, computer chips and the like. But in time, when manufacturing is perfected, it could replace a slew of big, heavy materials like concrete and steel. Supposing you could wrap a power plant in it? How about a roadbed that would never wear out? Those applications are in the out years, but look for computer and telephone screens that fold like bedsheets in the near future.
 
I wish that the president would give us a glorious transcendental speech that would astound his friends and undercut his enemies; a call to embrace the future as a rich and extraordinary place, more magical than the present — which is not without magic, when you think about it in terms of the human pilgrimage.
 
Lay it on us, Mr. President, the joy of being American and alive in 2014 — in the exploration society. — For the Hearst-New York Times Syndicate
 

Filed Under: King's Commentaries

Denigrating the Unemployed; at Christmas, Yet

January 3, 2014 by White House Chronicle 1 Comment

 
In order to execute an abomination, it helps to create myths about the victims: the Jews aim for world domination, all gypsies are thieves, all blacks abuse government assistance programs.
 
It's a national abomination that 1.3 million Americans lost their extended unemployment benefits over Christmas. Bring forth the myth: Extending benefits only causes the unemployed to prolong their search for work, or not to look for work at all. End the benefits and they'll find work.
 
This suggests that suddenly unemployment will fall nationally from 7 percent to who knows? Myths are great for ratiocination. Want to bet that ending extended unemployment benefits won't move the unemployment number at all?
 
Being unemployed isn't a vacation. It's not a glorious excuse to watch television at home and snigger at working stiffs who get a paycheck, have savings, take vacations, hope for promotions, and whose children will be able to afford to go to college.
 
Unemployment means cold economic fear — fear of not being able to provide for yourself and your loved ones; fear that your marriage will crumble; fear that your children will have the humiliation of not having the clothes, the electronic gadgets, the sports equipment, the vacations, the meals out and the college education, without which one is doomed to second-rateness.
 
What happens when a breadwinner loses a job? Fear for the future becomes a constant companion: it erodes the good times of family life and confiscates future plans. The specter of hunger and homelessness pushes out laughter and dreams. Worry moves in and begins to dominate a household; an unwelcome but palpable presence.
 
People who are sick to their stomachs with economic worry don't laugh much. Joblessness silences the normal joys of life.
 
Unemployment is not something I've read about. As a young man, I suffered its debilitating privations both in London and in New York. I was even evicted from an apartment in New York because I couldn't pay the rent. Where will I go? How will I eat? What will become of me. These survival fears are multiplied a hundredfold when there are dependent children.
 
The jobless, although they may be so through no fault of their own, blame themselves and sink into self-flagellating despair. The desire to work where there is no work is a hunger to belong, a hunger to be useful, a hunger to provide for loved ones, and a hunger for the simple dignity of going to work.
 
Going to work is a beautiful thing. Not going to work is an ugly thing – ugly in all the horrors that can descend on a person or a family.
 
Unemployment insurance is not the solution, but it's a help; it's not a substitution, just a help – a desperately important shelter in a storm. It's not, as one conservative commentator suggested, about paying people not to work. It's about paying people to live, until they find work in an economy that is changing the very nature of work.
 
In his masterpiece “The Sun Also Rises,” Ernest Hemingway wrote:
 
“How did you go bankrupt?” Bill asked.

“Two ways,” Mike said. “Gradually, then suddenly.”
 
If Congress follows Senate Majority leader Harry Reid's plan to pass a three-month unemployment benefits extension when it reconvenes on January 6, then a ghastly Christmas nightmare will be somewhat alleviated for 1.3 million Americans, who gradually or suddenly fell out of work – and some into bankruptcy – and will still have to pound the pavements, looking for those elusive jobs that will bring hope and dignity back into their shattered lives.
 
No unemployment checks for our fellow Americans is an abomination, originating with congressional indifference, buttressed by conservative mythology. — For the Hearst-New York Times Syndicate

Filed Under: King's Commentaries Tagged With: jobless, jobs, U.S.Congress, unemployed, unemployment benefits, work

The Shadow of 1914

December 31, 2013 by White House Chronicle Leave a Comment

The new year is beginning with the shadow of an old year flitting around the retina of our consciousness. That year is 1914; the year that Europe was convulsed in the world's worst war – 9 million dead.
 
It was also the war from which the world never fully recovered. In its destruction of the old order in Europe, World War I laid the blueprint for the rest of the century; its emancipations and its enslavements, its triumphs and its horrors.

The century following World War I has been a century in which blood and ideas have flowed freely. As a consequence of the war and the Treaty of Versailles which ended it:

1. The Russian Revolution ushered in communism, and later the Cold War.

2. Britain and France carved up the Middle East with boundaries that created new countries, such as Iraq and Saudi Arabia, without regard to the promises that had been made to the Arabs during the war or regard for their sensibilities.

3. The Ottoman Empire fell, making way for modern Turkey.

4. The Austro-Hungarian Empire fell, changing the face of central and eastern Europe.

5. Monarchical rule ended in Europe.

6. Germany was so emasculated by the peace that the ascent of Adolf Hitler was possible.

7. Mechanized war was perfected with industrialized killing by gun, bomb and, for the first time, aircraft was unleashed.

8. The combatants lost the cream of their crop of young men, many of who would have risen to affect the 20th century after the war. The consequences of the loss of a generation of a young men can be speculated upon, but not calculated.

9. The stage was set for the United States — which played a decisive role in the war from the spring of 1917 on, but was not as deeply affected as the European powers — to become the dominant nation in the later part of the 20th century and to this day.

10. The social order throughout Europe began to liberalize. Its feudal underpinnings would remain until World War II, but there was a loosening of the old bonds of class across Europe.

11.  Women were beginning to share their gifts with society.

12. African colonies were taken from Germany and handed to Britain for a kind of safe-keeping, but not for the imperial expansion that Britain had been enjoying for two centuries. Britain, France, Portugal and Holland remained the colonial powers — Britain's possessions were many times greater than the rest put together.

13. Fury at the colonial system was building, especially against British control of what are now India, Pakistan, Bangladesh and Sri Lanka. The beginning of the end of the colonial concept had begun, but it had many hurdles and another world war to go before it all ended in an avalanche of independencies.

 
World War I began with the assassination of Archduke Franz Ferdinand of Austria, heir presumptive to the Austro-Hungarian throne and his wife, Sophie, in Sarajevo. The Balkans were the tinder for the war, but the fuel was everywhere: it was the growth in nationalism and its arrogance; a lack of enough understanding of what a modern war would look like; militarism in many countries, and especially in Germany, where the high command found a fatal friend in Kaiser Wilhem II.
 
As tensions in Europe escalated, the players scrambled for allies and these alliances led to the broader war. For example, the German High Command did not think that Britain would join the war, despite Britain's commitments to France and Russia: It thought Britain could and would remain neutral.
 
The great myth of the time was that the European powers were so intertwined in their trading relationships that war would cost too much and so peace was secure. Yet all the ingredients of combustion were present in 1914, and they were abetted by a lack of great leaders in all the countries that would fling themselves at each other.
 
It was a time of crushing mediocrity in European governance. That may have been the real cause of the world's greatest, most terrible miscalculation, 100 years ago: a leadership vacuum. Beware. Happy New Year. –For the Hearst-New York Times Syndicate


 

Filed Under: King's Commentaries Tagged With: 1914, 2014, Europe, governance, leadership, World War I

Christmas Is Winning the ‘War’

December 21, 2013 by White House Chronicle Leave a Comment

The sinister forces that are supposed to be vanquishing Christmas, in what is called the “War on Christmas,” are in retreat. In fact, they are celebrating it.

Across secular Europe the creches are on display and decorations adorn street lamps. In most towns and villages, the central square is transformed into a Christmas market with a skating rink and stalls selling good things to eat and, even better, to drink. A million amplifiers blast carols in many languages. More traditional carolers go door to door.

Across the United States Christmas fever has been building, like the strains of Maurice Ravel's “Bolero,” since Thanksgiving. It is humanity's greatest festival; a wonderful collective indulgence, a surrender simultaneously to our profound and trivial selves.

The “War on Christmas” is an argument advanced by commentators on Fox Cable News that centers on skirmishes over the First Amendment. Fox actually publishes on the Internet a map of sites where it believes the forces opposed to Christmas are in hand-to-hand combat with the defenders of the Baby Jesus. Really!

The crux of the argument from the “war” people is that Christmas is a religious celebration that has been taken over by the ungodly. In fact, historically, it is an ungodly festival that was taken over by Christianity. It was a pagan festival that became a Christian festival and adjusted to the lands where it spread—and to the religious intensity of the time.

There is no mention of snow in the Bible; but thanks to Northern and Eastern Europeans, snow is part of Christmas. In hot Africa and India, shop windows are decorated with cotton wool and children sing “Good King Wenceslas” with the acceptance that snow is part of their Christmas, too. Yes, people who have never seen snow can dream of a white Christmas. That is just part of the great cultural snowball that is Christmas.

There is a silliness attending those who persist in believing that forces of atheism, secularism, and all the other religions, especially Islam, are out to rip the religious soul out of Christmas. Not quite. In Islam, Jesus is a prophet and a messiah and to be a believer, you must accept him. Others love the story of the nativity without accepting it as a threat to their beliefs.

One of the joys of Christmas is that it is such a wondrous bundle of beliefs, cultural agglomerations and ethnic inclusions that to strip out any of them is to do violence to the best time of year all over the world. Charles Dickens' masterpiece “A Christmas Carol” may embody the Christian spirit, but it features ghosts; Father Christmas comes from a union of German and Nordic mythology with the first Christian saint, Nicholas, who was known for his gifts to the poor. The old man who lives at the North Pole is now a global figure – incidentally, Megan Kelly — of many ethnicities. There is an Indian version, a Turkish version and a Brazilian version of him. I doubt any of these three is thought of as Caucasian.

Christmas is a festival of many splendors: decorations, from Russian icons to tinsel made in China; flora, from fir trees and mistletoe to ferns, in tropical climes; food, from German stollen to Mexican bacalao; music, from Bach to Broadway.

Much of the argument nowadays is about Christmas greetings, “Merry Christmas” versus “Happy Holidays.” My father, who read the King James Bible every day, had never read the U.S. Constitution, never heard of the separation of church and state, and who lived all his life in British Africa, used to say, “Season's Greetings” or “Compliments of the Season.” His argument was that “not everyone is a Christian, but everyone has Christmas." Quite so. Merry Christmas. — For the Hearst-New York Times Syndicate

 

 

 

 

 

Filed Under: King's Commentaries Tagged With: Christmas, Fox Cable News, Megan Kelly

The Mandela Doctrine and McCain’s Heresy

December 15, 2013 by White House Chronicle 1 Comment

What does one do about John McCain? Why can he not play the senior statesman? He is a veteran who has endured more than anyone should endure during his imprisonment in North Vietnam. He is a Churchill scholar. He has been a distinguished senator, a worthy presidential aspirant and a powerful voice for many causes.
 
But he cannot help himself: the ill-considered statement is his trademark. Without knowing anything about the situation on the ground in Syria, McCain was foursquare for American intervention. Now he said President Obama shaking hands with Cuban President Raul Castro was akin to Neville Chamberlain's shaking hands with Adolf Hitler.
 
McCain knows much more about the events of 1938 than this cheap shot suggests – I have heard him hold forth in front of the Churchill Society on the unfolding of the Third Reich's European strategy. So he knows better than to compare Obama's handshake with Castro to Chamberlain's grasp of Hitler's contaminated paw.
 
It is little understood these days in the United States how few were Chamberlain options, and how he owed it to the British people to forestall war until they were somewhat more ready to fight it. That is why Churchill joined the cabinet — and why, at the time, he accepted Chamberlain's action.
 
But that is not the point. In his way, McCain's remark trashes the Mandela doctrine, laid out in “Long Walk to Freedom,” Mandela's 1995 autobiography: “If you want to make peace with your enemy, you have to work with your enemy. Then he becomes your partner.”
 
Mandela's stature grew as his personal serenity and sense of high moral purpose began to be known not only outside Robben Island, but also inside the prison as he began to affect his jailers.
 
Mandela was schooled by the Methodist missionaries, who educated him to persevere and to seek peace; to turn the other cheek. This was one part of his inner strength. The other came from his birth as a nobleman of his tribe; someone in line to be its king if the wider struggle had not been paramount.
 
From Mahatma Gandhi, who had led a civil rights campaign for Indians in South Africa in the first decade of the 20th century, from the missionaries and from his birth, Mandela knew who he was. He also had a selflessness. He could have been released from prison a decade earlier, if he had been prepared to renounce violence. He was not.
 
Unlike Gandhi, Mandela thought violence was a necessary tool in the struggle. Many otherwise good white South Africans thought he should have been put to death – much in the same way we feel about terrorists today.
 
Yet when apartheid fell, not least thanks to Mandela's great partner in the making of the new South Africa, former President F.W. de Klerk, Mandela insisted on peace and reconciliation, saving a troubled, beautiful land from more bloodshed.
 
Mandela shook the hands of his enemies; those who had imprisoned him for 27 long years. He shook their hands just as McCain had gone back to Vietnam and shook hands there.
 
In that atmosphere of celebrating the life a man who had the genius to shake the hands of those who wanted him dead, and then to have reconciled with them, it would have been a travesty of Mandela's legacy for Obama not to have shaken the bloodstained hand of Castro. That is what Mandela would have wanted and would have done himself.
 
It is probably what McCain would have done, too, had he won the presidency. — For the Hearst-New York Times Syndicate

Filed Under: King's Commentaries Tagged With: Adolf Hitler, Barack Obama, F.W. de Klerk, John McCain, Nelson Mandela, Neville Chamberlain, Raul Castro

  • « Previous Page
  • 1
  • …
  • 5
  • 6
  • 7
  • 8
  • 9
  • …
  • 29
  • Next Page »

White House Chronicle on Social

  • Facebook
  • Twitter
  • Vimeo
  • YouTube
A Commencement Address — Get Used to Rejections, We All Get Them Sometimes

A Commencement Address — Get Used to Rejections, We All Get Them Sometimes

Llewellyn King

It is school commencement season. So I am taking the liberty of sharing my column of May 10, 2024, which was first published by InsideSources, and later published by newspapers across the country.  As so many commencement addresses haven’t been delivered yet this year, I thought I would share what I would have said to […]

Can Our Waterways Provide a New Source of Baseload Power?

Can Our Waterways Provide a New Source of Baseload Power?

Llewellyn King

This article first appeared on Forbes.com Virginia is the first state to formally press for the creation of a virtual power plant. Glenn Youngkin, the state’s Republican governor, signed the Community Energy Act on May 2, which mandates Dominion Energy to launch a 450-megawatt virtual power plant (VPP) pilot program. Virginia isn’t alone in this […]

The Problem of Old Leaders — Churchill’s Sad Last Years in Office

The Problem of Old Leaders — Churchill’s Sad Last Years in Office

Llewellyn King

Old age is a thorny issue. I can attest to that. As someone told my wife about me, “He’s got age on him.” Indubitably. The problem, as now in the venomously debated case of former president Joe Biden, is how to measure mental deterioration. When do you take away an individual’s right to serve? When […]

How Technology Built the British Empire

How Technology Built the British Empire

Llewellyn King

As someone who grew up in the last days of the British Empire, I am often asked how it was that so few people controlled so much of the world for so long? The simple answer is technology underpinned the British Empire, from its tentative beginnings in the 17th century to its global dominance in […]

Copyright © 2025 · White House Chronicle Theme on Genesis Framework · WordPress · Log in