Categories
Uncategorized

So what?

Though it has not been commemorated — or even noticed by most Americans — there is a steadily diminishing number among us that might regard January 27 of fifty years ago a day of infamy. On that day in 1973, the United States formally ended its involvement in the Vietnam War with an accord signed in Paris. Many regarded that end, and some still do, as an American military defeat — we turned tail and ran, the first time in history. While it took two years for the Communist North Vietnam to consolidate its victory and unify the country under its rule, it was inevitable once the U.S. ended its involvement.

Was it a defeat for America? Like so many things, it depends on the definition. Defeat in war implies surrender, occupation by the enemy, reparations, regime change, and other humiliations. France under Napoleon was defeated; Germany and Japan certainly were defeated in World War II. America suffered none of these catastrophes. In this country the domestic fury and civil strife over its military intervention to support the South Vietnam regime was quickly abated. America focused on its domestic issues for the rest of the decade. The only serious foreign scrape was the Desert One debacle when President Carter attempted to rescue the hostages from the U. S. Embassy in Iran, totally unrelated to the Vietnam situation.

Assessment of whether ending the U. S. involvement was a defeat depends on the context. The Vietnam War occurred in the middle of the Cold War. The raison d’etre for American involvement in Vietnam was to halt the spread of Soviet Communism that was seen to be a threat to world peace and freedom in the developing countries — the Third World, as it was termed. After World War II the USSR and its Comintern actively sought world domination — Soviet Premier Nikita Khrushchev famously declared “we will bury you” to the United States and its allies. Khrushchev appeared to be serious. Cuba — 90 miles from Florida — became a Soviet client state under Fidel Castro in 1959 and soon threatened the U. S. with intermediate-range missiles. Soviet sponsored aggression in the Third World, thinly disguised as “liberation” movements, was on the march. Eastern European nations, China, and North Korea were client states of the Soviets. A decade before American full-scale involvement in Vietnam, the United States, sanctioned by the United Nations, took the lead in the effort to stop Communist aggression in Korea. Though it became a stalemate, to the extent that intervention prevented the North Korean takeover of the South it was successful. Successive administrations in Washington believed a similar result in Vietnam was possible.

Nineteenth Century Prussian soldier and military theorist Carl von Clausewitz wrote a treatise On War (Vom Kriege in German) in which he theorized war is “a true political instrument, a continuation of political intercourse, carried on with other means.” In other words, war is one means to achieve a political end. The North Vietnamese military commander Vo Nguyen Giap and his political leader Ho Chi Minh’s aim was unification of Vietnam under a Communist regime as a client of the Soviet Union. The United States and its allies’ policy was to curtail the global spread of Soviet influence.

After the American withdrawal, which was precipitated mostly by domestic politics, the South Vietnamese were not able to fend for themselves for long. Thus, Giap and Ho achieved their policy aims.

After the normalization of the United States’ diplomatic relations with Vietnam in the 1990s, the late Senator (and one-time Presidential candidate) John McCain met with General Giap. McCain, who had been a prisoner of war in the North, reportedly told Giap that the North and its Viet Cong guerilla allies never defeated the U. S. Military in battle. Giap agreed that was true, but irrelevant. In succinct words: “So what?”

Clausewitz’s theory works both ways. The overarching containment policy of the U.S. and the West in the 1980s was continued “with other means.” These means were the economic, cultural, and moral forces led by Ronald Reagan, Margaret Thatcher, Pope John Paul II, and the courage of Polish patriot Lech Walensa and many others. November 1989 saw the fall of the Berlin wall. Two years later the hammer and sickle was hauled down from the Kremlin signaling that the USSR was no more. Vo Nguyen Giap had won his battle, but Soviet Communism ultimately lost the global war. To the extent America might have lost in Vietnam, we may also respond “So what?”

Notes

— Clausewitz’s expanded view has been translated from the German as “that war is nothing more than a continuation of the political process by applying other means. By applying other means we simultaneously assert that the political process does not end with the conclusion of the war or is being transformed into something entirely different, but that it continues to exist and proceed in its essence, regardless of the means, it might make use of.” See On War, translated and edited by M. Howard and P. Paret, Princeton University Press (1984 ed.) Clausewitz’s views have not been unchallenged.
— After the Soviet Union collapsed, Dallas’s eccentric restauranteur Harvey Gough obtained a statue of Vladimir Lenin during a visit to Russia and placed it in front of his hamburger restaurant on Lovers Lane inscribed with the words “America Won” on its base.
— Full disclosure. This writer served in the United States Army during the height of the Vietnam war. The vagaries of military personnel assignment sent me to Korea and a stateside post, not Vietnam. In that sense, I am not a “Vietnam veteran” though I could have been.


Categories
Uncategorized

88

If Elvis were still with us, he would be 88 years old today. I remember well the day his death was announced on the radio. He was only 42 at the time he died in his Memphis, Tennessee mansion known as Graceland. While having been through and to Memphis several times, I have never visited Graceland, which became and still is a tourist spot.

This past November, however, on our way to the Smoky Mountains in western North Carolina, we spent the night in Tupelo, Mississippi, the place where Elvis was born. The next day on our way out we visited Elvis’ birth home in Tupelo. That house, where his mother actually gave birth, is as modest as Graceland is reported to be opulent. Excluding the front porch, the Tupelo house exterior measurements are about 16 x 21 feet, the size of an average middle-class living room today. Modest though it was, Elvis’ father was unable to repay the loan, and it was foreclosed only a few years after he was born. The family lived in various places in Tupelo until they moved to Memphis when Elvis was 13. The rest, of course, is history.

Elvis’ house is in a park located at 306 Elvis Presley Drive. The park has a museum dedicated to Elvis’s memory, as well as the church building where the family attended and where he sang in the choir as a child.

Except as a pilgrimage by the dedicated Elvis fan (which I am not, particularly) the visit is probably not worth a special trip. But when passing through or otherwise being in Tupelo, it would be an interesting visit to show that humble beginnings in our country do not foreclose the possibility of fame and fortune. Also, during his career, especially at the beginning stages, his “cultural appropriation” of what was called Rhythm & Blues, or “race music” probably was instrumental in the civil rights movement that began in the 1950s.

Elvis Birthplace

Church

Note: During my high school years, I, and at least one of my readers, worked a part time job at the Circle Theater, a neighborhood cinema in Dallas (the building still exists, but hasn’t been a film venue for several decades). One of our tasks was to change the marquee when a new movie was to be showing. That required putting up a ladder and spelling the name of the show and star with individual 10″ letters on a lighted background above the theater entrance. When one of Elvis’ films was to be shown, I complained to the manager that there were not enough of the right letters in stock to spell the name of the film and “Elvis Presley” on both sides of the marquee. The manager said to just use “ELVIS” — there is no one around who will not know who he is. Of course, the manager was right – celebrity rules; then and now.

Categories
Uncategorized

Prescient (I Hope) Words

Thought it was worthwhile to pass this on.

Our next President? Perhaps.

Florida Governor Ron DeSantis, in his second-term inaugural address, Jan. 3:

“It is often said that our federalist constitutional system—with 50 states able to pursue their own unique policies—represents a laboratory of democracy. Well, these last few years have witnessed a great test of governing philosophies as many jurisdictions pursued a much different path than we have pursued here in the state of Florida. The policies pursued by these states have sparked a mass exodus of productive Americans from these jurisdictions—with Florida serving as the most desired destination, a promised land of sanity.

“Many of these cities and states have embraced faddish ideology at the expense of enduring principles. They have harmed public safety by coddling criminals and attacking law enforcement. They have imposed unreasonable burdens on taxpayers to finance unfathomable levels of public spending. They have harmed education by subordinating the interests of students and parents to partisan interest groups. They have imposed medical authoritarianism in the guise of pandemic mandates and restrictions that lack a scientific basis.

“This bizarre, but prevalent, ideology that permeates these policy measures purports to act in the name of justice for the marginalized, but it frowns upon American institutions, it rejects merit and achievement, and it advocates identity essentialism.

“We reject this woke ideology. We seek normalcy, not philosophical lunacy! We will not allow reality, facts, and truth to become optional. We will never surrender to the woke mob. Florida is where woke goes to die!

“Now Florida’s success has been made more difficult by the floundering federal establishment in Washington, D.C.

“The federal government has gone on an inflationary spending binge that has left our nation weaker and our citizens poorer, it has enacted pandemic restrictions and mandates—based more on ideology and politics than on sound science—and this has eroded freedom and stunted commerce.

“It has recklessly facilitated open borders: making a mockery of the rule of law, allowing massive amounts of narcotics to infest our states, importing criminal aliens, and green-lighting the flow of millions of illegal aliens into our country, burdening communities and taxpayers throughout the land.

“It has imposed an energy policy that has crippled our nation’s domestic production, causing energy to cost more for our citizens and eroding our nation’s energy security, and, in the process, our national security.

“It wields its authority through a sprawling, unaccountable and out-of-touch bureaucracy that does not act on behalf of us, but instead looms over us and imposes its will upon us.

“The results of this have been predictably dismal. This has caused many to be pessimistic about the country’s future. Some say that failure is inevitable.

“Florida is proof positive that We the People are not destined for failure. Decline is a choice. Success is attainable. And freedom is worth fighting for.”

Appeared in the January 5, 2023, print edition of the Wall Street Journal ‘Notable & Quotable: Gov. Ron DeSantis’, and elsewhere.

By the way, today is also Christmas to our Eastern Orthodox Christian (Greek, Russian, Serbian, etc.) brothers and sisters.

Categories
Uncategorized

Merry Christmas to All

Beginning with Thanksgiving every year, there are numerous parties and parades in anticipation of Christmas. Recently many, if not most, of them have been officially designated by the rather sterile “holiday” appellation. Nevertheless, it remains the reason for these events is the festivities traditionally surrounding the anticipation and celebration of Christmas Day.

The period beginning with Thanksgiving and extending through New Year’s Day, or for some, Twelfth Night, is often referred to generically as the holiday season. Christmas, its centerpiece, has its origin in the celebration of the birth of Jesus Christ, and thus gave the season its name. During this time, however, there occur the Jewish celebration of Hanukkah, which is not a major feast for Jews; and Kwanzaa, which is a recent innovation, ostensibly a sub-Saharan African tradition. Greek, Russian, and other Orthodox Christian denominations actually celebrate Christmas on January 7.

Now I have no trouble with honoring other traditions during this period, but the focus and the real reason there are such celebrations is Christmas, the commemoration of the birth of arguably the most influential man in history. The number of those who adhere to the Christian tradition, whether devout or nominal, exceeds that of every other.

The politically correct crowd seems to believe that terming the season and the parties and the parades as “Christmas” excludes and dishonors the other traditions. Erecting Christmas scenes such as the Nativity in public places is feared as endorsement of a particular religion by government. This notion is poppycock.

Christmas began as the celebration of the birth of Jesus Christ sometime in the early Christian era. No one knows for sure the day, or even the time of year when the Nativity actually occurred. Around the winter solstice in December is as good a time as any. Christianity took root primarily in Europe. The Western Hemisphere, Australia, and New Zealand after the 16th century became essentially a greater Europe. The European traditions, including those important to celebration of the Christian religion traveled with the European, primarily British, French, and Iberian settlers. Thus, the dominant culture in those areas included Christian practices and traditions. If it were not of this origin, it is doubtful there would be a “holiday” season as significant as it is.

Christmas has evolved into being as much a secular holiday as a religious one. Gift giving, trees, poinsettas, lights, holly berries, egg nog, and so forth are as much a part of the time as church services, creches, and Advent wreathes. Carols are the great crossover between the religious and secular. The great oratorios of Handel and Bach, as well as their less ambitious pieces, are enjoyed by the devout and irreligious alike.

Five years ago this past week there was an op-ed in the Dallas Morning News titled “This atheist loves Christmas, so stop the war on my favorite holiday.” In that column, Zachary Moore says “because I love Christmas so much … Squabbling over the public square diminishes my enjoyment of the season.” Moore goes on to propose that “Christmas [should] henceforth be treated as a secular holiday open to the interpretation and enjoyment of all. Christians are welcome to revel in its theological implications, while atheists and others may pick and choose whatever resonates with their own particular values. The Christmas tree in the square will be a malleable and inclusive symbol, able to support the weight of Magi, Menorah and Muhammad, as well as any other marginalized culture that would appreciate a little bit of fun in the darkness of winter (including we joyless atheists),” his parenthetical being a welcome tongue-in-cheek.

So, like this sensible unbeliever, let us all cut out the nonsense, politically correct or not. Christmas should not be a political or culture-war football, but for all a joyous time of some respite from the slings and arrows of daily life, whatever their religious or political persuasion.

I close with:

Craciun fericit

Wesolych Swiat

Linksmu Kaledu

Hyvää Joulua

Sretan Božic

Veselé Vánoce

Frohelich Weihnachten

Buon Natale

Joueux Noel

¡Feliz Navidad! (If you live in Texas and don’t know what this means, you reside in a cave.)

(In other words, Merry Christmas (to all and to all a good night.)

This is an updated version of an essay published five years ago.

Categories
Uncategorized

He Saved Us All

In the American Heritage history of World War II, C. L. Sulzberger, New York Times columnist (and member of the family that owns that newspaper) wrote this.

“Remember him, for he saved all of you: pudgy and not very large but somehow massive and indomitable; baby faced, with snub-nosed, square chin, rheumy eyes on occasion given to tears: a thwarted actor’s taste for clothes that would have looked ridiculous on a less splendid man… .He fancied a painting, at which he was good, writing at which he was excellent, and oratory, at which he was magnificent….

“This was the man, bloodied at Omdurman [an 1898 battle in Sudan] and Cuba, among the Pathans and Boers, long before most of those he led were even born, who guided Britain to victory in World War II — and, one might add, who was the guiding spirit for the whole free world. For had Britain succumbed, as it had ever logical reason to do so in 1940, probably no successful coalition could have been formed.”

Sulzberger, of course, was lauding Winston Churchill, who was born on this day 148 years ago. Churchill took over as prime minister of Great Britain when Hitler’s Germany had run roughshod over all the opposition on the European continent and was threatening that island nation. In spite of a fierce air war, in which the Royal Air Force, inspired by Churchill’s indomitable spirit, managed to shoot down and destroy more than five times the number of warplanes it lost to Herman Goering’s Luftwaffe and effectively won the Battle of Britain. After it became clear that an amphibious invasion was not going to work, Hitler turned to terror bombing of England cities and what became known as the Blitz. Throughout all of this, and for the next four years, Churchill remained steadfast and defiant. His grit led Britain and its empire, along with American allies, to total victory over the Nazi state.

In hindsight, and most historians appear to agree, had Great Britain made peace with Hitler in 1940 when it seemed prudent to do so, Nazi Germany would have won, as it almost did anyway.

On this November 30, fifty-eight years after his death, it is appropriate to commemorate Winston Churchill’s birthday. He was the man of the 20th century. Where Hitler attempted to destroy Western Civilization, Churchill saved it.

Note: Churchill was Prime Minister from May 1940 until July 1945. After Germany surrendered in May 1945, he called an election, but his party was defeated. He again became PM in 1951 and was Queen Elizabeth’s first (of 17) Prime Minister when she succeeded to the throne the next year upon the death of her father King George VI.

Categories
Uncategorized

A Judgment on Columbus Day 1968?

Wednesday October 12 commemorates the day Christopher Columbus and his ships landed on an island in the Bahamas in the year 1492 thus discovering what came to be known as the Americas. (It is celebrated as a holiday on Monday, to give those who observe it a three-day weekend.)

In recent times this holiday, and Columbus himself, have been held in disrepute by some, all of whom are on the political left, as a celebration of imperialism and colonialism. This is but one of the attempts of virtue-signaling by leftists and their fellow-travelers in the Democratic Party in an attempt to coalesce a sufficient number of supposedly aggrieved groups to advance candidates who would support insane socialist schemes.

Columbus’s voyages were indeed the advent of exploration and colonial settlement whose effect was to spread Western Civilization throughout the world. On balance, this led to immense benefit, if not always to the then indigenous populations of the European colonies, certainly to their descendants.

This writer posted essays in years past, the most recent being critical of the Dallas City Council’s re-designation of Columbus Day as “Indigenous Peoples Day” as misplaced.

(This post will not belabor the previous essays. For those interested, follow the links at the end of this post.)

The 2021 essay mentioned Michael Musmanno, a U.S. Navy officer, lawyer, and jurist, who was instrumental during the 1930s in persuading the Franklin Roosevelt Administration to make Columbus Day a federal holiday so as to recognize the contributions of Italian-Americans. Musmanno was remarkably accomplished. He wrote numerous books and judicial opinions. In one book he argued that Columbus was the first European to discover the Americas, not the Vikings. He served in both World Wars and was a military governor in Italy after the Mussolini government surrendered. He also presided over one of the courts trying Nazi war crimes perpetrators in Nuremberg. Elected to the Pennsylvania Supreme Court in 1951, Musmanno served there until his death in 1968. Both as a judge, politician, and private citizen, Musmanno was a colorful character, often quite the contrarian. He wrote a record number of dissenting opinions during his 17 years as a Justice on the Pennsylvania Supreme Court. Some much to the annoyance of his colleagues.

Musmanno was an intensely devout Christian. He attended Mount Saint Peter Catholic Church, founded by Italian immigrants near Pittsburgh, for most of his life. While he supported the First Amendment separation of church and state, he believed that public expression of religious beliefs should also be protected. The last of his many dissenting opinions was against overturning an attempted rape conviction in a case in which the court ruled that the trial judge instructed the jury improperly because the judge told the jury that they should decide the case on their consciences. Musmanno quoted from the record that the judge “particularly stated: ‘I’m not telling you what kind of verdict to bring in,’ and he then added, ‘but I’m telling you to stand up like men and women and do what you should do before your God to whom you will answer some day whether you answer to this court or not.’”

He wrote in his dissent “I was afraid it would come to this. It is becoming the fashion to make light of religious invocation. Books are being published asking whether God is dead. Well, God is not dead, and judges who criticize the invocation of Divine Assistance had better begin preparing a brief to use when they stand themselves at the Eternal Bar of Justice on Judgment Day.” Justice Musmanno, concluding his dissent, stated: “I am perfectly willing to take my chances with [the trial judge] at the gates of Saint Peter and answer on our ‘voir dire’ that we were always willing to invoke the name of the Lord in seeking counsel in rendering a grave decision on earth, which I believe the one in this case to be. — Miserere nobis Omnipotens Deus!” (Commonwealth v. Hilton, 432 Pa. 11 at 42 (1968)).

Justice Musmanno died the following day, October 12, 1968, Columbus Day, and presumably that voir dire took place.

Michael Musmanno was buried at Arlington National Cemetery. See his epitaph Here:
https://www.arlingtoncemetery.net/mamusman.htm

Prior Columbus Day posts.

Categories
Uncategorized

Requiem for a Queen (but not Her Realm)

“Far-called our navies melt away —
On dune and headland sinks the fire —
Lo, all our pomp of yesterday
Is one with Nineveh and Tyre.”

So wrote poet and Nobel Laureate Rudyard Kipling on the occasion of Queen Victoria’s diamond Jubilee in 1897. On this occasion of Queen Elizabeth II’s passing, I say — “Not so fast.”

It may be observed that of the world-historical leaders of Great Britain, two of the three most effective and consequential were women: Queen Elizabeth I, who saved England from foreign domination and probable destruction in the 16th Century; Winston Churchill, who saved his country — and perhaps the rest of us — from Hitler and the Nazis; and Margaret Thatcher, who revived a declining nation and reasserted the greatness of Great Britain. Churchill, toward the end of his political career, and Thatcher were Prime Ministers of our Queen Elizabeth II. Perhaps because of her 70-year long reign, and the stability she lent in what were by any standard tumultuous times, we can add the late Queen to the list.

Among the commentaries heard and read, proffered so far are those voices who decry the very existence of the British monarchy as undemocratic and archaic and the vestige of a brutal and rapacious empire that conquered and enslaved half the world. That is to be expected. There is money to be made by continuing to pick at the scabs of healing past wrongs — “wrong” being defined by present standards of certain political persuasion. Pay them no mind, except to be aware of their existence and not fall for their nonsensical vowel movements.

As for the British Monarchy being archaic, well, that is the point. Since the accession of King George I around 300 years ago, the monarchy has been mainly ceremonial, with the political power vesting in Parliament. That arrangement has the monarchy more useful and survivable in the subsequent two centuries, during which many others in Europe and elsewhere fell. Looking back from that time, most of the crises in English/British history, most notably the 1066 Norman Conquest, and in the long-running 15th Century Wars of the Roses, were the result of disputed succession. Since Queen Anne died in 1714, there’s been no dispute over who succeeds to the throne upon the monarch’s death. The vagaries of political shifts occur, sometimes tumultuously, but the Crown endures. Much political rancor is avoided when it is not necessary to elect the head of state.

Of course, much of this depends upon the character and behavior of the monarch. Henry VIII, and any of the kings of France prior to the revolution, could not have lasted one week from the 18th Century forward. Great Britain evolved into a parliamentary democracy within the overall framework of a symbolic monarchy. Stability during profound change was in many ways a result of those wearing the British Crown. Elizabeth II, in large part because of her 70-year reign, but also because of her character, proved to be most accomplished in that regard. But her predecessors, notably Victoria and the most recent Georges (Elizabeth’s father and grandfather) provided that stability when it was needed.

As for the British Empire, to paraphrase William Faulkner, it is not dead, it is not even past. It is true that political domination of the Empire’s colonies is no more. But the cultural influence is manifest. If not the largest by the sheer numbers of individuals who speak it, read it, and write it, as a first or secondary language, English is the language of diplomacy, air commerce, most international business transactions, and now, the Internet. Because of the facility the English language obtained by cross-fertilization — and dare I say it, cultural appropriation — from many places throughout the world, English will continue to be dominant. If for no other reason (and there are many) the sun has not, and will not, ever set on the now soft power, of the Empire.

“Come Nineveh; Come Tyre”? Not for a while.

Along with language, the British brought a political and legal system that emphasizes individual rights and due process, not only to North America and the Pacific nation-islands, but also to many African and South Asian nations. Whether those systems have been, and are being, administered imperfectly and unevenly, is beside the point. The systems are there, and available as a framework to provide justice and protect life, liberty, and property. They have quite often done so in a manner superior to whatever they replaced.

King Charles III, together with his Prime Minister, will face the Sceptered Isle’s many challenges throughout the coming years, and probably decades. Charles has been considerably less popular among Britons than his mother was. That could change if he refrains from doing something barbarous. It does appear that he is off to a good start.

The Queen’s last official act was to appoint the new Prime Minister. Interestingly, the appointee is another Elizabeth, although she is popularly and officially known by her nickname Liz. One wonders if that was deliberate, so as to avoid confusion with her previous now late Queen. Anyway, Ms. Truss has challenges before her even more formidable than those for Charles. Great Britain could be at the point where it really needs another Winston Churchill or Margaret Thatcher. Is Liz up to the task? For her, as well as the King, we will see.

Will the monarchy be abolished at some point? It is doubtful. No one in Britain who has any real influence is that stupid. The symbolism and cultural value to the British people aside, the pomp and splendor have great economic value for export and visitors from abroad who spend a lot of money there. Without the now King (and future successors to the throne) a visit to the United Kingdom would have all the panache of a trip to New Jersey.

As for our late Queen Elizabeth II, as she lies in state, to paraphrase another English poet, “She had a lovely face, God in his mercy lend her grace.” She had enough grace to lend us all. May she rest in peace.

Notes

The lead-in quote is from Kipling’s poem “Recessional”

“Come Nineveh Come Tyre” is the title of Allan Drury’s 1965 dystopian novel in his series that began with “Advise and Consent”

The paraphrase in the final paragraph is from Alfred Lord Tennyson’s poem “The Lady of Shalott”

Native speakers of Chinese outnumber native English speakers, but not when combined with those whose second language is English. Anyway, most commerce between China and the rest of the world is conducted in English.

The Monarch has the theoretical power to veto a bill of Parliament. The last time that was used was by Queen Anne in 1707. It is difficult to imagine a time it would be appropriate and not provoke a constitutional crisis. One power the Queen, and now the King, could use when necessary is to choose a Prime Minister in the event of a hung Parliament; that is, no leader has majority. Elizabeth II used that power once when it was necessary to form a government in Australia.

Categories
Uncategorized

Lesson from a Fallen Empire

This August 14th and 15th 2022, are respectively the 75th anniversaries of independent Pakistan and India. India today is known as an economically thriving and stable republic. Pakistan, somewhat less so. The history of these two nations in their present form began with the end of Great Britain’s imperial rule of the subcontinent in 1947. One of the historical lessons we in the United States of America can learn, if we will, comes from the accompanying trauma of these two nations achieving independence.

Prior to the British arriving in the late 17th Century, and for some time after that, the Muslim Moguls had established an empire over most of the subcontinent. That empire, though shrinking in area and power, lasted into the 19th Century. British presence began with the East India Trading Company chartered by Parliament to establish commerce in east and south Asia. By the 1850s, that company had become a quasi-governmental power in India, even possessing its own private army manned by Indians, called Sepoys. In 1857, subsequent to the rebellion by Sepoys in the company’s army, the British government assumed responsibility and control in what became know as the Raj. In 1876, Prime Minister Benjamin Disraeli had Parliament declare Queen Victoria “Empress of India” and added that appellation to her other titles. That title for British monarchs lasted until the partition and following independence of what is now India and Pakistan.

Prior to independence, the Indian subcontinent consisted of a number of provinces directly administered by Britain and numerous princely states ruled by potentates with various titles such as “maharaja” if Hindu, or “nawab” for Muslims. There were other titles; for example, “Nizam” in Hyderabad. The princely states accepted British hegemony in varying degrees, especially for external matters. The most significant of the British-administered provinces were Bengal in the southeast, and Punjab in the north. There was a polyglot of languages and ethnicities throughout the subcontinent, and two main religions. The most numerous of religious faiths were the Hindus and the Muslims. In Punjab the Sikhs, an offshoot of Hinduism, were a significant minority. Members of these sects often lived near each other, and sometimes shared the same village.

Serious indigenous agitation for the end of British rule began in the early 20th Century. A principal reason for independence was economic, including what many Indians regarded as oppressive taxation by the British, and the high-handed manner in which they were ruled. Hindu leaders, especially Jawaharlal Nehru, Vallabhbhai Patel, and the charismatic Mohandas Gandhi, engaged in various activities resisting British rule, most of which were passive rather than active violence. Mohammed Ali Jinnah emerged as the leader of Muslims. (Interestingly, all of these individuals, including Gandhi, were British-educated lawyers.)

Subsequent to World War II agitation for independence increased. Despite the Allied victory, Britain was politically and economically exhausted. Maintaining the Indian Empire was a net liability that had become unaffordable. The Labor government, recently installed, established a policy that India would be granted independence, and very soon. Prime Minister Clement Attlee appointed Viscount (later Earl) Louis Mountbatten, a cousin of King George and a naval commander in the Far East during the war, as the Viceroy. Mountbatten was explicitly given the mission of establishing Indian independence.

Two issues with independence were evident, the first was what would be the status of the princely states. With the exception of Kashmir, which remains troubled even today as a flashpoint between the two nations, and Hyderabad, which was forcibly integrated into India, this did not turn out to be a problem. The other, more serious, and ultimately a disaster, was that of the Hindu-Muslim question.

The Muslim leader, Jinnah, expressed concern that an independent monolithic India would become a theocratic Hindu state in which Muslims would be second-class citizens. Gandhi was Hindu, but he believed in toleration of all religions and expressed that toleration in many ways over his entire career. Nehru was Hindu, although non–observant with a secular outlook. Even so, Hindus were a majority, and an extreme sectarian nationalist group known as the RSSS was increasing in influence.

Jinnah, even if he really believed that a majority Hindu nation state consisting of the entire subcontinent would oppress Muslims, he doubtless believed identity politics was his way to achieve personal power. He thus insisted upon a partition of the subcontinent along religious lines.

The Moguls were Muslims who ruled India as their empire for several centuries. They do not appear to have been oppressive towards the Hindus. After the British assumed control, both Muslims and Hindus lived together more or less peaceably in Bengal and Punjab. At least they did not indulge in wholesale pogroms, or anything close. That comity became strained with the realization that independence was not far off. This strain was particularly fueled by Jinnah’s fear-mongering and that of his followers. They insisted on a separate Pakistan for Muslims, and threatened a bloodbath if a separate Pakistan was not created.

Pakistan was created and a bloodbath did occur. Several million Hindus, Muslims, and Sikhs were murdered in sectarian violence in the wake of independence and partition. Most of the violence occurred in Punjab and Bengal.

In order to expedite independence, Mountbatten and the government in London believed that partition was inevitable and should be accomplished at the same time independence was granted. The British-governed provinces and princely states that were predominantly either one or the other sect, would accede to the newly independent India if Hindu, or Pakistan if Muslim. Punjab and Bengal were the difficulty. Each had a significant number of both sects. Punjab also had the Sikhs, and their holy city of Amritsar. A British lawyer named Cyril Radcliffe, who was completely unfamiliar with the cultures in India, was given the task of drawing partition lines across each of those provinces. So many villages and areas were intermingled and Radcliffe had only five week to do it, it was a hopeless task from the beginning. Nevertheless, Radcliffe finished with a boundary line that pleased no one and left millions of Hindus and Sikhs in Pakistan and many Muslims in India, The Western area of Punjab went to Pakistan, as did the Eastern part of Bengal (since 1972, the independent nation of Bangladesh). Almost immediately the Muslims in their portion of each province set upon the Hindus left there, and Hindus wreaked similar violence upon Muslims left in India. Wholesale massacres occurred. Refugees streamed across the new borders; trainloads of dead Hindus arrived in India; likewise, dead Muslims arrived in Pakistan. Riots occurred in major Indian cities, particularly Delhi and Calcutta. The police and military were overwhelmed. The British Army was gone or going, and the government in London, still recovering from World War II, had no desire or even means to use its forces to assist.

Gandhi lent his prestige with both Hindus and Muslims to encourage an end to the violence, and began one of his famous fasts. Gandhi’s fasting appears to have had some salutary effect in Calcutta. He also had some success in Delhi, but in January 1948, on his way to prayer after he terminated his fast, a Hindu fanatic shot and killed Gandhi.

After Gandhi’s assassination — some might say martyrdom — much of the violence was brought under control. Both nations, however, continued to be antagonistic towards each other, even for the next 75 years. Since both are nuclear armed, one hopes that cool heads prevail.

Much of the blame for the partition, and the murder and mayhem that occurred in its wake, was placed on Mountbatten and the British government in general for botching the grant of independence. There is probably enough blame to go around, but Mohammed Ali Jinnah is probably the main culprit, along with those who acquiesced in his religious identity politics.

While analogies invariably break down when pushed too far, there are disquieting circumstances today in our country that bear some similarity. Identity politics is poison, be it race, ethnicity, religion, sex, or any category that relies on collectivism and irrelevant or immutable characteristics attributed to different groups of men and women. The respected historian David McCullough, who died this past week, believed that learning from history was necessary, although he did caution that when considering it one should step back and take the long view. Well, the long view is that conflict based on identity politics has never turned out well. From the religious violence of the Crusades and the European Thirty Years War in the 17th century, through the Nazi Holocaust — and the Indian partition — to the present day jihads, uncountable destruction and suffering has resulted. Here in present day United States avaricious individuals have and are fostering identity politics based on race and ethnicity as well as sex. This cannot end well. We must remember the intrinsic worth of individuals, and reject the poisonous collectivism of identity politics.

The history of the British Empire and its end with India’s independence and the creation of Pakistan in this essay barely scratches the surface. For those interested. two books for additional reading are Larry Collins’ and Dominique Lapierre’s Freedom at Midnight (1975) and Alex Von Tunzelmann’s Indian Summer: The Secret History of the End of an Empire (2007). Collins and Lapierre are journalists. Their work is aimed at a popular audience and is readable. They omit references and footnotes, but they have a reputation for being thorough and accurate. Their most notable work, Is Paris Burning?, is the story of the German general who defied Hitler’s order to destroy the city before evacuating in the face of British and American forces in 1944. It earned many accolades. Von Tunzelmann is an Oxford-educated British academic historian. Her work is researched and footnoted almost to a fault, and the detail in some of the chapters is somewhat difficult to plow through. Its history of the British Raj is immensely interesting. (She actually cites one of Collins’ and Laperre’s monographs.)

A note that Collins and Lapierre added but which Von Tunzelmann omits is that when the British announced that August 15 would be the date the British transferred power to the new nations, the Indian astrologers ascertained that because of the position of the stars it was an extremely inauspicious day of an inauspicious month. Bowing to the astrologers, Mountbatten revised his date to 11:59 PM on August 14. Given the mayhem that followed, it appears that the stars were not fooled.

Categories
Uncategorized

“the good is oft interred”

The evil men do lives after them, the good is oft interred with their bones. So let be with Caesar. Julius Caesar, Act III, scene 2.

On previous Independence Days this blog spent some time parsing the Declaration of Independence, although it omitted saying much about the bill of particulars cataloging the abuses for which King George III was accused. A reader suggested that on subsequent occasions it should describe the relationship of those allegations with provisions in the Constitution. Many commitments got in the way of doing the kind of job of it one would wish. For now, that part of the project must wait until Constitution Day in September. Nevertheless, there are a few observations to make on this Fourth of July.

There is a lot of political division occurring in our country these days. The magnitude of it is probably somewhat overblown by over-the-top rhetoric of the journalist class and the social media so pervasive on the Internet. One effect that’s raising the hackles across the political spectrum is a re-evaluation and promulgation of a revisionist version of the founding of the United States.

Two of the most prominent victims of historical revisionism that nevertheless should be honored on this day are founders George Washington and Thomas Jefferson. There is a contrary view by some, mostly academics seeking publicity, and others seeing a chance to make a few bucks off the notoriety. We all know who the usual suspects are. Their general thesis seems to be that regardless of all of the virtues and lasting accomplishments of these two founders, all are eclipsed by the fact that these men once owned slaves. They and the nation are tainted by that original sin. The only way their descendants and heirs can expiate it, is to throw Washington and Jefferson, as well as the other Founders, and their monuments on the ash heap of history. Or so they say.

Now, without getting into a philosophical or theological discussion, it is this writer’s view that the concept of original sin is a metaphor or allegory — that the original primitive audience of the Old Testament could understand — for the obvious characteristic that human nature and its evolution are imperfect, not necessarily evil, and all men must strive to make things better. (Lest anyone call for a revival of the Inquisition and lighting the faggots deduce otherwise from this statement, Darwinian evolution, together with its refinements, is not inconsistent with Christianity, or other faiths.)

That same Old Testament, as well as the scriptures of other religions and philosophies, condoned human slavery. It existed, without serious question, in almost all civilizations and societies; it still does in a few places. Until the 18th Century, Western Civilization had been no exception. That era, of course, was the Age of the Enlightenment.

Professor Allen C. Guelzo, a Senior Research Scholar at Princeton University, has opined thus.

“Washington’s and Jefferson’s time was also the Age of Enlightenment, when the classical hierarchies of the physical and political worlds were overthrown, to be replaced by the natural laws of gravity and the natural rights of ‘Nature and Nature’s God,’ as the Declaration of Independence put it. Labor ceased to be a badge of subservience, and commerce became admirable. As commerce and labor gave people a greater sense of control over their lives for the first time in human history, slavery came to be seen as repugnant and immoral.

“In 1797, the expatriate painter Benjamin West dined with Rufus King, the American diplomatic envoy to Great Britain. West astounded King with a comment George III made when he learned that Washington had voluntarily surrendered his commission as general-in-chief of the Continental Army at the close of the Revolution, a voluntary submission of military power to civilian rule. ‘That act,’ said the king, placed Washington in a light the most distinguished of any man living, and that he thought him the greatest character of the age.”

Thomas Jefferson’s lasting fame is principally as the author of the Declaration of Independence. The preamble “these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights” is known to almost every American. The contours of the meaning of equality and “unalienable rights” has been in dispute since 1776. The tensions between individual liberty and rights has been the basis of division and conflict throughout American political history.

Jack N. Rakove, Emeritus Professor of History at Stanford University writes “In the cosmopolitan range of his interests, the tension between his aristocratic lifestyle and his egalitarian commitments, and most important, the manifest contradiction between the stirring language of the Declaration and his life as a Virginia slaveholder, Jefferson remains the most compelling figure of the American founding generation—but also the most troubling.”

It shouldn’t be. The sanctimonious condemnation of those in the past who do not live up to present moral standards is misplaced. Jefferson was a product of the centuries-long intellectual evolution that began with the Renaissance that proceeded to the Enlightenment. Certainly, his being a slaveholder was a transgression against our modern sensibilities and moral standards, but he must be judged in the context of his time. Jefferson’s achievements, not only in the Declaration (in which the input of John Adams and Benjamin Franklin was significant), but in his subsequent careers, public and private, outshine by miles the ex post facto flaw of his slaveholding.

As an earlier Renaissance author put it “the evil that men do lives after them; the good is oft interred with their bones.” Let it not be so for George Washington and Thomas Jefferson.

Categories
Uncategorized

Knights, Not Lackeys

            Two U. S. Supreme Court decisions of this past week have generated some of the most hysterical rhetorical hyperbole ever heard and seen. It almost makes one long for the halcyon late 1960s and early ‘70s. Some of the milder descriptions of the Court’s opinions were to the effect of them producing political, legal, and a social earthquake. The Dobbs opinion that declared the issue of whether abortion is to be legal, regulated, or banned was for each state to decide and overruled the 49 year old Roe v. Wade case. Abortion, being one of, if not the most, divisive issues of the last half century, Dobbs set the wires, airwaves, and cyberspace on the most fire.

            Justice Clarence Thomas’s opinion in New York State Rifle & Pistol Association, Inc. v. Bruen took up the issue of the New York law that prohibited individuals from carrying a handgun outside their home also had serious Constitutional significance.  The New York law made licensing individuals to carry a handgun outside their home to be at the discretion of a public official and only if the individual could demonstrate a “special need” to the official’s satisfaction. The Court said the Constitutional right protected by the Second Amendment could not be subject to the whim of the government.

            Subordinate in the public consciousness but not entirely ignored was what happened to the lawyers who prevailed in Bruen.

            Paul Clement and Erin Murphy were told by their law firm Kirkland & Ellis that they could not represent any more clients in gun rights cases or would have to leave the firm to do so. Apparently, Kirkland, a “white shoe” Wall Street and D.C. firm was afraid of offending their “progressive” big business clients. Clement and Murphy did not hesitate; they left, and reportedly intend to form their own independent practice. It is worth quoting their stated reasons:

“There was only one choice: We couldn’t abandon our clients simply because their positions are unpopular in some circles.

“Some may find this notion strange or quaint. Many businesses drop clients or change suppliers as convenience dictates. To others, the firm’s decision will seem like one more instance of acceding to the demands of the woke. But law firms aren’t supposed to operate like ordinary businesses. Lawyers owe a duty of loyalty to their clients.

“A lawyer can withdraw from a representation for good reason, like a newly discovered conflict of interest [or not getting paid, if that was a condition of representation from the start]. But defending unpopular clients is what we do. The rare individuals and companies lucky enough to be universally popular (for the time being) have less need for lawyers. And the least popular clients are most in need of representation, from the British soldiers after the Boston Massacre to the defendant in the Boston Marathon bombing.”

The primary duty of a lawyer is to zealously represent the client within the bounds of the law. That is not to say that lawyers must represent or advocate a position they find repugnant or frivolous, and they may not suborn perjury, whether by the client or another witness. There are rules promulgated by the judiciaries of the several states and the federal courts as to how lawyers must conduct themselves. It’s also been said that a lawyer should be careful in choosing his clients. But once chosen, a duty of loyalty attaches, and to some degree, like keeping client confidentiality, remains even after the representation ends.

A one-time client, who was also a lawyer and had been an appellate judge, expressed to me his ideal that lawyers should strive to knights, not lackeys. That ideal is why the wannabe socialist dictator Jack Cade’s henchman Dick the Butcher proposed to first kill all of the lawyers. Henry VI, part 2; Act 4, Scene 2.

  Asides

Paul Clement represented the National Rife Association in McDonald v. City of Chicago,  561 U.S. 742 (2010), where the Supreme Court ruled that the right of an individual to “keep and bear arms”, as protected under the Second Amendment, is incorporated by the Due Process Clause of the Fourteenth Amendment and is thereby enforceable against the states.

Historically, the term “white-shoe” conveyed class envy and a ridicule of the Ivy League educated effete. The wealthy could afford special shoes for boating, tennis, and other genteel pursuits, and in the summer they wore white bucks—perhaps with a bow tie and a seersucker suit—to the exclusive Wall Street firms where they worked. School connections played a central role in maintaining the boundaries of the white-shoe class. In 1962, more than 70 percent of the lawyers in Wall Street law firms had graduated from Harvard, Columbia, or Yale. See Elizabeth Chambliss, The Shoe Still Fits, Legal Affairs September/ October 2005 . See httphttps://www.legalaffairs.org/issues/September-October-2005/toa_sepoct05.msps://www.legalaffairs.org/issues/September-October-2005/toa_sepoct05.msp

Clement successfully argued Adoptive Couple v. Baby Girl, 570 U.S. 637 (2013). This was a decision of the Supreme Court of the United States that righted a terrible wrong visited upon a child and adoptive family. In Baby Girl, the Court ruled that several sections of the Indian Child Welfare Act (ICWA) do not apply to Amerindian biological fathers who are not custodians of an Amerindian child. The court held that the procedures required by the ICWA to end parental rights do not apply when the child has never lived with the father. Additionally, the requirement to make extra efforts to preserve the Amerindian family also does not apply, nor is the preferred placement of the child in another Amerindian family required when no other party has formally sought to adopt the child.

Given the number of women who have entered the legal profession, perhaps the “knight” appellation is not strictly appropriate. In Great Britain and many Commonwealth nations, the title “Dame” is the female equivalent for Knight. Numerous prominent and accomplished women have been so recognized by the Queen. Here in the U.S., unfortunately, that word has served, in some parts of the country, as generic slang, not exactly offensive but not complementary, for any female.

For the lawyers and judges, as well as informed layperson, who may be reading, I wonder if the Dobbs opinion to the extent it overruled Roe v. Wade was an obiter dictum, legal speak for a court’s statement that is not necessary to decide the case, and thus not binding precedent. Chief Justice Roberts’ concurring opinion deems to indicate it might be. Perhaps more grist for the judicial mill.