A Judgment on Columbus Day 1968?

Wednesday October 12 commemorates the day Christopher Columbus and his ships landed on an island in the Bahamas in the year 1492 thus discovering what came to be known as the Americas. (It is celebrated as a holiday on Monday, to give those who observe it a three-day weekend.)

In recent times this holiday, and Columbus himself, have been held in disrepute by some, all of whom are on the political left, as a celebration of imperialism and colonialism. This is but one of the attempts of virtue-signaling by leftists and their fellow-travelers in the Democratic Party in an attempt to coalesce a sufficient number of supposedly aggrieved groups to advance candidates who would support insane socialist schemes.

Columbus’s voyages were indeed the advent of exploration and colonial settlement whose effect was to spread Western Civilization throughout the world. On balance, this led to immense benefit, if not always to the then indigenous populations of the European colonies, certainly to their descendants.

This writer posted essays in years past, the most recent being critical of the Dallas City Council’s re-designation of Columbus Day as “Indigenous Peoples Day” as misplaced.

(This post will not belabor the previous essays. For those interested, follow the links at the end of this post.)

The 2021 essay mentioned Michael Musmanno, a U.S. Navy officer, lawyer, and jurist, who was instrumental during the 1930s in persuading the Franklin Roosevelt Administration to make Columbus Day a federal holiday so as to recognize the contributions of Italian-Americans. Musmanno was remarkably accomplished. He wrote numerous books and judicial opinions. In one book he argued that Columbus was the first European to discover the Americas, not the Vikings. He served in both World Wars and was a military governor in Italy after the Mussolini government surrendered. He also presided over one of the courts trying Nazi war crimes perpetrators in Nuremberg. Elected to the Pennsylvania Supreme Court in 1951, Musmanno served there until his death in 1968. Both as a judge, politician, and private citizen, Musmanno was a colorful character, often quite the contrarian. He wrote a record number of dissenting opinions during his 17 years as a Justice on the Pennsylvania Supreme Court. Some much to the annoyance of his colleagues.

Musmanno was an intensely devout Christian. He attended Mount Saint Peter Catholic Church, founded by Italian immigrants near Pittsburgh, for most of his life. While he supported the First Amendment separation of church and state, he believed that public expression of religious beliefs should also be protected. The last of his many dissenting opinions was against overturning an attempted rape conviction in a case in which the court ruled that the trial judge instructed the jury improperly because the judge told the jury that they should decide the case on their consciences. Musmanno quoted from the record that the judge “particularly stated: ‘I’m not telling you what kind of verdict to bring in,’ and he then added, ‘but I’m telling you to stand up like men and women and do what you should do before your God to whom you will answer some day whether you answer to this court or not.’”

He wrote in his dissent “I was afraid it would come to this. It is becoming the fashion to make light of religious invocation. Books are being published asking whether God is dead. Well, God is not dead, and judges who criticize the invocation of Divine Assistance had better begin preparing a brief to use when they stand themselves at the Eternal Bar of Justice on Judgment Day.” Justice Musmanno, concluding his dissent, stated: “I am perfectly willing to take my chances with [the trial judge] at the gates of Saint Peter and answer on our ‘voir dire’ that we were always willing to invoke the name of the Lord in seeking counsel in rendering a grave decision on earth, which I believe the one in this case to be. — Miserere nobis Omnipotens Deus!” (Commonwealth v. Hilton, 432 Pa. 11 at 42 (1968)).

Justice Musmanno died the following day, October 12, 1968, Columbus Day, and presumably that voir dire took place.

Michael Musmanno was buried at Arlington National Cemetery. See his epitaph Here:

Prior Columbus Day posts.


Requiem for a Queen (but not Her Realm)

“Far-called our navies melt away —
On dune and headland sinks the fire —
Lo, all our pomp of yesterday
Is one with Nineveh and Tyre.”

So wrote poet and Nobel Laureate Rudyard Kipling on the occasion of Queen Victoria’s diamond Jubilee in 1897. On this occasion of Queen Elizabeth II’s passing, I say — “Not so fast.”

It may be observed that of the world-historical leaders of Great Britain, two of the three most effective and consequential were women: Queen Elizabeth I, who saved England from foreign domination and probable destruction in the 16th Century; Winston Churchill, who saved his country — and perhaps the rest of us — from Hitler and the Nazis; and Margaret Thatcher, who revived a declining nation and reasserted the greatness of Great Britain. Churchill, toward the end of his political career, and Thatcher were Prime Ministers of our Queen Elizabeth II. Perhaps because of her 70-year long reign, and the stability she lent in what were by any standard tumultuous times, we can add the late Queen to the list.

Among the commentaries heard and read, proffered so far are those voices who decry the very existence of the British monarchy as undemocratic and archaic and the vestige of a brutal and rapacious empire that conquered and enslaved half the world. That is to be expected. There is money to be made by continuing to pick at the scabs of healing past wrongs — “wrong” being defined by present standards of certain political persuasion. Pay them no mind, except to be aware of their existence and not fall for their nonsensical vowel movements.

As for the British Monarchy being archaic, well, that is the point. Since the accession of King George I around 300 years ago, the monarchy has been mainly ceremonial, with the political power vesting in Parliament. That arrangement has the monarchy more useful and survivable in the subsequent two centuries, during which many others in Europe and elsewhere fell. Looking back from that time, most of the crises in English/British history, most notably the 1066 Norman Conquest, and in the long-running 15th Century Wars of the Roses, were the result of disputed succession. Since Queen Anne died in 1714, there’s been no dispute over who succeeds to the throne upon the monarch’s death. The vagaries of political shifts occur, sometimes tumultuously, but the Crown endures. Much political rancor is avoided when it is not necessary to elect the head of state.

Of course, much of this depends upon the character and behavior of the monarch. Henry VIII, and any of the kings of France prior to the revolution, could not have lasted one week from the 18th Century forward. Great Britain evolved into a parliamentary democracy within the overall framework of a symbolic monarchy. Stability during profound change was in many ways a result of those wearing the British Crown. Elizabeth II, in large part because of her 70-year reign, but also because of her character, proved to be most accomplished in that regard. But her predecessors, notably Victoria and the most recent Georges (Elizabeth’s father and grandfather) provided that stability when it was needed.

As for the British Empire, to paraphrase William Faulkner, it is not dead, it is not even past. It is true that political domination of the Empire’s colonies is no more. But the cultural influence is manifest. If not the largest by the sheer numbers of individuals who speak it, read it, and write it, as a first or secondary language, English is the language of diplomacy, air commerce, most international business transactions, and now, the Internet. Because of the facility the English language obtained by cross-fertilization — and dare I say it, cultural appropriation — from many places throughout the world, English will continue to be dominant. If for no other reason (and there are many) the sun has not, and will not, ever set on the now soft power, of the Empire.

“Come Nineveh; Come Tyre”? Not for a while.

Along with language, the British brought a political and legal system that emphasizes individual rights and due process, not only to North America and the Pacific nation-islands, but also to many African and South Asian nations. Whether those systems have been, and are being, administered imperfectly and unevenly, is beside the point. The systems are there, and available as a framework to provide justice and protect life, liberty, and property. They have quite often done so in a manner superior to whatever they replaced.

King Charles III, together with his Prime Minister, will face the Sceptered Isle’s many challenges throughout the coming years, and probably decades. Charles has been considerably less popular among Britons than his mother was. That could change if he refrains from doing something barbarous. It does appear that he is off to a good start.

The Queen’s last official act was to appoint the new Prime Minister. Interestingly, the appointee is another Elizabeth, although she is popularly and officially known by her nickname Liz. One wonders if that was deliberate, so as to avoid confusion with her previous now late Queen. Anyway, Ms. Truss has challenges before her even more formidable than those for Charles. Great Britain could be at the point where it really needs another Winston Churchill or Margaret Thatcher. Is Liz up to the task? For her, as well as the King, we will see.

Will the monarchy be abolished at some point? It is doubtful. No one in Britain who has any real influence is that stupid. The symbolism and cultural value to the British people aside, the pomp and splendor have great economic value for export and visitors from abroad who spend a lot of money there. Without the now King (and future successors to the throne) a visit to the United Kingdom would have all the panache of a trip to New Jersey.

As for our late Queen Elizabeth II, as she lies in state, to paraphrase another English poet, “She had a lovely face, God in his mercy lend her grace.” She had enough grace to lend us all. May she rest in peace.


The lead-in quote is from Kipling’s poem “Recessional”

“Come Nineveh Come Tyre” is the title of Allan Drury’s 1965 dystopian novel in his series that began with “Advise and Consent”

The paraphrase in the final paragraph is from Alfred Lord Tennyson’s poem “The Lady of Shalott”

Native speakers of Chinese outnumber native English speakers, but not when combined with those whose second language is English. Anyway, most commerce between China and the rest of the world is conducted in English.

The Monarch has the theoretical power to veto a bill of Parliament. The last time that was used was by Queen Anne in 1707. It is difficult to imagine a time it would be appropriate and not provoke a constitutional crisis. One power the Queen, and now the King, could use when necessary is to choose a Prime Minister in the event of a hung Parliament; that is, no leader has majority. Elizabeth II used that power once when it was necessary to form a government in Australia.


Lesson from a Fallen Empire

This August 14th and 15th 2022, are respectively the 75th anniversaries of independent Pakistan and India. India today is known as an economically thriving and stable republic. Pakistan, somewhat less so. The history of these two nations in their present form began with the end of Great Britain’s imperial rule of the subcontinent in 1947. One of the historical lessons we in the United States of America can learn, if we will, comes from the accompanying trauma of these two nations achieving independence.

Prior to the British arriving in the late 17th Century, and for some time after that, the Muslim Moguls had established an empire over most of the subcontinent. That empire, though shrinking in area and power, lasted into the 19th Century. British presence began with the East India Trading Company chartered by Parliament to establish commerce in east and south Asia. By the 1850s, that company had become a quasi-governmental power in India, even possessing its own private army manned by Indians, called Sepoys. In 1857, subsequent to the rebellion by Sepoys in the company’s army, the British government assumed responsibility and control in what became know as the Raj. In 1876, Prime Minister Benjamin Disraeli had Parliament declare Queen Victoria “Empress of India” and added that appellation to her other titles. That title for British monarchs lasted until the partition and following independence of what is now India and Pakistan.

Prior to independence, the Indian subcontinent consisted of a number of provinces directly administered by Britain and numerous princely states ruled by potentates with various titles such as “maharaja” if Hindu, or “nawab” for Muslims. There were other titles; for example, “Nizam” in Hyderabad. The princely states accepted British hegemony in varying degrees, especially for external matters. The most significant of the British-administered provinces were Bengal in the southeast, and Punjab in the north. There was a polyglot of languages and ethnicities throughout the subcontinent, and two main religions. The most numerous of religious faiths were the Hindus and the Muslims. In Punjab the Sikhs, an offshoot of Hinduism, were a significant minority. Members of these sects often lived near each other, and sometimes shared the same village.

Serious indigenous agitation for the end of British rule began in the early 20th Century. A principal reason for independence was economic, including what many Indians regarded as oppressive taxation by the British, and the high-handed manner in which they were ruled. Hindu leaders, especially Jawaharlal Nehru, Vallabhbhai Patel, and the charismatic Mohandas Gandhi, engaged in various activities resisting British rule, most of which were passive rather than active violence. Mohammed Ali Jinnah emerged as the leader of Muslims. (Interestingly, all of these individuals, including Gandhi, were British-educated lawyers.)

Subsequent to World War II agitation for independence increased. Despite the Allied victory, Britain was politically and economically exhausted. Maintaining the Indian Empire was a net liability that had become unaffordable. The Labor government, recently installed, established a policy that India would be granted independence, and very soon. Prime Minister Clement Attlee appointed Viscount (later Earl) Louis Mountbatten, a cousin of King George and a naval commander in the Far East during the war, as the Viceroy. Mountbatten was explicitly given the mission of establishing Indian independence.

Two issues with independence were evident, the first was what would be the status of the princely states. With the exception of Kashmir, which remains troubled even today as a flashpoint between the two nations, and Hyderabad, which was forcibly integrated into India, this did not turn out to be a problem. The other, more serious, and ultimately a disaster, was that of the Hindu-Muslim question.

The Muslim leader, Jinnah, expressed concern that an independent monolithic India would become a theocratic Hindu state in which Muslims would be second-class citizens. Gandhi was Hindu, but he believed in toleration of all religions and expressed that toleration in many ways over his entire career. Nehru was Hindu, although non–observant with a secular outlook. Even so, Hindus were a majority, and an extreme sectarian nationalist group known as the RSSS was increasing in influence.

Jinnah, even if he really believed that a majority Hindu nation state consisting of the entire subcontinent would oppress Muslims, he doubtless believed identity politics was his way to achieve personal power. He thus insisted upon a partition of the subcontinent along religious lines.

The Moguls were Muslims who ruled India as their empire for several centuries. They do not appear to have been oppressive towards the Hindus. After the British assumed control, both Muslims and Hindus lived together more or less peaceably in Bengal and Punjab. At least they did not indulge in wholesale pogroms, or anything close. That comity became strained with the realization that independence was not far off. This strain was particularly fueled by Jinnah’s fear-mongering and that of his followers. They insisted on a separate Pakistan for Muslims, and threatened a bloodbath if a separate Pakistan was not created.

Pakistan was created and a bloodbath did occur. Several million Hindus, Muslims, and Sikhs were murdered in sectarian violence in the wake of independence and partition. Most of the violence occurred in Punjab and Bengal.

In order to expedite independence, Mountbatten and the government in London believed that partition was inevitable and should be accomplished at the same time independence was granted. The British-governed provinces and princely states that were predominantly either one or the other sect, would accede to the newly independent India if Hindu, or Pakistan if Muslim. Punjab and Bengal were the difficulty. Each had a significant number of both sects. Punjab also had the Sikhs, and their holy city of Amritsar. A British lawyer named Cyril Radcliffe, who was completely unfamiliar with the cultures in India, was given the task of drawing partition lines across each of those provinces. So many villages and areas were intermingled and Radcliffe had only five week to do it, it was a hopeless task from the beginning. Nevertheless, Radcliffe finished with a boundary line that pleased no one and left millions of Hindus and Sikhs in Pakistan and many Muslims in India, The Western area of Punjab went to Pakistan, as did the Eastern part of Bengal (since 1972, the independent nation of Bangladesh). Almost immediately the Muslims in their portion of each province set upon the Hindus left there, and Hindus wreaked similar violence upon Muslims left in India. Wholesale massacres occurred. Refugees streamed across the new borders; trainloads of dead Hindus arrived in India; likewise, dead Muslims arrived in Pakistan. Riots occurred in major Indian cities, particularly Delhi and Calcutta. The police and military were overwhelmed. The British Army was gone or going, and the government in London, still recovering from World War II, had no desire or even means to use its forces to assist.

Gandhi lent his prestige with both Hindus and Muslims to encourage an end to the violence, and began one of his famous fasts. Gandhi’s fasting appears to have had some salutary effect in Calcutta. He also had some success in Delhi, but in January 1948, on his way to prayer after he terminated his fast, a Hindu fanatic shot and killed Gandhi.

After Gandhi’s assassination — some might say martyrdom — much of the violence was brought under control. Both nations, however, continued to be antagonistic towards each other, even for the next 75 years. Since both are nuclear armed, one hopes that cool heads prevail.

Much of the blame for the partition, and the murder and mayhem that occurred in its wake, was placed on Mountbatten and the British government in general for botching the grant of independence. There is probably enough blame to go around, but Mohammed Ali Jinnah is probably the main culprit, along with those who acquiesced in his religious identity politics.

While analogies invariably break down when pushed too far, there are disquieting circumstances today in our country that bear some similarity. Identity politics is poison, be it race, ethnicity, religion, sex, or any category that relies on collectivism and irrelevant or immutable characteristics attributed to different groups of men and women. The respected historian David McCullough, who died this past week, believed that learning from history was necessary, although he did caution that when considering it one should step back and take the long view. Well, the long view is that conflict based on identity politics has never turned out well. From the religious violence of the Crusades and the European Thirty Years War in the 17th century, through the Nazi Holocaust — and the Indian partition — to the present day jihads, uncountable destruction and suffering has resulted. Here in present day United States avaricious individuals have and are fostering identity politics based on race and ethnicity as well as sex. This cannot end well. We must remember the intrinsic worth of individuals, and reject the poisonous collectivism of identity politics.

The history of the British Empire and its end with India’s independence and the creation of Pakistan in this essay barely scratches the surface. For those interested. two books for additional reading are Larry Collins’ and Dominique Lapierre’s Freedom at Midnight (1975) and Alex Von Tunzelmann’s Indian Summer: The Secret History of the End of an Empire (2007). Collins and Lapierre are journalists. Their work is aimed at a popular audience and is readable. They omit references and footnotes, but they have a reputation for being thorough and accurate. Their most notable work, Is Paris Burning?, is the story of the German general who defied Hitler’s order to destroy the city before evacuating in the face of British and American forces in 1944. It earned many accolades. Von Tunzelmann is an Oxford-educated British academic historian. Her work is researched and footnoted almost to a fault, and the detail in some of the chapters is somewhat difficult to plow through. Its history of the British Raj is immensely interesting. (She actually cites one of Collins’ and Laperre’s monographs.)

A note that Collins and Lapierre added but which Von Tunzelmann omits is that when the British announced that August 15 would be the date the British transferred power to the new nations, the Indian astrologers ascertained that because of the position of the stars it was an extremely inauspicious day of an inauspicious month. Bowing to the astrologers, Mountbatten revised his date to 11:59 PM on August 14. Given the mayhem that followed, it appears that the stars were not fooled.


“the good is oft interred”

The evil men do lives after them, the good is oft interred with their bones. So let be with Caesar. Julius Caesar, Act III, scene 2.

On previous Independence Days this blog spent some time parsing the Declaration of Independence, although it omitted saying much about the bill of particulars cataloging the abuses for which King George III was accused. A reader suggested that on subsequent occasions it should describe the relationship of those allegations with provisions in the Constitution. Many commitments got in the way of doing the kind of job of it one would wish. For now, that part of the project must wait until Constitution Day in September. Nevertheless, there are a few observations to make on this Fourth of July.

There is a lot of political division occurring in our country these days. The magnitude of it is probably somewhat overblown by over-the-top rhetoric of the journalist class and the social media so pervasive on the Internet. One effect that’s raising the hackles across the political spectrum is a re-evaluation and promulgation of a revisionist version of the founding of the United States.

Two of the most prominent victims of historical revisionism that nevertheless should be honored on this day are founders George Washington and Thomas Jefferson. There is a contrary view by some, mostly academics seeking publicity, and others seeing a chance to make a few bucks off the notoriety. We all know who the usual suspects are. Their general thesis seems to be that regardless of all of the virtues and lasting accomplishments of these two founders, all are eclipsed by the fact that these men once owned slaves. They and the nation are tainted by that original sin. The only way their descendants and heirs can expiate it, is to throw Washington and Jefferson, as well as the other Founders, and their monuments on the ash heap of history. Or so they say.

Now, without getting into a philosophical or theological discussion, it is this writer’s view that the concept of original sin is a metaphor or allegory — that the original primitive audience of the Old Testament could understand — for the obvious characteristic that human nature and its evolution are imperfect, not necessarily evil, and all men must strive to make things better. (Lest anyone call for a revival of the Inquisition and lighting the faggots deduce otherwise from this statement, Darwinian evolution, together with its refinements, is not inconsistent with Christianity, or other faiths.)

That same Old Testament, as well as the scriptures of other religions and philosophies, condoned human slavery. It existed, without serious question, in almost all civilizations and societies; it still does in a few places. Until the 18th Century, Western Civilization had been no exception. That era, of course, was the Age of the Enlightenment.

Professor Allen C. Guelzo, a Senior Research Scholar at Princeton University, has opined thus.

“Washington’s and Jefferson’s time was also the Age of Enlightenment, when the classical hierarchies of the physical and political worlds were overthrown, to be replaced by the natural laws of gravity and the natural rights of ‘Nature and Nature’s God,’ as the Declaration of Independence put it. Labor ceased to be a badge of subservience, and commerce became admirable. As commerce and labor gave people a greater sense of control over their lives for the first time in human history, slavery came to be seen as repugnant and immoral.

“In 1797, the expatriate painter Benjamin West dined with Rufus King, the American diplomatic envoy to Great Britain. West astounded King with a comment George III made when he learned that Washington had voluntarily surrendered his commission as general-in-chief of the Continental Army at the close of the Revolution, a voluntary submission of military power to civilian rule. ‘That act,’ said the king, placed Washington in a light the most distinguished of any man living, and that he thought him the greatest character of the age.”

Thomas Jefferson’s lasting fame is principally as the author of the Declaration of Independence. The preamble “these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights” is known to almost every American. The contours of the meaning of equality and “unalienable rights” has been in dispute since 1776. The tensions between individual liberty and rights has been the basis of division and conflict throughout American political history.

Jack N. Rakove, Emeritus Professor of History at Stanford University writes “In the cosmopolitan range of his interests, the tension between his aristocratic lifestyle and his egalitarian commitments, and most important, the manifest contradiction between the stirring language of the Declaration and his life as a Virginia slaveholder, Jefferson remains the most compelling figure of the American founding generation—but also the most troubling.”

It shouldn’t be. The sanctimonious condemnation of those in the past who do not live up to present moral standards is misplaced. Jefferson was a product of the centuries-long intellectual evolution that began with the Renaissance that proceeded to the Enlightenment. Certainly, his being a slaveholder was a transgression against our modern sensibilities and moral standards, but he must be judged in the context of his time. Jefferson’s achievements, not only in the Declaration (in which the input of John Adams and Benjamin Franklin was significant), but in his subsequent careers, public and private, outshine by miles the ex post facto flaw of his slaveholding.

As an earlier Renaissance author put it “the evil that men do lives after them; the good is oft interred with their bones.” Let it not be so for George Washington and Thomas Jefferson.


Knights, Not Lackeys

            Two U. S. Supreme Court decisions of this past week have generated some of the most hysterical rhetorical hyperbole ever heard and seen. It almost makes one long for the halcyon late 1960s and early ‘70s. Some of the milder descriptions of the Court’s opinions were to the effect of them producing political, legal, and a social earthquake. The Dobbs opinion that declared the issue of whether abortion is to be legal, regulated, or banned was for each state to decide and overruled the 49 year old Roe v. Wade case. Abortion, being one of, if not the most, divisive issues of the last half century, Dobbs set the wires, airwaves, and cyberspace on the most fire.

            Justice Clarence Thomas’s opinion in New York State Rifle & Pistol Association, Inc. v. Bruen took up the issue of the New York law that prohibited individuals from carrying a handgun outside their home also had serious Constitutional significance.  The New York law made licensing individuals to carry a handgun outside their home to be at the discretion of a public official and only if the individual could demonstrate a “special need” to the official’s satisfaction. The Court said the Constitutional right protected by the Second Amendment could not be subject to the whim of the government.

            Subordinate in the public consciousness but not entirely ignored was what happened to the lawyers who prevailed in Bruen.

            Paul Clement and Erin Murphy were told by their law firm Kirkland & Ellis that they could not represent any more clients in gun rights cases or would have to leave the firm to do so. Apparently, Kirkland, a “white shoe” Wall Street and D.C. firm was afraid of offending their “progressive” big business clients. Clement and Murphy did not hesitate; they left, and reportedly intend to form their own independent practice. It is worth quoting their stated reasons:

“There was only one choice: We couldn’t abandon our clients simply because their positions are unpopular in some circles.

“Some may find this notion strange or quaint. Many businesses drop clients or change suppliers as convenience dictates. To others, the firm’s decision will seem like one more instance of acceding to the demands of the woke. But law firms aren’t supposed to operate like ordinary businesses. Lawyers owe a duty of loyalty to their clients.

“A lawyer can withdraw from a representation for good reason, like a newly discovered conflict of interest [or not getting paid, if that was a condition of representation from the start]. But defending unpopular clients is what we do. The rare individuals and companies lucky enough to be universally popular (for the time being) have less need for lawyers. And the least popular clients are most in need of representation, from the British soldiers after the Boston Massacre to the defendant in the Boston Marathon bombing.”

The primary duty of a lawyer is to zealously represent the client within the bounds of the law. That is not to say that lawyers must represent or advocate a position they find repugnant or frivolous, and they may not suborn perjury, whether by the client or another witness. There are rules promulgated by the judiciaries of the several states and the federal courts as to how lawyers must conduct themselves. It’s also been said that a lawyer should be careful in choosing his clients. But once chosen, a duty of loyalty attaches, and to some degree, like keeping client confidentiality, remains even after the representation ends.

A one-time client, who was also a lawyer and had been an appellate judge, expressed to me his ideal that lawyers should strive to knights, not lackeys. That ideal is why the wannabe socialist dictator Jack Cade’s henchman Dick the Butcher proposed to first kill all of the lawyers. Henry VI, part 2; Act 4, Scene 2.


Paul Clement represented the National Rife Association in McDonald v. City of Chicago,  561 U.S. 742 (2010), where the Supreme Court ruled that the right of an individual to “keep and bear arms”, as protected under the Second Amendment, is incorporated by the Due Process Clause of the Fourteenth Amendment and is thereby enforceable against the states.

Historically, the term “white-shoe” conveyed class envy and a ridicule of the Ivy League educated effete. The wealthy could afford special shoes for boating, tennis, and other genteel pursuits, and in the summer they wore white bucks—perhaps with a bow tie and a seersucker suit—to the exclusive Wall Street firms where they worked. School connections played a central role in maintaining the boundaries of the white-shoe class. In 1962, more than 70 percent of the lawyers in Wall Street law firms had graduated from Harvard, Columbia, or Yale. See Elizabeth Chambliss, The Shoe Still Fits, Legal Affairs September/ October 2005 . See http

Clement successfully argued Adoptive Couple v. Baby Girl, 570 U.S. 637 (2013). This was a decision of the Supreme Court of the United States that righted a terrible wrong visited upon a child and adoptive family. In Baby Girl, the Court ruled that several sections of the Indian Child Welfare Act (ICWA) do not apply to Amerindian biological fathers who are not custodians of an Amerindian child. The court held that the procedures required by the ICWA to end parental rights do not apply when the child has never lived with the father. Additionally, the requirement to make extra efforts to preserve the Amerindian family also does not apply, nor is the preferred placement of the child in another Amerindian family required when no other party has formally sought to adopt the child.

Given the number of women who have entered the legal profession, perhaps the “knight” appellation is not strictly appropriate. In Great Britain and many Commonwealth nations, the title “Dame” is the female equivalent for Knight. Numerous prominent and accomplished women have been so recognized by the Queen. Here in the U.S., unfortunately, that word has served, in some parts of the country, as generic slang, not exactly offensive but not complementary, for any female.

For the lawyers and judges, as well as informed layperson, who may be reading, I wonder if the Dobbs opinion to the extent it overruled Roe v. Wade was an obiter dictum, legal speak for a court’s statement that is not necessary to decide the case, and thus not binding precedent. Chief Justice Roberts’ concurring opinion deems to indicate it might be. Perhaps more grist for the judicial mill.


78 & Counting

On this June 6, its is again appropriate to repeat an essay I wrote and published almost a decade ago (with an update). Here goes.

The pleasant town of Bayeux in northern France is famous for its eponymous tapestry depicting the events leading to the Norman Conquest of England in 1066. Across from the railway station there is a café that serves cold beer and the apple cider the region is also famous for. That establishment bears a sign in English “Welcome to our liberators.” The sign might appear to be incongruous to some of us, except that ten kilometers to the northwest is a bluff overlooking a sandy expanse along the English Channel that for the past seventy-five years has been known to the world as Omaha Beach.

Many words will be written and spoken on this 77th anniversary of D-Day, the beginning of what General Eisenhower called the “Great Crusade” to end the Nazi occupation of Europe, and ultimately win World War II. Today, the word “crusade” is politically incorrect in some circles as being offensive to those who have vowed to kill us and actually have achieved some success in doing so. And we have become accustomed to euphemisms, direct and to the point speech being too harsh for our sensitive ears. That is just as well. The loudest, and most eloquent, statements to be made come from the 10,000 American graves at the top of the cliff and the sound of the waves below.

When visiting the beach even this long after the fact, it is not difficult to picture the horror and chaos experienced by the soldiers and sailors who stormed ashore that day. The Germans had fortified nearly the entire coastline of France, as well as the coasts of other occupied countries, into what was called the Atlantic Wall. Various barriers and obstacles had been placed in the water offshore to prevent landing craft from reaching dry land, and to channel invaders into killing zones covered by machine gun bunkers dug into the 100 feet high cliffs above. This required the assault to be made at low tide, leaving a 300 yard open expanse of sand to traverse before the slightest natural cover could be reached. Above the high tide line is another 50 yard stretch of loose sand. Walking unencumbered on loose sand can be difficult; running with 60 pounds of weaponry and equipment, all the while facing withering small arms and artillery fire, has to have been a nearly superhuman feat. Many of the invaders did not make it; that so many did is a credit to the quality of the military training and preparation, as well as the fortitude and power of the survival instinct of the troops. The actual film footage in the Normandy episode of the Victory at Sea documentary demonstrated some of the difficulty, but the bloodiest parts had to have been edited to make it suitable for a 1950s home audience. The fictional first 24 minutes of the film Saving Private Ryan might more accurately portray the horror and difficulty of the assault, but still may be an understatement.

Eisenhower said in his address to the American, British, and Canadian service members who were about to land on the beaches: Your task will not be an easy one. Your enemy is well trained, well equipped and battle-hardened. He will fight savagely. They were about to discover that he got that right.

It could have been worse. A major part of the plan was to deceive the defenders as to where and when the attack would be made. As previously mentioned, the entire coast-line was fortified. The defending German army was battle-hardened, and exceptionally well-led by Field Marshals Gerd von Runstedt and Erwin Rommel. Their main problem was manpower and munitions. Five years of war, and the continuing demands of the Russian front in the east made critical to the defenders the knowledge of the place and time of the landings. The deception, with some cooperation from the weather, worked. The German defenders were caught off guard at Normandy, and were unable to bring the full weight of their forces to bear until a beachhead was established. But in spite of the withering fire and the obstructions, even Omaha Beach was taken by day’s end. The Americans didn’t get much farther that day, though, and the casualties were huge. This beachhead, established by those soldiers, whose ranks are now thinning day by day, made it possible to end the war in Europe. Nazi Germany unconditionally surrendered eleven months and two days after D-Day. Those few that are left, and those who passed before them, merit the gratitude of us all.

For every victor there is a vanquished. So it must be added that within five years of the victory, the United States, and to some degree Great Britain and France, have become allies, if not friends with Germany during forty years of Cold War, and beyond. There was no doubt then, or today, that the German Army was fighting on behalf of evil masters and a bad cause. Soldiers, most of whom in World War II were not fighting because they wanted to, can nevertheless fight honorably for an ignoble cause (or dishonorably for a good cause, for that matter). Soldiers know this, and once the fighting is over, they are often more inclined than the civilians far from the horrors to let bygones be bygones.

A poignant story related in a British history magazine relates the ordeal of two soldiers, an American and a German defender who shot him at Omaha Beach. Both survived the war. Heinrich Severloh manned a machinegun in a bunker in the cliff. He estimated that he fired over 12,000 rounds before he ran out of ammunition for it, and then picked up his carbine to continue shooting at the attacking Americans. Three of Severloh’s rounds hit David Silva, as he and other GIs were scrambling for cover on the beach. The German was later captured and held in a POW camp until some time after the end of the war. He was repatriated in 1946 and took up farming. After reading Cornelius Ryan’s book The Longest Day, published in 1959, Severloh learned that he was the one shot Silva. In 1963, the two former adversaries met each other in Germany. Silva, by that time had taken Holy Orders as a Catholic priest. The two formed a friendship, as former soldiers who fought honorably for opposing sides are often known to do, and corresponded for many years. They both suffered of the circumstances that attend the fog and maelstrom of war.

But the story of Severloh and Silva’s later relationship is only an aside. The honor today goes to Silva and his fellow servicemen who stormed the beaches on the fateful day. They we salute.


Guns and the Global Village

I have been a member of the National Rifle Association for decades and have supported most of their policies regarding Second Amendment rights. I still do. The right of individuals to keep and bear arms is not unlimited, but like other fundamental rights protected by our Constitution, limitations must be more than reasonable. They must address a compelling governmental interest. And even more, they must be narrowly tailored to further that interest.

Ohio Congressman Jim Jordan recently said something to the effect that we did not ban airplanes after 9/11, so why should we ban guns after miscreants misuse them. Most analogies break down if pushed too far, and I’m not sure Mr. Jordan’s is completely on point. I do believe, however, that some measures should be taken by Texas to reduce the likelihood that firearms will not get into the wrong hands.

Wall Street Journal columnist Holman Jenkins last week wrote a column in which he suggested that social media could consider developing algorithms to flag extremist and violent threats, and provide them to law enforcement, and perhaps other public services. This would not be probable cause for arrest or the basis for disarming certain individuals that make threats, of make other intemperate remarks but law enforcement could pay them a visit and interview them, and alert possible targets. This might be worth considering. If such individuals are aware that there being looked at — known to the police — it might have some deterrent effect.

One aspect of the Uvalde perpetrator’s obtaining his weapons from a licensed dealer should be examined. This guy purchased two AR-15 platform rifles, one of which retails for more than $1800. No AR retails for less than around $1000. Ammunition is also expensive. Buying them off the street would not have cost much less, if not more. Where did a 18-year-old unemployed school dropout obtain that kind of money? Did he steal it, perhaps from his grandmother, who he shot? Anyway, legal or not, a licensed dealer should have recognized, and use some discretion, or at least expedited notification of the sale of multiple firearms to one person. If this transaction was not a red flag, there’s no cows in Texas.

Another possible measure could be to raise the age requirement for firearms purchase or unsupervised possession to 21. Though the age of adulthood has been 18 for the past half-century or so, alcoholic beverages cannot be purchased until age 21. That is not to say that underage individuals would not be able to obtain weapons — they certainly do obtain alcohol. Nevertheless, it would give law enforcement another tool.

Speaking of law enforcement, police at all levels have been compromised by the perception perpetrated by the political left that use of force is always suspect. No doubt there has been some overreaction by law officers, including the ones in the notorious 2020 Minneapolis situation, but they all involved individuals who were committing a crime, egregiously disturbing the peace, or resisting arrest. None of the so-called victims of police brutality could have been described as model citizens. The net effect has been reluctant for police to be proactive in attempting to prevent crime. When law enforcement breaks down, vigilantism is bound to take over. No sane person would want that.

As for the proliferation of firearms, it will continue. There are gun shows in this state several times each month. If there is one this weekend, it probably would be necessary to park in a ZIP Code away from the location to attend. The shows always have signs nominating prominent a number of public personas for gun salesman of the year — I understand Joe Biden is the current front runner. (Perhaps it should be pointed out that there is no “gun show loophole” in the identification and background checks by attending dealers. Individual private sales, whether by those attending, either there or elsewhere on the street occur on a daily basis. Anyway, selling a firearm to an individual who is ineligible to buy or possess one, is a crime whether one is a licensed dealer or not.)

Many of those supporting severe restrictions on purchase and possession of guns, or types of guns and accessories, try to compare the United States with other nations. They often bring up Great Britain and Australia. This is not helpful. No other country in the world has a multiplicity of cultures and demographics that the United States has. Australia banned most firearms after a particularly horrible mass shooting. It is pointed out that there had been one in that country since. What is not typically pointed out is that Australia did not have very much violence, and hardly any mass shootings, prior to enacting that measure. Perhaps a more useful comparison might be with the Czech Republic where restrictions on firearms and the number of owners is comparable to the U.S. There is very little violence involving firearms in that country. It, of course, is culturally homogenous.

The increase of violence, with or without the use of firearms, here in the United States over the past several decades is doubtless a result of many factors. The proliferation of guns in the hands of ordinary citizens is more the effect of these causes, rather than the other way around. One possible culprit to be examined, though little can be done about it, is social media itself. That phenomenon has all of the drawbacks of living in a village, without the benefits. In the 1960s, Marshall McLuhan wrote extensively about media, which he called the extensions of man. McLuhan was observing broadcast radio and television, as well as refinement of all telecommunications. He didn’t foresee the half of it. McLuhan also coined the term “global village” to describe what was coming. Advantages of living in a village or small town include social and economic cooperation, and informal mores and folkways that make for a pleasant quality of life. One of the drawbacks of living in such a place is that everybody knows everybody else’s private business. Nags and busybodies rule the roost. Those whose activities, expressed opinions, or even thoughts are disfavored or ostracized. Eccentricity, no matter if plainly harmless, is not tolerated. The ultimate sanction, of course, is shunning.

An observation made after one of the previous school shootings was that a small number of human beings are insane. There’s really nothing to be done. The “do something” cry is as insipid as the “thoughts and prayers” mantra. Not sure how this will shake out, but it has been in the making for a long time.



Tesla and PayPal entrepreneur Elon Musk tweeted:

“I hereby challenge [Vladimir Putin] to single combat,” Musk wrote, using Putin’s Russian name.

“He added, “Stakes are [Ukraine],” using the Ukrainian spelling for Ukraine.

“In Cyrillic script and tagging the Kremlin’s official account, Musk asked, “Do you agree to this fight?”

Absurd? Probably, but it seems like a good idea. Putin likes to portray himself as muscular and accomplished in the martial arts. Don’t know about Musk, but he’s quite a bit taller than Putin.

Emile Zola, in his novel about the Franco-Prussian War (1870-71), aptly titled (in French) Le Débâcle, portrayed a soldier on his way to the battlefield musing thus:

“If Badinguet [satirical sobriquet for Napoleon III] and Bismarck have a quarrel, let them go to work with their fists and fight it out and not involve some hundreds of thousands of men who don’t even know each other by sight and have not the slightest desire to fight.” — Zola: The Downfall, chap. ii. (1892)(translated by E. P. Robins)

Wonder how the Russian soldiers taking this. Ukrainians are fighting for their existence. Either way the war ends, it will be a debacle for both.


Winning a Rock

“A victory? What have we won? We’ve won a rock in the middle of a wasteland, on the shores of a poisoned sea.” — General Lucis Flavis Silva (attributed in the 1981 TV miniseries Masada, based on the historical novel The Antigonists, by Ernest K. Gann)

Silva was speaking of the Roman final conquest of the Jewish Zealots’ redoubt Masada on a mountain above the Dead Sea in 75 A.D. The defenders, except for a few women were all dead when the surrounding wall was breached after a lengthy seige.

This quote, apocryphal though it probably is, may signal the result of Vladimir Putin’s conquest of Ukraine, assuming he succeeds. It appears he will win only a wasteland, and maybe a poisoned one at that.

The siege of Masada by the Roman army was described by Roman-Jewish historian Flavius Josephus during the Jewish-Roman War. The siege occurred between 73 and 74 after the fall of Jerusalem, destruction of the Temple, and the subsequent diaspora.

Another quote, probably a paraphrase, that the Russians and our leaders should take note of: “Not sure how World War III will be fought, but World War IV will be fought with sticks and stones.”


“We Want Them Broken”

“DID YOU REALLY THINK THAT WE WANT THOSE LAWS TO BE Observed? We Want Them Broken. You better get it straight that it is not a bunch of boy scouts you are up against — then you’ll know that this is not the age for beautiful gestures. We’re after power and we mean it. You fellows were pikers, but we know the real trick, and you had better get wise to it. There’s no way to rule innocent men. The only power any government has is the power to crack down on criminals. Well, when there aren’t enough criminals, one makes them. One declares so many things to be a crime that it becomes impossible for men to live without breaking laws. Who wants a nation of law-abiding citizens? What’s there in that for anyone? But just pass the kind of laws that can neither be observed nor enforced nor objectively interpreted — and you create a nation of lawbreakers — and then you cash in on guilt. Now that is the system, that is the game, once you understand it, you’ll be much easier to deal with.” Ayn Rand, Atlas Shrugged (1957, 25th Anniversary Ed. 1992) p. 411).

Piracy, smuggling, counterfeiting, and treason are the only federal crimes mentioned in the United States Constitution. How many federal crimes has Congress created?

The Saturday (January 22, 2022) Wall Street Journal published an editorial that stated: “In the 2019 United States Code, [the Heritage Foundation and George Mason University’s Mercatus Center] found 1,510 criminal sections. By examining some of those sections at random, they estimated that they encompass 5,199 crimes in total. The Heritage Foundation report notes that ‘there is no single place where any citizen can go to learn’ all federal criminal laws, and even if there were, some ‘are so vague that . . . no reasonable person could understand what they mean.’

* * *

“But even when it comes to conduct everyone agrees should be criminal, the inexorable expansion of the Code has serious consequences for justice and federalism. The Constitution envisioned that most lawbreaking would be handled by state governments, while the federal government’s jurisdiction would be narrower.

* * *

“Both political parties should recognize the risks of an ever-expanding roster of federal crimes, which invites abuse by prosecutors. How about a commitment by Congress to re-examine the necessity of an existing crime for every new one it creates?”

The Heritage Foundation and the Mercatus Center appear to have done a large part of the job for identifying obscure criminal statutes. But what about the Code of Federal Regulations (CFR)? The offices of U. S. Representatives and Senators have become, for most members, a sinecure; that is, a position of being paid for status, rather doing productive work. Congress has delegated broad powers in making regulations, even powers to criminalize conduct, to the various boards, commissions, and agencies created by a myriad of statutes. The so-called experts that run those organizations can declare citizens to become felons, without express Congressional approval, by criminalizing conduct of which they disapprove.

Much of the current abuses of federal criminal law involve the regulatory offenses that have no mens rea (criminal intent, knowledge, or recklessness) element. This appears to be against fundamental fairness and due process, but so far courts have been reluctant to rule the lack of such mens rea is fatal to a prosecution for violation of many of those regulatory offenses. There may be hope as the present composition of the Supreme Court seems to be in favor of reining in administrative fiat and executive ukase.

This writer believes There are two measures that would go a long way to ameliorate this situation. (1) Rewrite the Federal Criminal Code and put EVERY violation that calls for a fine or imprisonment in Title 18 of the United States Code, the present criminal code, where everyone can see and read it. If it is not in Title 18, it cannot be a crime. (2) Enact a statute (might take a Constitutional Amendment, but that is political heavy lifting), that makes it clear that NO criminal offense can be created by administrative fiat or executive order. Write (OK, email) you view to your Representative and Senator.

Note: for additional reading on this issue, recommend Three Felonies a Day by Harvey Silverglate (2009). Mr. Silverglate is a lawyer based in Massachusetts, who has extensive “white collar” criminal defense experience. Silverglate, along with University of Pennsylvania professor Alan Charles Kors, founded the Foundation for Individual Rights in Education (, and organization that advocates for and defends students, faculty members, and other employees in free speech and expression issues (on both sides of the political spectrum).


Recent significant news.

If there was any doubt that the Democratic Party, at least in Arizona and probably in the rest of the country, is today led by fools, there is now proof. Thet Party censured Senator Krysten Sinema for not following its national leadership in getting rid of the filibuster — the requirement that it takes 60 Senators to bring a bill to the floor for a vote. They thus converted Sinema from a party pariah into a martyr.