Guns & Felons

Whether convicted felons should be able to possess firearms seems to be a no-brainer, at least on the surface. It appears obvious that if anyone is restricted from such ownership or possession, it should be those persons who have been duly convicted of a serious crime. A federal statute, 18 U.S.C. § 901(g) provides that a person who has been convicted of a crime that is punishable by imprisonment for more than one year, or two years if a state misdemeanor offense, is banned from possessing any firearm, or other dangerous weapon (whatever that might be), for the rest of their life.

The Constitutionality of this statute has been questioned by legal commentators and in a number of cases to the extent it applies to non-violent crimes such as regulatory offenses. So far it has been upheld in the federal appeals courts, though with dissents by participating judges in a few. The Supreme Court has not had a case that involved this issue as of yet.

During the recent U.S. Senate Judiciary Committee’s hearings on whether Amy Coney Barrett, a federal Court of Appeals judge and former law professor, should be confirmed as a Supreme Court Justice, at least one Senator on the committee questioned Barrett about this issue. What prompted the questioning was a dissent that Barrett wrote in an opinion in her role as a judge on the Seventh Circuit Court of Appeals upholding the statute.(1) That case involved an individual who had been convicted of mail fraud for representing orthopedic shoe inserts he was selling were Medicare approved when they were not. Her dissenting opinion would hold that in a case where the defendant was convicted of a non-violent felony, the statute would be unconstitutional under the Second Amendment to the United States Constitution, and the Supreme Court decisions that recognized the individual right to keep and bear arms. If Barrett is confirmed as a member of the Supreme Court, which appears likely at this juncture, and the court were to hear a similar case, the “felon in possession” law, at least at the federal level, could be modified to not apply for nonviolent offenses.

This prospect distresses many who are in favor of stricter gun control laws, particularly lawmakers on the Democrat side of the aisle. Why would anyone believe that someone convicted of a serious crime should not lose their right? After all, a felon has been convicted of a crime that Congress or a state legislature has determined to be a serious affront to public safety and order. And when one is convicted they can lose their liberty by incarceration, and lose their property through fines or forfeiture. In some states, a convicted felon loses their right to vote, to hold public office, to serve on juries, and other civic rights.

Conviction of crime does not allow infringement of religious liberties, freedom of speech, freedom from cruel and unusual punishment, and so forth. Should such conviction allow the government to curtail ones right to self-defense — especially when in their own home — and other lawful use of arms, as protected by the Second Amendment?

There are several things to keep in mind in this controversy.

There is a concept in law that categorizes offenses as either malum-in-se, meaning they are evil in themselves by commonly accepted standards, or malum prohibitum, that is, bad because they are prohibited for some reason other than they are inherently evil. It is obvious that, for example, murder, rape, robbery, assault, and similar offenses utilizing physical force, its threat, or potential, are evil. At the other end of the spectrum, jaywalking, driving without a license, damming a stream on one’s property that feeds into a navigable waterway, or shipping lobsters to market in the wrong kind of packaging, are not inherently evil.

At common law, felonies were few, and limited to acts of violence for the most part. Treason was a separate category at common law, and could include merely wishing for harm to the king. Felonies were all punishable by forfeiture, corruption of blood, and even (but not always) death. In some felonies such as theft, the punishment could be mutilation or branding — serious and exceptionally harsh punishment. Anyone who committed these crimes was considered not to be fit to live in the community or to remain there with significant legal and even physical disabilities. This, of course, is no longer the case in any common law jurisdiction. These punishments had been abolished in the English-speaking world and in many other cultures, by the end of the 19th century except for the most serious of offenses such as murder.

The statute, 18 U.S.C. § 901, discussed here provides that the prohibition extends to a conviction of a crime punishable by more than one year imprisonment. It does not use the word “felon” or “felony” in its language.

Offenses that provide for possible punishment of more than a year incarceration include over a thousand acts criminalized under the United States Code and state laws. Many of these are regulatory offenses defined by administrative bodies, and not necessarily by legislatures or Congress. Those administrative or regulatory crimes, with few exceptions, do not involve any kind of violent act, or threat of violence. The title of lawyer Harvey Silverglate’s Three Felonies a Day (2009) suggests the number, or more, each adult individual in the United States might unknowingly commit while going about their everyday activities.(2) What is worse, while Title 18 is the federal criminal section of the Code, there are numerous other crimes defined elsewhere in the Code — some in really obscure places — and in administrative rules that the average person, and even Justice Department lawyers, are unaware of, And this in a legal system where ignorance of the law is no excuse!

There have been recent calls from along the political spectrum for criminal justice reform. Many have called for an end of “mass incarceration” particularly regarding non-violent drug offenses. There is is also widespread criticism of the strict liability crimes, that is, offenses that do not require a culpable mental state such as intent, knowledge, or recklessness. Whether there is a political will to accomplish any such reform is not entirely clear.

In any event, this issue needs to be addressed in the Supreme Court. It “cries out for a serious and fresh look–the first serious look since the 1920s, and arguably the first ever in light of the historical right. The need is particularly acute given the cancerous growth since the 1920s of “regulatory” crimes punishable by more than a year in prison, as distinct from traditional common-law crimes. The effect of this growth has been to expand the number and types of crimes that trigger ‘felon’ disabilities to rope in persons whose convictions do not establish any threat that they will physically harm anyone, much less with a gun.”(3)

The over-criminalizing of activities that are not inherently evil coupled with the lifetime ban on the ability to exercise a fundamental right enumerated in the Constitution and recognized by Supreme Court is a danger to liberty. This is so even by the mere threat. It might seem remote to most of us. But consider the following warning written over 60 years ago:

Did you really think that we want those laws to be observed? We want them broken. You better get it straight that it is not a bunch of boy scouts you are up against — then you’ll know that this is not the age for beautiful gestures. We’re after power and we mean it. You fellows were pikers, but we know the real trick, and you had better get wise to it. There’s no way to rule innocent men. The only power any government has is the power to crack down on criminals. Well, when there aren’t enough criminals, one makes them. One declares so many things to be a crime that it becomes impossible for men to live without breaking laws. Who wants a nation of law-abiding citizens? What’s there in that for anyone? But just pass the kind of laws that can neither be observed nor enforced nor objectively interpreted — and you create a nation of lawbreakers — and then you cash in on guilt. Now that is the system, that is the game, once you understand it, you’ll be much easier to deal with. (4)

Think about that.

Endnotes

  1. Kanter vs. Barr, 919 F.3d 437 (2019).
  2. Follow this link to see eight ways to inadvertently commit a felony. There are doubtless many others. https://www.mic.com/articles/86797/8-ways-we-regularly-commit-felonies-without-realizing-it
  3. See C. K. Marshall, “Why Can’t Martha Stewart Have a Gun?” 32 Harvard Journal of Law & Public Policy 695 (2009).
  4. Ayn Rand, Atlas Shrugged (1957) pb. 410.

Celebrate Columbus Day

October 10, 1492 — Here the men lost all patience, and complained of the length of the voyage, but the Admiral encouraged them in the best manner he could, representing the profits they were about to acquire, and adding that it was to no purpose to complain, having come so far, they had nothing to do but continue on to the Indies, till with the help of our Lord, they should arrive there.

October 12, 1492 — At two o’clock in the morning the land was discovered, at two leagues’ distance; they took in sail and remained under the square-sail lying to till day, which was Friday, when they found themselves near a small island, one of the Lucayos, called in the Indian language Guanahani.

— As translated in Journals and “Other Documents on the Life and Voyages of Christopher Columbus “ (1963) by Samuel Eliot Morison, pp. 62 – 64 (derived from a summary made by Bartolomé de Las Casas).

One of the current targets of the American left’s chagrin appears to be Christopher Columbus, the discoverer of the lands of the Western Hemisphere. Commemorative statues of the explorer have been vandalized and even removed from many cities and places in the United States. There is a movement — among some institutions and in places already successful — to rename October 12 to “Indigenous Peoples Day.” Perhaps those of us were born in this country should not be offended, currently being indigenous and native to America ourselves. But, of course, what the cancel culture has in mind is a sop to those (assuming most even care) individuals whose ancestors immigrated to the Western Hemisphere prior to Columbus’ arrival.

Anthropologists and those of related disciplines, have demonstrated that the original “discoverer” was the first human who crossed the Bering Strait, or the land bridge that formerly existed, several millennia ago. And Vikings briefly visited what are now the Canadian Maritime Provinces and perhaps the coast of Maine four or five centuries before 1492.

Columbus’s voyages of exploration and discovery, however, commenced the permanent connection between Europe and the Western Hemisphere and later the rest of the world. In most significant aspects, the New World over the next five centuries became the New Europe. The culture termed Western Civilization came to dominate the world, and its world has been much better ever since. Anyone that does not believe that it has is not paying attention.

The left in the United States, especially the majority of the university professoriat, are certainly among them. They are the blind who will not see. Their fiction is that Columbus and his immediate successors rapaciously conquered “paradise” and brought war, disease, racism, slavery, and oppression to the inhabitants who had long communed in harmony with nature.

Of course, this is nonsense. Western Civilization did not invent racism, disease, slavery, or warfare. Those maladies existed among humans from time immemorial, and still do, chiefly in places that have not yet accepted Western culture. The West, however, during the post-Columbus centuries ended slavery, drastically curtailed disease throughout the entire world, and made subsistence so abundant that as much, if not more is actually thrown away as consumed. The slavery that still exists is in places that have rejected Western Civilization. While racism, that is, tribalism, may have been mitigated, the leftist identity politics have given it a new life under a different name. War has not been eliminated; perhaps that is impossible for so long as there are those who eschew reason.

Christopher Columbus was in the vanguard of the West. He was the individual with the courage to strike out and risk his very life by sailing into the unknown to discover a new route, not for conquest, but for trade and commerce. He found this new hemisphere where Western Civilization was to expand and flourish. If it had not been Columbus, it doubtless would have been another European who came to the American continents. But no matter, he was the first, and over four voyages he made the connection between the old and new worlds permanent. This commenced what Texas historian Walter Prescott Webb called The Great Frontier — a four century economic boom that benefitted the entire world.

_____________________________________________________________________________

Note: Just this past week, adjunct professor Richard Taylor was punished by St. John’s College in New York for posing the question “Do the positives outweigh the negatives?” about the Columbian Exchange to his history class. That was the exchange of plants, animals, and diseases between the Old and New Worlds that occurred subsequent to Columbus’ voyages. The most noted of the exchange was the smallpox and other diseases the Europeans brought to the Americas, and the maladies unknown to Europeans that the sailors brought back. The short term effects of that exchange were devastating to the then indigenous inhabitants, but the long term effects may well have been, and are, beneficial to later ones. A debate on both sides of this topic is certainly an exercise that students and faculty should be able to have without being censured or canceled. See https://www.thefire.org/teaching-history-not-permitted-st-johns-bulldozes-academic-freedom-punishes-professor-for-posing-question-about-columbian-exchange/

Systemic Racism

(Endnotes in parentheses)

“Racism is the lowest, most crudely primitive form of collectivism. It is the notion of ascribing moral, social or political significance to a man’s genetic lineage—the notion that a man’s intellectual and characterological traits are produced and transmitted by his internal body chemistry. Which means, in practice, that a man is to be judged, not by his own character and actions, but by the characters and actions of a collective of ancestors.”
* * *
“Today, that problem is growing worse — and so is every other form of racism. America has become race-conscious in a manner reminiscent of the worst days in the most backward countries of nineteenth-century Europe. The cause is the same: the growth of collectivism and statism.”
* * *

“The smallest minority on earth is the individual, Those who deny individual rights, cannot claim to be defenders of minorities.” (1)

Ayn Rand wrote these words over 50 years ago. But her words live on as a present day description of the so-called “Progressivism.”

Fifty-plus years after a successful movement to eliminate legal race discrimination and institute remedies for such conduct, the cry of racism has emerged in three-digit decibels here in the United States. Why? The answer is contained in Rand’s essay. Racism is but a sub-set of collectivism, and that has been by no means eliminated, either officially, culturally, or socially. In fact, it has been encouraged by official acts, and imposed by popular culture, threats by employers, and, in some cases, by mob action.

Thus, the flaw in attempts to end race discrimination was that the legislation and judicial action enacted and decided with that goal in mind perpetuated collectivism that was inherent in racism.

“Systemic” means “relating to a system as a whole; inherent in the system.” or “fundamental to a predominant social, economic, or political practice.” (2) As early as 1967, the term “institutional racism” began to be used to justify such practices as “affirmative action” and other race-conscious activities in government. (3) That term has not been heard recently. One can suppose that there are some who believe “systemic,” which has generally been used in the medical context, might give some scientific aura to racism as pervasive or a disease of the entire body politic.

Is there “systemic racism “in the United States today? Well, yes. But not as our leftist cousins would like us to believe.

My recent experience with racism happened recently, and similar experiences have occurred over some time, as they have most of my adult life, and for sure, every ten years such as the present one.

The recent occasion spoken of in the above paragraph was my purchase of a firearm from a dealer. In order to make the purchase, I was required to complete ATF Form 4473. The purpose of this form was to ensure that eligibility to purchase and possess a firearm; that is, I was not a felon, mentally ill, et cetera. The questionnaire asks for identifying information, such as date and place of birth, height, weight, residence, and so forth. It does not ask for eye color or complexion hue. It does, however ask for the race and, whether buyer is “Hispanic” regardless of race. In other words, it asks you what tribe, recognized by the government, you belong to. It means the government classifies you by your accident of birth. This is also the case for the U. S. Census and a myriad of other forms. To purchase a firearm, to completer the Census, it is mandatory. True, one can classify himself or herself. (4) For non-government agencies and organizations, many such forms have options for “multi-racial” or “prefer not to answer,” but those still have a number of racial categories to choose from.

What is interesting about each person choosing the racial and ethnic classification he or she belongs to is that their choice cannot be successfully challenged. (5) Biologically, almost everyone is of mixed race, and thus there is only one: human. (6)

As an aside, since most anthropologists appear to agree that humans originated in the eastern and southern Africa, one can suppose that all individuals in the United States can claim to be “African-Americans.” (7)

The ATF is not the only agency that insists that an individual who interacts with it be classified by their supposed race. Other agencies do. For example, the EEOC regulations require employers who have more than a certain number of employees to do so. (8)

The national Census was required by the Constitution to implement apportionment of members of the House of Representatives. A controversy arose as to whether slaves, who could not vote, would be counted for that purpose. The states, mainly Virginia, and New York, which interestingly enough had a significant number of slaves, whose ratification of the Constitution was essential if the new federal government was to succeed, wished to include the slaves. Other states essential to ratification also objected, because counting the slaves would give disproportionate power to the state where they were a significant portion of the population. Thus, the compromise in which “three-fifths of all other persons” would be counted was included in Article One. (9)

After the 14th & 15th Amendments, it was no longer necessary to identify who was a slave versus a free person. Nevertheless, later enumerations identified persons by race or ethnicity, though such identifications were to an extent arbitrary, and sometimes ludicrous. More categories were added until the number in the current 2020 Census reached 14, plus “some other race, ” which I suppose could include Martian or Vulcan (which would be logical for Mr. Spock). The current form includes “Chamorro” — residents or those whose origin is in Guam. Why this is a separate “race” can doubtless be traced to a bureaucrat’s desire to curry favor with a denizen of K Street.

While actual enumeration of individuals continues to be necessary to decide the number of members of the U. S. House of Representatives allocated to each state, identifying the race or ethnicity of each person does not serve any Constitutional purpose. It should be eliminated, once and for all.

The late Nineteenth and early Twentieth Centuries saw academic acceptance of racial hierarchy and eugenics. Many intellectuals, who included President Woodrow Wilson, educator John Dewey, Justice Oliver Wendell Holmes, Jr., and feminist hero Margaret Sanger, promulgated those idea that were termed “progressive” in those days. (10) Apparently our current “Progressives” agree at least for their own purposes.

An official government website enumerates the ostensible reasons for asking the race question:

• Establish and evaluate the guidelines for federal affirmative action plans under the Federal Equal Opportunity Recruitment Program.
• Monitor compliance with the Voting Rights Act and enforce bilingual election requirements.
• Monitor and enforce equal employment opportunities under the Civil Rights Act of 1964.
• Identify segments of the population who may not be getting needed medical services under the Public Health Service Act.
• Allocate funds to school districts for bilingual services under the Bilingual Education Act. (11)

Affirmative action as it has been practiced is the most egregious form of government race discrimination. It, of course, assumes that one is inferior in some way because of their racial classification, and needs special treatment on that account to achieve some benefit or achievement. (12) More to the current point, a supposed beneficiary of an affirmative action program may be, and quite often is, assumed to have attained position, profession, or occupation only because of their race rather than their ability, and, perhaps, edged out a more qualified person. Whether it is a fact that many affirmative action beneficiaries are objectively unqualified for the status or position they attain, the perception is widespread, and is unhelpful in the quest to abate race or sex discrimination. (13)

Regarding voting rights, the Census does not publicly identify individuals, so to ascertain a pattern of supposed discrimination, those interested would have to assume that qualified voters in a particular voting precinct failed to vote because they were excluded from the polls because of their race, which might not be the case in fact. Many citizens do not vote because of indifference or simple laziness. Anyway, racial or ethnic discrimination in voting, housing, or employment should be enforced by individual actions based on intentional acts. The “disparate impact” concept as proof of discrimination is rampant collectivism, which is inherently both over-inclusive and under-inclusive.

Additionally, the Census is used for allocation of government largesse and is useful for private commercial uses. (14) But dividing and categorizing citizens (and non-citizen) residents by race only perpetuates the concept that and individual’s race should matter to the government. Obviously, in this case our government is still discriminating by race.

Many countries do not enumerate their inhabitants by race and ethnicity. France is one of the most notable. In that country, an individual is a French citizen or not. On July 13, 2018, the day prior to Bastille Day, the National Assembly voted to remove the word “race” from the Fifth Republic’s constitution. This comes some years after the word was removed from French legislation in 2013. (15)

Dr. Carolyn Liebler, an associate professor of sociology at the University of Minnesota worked with the Census Bureau to document how answers to the race question changed from 2000 to 2010. She found that 6.1% (9.8 million) changed their answers. If that trend continues in 2020, the Census’ expressed purposes for distribution of government largesse will become invalid. (16) Nevertheless, Dr. Liebler answered her rhetorical question why we ask the race question thus, “Because our society is organized by race. We treat it as real.” (17) Perhaps so, though one can certainly argue that no such “organization” exists. But the official treatment of individuals in this country as members of tribes only perpetuates the existence of racism. That is inevitable, and “systemic” when the government continues to treat it so.

Supreme Court Chief Justice John Roberts has declared “The way to stop discrimination on the basis of race is to stop discriminating on the basis of race.” (18) Our government should start with the U.S. Census by purging the racial and ethnic information from current and past Censuses and eliminating the question from future ones. In his dissent in the 1896 case of Plessy v. Ferguson, which upheld the “separate but equal” doctrine for racial segregation, Justice John Marshall Harlan stated “our Constitution is color-blind, and neither knows nor tolerates classes among citizens.” The doctrine was repudiated in the Brown v. Board of Education (1954), but our governments still tolerate discrimination on the basis of race. If “systemic” or “institutional” racism exists, it will persist so long as Justice Harlan’s words are ignored. (19)

Note: Doubtless there are those who would question this analysis, or parts of it. This blog is always welcome to such questions, if they are respectful.

Endnotes

  1. Ayn Rand, “Racism” (1963), reprinted at https://ari.aynrand.org/issues/government-and-business/individual-rights/racism/
  2. “systemic, adj. and n.” OED Online. Oxford University Press, September 2020. https://www.oed.com/
  3. See S. Carmichael & C. V. Hamilton, Black Power 4 (1967). (Cited by the OED in its entry “institutional.”)
  4. In our brave new world, it’s been asserted that one should be permitted to choose which sex (“gender” in wokespeak, a dialect of George Orwell’s Newspeak, I suppose) one wishes to identify with.
  5. Ann Morning, “It’s Impossible to Lie About Your Race,” (July 1, 2015, updated July 1, 2016) https://www.huffpost.com/. To enforce the “Jim Crow” laws, in the early 20th Century, some states adopted the so-called “one-drop rule” that and discernable Negro or black ancestry makes one “black” for legal purposes. This “is taken for granted as readily by judges, affirmative action officers, and black [and white] protesters as it is by Ku Klux Klansmen” interestingly enough (emphasis added). See F. James Davis, Who is Black? One Nation’s Definition (1991),
    excerpted at https://www.pbs.org/wgbh/pages/frontline/shows/jefferson/mixed/onedrop.html
  6. Many on the left, in their Orwellian doublethink way, challenge this concept as being racist itself. One particularly outrageous example is a recent Twitter statement by one Ibram X. Kendi, the director of the Center for Antiracist Research at Boston University, stated that the recent Supreme Court nominee Amy Coney Barrett and her husband adoption of two black children from Haiti, proves they are racists and “white colonizers.” One cannot make this stuff up.
  7. See Mounier, Aurélien; Lahr, Marta (2019). “Deciphering African late middle Pleistocene hominin diversity and the origin of our species” Nature Communications. 10 (1): 3406.
  8. See https://www.eeoc.gov/employers/eeo-1-survey/eeo-1-instruction-booklet.
  9. U. S. Constitution. Article I, Section 2. The “three-fifths” obviously referred to the aggregate number of persons counted, not three-fifth of each individual.
  10. Margaret Sanger has recently been removed from the pantheon by the woke crowd, notwithstanding her contributions to their cause, because they discovered she held some heretical beliefs. Revolutionaries always eat their old, as well as their young.
  11. See https://www.census.gov/acs/www/about/why-we-ask-each-question/race/
  12. One wonders if an ostensible racial category could be a disability and “affirmative action” considered a reasonable accommodation under the Americans for Disabilities Act.
  13. It might be added that many small businesses are formed with women (often spouses or siblings) or members of favored “minority” groups as token owners to obtain government benefits.
  14. See endnote 11.
  15. https://www.connexionfrance.com/ July 13, 2018. This move has not been without controversy, apparently for the same reasons there would be opposition here. It would prevent the use of identity politics and wedge issues based thereupon.
  16. As will its use for identification of alleged “disparate impact.”
  17. Jo Craven McGinty, “Documenting Race Proves Tricky for Census” The Wall Street Journal, July 25 – 26, 2020, page A2. (Print edition).
  18. Parents Involved in Community Schools v. Seattle School District No. 1, 551 U.S. 701 (2007),
  19. Plessy, 163 U.S. 537 (1896). Harlan should not be confused with his eponymous grandson who also served as a Supreme Court Justice from 1955 to 1971. The flaw in Brown was that it did not base its holding on the categorical grounds Harlan articulated, but upon the sociological theories, that relied largely on the work of Swedish academic Gunnar Myrdal.
                           

Good Week for the Constitution

This has been a good week for the U. S. Constitution and good guys.

On the Fifth Amendment front, as of yesterday, colleges and universities conducting Title IX proceedings are required to use a First Amendment-compliant definition of sexual harassment and to guarantee basic due process protections for the accused (such as a presumption of innocence, the right to an advisor, and the right to question one’s accuser). Accusers and victims, too, will benefit from newly required measures that will offer them support — without punishing anyone accused before they are actually found to have committed the offense. The new regulations, passed in strict compliance with the Administrative Procedures Act, overturned the “suggestions” in the infamous “dear colleague” letter issued by the Department of Education during the Obama administration. That letter suggested that institutions requiring the above enumerated protections could be penalized by the loss of public funds. Due process, which includes a requirement that an accuser prove their case, not that accused prove their innocense; having counsel that is familiar with the adjudication process; and the ability to confront the accuser is the cornerstone of liberty. A tribunal that denies this right is not worthy of respect and amounts to a kangaroo court.

Two federal courts have denied relief in suits brought to enjoin the enforcement of the new rule. See https://www.thefire.org/legal/fire-fights-back-against-lawsuits-challenging-2020-title-ix-regulations/

It is still possible that a new President and his/her administration could reverse this, though it could be a lengthy process.

________________________________________________________

On the Second Amendment front, in the a case challenging the California statute banning possession of high-capacity magazines for firearms, the U.S. Ninth Circuit Court of Appeals upheld a district court’s ruling that the law was unconstitutional.

In Duncan vs. Becerra, Case No 19-55376 (August 14, 2020) Judge Kenneth Lee, writing on behalf of himself and Judge Consuelo Callahan gave the reasons behind their ruling.

“Armed self-defense is a fundamental right rooted in tradition and the text of the Second Amendment,” said the majority ruling in . “Even well-intentioned laws must pass constitutional muster. They passed the law in the wake of heart-wrenching and highly publicized mass shootings, but it isn’t enough to justify a law that is so sweeping that half of all magazines in America are now unlawful to own in California.”

“Ammunition is typically used for lawful purposes, and are not ‘unusual arms’ that would fall outside the scope of the Second Amendment.”

As important as this ruling is, it only applies to laws prohibiting the possession of all magazines that hold more than ten rounds. The court concluded by stating “[w]e also want to make clear that our decision today does not address issues not before us. We do not opine on bans on so-called “assault weapons,” nor do we speculate about the legitimacy of bans on magazines holding far larger quantities of ammunition.”

This leaves open the possibility that a ban on a rifle magazine that holds, say, 20 or more rounds could be constitutional, at least in the Ninth Circuit. (Western U.S.)

A fly in the ointment is that there was a dissent in this case. Our own Barbara Lynn, Chief District Judge of the Northern District of Texas, sitting by assignment, wrote that this opinion is wrong and is contrary to rulings in other circuits. A dissent and split in the circuits make U. S, Supreme Court review more likely, though not certain. The Supreme have been reluctant to take any cases regarding gun control since McDonald vs. Chicago in 2010. But who knows? Many believe the Justices are inclined to let hink the issue should be handled by individual states and the circuits.

My own opinion is that a nationwide ban on possession of large capacity magazines (or for that matter, semi-automatic rifles) would be unenforceable, would potentially criminalize half the population of the United States, and do nothing to take those firearms and accessories out of the hands of those who would use them unlawfully.

The full opinion in Duncan et al vs. Becerra is available at https://law.justia.com/cases/federal/appellate-courts/ca9/19-55376/19-55376-2020-08-14.html

Hiroshima & Nagasaki + 75

On August 6, 1945, and again on August 9, the United States military dropped atomic bombs on the Japanese cities of Hiroshima and Nagasaki. Three days later surrendered, thus ending World War II.

The 75 years since, not one more nuclear weapon has been used in anger, despite escalation of tensions over that time, and proliferation of those weapons. A nearly 50 year Cold War saw a number of wars between client states of the Soviet Union and the United States, that despite the ferocity in which they were fought at times, never resulted in an exchange of atomic weapons.

Perhaps one reason hostile nations have so far avoided the use of nukes is because Hiroshima and Nagasaki demonstrated the sheer horror of the use of those weapons.

During this annus horribilis of 2020, we have the usual suspects calling for a “conversation” (which really means recrimination) questioning whether President Harry Truman’s of those weapons was necessary, or perhaps profoundly immoral. Some have even accused the President and those involved in dropping the bombs of being war criminals. That is sheer nonsense.

Sixteen years ago a British writer A. C. Grayling wrote Among the Dead Cities: The History and Moral Legacy of the WWII Bombing of Civilians in Germany and Japan. I wrote a review of the book for a local blog. It bears repeating on this semisesquicentennial anniversary of the Hiroshima/Nagasaki bombings, as follows.

A friend once opined that if the United States and its British ally had pulled their punches in World War II as they have in every war since, including the present one, we’d all be speaking German and/or Japanese. Rhetorical hyperbole this might be, and it would in no sense justify a no-holds barred approach to the current conflict in the Middle East. It is should be undeniable nevertheless that the total war Britain and America fought was necessary to beat the Axis. After all, Nazi Germany and Japan began the concept with a vengeance, and fought ferociously until the bitter end. Air Marshal Sir Arthur “Bomber” Harris, head of the Royal Air Force Bomber Command is reported to have observed while watching the fires around St. Paul’s during the London Blitz “they’ve sown the wind and will reap the whirlwind.” Harris, of course, was the chief windmaker, the architect, if one can use that appellation in such circumstances, of the utter devastation of German cities in the air war that ensued. His bombardiers sowed the seeds of the tornadic firestorms that engulfed Hamburg, Dresden, and other cities, incinerating tens of thousands of civilians and reducing houses, shops, museums, and public buildings to hideous skeletons. The U.S. Air Force in the Pacific, once islands in range of Japan had been captured, carried out similar raids on Japanese cities creating even greater destruction. The final two raids witnessed the only wartime use ever of nuclear weapons.

A.C. Grayling’s Among the Dead Cities is the latest of a number of histories of the strategic bombing in World War II. Its dramatic title (possibly an allusion to I Samuel 31:7) alone sets it apart from the prosaic works by more methodical historians. Grayling styles himself a philosopher rather than a historian and focuses on the morality of the area bombing – sometimes called “saturation” or “carpet” bombing – of German, and Japanese cities. That such bombing was indiscriminate and served to terrorize the targeted populations, kill civilians in great numbers, and destroy their cities makes the whole concept morally repugnant to Grayling. The author claims that, while the stated purpose was to break the enemy’s morale and spirit and disrupt the daily lives and economy of the German people, it served only to increase the resolve of the Germans – much like the 1940-41 Blitz steeled the British to resist. All that the bombing accomplished was wanton and useless mass destruction of centuries old cultural treasures and wanton slaughter of civilians, and had little effect on the outcome of the war.

That thesis is nothing new. A postwar assessment of the effect of strategic bombing indicated that German industrial production continued to increase almost up the end in spite of nearly continuous attacks during the last year of the war. Grayling’s conclusion, however, is that the area bombing was unjustified by military necessity, and thus amounted to a moral outrage and a war crime. Harris, Churchill, and other commanders who carried out their orders (Grayling, perhaps protesting too much, includes a disclaimer that he does not intend to impugn the RAF and American pilots and crews bravery or morality) perhaps escaped prosecution because no international protocols like the Geneva Convention proscribed aerial bombing of civilian targets, and, most importantly, because the Allies won the war.

Similarly, Grayling believes the area bombing raids of Japanese cities were American war crimes, and, by implication, Roosevelt, LeMay, and Nimitz were war criminals. The firebombing of Tokyo and other cites, even more destructive, is of course overshadowed by the nuclear devastation of Hiroshima and Nagasaki, criticism of which from time to time is the subject of unctuous breast beating by certain elements – but that is another story.

Area bombing in Europe was destructive and deadly. Did it win or hasten the end of the war? Did it have any salutary effect at all? Was the loss of civilian life and the ruining of historic structures and artifices worth the cost? Was there any justification to continue the bombing after late 1944 in Germany or after April or May of 1945 in Japan when the war was all but won? While a majority of Germans never voted for Hitler (when it was still possible to vote for leaders prior to 1933), few protested Nazi policies, most acquiesced in the anti-Jewish laws, and probably a huge majority were thrilled by Hitler’s diplomatic and early military victories. So-called terror bombing was first used by the Nazi controlled Luftwaffe against Holland and Britain. When tit was given for tat, the bombing in Germany was not carried out wantonly against a defenseless people. The German military fought back ferociously. Over 50,000 British airmen (and a considerable number of Americans) were casualties of the campaign and thousands of aircraft were shot down. Until the United States geared up sufficiently to help in Europe (remember, the American military had its hands full with Japan in the first two years of the Pacific war, while the Russians were reeling from a withering German offensive) Britain was essentially alone. It had itself been subjected to a Nazi terror bombing campaign from May 1940 through June 1941 that was halted only when Hitler turned his attention to the Soviet Union. The British fought the only way they could. Given the technology of the time – a far cry from the kind that allowed U. S. forces to pinpoint and kill terrorists with drones and computer guided missile resulting in minuscule collateral damage – and the European weather conditions, nighttime area bombing was the only method that could be remotely effective. The main accomplishment the British wanted was to create sufficient disruption to discourage the Nazi bombers from coming back to their homeland. The diversion of resources to air defense, particularly after the Cologne raid of 1942 and Operation Gomorrah over Hamburg in 1943, surely kept the German air force from attacking Britain again, at least with manned aircraft. The bombing likewise surely hindered the effort on the Russian front. The Soviets begged the western Allies to open a western front for over two years before the invasion of France on D-Day. Aerial bombing was the best that Britain and the U. S. could do until sufficient resources were marshaled for the Normandy invasion.
Grayling argues that after the establishment of a western front, and the liberation of most of France by September 1944, combined with contemporaneous Russian drives into Poland, every indication was that Germany was defeated, and all was over but the shouting. Continuing the relentless bombing of German was thus unnecessary.


This is hindsight; it was not all that apparent at the time. To illustrate this point, in September 1944, the Anglo-American forces were dealt a severe setback in Operation Market Garden in the Netherlands, and in December of that year, the German army launched a fierce offensive in the Battle of the Bulge. All the while, Great Britain again was subjected to air raids; this time by the unmanned V-1 and V-2 missiles, the latter being supersonic and striking without warning of any kind. The only defense against the V-2s was to prevent their being available to be launched in the first place. Area bombing, haphazard as it was, was the only possible way. Even after the Bulge, the Anglo-American-Canadian forces were fiercely opposed every step of the way. The Soviet Army in the East was even more ferociously opposed. The Russians suffered nearly a half million casualties in the final drive to Berlin, and had to fight for the city block by block. It is incontrovertible that Britain and the U.S. had to use everything at their disposal to end and win the European War.


As for Japan, the resistance of the enemy was even stronger. Japan began its war with the United States with a sneak attack. Japanese forces contested every battle by fighting, almost literally to the last man. American and British prisoners of war were treated abominably. And when the war was clearly going against Japan, the Kamikaze suicide campaign began. After the liberation of the Philippines, the U.S. was faced with the necessity of invading the home islands of Japan to end the war. Given this situation, could any rational American commander in chief not conclude that serious softening up of the Japanese homeland was a necessary prelude to invasion? When the atomic bombs became available, why should it have been preferable to spare Hiroshima and Nagasaki so the United States and its allies could suffer a million casualties (the estimate at the time, never seriously refuted) invading Japan?


The measureless human suffering caused by the bombing is evident. The loss of lives, particularly innocent children who could not have made the world they are born into, is an unfortunate reality of war. The resultant loss, particularly in Germany, of cultural treasures is one of the saddest legacies of the area bombing. Photographs of the pre-war German cities – Berlin, Hamburg, Dresden, and others — reveal charm and beauty that was utterly destroyed. Berlin suffered the most, not only from the bombing, but the devastation of the last battle, and division between two hostile powers for the next two generations. During my first visit to Berlin twenty-one years after the end of the war, the scars were still there, and where rebuilding had taken place, it was mostly soul-less modern. At the time of my visit a few years before the Wall fell, it had not changed much. Even in 1995, large tracts were still rubble strewn vacant areas. But Berlin has come back, much of the city has been restored to its pre-war appearance, and the newer architecture has its own beauty. Dresden was more remarkable for the restoration of the old city area, including plans, much delayed by the former East German communist regime, for the restoration of the totally destroyed Frauenkirche (which now is complete and was rededicated in 2005 last year, Britain’s Queen – during the war, Second Lieutenant Elizabeth Windsor – sending her best wishes). Nuremberg’s old city center, especially targeted because it was a Nazi hotbed, has been almost completely restored to its pre-war appearance. This demonstrates that artifacts can be rebuilt. Civilizations, however, might well not be. World War II was a struggle for civilization, Western Civilization as it had advanced in its highest and finest order. One of its finest exemplars had been hijacked by evil forces that harkened back to barbarism, superstition, and savagery. The reasons this happened are the subject of a surfeit of writings with many more doubtless to follow, so this phenomenon will not be examined here. But happen it did, and was an evil that had to be defeated, at whatever cost, for our civilization to survive.

Grayling acknowledges that Naziism was a profound evil, and Japanese militarism was not much better, and committed worse atrocities than could ever be laid at the feet of the British and Americans. He maintains, however, that two wrongs do not make a right, and there is no justification for sinking to the same moral level as the Nazis. True enough, but beside the point. Allied strategic bombing was not calculated genocide or wanton cruelty toward conquered people and prisoners of war. It had the legitimate goal of defending against and defeating the forces that practiced such atrocious conduct.


A final point that Grayling ignores completely is what would happen when the fighting was over. Winning the war was one thing; maintaining a peace afterward is quite another. After the World War I armistice, which occurred while the German army was intact and still on French and Belgian soil, and no part of Germany had been invaded, gave credibility to the Nazi explanation that the victorious German army was “stabbed in the back” by reformers, bankers, pacifists, and, especially, Jews. At the end of World War II, Germany and Japan knew they had been beaten – badly. While the comparison is apt, the victory did not quite impose a Carthaginian Peace, as the Romans did after being troubled three times by the same foe. The defeated German and Japanese adversaries were devastated to the point that they had to be rebuilt from the ground up. They were; and reconstructed in the image of capitalist representative democracies. For sixty years after World War II ended, the world, beset by conflict and bloodshed as it has been, was not to be troubled by military aggression emanating from Germany or Japan. Perhaps, then, at least in two corners of the earth, Arthur Harris’ whirlwind managed to uproot the grapes of wrath.

R.I.P. for Another Civil Rights Hero

During the past two weeks there has been quite a bit of pomp and circumstance and celebration of the life of John Lewis, a civil rights icon and hero. Lewis doubtless well-deserves these accolades.

Lewis was in the vanguard of the civil rights movement of the late 50s and 60s, was in harm’s way of the segregationist resistance, and was beaten and jailed for his pains. He brought attention to the injustice of racial segregation and consequent oppression that existed in the southern United States, and elsewhere in the nation. For that everyone should honor him.

Easy to miss among the Lewis mourning is a death of another civil rights hero, if less than an icon, Herman Cain. Few of us had ever heard his name until he offered himself as a candidate for President of the United States in 2012.

Cain’s accomplishment in the private sector showed what intelligence, education, and persistence could do for an individual, even if he had been born and raised as a black person in a racially segregated society. The elimination of legal barriers to individuals like him did not themselves change the hearts and minds of many who thought that black persons were inherently less capable than those of lighter complexion. Nevertheless, Cain did a lot to dispel that notion.

He was educated at Morehouse College in Georgia, a historically black institution, and received a Master’s degree from Purdue University. His college major was mathematics, certainly not the easiest discipline one can study.

Cain worked first for Coca-Cola, became a vice president with Pillsbury, then was appointed to run its struggling Burger King unit in the Philadelphia area. His success prompted Pillsbury officials to ask Cain to take over its floundering Godfather’s Pizza chain. His success in industry showed that a person of his complexion and ancestry certainly had the qualifications for running large businesses, including bringing one back to profitability. As an executive he certainly had experience to rival the present incumbent of the White House, and vastly more than its previous occupant, both of whom he sought to challenge for the office.

Many statues commemorating past public and private individuals are currently under attack, and in some cases destroyed by vandals. Some of those persons probably do not to be so commemorated, but all heroes have flaws, including Cain and Lewis, both of whom are worthy of eponymous streets, statues, and institutions.

There is a lot more to say about Herman Cain, and others will surely say it.  I close with this thought: May he rest in peace.

The Declaration of Independence

Earlier this year we acquired a “Betsy Ross” American flag — the one with thirteen stars in a circle on the ensign. Somehow it seems appropriate to display it on the Fourth of July.

The following is a post published on previous Independence Days. It is worth repeating on this day (with a few non-substantive updates).

The Fourth of July is a day for picnics, fireworks, parades, and all kinds of activities celebrating our Nation’s birth. It should also be a time for reflection on the founding principles. The source of those principles is most immediately found in the Enlightenment. This was a cultural, intellectual, and, later, political movement of the late Seventeenth and Eighteenth Centuries that historians regard as the culmination of a sea-change of thought about the relations of human beings with the universe that began with the Renaissance and Reformation and their precursors. Enlightenment thinkers emphasized the use of reason as the basis of knowledge and understanding and the primary method of discovering moral and physical truths which many regarded as inseparable. Their use of reason led to the idea of the essential sovereignty of the individual person, and the rights of man (in the generic sense) to life, liberty, and estate, or property. The Enlightenment is associated with such thinkers as Isaac Newton, John Locke, Montesquieu, Voltaire, Rousseau, Adam Smith, and others in Europe, and Thomas Jefferson, Benjamin Franklin, and Thomas Paine in America. The United States of America is the only nation founded on the basis of common ideas, rather than accidents of geography, of kinship or tribe, or conquest. Some historians have described our primary founding document as an Enlightenment Manifesto. Leonard Peikoff in his book The Ominous Parallels described America as the “Nation of the Enlightenment.” I have spent some time parsing the Declaration of Independence to show why this is so. Please read on.

When in the Course of Human Events…

The first seven words of the Declaration of Independence are themselves revolutionary. Before Thomas Jefferson (with help from Benjamin Franklin, John Adams, and others) penned this document, all important legal documents began with the paean to God, or the monarch. The Magna Carta, the Mayflower Compact (which some call the first American Constitution) are some examples. There are countless others. This Declaration recognized human, not supernatural, not authoritarian, events which drive this change. This is not to say that it rejects a deity, or even Christianity. It emphasizes that this is the act of human beings, and it is done in the name of a group of people freely associating.

… it becomes necessary…

The word “necessary” in the Enlightenment sense means naturally caused; that is, inevitable because it is of nature. It is akin to a natural law like Isaac Newton described in his treatises on motion and gravity. The law of gravity requires – makes necessary – that an object falls to the ground. As the Declaration goes on to say, events have made American independence necessary.

… for one People to dissolve the Political Bands which have connected them with another…

Political connections are a human construct, not the divine right of kings. The King is not the state – Le roi n’est pas l’état.

…and to assume among the Powers of the Earth the separate and equal Station…

The people of the colonies are “assuming by their own act a status that is independent sovereignty is equal to all the other nations on earth. This assumption is not a grant; it is not a sufferance of the colonial master. It is inherent by right, by the law of nature.

… to which the Laws of Nature and of Nature’s God entitle them,…

This is the core of Enlightenment thinking. God is revealed through nature, and the laws of nature are the laws of God. Jefferson may have been treading lightly here. He probably was a Deist, which many Enlightenment figures were, including Benjamin Franklin, and most notoriously Thomas Paine. He had to recognize the Judeo-Christian tradition because most of the colonial leaders, not to mention the ordinary colonists were at least nominally Christian. Deism as such, was not hostile to Christianity, or other forms of religion, but those faiths did not tolerate Deists.

… a decent Respect for the Opinions of Mankind requires that they declare the causes which impelled them to the Separation.

The people seeking independence are telling the world why. They are justifying their actions to the rest of the world, not just their former British overlords. Reason is what gives actions legitimacy, according to Enlightenment principles. Reason is given to men by God, or nature, and they are expected to justify their actions by it. In order for the world to grant its approval and sanction, human actions must be reasonable.

The next paragraph of the Declaration is a treatise on government and gives the underlying philosophical basis and general justification for independence.

We hold these Truths to be self evident…

The introductory phrase is an epistemological statement that breaks from the long-standing Aristotelian Scholasticism’s presumed authority of the past. The Enlightenment held that the empirical observation, and reasoning from those verifiable observations, is the basis of knowledge. The intellectual tradition of the Western world – indeed, the entire world – had been that the received wisdom from the past should not be deviated from and should form all premises on which knowledge was based. The primary Western authorities, of course were the Scriptures and the Greek philosophers, particularly Aristotle. Beginning with Francis Bacon, the early modern thinkers gradually broke with this method, at least insofar as it attempted to explain the workings of the physical universe. To them, self-evident truths are those that could be apprehended by ordinary minds that are neither clouded by superstition nor addled by passion. The Enlightenment scholars, and other humanists, did not necessarily reject religious Christianity to provide moral guidance and inform men as to the relation with God in eternity, but believed that God manifested truths about the physical universe in nature.

… that all Men are created equal…

The notion of equality of human beings in the Enlightenment did not mean that everyone was the same; that is, equal in physical, intellectual, and moral character. Neither did it mean that they should be leveled to the same economic status. The operative word here is “created.” Its use in this context means that no one is given any special status in relation to others merely by the accident of birth. The Aristotelian description of the universe included the Great Chain of Being. This construct held that there is a order from God in heaven down to the inanimate rocks in which every species of being has a place. In the human order, the King and his nobles have their places at the top, and the peasants and serfs have theirs at the bottom, with different levels of status or importance in between. The Chain of Being was not a ladder, and one’s place was immutable. It was a crime or a sin to attempt to rise above or sink below the status to which one was born to. The Seventeenth Century doctrine of the Divine Right of Kings was a logical deduction from the concept of a Chain of Being. The King was God’s lieutenant on earth. The Enlightenment broke with that concept, and declared that there was no inherent aristocracy based merely on the accident of birth. The turmoil of the English Civil War, the Restoration, and then the Glorious Revolution broke the chain in England by the end of the Seventeenth Century. It would persist in France until that country’s Revolution, a hundred years later. That was not long after the American colonies won their independence, with a little help from their friend – France, ironically, while still under the ancien regime.

… that they are endowed by their Creator with certain unalienable Rights,…

The quality of being human means that there are rights which are given by God, the Creator of nature and the universe, which cannot be abrogated by the whim of human authority. Those rights may be forfeited, but only by conduct of the individual as rationally determined in the due process and course of valid law.

… that among these are Life, Liberty, and the Pursuit of Happiness –…

This enumeration of rights is declared to be, by the use of the word “among,” not exclusive, but these are the basis of, and imply, others. It comes directly from John Locke’s formulation that when human beings are in a state of nature, these are their individual rights. Locke’s Second Treatise on Government termed these rights as protection of “life, liberty, and estate” – estate being interchangeable with the concept of property. Jefferson changed “estate” or “property” to the “pursuit of happiness,” which included the right to possess and enjoy property, but was broader in scope. The proposition that the pursuit of happiness was a fundamental right was revolutionary in itself. From almost time immemorial, and certainly in the Christian tradition at least until the Reformation, life on earth was not supposed to be happy. Life was an arduous journey through a vale of tears on the way to an afterlife of happiness, or punishment, depending on how one conducted oneself in this world. Rather than pursuing happiness here on earth, it was self-denial and mortification that were virtues, not enjoyment or seeking betterment of living standards and conditions. This, of course, was a doctrine which kept the Great Chain of Being intact, as well as the hoi polloi in line. Individual liberty was nonexistent, because the individual person was subject to the collective, that is, the King or state. One’s life as well belonged to the same sovereign.

That to secure these Rights, Governments are instituted among Men, deriving their just Powers from the Consent of the Governed…

Legitimate governments are created by the consent of the people, not imposed from the top down. The people, who in a state of nature are unable to adequately protect their lives, their liberty, and the ability to pursue happiness, including the protection of their private property, form a government for this purpose. In order to accomplish these ends, certain aspects of the fundamental rights are limited, and ceded to a constituted authority by consent, whose primary – and only legitimate – function is to secure the essence of those rights.

… That whenever any form of Government becomes destructive of these Ends, it is the Right of the People to alter or abolish it, and to institute new Government, laying its Foundation on such Principles, and organizing its Powers in such Form, as to them shall seem most likely to effect their Safety and Happiness.

A government can become destructive of life, liberty, and the pursuit of happiness. In fact, the use of the word “whenever” seems to imply that it is inevitable that at some point the government will become destructive of such. It is part and parcel of these unalienable rights for the people to alter or abolish it, and create a new government. A new government, however, must be instituted on the core principles stated.

Prudence, indeed, will dictate that Governments long established should not be changed for light and transient Causes; and accordingly all Experience hath shewn, that Mankind are more disposed to suffer, while Evils are sufferable, than to right themselves by abolishing the Forms to which they are accustomed.

When government has existed for a long time, some deference must be given to it for that quality alone. Stability and just expectations are aspects of the unalienable rights, which themselves must be respected. This passage recognizes that the governments will not be perfect, and there may be better ways of accomplishing the protection of life liberty and the pursuit of happiness at various times, and under different conditions. Just because a government may act in an imperfect manner temporarily is no reason to take the drastic step of abolishing it. The phrase that “mankind are more disposed to suffer while evils are sufferable” echoes William Shakespeare’s observation in Hamlet that we “rather bear the ills we have than fly to those we know not of.”

But when a long Train of Abuses and Usurpations, pursuing invariably the same Object, evinces a Design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government and to provide new Guards for their future Security.

This sentence states the conditions necessary to overcome the presumption that governments long established should not be changed. When they are fulfilled, revolution becomes a right – and duty.

Such has been the patient Sufferance of these Colonies; and such is now the Necessity which constrains them to alter their former Systems of Government. The History of the present King of Great-Britain is a History of repeated Injuries and Usurpations, having in direct Object the Establishment of an absolute Tyranny over these States. To prove this, let Facts be submitted to a candid World.

Thus begins the justification for Revolution and nature of the grievances against British rule. It is followed by a bill of particulars containing twenty-seven specific grievances committed by the Crown, personified by King George III. If one studies the Constitution later drafted and ratified, it is possible to find a provision there which addresses nearly every one of the complaints found in this bill of particulars.

In every stage of these Oppressions we have Petitioned for Redress in the most humble terms: our repeated Petitions have been answered only by repeated Injury. A Prince, whose Character is thus marked by every act which may define a Tyrant, is unfit to be the Ruler of a free People.

After the bill of particulars, the Declaration provides additional justification for independence by asserting that the people of the American colonies have brought their grievances to the attention of the King, and his ministers, to no avail, and only to receive further injury.

Nor have we been wanting in Attentions to our British Brethren. We have warned them from Time to Time of attempts by their Legislature to extend an unwarrantable Jurisdiction over us. We have reminded them of the Circumstances of our Emigration and Settlement here. We have appealed to their native Justice and Magnanimity, and we have conjured them by the Ties of our common Kindred to disavow these Usurpations, which would inevitably interrupt our Connections and Correspondence. They too have been deaf to the Voice of Justice and of Consanguinity. We must, therefore, acquiesce in the Necessity, which denounces our Separation, and hold them, as we hold the rest of Mankind, Enemies in War, in Peace, Friends.

This penultimate paragraph reminds the people of Great Britain that the American colonists have notified them of the grievances, and they have nevertheless done nothing to prevail upon Parliament and the King’s ministers to change policies and redress the grievances. It concludes by defining the relations going forward that the Americans will have with the British: that is, a separate and equal station, along with all other nations of the Earth, and not as sworn enemies.

“We, therefore, the Representatives of the united States of America, in General Congress, assembled, appealing to the Supreme Judge of the World for the Rectitude of Our Intentions, do, in the Name, and by the Authority of the good People of these Colonies, solemnly Publish and Declare, that these United Colonies are, and of right ought to be Free and Independent States; that they are absolved from all Allegiance to the British Crown and that all political Connection between them and the State of Great-Britain, is and ought to be totally dissolved; and that as Free and Independent States, they have full Power to levy War, conclude Peace, contract Alliances, establish Commerce and do all other Acts and Things which Independent States may of right do. And for the support of this Declaration, with a firm Reliance on the Protection of divine Providence, we mutually pledge to each other our Lives, Our Fortunes, and Our Sacred Honor.

The use of the lower case “united” indicates that each of the new entities are separate States, though united in purpose. Unification will come later, and remain tenuous, in many ways, even unto this day. The Declaration appeals to God as a witness, but is done in the name of the “good People of the Colonies” who are to be the sovereign. Divine Providence will protect them. The signers pledged their “sacred honor,” the most precious possession to an Enlightenment man. As for their lives and fortunes, they were aware they were committing treason against the British Crown, which was subject to the severest of penalties.

It was over seven long years of war and privation before the Declaration of Independence was ratified by the Treaty of Paris in which Great Britain gave up all claim to sovereignty over its former colonies, but these words written and approved by the patriots in Philadelphia two hundred and forty-four years ago finally became a reality. It remains so to this day, perhaps imperfect, but there is nothing better. Indeed, there is nothing like it in the world.

Note: The writings of Professor Alan Charles Kors of the University of Pennsylvania, who is the editor of Encyclopedia of the Enlightenment, gave me the idea for this essay. He teaches 17th & 18th Century intellectual history. Professor Kors is one of the founders of the Foundation for Individual Rights in Education (FIRE), a watchdog organization that fights the increasing denial of freedom of expression on American college campuses.

Are you kidding me?

Abolish/defund the police? The following observation by one who has observed and commented on the police for 50 years.

“The traffic on Revolutión [street in Tijuana, Mexico] was nearly bumper-locked. Young Americans hanging out of car windows were whistling, clapping, yelling, thumping on car doors, flipping the bird at pedestrians, cutting off cars, mooning any female older than 12, and spewing the contents of their stomachs onto the streets of a country they considered third-rate and Third World. In short, it was a scene that might be replayed in just about any U.S. city if the police were underpaid, underfunded, undermanned, undermined, and as desperately corrupt as the police of Tijuana.” — Joseph Wambaugh, Finnegan’s Week (1992).

Of course, uninhibited carousing, fueled by alcohol and other chemicals, nearly always results in violence to persons and property, often bystanders.

70 Cold Winters

70 years ago today, June 25, 1950, North Korean forces, supported by Stalin’s Soviet Union and Mao Zedong’s new established Peoples Republic of China, invaded the Republic of Korea (ROK), which, because of the Cold War still had American and other allied forces located there. The NK Army managed to move almost the entire distance of the peninsula and appeared just about ready to overrun and conquer the entirety of the ROK.

President Harry S Truman responded almost immediately and the American military in Korea and occupied Japan, together with ROK army units, established a defensive perimeter surrounding an area centered on the southern port of Pusan. The United States moved the UN Security Council to authorize a United Nations response to this aggression. Because the Soviet Union was boycotting Security Council over other issues at the time, it was not able to use its veto against the UN resolution. Thus, the United Nations Command was established and authorized to respond militarily.


In September of that year UN forces, now under the leadership of General Douglas MacArthur, simultaneously broke out of the Pusan perimeter and landed forces at Inchon near the mouth of the Han River near the ROK capital of Seoul, subsequently routing the NK forces.

By November, American soldiers, Marines, and allies — mainly British — had pushed the NK Army all the way out of the South and almost to the Chinese border at the Yalu River. Then, Chinese forces, ostensibly “volunteers” but actually Mao Zedong’s People’s Liberation Army, counterattacked and drove the UN forces out of the north and back in to the South where they established another defensive line of Seoul. The full weight of Americans and allies was able to push the RK/PRC forces back into the north by February 1951. After that, a stalemate for the next two years until a shaky armistice was signed, establishing a demilitarized zone, which remains in place to this day.

The attempt of North Korean leader at the time, Kim Il Sung, to unify Korea by force failed. But Korea remained divided, and still is. In the interim the north has been ruled by Kim (until his death in 1994), his son Kim Jong-il (1994 – 2011), and his grandson Kim Jong-un (2011 – present). It has been, and is, a totalitarian state that makes George Orwell’s Oceania pale by comparison. It continues to exist only by virtue of Chinese support, and its acquisition of nuclear weapons, which the United States and other nations should never allowed to happen. (The Soviet Union also provided a measure of support until it collapsed in 1990-91. Was that collapse an opportunity missed?)


For comprehesive histories of the war, see T. R. Fehrenbach’s This Kind of War (1963); David Halberstam’s The Coldest Winter (2007). Fehrenbach, a noted Texas historian (Lone Star and Fire and Blood, histories of Texas and Mexico respectively), served as an infantry platoon leader in the Korean War. Halberstam was a journalist who covered both the Korean and Vietnam wars.
Also see: https://bobreagan13.com/2017/04/15/now-hes-too-rich-to-kill/