South Carolina’s Iconic Palmetto and Crescent Flag
Speaking as one who has harbored a fascination with flags and symbolism for as long as I can remember, I have been intrigued with the amount of time legislators in several states have put into resolving issues related to their state’s symbols, notably flags.
Granted, the bulk of these challenges have been taken up by Southern legislatures as they are confronted with the sundry challenges associated the symbolism of the very late Confederate States of America, some of which is incorporated into their symbols.
Within the last generation two Southern states, Georgia and Mississippi, have undertaken wholesale revisions of their state flag, though Georgia opted in the end to retain a design inspired directly by the Confederacy’s first national flag, the Stars and Bars.
More recently, South Carolina is dealing with what could prove to be one of the most vexing challenges of all: settling on a standard for the state’s iconic Palmetto and Crescent symbol. Fortunately for South Carolina, this symbol predates the Confederacy and stems from the state’s distant Revolutionary past.
As it happens, the Palametto and Crescent flag hoisted daily over the Statehouse bears a somewhat different design than those displayed in the House and Senate chambers and the governor’s office.
Complicating matters is the fact that South Carolina, like many other frugal state governments, relies on private manufacturers to supply the flags it displays in official offices and on public grounds. And because the Palmetto and Crescent symbol never had been standardized, these companies supply different versions.
Consequently, the Legislature is now being challenged to adopt a standardized version of banner, one that has proven more challenging than any of the legislators anticipated.
Speaking as a proud Alabamian, I have to concede that I envy South Carolina immensely. No other state can hold a candle to the Palmetto and Crescent, except Texas, which possess the nation’s most iconic state symbol, the Lone Star flag, recognized the world over.
If only Alabama’s Yellow Hammer carried as much symbolic punch But alas, it is rooted in the Old Confederacy and sooner or later will be consigned to extinction – the symbol, not the bird – much like several Alabama college administrative buildings bearing the name of Gov. Bibb Graves, a noted educational reformer who also maintained KKK membership.
Whatever the case, I do think that the recent dust up over the Palmetto and Crescent is possibly highly instructive from a cultural standpoint.
In the face of an increasingly fraying American identity, state flags and symbolism are likely to become more significant in the future.
Way back in the mid-1970’s, I and my fellow classmates at Russellville Junior High School were blessed with an unusually gifted and dedicated 8th grade history teacher named Mary Alexander.
Mrs. Alexander, now long deceased, loved pointing out the irony of history, particularly in terms of how facets of it – whether these happened to be political or cultural ideals or ways of doing things – often re-expressed themselves at times when we least expected them, even when we thought that they had become discredited or simply had played out.
I never forgot her lesson. Indeed as an avid reader of history I am reminded of this on a frequent basis. Just when we think that some ideas have been discredited or forgotten and, consequently, consigned to history’s ashbin, they return with a vengeance, even with the sense of vibrancy and relevance that had distinguished them in previous decades or even centuries.
The rekindling of American federalism and even, perish thought, states rights, serves as an unusually timely example. I grew up at a time when federalism expressed as states sovereignty seemed throughly discredited. What seemed to have been an inexorable march toward human progress, LBJ’s Great Society programs, locked arm and arm with the civil rights struggle and the federal courts’ efforts to expunge the stigma of racial discrimination, seemed to have dealt, if not a fatal blow to states rights, at least a searing defeat that would leave this constitutional doctrine in what amounted to a semi-comatose state.
We were assured by teachers at every level of public education that states rights was a relic of the past – not just a quaint but even a disquieting one. I recall several political science courses in which the professor, a Great Society liberal, likened federalism to a marbled cake. The federal government was the cake, though states provided measure of enhancement, sort of like chocolate marbling.
Yet, history seems to be repeating itself with a vengeance. In the face of American federal impasse and national division, states, large and small alike, are reasserting the themselves. As I have pointed out on numerous occasions on this forum, it started more than a decade ago when then-California Gov. Arnold Schwarzenegger began characterizing his state as something resembling a nation within a nation. He successor, Jerry Brown, even began conducting a kind of incipient foreign policy related to climate change.
Recently, a prominent GOP leader, Allen West, has lobbied for a secession vote in the Texas State Assembly, a move that at least one GOP leader in another Western state characterizes not only as a positive move but also one that bears close watching.
More recently in Oregon, state Sen. Jeff Golden (D-Ashland) has proposed legislation that would reintroduce a state bank concept for Oregon, primarily with the aim of serving as a backstop for community banks and credit unions.
Golden holds up the Bank of North Dakota as the model for his efforts, stressing the role that this bank played in minimizing foreclosures during the Great Recession.
The Washington Post reports that small businesses in North Dakota, compared with their counterparts in other states, were ably served by this model. In fact, they secured more Paycheck Protection Program (PPP) loans relative to the state’s workforce than other states, with more than $5,000 per private-sector worker as of May 8, 2020.
Yet, why is all of this surprising? States, by their very nature, possess the accoutrements of nationhood. And this is as much a matter of practicality as a historical fact.
As a student of constitutional history, I not only find this fascinating but also instructional in terms of how it underscores the increasing inefficiency of centralized federalism. If developments such as these demonstrate one thing, it’s that no central government, certainly one so big, bloated and overextended as the imperial behemoth in Washington, possesses the omnicompetence to manage a polity of the scale of the United States.
The late Mrs. Alexander was spot on: History does repeat itself.
Confederate Provisional Congress Meeting in Montgomery, AL, in 1861
I wondered how much longer it would be before the Confederate Constitution, much like Confederate statues, would fall victim to cancel culture. Quite honestly, though, I don’t know what is more maddening: cancel culture or the intellectual laziness evincedby journalists, even relatively elite ones, who, either intentionally or unintentionally, aid and abet this malignant cultural trend.
AP journalist Jay Reeves characterizes the Confederate Constitution, which, incidentally, was debated and drafted in the Capitol in Montgomery in my native state of Alabama, as a vestige of white supremacy without even bothering to consider the document within its full historical context. And let’s make no mistake here: The Permanent Confederate Constitution was conceived within a wide intellectual and historical Anglo-American constitutional context and, for that reason alone, is worthy of serious discussion, despite its provisions safeguarding the institution of slavery.
It is appalling to me that Reeves never even bothered to explore this unusually rich context, which would have been standard practice among journalists as recently as a decade ago.
A Watershed Document
Before public discourse became so poisoned, the Confederate Constitution, despite the controversy associated with it, would have been characterized by some writers and academics as a watershed document, one that represented the outcome of a protracted, intense and often acrimonious debate on the nature and scope of federal power that began immediately following the drafting of the U.S. Constitution in 1789.
The Permanent Confederate Constitution could be accurately characterized as embodying the Jeffersonian School argument, which maintains that the federal government – the “general government,” as it was characterized by many in the decades following constitutional ratification – simply functioned as the agent of the contracting sovereign states. This was underscored by the Confederate Constitution’s preamble, which affirmed that each state, in ratifying the document, was acting in its “sovereign and independent character.”
Aside from reaffirming the Jeffersonian view of federal power, this revised constitution also introduced some remarkable innovations that not only are instructive today but that still hold currency as contemporary Americans struggle to rein in federal power and even more significant, contend with mounting interest in sectionalism and even secession. Indeed, the case could be made that these innovations are especially relevant today amid new sectional divisions pitting predominantly liberal blue-coastal states against predominantly and implacably conservative red heartland states – issues not all that different from the ones that plagued federal relations in the early 19th century.
A Six-Year Presidency and a Line-Item Veto
One notable innovation was how the Confederate framers altered the office of the presidency, both limiting and strengthening it. While restricting the chief executive to a single 6-year term, the Confederate Constitution also empowered him with line-item veto power. Such a constitutional prerogative potentially would have gone a long way toward reining in the Leviathan federal state, one that not only extends its hand into increasing facets of American life but even holds tremendous sway over the affairs of nations in far-fling corners of the world. Moreover, with such a constitutional safeguard, we likely wouldn’t be contending today with a $20-million deficit.
The constitution also prohibited Congress from levying protective tariffs that tended to benefit one section of the country over others, an issue that proved contentious in the formative stages of the young American Republic and that virtually rent it apart in the early 1830’s.
The long-term effects of protective tariffs arguably have had an especially deleterious effect on the fortunes of American development and national cohesiveness, not only by allowing one section of the country, namely, the mercantile Northeast, to grow rich at the expense of most of the others but also by enabling it to transform much of the rest of the country, notably the war-ravaged, economically prostrate post-Civil War South, into an economic extraction zone.
Reining in Federal Judicial Power
In what arguably could be regarded as the most noteworthy innovation of them all, state legislatures were entitled to remove corrupt or constitutionally unscrupulous federal judges living in their states by a two-thirds vote of both houses. Ponder for a moment all of the contentious 21st century issues that could have been resolved by this provision. It would have obviated the need for state legislatures to resort to strategies such as interposition and nullification that contributed significantly to two serious constitutional crises stemming from passage of the Alien and Sedition Acts in 1798 and the Tariff Act of 1828. Each of these contributed significantly to the protracted political impasse that culminated in a national breakup in 1861. Even more significant, though, such a constitutional safeguard likely would have contributed significantly not only to higher levels of restraint in the judicial branch but also in the federal legislative branch, as lawmakers would been more cognizant of the futility of passing laws that encroached on state sovereignty.
Yes, the Confederate Constitution was both an innovative and instructive, one among a long line of written constitutions within the Anglo-American tradition, one that also incorporates those of Commonwealth realms. And that is why it, along with others, should figure in prominently in any undergraduate or graduate coursework dealing with the protracted historical debate about the nature and scope of central power within a federal system. But like so much else in woke 21st century America, the Confederate States Constitution is now so thoroughly tainted by the stigma of white supremacy that it can never be regarded as anything more than a “forgotten relic of an ignoble cause,” borrowing Reeves’ description, and, consequently should remain locked away in archive and forgotten.
This only ensures that substantive debate in this country will grow even more constrained. But, of course, by now it should have dawned on most of us that this is one of the underlying aims of wokeness and cancel culture, which aren’t so much about fairness and inclusiveness as they are about stigmatizing views that threaten their hegemonic standing within American politics and culture.
Reeves’ article only served to underscore that we no longer function aa vibrant, open and free society, only one that pretends to be. And many of us are beginning to wonder how much longer elites, increasingly confident of the political and cultural power they increasingly wield, will bother with maintaining this pretension.
Remember mad anchorman Howard Beale’s admonition in Network (1976) that television was the most “awesome goddamned propaganda force in the whole godless world!”? Well, the hapless Mr. Beale only got it partly right. The most awesome force in the world is American culture, a phenomenon that now drives much of the dialogue and culture throughout the Western world and beyond.
As we approach the second quarter of the 21st century, we’re seeing the cresting of a remarkable cultural force that was incubated by Washington’s victory at Yorktown and that has gained increasing levels of traction since the U.S. Civil War, World War I and particularly World War II, which placed this phenomenon at a particularly distinct advantage vis-a-vis its war-ravaged, materially depleted counterparts and erstwhile rivals in Western Europe.
A Pulverizing, Flattening Social Force
Indeed, looking back over the past 30 years following the collapse of of Soviet communism, it’s worth recalling how many on the left finally concluded, however reluctantly, that American culture – all the pulverizing, flattening effects associated with it – ultimately proved to be, paraphrasing Beale, history’s most awesome, radicalizing force in the whole godless world.
The whole world has been Americanized – and it leads one to wonder if any ancient institution, including one of the most ancient of all, the British monarchy – is equipped to withstand this force over the course of time.
It’s fascinating to consider all of the subtle ways that this cultural force is playing out in every facet of modern life.
The Late Prince Diana Wasn’t British
Consider the late Princess Diana, who was eulogized at her very Americanized funeral by her brother, Viscount Althrope, as a “very British girl.” Actually, she arguably wasn’t at all. Despite her very noble and very English pedigree, she embodied many of the aspirations of global American culture – a penchant for personal independence, self-expression and self-actualization.
Even her comparatively sober, responsible son, Prince William, the royal heir, has expressed his qualms about assuming the Windsor mantle and in ways that sound, well, rather American. And this really isn’t all that new. The Duke of Windsor, the former Edward VIII, who always evinced a special affinity for American culture, even incorporating American slang in his casual discourse, possibly wedded a twice-divorce American social climber as a pretense for abandoning the British throne, likely because he, too, had been infected with American notions of personal independence.
Now the monarchy is imperiled once again by an even more explicit expression of this this awesome cultural force: a grasping, b-list American actress whose personal agenda has been hiding in plain sight for the past four years, one that puts her late mother-in-law’s rather ill-defined and hastily improvised agenda to shame.
A Fresh Face among Staid but Stuffy, Lilly-White In-Laws
Some royal watchers speculate that Meghan initially harbored a desire to transform the monarchy from within, carving out her own distinct royal identity amid her staid but rather stuffy, lilly white in-laws. She aspired to be the fresh face among the Windsor clan, not only equipped to energize this thousand-year-old institution but one who, over the course of time, would be regarded as so valuable and indispensable to the Crown’s long-term success that she would be afforded the opportunity to establish her own distinct style and agenda.
Predictably and in remarkably short order, she realized this this ancient institution operated by its own time-honored and distinctly rigid rules. The full weight of this new reality fell on her carefully sculpted shoulders: She had been assigned a non-negotiable set of job responsibilities, that not only detracted her from her personal career aspirations but that also effectively consigned her to what amounted to gilded oblivion – a mere face and a tightly constrained voice consoled only by the knowledge that her fate was shared with the world’s wealthiest and most exclusive family.
She balked, predictably resorting to American arguments about one’s being entitled to happiness, self-actualization and self-expression.
Then followed disruption – the quintessential American desire for separation and a fresh start, albeit with the the tacit understanding that she would continue to profit from Windsor family connections.
While this likely amounted to a deviation from her original plan, Meghan, with poor, dimwitted Harry in tow, had drawn closer to her goal of carving out a sort of semi-autonomous woke Windsor counter-monarchy, one in she could fuse the legacies of Princess Diana, Mother Theresa, Rosa Parks and the theology of Oprahism into a neatly crafted, compellingly new alternative brand.
In time, though, elements of Plan B proved to be as problematic as Plan A, notably, running up against the Royal Family’s obstinate refusal to allow the Sussexes to profit from their residual royal ties.
A Scorched Earth Plan C.
From the Sussexes’ swanky digs in exclusive Montecito, California Meghan improvised a new, scorched earth Plan C, turning wokeness up to full throttle, accusing the Windsors of mental abuse (one of the standby American strategies in divorces and HR disputes) and even characterizing them as a vestige of white supremacy.
Granted, millions of people see through this grifting, grasping woman and have from the very start. But millions of others have predictably swallowed this Cinderella narrative hook, line and sinker, just as they did Diana’s version. Moreover, there are plenty of facets of woke elite culture, particularly within the corporate sector, that very well may lend a lucrative helping hand to the Sussexes over time. And that is precisely the outcome that Meghan planned for and expects. And she likely will be proven right.
So, it’s entirely possible, if not likely, that Meagan will be remembered generations from now as a truly singular historical figure: not only as fashion icon and trendsetter but even as a dynastic matriarch of sorts – the founder of a new, radically chic form of monarchy, one that represents a the culmination of global American culture, leavened by a heaping serving of wokism.
Protesters toppling a statue of Shah Mohammad Reza in Iran in 1978.
As memory serves, I’ve mentioned French philosopher Etienne de la Boetie a time or two in this forum.
His observations about how the fortunes of government, any government, no matter how democratic or authoritarian, ultimately rest on the sentiments of its subjects, invariably remind me of the tumultuous events culminating in the overthrow of Shah Mohammad Reza during Iran’s 1979 Islamic Revolution.
The ways that tbe Shah’s besieged caretaker government under Shapour Baktiar desperately clung to power following the Shah’s hasty departure, issuing edict after edict, proclamation after proclamation, in the forlorn hope of reining in revolutionary discontent would have resounded with de la Boetie. He even coined a succinct phrase, which has been employed by paleo-libertarian writers time and again to describe those rare inflection points in history when a large segment of a society’s population simply has had enough, writing off governmental authority as utterly debased, illegitimate and unentitled to obedience, despite the potentially deadly consequences this behavior often invites. I have racked my brain for years and still can’t recall the phrase, though it brilliantly conveyed the essence of this historical inflection point, which invariably portends a abrupt, irrevocable break with the old order.
As de la Boetie would have anticipated, Bakhtiar’s efforts amounted to nothing, as millions of rank-and-file Iranians, obstinately ignoring all of them, pushed ahead with insurrection. Iran had reached an inflection point of popular discontent, one that bore close parallels to the descriptions of popular disillusionment that de la Boetie supplied in his own writings.
I was a high school student way back in 1979, too intellectually unsophisticated at the time to grasp the full implications of what was unfolding in Iran. But I possessed at least enough insight to discern that some sort of line had been crossed. And I also suspected that it marked not only a significant historical departure for ordinary Iranians but also a monumental shift in the geopolitical balance – namely, the ways the United States subsequently ordered its affairs in this tumultuous region.
Granted, most Americans of the time held no sympathy for radical Islam and knew that what followed would impose significant hardship for the Iranian people. But based on all the facts that we were able to garner at that time through broadcast and print media – this, after all, was almost a full generation before the advent of digital media – many of us knew that longstanding American support for the hated Pahlavi regime was a significant driving factor behind this uprising.
Empires, especially global ones, require client states, and the Shah’s regime served American interests in a variety of ways, despite their running counter to the aspirations of millions of ordinary Iranians, especially those in rural locales, far removed from the material prosperity unfolding in Iranian cities.
To be sure, “dark forces,” notably the Soviet Union, may have been working behind the scenes to exacerbate the these social, cultural and political cleavages, but I, for one, still believed that the raging anger of the Iranians was rooted in genuine grievance. Yet, who could ever had imagined that this conflagration ultimately would lead months later to the storming of the American Embassy in Teheran?
By that time I had graduated high school and enrolled in college to earn a political science degree. I can still recall almost verbatim how one professor described the embassy occupation as an event of profound geopolitical significance, one that likely would be remembered many years later as one of the watershed events of the post-war of the 20th century. He was right: Iran’s Islamic Revolution marked a significant reformulation of American strategy in the Middle East, one that would be followed by an immense expenditure of American blood and wealth.
The Pahlavi regime’s collapse not only foreshadowed the erosion of American influence in that region but also of the decline of the comparatively short-lived American Empire, which had been hastily improvised little more than a quarter century earlier to fill the breach left by a beleaguered British Empire in the aftermath of World War II.
Americans were in store for a long and arduous journey, though one punctuated by the assurances of U.S. governing elites that all setbacks were only temporary and that the expenditure of American blood, wealth and geopolitical capital to contain and ultimately to reverse the viral eruption of Islamic radicalism ultimately would tip the scales, drawing us finally toward a new flourishing of the American-fostered liberal-democratic imperium, in which democracy and secularism finally would would take root and thrive in previous inhospitable Mideastern soil.
We know better now – at least, growing numbers of us do. And we also perceive how this vast expenditure of blood and treasure in this region of the world has sapped American strength not only abroad but also at home, embodied in the decaying infrastructure and boarded store fronts as well as in the social pathology and breakdown evident on so may small cities and towns across the vast American heartland. Tens of millions also perceive how our elites, increasingly exposed, cornered and threatened as a result of the wind they sowed decades ago, have turned to the same desperate tactics to which previous ruling classes have resorted in the face of imperial decline and rising levels of discontent.
Our rulers and their media enablers characterize the occupation of the U.S. Capitol in January essentially Qanon conspiracy fearmongering run amuck. Millions of us aren’t buying it. We even suspect that decades from now, this event very well may be recalled as an turning point, perhaps even as the harbinger of a de la Boetiean-style watershed event in America not that far removed from what transpired in Iran more than two generations ago. Indeed, for tens of millions of us, this event only served to shine a light on the perfervid anger of millions of rank-and-file Americans, not only over the rot that has set into many, if not most, of this country’s political and cultural institutions but also over the ways that our governing class and their enablers (e.g., academia, media and Silicon Valley) have contributed immensely to it.
Borrowed for purposes of illustration from the Dishcast with Andrew Sullivan
I have been intrigued with the recent behavior of Andrew Sullivan, one of the most innovative and gifted political commentators of the age.
Note in this column Michael Anton’s description of how exasperated Sullivan, an ardent NeverTrumper and self-described conservative (albeit of the wet Tory variety), became during a podcast interview in the face of interviewee Anton’s refusal to acknowledge the validity of 2020 election outcome. Sullivan would abide none of this and, over the course of the interview, kept dragging Anton back to the topic.
Personally, I think that Sullivan’s exasperation with this topic possibly provides a fascinating glimpse into the soul of the American political cognoscenti, especially those in the thinning ranks of thr centrist camp, of which Sullivan is the most conspicuous and talented member.
Anton is only one of several commentators who have pointed out the fractiousness with which Sullivan and other political commentators have treated those who have summoned the temerity to question the validity of the election outcome. But then, why wouldn’t they?
For at least the past century and a half, most Americans have regarded their country as one of humanity’s singular achievements, one built significantly, if not entirely, on the basis of ideals rather than from the Old World ingredients of language, culture and ethnicity. And this narrative typically has also encompassed the argument that this singularity has been sustained – backstopped – by governing institutions, notably an electoral system that, at least until the last few election cycles, has set a benchmark not only for every other Western constitutional democracy but also for nations that aspire to lofty standards of governance.
Singularity has comprised much of the adhesive that has held this country together for at least the past century, especially following the tidal wave of immigration from Eastern and Southern Europe in the late 19th century, which threatened to dilute the moorings that previously had connected the country to its strong Anglo-Saxon cultural and political legacies.
In the face of this rapid demographic transformation, American intellectuals began improvising an updated national identity that over time was expressed as propositional nationhood. It is grounded on the premise, foreshadowed in the Gettysburg Address, that America derives its identity from the ideals outlined in the Declaration of Independence and that these are sustained by a rigid adherence to the rule of law. Many liberals and a few conservatives would contend that this improvisation has worked reasonably well, at least, until recently.
Yet, cultural and political upheavals since the 2016 Trump election upset have drawn growing numbers of Americans on both ends of the political spectrum to question whether or not these idealistic foundations have frayed to the point of threadbareness.
While it’s impossible to discern an individual’s motives, I suspect that Sullivan is one among several in the elite punditry who harbor serious misgivings about what is unfolding in America. After all, Sullivan, a Briton by birth, is a naturalized American who has affirmed more than once in his columns and blogs how the idealistic underpinnings of American national identity ultimately inspired him to acquire citizenship.
Yet, I wonder if this enthusiasm has been beset recently with the same gnawing doubts that have gripped other Americans. Sullivan is no naif by any stretch of the imagination. He has demonstrated time and again in his commentary not only vast erudition but also a highly nuanced understanding of vitually every prevailing political trend.
Over the course of his wide reading, he’s undoubtedly encountered Czech playwright and later Czecholovakian President Vaclav Havel’s seminal essay “The Power of the Powerless,” wherein Havel likens the embattled Czecholoslovakian Communist regime and its underpinning ideology to a hermetically sealed package prone to rapid spoilage at the mere prick of the seal.
A time or two I’ve wondered if Sullivan, pondering the parlous state of American unity, has been reminded of this seminal essay and noted parallels with present-day America.
I readily confess that I have.
Sullivan undoubtedly understands that a nation such as the United States founded on and sustained largely by abstract ideals survives as a functioning constitutional democracy only so long the majority of its citizens evince faith in these ideals.
What if the spoilage described by Havel ultimately is setting into American idealism? Likewise, what happens if Americans, growing numbers of them, no longer express confidence in these ideals? What if they come to the point of openly expressing doubts that these ideas still comprise an adequate basis for America unity?
To be sure, Sullivan’s exasperation with Anton may simply have been a means of reinforcing his standing as an Establishment commentator standing at the temperate center of American elite discourse. And who can blame him? Sullivan has no incentive to run afoul of elite media,, despite that fact that it’s increasingly evincing proto-totalitarian traits. After all, where could a gay man with an Oxbridge/Ivy League educational pedigree possibly go?
Still, I doubt that I’m the only one who has closely followed Sullivan’s career and noted his recent behavior. He’s too smart and perceptive to ignore the specter that is haunting America: the increasingly evident doubt among millions of Americans of the efficacy of this nation’s ideals and as well as the institutions charged with sustaining national unity.
Maybe this accounts for Sullivan recent exasperated podcast exchange with a defiant Michael Anton.
There has been a lot of chatter lately within conservative and libertarian circles about the increasing dysfunction that has set into our judicial branch, which, however ill-advisedly, now regards itself as the Union’s defender of last resort.
Lots to unpack here but I’ll return to something that I have argued before in this forum – something that was driven home to me years ago reading British constitutional scholar James Bryce’s appraisal of the American constitutional system in his classic tome The American Commonwealth, first published in 1888. Even way back then, Bryce had perceived how dysfunctional and unwieldy the federal legislative branch had become in the face of the nation’s rapid demographic and geographic expansion.
By the late 19th century it was impossible for the House of Representatives to function as a bona fide legislative assembly. Virtually all of its vital daily work was conducted via committee with all of the backroom Machiavelianism this entailed. Meanwhile, the Senate had grown far beyond its ability to function as a comparatively small, elite advisory council to the executive branch, as conceived by the constitutional framers.
By the late 19th century the judicial branch, embodied in most American minds then and now as the Supreme Court, one that was given comparatively short shrift by the Constitution by its framers, was poised for its ascent to the commanding heights of American politics and culture.
Its earliest custodians, notably Chief Justice John Marshall, had, like all elites in virtually all political systems throughout history, engineered the first tenuous steps toward an accretion of power beginning with Marbury v. Madison. But even Marshall, careful to avoid overreach and the backlash that inevitably would follow from the majority Jeffersonian camp, stepped away from one especially contentious constitutional issue of the day, conceding, however reluctantly, that the recently enacted Bill of Rights applied only the the federal government, not to the states.
The most libertarian- and constitutionalist-minded of early American statesman expressed qualms about enacting an explicit statement of rights, fearing that it ultimately would be construed by Congress or the courts as affecting state as well as federal authority.
These fears rather predictably proved prescient, following the post-Civil War passage of three constitutional amendments – the 13th, 14th and 15th – that set the Supreme Court firmly on the path toward the enunciation of the Incorporation Doctrine, which effectively worked to erode the states’ sovereignty, reducing them to de facto provinces.
Equally significant, though, is how the Supreme Court has employed the Incorporation Doctrine with many subsequent expansionist rulings in a manner that essentially has transformed it into a de facto supreme governing council – effectively, the American Union’s final arbiter.
What many observers surprisingly overlook, no doubt, intentionally in the vast majority of instances, is that the court employs enhanced powers partly to compensate for the dysfunction of the legislative branch, which the Framers regarded as the well-spring of federal policy, not to mention, the branch charged with safeguarding the balance between state sovereignty and that which had been delegated – conditionally, it should be stressed – by the states to the federal government.
The behavior and public pronouncements of the current Supreme Court Chief Justice John Roberts and and his immediate predecessors seem to reflect this fact. The case could be made that the court has been aware for decades of the role it has served, however unconstitutional, in shoring up the deep dysfunctionality of the legislative branch, one whose efficacy has been badly eroded within the past century and a half but especially in the years after World War II when the United States emerged as a global empire..
Yet, increasingly, the Court finds itself hemmed in, if not trapped, by the demographic and cultural changes overtaking the country, many of which are of its making. One recent example: It’s decision following the 2020 election not to hear the case lawsuit challenging late changes to Pennsylvania’s election process.
Despite a thunderous dissent by Justice Clarence Thomas, two justices previously regarded as being in the tank for the right, Brett Kavanaugh and Amy Comey Barrett, voted with the majority. And why should we find that at all surprising? Given the way the Mainstream Media organs characterized Thomas’ opinion as dissent bordering on sedition, it’s easy to discern why a court that they regard a majority conservative one has gotten into the habit of carefully hedging its bets.
SCOTUS, to employ one of the Orwellian Newspeak-style terms that characterizes so much of cultural and political discourse now days, is walking an increasingly thin rope. It carries on what it undoubtedly regards as a lofty and valiant struggle to safeguard not only a dysfunctional legislative branch but an increasingly divided, if not fraying, American Union. Yet, as a marginally conservative court, regarded as illegitimate by many, if not most, of our Mandarin class entirely for that reason, it imposes limits on the manner in which which it weighs in on the most pressing issues of the day.
This amounts to one of the most remarkable ironies in U.S. political history: The judicial branch that, at least for the last century, has regarded itself as the panel of last resort and that has played a major role in the sweeping changes within American society, now feels constrained and even threatened by this transformation – so threatened that is now limiting its judicial activism.
This raises a troubling question: Who mans the rudder of state, certainly during an extreme national crisis? If the legislative and judicial branches have been rendered either too dysfunctional or too threatened to step in during a major upheaval, who will?
It serves as another reminder to me and many other red heartlanders of the precarious times in which we live.
Glad to know that they’re finally closing in on New York’s bloviating cad-in-chief: Governor Nipple Ring.
Even so, this late-served comeuppance only serves to expose CNN for what it is: late America’s version of agit-prop, actually not that much different from the media apparatus that served the ruling class of the late Soviet Union. And, yes, I know all about Fox News – Yada, Yada, Yada – but there’s one big difference.
Fox lacks the backing of the culturally hegemonic segments of American society – academia, Silicon Valley, and Big Entertainment, to name a few. It may be the voice of Con, Inc. – Big (K-Street) Conservatism – but it still lacks the cultural clout of what I’ve come to call #AmericanPravda.
“As the nation reckons with its racist history, legislation calling for the removal of Confederate commemorative works from national parkland is likely to be reconsidered this year,” solemnly writes Kim O’Connell of the National Parks Traveler.
She adds that “one might be forgiven for believing that the South won, based on a reading of the monuments alone.”
In that case, I’ll never set foot on a federal park again. I’ll even go a step further by expressing my fervent hope that young Southern men and women withdraw their support of the American imperial enterprise, opting not to serve in any of the branches of the American military – yes, refusing to support the geopolitical interests of a government that resembles less a constitutional republic, more a tyranny with each passing day and, like many earlier empires, sustaining its power by pitting one cultural segment of society against another.
What is conveniently ignored by writers such as O’Connell in the midst of this proto-totalitarian woke struggle is that national unity and the ultimate construction of what amounts to a global American empire was secured through the construction of thousands of such monuments in town squares, cemeteries and, yes, national parks in every corner of the vanquished Confederacy.
It ultimately was achieved only because the Northern conquerors concluded, however half-heartedly, that post-war unity was achievable only through an acknowledgement of the bravery and sacrifices of the Confederate fighting man.
Without this acknowledgment, the South very well could have ended up as the American version of Ireland or even the Balkans, a soft, vulnerable underbelly of an aspiring empire. And given where we are heading with all of this neo-Puritanical cleansing, we may end up with something resembling Northern Ireland during the troubles or, even worse, the past Yugoslavian Balkans.
John Calvin (1509-64), French theologian and reformer
Speaking as an amateur historian of frontier religious history, I find this development quite fascinating but not all that surprising: Baptists, in this case, presumably Southern Baptists, have returned to their pre-Frontier roots, namely, Reformed Christianity. A couple of things stand out in the material that a new Reformed Baptist Church in Alexander City, Alabama, has posted to their Web site, notably, an allusion to Communion as a sacrament rather than an ordinance.
This represents a significant departure from the historic frontier Baptist and earlier Radical Protestant emphasis on Communion simply as a memorial of Christ’s atoning grace. Also, the comments are quite interesting, especially among those who discern this as a Baptist embrace of Protestantism.
There has always been a strongly held view among many Baptists, historically regarded as the Landmark tradition, that they represent the restoration of the New Testament Church – another interesting Baptist distinctive, though also shared among other movements, one that also dates back to pioneer settlement of the American Back Country. Many egalitarian-minded frontiersmen regarded settlement as an opportunity to abandon creeds and confessions and to set everything right by returning to the pristine attributes of the First Century Church.
However, earlier, pre-frontier Baptists had hewed to many of the teachings of Reformed Christianity, which is not surprising, considering that this was the regnant form of Protestantism not only in Britain but also the American colonies in the 17th and 18th century.
History has demonstrated time and again that many movements, political and religious alike, have returned to facets of their original roots. Baptists, who have generally followed a divergent path over the last 200 year following settlement of the American frontier, may prove no exception;
It’s interesting to consider the factors that have contributed to this. So-called New Testament Christianity, which gained traction during American frontier settlement, offered the advantage of lean messaging, at least, the case could be made that it did. Moreover, these teachings were exceptionally well-suited to a relatively unlettered, itinerate and rather culturally unrooted people people who had become both intellectually and the temperamentally untethered from the creeds and confessionals that had prevailed in Europe and along the American coast.
Yet, this lean messaging, popular and arguably practical within a frontier setting, seemed to many increasingly threadbare in late 20th century America in the face of rising levels of education and affluence and within a nation struggling with the demands of post-modernism as wells as the complexities of a post-industrial, technological society. In fact, in 1977 a group of disaffected evangelical intellectuals, convinced that evangelical Christianity had become untethered from much of the substance, notably the creeds, confessions and liturgies that had sustained the faith for two millennia, issued the Chicago Call, admonishing their fellow churchmen to return to the ancient teachings of the faith.
However, this effort largely fell on deaf ears, leaving many of these disaffected intellectuals to embark on the path to Rome, Constantinople, Lambeth and, in some cases, Geneva.
Now, as Christianity and particularly the evangelical faith seem more imperiled than ever, especially in the face of a rapidly permutating left that seems increasingly intent to subdue Christianity, at least, the conservative expressions of it, as a formative force in American life, many evangelicals likely will become more receptive than ever to the admonishments such as the Chicago Call.
There possibly, if not likely, will be stronger inclinations than ever among evangelical Christians to return to what growing numbers within the ranks perceive as more enduring foundations.