The incomparable Victor Hanson Davis weighs in again on the deeply disturbing trend of rising intolerance of whites.
He raises a point, one that has always fascinated me, about how rhetoric, particularly political forms of it, are subject to endless mutation. And as he stresses, one of the most disturbing examples of this is reflected in the increasing use of the term “whiteness” among liberal and progressives to describe the underlying pathology with American society.
Until recently, the common term was white privilege, though it ultimately proved inadequate in the face of everyday reality. After all, as Hanson observes, it’s simply untenable to argue that “a white Dayton, Ohio tire-changer is innately blessed in a way an unfortunate Eric Holder or Jay-Z purportedly is not.” So, consequently, white privilege has given way simply to “whiteness,” which, needless to say, evokes disturbing parallels to “Jewishness,” of which the Nazis made such ready use in the years leading up to their seizure of power.
Plenty of ordinary people, certainly when engaged in private discourse with family and friends, readily discern the genocidal implications of this rhetoric.
Yet, there are legions of whites, even those from comparatively stereotypical “deplorable” socioeconomic backgrounds, who remain insouciant in the face of this of this rhetoric and the horrifying effects it likely will produce over the next few generations.
Indeed, reading Hanson’s account earlier this morning, I was invariably reminded of a family of especially rabid “yellow-dog Democrats” in my native Northwest Alabama hometown who incongruently remain committed evangelical Christians but who still eagerly regurgitate whatever tripe their ancestral party puts in front of them.
They still hold maniacally to this identity even today as their region’s economy, largely as a result of their party’s neoliberal policies, has undergone headlong decline.
Northwest Alabama, once one of the country’s most obstinate bastions of yellow-dog Democratic sentiment, now stands as one of the reddest of red GOP bastions in the country, even as this family still waves the blue flags of dissent on social media. One even embarked on a pathetically obscene Trump rant a few months ago, invoking the f-bomb multiple times.
They are proverbial Kool-Aid drinkers. Will they ever be awakened to what is unfolding.
When their grandchildren are singled out some day in public places and beaten senseless merely for bearing white skin, they likely will still be eagerly, even perfervidly, propagating ther party’s line.
But then, some people, irrespective of political conviction, will never yield to reality, especially when the truth proves too painful to accept.
I wrote a piece along very similar lines weeks ago, only this writer has said it much better.
In terms of the Supreme Court, we have witnessed a fascinating playing out over the past two centuries. In many respects the heightened prestige of the court and, even more significant and troubling, our increasing reliance on it, speaks volumes about the breakdown of American federalism. In many notable respects the court has come to redress the ineffiencies of the legislative branch, which the Founders envisioned as serving as the principal, if not sole, source of domestic policy making.
However, the legislature simply is ill-equipped to serve a federal system this vastly extended and, frankly, unwieldy and increasingly inefficient. Yet, as this columnist stresses, political and cultural divisions in this federal union are now so acute that the Supreme Court has to be extremely judicious about the issues it adjudicates, lest it destroys its remaining reservoirs of legitimacy.
The consequence has been increasing judicial branch impasse. And this raises the question: What element of federal power is capable of resolving what ultimately could prove to be an existential black swan crisis, one that even may involve the viability of the Federal Union?
This characteristically trenchant observation by Dietrich Bonhoeffer, the late Lutheran theologian and martyr s ss.06p Nazism, speaks to a egregious failing of of modern society, certainly the way it is expressed today in the digital age.
Simultaneously inspired and troubled by what I have witnessed over the past two generations, I have spent the last few years developing a concept I have come to call the “Networked Human Exoskeleton.”
I argue that not only human progress but also humanity itself sprang from a networking phenomenon that started with simple forms of technology (e.g., clubs and cutting tools) the effects of which over a long stretch of time were augmented by rudimentary language, followed much later by writing and mathematics.
Over many millennia the fusion of these four concepts not only shaped our hominin forebears into fully developed human beings but also enclosed our species into a kind of cocoon – an increasingly dense network that sustains and protects us but from which we are unable to escape. For several reasons, I settled on the term networked exoskeleton to convey the sense of ensconcement that characterizes our species. While some of it is quite tangible and reflected in the technology that we use, much of it is incorporeal in nature, though all of these elements are fused together to provide our species with the protection that corporeal exoskeletons provide crustaceans and insects.
Yet, by its very nature it is like no other exoskeleton on earth – a remarkable achievement that sets us apart from every other species on the planet. It’s growth within the past seven decades following the end of the Second World War has been nothing short of astonishing. We now inhabit an exoskeleton that extends it conceptual reach into the deepest reaches of our planet’s oceans and even beyond our solar system.
Yet, our exoskeleton, despite its age and enormous complexity, essentially is no different from the survival strategy of any other species in the sense that it represents only an improvisation across a very long stretch of time. Like every other evolutionary strategy on earth, our networked exoskeleton is only approximately rather than ideally suited to our species’ needs. Indeed there may come a day decades, centuries or millennia from now in which the evolutionary strategy embodied in our exoskeleton runs into an effective brick wall.
As I have expressed a few times in this forum, we very well may be fast approaching such as impasse. For many millennia, human beings were governed by overarching narratives supplied by myth and religion or a mixture of the two.
However, within the last few centuries, largely through rapid advances in scientific knowledge, these over-arching narratives have undergone steady erosion, perhaps most aptly embodied in Nietzsche’s observation about our having killed God. Consequently, society is now ignorant of a great many moral and ethical insights that were regarded as essential to the functioning of a healthy society only a few decades ago.
The Bonhoeffer quote above adequately expresses this unsavory fact of contemporary reality better than most others. All the more troubling to me and many others is the fact that so many moral and ethical appeals are now lost within a network that has now become so vastly extended and multifaceted. Indeed, it leads one to wonder if humanity will ever succeed in developing anything resembling a new over-arching narrative.
During the height of the Cold War, the expatriate Russian novelist and sage Alexander Solzhenitsyn observed that free speech, so widely affirmed as a sacred pillar of Western society, now essentially amounts to a dead letter because in a vastly extended and multifaceted consumer society such as ours, dissident speech has been rendered effectively meaningless.
So much has changed in the four-plus decades since Solzhenitzyn offered that observation. Indeed, due to advances in digital technology our network is now so vast and complex that all manner of philosophical and political appeals, even those issued with moral and ethical contexts, have been rendered effectively meaningless.
Consequently, Bonhoeffer’s warning about the propects of morality winning out in the face of rank stupidity seem more cogent and prophetic than ever before in history.
Why is it that so much of what is written today in the victimhood genre strikes me and many others as parody?
One of the most notable examples of late is a screed in The Nation written by a person of color who describes how he has ensconced himself within a dense web of material and technological comforts while anticipating the day, the dreaded day, when he reluctantly will have to emerge from this cocoon to confront once again all the indignities of white society.
Incidentally, he refers to this cocoon as his “whiteness-free castle.”
Hell, forget the parody and consider for a moment the irony bound up in all of this. But then, I doubt that he perceives any irony at at all – the fact that virtually all of the contemporary comforts in which he has enveloped himself all these months were achieved by the very civilization that he so obviously despises.
It’s also worth mentioning that this writer also possesses a singular educational pedigree, having graduated with a B.A. and J.D. from the educational institution that most embodies historic whiteness: Harvard.
Well, let’s just hope that the screen door, invented in 1887 by an Iowan named Elizabeth C Harger, presumably of European extraction, and, for that matter, refined over the past century and mass produced and marketed within a global economic system regarded as one of many crowning achievements of Western civilization doesn’t hit him in the ass on his to way to his first post-covid outing.
I’ve spent the last view decades engaged in a strange intellectual pursuit: studying the conditions that gave rise to the totalitarian dystopias of the 20th centuries.
As as the first discernible fissures set into the foundations of Soviet Communism in Eastern Europe in the late 80’s, I undertook a rather assiduous study of the factors that ultimately contributed to the collapse of Soviet communism. I became an avid reader of the International Section of the New York Times, which provided superb coverage of all the subtle ways that rot was setting into this conquered domain, particularly within the Soviet Union’s imperial crown jewel, East Germany.
I complemented this with deep reading of a wide range of books dealing with how both forms of totalitarianism, Communism and Nazism, became rooted in Central and Eastern Europe in the first place.
I was also treated to a knock-on effect, because this reading yielded remarkable insight into how both of these systems invariably required bogeymen, essentially the manufacturing of existential threats, which supplied these regimes with the two essential and invaluable tools, which not only served to create a perpetual siege mentality among the masses but also provided the regimes with an effective strategy for diverting public scrutiny away from their manifold shortcomings and failures.
Indeed, this proved to be one of the major insights driven home to me via all this reading: that all ideologies require stategies that afford a means of both focus and deflection, and that is why bogeymen have proven such valuable tool.
Small wonder why I am simultaneously fascinated and repelled by the rhetoric of wokism, which evinces many, of not most, of the traits of incipient totalitarianism. But then, wokism, like all hard ideologies, is inherently weak, because it demands a radical departure from real-life realities.
Given this fact, it’s not surprising at all that one especially scabrous polemicist of wokism, Damon Young, seems to be employing language smacking of the same rhetoric that resulted in the deaths of hundreds of thousands of Kulaks in the Soviet Union and millions of Jews in Nazi-occupied Europe.
It’s horrifying, to say the least, and should be regarded as a wake-up call – a deeply troubling portent of the dystopia that awaits the Western world, at least, what remains of it. But I wonder: How many of us not only are willing to acknowledge this rhetoric for what it is – genocidal speech – but also to speak out against it?
That remains one of the most vital questions as we move into the second quarter of what is shaping up to be a very troubled century.
Mainstream media’s conspicuous silence in the aftermath of Biden’s very conspicuous fall on the steps leading to Air Force One is one of the many reasons reason why I will NEVER be lectured ever again by any liberal about anything.
If you’re old enough, you recall the unrelenting fun that SNL made of Republican President Gerald Ford’s repeated stumbles.
More recently, #AmericaPravda spared no effort to analyze anything and everything associated with Trump’s presumed physical and mental decline. But that is not surprising because Mainstream Media are Establishment media. They have been in service to a narrative since at least the FDR presidency and arguably earlier.
Whatever the case, American liberalism is nothing but a sick self-parody now days, evidence of this empire’s precipitous decline on all fronts, which is significantly of liberalism’s making. #LateAmerika #BrezhnevRedux
Speaking as one who has harbored a fascination with flags and symbolism for as long as I can remember, I have been intrigued with the amount of time legislators in several states have put into resolving issues related to their state’s symbols, notably flags.
Granted, the bulk of these challenges have been taken up by Southern legislatures as they are confronted with the sundry challenges associated the symbolism of the very late Confederate States of America, some of which is incorporated into their symbols.
Within the last generation two Southern states, Georgia and Mississippi, have undertaken wholesale revisions of their state flag, though Georgia opted in the end to retain a design inspired directly by the Confederacy’s first national flag, the Stars and Bars.
More recently, South Carolina is dealing with what could prove to be one of the most vexing challenges of all: settling on a standard for the state’s iconic Palmetto and Crescent symbol. Fortunately for South Carolina, this symbol predates the Confederacy and stems from the state’s distant Revolutionary past.
As it happens, the Palametto and Crescent flag hoisted daily over the Statehouse bears a somewhat different design than those displayed in the House and Senate chambers and the governor’s office.
Complicating matters is the fact that South Carolina, like many other frugal state governments, relies on private manufacturers to supply the flags it displays in official offices and on public grounds. And because the Palmetto and Crescent symbol never had been standardized, these companies supply different versions.
Consequently, the Legislature is now being challenged to adopt a standardized version of banner, one that has proven more challenging than any of the legislators anticipated.
Speaking as a proud Alabamian, I have to concede that I envy South Carolina immensely. No other state can hold a candle to the Palmetto and Crescent, except Texas, which possess the nation’s most iconic state symbol, the Lone Star flag, recognized the world over.
If only Alabama’s Yellow Hammer carried as much symbolic punch But alas, it is rooted in the Old Confederacy and sooner or later will be consigned to extinction – the symbol, not the bird – much like several Alabama college administrative buildings bearing the name of Gov. Bibb Graves, a noted educational reformer who also maintained KKK membership.
Whatever the case, I do think that the recent dust up over the Palmetto and Crescent is possibly highly instructive from a cultural standpoint.
In the face of an increasingly fraying American identity, state flags and symbolism are likely to become more significant in the future.
Way back in the mid-1970’s, I and my fellow classmates at Russellville Junior High School were blessed with an unusually gifted and dedicated 8th grade history teacher named Mary Alexander.
Mrs. Alexander, now long deceased, loved pointing out the irony of history, particularly in terms of how facets of it – whether these happened to be political or cultural ideals or ways of doing things – often re-expressed themselves at times when we least expected them, even when we thought that they had become discredited or simply had played out.
I never forgot her lesson. Indeed as an avid reader of history I am reminded of this on a frequent basis. Just when we think that some ideas have been discredited or forgotten and, consequently, consigned to history’s ashbin, they return with a vengeance, even with the sense of vibrancy and relevance that had distinguished them in previous decades or even centuries.
The rekindling of American federalism and even, perish thought, states rights, serves as an unusually timely example. I grew up at a time when federalism expressed as states sovereignty seemed throughly discredited. What seemed to have been an inexorable march toward human progress, LBJ’s Great Society programs, locked arm and arm with the civil rights struggle and the federal courts’ efforts to expunge the stigma of racial discrimination, seemed to have dealt, if not a fatal blow to states rights, at least a searing defeat that would leave this constitutional doctrine in what amounted to a semi-comatose state.
We were assured by teachers at every level of public education that states rights was a relic of the past – not just a quaint but even a disquieting one. I recall several political science courses in which the professor, a Great Society liberal, likened federalism to a marbled cake. The federal government was the cake, though states provided measure of enhancement, sort of like chocolate marbling.
Yet, history seems to be repeating itself with a vengeance. In the face of American federal impasse and national division, states, large and small alike, are reasserting the themselves. As I have pointed out on numerous occasions on this forum, it started more than a decade ago when then-California Gov. Arnold Schwarzenegger began characterizing his state as something resembling a nation within a nation. He successor, Jerry Brown, even began conducting a kind of incipient foreign policy related to climate change.
Recently, a prominent GOP leader, Allen West, has lobbied for a secession vote in the Texas State Assembly, a move that at least one GOP leader in another Western state characterizes not only as a positive move but also one that bears close watching.
More recently in Oregon, state Sen. Jeff Golden (D-Ashland) has proposed legislation that would reintroduce a state bank concept for Oregon, primarily with the aim of serving as a backstop for community banks and credit unions.
Golden holds up the Bank of North Dakota as the model for his efforts, stressing the role that this bank played in minimizing foreclosures during the Great Recession.
The Washington Post reports that small businesses in North Dakota, compared with their counterparts in other states, were ably served by this model. In fact, they secured more Paycheck Protection Program (PPP) loans relative to the state’s workforce than other states, with more than $5,000 per private-sector worker as of May 8, 2020.
Yet, why is all of this surprising? States, by their very nature, possess the accoutrements of nationhood. And this is as much a matter of practicality as a historical fact.
As a student of constitutional history, I not only find this fascinating but also instructional in terms of how it underscores the increasing inefficiency of centralized federalism. If developments such as these demonstrate one thing, it’s that no central government, certainly one so big, bloated and overextended as the imperial behemoth in Washington, possesses the omnicompetence to manage a polity of the scale of the United States.
The late Mrs. Alexander was spot on: History does repeat itself.
I wondered how much longer it would be before the Confederate Constitution, much like Confederate statues, would fall victim to cancel culture. Quite honestly, though, I don’t know what is more maddening: cancel culture or the intellectual laziness evincedby journalists, even relatively elite ones, who, either intentionally or unintentionally, aid and abet this malignant cultural trend.
AP journalist Jay Reeves characterizes the Confederate Constitution, which, incidentally, was debated and drafted in the Capitol in Montgomery in my native state of Alabama, as a vestige of white supremacy without even bothering to consider the document within its full historical context. And let’s make no mistake here: The Permanent Confederate Constitution was conceived within a wide intellectual and historical Anglo-American constitutional context and, for that reason alone, is worthy of serious discussion, despite its provisions safeguarding the institution of slavery.
It is appalling to me that Reeves never even bothered to explore this unusually rich context, which would have been standard practice among journalists as recently as a decade ago.
A Watershed Document
Before public discourse became so poisoned, the Confederate Constitution, despite the controversy associated with it, would have been characterized by some writers and academics as a watershed document, one that represented the outcome of a protracted, intense and often acrimonious debate on the nature and scope of federal power that began immediately following the drafting of the U.S. Constitution in 1789.
The Permanent Confederate Constitution could be accurately characterized as embodying the Jeffersonian School argument, which maintains that the federal government – the “general government,” as it was characterized by many in the decades following constitutional ratification – simply functioned as the agent of the contracting sovereign states. This was underscored by the Confederate Constitution’s preamble, which affirmed that each state, in ratifying the document, was acting in its “sovereign and independent character.”
Aside from reaffirming the Jeffersonian view of federal power, this revised constitution also introduced some remarkable innovations that not only are instructive today but that still hold currency as contemporary Americans struggle to rein in federal power and even more significant, contend with mounting interest in sectionalism and even secession. Indeed, the case could be made that these innovations are especially relevant today amid new sectional divisions pitting predominantly liberal blue-coastal states against predominantly and implacably conservative red heartland states – issues not all that different from the ones that plagued federal relations in the early 19th century.
A Six-Year Presidency and a Line-Item Veto
One notable innovation was how the Confederate framers altered the office of the presidency, both limiting and strengthening it. While restricting the chief executive to a single 6-year term, the Confederate Constitution also empowered him with line-item veto power. Such a constitutional prerogative potentially would have gone a long way toward reining in the Leviathan federal state, one that not only extends its hand into increasing facets of American life but even holds tremendous sway over the affairs of nations in far-fling corners of the world. Moreover, with such a constitutional safeguard, we likely wouldn’t be contending today with a $20-million deficit.
The constitution also prohibited Congress from levying protective tariffs that tended to benefit one section of the country over others, an issue that proved contentious in the formative stages of the young American Republic and that virtually rent it apart in the early 1830’s.
The long-term effects of protective tariffs arguably have had an especially deleterious effect on the fortunes of American development and national cohesiveness, not only by allowing one section of the country, namely, the mercantile Northeast, to grow rich at the expense of most of the others but also by enabling it to transform much of the rest of the country, notably the war-ravaged, economically prostrate post-Civil War South, into an economic extraction zone.
Reining in Federal Judicial Power
In what arguably could be regarded as the most noteworthy innovation of them all, state legislatures were entitled to remove corrupt or constitutionally unscrupulous federal judges living in their states by a two-thirds vote of both houses. Ponder for a moment all of the contentious 21st century issues that could have been resolved by this provision. It would have obviated the need for state legislatures to resort to strategies such as interposition and nullification that contributed significantly to two serious constitutional crises stemming from passage of the Alien and Sedition Acts in 1798 and the Tariff Act of 1828. Each of these contributed significantly to the protracted political impasse that culminated in a national breakup in 1861. Even more significant, though, such a constitutional safeguard likely would have contributed significantly not only to higher levels of restraint in the judicial branch but also in the federal legislative branch, as lawmakers would been more cognizant of the futility of passing laws that encroached on state sovereignty.
Yes, the Confederate Constitution was both an innovative and instructive, one among a long line of written constitutions within the Anglo-American tradition, one that also incorporates those of Commonwealth realms. And that is why it, along with others, should figure in prominently in any undergraduate or graduate coursework dealing with the protracted historical debate about the nature and scope of central power within a federal system. But like so much else in woke 21st century America, the Confederate States Constitution is now so thoroughly tainted by the stigma of white supremacy that it can never be regarded as anything more than a “forgotten relic of an ignoble cause,” borrowing Reeves’ description, and, consequently should remain locked away in archive and forgotten.
This only ensures that substantive debate in this country will grow even more constrained. But, of course, by now it should have dawned on most of us that this is one of the underlying aims of wokeness and cancel culture, which aren’t so much about fairness and inclusiveness as they are about stigmatizing views that threaten their hegemonic standing within American politics and culture.
Reeves’ article only served to underscore that we no longer function aa vibrant, open and free society, only one that pretends to be. And many of us are beginning to wonder how much longer elites, increasingly confident of the political and cultural power they increasingly wield, will bother with maintaining this pretension.
Remember mad anchorman Howard Beale’s admonition in Network (1976) that television was the most “awesome goddamned propaganda force in the whole godless world!”? Well, the hapless Mr. Beale only got it partly right. The most awesome force in the world is American culture, a phenomenon that now drives much of the dialogue and culture throughout the Western world and beyond.
As we approach the second quarter of the 21st century, we’re seeing the cresting of a remarkable cultural force that was incubated by Washington’s victory at Yorktown and that has gained increasing levels of traction since the U.S. Civil War, World War I and particularly World War II, which placed this phenomenon at a particularly distinct advantage vis-a-vis its war-ravaged, materially depleted counterparts and erstwhile rivals in Western Europe.
A Pulverizing, Flattening Social Force
Indeed, looking back over the past 30 years following the collapse of of Soviet communism, it’s worth recalling how many on the left finally concluded, however reluctantly, that American culture – all the pulverizing, flattening effects associated with it – ultimately proved to be, paraphrasing Beale, history’s most awesome, radicalizing force in the whole godless world.
The whole world has been Americanized – and it leads one to wonder if any ancient institution, including one of the most ancient of all, the British monarchy – is equipped to withstand this force over the course of time.
It’s fascinating to consider all of the subtle ways that this cultural force is playing out in every facet of modern life.
The Late Prince Diana Wasn’t British
Consider the late Princess Diana, who was eulogized at her very Americanized funeral by her brother, Viscount Althrope, as a “very British girl.” Actually, she arguably wasn’t at all. Despite her very noble and very English pedigree, she embodied many of the aspirations of global American culture – a penchant for personal independence, self-expression and self-actualization.
Even her comparatively sober, responsible son, Prince William, the royal heir, has expressed his qualms about assuming the Windsor mantle and in ways that sound, well, rather American. And this really isn’t all that new. The Duke of Windsor, the former Edward VIII, who always evinced a special affinity for American culture, even incorporating American slang in his casual discourse, possibly wedded a twice-divorce American social climber as a pretense for abandoning the British throne, likely because he, too, had been infected with American notions of personal independence.
Now the monarchy is imperiled once again by an even more explicit expression of this this awesome cultural force: a grasping, b-list American actress whose personal agenda has been hiding in plain sight for the past four years, one that puts her late mother-in-law’s rather ill-defined and hastily improvised agenda to shame.
A Fresh Face among Staid but Stuffy, Lilly-White In-Laws
Some royal watchers speculate that Meghan initially harbored a desire to transform the monarchy from within, carving out her own distinct royal identity amid her staid but rather stuffy, lilly white in-laws. She aspired to be the fresh face among the Windsor clan, not only equipped to energize this thousand-year-old institution but one who, over the course of time, would be regarded as so valuable and indispensable to the Crown’s long-term success that she would be afforded the opportunity to establish her own distinct style and agenda.
Predictably and in remarkably short order, she realized this this ancient institution operated by its own time-honored and distinctly rigid rules. The full weight of this new reality fell on her carefully sculpted shoulders: She had been assigned a non-negotiable set of job responsibilities, that not only detracted her from her personal career aspirations but that also effectively consigned her to what amounted to gilded oblivion – a mere face and a tightly constrained voice consoled only by the knowledge that her fate was shared with the world’s wealthiest and most exclusive family.
She balked, predictably resorting to American arguments about one’s being entitled to happiness, self-actualization and self-expression.
Then followed disruption – the quintessential American desire for separation and a fresh start, albeit with the the tacit understanding that she would continue to profit from Windsor family connections.
While this likely amounted to a deviation from her original plan, Meghan, with poor, dimwitted Harry in tow, had drawn closer to her goal of carving out a sort of semi-autonomous woke Windsor counter-monarchy, one in she could fuse the legacies of Princess Diana, Mother Theresa, Rosa Parks and the theology of Oprahism into a neatly crafted, compellingly new alternative brand.
In time, though, elements of Plan B proved to be as problematic as Plan A, notably, running up against the Royal Family’s obstinate refusal to allow the Sussexes to profit from their residual royal ties.
A Scorched Earth Plan C.
From the Sussexes’ swanky digs in exclusive Montecito, California Meghan improvised a new, scorched earth Plan C, turning wokeness up to full throttle, accusing the Windsors of mental abuse (one of the standby American strategies in divorces and HR disputes) and even characterizing them as a vestige of white supremacy.
Granted, millions of people see through this grifting, grasping woman and have from the very start. But millions of others have predictably swallowed this Cinderella narrative hook, line and sinker, just as they did Diana’s version. Moreover, there are plenty of facets of woke elite culture, particularly within the corporate sector, that very well may lend a lucrative helping hand to the Sussexes over time. And that is precisely the outcome that Meghan planned for and expects. And she likely will be proven right.
So, it’s entirely possible, if not likely, that Meagan will be remembered generations from now as a truly singular historical figure: not only as fashion icon and trendsetter but even as a dynastic matriarch of sorts – the founder of a new, radically chic form of monarchy, one that represents a the culmination of global American culture, leavened by a heaping serving of wokism.