Dust in the Light is not meant to be a solo effort. To contribute or reach out for any reason, email here.
Search is for "2020..." only.
©2020 Dust in the Light. All rights reserved.
An optimist can hope, as does Barton Swaim, future generations will recognize that the supposed experts of our times have been, “at crucial moments, idiots,” but such optimism would miss fundamental explanations and incentives.
The errors of those to whom our society looks for sage guidance have been broadly infectious, but for clarity of the point of this essay, let’s focus on the political. In 2016, Swaim suggests, “America’s best and brightest political adepts turned out to know very little about the elections they claim to understand.”
That observation isn’t quite accurate; the fact that the adepts didn’t know enough doesn’t mean they know “very little.” Just so, as skeptical as we should be about credentialism, credentials aren’t only fancy acronyms (or rather, they aren’t always only that). The experts are experts; it’s just that their expertise applies to a narrower range of possible realities than they (or we, as a reliant community) like to admit.
But if the experts know, on some level, that beyond the ornamented edges of their institutional degrees lies a vast unknown in which their runes have no power, we can see how susceptible they must be to a corrupt elite who promise to operate within the experts’ necessary frame, thereby maintaining the relevance of their expertise. Politicians who more or less follow the recognized formulas make their activities predictable for those whose credibility relies upon their predictions about politics.
A problem has increasingly arisen because the inertial force of the elites’ corruption — drawn toward the gravity of total control promised by progressivism — prevents them from keeping within the necessary boundaries. The more the People challenge the progressive project, the farther progressive politicians must move, to the point that they are pressing untenable falsehood, lest truth overwhelms them before they have the total control they desire.
Thus, for example, they began with the shared principle that even speakers of ideas we find detestable have a right to the public square on the grounds of social tolerance, intellectual humility, and cultural confidence. This moved to the argument that those speakers deserved public funding (if they were progressive) so they might be better heard, which was followed by the demand that other speakers, whose ideas were opposed to the formerly detestable, must be denied government support. Next, the formerly accepted, but newly detested, ideas (i.e., traditional norms) have been said to be so pernicious that they must be declared unspeakable — with, finally, those who persist in speaking them shamed, canceled, and prevented from maintaining an ordinary life. Because progressive ideas cannot bear scrutiny at the level of their assumptions, the ban is now extending not only to adherents of old, newly detested ideas, but also to others who continue to state the originally shared principle that deplorables have a right to the public square.
For their part, the experts have no choice but to keep pace, not only because they wish to be seen as elites, but also because delusion is a comfortable bubble in which to float through the unknown beyond their expertise. Just so, they must proclaim as indisputable the fantasy that pop-culture reality star Donald Trump was some sort of Russia-controlled fascist riding a wave of palpable and systemic white supremacy, because otherwise, the experts would have to recognize that their political friends are not serving the public in some way that they — both the intellectual and political elite — do not understand or cannot accept.
No matter how much damage it might cause, the symbiotic relationship between power and knowledge in every area — politics, foreign policy, climate change, social issues, monetary theory, and more — requires that the political elite must continually double-down on disproven policies while the experts explain why the evidence proves their necessity.
We needn’t follow this path very far before it becomes existential. Eventually, one must either accept reality or rebel against it. One must either accept the nature of reality’s creator and source of volition or insist that one’s own feelings are, in actuality, the creative force of the universe and its true source of volition.
To be sure, only a vanishingly small minority of those who lean toward rebellion against reality and God will put this much thought into it. Indeed, many will actively resist any temptation or urging to do so.
In an essay about fear of religion among secular intellectuals, Matt Nelson quotes “philosopher and atheist” Thomas Nagel as writing, “The thought that the relation between mind and the world is something fundamental makes many people in this day and age nervous.” Nelson speculates that the reason for this nervousness is that “people despise religion and are afraid it is true.”
As nicely simple as it is, this formulation fills the pronoun, “it,” to the bursting point. Which “it” is true? Perhaps the fear derives from the need to answer that very question, because ultimately, again, one must either accept external rules — which may impose restrictions and which others may genuinely better understand — or accept the responsibility of being the source of rules.
Turning to New Age spirituality, Nelson cites C.S. Lewis’s description of “Life Force philosophy,” which (Lewis wrote) “gives one much of the emotional comfort of believing in God and none of the less pleasant consequences”:
When you are feeling fit and the sun is shining and you do not want to believe that the whole universe is a mere mechanical dance of atoms, it is nice to be able to think of this great mysterious Force rolling on through the centuries and carrying you on its crest. If, on the other hand, you want to do something rather shabby, the Life-Force, being only a blind force, with no morals and no mind, will never interfere with you like that troublesome God we learned about when we were children.
Put more concretely, the function of this Life Force concept is to provide a sense of connection when one wants to feel a part of something grand and good, but to painlessly and temporarily sever that connection when one wishes to do something that would harm the sparkling universal flow of life if we truly were connected. It is as if a husband or wife who values the long-term security and cultural approbation of marriage believes that removing his or her wedding band for an evening prevents an adulterous fling from harming the relationship.
As materialist New Age spirituality is to theology, so a credentialist Belief in Science is to public policy. Experts and politicians want neither to acknowledge their limits nor to take responsibility when their overextension does not work as they promised. When predictions are being made and new restrictions imposed on the public, progressive politicians provide a shape for their preferred solutions, and experts find it fulfilling to think that their expertise can have historic significance in the wellbeing of humankind. When their scheme fails and unanticipated consequences emerge, the experts insist that they lacked sufficient data, and the politicians claim their political opposition imposed too much restraint.
So, social engineers determine to dismantle more of the cultural machine, and progressive economists call for bigger, more-risky stimulus. The ideologues’ cloud of dismissal moves its shadow over truths that are even more obvious to the average person, and citizens who are even more clearly moderate face the partisans’ mandate to submit or be banned.
On it will go until a sufficient mass of the public imposes a return to sanity or reality asserts itself through irrefutable ruin. In the course of either corrective process, however, the politicians will merely shift their tones, and the experts will behave as if it was their data, rather than their principles, that were corrupt.
The likelihood that any of them will in any sense be held to account for their lunacy is vanishingly small. The best we can hope for — and an objective toward which we who have eyes to see should begin working now — is that they will be displaced by a new generation of experts who are willing to admit the limits of their expertise.
As our present era progresses toward the elevation of “gaslighting” to status as the term of the century, the image of Jesus in Rembrandt’s Christ in the Storm on the Sea of Galilee has repeatedly emerged from the disorder in my mind. The work — stolen from the Isabella Stewart Gardner Museum in Boston in 1990 — depicts a scene that appears in the gospels of Matthew and Mark:
… as evening drew on, he said to them, “Let us cross to the other side.” Leaving the crowd, they took him with them in the boat just as he was. And other boats were with him. A violent squall came up and waves were breaking over the boat, so that it was already filling up. Jesus was in the stern, asleep on a cushion. They woke him and said to him, “Teacher, do you not care that we are perishing?” He woke up, rebuked the wind, and said to the sea, “Quiet! Be still!” The wind ceased and there was great calm. Then he asked them, “Why are you terrified? Do you not yet have faith?” They were filled with great awe and said to one another, “Who then is this whom even the wind and sea obey?” (Mark 4:35-41)
Homilies about this passage tend to put us in the position of the apostles, with the message that we should trust in the power of Jesus during the storms of our lives. While this advice is of perennial application, our times may make it more relevant to put ourselves in the position of Jesus, himself, as our model.
After all, why shouldn’t he be calm? He’s the Son of God, and his time was not yet come during this particular voyage. “Why are you terrified,” he asks the apostles. “Do you not yet have faith?” He might well have asked, “Do you not yet know what world you live in?”
A traveler might sleep on an airplane despite traveling at high speeds at a fatal altitude, because he knows the world is such that airplanes can fly. A patient can sleep before a routine surgery because she knows her civilization has made such practices as safe as can be. Weightlifters revel in their pain because they know it foretells increased strength, and students mightn’t hesitate to throw themselves into the agony of confusion because at the other end lies confident knowledge.
Even more, the believing Christian should know that he or she lives in the world of God’s covenant, in which even death itself is not a terror. This point comes home with another attempted temptation of Christ in the Garden of Gethsemane. Appeals to his pride in the desert at the beginning of his ministry had not worked, but now the whispers appeal to doubt, and this time, Jesus rebukes his apostles for sleeping:
When he returned he found them asleep. He said to Peter, “Simon, are you asleep? Could you not keep watch for one hour? Watch and pray that you may not undergo the test. The spirit is willing, but the flesh is weak” (Mark 14:37)
The gospels do not say, but perhaps Jesus remembered his own preaching: “Do not be afraid of those who kill the body but cannot kill the soul; rather, be afraid of the one who can destroy both the soul and the body” (Matthew 10:28) That is the field on which our battle takes place, and the proof of our victory is our ability to remember what world we live in!
This imperative filters down to our relationship with truth. In a timely essay titled, “On the First Duty of Intelligent People,” Dr. Tod Worner (an internal medicine physician) gives the explanation of his title by means of timeless quotations:
- “We have now sunk to a depth at which the restatement of the obvious is the first duty of intelligent men.” — George Orwell
- “We must always tell what we see. Above all, and this is more difficult, we must always see what we see.” — Charles Péguy
- “What we suffer from to-day is humility in the wrong place. Modesty has moved from the organ of ambition. Modesty has settled upon the organ of conviction; where it was never meant to be. … We are on the road to producing … men too mentally modest to believe in the multiplication table.” — G.K. Chesterton
- “Live not by lies.” — Aleksandr Solzhenitsyn
- “I won’t insult your intelligence by suggesting that you really believe what you just said.” —William F. Buckley
Two modes of thought — strategies of combat — are available in the battle of ideas:
- You must be wrong because I am right.
- I will strive to understand what you are saying as if it might be true so as to fairly assess who is right and who is wrong.
Note that both are distinct from the acknowledgement of objective facts encouraged in the great quotations above.
Gaslighting has become such a powerful force in the twenty-first century because its practitioners present themselves as if they adhere to the second approach, but they are adherents of the first. So persistent are they that their “I am right” engulfs the obvious reality that sits between the disputants. That is the trap.
Yet, we must always question ourselves, so as to balance between the temptation toward doubt (wherein we refuse to acknowledge that truth exists) and the temptation toward arrogance (wherein we refuse to acknowledge that truth exists outside of ourselves). And the only sure way to keep that balance is to root ourselves in the single fundamental truth. As Worner quotes Pope Benedict XVI: “a mature adult faith is deeply rooted in friendship with Christ … that opens us up to all that is good and gives us a criterion by which to distinguish the true from the false, and deceit from truth.”
Remembering what world we live in means being capable of acknowledging error without risking the loss of Truth. The deceiver requires that the facts be seen in the way that points toward an end that he or she desires for some other reason; if the person is a self-deceiver, he or she will not even recognize that this other reason is separate from the facts. The believer strives to understand the facts in order to improve his or her relationship with Truth, which he or she recognizes to ultimately transcend material facts. The facts don’t have to be made to conform with reality, because they implicitly emit from it. That transcendent reality is the world we live in, and remembering it means having the courage to face the storm that plainly speaking the truth can raise in turbulent times because we are not truly at risk of perishing — at least in the sense that matters.
Resigning from her post at the pinnacle of the commentary world at the New York Times with a magnificent document of huge cultural significance, Bari Weiss became a hero of the movement to restore classically liberal ideals like freedom of speech, mutual respect of those with differing opinions, and trust in a marketplace of ideas. She picked up this theme in a recent essay titled, “The Great Unraveling,” warning that the loss of that spirit of freedom is beginning to manifest itself in physical reality.
Her thinking on this matter found form in a dinner conversation with Catholic intellectual Robert George, who read to her a prose poem by Heinrich Heine that warned about the consequences as the German spirit shook off the “subduing talisman” of Christianity a century before Hitler’s rise.
Notable, given the theme of Weiss’s essay, is a seemingly unnecessary interjection into her narrative:
Robby is among the most important Catholic intellectuals of our era. He is a Princeton professor, a lover of great wine, a wonderful writer, a total gentleman, and one of the most articulate opponents of gay marriage in the country.
Now is a good time to say that as soon as the pandemic ends I plan to invite all of my friends to an inappropriately large wedding where I will stand under a chuppah and marry a woman (Nellie Bowles, the love of my life). I am profoundly grateful that we have that right. And I’m grateful for all of those, including my friend Andrew Sullivan, who waged the battle to win it.
Robby might not want to go to a gay wedding. But I love that at least for now I still live in an America where he and I can sit together, over good food on a dark night in the middle of a pandemic and talk about what is broken and how we might join together to fix it. That act is the whole point of the American experiment.
Obviously, this detail is relevant by way of showing how Weiss and Robby George can converse in a friendly way despite dramatic differences on one of the most profound areas of difference in our times, but how very interesting that she would feel the need to leaven her cross-ideological sympathy with a nod of gratitude to her ideological allies and a genuflection in the direction of “the battle to win” same-sex marriage.
Stylistically, I use the phrase “how very interesting” because I’m currently caught up in the voice of Thomas Mann’s The Magic Mountain, another German production that can be understood to have predicted the rise of Naziism by judging the German character. Intellectually, I emphasize the paragraph’s interestingness because it seems to me that the “battle” she lauds was an unveiling of and test case for the trends in the American character that she is beginning to lament.
The ingredients were all there. Emotional manipulation. Evidence of academia’s proselytization among the cultural elites. A news media refusing to treat the traditional side as if it had any validity worthy of consideration. Corporations leveraging their power to promote an ideological side. Courts rewriting the definition of words as a means of changing the law, thus signaling to a large portion of the country that we are not permitted to self-govern when our cultural betters have made up their minds.
This point sharpens to a supremely relevant two-word coinage from Sullivan’s book, Virtually Normal (my analysis of which is actually mentioned on the book’s Wikipedia page), wherein Weiss’s friend wrote:
Some might argue that marriage is by definition between a man and a woman; and it is difficult to argue with a definition. But if marriage is articulated beyond this circular fiat, then the argument for its exclusivity to one man and one woman disappears. (Emphasis added.)
As I explained in 2010, “my proximate concern was that this reasoning justifies any change to marriage, and its method any change to anything.” (Emphasis added.)
And so it has gone. We should applaud both Bari Weiss and Robert George for their willingness — their desire, even — to pursue interactions with those who harbor profound disagreements. Still, we should also be honest about their demonstrated fealty to the principles they ostensibly share. When it comes to a privilege for which she is grateful, Weiss does not bend to the beliefs and rights of her ideological opponents. Rather, she waves a flag for the battle that subjugated them.
In short, she has a seat of her own on the train that is plowing through our culture and flattening the subduing talisman of our classically liberal heritage like a penny on the tracks. She may feel kinship with the “strange” people “who see clearly that the fight of the moment, the fight that allows for us to have those disagreements in the first place, is the fight for liberalism,” but when it comes to her issue, disagreements aren’t actually permitted — at least not to the extent of allowing them to produce a policy that she does not like. In such cases, the opposition’s beliefs are pounded into a circular fiat to be rolled away.
Weiss may wish to lock away the weapons by which her battle was won before others take them too far, but the great unraveling she observes is evidence that she cannot. Her fellow “strange” people certainly cannot secure the armory unless they acknowledge the role that they have played and the damage they have done.
That damage is massive, and it is hydra-headed. Even as the heads of the media, the courts, and other cultural and political forces have gone on gnawing our discourse, civil rights, and civic institutions away, another head tears at an institution that would be central to stopping the decay: the family. To wit:
Fewer U.S. adults now than in past years believe it is “very important” for couples who have children together to be married. Currently, 29% say it is very important that such a couple legally marry, down from 38% who held this view in 2013 and 49% in 2006.
If she were to ask, Mr. George would no doubt tell Ms. Weiss that this is no surprise to him. After all, to achieve her desired goal, the cultural powerhouses and the courts changed the definition of marriage such that the institution is not ultimately about children, but about adults, and if the adults feel like they are committed to each other, then why do they need some old-fashioned ceremony? Moreover, if it is all about them, why should they make it more difficult to change their minds in the future?
Weiss closes her essay with the prediction that the “credentialed journalist[s] and liberal public intellectual[s]” who are cheering on the development of Big Tech censorship “will look like fools much faster than they realize.” Perhaps we can hope her previously demonstrated courage is substantial enough that she can honestly explore the contributing role of developments for which she, herself, cheers. If she has such courage, perhaps she can blaze a path for the acknowledgment of error that would be essential to find a way back toward re-raveling.
Featured image by Wendy Winarno on Unsplash.com.
Three items that jostled near each other in the continual stream of my information input seemed fated to join together. The first was a video ad that blurted out unbidden from an article that I was trying to read, with words pretty close to: “Don’t give me what I ask for. I’m a kid.” Clicking the mute button for the browser tab cut off the sound, but as the video rolled on silently, I saw that the advertisement was for a college savings fund.
A separate item with an obvious connection was mention of a transgender activist who has come to the logical (albeit monstrous) conclusion that doctors ought to medicate all children in order to block puberty until they are old enough to consent to sex-change operations, if they decide that’s the way they want to go.
Thus, we come across yet another example of a simple, recurring principle. Reasonable, moral people will conclude that the child in the college savings commercial was correct: It is irresponsible to trust children to make life-changing decisions based on their immediate feelings and desires. From there, however, we must make a binary choice that will lead us toward irreconcilable realities.
Either children’s inability to consent to a sex change before their bodies have made the reality manifest means we move back toward the science-based understanding that human bodies are male or female and we treat conflicting feelings as disorders to address… or it means we use science to give people the option to deny biological reality, even to the point of freezing their development until they can make the decision.
When our choices are binary and existential, they tend toward logical inevitability. The initial choice is based on a first principle that follows through all subsequent choices. A particular decision may lead to resting points at which it is possible to stop short of logical inevitability, but these will be dependent upon temporary barriers of emotion or circumstances that will require something other than logical arguments to maintain.
Whether the logical consequences of a decision are, in fact, inevitable or are only apt to create a sort of momentum, it behooves us to make decisions with a full understanding of their implications. This brings us to a third item in my recent information stream.
A group of three men in California has now produced a second child utilizing donated embryos and surrogate mothers, with the three men all listed as parents on the birth certificates. The semantics are helpful to the theme of this essay. If we’re dealing with certificates, the natural question is: What are we certifying?
Certainly, it can’t be the birth, which as a plain matter of fact involves sperm from a man and an egg from a female. If that event were the essence of the certification, it could maybe extend to another woman who carried the child, but a birth certificate conveys information about the child, as a record of how that child came to exist.
What the three men — tellingly and inaccurately referred to as a “polyamorous couple” in the article — are certifying is something much more like ownership. It is not a birth certificate, but a title of parenthood, like the title to a car. You can dislike this fact for emotional reasons (and if I were still using Twitter, the social media publisher would likely ban Dust in the Light again for suggesting it), but it is a straightforward and honest statement of reality. Acknowledged or not, it comes with an internal logic, and if widely accepted, that logic will pull toward social consequences.
If our society were progressing in a reasonable and humane way, we would recognize such problems and account for them, but radicals justifiably fear we would account for them by not going down the progressive path in the first place. The claim of the movement may be that it is searching for equality and fairness, and many of its advocates surely see that as their motivation, emotionally, but if that were truly the case, intended effects would be measured against (possibly) unintended effects.
Establishing the principle that birth is more about the parents than the child will have implications for children, just as universally blocking puberty would have implications for a generation, as does refusal to privilege biological reality. As does, to be sure, the tolerance that treats radical cultural experimentation as a respectable opinion.
In early September, Roman Catholic Bishop of Providence Thomas Tobin resumed his regular column in the diocese’s newspaper, Rhode Island Catholic, with an essay titled in accord with his characteristic humor, “Okay, God, You Can Stop Now.” In brief, his point was that the COVID-19 pandemic has taught some important spiritual lessons, and now that we’ve learned them, God can safely set things aright.
The lessons Bishop Tobin lists are these:
- “We’re not completely in control of our lives and fortunes.”
- “Our behavior affects others. … And what’s true on a physical level is equally true in the spiritual realm. … Our righteous conduct and good example encourages others, but our sin contaminates society and brings us all down.”
- We “need to keep our priorities in order. … Every day we should treasure the blessings the Lord has given us as if they’ll disappear tomorrow, because maybe they will.”
As much as we might wish the bishop’s literary device were true, however, we’re nowhere near done learning the lessons that our current predicament can teach.
Consider the story of Canadian nonagenarian Nancy Russell. Already struggling with the newly restrictive lifestyle of her nursing home, Russell did not want to go through another full lockdown. So, she sought and received a medically assisted suicide.
Wesley Smith notes a deep perversity to the story: “for her death, she could be surrounded by friends and family! … So companionship to be made dead but not to remain alive.” Indeed, by choosing death, Mrs. Russell gained permission to leave her nursing home as it went into full lockdown and spend eight days with her family before they all gathered around her bed to sing her to death.
A deeper illness than the coronavirus has settled upon our civilization. We are so frightened of death that we’ve consented to turn over our freedom and much of the substance of our days in order to avoid a small elevation of risk. Yet, we’re so callous about life that we consider death preferable to inconvenience.
In our response to COVID-19, we’ve laid entire industries to waste and cost countless people their livelihoods. We’ve restricted religious services and social activities. We’ve cut children off from their friends and elderly spouses off from their dying soulmates. But can we muster a willingness to expend the resources and accept the inconveniences of making life worth living?
This question reflects not only on the limited circumstances of our global panic. Did we have a willingness to sacrifice for others before the pandemic, and will we have it after? Bishop Tobin’s point is well taken, that we should treasure our blessings in the moment, because they may not be there in the next. Just so, we should be willing to make sacrifices as if we are addressing a temporary crisis even when we are not aware that one exists.
That doesn’t mean living always in panic or eschewing long-term planning and a measure of comfort. It does mean a more-reasonable balance — accepting a little more risk in crisis and a little less complacency during times of ease.
Also as the bishop says, our behavior and decisions affect each other. In the acquiescence to the fear of some, our government has imposed a strict regimen on all of us based not on actual illness, but merely on positive tests. Our ability to tell, with modest accuracy, whether somebody has the virus in his or her body has shifted the standard from being sick with the disease to simply having the virus.
Thus, a positive test can have devastating effects on a family, which influences the decisions of its members. People who might make one decision based on the risk of actually getting sick are having to make more-restrictive decisions based on the risk of testing positive. With such a test in the household, children can no longer go to school and parents can no longer go to work, which ripples to other families to the extent our coworkers and clients truly require our presence.
Yes, our behavior affects others, but we can’t apply this principle selectively. One person’s reckless behavior may bring harm to another. But that other’s timorous excess can spread harm, as well. The request for sacrifice of security so as to minimize despair has as just a claim as the request for sacrifice of liberty so as to minimize illness.
From the story of Nancy Russell, we should learn that the scale shifts if on one side we place despair unto medically assisted suicide while on the other side we place a mere positive result from a nose swab.
In a case of intellectual serendipity, I happen to be listening to an audiobook version of The Black Swan: The Impact of the Highly Improbable, by Nassim Taleb, during this peculiar election season. More acutely, just the other day, I heard his compliment of the United States as a place where it’s still acceptable to take risks:
My colleague Mark Spitznagle understood that we humans have a mental hang-up about failures. “You need to love to lose,” was his motto. In fact, the reason I felt immediately at home in America is precisely because American culture encourages the process of failure, unlike the cultures of Europe and Asia, where failure is met with stigma and embarrassment.
An apparent deterioration of this American trait is among the illnesses that have coincided with the proliferation of social media. “Just do your best,” “better to have tried and failed than to have done nothing,” even “better to have loved and lost than never to have loved at all” — all such sentiments are more difficult to maintain when petty, envious people might capture a moment of weakness or failure and replay it without mercy for your entire life. When it comes to anything cultural, political, or simply visible, some there will be who take to social media gleefully at your every misstep.
Even without antagonists, though, it’s all too easy to imagine some future potential employer or love interest bringing up for explanation some mark that you missed years or decades earlier. So… no risks of ideas, words or actions. It’s safer to adhere to the common fashion (whatever it is), even as it thrashes wildly around. At least then a great many people will be working to excuse your shared past errors.
But we should take a higher perspective than our curated media personas, and we should be suspicious of people who want temporary failure to be taken as another’s endemic state. Indeed, there’s something evil about wanting failure to be somebody else’s defining feature.
What motivates such people?
In episode 2 of Roman Catholic Bishop Robert Barron’s Catholicism series, then-Father Barron meditates on the Beatitudes. “Blessed are they who are persecuted for the sake of righteousness, for theirs is the kingdom of heaven”! In summary, Barron suggests that giving up worldly needs and satisfying them with God — aligning one’s life with His plans — fills one up, leaving one needless. Giving up a need for approval makes one free to love. A person who does not need anything from the other cannot have the most-essential thing taken from him or her, so he or she has no need for hatred, envy, or insecurity, because the only thing that ultimately matters is fully secure. That person is blessed.
Conversely, it may be an error to think of our worldly antagonists as motivated by hatred. Maybe hatred is the surface expression, and a useful shorthand to describe their demeanor, but behind it is a neediness and insecurity — cursedness.
This theme emerges in episode 3 of Catholicism, wherein Bishop Barron addresses the problem of evil. Evil, he says, is not an active force of opposition, but a deprivation. It is the absence of goodness and love. Just so, hatred does not exist as a thing in itself, but either as a disordered love (loving that which should not be loved) or an expression of deprivation of something that one needs.
The hater senses that something important is missing, so he or she looks for a scapegoat upon whom to place the blame. Hating the bad is a manifestation either of an inability to truly love the good or an insecurity about the good, as if it can be taken away. And anything that actually can be taken away is not a suitable resting place for our security.
In the final conclusion, the only suitable security is God. If you have love of Him, then nothing else can harm you. Evil has no holding point, because every possible outcome or circumstance — pleasure, pain, health, illness — is a blessing in its way.
Bishop Barron says that God creates the universe not by conflict, as banging against some substance that was originally formed in a different way (as a different thing). Rather, he coaxes reality, guiding it toward His purpose. Consequently, because no outcomes or circumstances are deformities to be hammered out of the mold, they are instead stages toward God’s intention; thus they are good.
So, too, with risk and loss. We should always be striving, and to insist that we should never fail is not only to prove ignorance of the process of learning and improvement, but also to miss the point of what we should be striving toward.
Existentially, we should be striving toward God, but this existential perspective should filter down into practical life and professional endeavors, even for those who struggle with faith. Ultimately, we should be striving to fulfill God’s will, but at a closer level (one that we are better able to discern), we should be striving to build and to improve the world.
If such is our heart’s desire, then the effort alone is success, and we should welcome humbling failures as opportunities to improve ourselves. Inasmuch as America is a shining city upon a hill, this is its source of righteousness — not that it is always successful or that it was born sinless from the forehead of civilization, but Americans strive and are not afraid to fail, because we have a higher purpose in mind.
“Love is love” is one of those wonderful-sounding slogans that people ought to be able to dismiss as a serious idea almost immediately upon hearing it. It’s a good t-shirt, but ridiculous analysis and foolish policy.
Do you love your spouse as you love your friends? Do you love your child as you love your parents? For that matter, do you love your mother as you love your father? Some forms of love can be weighed against others, as comparing your children against your friends, and others can’t, or shouldn’t be, as comparing one child against another or your mother against your father. But love is not simply love.
That is why Greek philosophers drew love into multiple categories very early in the formation of our culture. Playful love, or ludus, must be distinct from pragma, or the mature, life-long love that develops over decades of marriage. Loving one’s self, philautia, cannot be the same as agape, which is a selfless love for all. (Indeed, the former is not a unitary whole, inasmuch as it involves both narcissism and the much-healthier self-compassion.) And abiding love for a friend, philia, is distinct from passion for a lover, eros.
Arguably, this taxonomy is not complete. Where, for example, does love in the form of obligation (as for our parents) become distinct from love in the form of responsibility (as for our children)? What about the difference between love of our home and love of our homeland?
What the slogan, “Love is love,” means to insist is that the sort of love that characterizes marriage does not depend on the sexes of those involved. In the lifecycle of marriage, a man can experience ludus turning into eros, which then develops into philia and pragma with a woman, ideally with healthy development of philautia as the reward for feeling loved and needed. If instead that man follows the same process with another man, then the slogan proclaims them to be the same sort of love.
That is too limited, though, when it comes to what marriage is. Traditional marriage also fosters agape, and we must at least entertain the possibility of difference when the relationship does not mix the two sexes of our species; being bound in love with a person of the other sex surely assists sympathy with that half of humanity.
Practical distinctions also exist between men and women, most especially that they jointly can have children. Thus, marital eros places one face to face with agape as intimacy generates children and places the couple in the continuum of humanity.
To be sure, same-sex couples can adopt or come to raise children by some other means, but that is a separable decision from their coupling. At the heart of the matter is whether our society should — indeed, should be permitted to — acknowledge a distinction of a type of intimacy and of love that creates children by its very nature.
But isn’t all of this just an over-intellectualized rehashing of a debate that’s already been lost in the public square? Maybe, but it’s important for us to remind ourselves regularly so, as the consequences emerge, society won’t be puzzled as to the reason.
Consequences, there will be. For example, a recent analysis of research found that fathers’ involvement with their children tends to increase their desire to be involved, and then this trend builds on itself from one generation to the next. Other studies repeatedly confirm the importance of fathers, which can be distinguished from the importance of mothers, especially biological fathers.
In messy life, such families are not universally possible, and compassion requires us to mitigate the harm to everybody involved (rather than amplify it through stigma). Still, how could it not have consequences when a society refuses even to recognize the ideal circumstance as something unique? If procreation is not intrinsic to marriage, then fathers have less encouragement to develop healthy families for their children when having them was not a deliberate choice.
The fundamental error of “Love is love” — indeed, the fundamental error of the juvenile ideology that would erase all distinctions — is the insistence that unless all items in a broad category are the same, then some are devalued. This error harms all of us, not only by taking away the tools by which we encourage each other toward better decisions, but even by depriving us of the ability to be enriched by differences.
If all love is simply love, then ideas like friendship and parenthood are either corrupted by lingering questions about eros or they cannot involve “love.” As much as our unconscious social heritage may make such statements seem bizarre to us now, a generation or two of “Love is love” could make them seem unremarkable and even desirable. When such shifts produce their inevitable harm, we’ll need a record of dissent that can help future generations rediscover the new old truths that were nearly lost in the social revolution.
Increasingly, we’re being called upon to raise our fists as a show of solidarity. This is yet another way in which the zeitgeist of our times is demanding that we affirm a falsehood. Several falsehoods, actually.
The most direct is the notion that Western civilization is founded on and persistent in systemic racism. It is not. But a more subtle and more profound falsehood brings us to the deeper battles of humanity, on the plane of inner demons and better angels.
Namely, clenched fists are not a sign of solidarity, at least not the type of solidarity modern advocates claim to be promoting. In terms of social justice, the Catechism of the Roman Catholic Church provides a strong definition, quoting from Pope Pius XII as follows:
An error, “today abundantly widespread, is disregard for the law of human solidarity and charity, dictated and imposed both by our common origin and by the equality in rational nature of all men, whatever nation they belong to.”
The text goes on:
Solidarity… presupposes the effort for a more just social order where tensions are better able to be reduced and conflicts are more readily settled by negotiation.
Perhaps universally across humanity, a clenched fist is the opposite. It’s a sign of inner tension and an outward aggression. The idea modern radicals are striving to infuse into social justice and solidarity is older, arguably reactionary: solidarity with one group in opposition to another. That is the meaning of a closed fist.
Pick up any self-improvement book about communication strategies, and you’ll likely find sections about presenting with an open posture. They’ll encourage an open stance and open hands. That is a posture of solidarity. “I am open to what you have to say. I am interested in you, as a human being.”
When a mob of young adults surrounded and berated an outside diner in Washington, D.C., in August, they were not asking her to show her openness to all humanity. They were demanding that she take their side against another. Moreover, they have drawn a hard line by which solidarity with them means repudiation of the Other. Thus, those promoting so-called “anti-racism” hold that it is insufficient not to be racist; one must actively oppose those on whom they affix the patch of “racist.”
Providing this open check to solidarity hucksters is the price of being recognized as a human being. Just so did a Black Lives Matter recently refuse to speak with John DePetro, saying that he was the only journalist who had refused to pay that price.
Lindsay Iadeluca, of WJAR, channel 10, recently wrote on Twitter that, “Social justice isn’t a personal view. It’s human decency.” Such a statement from a journalist might be defensible under the Catechism’s definition of social justice. Under the activists’ oppositional, aggressive definition of “solidarity,” it is not.
What Iadeluca (or the trap that caught her) has done is to smuggle one definition of a word into a principle of behavior that is appropriate to a different definition. Ostensibly neutral reporters can side with social justice when it means “the good of all humanity,” but changing the meaning to “support for this ideological group against another” rewrites the social agreement under which journalists operate.
Writer Andrew Sullivan encountered an intellectual’s version of this recently. He was forced out of New York magazine because it became suddenly controversial that he had published an extract from the book The Bell Curve alongside “13 often stinging critiques” for a 1994 issue of the New Republic. That is, he published a baker’s dozen of essays about a controversial book and provided readers with a little bit of context from the book itself. He now writes:
The fact I had not recanted that decision did not, mind you, prevent TIME, the Atlantic, Newsweek, the NYT and New York magazine from publishing me in the following years. But suddenly, a decision I made a quarter of a century ago required my being canceled.
Again, the terms of a social agreement were changed under Sullivan’s feet. Presenting information alongside responses to it was just expected of an intellectual magazine back then, and acknowledging that some question of science is still open when it is, in fact, open was expected of intellectuals. Now those acts are, as a New York Times reporter editorializing about Sullivan insists, indefensible.
Our society has been on this path for a long while, even if the precipitous slope is a more-recent development. Perhaps the most intellectually striking sentence Andrew Sullivan has ever written was in his 1995 book, Virtually Normal, making the case for same-sex marriage.
Some might argue that marriage is by definition between a man and a woman; and it is difficult to argue with a definition. But if marriage is articulated beyond this circular fiat, then the argument for its exclusivity to one man and one woman disappears.
The institution of marriage — and the laws that evolved around it — were premised on the definition that it was a relationship involving members of the opposite sex. But, suggested Sullivan, if we just change that definition, then the meaning, the institution, and the laws will all change around it. Well… of course.
Unfortunately, rationalizing that route to social change has consequences, because one can’t simply say, “as far as I want and no farther.” Once we accepted that an intellectual time machine could go back in history and change words and, thereby, rules, we opened the way for ideologues to mass produce such vehicles.
Sex distinction was written out of “marriage” (with number distinctions now following). The concept of a unified humanity, understanding of differences, is now being written out of “social justice.” And openness is being written out of “solidarity.”
If these things are done as, and understood to be part of, the evolution of humanity (whether in a better or worse direction), then we recognize them as results of change. A debate has been conducted and a conclusion reached. We move forward with a new understanding of the terms. But if definition and context are simply rewritten, then one is always at fault for believing differently than others might believe in the future because the definition and context behind one’s actions is simply erased.
No better emblem for the destination of this process could be found than a closed fist.
Featured image: Black Lives Matter activists demand conformity at an outdoor restaurant in Washington, D.C., on August 25, 2020.
Disagreement between those who emphasize science and those who emphasize religion tends to be reducible to a matter of ignoring boundaries. This can be direct, as when a religious person holds that a scientific finding cannot be true because it contradicts his or her faith, or indirect, as when a secularist implies that science is providing meaning or moral judgment.
One can see the indirect version — crossing boundaries by assumption — when scientists fail to include the possibility that a religious belief is correct while attempting to explain their findings. Whether they realize it or not, they have given to science the realm of discerning what is and left to religion the realm of what we want to believe. As we see in public education, the first principle is to leave God out of it and focus, instead, on human beliefs about Him when it can’t be avoided. This dictum crosses science’s boundary to smuggle in the belief that God can’t be real.
In a recent UPI article, Brooks Hays reports on a paper by psychology professor Adam Green. The fact that Green works at Georgetown University, an ostensibly Catholic college, illustrates how deeply ingrained is the faulty notion that real science is not permitted to acknowledge the possibility of God as an actually existent Being.
Green and his team found that people with a strong capacity for “implicit pattern learning” — meaning that they subconsciously pick up on patterns — were disproportionately religious, especially those who are very religious and believe that God “intervenes to establish order in the universe,” in Hays’s language. Without knowing that a pattern existed or being told that their task was to find patterns, these folks picked up on them.
Given increasing doubts about the replicability of social science experiments, put aside the question of whether Green’s findings are accurate. (Although they intuitively make sense.) Of interest, here, is the researchers’ conclusion:
“This is not a study about whether God exists, this is a study about why and how brains come to believe in gods,” said Green, who also serves as the director of the Georgetown Laboratory for Relational Cognition. “Our hypothesis is that people whose brains are good at subconsciously discerning patterns in their environment may ascribe those patterns to the hand of a higher power.”
If one takes away the defensiveness about possibly generating evidence of God’s existence, one can see that the conclusion isn’t strictly in line with the findings. It is the conclusion to which one would come when the first priority is to find an explanation that doesn’t possibly imply that religious people are correctly identifying something true in reality. Green’s experiment didn’t find that religious people were more likely to believe there was a pattern. Rather, it found that they were more likely to spot a pattern that did exist.
Why wouldn’t the conclusion be that his religious subjects were more adept at spotting an actual divine pattern in the universe? They weren’t more likely to want to see a pattern, and they weren’t more likely to think they’d identified a pattern when there wasn’t one. They subconsciously understood what the pattern was in the experiment. That doesn’t prove that their powers of discernment must extend to the broad complexities of the universe, but it’s possible.
If the study had found that people good at implicit pattern learning were disproportionately successful stock traders, our first hypothesis wouldn’t be that they “ascribe” patterns to the market’s animal spirits. It would be that they’re good at seeing patterns, which gives them an all-important edge in picking and choosing investments at split-second speeds.
Indeed, in that case, investment firms might start hiring Green to test their job applicants!
This isn’t to say that we should use the pattern test to find people who can tell us what to believe about the universe. But if the role of science is to discern and describe what is, it will be seriously handicapped to the extent its practitioners refuse to acknowledge that people’s beliefs can be true… even if they derive from some other source than a rational experiment.
Featured image: A serendipitous cloud formation captured by Carl Tidy.
The pop-cultural interpretation of Robert Frost’s famous poem, “The Road Not Taken,” is that it is an encouragement to diverge from the herd — to discern the path that others have eschewed and choose that one.
A professor of literature will teach, or at least would have taught some twenty years ago, that the poem is actually ambiguous. The narrator never states which path he took, or even whether he intended to take “the one less traveled by.” He says that he will “be telling this with a sigh,” not now, but in the distant future, without detailing what kind of a sigh he means or what difference was made. The reader can’t really say whether the narrator has already confirmed that he took the less-traveled road and that it made a difference or rather is still predicting what will prove to have been the case. A different kind of society than ours might very well read this poem as a lamentation over inadvisable rebellion.
Perhaps what captures the imagination in “The Road Not Taken” is not the decision, but the affirmation that the decision matters. This wasn’t the only choice the narrator had to make. “Way leads on to way”; one choice follows the previous. In our more-or-less comfortable half-century, we hunger for the sense that it makes a difference whether we go this way or that, stand up or sit down, live or die.
For this reason, a tweet caught my eye a few weeks back from blogger, writer, and Roman Catholic priest Father Dwight Longenecker announcing the release of his latest book, Immortal Combat. The title and the cover both scream that what we do (what we decide) does, in fact, matter.
Fr. Longenecker describes the world that Jesus entered and the way he acted as the archetypal “secret son” — the unknown savior who sneaks past the enemy unseen until it is too late. He writes of the historical Gehenna, where the cult of Moloch would slide children into their idol’s mouth to the fire within as a sacrifice. With the death and resurrection of Jesus, it was as if the Father had sent in a child whom the flames could not harm, and who could ensure that no child need ever burn again.
“The torture, death, and resurrection of the Lord Jesus broke open the floodgates of a new power of life and love in the world” (p. 121). The modern Western ear has heard such phrases again and again, to the point that they seem clichés of poetic dogma. What does that mean? How does it work? Jesus walked the Earth two thousand years ago. If he defeated Satan, then what are we still doing wandering around in our Original Sin?
We are not, Fr. Longenecker assures us, engaged in a mop-up operation, simply cleaning up the mess of the battle after it has ended. Nor are we simply biding time until He comes again. And we’re certainly not living in the Eden of Christ’s victory. Rather, “every action of self-sacrifice — no matter how secret and small — helps to bring alive in every moment and hammer home the eternal victory of the Crucified” (p. 125).
Expanding beyond the argument of Immortal Combat, this line of thought recalls the notions of paganism discussed in this space last week. The pagan world was one of battling tribes under competing gods. In our “post-Christian” world, that sense of society is returning, only our paganism has gotten more abstract.
The battle between tribes once was a battle between their gods. Whoever won a battle, that army’s gods were said to be with them. In humanity’s early awareness of Yahweh, the God called “I am” was simply the greatest god. Pharaoh obstinately refused to heed Moses as long as his magicians were able to replicate Yahweh’s miracles. Only when Aaron and Moses brought forth gnats from the dust did the magicians credit the unique power of God, and even then pharaoh needed more evidence.
This sort of divine battle faded with the concept that there is only one God, and our task is to discern His will. The battle then became a question of who was right. Bloody as those disputes have been, the change is still an improvement, because at least everybody is focused on the same divine person, presumably with an agreed-upon text as a guide.
But now, relativism has brought the pagan divergence back in a worse way. Instead of all having different gods, we suddenly find ourselves in different realities. The battle isn’t to defeat the others’ God; it’s to obliterate their entire understanding of the meaning of the universe. It is no longer “your god is defeated”; it’s “God is dead.”
If the will and power of the gods was once known by who won in battle, the warriors were, in a sense, acting as the bodies of those gods. In the abstract battles of relativism, we more-literally give body to competing realities defined by different meanings, different intentions for creation, which is to say, different capital-G Gods. Thus, when we “bring alive in every moment and hammer home the eternal victory of the Crucified,” in Fr. Longenecker’s words, we are giving Christ a Body that makes Him real. To an ecumenical relativist, this can mean that every variation of God is equally valid. To the nihilist, a God who requires human beings to will Him into existence cannot be real in the first place, so there can be no such thing as God.
Years ago, I proposed that we should see reality as a mesh: infinite threads of probability leading from one moment to the next, with free will being our ability to choose which path to take. In that model, souls communicate and pull each other toward one direction or another. The idea might appeal to relativists except that God is the ultimate reality, being the One who existed before He created this web of possibilities called the universe and the One who will continue to exist when it has run its course.
To give relativists their due, free will still means (must mean) that we can drift off so far from God that we become practically incapable of hearing His Spirit communicating across the threads. We can choose a different intention of the universe, a different meaning, which is synonymous with a different God, by making that the underlying principle that guides our choices from moment to moment. The more people we draw in, the more real that God becomes. Even so, the physical rules of how the universe works, and the rules of our human nature, are only fully in harmony with the God who created them.
The continuing mission of Christians, therefore, is to live along the path (the physical dimension) in which God saves us all. Fr. Longenecker comes very close to this precise statement when he writes that “The Way of the Lamb… means living in a new dimension of reality — a supernatural renewal of heart and mind that draws us ever closer into an intimate union with the Lord Jesus Himself” (p. 123).
Even if we conclude that our decisions matter in this way, the question of our lives remains: How do we know that we’re actually on the right path and not just talking ourselves into believing it for nothing? How do we know that “somewhere ages and ages hence,” as Robert Frost put it, we will be assured that our path made all the difference, and how can we be assured, unlike Frost’s narrator, that the difference will have been positive?
The answer draws on the two great sides of our personalities — our two basic ways of knowing. We first must have faith, drawing on the intuition of our consciences and our feelings. We must then apply our scientific reason. If we find that the velocity of our understanding is toward harmony of fact, feeling, and faith, then our chosen path is more apt to be true. If we find, however, that “way leads on to way,” and that it is proving necessary to deny physical reality because it is out of harmony with the Reality we take as the purpose of the universe, we should retrace our steps.
Featured image: Cover art from Immortal Combat, by Fr. Dwight Longenecker.