Living a Meaningful Life

For my students:

Living a Significant Life

When many people discuss what they want to achieve, they talk about being “successful.” In their view of the world, success is the ultimate goal in life. But have you ever thought about taking it one step further and becoming significant? Just take a look around you, and you’ll see that most people use their knowledge, resources, and experience to acquire things and status in an attempt to satisfy their personal desires. This, in their minds, constitutes success. But becoming significant means using your knowledge, resources, and experience to serve and benefit others.

“Success is indeed a journey, but if you stop at [only] adding value to yourself, you miss the reward of significance.”

—John C. Maxwell

For those that truly understand life, success and significance are one and the same. However, for a significant portion of our society, success begins and ends with the achievement of a certain list of personal goals with little regard to the impact on others. These people confuse success with significance, and regardless of their wealth and professional accomplishments, they won’t acquire the true greatness that only comes through making significant contributions to something other than oneself. Success is in the eye of the beholder, whereas significance is a view of you that is held by others.

Moving from success to significance relies completely on motivation. Are you solely seeking to have fun, fame, fortune, and recognition, or are you seeking to serve and benefit others with what you have? Having a genuine desire to help others comes from the heart, not just the head. Significant means something is weighty and highly meaningful, but if we ignore the simple things, we might risk being inauthentic. Everyday things are embedded in significance and making the simple a priority can make a big difference. Pursuing significance takes us out of our comfort zone, but once it’s sensed, nothing else will satisfy you.

More than anything, significance means instead of finding joy in your own success, your joy is the result of the success of others. Adding value to others will become most important to you, and you will be able to say that you love what you do and feel that it’s making a difference in the lives of others.

A great way to start is by being consistently present in others’ lives. Being the person who can be counted on to pick up the phone, be at your desk, answer the text, or simply listen, is invaluable and sets you apart from those who have agendas or are projecting care that isn’t genuine. Few successful people actually make the transition to significance, but every person of significance is successful.

The journey to significance takes time. Set the bar high for yourself by reevaluating your goals and objectives to ensure that you’re on a path towards significance. Don’t allow yourself to be blinded and halted by your own success. Rather leverage that success in an attempt to make a lasting and significant legacy for which you and your family can be proud. If you have the ability to live successfully, you have the ability to live significantly.

Posted in Business, Counseling Concepts, General Musings | 5 Comments

Americans Don’t Want Their Children to Become School Teachers

The following article, which appeared at today’s Daily Wire site, should startle everyone who reads it. While I am happy that enrollment in UW’s teacher preparation program seems to be up (anecdotal, because I am not sure anyone really knows the number), I know that by semester’s end a good percentage of these starry-eyed freshmen will drop out of Education as a major. And it couldn’t come at a worse time. As examples:

I could go on.

Everyone has their opinions as to why there’s a shortage, but I think this article pretty well sums it up: Teachers Want to Be Supported. And not just with more money (although that would be a start). Absent support and the honoring of those who choose to enter the profession, we will see more departures and recurring shortages.


Americans Don’t Want Their Children to Become School Teachers

By Jeremy Adams

DailyWire.com

Americans don’t want their own children to become schoolteachers. This is a sudden and troubling development and is one of the most important issues we must address if we are going to fix our schools and educate our children.

Last week, a disturbing poll about the state of the teaching profession was released.

One of the questions asked in the PDK “Poll of the Public’s Attitudes Toward the Public Schools” was, “Would you like a child of yours to become a public-school teacher in your community?”

Only 37% answered yes.

To give this number some historical context, consider that when the same question was asked in 2018, 46% answered in the affirmative. In 1969, 75% positively responded.

So, what’s going on here? And what does it say about the current state of American education?

To discover what’s at work, it is important to note that in the exact same poll Americans’ rating of their community public schools reached an all-time high in the 48-year history of the poll. A whopping 54% would give their community schools a grade of an “A” or a “B.” This fascinating duality of a public that gives high marks to a profession they wouldn’t want their own children to enter presents a crystal-clear reality: the job has become both unpleasant and unappealing.

Only 29% of the respondents cited poor pay as a reason for wanting their progeny to avoid the profession. Instead, they cited “the difficulties, demands, and stress of the job,” “a lack of respect or being valued,” and “a variety of other shortcomings.”

In Ohio last week, teachers went on strike, not in the oddball ideological tradition of the Chicago Teachers Union, but because they simply wanted air-conditioning and better working conditions.

When I travel and speak to teachers across the country, their complaints are about aberrant student behavior, poor working conditions, and disrespect from the public more often than paltry pensions or subpar pay.

Jake Miller, an award-winning teacher who quit the teaching profession last year, powerfully explained in an op-ed:

Maybe it was the in-service where my colleagues and I wanted time to catch up on emails, grading, parent phone calls, and other things lost in the substitute shortage shuffle. Instead, we were finger painting.

Maybe it was the day prior when I found 3 inches of urine flooding out of the boys’ bathroom.

Or the day when a student hurt themselves and, after reporting the situation, they didn’t get the help they needed and returned to class the next day.

Mr. Miller isn’t being dramatic or engaging in rhetorical bravado. He is telling the truth and we would be wise to listen.

Students are overdosing in bathrooms. Violence toward teachers has increased in recent years. The pernicious obsession with cell phones has robbed students of anything resembling a healthy attention span. Fellow citizens — and yes, I am sadly thinking of a great many of us— mistakenly equate all teachers with teachers’ unions and the broken-souled educators on Libs of TikTok.

We can scream about CRT and the 1619 Project (I have), but really, at the end of the day, I just want my students to be able to make eye contact, learn how to take lecture notes, understand why they can’t listen to music through earbuds when class is going on, and maybe gain a revitalized eagerness to learn in so doing.

Far too many of our students — especially the most at-risk — don’t sleep well. They don’t eat well. They don’t exercise. They don’t socialize. They are utterly stressed out and plagued with “anxiety.” They do not have adult exemplars in their lives. They live their lives untethered to the nourishing power of high expectations and real accountability. Violence and drug use surround them. At the end of the day, it doesn’t matter how good a pitcher is if the rest of the team is off the field.

Of course, in an ideal world, schools would be palaces (with air conditioning). The most educated and talented people in America would enter the classroom because a democracy cannot survive if its citizens are not educated and imbued with the skills of reading, writing, and critical thought. Vacant teaching positions would garner multiple applications. Parents and teachers would work together instead of seeing each other as potential adversaries.

This is why Americans don’t want their children to teach. Not because it isn’t noble. Not because it isn’t important. Not because we want our children to take a vow of poverty.

No, it’s because the long-term habit of our policymakers is to view schools as meccas of social intervention and as hubs of public policy triage for a broken society. And to a certain extent, that makes sense. All of the social pathologies and community dysfunction present themselves on a daily basis on the frontlines of the American school system.

Teachers don’t need a million-dollar salary. They need to know that when they send out a disruptive student, that student won’t be back in class 20 minutes later. They want air-conditioning. They want a computer that isn’t over 10 years old. They want to be able to abolish cell phones in their classrooms without mom and dad breathing down their necks.

This isn’t rocket science. It’s common sense.

Sadly, common sense isn’t so common these days.


[Russo: I do not allow technology in my classrooms, which for college, is somewhat unheard of. But after I explain the “why,” my students gladly put away their phones, tablets, and laptops. And for what it’s worth, most if not all of my students would gladly see the so-called “smart classroom” go the way of the buggy whip. We have invested money where investment wasn’t needed. Instead, we need to stand back and decide on what is truly important. Maybe a return to blackboards and chalk would be a start. But then I’d hear from OSHA. God help us.]

Posted in General Musings | Comments Off on Americans Don’t Want Their Children to Become School Teachers

An Open Letter to Incoming Freshmen

Dear College Freshman,

The University of Wyoming is back in session. I am happy about it because I love to teach.

But this year we begin with the backdrop of President Joe Biden’s $300 billion plan to forgive student debt. Is that decision a brilliant idea that will relieve a burdened generation? Or a terrible policy that leaves the working class footing the bill of the country’s college graduates?

That debate will rage in the so-called “public square.” Millions are already talking about it and soon so will you. I can imagine now how it will result in all manner of protests and moralizing on either side, all of which will be a waste of time.

Remember – you are here to learn how to learn, not to protest.

So, with this letter, I want to give you my perspective, albeit rather dated (I began college in 1974). I hope to impart some hard-won words of wisdom, specifically surrounding all the false choices you will face — and therefore, the challenge of living authentically on campuses where sameness has become the rule.

They say that our universities are charged with educating tomorrow’s citizens insofar as those universities are said to be the incubators of our ideas, our language, our social movements and our politics. I don’t necessarily agree. I think that you can become a citizen, perhaps even a better one, having never gone to college at all.

In the end, the value of a real education lay in learning how to make some Important Choices. Not the meaningless ones. As examples:

Your new “.edu” inbox is by now undoubtedly full of emails from administrators asking you about your preferred dining options—Are you vegan? Do you want your meals to go? Are you sure you’re not a vegan? And roommates—How do you feel about night owls? How clean do you expect your new BFF to be? Is it okay if this person smokes? And all the clubs you might join. Badminton, Brazilian Jiu Jitsu, Glee, Ski Racing, The ChicanX Caucus, Anime Society . . . you name it, they’ve got it.

Then, there’s your preferred pronoun. Not exactly an essential or Important Choice, but such is the college campus of today.

Or how about your dorm room? Like so many of us “first worlders,” you’ve probably spent hours on Amazon, shopping for the perfect dorm room essentials. Perhaps a beautiful duvet?

And then, once you’re on campus, you’ll have to choose your special spot in the library, your seat in the lecture hall, your on-campus gym, and your off-campus bar (with choices aplenty as to the state you want on your fake ID).

But before you devote untold hours to mapping out exactly how you think the next four years of your life will go, I want to offer you this thought:

None of those decisions really matter.

They’re not important because they’re not real. They might feel like actual choices – I know they did to me when I started school – but I have come to understand them as fake ones. They distract us all from the fact that college has become a place where students no longer make hard intellectual and moral choices; in other words, the choices that actually matter.

Beneath the surface of all those meaningless choices, whose purpose is to assert meaningless differences, is an overpowering sameness. You may have different diets and duvets, but let me warn you: In the end, you are all supposed to think the same. It will be forced upon you by an array of social forces, cues, nods, and emojis. Sometimes it’s overt. Sometimes it’s just a vibe.

I suspect it’s already familiar to many of you. It’s the lack of discussion in lecture halls because people think they think the same things or because they’re afraid to use the wrong pronoun and then get called out. It’s the rallies where people have no idea what they’re calling for and don’t care because they just want their presence to be noted, tagged, and liked. It’s the students and professors and Teaching Assistants nodding along, clapping uproariously, at all the correct platitudes that nobody bothers to unpack: Free Palestine! Defund the police! Make college free! Believe all women! (All of them? Really?)

But sometimes, unbelievably, it will be the quiet studious professor with decades of experience and accrued wisdom who, quietly, behind closed doors, acknowledges that your unpopular opinion—the one you were bold or dumb enough to give voice to in class—was actually valid, that he didn’t want to say as much in front of his students, because, you know, it’s easier to go along. (True story.)

There’s this dead white guy, Baruch Spinoza, who tells us that the only true freedom is freedom of thought. It comes from what you believe, not what you say you believe or what you look like. If Spinoza were to reappear on campus today, he’d see a lot of people with different colored hair and tattoos and piercings who insist they are “living their truth” but are, in fact, unwitting prisoners of someone else’s truth (whatever it may be).

Go ahead: Tattoo and pierce your beautiful legs, your muscled arms, your eyebrows, whatever. But please, I beg you, don’t do it to fit in.

Often, these choices are subtle. Take for example, my lifelong friends. I have exactly four of them. I’m not a weirdo. It’s just that in my lifetime, it’s come down to the four people who insisted on being free; people who have shown me, more than anyone else, what it means to stand up for something, to make a cogent argument, to listen, and possibly to admit to being wrong. These are the people whom I admire and respect and really, truly know. It is because of these four friendships that I am now inching toward old age with the knowledge that I didn’t need to be afraid of being myself. They didn’t surround themselves with like-minded people and encouraged me to avoid them as well. It was subtle encouragement, but it was real.

We had a history teacher way back in high school whose byline was, “ask many [questions and people], listen to all [the people and answers], then make-up-your-own-mind.”

While that was high school, the class was an elective. I didn’t need to take it, nor did my friends. But without question, it taught me and them how to think, how to distinguish between fake and real choices, how to find meaning in other people — and ultimately, how to make up my own mind.

In college, I was encouraged to take ethics courses as electives. I am so glad that I did. I ended up minoring in Business Ethics and in that process, learned so much about definitions of human nature. And I had to “put up” with so many of the now-accepted relativistic ideas surrounding morality. I had to learn to stand down from “putting up” with those ideas and, instead, seek first to understand what the hell they were talking about. I asked questions, listened intently, then made up my own mind.

Make sure you do that too. There will be plenty of oddball arguments made in class, some, or all of which will have your eyes rolling. But desist if you can. Ask penetrating questions, listen intently, and then … you got it … make up your own mind.

In other words: Try to escape sameness.

One other important choice my friends and I made was to go to class. Attendance matters.

First of all, you paid for it. Regardless of whether you borrowed the money (put it on a credit card), or if your parents are paying, or if you are paying for it yourself, not going to class is like paying for a hotel room you never use.

Also of consideration, and I am being snarky here, is the idea that unless you go to class, you won’t learn the subject matter. Maybe you know it all – fair enough – but what you don’t know is how your particular professor is teaching the material. You won’t learn the subtleties that come from his or her energetic lecture on that material.

Most importantly, you won’t be practicing self-discipline.

Go to class.

I certainly had opportunities to skip classes. In my days in college, the Vietnam War was the cause du jour, and plenty of classes were empty as “students” (using that term loosely) went out to protest. The war didn’t end any earlier, by the way. Meanwhile, I aced examinations that they didn’t “ace,” simply because I was in class, and they weren’t.

I kept going. Not just because my parents, who’d paid for it all, expected me to, but because I (me, myself, and I) had elected to take the class. I wasn’t about to skip class just because a bunch of idealogues were telling me what I was supposed to believe. I resisted conforming to the mob, the blob, to give in to the sameness. Discipline.

To quote a recent article,

So, as I sat in the lecture hall chair, biting my nails and tapping my foot anxiously against the floor, I debated my next course of action. If I stayed seated, I would surely be excluded from the study guides that my friends would pass around in group meetings.

I pondered getting up with everyone else and walking out of the lecture hall. If I did walk out, I told myself, I wouldn’t chant along with everyone else or clap or add to the madness. I’d simply leave class, keep my head down, and be rewarded with study guides and maybe even a Facebook “friend.” And I’d try to pretend I wasn’t ashamed of myself.

And then I thought: No way.

He was right to stay.

Please don’t become part of the reason why in America, in the 21st century, a teacher needs a bodyguard to teach class. Or a civil rights attorney to fight the inevitable complaints filed against him. Or, worse, the reason why a teacher avoids the tough stuff.

For it is the tough stuff you are here to learn about.

Stay in class. Listen, Take notes. Ask the proverbial stupid questions. Do the readings.

Yes, you must act according to your conscience—choosing to stand apart without falling apart—but please see that as the most important, real choice you will make.

Duvets don’t matter. Indeed, all the stuff of country club living that our universities offer, do not matter.

What matters is learning how to think.

Yours sincerely,

/s

Dr. Russo

August 2022

Posted in Classroom Management, General Musings, People (in general), People in general, State of the Nation | 2 Comments

His Motive was Unclear?

A man stabs Salman Rushdie and The New York Times says, “the attacker’s motive was unclear. “

Unclear? Seriously? Are we that compromised by relativistic motives and legalisms? Do they not recall the Fatwa issued on Mr. Rushie’s life?  I am surprised they didn’t add “alleged” in that sentence. But then again, even the NYT couldn’t ignore the video of the brutal attempted murder.

Bari Weiss would know. She left the NYT in disgust after the blowback it got from publishing a few years back an editorial by a Republican Senator, Tom Cotton. It marked the beginnings of the “words are violence” trend in America (and elsewhere), and she knew it.

I am old enough to remember how “sticks and stones may break bones, but words can never hurt me.”

Today?

You know the answer. All of us keep our mouths shut. Free speech is dead.

This piece, authored by Ms. Weiss at her blog (her work not mine) is excellent. I put it here for my own reference, and perhaps yours.


We Ignored Salman Rushdie’s Warning

by Bari Weiss

We live in a culture in which many of the most celebrated people occupying the highest perches believe that words are violence. In this, they have much in common with Iranian Ayatollah Ruhollah Khomeini, who issued the first fatwa against Salman Rushdie in 1989, and with Hadi Matar, the 24-year-old who, yesterday, appears to have fulfilled his command when he stabbed the author in the neck on a stage in Western New York.

The first group believes they are motivated by inclusion and tolerance—that it’s possible to create something even better than liberalism, a utopian society where no one is ever offended. The second we all recognize as religious fanatics. But it is the indulgence and cowardice of the words are violence crowd that has empowered the second and allowed us to reach this moment, when a fanatic rushes the stage of a literary conference with a knife and plunges it into one of the bravest writers alive.

I have spoken on the same stage where Rushdie was set to speak. You can’t imagine a more bucolic place than the Chautauqua Institution—old Victorian homes with screened-in porches and no locks, a lake, American flags, and ice cream everywhere. It was founded in 1874 by the Methodists as a summer colony for Sunday school teachers. Now, it attracts the kind of parents and grandparents who love Terry Gross and never miss a Wordle. It is just about the last place in America where you would imagine an act of such barbarism.

And yet as shocking as this attack was, it was also 33 years in the making: The Satanic Verses is a book with a very bloody trail.

In July 1991, the Japanese translator of the condemned book, Hitoshi Igarashi, 44-years-old, was stabbed to death outside his office at the University of Tsukuba, northeast of Tokyo. The same month, the book’s Italian translator, Ettore Capriolo, was also stabbed—this time, in his own home in Milan. Two years later, in July 1993, the book’s Turkish translator, the prolific author Aziz Nesin, was the target of an arson attack on a hotel in the city of Sivas. He escaped, but 37 others were killed. A few months later, Islamists came for William Nygaard, the book’s Norwegian publisher. Nygaard was shot three times outside his home in Oslo and was critically injured.

And those are just the stories we remember.

Think back to 1989, when 12 people were killed at an anti-Rushdie riot in Mumbai, the author’s birthplace, where the book was also banned. Five Pakistanis died in Islamabad under similar circumstances.

As for Rushdie himself, he took refuge in England, thanks to round-the-clock protection from the British government. For more than a decade, he lived under the name “Joseph Anton” (the title of his memoir), moving from safe house to safe house. In the first six months, he had to move 56 times. (England was not immune from the hysteria: Rushdie’s book was burned by Muslims in the city of Bradford—and at the suggestion of police, two WHSmith shops in Bradford stopped carrying the book at the advice of police.)

Salman Rushdie has lived half of his life with a bounty on his head—some $3.3 million promised by the Islamic Republic of Iran to anyone who murdered him. And yet, it was in 2015, years after he had come out of hiding, that he told the French newspaper L’Express: “We are living in the darkest time I have ever known.”

You would think that Rushdie would have said such a thing in the height of the chaos, when he was in hiding, when those associated with the book were being targeted for murder. By 2015, you might run into Rushdie at Manhattan cocktail parties, or at the theater with a gorgeous woman on his arm. (He had already been married to Padma, for God’s sake.)

So why did he say it was the “darkest time” he had ever known? Because what he saw was the weakening of the very Western values—the ferocious commitment to free thought and free speech—that had saved his life.

“If the attacks against Satanic Verses had taken place today,” he said in L’Express, “these people would not have defended me, and would have used the same arguments against me, accusing me of insulting an ethnic and cultural minority.”

He didn’t have to speculate. He said that because that is exactly what they did.

See, when Salman Rushdie was under siege, the likes of Tom Wolfe, Christopher Hitchens, Norman Mailer, Joseph Brodsky, and Seamus Heaney stood up to defend him. The leader of the pack was Susan Sontag, who was then president of PEN America, and arranged for the book to be read in public.  Hitchens recalled that Sontag shamed members into showing up on Rushdie’s behalf and showing a little “civic fortitude.”

That courage wasn’t an abstraction, especially to some booksellers.

Consider the heroism of Andy Ross, the owner of the now-shuttered Cody’s Books in Berkeley, which carried the book and was bombed shortly after the fatwa was issued.

Here’s Ross:

“It was pretty easy for Norman Mailer and Susan Sontag to talk about risking their lives in support of an idea. After all they lived fairly high up in New York apartment buildings. It was quite another thing to be a retailer featuring the book at street level. I had to make some really hard decisions about balancing our commitment to freedom of speech against the real threat to the lives of our employees.”

After the bombing, he gathered all of his staff for a meeting:

“I stood and told the staff that we had a hard decision to make. We needed to decide whether to keep carrying Satanic Verses and risk our lives for what we believed in. Or to take a more cautious approach and compromise our values.  So, we took a vote. The staff voted unanimously to keep carrying the book. Tears still come to my eyes when I think of this. It was the defining moment in my 35 years of bookselling. It was at that moment when I realized that bookselling was a dangerous and subversive vocation. Because ideas are powerful weapons, I didn’t particularly feel comfortable about being a hero and putting other people’s lives in danger. I didn’t know at that moment whether this was an act of courage or foolhardiness. But from the clarity of hindsight, I would have to say it was the proudest day of my life.”

That was the late 1980s.

By 2015, America was a very different place.

When Rushdie made those comments to L’Express it was in the fallout of PEN, the country’s premiere literary group, deciding to honor the satirical French magazine Charlie Hebdo with an award. Months before, a dozen staff members of Charlie Hebdo were murdered by two terrorists in their offices. It was impossible to think of a publication that deserved to be recognized and elevated more.

And yet the response from more than 200 of the world’s most celebrated authors was to protest the award. Famous writers—Joyce Carol Oates, Lorrie Moore, Michael Cunningham, Rachel Kushner, Michael Ondaatje, Teju Cole, Peter Carey, Junot Díaz—suggested that maybe the people who had just seen their friends murdered for publishing a satirical magazine were a little bit at fault, too. That if something offends a minority group, perhaps it shouldn’t be printed. And those cartoonists were certainly offensive, even the dead ones. These writers accused PEN of “valorizing selectively offensive material: material that intensifies the anti-Islamic, anti-Maghreb, anti-Arab sentiments already prevalent in the Western world.”

Here’s how Rushdie responded: “This issue has nothing to do with an oppressed and disadvantaged minority. It has everything to do with the battle against fanatical Islam, which is highly organized, well-funded, and which seeks to terrify us all, Muslims as well as non-Muslims, into a cowed silence.”

He was right. They were wrong. And their civic cowardice, as Sontag may have described it, is in no small part responsible for the climate we find ourselves in today. (As I wrote this, I got a news alert from The New York Times saying the attacker’s “motive was unclear.”)

Motive was unclear?

The words are violence crowd is right about the power of language. Words can be vile, disgusting, offensive, and dehumanizing. They can make the speaker worthy of scorn, protest, and blistering criticism. But the difference between civilization and barbarism is that civilization responds to words with words. Not knives or guns or fire. That is the bright line. There can be no excuse for blurring that line—whether out of religious fanaticism or ideological orthodoxy of any other kind.

Today our culture is dominated by those who blur that line—those who lend credence to the idea that words, art, song lyrics, children’s books, and op-eds are the same as violence. We are so used to this worldview and what it requires—apologize, grovel, erase, grovel some more—that we no longer notice. It is why we can count, on one hand—Dave Chappelle; J.K. Rowling—those who show spine.

Of course, it is 2022 that the Islamists finally get a knife into Salman Rushdie.

Of course, it is now, when words are literally violence and J.K. Rowling literally puts trans lives in danger and even talking about anything that might offend anyone means you are literally arguing I shouldn’t exist.

Of course, it’s now, when we’re surrounded by silliness and weakness and self-obsession, that a man gets on stage and plunges a knife into Rushdie, plunges it into his liver, plunges it into his arm, plunges it into his eye.

That is violence.


Postscript: Rushdie will likely lose an eye and could be permanently disabled in a variety of other ways.

 

Posted in Death, General Musings, People (in general), People in general, State of the Nation, Victimhood | 2 Comments

The Business of Business is Business

First, it was that business had to “save the planet.” Then it was that business had to save the whales, the dolphins, and other assorted species. Today it is ESG – environmental, social, and governance – code for social justice and the so-called “triple bottom line.” It was all bullshit. In the end, a business must have made a profit to stay in business. One profit, one bottom line. And by staying in business, it employs people who can take their earnings and do with them whatever they please. But to decide to abandon its mission of making profit by producing goods and services the consumer and other businesses want or need, is to commit a kind of misfeasance.

This article, written 50+ years ago by Dr. Milton Friedman, is seminal. I am capturing it here for my own future reference and as a memory of when rational business minds tended the till.

The Business of Business is Business

The New York Times Magazine

September 13, 1970, originally titled:

The Social Responsibility of Business is to Increase its Profits

by Milton Friedman

When I hear businessmen speak eloquently about the “social responsibilities of business in a free-enterprise system,” I am reminded of the wonderful line about the Frenchman who discovered at the age of 70 that he had been speaking prose all his life. The businessmen believe that they are defending free enterprise when they declaim that business is not concerned “merely” with profit but also with promoting desirable “social” ends; that business has a “social conscience” and takes seriously its responsibilities for providing employment, eliminating discrimination, avoiding pollution and whatever else may be the catchwords of the contemporary crop of reformers. In fact, they are — or would be if they or anyone else took them seriously — preaching pure and unadulterated socialism. Businessmen who talk this way are unwitting puppets of the intellectual forces that have been undermining the basis of a free society these past decades.

The discussions of the “social responsibilities of business” are notable for their analytical looseness and lack of rigor. What does it mean to say that “business” has responsibilities? Only people have responsibilities.

A corporation is an artificial person and, in this sense, may have artificial responsibilities, but “business” as a whole cannot be said to have responsibilities, even in this vague sense. The first step toward clarity in examining the doctrine of the social responsibility of business is to ask precisely what it implies for whom.

Presumably, the individuals who are to be responsible are businessmen, which means individual proprietors or corporate executives. Most of the discussion of social responsibility is directed at corporations, so in what follows I shall mostly neglect the individual proprietor and speak of corporate executives.

In a free-enterprise, private-property system, a corporate executive is an employee of the owners of the business. He has direct responsibility to his employers. That responsibility is to conduct the business in accordance with their desires, which generally will be to make as much money as possible while conforming to the basic rules of the society, both those embodied in law and those embodied in ethical custom. Of course, in some cases his employers may have a different objective. A group of persons might establish a corporation for an eleemosynary purpose — for example, a hospital or a school. The manager of such a corporation will not have money profit as his objectives but the rendering of certain services.

In either case, the key point is that, in his capacity as a corporate executive, the manager is the agent of the individuals who own the corporation or establish the eleemosynary institution, and his primary responsibility is to them.

Needless to say, this does not mean that it is easy to judge how well he is performing his task. But at least the criterion of performance is straightforward and the persons among whom a voluntary contractual arrangement exists are clearly defined.

Of course, the corporate executive is also a person in his own right. As a person, he may have many other responsibilities that he recognizes or assumes voluntarily — to his family, his conscience, his feelings of charity, his church, his clubs, his city, his country. He may feel impelled by these responsibilities to devote part of his income to causes he regards as worthy, to refuse to work for particular corporations, even to leave his job, for example, to join his country’s armed forces. If we wish, we may refer to some of these responsibilities as “social responsibilities.” But in these respects, he is acting as a principal, not an agent; he is spending his own money or time or energy, not the money of his employers or the time or energy he has contracted to devote to their purposes. If these are “social responsibilities,” they are the social responsibilities of individuals, not of business.

What does it mean to say that the corporate executive has a “social responsibility” in his capacity as businessman? If this statement is not pure rhetoric, it must mean that he is to act in some way that is not in the interest of his employers. For example, that he is to refrain from increasing the price of the product in order to contribute to the social objective of preventing inflation, even though a price increase would be in the best interests of the corporation. Or that he is to make expenditures on reducing pollution beyond the amount that is in the best interests of the corporation or that is required by law in order to contribute to the social objective of improving the environment. Or that, at the expense of corporate profits, he is to hire “hard-core” unemployed instead of better qualified available workmen to contribute to the social objective of reducing poverty.

In each of these cases, the corporate executive would be spending someone else’s money for a general social interest. Insofar as his actions in accord with his “social responsibility” reduce returns to stockholders, he is spending their money. Insofar as his actions raise the price to customers, he is spending the customers’ money. Insofar as his actions lower the wages of some employees, he is spending their money.

The stockholders or the customers or the employees could separately spend their own money on the particular action if they wished to do so. The executive is exercising a distinct “social responsibility,” rather than serving as an agent of the stockholders or the customers or the employees, only if he spends the money in a different way than they would have spent it.

But if he does this, he is in effect imposing taxes, on the one hand, and deciding how the tax proceeds shall be spent, on the other.

This process raises political questions on two levels: principle and consequences. On the level of political principle, the imposition of taxes and the expenditure of tax proceeds are governmental functions. We have established elaborate constitutional, parliamentary and judicial provisions to control these functions, to assure that taxes are imposed so far as possible in accordance with the preferences and desires of the public — after all, “taxation without representation” was one of the battle cries of the American Revolution. We have a system of checks and balances to separate the legislative function of imposing taxes and enacting expenditures from the executive function of collecting taxes and administering expenditure programs and from the judicial function of mediating disputes and interpreting the law.

Here the businessman — self-selected or appointed directly or indirectly by stockholders — is to be simultaneously legislator, executive and jurist. He is to decide whom to tax by how much and for what purpose, and he is to spend the proceeds — all this guided only by general exhortations from on high to restrain inflation, improve the environment, fight poverty and so on and on.

The whole justification for permitting the corporate executive to be selected by the stockholders is that the executive is an agent serving the interests of his principal. This justification disappears when the corporate executive imposes taxes and spends the proceeds for “social” purposes. He becomes in effect a public employee, a civil servant, even though he remains in name an employee of a private enterprise. On grounds of political principle, it is intolerable that such civil servants — insofar as their actions in the name of social responsibility are real and not just window-dressing — should be selected as they are now. If they are to be civil servants, then they must be selected through a political process. If they are to impose taxes and make expenditures to foster “social” objectives, then political machinery must be set up to make the assessment of taxes and to determine through a political process the objectives to be served.

This is the basic reason why the doctrine of “social responsibility” involves the acceptance of the socialist view that political mechanisms, not market mechanisms, are the appropriate way to determine the allocation of scarce resources to alternative uses.

On the grounds of consequences, can the corporate executive in fact discharge his alleged “social responsibilities”? On the one hand, suppose he could get away with spending the stockholders’ or customers’ or employees’ money. How is he to know how to spend it? He is told that he must contribute to fighting inflation. How is he to know what action of his will contribute to that end? He is presumably an expert in running his company — in producing a product or selling it or financing it. But nothing about his selection makes him an expert on inflation. Will his holding down the price of his product reduce inflationary pressure? Or, by leaving more spending power in the hands of his customers, simply divert it elsewhere? Or, by forcing him to produce less because of the lower price, will it simply contribute to shortages? Even if he could answer these questions, how much cost is he justified in imposing on his stockholders, customers, and employees for this social purpose? What is his appropriate share and what is the appropriate share of others?

And, whether he wants to or not, can he get away with spending his stockholders’, customers’, or employees’ money? Will not the stockholders fire him? (Either the present ones or those who take over when his actions in the name of social responsibility have reduced the corporation’s profits and the price of its stock.) His customers and his employees can desert him for other producers and employers less scrupulous in exercising their social responsibilities.

This facet of “social responsibility” doctrine is brought into sharp relief when the doctrine is used to justify wage restraint by trade unions. The conflict of interest is naked and clear when union officials are asked to subordinate the interest of their members to some more general social purpose. If the union officials try to enforce wage restraint, the consequence is likely to be wildcat strikes, rank-and-file revolts, and the emergence of strong competitors for their jobs. We thus have the ironic phenomenon that union leaders — at least in the U.S. — have objected to government interference with the market far more consistently and courageously than have business leaders.

The difficulty of exercising “social responsibility” illustrates, of course, the great virtue of private competitive enterprise — it forces people to be responsible for their own actions and makes it difficult for them to “exploit” other people for either selfish or unselfish purposes. They can do good — but only at their own expense.

Many a reader who has followed the argument this far may be tempted to remonstrate that it is all well and good to speak of government’s having the responsibility to impose taxes and determine expenditures for such “social” purposes as controlling pollution or training the hard-core unemployed, but that the problems are too urgent to wait on the slow course of political processes, that the exercise of social responsibility by businessmen is a quicker and surer way to solve pressing current problems.

Aside from the question of fact — I share Adam Smith’s skepticism about the benefits that can be expected from “those who affected to trade for the public good” — this argument must be rejected on the grounds of principle. What it amounts to is an assertion that those who favor the taxes and expenditures in question have failed to persuade a majority of their fellow citizens to be of like mind and that they are seeking to attain by undemocratic procedures what they cannot attain by democratic procedures. In a free society, it is hard for “good” people to do “good,” but that is a small price to pay for making it hard for “evil” people to do “evil,” especially since one man’s good is another’s evil.

I have, for simplicity, concentrated on the special case of the corporate executive, except only for the brief digression on trade unions. But precisely the same argument applies to the newer phenomenon of calling upon stockholders to require corporations to exercise social responsibility (the recent G.M. crusade, for example). In most of these cases, what is in effect involved is some stockholders’ trying to get other stockholders (or customers or employees) to contribute against their will to “social” causes favored by the activists. Insofar as they succeed, they are again imposing taxes and spending the proceeds.

The situation of the individual proprietor is somewhat different. If he acts to reduce the returns of his enterprise in order to exercise his “social responsibility,” he is spending his own money, not someone else’s. If he wishes to spend his money on such purposes, that is his right and I cannot see that there is any objection to his doing so. In the process, he, too, may impose costs on employees and customers. However, because he is far less likely than a large corporation or union to have monopolistic power, any such side effects will tend to be minor.

Of course, in practice the doctrine of social responsibility is frequently a cloak for actions that are justified on other grounds rather than a reason for those actions.

To illustrate, it may well be in the long-run interest of a corporation that is a major employer in a small community to devote resources to providing amenities to that community or to improving its government. That may make it easier to attract desirable employees, it may reduce the wage bill or lessen losses from pilferage and sabotage or have other worthwhile effects. Or it may be that, given the laws about the deductibility of corporate charitable contributions, the stockholders can contribute more to charities they favor by having the corporation make the gift than by doing it themselves, since they can in that way contribute an amount that would otherwise have been paid as corporate taxes.

In each of these — and many similar — cases, there is a strong temptation to rationalize these actions as an exercise of “social responsibility.” In the present climate of opinion, with its widespread aversion to “capitalism,” “profits,” the “soulless corporation” and so on, this is one way for a corporation to generate good will as a byproduct of expenditures that are entirely justified in its own self-interest.

It would be inconsistent of me to call on corporate executives to refrain from this hypocritical window-dressing because it harms the foundations of a free society. That would be to call on them to exercise a “social responsibility”! If our institutions, and the attitudes of the public make it in their self-interest to cloak their actions in this way, I cannot summon much indignation to denounce them. At the same time, I can express admiration for those individual proprietors or owners of closely held corporations or stockholders of more broadly held corporations who disdain such tactics as approaching fraud.

Whether blameworthy or not, the use of the cloak of social responsibility, and the nonsense spoken in its name by influential and prestigious businessmen, does clearly harm the foundations of a free society. I have been impressed time and again by the schizophrenic character of many businessmen. They are capable of being extremely farsighted and clearheaded in matters that are internal to their businesses. They are incredibly shortsighted and muddle-headed in matters that are outside their businesses but affect the possible survival of business in general. This shortsightedness is strikingly exemplified in the calls from many businessmen for wage and price guidelines or controls or incomes policies. There is nothing that could do more in a brief period to destroy a market system and replace it by a centrally controlled system than effective governmental control of prices and wages.

The shortsightedness is also exemplified in speeches by businessmen on social responsibility. This may gain them kudos in the short run. But it helps to strengthen the already too prevalent view that the pursuit of profits is wicked and immoral and must be curbed and controlled by external forces. Once this view is adopted, the external forces that curb the market will not be the social consciences, however highly developed, of the pontificating executives; it will be the iron fist of government bureaucrats. Here, as with price and wage controls, businessmen seem to me to reveal a suicidal impulse.

The political principle that underlies the market mechanism is unanimity. In an ideal free market resting on private property, no individual can coerce any other, all cooperation is voluntary, all parties to such cooperation benefit or they need not participate. There are no “social” values, no “social” responsibilities in any sense other than the shared values and responsibilities of individuals. Society is a collection of individuals and of the various groups they voluntarily form.

The political principle that underlies the political mechanism is conformity. The individual must serve a more general social interest — whether that be determined by a church or a dictator or a majority. The individual may have a vote and say in what is to be done, but if he is overruled, he must conform. It is appropriate for some to require others to contribute to a general social purpose whether they wish to or not.

Unfortunately, unanimity is not always feasible. There are some respects in which conformity appears unavoidable, so I do not see how one can avoid the use of the political mechanism altogether.

But the doctrine of “social responsibility” taken seriously would extend the scope of the political mechanism to every human activity. It does not differ in philosophy from the most explicitly collectivist doctrine. It differs only by professing to believe that collectivist ends can be attained without collectivist means. That is why, in my book “Capitalism and Freedom,” I have called it a “fundamentally subversive doctrine” in a free society, and have said that in such a society, “there is one and only one social responsibility of business — to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition without deception or fraud.”

Milton Friedman (1912-2006) received the Nobel in economic science in 1976. He was a leader of the Chicago school of economics, associated with University of Chicago, and a lifelong advocate for free markets. His economic ideas influenced a generation of conservative politicians, notably Ronald Reagan and Margaret Thatcher, making him one of the most consequential economists of the 20th century.

 

Posted in Business, State of the Nation | Comments Off on The Business of Business is Business

Travesty of Travesty – Apparently, Some Doctors Out There are Pro-Life

 

 

 

 

 

 

Recently, University of Michigan medical students walked out of their white coat ceremony after their “demands” weren’t met. Problem is: They missed a transcendent lecture about staying human in an age of machines.

The following piece, appearing at Bari Weiss’ Common Sense blog on Sub stack, was written by a professor of medicine at UCSF. It is his work, not mine, and is worth pondering. Imagine – there are doctors who will speak publicly about being pro-life.

Amazing.


Dr. Kristin Collier is an assistant professor of internal medicine at the University of Michigan, where she has served on faculty for 17 years. She also is the director of the medical school’s Program on Health, Spirituality and Religion and has been published in publications including the Journal of the American Medical Association and the Annals of Internal Medicine.

Many describe her as a consummate physician and superb teacher—deeply liked and respected by her peers. That’s why, out of some 3,000 faculty at Michigan, Dr. Collier was chosen by students and her peers to be this year’s White Coat Ceremony speaker. The White Coat Ceremony is one bookend of medical school (graduation is the other), where students put on their white coats for the first time, take the Hippocratic oath, and begin the long path to becoming a doctor.

The trouble is that Professor Collier has views on abortion that are out of step with many Michigan medical students—likely the majority of them. She has stated that she defines herself as pro-life, though she does not state the extent of her position (i.e., whether she allows exemptions for rape or incest). In that same interview, in which she talks about her personal transformation from a pro-choice atheist to a Christian, she laments the intolerance for religious people among medical colleagues.

“When we consider diversity in the medical profession, religious diversity is not—should not—be exempt from this goal.”

After Michigan announced her speech, the university made it clear that Dr. Collier would not be addressing abortion in her talk. “The White Coat Ceremony is not a platform for discussion of controversial issues, and Dr. Collier never planned to address a divisive topic as part of her remarks,” the Dean of the medical school, Marshall Runge, wrote to students and staff earlier this month.

That didn’t stop hundreds of students and staff from signing a petition demanding Dr. Collier be replaced with another speaker. “While we support the rights of freedom of speech and religion, an anti-choice speaker as a representative of the University of Michigan undermines the University’s position on abortion and supports the non-universal, theology-rooted platform to restrict abortion access, an essential part of medical care,” they wrote. “We demand that UM stands in solidarity with us and selects a speaker whose values align with institutional policies, students, and the broader medical community.”

The school stood firm.

But on Sunday, just as Dr. Collier rose to give her remarks, dozens of students and family members began walking out of their white coat ceremony.

By now we are all accustomed to such displays from American students.

The particular shame here is that those who walked out missed a transcendent lecture about the meaning of practicing medicine in a culture that increasingly treats human beings like machines.

“The risk of this education and the one that I fell into is that you can come out of medical school with a bio-reductionist, mechanistic view of people and ultimately of yourself. You can easily end up seeing your patients as just a bag of blood and bones or human life as just molecules in motion,” Dr. Collier said.

“You are not technicians taking care of complex machines, but human beings taking care of other human beings,” she said. “Medicine is not merely a technical endeavor but above all a human one.”

Dr. Collier has handled the whole thing with grace. She tweeted yesterday: “I’ve heard that some of the students who walked out have been harassed and targeted—please stop. Everyone has a right to stand up for what they believe in.”

The decision of students to walk out of the lecture because they disagree with the speaker on another topic apparently has no limit.

In medicine, abortion is an important life or death issue. So too is universal health care, immigration, and school closure. All these topics have the highest stakes. And all are controversial. If students walk out on speakers discussing unrelated issues, where does it end?

Would they learn about the nephron from a nephrologist who favors strict immigration limits? Could they learn how to perform CPR from an instructor who lobbied to keep schools open during Covid-19?

Most concerning, what does it mean for American patients, if their future doctors cannot sit through a speech by a beloved professor who has a different view on abortion? Could you trust a physician knowing that may judge you for holding views that they deem beyond the pale?

As a professor at UCSF medical school, I worry deeply that we are not preparing our future doctors for practicing medicine on real people in the real world. Medicine has to meet patients where they are; often that means caring people and working with people with whom we disagree.

We can’t walk out on that.

Posted in General Musings, Victimhood | 2 Comments

Right on Schedule: A Killer Virus Emerges Just Before an Important Election

 

 

 

 

 

I expect either COVID or monkeypox, or both, will be cited as yet another reason to stay away from the election this coming Fall.

Funny how that happens.

The following was written by Donald G. McNeil Jr. and appeared in Bari Weiss’ Common Sense blog. I am reposting it here for my own reference, and yours (should it be of value). It is their work product, not mine.

Mr. McNeil writes …


As if one plague weren’t enough, we now have another: monkeypox. And we’re not handling it well.

At the moment, unless you are a gay man with multiple anonymous or casual sexual partners, you are probably not at much risk. In this new non-African outbreak that began in May, most of the cases have been inside that network—at least thus far. In seven central and west African countries, there have been tens of thousands of suspected cases and hundreds of deaths attributed to the virus over the last two decades. This new outbreak has risen from a dozen cases in Portugal, Spain, and Britain in mid-May to more than 12,000 in over 50 countries, from Iceland to Australia. Over 1,000 of them are in the U.S., many of them in cities like New York, Los Angeles, San Francisco, and Chicago.

But there’s no guarantee it will stay inside that network. Monkeypox is transmitted by sex, by skin-to-skin contact, by towels and sheets, and possibly even by kissing or coughing by patients with sores inside their mouths. A few nurses have caught it from patients, as have family members. There’s much we don’t know about it yet.

Some sexually transmitted diseases—like AIDS—have stayed mostly inside gay male sexual networks. (In the West, that is. In Africa, more than 50 percent of HIV cases are in women and girls.) Some, like syphilis, circulate generally but are more common among gay men. Others like herpes and HPV are widespread among heterosexuals. There’s no way to know yet where monkeypox will go.

We have a vaccine—two vaccines, actually—and also a treatment, so we are in a far better position than we were at the beginning of HIV, Covid or virtually any other previous epidemic. But right now, it’s still spreading fast, with the global number of cases rising by about 1,000 a day.

If there are two effective vaccines for this disease and one solid treatment, why are we losing the fight?

I blame several factors: shortages of vaccines and tests, the initial hesitancy by squeamish health agencies to openly discuss who was most at risk, and the refusal by the organizers of lucrative gay sex parties to cancel them over the past few months—even as evidence mounted that they are super-spreader events.

The monkeypox virus—misnamed because it normally circulates in African rodents, not simians—is related to smallpox, but it’s not nearly as lethal. The successful 25-year effort to eradicate smallpox held it in check: The smallpox vaccine also prevents monkeypox.

But vaccination ended in 1980 because the old vaccines had some rare but very dangerous side effects. So, in the 1990s, monkeypox cases began reappearing in rural Africa, mostly in children born after 1980.

In central Africa, two strains circulate: one with a fatality rate of about 10 percent, another of about one percent. It now looks as if, sometime in 2017 in southeast Nigeria, an even less lethal but more transmissible variant of the second strain emerged. It has circulated in Nigerian cities since then and a few cases were found among Nigerians traveling to Europe and the U.S. One new study suggests that variant was circulating in Europe at least as early as this past March and picking up new mutations.

This May, it appeared among gay men, especially those who had visited four venues: the Darklands leather fetish festival in Belgium; the annual Pride Festival in Spain’s Canary Islands; a gay rave at Berlin’s Berghain techno club; and the Paraiso sauna in Madrid, which, since it had darkened cubicles for orgies, a bondage cell and a bar, was really more of a huge sex club than a spa.

Even though one “sex-positive” party after another has turned into super-spreader events, there has been no willingness by the organizers of such parties to cancel or even reschedule them until more men can be vaccinated.  June was Pride Month in New York and cases are surging in the city now. Two recent parties in San Francisco, Electrolux Pride and the Afterglow Blacklight Discotheque had cases linked to them. And yet more events, like Provincetown Bear Week are going forward anyway.

Outside of Africa, the disease has not killed anyone yet. But for some victims, the pain—especially from pustules inside the mouth or rectum—is so severe that it requires hospitalization. Others feel miserable for weeks; a few get permanent scars. Also, those with monkeypox are supposed to stay in isolation until all the pox crust over and the scabs fall off, which can take up to a month.

The negligible fatality rate won’t necessarily persist if the virus escapes its current network: mostly young, mostly healthy adult men. In Africa, children and pregnant women are the most likely to die from monkeypox. (Older people, by contrast, are not. Many Africans born before 1980 and most Americans born before 1972 were vaccinated as children and doctors believe they still have some residual protection.)

We have two vaccines:

The older one, ACAM2000, approved in 2007, was made after fears were raised that Saddam Hussein had stocks of weaponized smallpox. The National Strategic Stockpile contains 100 million doses. But it has a very small risk of seriously hurting or even killing someone with undiagnosed H.I.V. or another immune-suppressive condition, or someone with widespread skin problems, such as eczema.  It also has about a one-in-500 risk of heart inflammation, which is scary but can usually be treated.

The newer one, Jynneos, made by Bavarian Nordic in Denmark and approved in 2019, is much safer. It has been tested on people with H.I.V. and with skin problems.  As of three years ago, Bavarian Nordic had supplied our Strategic National Stockpile with 28 million doses, but all those have expired. Nordic had a contract to make a longer-lasting freeze-dried version, but when this epidemic began, the stockpile held only 64,000 usable doses; another 800,000 are in bulk frozen form in Denmark. The company is currently converting them into a usable form. (According to New York magazine, the company did not do this sooner because, during the Covid pandemic, it couldn’t get the FDA to inspect its new factory. The Washington Post reports that the inspection has now been done and the vaccines are being processed and loaded onto freezer planes in batches of 150,000 doses at a time.)

We also have an antiviral medicine, tecovirimat or Tpoxx, which was developed by Siga Technologies as a defense against a smallpox bioterrorism event. It also lessens monkeypox symptoms and comes in both oral and intravenous forms. But many men are finding it hard to get because the Centers for Disease Control requires that men getting it be enrolled in a clinical trial.

So where are we now?  As in all epidemics, we’re in the early “fog of war” period. We don’t have rapid tests, so we have no idea how many cases the country really has. (The current test requires swabbing a pustule—those may not appear until many days after the initial infection.) We don’t know all the ways it’s transmitted. We don’t know if there is asymptomatic transmission. And we don’t have nearly enough Jynneos vaccine. When a city like New York gets a shipment, all the appointments are gone in within minutes.

I started writing on Medium about monkeypox on May 23, when most media outlets were saying either, “Monkeypox? Eww!” or “Don’t worry, it’s not Covid.” I argued that we needed to take the threat seriously.

In five subsequent articles, I’ve made some fairly strident suggestions.

First, that we talk frankly about risky gay male sex networks instead of fretting about stigmatization. Stopping an epidemic is more important than greenwashing it. Second, that this summer’s Pride celebrations for men (not the parades, the “sex-positive” after-parties) should be rescheduled until autumn, when many more men will be vaccinated, and rapid tests should be available to enable testing right at the party door. Third, that we stop pretending that ring vaccination could ever work and instead offer vaccine to all men with multiple sex partners, and to all sex workers. (Vaccinating the “ring” of contacts around each case is impossible when men have anonymous sex—they don’t know who their contacts are.) Fourth, that we roll out both the Jynneos and the ACAM2000 vaccines and screen men for the risks posed by the older vaccine. Fifth, that the government offer a month’s shelter to all men who test positive so they can isolate safely under medical surveillance.

Some of these things are finally being done, but I still don’t think we’re doing enough.

In mid-June, the CDC did finally issue warnings that were blunt about the risks of anonymous hook-ups, even mentioning fetish gear and sex toys.

On June 23, New York City’s health department imitated Montreal and started offering vaccine to all gay men who had recently had multiple or anonymous sex partners. Other American cities followed suit.

This is progress, but there still isn’t enough of the safer Jynneos vaccine. And the CDC doesn’t think the threat yet warrants releasing the huge stockpile of ACAM2000.

The Food and Drug Administration needs to do whatever it can to speed up access to the stocks of Jynneos owned by the U.S. government but frozen in Denmark. If they aren’t enough to stop the epidemic, some hard choices about ACAM2000 will have to be made. The FDA must also speed up access to Tpoxx. In an epidemic, it is unfair to demand that every suffering recipient enroll in a clinical trial in order to be treated.

Moreover, gay men—particularly the owners of businesses and organizations sponsoring parties at which sex is encouraged—are not, in my opinion, making the sacrifices needed to slow down the spread.  Their reluctance to reschedule reminds me of the early 1980s, when the owners of San Francisco’s bathhouses bitterly fought the city’s attempts to close them, cloaking themselves in the mantle of gay freedom even as their clients died of AIDS. There was just too much money to be made. For Bear Week—as in the baths 40 years ago—the sponsors now claim their events will be “educational.”

I see the need to stop this epidemic as urgent. Helping more men avoid misery is in itself a worthy goal. But beyond that, after 50 years of progress, a serious backlash against gay acceptance is growing in this country. If the virus, which is already pegged a “gay disease,” keeps spreading and even kills some children or pregnant women, that will get much worse.

For once, we have the tools to stop a budding epidemic. But we don’t yet have enough of them. We should ask men to show restraint until enough rapid tests and vaccines are ready. Then we should crush it.

After that, for the long term, we should encourage other nations to join us in investing billions in a campaign to wipe out this virus at its source, in Africa. Not only would that save the lives of Africans—most of them children—but it’s far cheaper than having to fight this battle again and again on our soil.

 

Posted in Anxiety, Blogging, Death, People (in general), People in general, State of the Nation | Comments Off on Right on Schedule: A Killer Virus Emerges Just Before an Important Election

Successful People Keep Lists

The.Schindlers_List-[C.A.A]-front

Successful people keep lists so that they do not forget certain things or simply to track their progress. The world of several dozen Jews was forever saved because Oskar Schindler kept his list. That was, of course, an extraordinary example of list-making.

In every-day life, and for most of the time, such lists have a way of contributing positively to the success of successful people, since they offer balance and direction. Also, a list helps you to be organized and focused. I keep lists in my iPhone (using the Notes app), and on my tablet and in my Commonplace Book, so that I always have it hand. It’s in all three places for a variety of reasons, including the fact that because paper doesn’t run out of battery life I am always sure to have it.

Wherever you keep your lists, here are some things you should keep written down to help you attain success.

1. Creative ideas

Ideas can be very volatile. They are important to your success, as one idea could change the direction and course of your whole life. This is why it is essential to have a list of ideas written down. These ideas should be the ones you want to act upon or that excite you just as you receive their spark. Many successful minds have a way of carrying a journal or notebook at all times. This way, when an idea pops up, they write it down, so they won’t forget them for future use.

2. Books to be read

Every successful person reads. Reading books boost your chances for success, as they are one item that could challenge you intellectually and inspire smart decisions. Whether it is a non-fiction or a fictional work, always have a list of the books that will satisfy your career or success interests. Sometimes such books are recommended by others, while other times we simply stumble upon them.

3. Thoughts to share with others

Just like creative ideas that could trigger your success, it is important to write something that could inspire and educate others. Success is not just about you, but also offering others a hand in achieving their own success. You could be offering thoughts or knowledge in the form of blog posts, an article in a famous magazine, or a book. This could also propel your career and make you an authority in your field.

4. The interesting people you have met

I once met an important and knowledgeable person at a conference. Even after a very interesting discussion, I did not take down his contact information. This is something I still regret, because who knows for what and when I would have needed his expert advice again? It is always important to keep a list of interesting people you have met and how you can once again reach out to them. Their knowledge and expertise could add to the talent pool of a business you are building or working for.

5. Media persons

From TV hosts to journalists, media persons can be pivotal to offering you the recognition you need to be successful. Reaching out and keeping a list of such persons can be rewarding and offer the needed boost in publicity you need. Don’t be shy to connect with a media person or to engage them on social media. Make sure you are noticed by them.

6. Progress journal

Success requires that you keep track of your progress. Many successful persons like Oprah, Eminem, and J.K Rowling have a journal which they use in tracking their progress. Progress may not mean initial and momentary success, but rather a holistic view of what you define as success. With such a journal, you can identify your areas of strength and weakness and see how you can navigate your terrain for the future.

7. To do lists (I call mine a TooDue™ List)

This is a list of things that you intend to accomplish in the future. For some, it is a “bucket list,” yet this helps you identify what is important and what is not. It also helps you ascertain what direction you want to take. Each item on this list should help to make you a more fulfilled person.

8. Things to see list

Many may not consider this essential, but just as books engage and develop the mind, certain movies, documentaries, and TV shows have a way of making a considerable impact on our success. Try checking out reviews and listening or reading critics to know what shows would align with your goals.

 

Do you keep lists? I’d love to hear from you!

 

— from www.lifehacker.com

Posted in Business, General Musings, List Making, Positive Mental Attitude | Comments Off on Successful People Keep Lists

How to Think like an Adult: Dr Russo’s Review of Cognitive Distortions

This is a rather long post, intended for clients and students as an adjunct to what I’ve covered in sessions or classes. It is a repeat of a much earlier post (from 2016) and is worth re-reading.

In my practice as a mentor and life coach I have worked many clients who present with clinically significant distress that, if not wholly based on distortive thinking, is largely the result of so-called “monkey mind.”  This is simply that, when we cognitively distort what has happened to us, we literally jump around like excited monkeys, in ways which result in clinical distress way out of proportion to what actually occurred (or is occurring).

We begin this little treatise on distortive thinking by examining what the great ancient stoic, Epictetus, had to say. He taught that philosophic inquiry is simply a way of life and not just a theoretical approach to the world. To Epictetus, all external events are beyond our control. Further, he taught that we should calmly, and without passion, accept whatever happens. Individuals are responsible for their own actions, which they can moderate through rigorous self-discipline. The quote, attributed to him, sums it up nicely:

“Men are disturbed not by things, but by the view which they take of them.”

Within counseling, the seminal theorists Aaron Beck and Albert Ellis seized upon Epictetus as they developed their respective therapeutic techniques: cognitive behavior therapy (CBT) and rational emotive behavior therapy (REBT).  As we go about examining cognitive distortions, keep in mind that both theorists had turned their back on then-traditional psychoanalytic techniques which saw depression arising from motivational‐affective considerations; in other words, as misdirected anger, swallowed anger, or “bottled up anger.” 

By the way, I do not use the term “anger management.” I stress to clients that anger, like any other emotion, is meant to be learned from. What occasioned your anger? Did it arise from a violation of your core beliefs? Injustice? No matter the activating event, it is important to salute the emotion – it’s real, after all – and to master how you handle it. Don’t bottle it up. Feel it and then use it to change your world. I call this “anger mastery,” a term borrowed from Kevin Burke.

Anyway, in his practice work, Beck found that his clients reported their feelings of depression in ways differing from these psychoanalytical conceptualizations of depression. Like Ellis, Beck found his clients illustrated evidence of irrational thinking that he called systematic distortions.  Therefore, the basic premise of Beck’s Cognitive‐Behavioral Therapy concerns these distortions and follows the philosophy of Epictetus: It is not a thing that makes us unhappy, but how we view things that make us unhappy. Consequently, if we avoid struggling to change things, and instead change our own interpretations of things, we change how we feel and how we act in the future.

So, then, what is cognition? Cognitions are verbal or pictorial representations, available to the conscious mind, which form according to the interpretations we make about the things that happen to us. These interpretations and assumptions are shaped by a bunch of unconscious presuppositions we make about people and things, based on past experience ‐ and when I say past experience, I mean experiences going all the way back to birth.

When we are infants, we take in the whole world in a rather naive and unthinking fashion. Some term this process as one of “accepting introjected beliefs.” We live with these introjections throughout life (some refer to this as the “appraisal approach” to the world). By way of a simple example, most of us are reluctant to touch a hot stove because of what our mothers and fathers commanded us NOT to do; namely, to touch a hot stove.  We lived with that introjected understanding of stoves for a long time, accepting that they were right. But we never knew for ourselves that they were right! That is, until we accidentally touched a hot stove.

In order to make sense of our world, we recursively form cognitive filters, schemas, or a set of assumptions and expectations of how events will transpire, and what they mean to us. Such expectations can be (and often are) illogical and irrational.  It is as if our accumulated introjections preclude us from making even the simplest of logical leaps of faith.

These introjections get in the way and often result in the aforementioned clinical distress. Beck’s therapy seeks to uncover instances where distorted, illogical thoughts and images lead to unwanted or unproductive emotions. We say unproductive emotions, because while these emotions can be either good or bad both can lead to unproductive behaviors. 

Beck’s typology of cognitive distortions is somewhat like Ellis’s notion of “irrational beliefs,” which the latter challenged in therapy through a process of disputation.  All distortions represent evidence of our emotions subsuming a logical thought process. We may therefore label them as logical fallacies.

Regardless of the over-riding clinical efficacy of appealing to our clients’ intellectual reasoning abilities (which, by itself, can be problematic), it is helpful to share with clients a list of irrational beliefs, or cognitive distortions, as a starting point in the work we will do:

  1. Catastrophizing or Minimizing ‐ weighing an event as too important or failing to weight it enough.
  2. Dichotomous Thinking ‐ committing the false dichotomy error ‐ framing phenomena as an either/or when there are other options. (Remember here the “genius of AND versus the tyranny of OR”).
  3. Emotional Reasoning ‐ feeling that your negative affect necessarily reflects the way a situation really is.
  4. Fortune Telling ‐ anticipating that events will turn out badly. I see this perhaps more so than any other distortion.
  5. Labeling ‐ this occurs when we infer the character of a person from one behavior, or from a limited set of behaviors, i.e., a person who forgets something one time is “an idiot.” This amounts to “telling a book by its cover” type-thinking.
  6. Mental Filter ‐ we all have mental filters, but this distortion refers to specific situations where we filter out evidence that an event could be other than a negative one for us.
  7. Mind Reading ‐ believing that we can know what a person thinks solely from their behaviors. This one is related strongly to the notion of “projection” onto someone else the feelings that we, ourselves, might have in a similar situation.
  8. Overgeneralization ‐ one event is taken to be proof of a series or pattern of events. Basic statistics courses teach us (or SHOULD teach us) that patterns are hard to find in nature. They only appear to be such. We can view them as otherwise.
  9. Personalization ‐ here we are assuming that a person is at fault for some negative external event. Said another way, we take complete responsibility for something that is nowhere near our fault!
  10. Should Statements ‐ statements that begin with “Shoulds” or “Musts” are often punishing demands we make on ourselves. Generally, the assumption that we Must or Should do something is absolutist, and therefore most likely false.

Ellis, Beck, and other theorists and therapists who employ CBT approaches in their practices, have similar approaches to cognitive distortions. They begin with a laser-like focus upon symptom relief, which in turn, looks to find those cognitive distortions that many clients suffer from.  Beck’s CBT is short-term in nature, as we know, but remember that symptom reduction has been shown by empirical research to be as effective as longer-term help. The idea here is that by focusing on symptoms we can help effect “core” character changes.

Becks’ CBT, along with Ellis’s REBT, have many therapeutic approaches in common.  And while this treatise was not intended to be a detailed review of those therapies, it might be helpful to see how each address the notion of “cognitive distortive thinking.”

In both approaches, the therapist is active, didactic and directive. This means that he or she tells you what he is doing, the reason why he is doing it, and even teaches you how to do it for yourself. For example, if she assigns homework for a client, she tells the client the reasoning behind the homework. This also has the additional benefit of allowing the client to practice new behaviors in the actual environment where they will occur. Keeping a list of cognitive distortions handy can be an example of where the therapist has asked that the client practice how to recognize them.

I am an REBT devotee. As such, I often provide clients with a toolkit of self-help techniques. I want my client to, in effect, become a specialist in dealing with his own problems. My intent is to help the client become more independent. In other words, I am putting myself out of business by freeing my clients from the need for therapy. I want them to actively prepare for similar events in their lives in the future and be prepared with techniques they can employ themselves.

So, for example, when the client senses that they are suffering from distortive thinking, they will be able to whip out the list of ten cognitive distortions and do the work themselves.

One thing that CBT and REBT therapists (among others) focus upon is the here-and-now, the present, as well as the future. Without question, events in our past have shaped who we are. We need to embrace that fact, but at the same time, we need to look at what we can do now to change our view of life. We cannot change the past. We can only adjust our view of the present and future events. Consequently, I approach the therapeutic relationship in a highly collaborative fashion, as opposed to an authoritarian, adversarial or a neutral fashion. In effect, and while I begin as the authority on what we are about to do in therapy, I actively transfer the power of the relationship to the client.

To that end, and like what Beck did, I often set an agenda at the beginning of therapy and then return to that agenda time and time again to gauge progress. Beck set a great example when he outlined these precise steps (in terms of agenda and session structure):

  1. Set an agenda
  2. Review self‐report data
  3. Review presenting problem
  4. Identify problems and set clearly definable and measurable goals
  5. Educate patient on the cognitive model – discuss cognitive distortions
  6. Discuss the patient’s expectations for therapy
  7. Summarize session and assign homework
  8. Elicit feedback from the patient

Let’s focus for a moment on step 5.  Without question, behavioral therapists have many tools at their disposal, including psycho education, relaxation training, coping skills, exposure, and response prevention.  But is cognitive restructuring that most specifically addresses distortive thinking, and which can offer the aforementioned symptom relief.

At step 5, I will acquaint my client with the ideas of both Beck and Ellis and talk about how both (but mostly Ellis) seek to focus on the client’s core evaluative beliefs about himself.  Of course, Ellis tended to revert somewhat to the past by explaining how unconscious conflict may exist, based upon past experiences, while Beck tended to eschew this.  He was more concerned with working with observable behaviors, and thereby potentially uncover the distortive thinking.

To that end, I focus more on what is referred to as one’s “automatic thoughts,” which tend to link back to core beliefs. Ellis would have me attack those core beliefs (to the extent they are maladaptive), while Beck would have been simply to try to change them. But when you stop and think about it, BOTH would have the therapist help the client to change those core beliefs.

We all have automatic thoughts – indeed, such automatic thinking helps to keep us alive. The so-called Gift of Fear comes into play here, which while largely unconscious, is what governs our approach to the world. It is when such automatic thinking results in distress that we as therapists are called upon.

Beck did not view automatic thoughts as unconscious in a Freudian sense. He merely saw them as operating without our notice; in a word, “automatic.”

Remember from above that such thinking arises from the underlying assumptions and rules we have accepted (via introjections) and made up (through experience) about how to do deal with the world. And it is HOW we have previously dealt with the world, for the good or for the bad, that has resulted in our core beliefs (about ourselves and about others around us). They are, almost by definition, highly charged and rigid “takes” on the world. They govern what we do.

Beck and to large extent, Ellis, engaged in what they called cognitive restructuring.  First, you identify the cognitive distortions that appear in those automatic thoughts, and which point to the self-defeating core beliefs the client has allowed to set in his or her cognitions. Often this is done through active disputation. I prefer to go about disputation in a somewhat scientific way, through guided discovery, hypothesis testing, supporting through evidence, and looking for alternative theories. Clients are, for the most part, receptive.

Here is an example:

Automatic thought: I can’t do this; it is too hard.

Assumption: I will fail.

Core Belief: Because I am a loser.

I would begin with the statement, “I cannot do this.”  I would work with the client to uncover “real evidence” of their inability to do … whatever.  I simply ask” What evidence do you have? And often, there is NO evidence. The statement, “I can’t do this” is not literally true.  Perhaps he’s having trouble because he’s trying to do too much at once.  The core belief is what I then attack, by asking questions around when they have NOT been a loser; by asking for examples of a time when they were successful in resolving a situation to their satisfaction (read: successfully); and by asking, is that what you truly believe about yourself?

In the most general sense, we can discuss cognitive restructuring in the following fashion: Perception and experiencing in general are active processes that involve both inspective and introspective data, in that the clients’ cognitions represent a synthesis of internal (mental filters) and external stimuli (the world about him.)  How people appraise a situation is generally evident in their cognitions (thoughts and visual images). These cognitions constitute their stream of consciousness or phenomenal field, which reflects their configuration of themselves, their past and future, and their world. Alterations in the content of their underlying cognitive structures affect their affective state and behavioral patterns. Through psychological intervention, clients can become aware of their cognitive distortions. Correction of those faulty dysfunctional constructs can lead to clinical improvement.

According to cognitive theory, cognitive dysfunctions are the core of the affective, physical and other associated features of depression.  Apathy and low energy are results of a person’s expectation of failure in all areas. Similarly, paralysis of will stems from a person’s pessimist attitude and feelings of hopelessness.

Take depression – a negative self‐perception whereby people see themselves as inadequate, deprived and worthless.  They experience the world as negative and demanding.  They learn self‐defeating cognitive styles, to expect failure and punishment, and for it to continue for a long time.  The goal of cognitive therapy is to alleviate depression and to prevent its recurrence by helping clients to identify and test negative cognitions, to develop alternative and more flexible schemas, and to rehearse both new cognitive and behavioral responses in the confines of the therapeutic chamber. By changing the way people think, the depressive disorder can be alleviated.

The beginnings of Cognitive Restructuring employ several steps:

  1. Didactic aspects.   The therapy begins by explaining to the client the theoretical concepts of CBT or REBT, by focusing on the belief that faulty logic leads to emotional pain. Next, the client learns the concept of joint hypothesis formation, and hypothesis testing.  In depression, the relationship between depression and faulty, self-defeating cognitions are stressed, as well as the connection of affect and behavior, and all rationales behind treatment.

  2. Eliciting automatic thoughts.   Every psychopathological disorder has its own specific cognitive profile of distorted thought, which provides a framework for specific cognitive intervention. In depression, we see the negative triad: a globalized negative self-view, negative view of current experiences and a negative view of the future.

    For example, in hypo manic episodes we see inflated views of self, experience and future. In anxiety disorder, we see irrational fear of physical or psychological danger. In panic disorder, we see catastrophic misinterpretation of body and mental experiences.  In phobias, we see irrational fear in specific, avoidable situations. In paranoid personality disorder: negative bias, interference by others. In conversion disorder: concept of motor or sensory abnormality. In obsessive‐compulsive disorder: repeated warning or doubting about safety and repetitive rituals to ward off these threats. In suicidal behavior: hopelessness and deficit in problem solving. In anorexia nervosa, the ear of being fat. In hypochondriasis, the attribution of a serious medical disorder.

  3. Testing automatic thoughts.   Acting as a teacher, the therapist helps a client test the validity of her automatic thoughts. The goal is to encourage the client to reject inaccurate or exaggerated thoughts.  As therapists know all too well, clients often blame themselves for things outside their control.

  4. Identifying maladaptive thoughts.    As client and therapist continue to identify automatic thoughts, patterns usually become apparent. The patterns represent rules of maladaptive general assumptions that guide a client’s life.

    As an example, “To be happy, I must…”  The primary assumption is: “If I am nice, and suffer for others, then bad things won’t happen to me,” with a secondary assumption: “It is my fault when bad things happen to me, because I was not nice enough. Therefore, “Life is unfair, because I am nice and still bad things happen.”  

    You can see how such rules inevitably lead to disappointment, depression, and ultimately, depression.

Some concluding thoughts which Beck had about depression and his view of how psychopathology occurs in general:

  • Emotional disorders are the results of distorted thinking or an unrealistic appraisal of life events.
  • How an individual structures reality determines his emotional state.
  • A reciprocal relation exits between affect and cognition wherein one reinforces the other, resulting in escalations of emotional and cognitive impairment.
  • Cognitive structures organize and filter incoming data and are acquired in early development.
  • Too many dissonant distortions lead to maladjustment.
  • Therapy involves learning experiences for the client that allow them to monitor distorted thinking to realize the relation between thoughts, feelings and behavior and to test the validity of automatic thoughts to substitute more realistic cognitions and to learn to identify and later the underlying assumptions that predispose the client to the distorted thoughts in the first place.

Finally, both Beck and Ellis came up with what they saw as the rudiments of so-called Mature Thinking as compared to primitive thinking. They comprise a set of ways of thinking about yourself, the world, and the future, that lead to cognitive, emotional and behavioral success in life.

Primitive thinking is non-dimensional and global:  I am the living embodiment of failure

Mature thinking is multidimensional and specific: I make mistakes sometimes, but otherwise I can be clever at many things.

Primitive thinking is absolutistic and moralistic: I am a sinner, and I will end up in hell.

Mature thinking is relativistic and non-judgmental: I sometimes let people down, but there is no reason I can’t make amends.

Primitive thinking is invariant: I am hopeless

Mature thinking is variable: There may be some way…

Primitive thinking resorts to “character diagnosis” and labeling: I am a coward

Mature thinking examines behaviors and engages in behavior diagnosis: I am behaving like a coward right now.

Primitive thinking is irreversible and sees things as immutable: There is simply nothing I can do about this.

Mature thinking is reversible, flexible and ameliorative: Let’s see what I can do to fix this…

Hopefully this piece has taught you something about cognitive distortions and how everyone – all of us – suffer from them from time to time. The key is to try always to engage in mature thinking. Hard to do! And often it can take a lifetime! But I urge you to try!

© Dr. Joseph V Russo (2019), All Rights Reserved

Posted in Counseling Concepts, General Musings | Comments Off on How to Think like an Adult: Dr Russo’s Review of Cognitive Distortions

We Are Rasing Excellent Sheep (only these sheep cannot be shorn)

Without question, this article nails it, at least to my way of thinking and teaching as I do freshmen coming into University. I am reminded each and every semester of how the high schools are doing their best to turn-out exemplarily woke graduates who cannot write to save their lives, are barely passable in terms of math skills, and haven’t yet read even one of the classics (you know, dead white guys like Aristotle, Plato, et. al.).

Yet, they have their preferred pronouns down straight, have varying colored hair (I have yet to figure out that meta-message), and are anxious to look around for a statue they can desecrate or even tear down. Mention how (in my classes, anyway) boys are boys and girls are girls and you get hauled before a civil rights star chamber. XX and XY are dog whistles they say for desiring a continuance of something called toxic masculinity and the oppressive patriarchy.

Yes, we are turning out sheeple. This article speaks to the whole idea of how that will be only at our peril. Read on.


We Are Raising Excellent Sheep

By William Deresiewicz

(c) 2022, All Rights are His (assuming his preferred pronoun is indeed “his”)

I taught English at Yale University for ten years. I had some vivid, idiosyncratic students—people who went on to write novels, devote themselves to their church, or just wander the world for a few years. But mostly I taught what one of them herself called “excellent sheep.”

These students were excellent, technically speaking. They were smart, focused, and ferociously hard-working.

But they were also sheep: stunted in their sense of purpose, waiting meekly for direction, frequently anxious and lost.

I was so struck by this—that our “best and brightest” students are so often as helpless as children—that I wrote a book about it. It came out in 2014, not long before my former colleague Nicholas Christakis was surrounded and browbeaten by a crowd of undergraduates for failing to make them feel coddled and safe—an early indication of the rise of what we now call wokeness.

How to reconcile the two phenomena, I started to wonder. Does wokeness, with its protests and pugnacity, represent an end to sheephood, a new birth of independence and self-assertion, of countercultural revolt? To listen to its radical-sounding sloganeering—about tearing down systems and doing away with anyone and anything deemed incorrect—it sure sounded like it.

But indications suggest otherwise. Elite college graduates are still herding toward the same five vocational destinations—law, medicine, finance, consulting, and tech—in overwhelming numbers. High-achieving high school students, equally woke, are still crowding toward the same 12 or 20 schools, whose application numbers continue to rise. This year, for example, Yale received some 50,000 applications, more than twice as many as 10 years ago, of which the university accepted less than 4.5%.

Eventually, I recognized the deeper continuities at work. Excellent sheephood, like wokeness, is a species of conformity. As a friend who works at an elite private university recently remarked, if the kids who get into such schools are experts at anything, it is, as he put it, “hacking the meritocracy.” The process is imitative: You do what you see the adults you aspire to be like doing. If that means making woke-talk (on your college application; in class, so professors will like you), then that is what you do.

But wokeness also serves a deeper psychic purpose. Excellent sheephood is inherently competitive. Its purpose is to vault you into the ranks of society’s winners, to make sure that you end up with more stuff—more wealth, status, power, access, comfort, freedom—than most other people. This is not a pretty project, when you look it in the face. Wokeness functions as an alibi, a moral fig leaf. If you can tell yourself that you are really doing it to “make the world a better place” (the ubiquitous campus cliché), then the whole thing goes down a lot easier.

All this helps explain the conspicuous absence of protest against what seem like obviously outrageous facts of life on campus these days: the continuing increases to already stratospheric tuition, the insulting wages paid to adjunct professors, universities’ investment in China (possibly the most problematic country on earth), the draconian restrictions implemented during the pandemic.

Yes, there have been plenty of protests, under the aegis of wokeness, in recent years: against statues, speakers, emails about Halloween costumes, dining hall banh mi. But those, of course, have been anything but countercultural. Students have merely been expressing more extreme versions of the views their elders share. In fact, of the views that their elders have taught them: in the private and upscale public high schools that have long been dominated by the new religion, in courses in gender studies, African American studies, sociology, English lit.

In that sense, the protesters have only been demonstrating what apt pupils they are. Which is why their institutions have responded, by and large, with pats on the head. After the Christakis incident, two of the students who had most flagrantly attacked the professor went on to be given awards (for “providing exemplary leadership in enhancing race and/or ethnic relations at Yale College”) when they graduated two years later.

The truth is that campus protests, not just in recent years but going back for decades now, bear only a cosmetic resemblance to those of the 1960s. The latter represented a rejection of the authority of adults. They challenged the very legitimacy of the institutions at which they were directed, and which they sought to utterly remake. They were undertaken, at a time when colleges and universities were still regarded as acting in loco parentis, by students who insisted on being treated as adults, as equals; who rejected the forms of life that society had put on offer; and who were engaged, at considerable risk—to their financial prospects, often to their physical safety—in a project of self-authoring.

I was involved in the anti-apartheid protests at Columbia in 1985. Already, by then, the actions had an edge of unreality, of play, as if the situation were surrounded by quotation marks. It was, in other words, a kind of reenactment. Student protest had achieved the status of convention, something that you understood you were supposed to do, on your way to the things that you’d already planned to do, like going to Wall Street. It was clear that no adverse consequences would be suffered for defying the administration, nor were any genuinely risked. Instead of occupying Hamilton Hall, the main college classroom building, as students had in 1968, we blocked the front door. Students were able to get to their classes the back way, and most of them did (including me and, I would venture to say, most of those who joined the protests). “We’ll get B’s!” our charismatic leader reassured us, and himself—meaning, don’t worry, we’ll wrap this up in time for finals (which is exactly what happened). The first time as tragedy, the second time as farce.

And, so, it’s been since then: the third, fourth, tenth, fiftieth time.

In a recent column, Freddie deBoer remarked, in a different context, that for the young progressive elite, “raised in comfortable and affluent homes by helicopter parents,” “[t]here was always some authority they could demand justice from.”

That is the precise form that campus protests have taken in the age of woke: appeals to authority, not defiance of it. Today’s elite college students still regard themselves as children and are still treated as such. The most infamous moment to emerge from the Christakis incident, captured on a video the world would later see, exemplifies this perfectly. Christakis’s job as the head of a residential college, a young woman (one could more justly say, a girl) shriek-cried at him, “is not about creating an intellectual space! It is not! Do you understand that? It’s about creating a home!”

We are back to in loco parentis, in fact if not in law. College is now regarded as the last stage of childhood, not the first of adulthood. But one of the pitfalls of regarding college as the last stage of childhood is that if you do so then it very well might not be. The nature of woke protests, the absence of Covid and other protests, the whole phenomenon of excellent sheephood: all of them speak to the central dilemma of contemporary youth, which is that society has not given them any way to grow up—not financially, not psychologically, not morally.

The problem, at least with respect to the last two, stems from the nature of the authority, parental as well as institutional, that the young are now facing. It is an authority that does not believe in authority, which does not believe in itself. That wants to be liked, that wants to be your friend, who wants to be thought of as cool. That will never draw a line, which will always ultimately yield.

Children can’t be children if adults are not adults, but children also can’t become adults. They need something solid: to lean on when they’re young, to define themselves against as they grow older. Children become adults—autonomous individuals—by separating from their parents: by rebelling, by rejecting, by, at the very least, asserting. But how do you rebel against parents who regard themselves as rebels? How do you reject them when they accept your rejection, understand it, sympathize with it, join it?

The 1960s broke authority, and it has never been repaired. It discredited adulthood, and adulthood has never recovered. The attributes of adulthood—responsibility, maturity, self-sacrifice, self-control—are no longer valued, and frequently no longer modeled. So, children are stuck — they want to be adults, but they don’t know how. They want to be adults, but it’s easier to remain children. Like children, they can only play at being adults.

So here is my commencement message to the class of 2022. Beware of prepackaged rebellions; that protest march that you’re about to join may be a herd. Your parents aren’t your friends; be skeptical of any authority that claims to have your interests at heart. Your friends may turn out to be your enemies; as one of mine once said, the worst thing you can do to friends is not be the person they want you to be.

Self-authoring is hard. If it isn’t uncomfortable, it isn’t independence.

Childhood is over.

Dare to grow up.

 

Posted in General Musings, Helicopter Parenting, People (in general), People in general, State of the Nation | Comments Off on We Are Rasing Excellent Sheep (only these sheep cannot be shorn)