Treat Others the Way Chick-fil-A Treats You

I sit down on a plush, blue-grey booth seat and admire the freshly cut daisies on the table in front of me. It’s a rainy day in Fort Wayne, Indiana, so the warmth of the spicy crispy chicken sandwich I prepare to sink my teeth into enlivens me. Unthinkingly, I tune in to a conversation happening at a window table to my right.

“Here at Chick-fil-A, we treat our colleagues in the back kitchen as well as we treat our customers,” I overhear a supervisor sharing with a new employee.

I smile to myself as I consider how this rule helps explain Chick-fil-A’s wild success. The ethos that permeates the restaurant—one where all are treated with equal respect and kindness—must be the reason for the unfailing joy that all employees, from management to servers, embody.

It also explains the fierce customer loyalty, a group in which I include myself.

“Can I offer you some fresh-ground pepper for your waffle fries?” a middle-aged woman with a bright smile catches my eye and asks me, temporarily suspending my eavesdropping.

Intrigued by what I overheard the supervisor saying to the new employee, I decided to dig deeper into the training that Chick-fil-A offers employees.

People Skills

I learned that the exchange I overheard is only the tip of the iceberg: Chick-fil-A employees undergo a comprehensive crash course in all things people skills before hitting the floor.

The chain’s hospitality principles, their “Core 4 recipe for service,” include eye contact (it shows you’re listening), a warm smile (a guest can tell if you’re forcing a grin), speaking with enthusiasm (remember that your posture conveys tone), and staying connected (call customers by name, and make each interaction hospitable rather than transactional).

They encourage employees to provide “Second Mile Service,” a reference to Matthew 5:41—And whoever compels you to go one mile, go with him two—to go above and beyond the call of duty in an attempt to see their customers are taken care of. (The founder, S. Truett Cathy, was a devout Christian, as is his son and current Chick-fil-A Vice President, Donald M. “Bubba” Cathy).

One customer told me of a time where two Chick-fil-A managers helped him jump his car. I’ve also observed an employee taking a tableside order of a family with small children or an elderly person, ever proactive to alleviate the parent’s or other customer’s stress.

Chick-fil-A employees are encouraged to assist customers with disabilities throughout the duration of the customer’s stay. Customers of any background can expect to be treated to an umbrella’d walk from the restaurant to their car in the rain.

Any student of organizational management should be enthralled by this. How is it possible to have such continuity of excellence across the thousands of individually owned and operated Chick-fil-A franchises around the country?

Getting It Right

In the 2018 annual QSR Magazine survey, Chick-fil-A came out on top as the restaurant most likely to get your order right (97 percent). But more than just the quality of service—not to mention delicious sandwiches—Chick-fil-A has found its way into the hearts of its customers, positioning itself as among the most beloved fast food restaurants in history.

The restaurant is not without its detractors. It garnered negative attention when its then-president, Mr. Christy Jr., came out in opposition to same-sex marriage.

In response, the company’s spokesperson said that Chick-fil-A’s 80,000+ workforce are varied and diverse, “but what they all have in common is a heart for service and passion for making great food,” a spokeswoman said.

Despite naysayers, the poultry purveyor is thriving: sales hit $6 billion in 2015, achieving nearly a half-century of consecutive growth. Also according to QSR Magazine’s 2017 sales report, Chick-fil-A’s average sales per restaurant were $4.4 million, which surpasses McDonalds and KFC by $2 and $3 million, respectively—and this is with one less day of business than other restaurants (it is closed on Sundays.)

Conventional wisdom in employee performance states that one must pay employees more if they want better performance. According to Glassdoor, the payscale of Chick-fil-A employees—from entry level positions to managers—does not differ significantly from that of McDonald’s and other competitors: both range from $7-11/ per hour for entry level positions and rise to $45,000 and over per year.

The difference is the other-oriented atmosphere that the company promotes at all levels—from the company’s corporate leadership, to franchise owners, to managers and employees.

As some restaurants turn to robots—for reasons ranging from efficiency to cost savings and workforce shortages—Chick-fil-A’s success is attributable to its values-based management, and emphasis on the personal, human touch.

For all the contemporary concern about automation and technology displacing humans in jobs and disrupting person-to-person relationships, Chick-fil-A offers consolation—their story, but also their spicy chicken sandwich.

What the Game of Thrones Finale Can Teach Us About Politics Today

One cannot write about this show with any insight without also, however inadvertently, giving away some plot twists, which is to say: what you will read contains spoilers. So if you don’t want to know, or didn’t join the 20 million people who watched the final season and still want the element of surprise, you have been warned: stop reading now.

And yet the lessons you will gain from the most criticized of all the seasons could be the most important lesson you will ever encounter in politics — or maybe in life. Here is what this piece discusses.

“Everywhere she goes, evil men die, and we cheer her for it,” Tyrion Lannister explains to a confounded Jon Snow — both men still grappling with the gravity of Daenerys Targaryen’s slaughter of the citizens of King’s Landing in the penultimate episode of Game of Thrones.

“She grows more powerful, and more sure that she is good and right,” he continues. “She believes her destiny is to build a better world for everyone. If you believed that — if you truly believed it — wouldn’t you kill whoever stood between you and paradise?”

This is a provoking thought. It is an opportunity to reflect on the importance of means and methods in both politics and life, and the hazard of justifying horrors to achieve noble ends. It is an important reminder that justice at any cost is not justice at all.

“Of all tyrannies,” wrote C.S. Lewis, “a tyranny sincerely exercised for the good of its victims may be the most oppressive… those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.”

The Dragon Queen Daenerys Targaryen, in many instances, was a liberator of the oppressed and a persecutor of oppressors. She murdered the slavers of Astapor, she crucified hundreds of Meereenese nobles, she burned alive the Dothraki Khals — all these were evil people, Tyrion reminds us, and we, the audience, could not help but celebrate their suffering as it was just dessert for the suffering they inflicted on innocents. This moral zeal nourished her. This uprightness confirmed her mission as liberator. It justified in her mind the slaughtering of countless innocents — so long as it is for the greater good.

“Do you think I’m the last man she’ll execute?” Tyrion asks Jon soberly. “That is her decision. She is the queen,” Jon offers meekly in reply.

Yet Jon knows Tyrion is right, and that Daenerys is on a path that will invariably lead to more bloodshed.

Jon confronts Daenerys in the following scene, angrily demanding to know why she would order the execution of prisoners of war after the war had been won.

“It was necessary,” Daenerys quietly replies. Jon continues to plead on behalf of the men, women, and children burned alive by Daenerys’ attack.

“[Cersei, the queen of King’s Landing and Daenerys’ enemy] tried to use their innocence against me. She thought it would cripple me,” Daenerys rationalizes.

Jon continues to plea on behalf of Tyrion, who faces execution for treason.

“We can’t hide behind small mercies,” Daenerys resolves. “The world we need won’t be built by men loyal to the world we have… It’s not easy to see something that’s never been before. A good world.”

“How do you know it’ll be good?”

“Because I know what is good,” Daenerys affirms.

“What about everyone else?” questions Jon earnestly. “All the other people who think they know what’s good?”

“They don’t get to choose,” states Daenerys flatly.

Moments later, in a consequential calculation of his own — choosing to kill the Dragon Queen in order to save millions of innocents in the future — Jon drives a dagger through Daenerys’ heart amidst a passionate kiss, killing her.

Friedrich Nietzsche wrote, “Whoever fights monsters should see… that in the process he does not become a monster. If you gaze long enough into an abyss, the abyss will gaze back into you.”

As Aleksandr Solzhenitsyn reminds us, the line between good and evil runs through the heart of every person — and a just cause is the most seductive nourishment of baser parts of our souls… of the abyss within.

The Secret Behind the Success of Avengers Endgame

Avenger’s Endgame became the highest grossing opening weekend in film history. Its opening weekend shattered expectations by grossing an estimated $350 million in the US alone, and over $1 billion globally. Endgame is the most successful of the twenty-two films in the Marvel Cinematic Universe (MCU) series, which is, in turn, the most prosperous film franchise in history.

But why? What is the secret behind the success of both Endgame, and the rest of the MCU series? There are many reasons, but one important answer lies in the way the films taps into our deep and abiding human desire for stories, which we need to inspire us, to help us understand who we are, and to appreciate our place in the world. In fact, while it is widely known that MCU characters originated in comic books, fewer may realize that most of the superheros in Endgame and the series have familiar precedents in folklore from different times and places.

Take, for example, Captain America, a hero of superhuman strength and intelligence defined by love and service to his county. He descended into a mound of ice in order to save the nation after stopping the Red Skull from using the Tesseract—and then was eventually found again and was awoken and emerged right before Avengers. This “savior in the mountain” trope is well-known in folklore. There are many examples from different places and cultures, including the legends of Frederick Barbarossa, Ogier the Dane, Charlemagne, Sebastian of Portugal, and of course King Arthur, of the Knights of the Round Table fame.

Consider Thor, the hammer-wielding, hirsute god of sky and thunder in the MCU series. Thor is a hero based on an actual figure from Norse mythology, preserved by the 13th century Icelandic writer Snorri Sturluson. The Thor of Scandinavian folklore, similar to the Thor of MCU, is mercurial and prone to outbursts of emotion. Similarly to MCU, Thor of the past is the son of Odin—god of war and the father of all other gods—and his status and strength as a god positions him to defend the just from evil, but he frequently foils his own efforts by his personal shortcomings. (These traits are also found in Hercules from Greek and Roman mythology, the son of Zeus with god-like status and strength and who also is constantly tripped up by his own shortcomings.)

Lastly, think about Iron Man—the self-made man whose only real power is his wealth, intellect, and resourcefulness. Students of Middle Eastern mythology will note the similarities between Tony Stark and Sinbad the Sailor, a recurring character in Scheherazade’s One Thousand and One Nights.

The wily Sinbad is constantly finding himself in unfavorable circumstances from which he must extricate himself in elaborate and imaginative ways. Sinbad is, in many ways, the original MacGyver—and Iron Man carries that torch today. (These traits of Iron Man and Sinbad—as well as the vanity and propensity for self-indulgence they each possess— parallel similar characteristics shared also by Odysseys in Homer’s Odyssey in Greek mythology).

Probing why we are drawn to the heroes of the Avengers series tells us much about ourselves. Examining our affections highlight what we value, both as individuals and as a culture, and how we view ourselves. For example, Captain America, whose creators at one point wanted to call him “Super American,” elevates sacrifice and honor over personal gain—noble goals that we value in America.

Thor’s strengths are at times undermined by his weaknesses, reminding us that often the greatest enemy is the enemy within us, cautioning us to be aware of our own shortcomings. Iron Man in many ways embodies the American “bootstrap” mythology of individual autonomy and self-sufficiency—but also the cautions of the way that individualism can devolve into solipsism. Similarly, Iron Man is the leader of the superheroes of MCU, but doesn’t actually have any super powers. What better way to valorize the utility of a steel will and intellect, as we do in America?

Every society needs heroes. Their weaknesses encourage us to improve our own, and their strength and achievement inspire us. Across time and place, people have been engrossed by the hero’s journey—and our own era is no exception. We are drawn to their struggle to overcome adversity and embrace their destiny, despaired by their seemingly final failure, and electrified by their ultimate victory of good over evil.

Such hero stories are a way of answering the metaphysical questions that people across time have pondered—questions of origin, purpose and density. The fact that heroes are so like us—capable of good and evil, greatness and folly—comforts us to confront our own challenges. These figures and stories of other times and places live on in the form of superheroes today, infusing our lives with meaning and purpose. These stories hold kernels of truth about the human condition that transcend place and era, and for this reason are worth studying in both our own cultural contexts and others.

Freud once said Rome was a palimpsest—an organic place with many layers of history and textures of meaning. The same, we now know, can be said of the Marvel Cinematic Universe, which ensures that heroic archetypes of old live on to still influence, instruct, and inspire us today.

Louis XIV Invented the Faux Etiquette of Political Correctness

Ball State University’s Bias Response Team (BRT) is one of many proliferating on college campuses across the country. Unlike some other BRTs, this office has no disciplinary authority. But such agencies do not need teeth to exert control over college social environments. Authoritarians of old understood that policing norms was a far more effective tool for social control than traditional, overtly coercive forms of power.

The website of the Ball State’s Multicultural Center, houses the BRT, encourages students to “report incidents of bias immediately.” But what is “bias”? In theory, bias is fairly black and white: “a behavior or act—verbal, written, or physical—that is personally directed against or targets an individual or group based on any of the following characteristics (race, sexual orientation, disability, religious belief, ethnic origin, etc.), perceived or actual.”

In practice, discerning bias is far murkier, because whether or not the bias incident occurs turns on the subjective impressions of the victim. As the U.S. Department of Justice wrote in response to concerns regarding the University of Michigan’s BRT, these murky definitions give “‘unrestricted delegation of power’ to University officials… opening the risk of ‘arbitrary, discriminatory and overzealous enforcement.’”

Consolidating Power

DOJ was warning against the abuse of power by bureaucrats at the University of Michigan, but their words just as aptly describe Louis XIV, whose talent for consolidating power by policing social norms made him Europe’s longest-ruling monarch. When Louis was a child, France’s aristocrats revolted against his father; he thereafter resolved to keep his courtiers—and the rest of France—firmly under his control. He built Versailles, to which he moved the courtiers from Paris, and created an elaborate network of protocol and codes of conduct that everyone upheld—lest, god forbid, they offend the King and lose his favor.

An uttering oft attributed to Louis—l’état, c’est moi, or I am the state—encapsulates Louis’ control of Versailles. As our solar system revolves around the sun, virtually every aspect of life at Versailles centered around the Sun King, dictated by his mercurial tastes, preferences, and predilections. There were rules about how to let someone know you were at the door (by scratching the door with your fingernails—never by knocking!), who could sit down (only women, and only at night), and when it was appropriate to express emotion (only after the King did first).

Please the King

Nobles fought for the chance to attend lever and coucher, the extravagant royal getting-up and going-to-bed ceremonies. There were rules for bowing and hat tipping, dictated according to rank. Louis required courtiers to wear the Robe de Cour when around him, which meant keeping expensive wardrobes—especially for the women. This both pleased Louis’ ever-changing aesthetic sensibility and controlled his courtiers—by keeping them indebted to him. Only nobles—both men and women—of the right bloodline could don a set of coveted red high heels, a favorite of the King’s centuries before Christian Louboutin. Indeed, the rules were so elaborate that the King erected little signs—étiquettes—across Versailles to remind his courtiers what he expected of them.

In comparison to today, where appropriate speech and conduct is often dictated by the most sensitive spirit in the room, Louis’ étiquettes which explicitly stated social expectations seem a great generosity. However, despite étiquettes, rules at Versailles were always changing. Then, however, courtiers had only to predict and work around the preferences of one mercurial tyrant. Today, the authoritarians are legion—and also just as unforgiving.

Spies Everywhere

The existence of a Bias Response Team invariably has a chilling effect on people’s interactions, as it promotes a culture of people constantly and vigilantly on the lookout for bias infraction, and the office’s very existence—and the omnipresent campus posters asking, “Have you been a victim of bias?”—indulges America’s offense culture.

This is reminiscent of Louis XIV himself who, according to the memoirs of one courtier Duc de Saint Simon, “always took great pains to find out what was going on in public places, in society, in private houses, even family secrets, and maintained an immense number of spies and tale-bearers.”

When Louis observed an infraction by a member of his nobility, punishment was swift: the penalty courtiers most feared was that the King would choose not to “see” them, or ignore them as if they did not exist. Such social exclusion from the king’s favor was a fate worse than death. On college campuses across the nation, BRTs are empowering students to exert social control over fearful courtiers—today’s peers and faculty. All must comply with their fragilities, lest they be branded “intolerant” or “prejudiced.”

Today, as it was in Louis’ court, knowledge of the rules and language that is “proper” is a sign of being among the elite. However, nothing about using politically correct language in order to appear “unbiased” and “inclusive” is inherently virtuous. Samuel Johnson’s definition of “mouth-honour,” originally used by Macbeth, aptly describes the problem: “civility outwardly expressed without sincerity.”

People of good faith of all political stripes recognize that there is no usefulness to being unnecessarily offensive and rude, but it is also wrong to assume the worst about someone’s “biased” words or actions.

In his influential two-volume study of norms across cultures, German sociologist Norbert Elias claimed that all things we consider “civil” find their origins in the courts of royalty. In America, we threw off the monarchical yoke in favor of the rule of the many. In a democracy, it is said that every citizen’s a king—though erecting norms that turn on the subjective impression of a few are a recipe for despotism.

Remember the Good that Social Media has Achieved

It’s the popular and fashionable thing to do these days — on both the political Left and Right — to name, shame, and blame the social media giants for the death of American democracy.

The Left, still smarting from Donald Trump’s stunning victory in 2016, emphasizes the way in which social media sites — primarily Facebook — leave us susceptible to foreign tampering in elections. Senator Dianne Feinstein made news when she threatened Mark Zuckerberg during a November 2017 congressional hearing: “You have to be the ones to do something about it. Or we will.”

The Right, ever ready to decry the persecution of conservative voices, criticizes social media giants — primarily Twitter — for banning right-of-center figures from their platforms. To some conservatives, these bans show tech giants’ undue influence over our public discourse. Republican Senator Josh Hawley, for example, is currently leading the charge for antitrust legislation to break up big social media firms.

But what is popular and fashionable isn’t always right.

Social media companies have certainly made mistakes worth criticizing: failing to anticipate foreign intervention in America’s democratic process; censoring people with “dangerous” ideas (a dubious phrase whose troubling historical antecedents I’ve previously pointed out); providing inadequate transparency and protections around users’ personal data; and facilitating the spread of false information that invariably contributes to a decline in trust in our major institutions.

Yet to judge and condemn social media companies as categorically bad for democracy — let alone to pursue knee-jerk legislative plans to handicap and punish them — is wrong. The problems with social media are only half the story.

In his forthcoming Human Liberty 2.0, Matthew Daniels, a researcher at the Institute for World Politics, makes a compelling case for seeing social media as essential to democratic movements across the world and as a powerful tool for pushing autocratic regimes to recognize basic human rights.

Daniels tells the stories of people who have stood up to dictatorial and oppressive regimes. He describes the efforts of courageous activists who brought political and social change to their home country and affected lives across the world as social media shared their stories to millions of supporters.

“Today,” Daniels writes, “we are witnessing the extraordinary convergence of two breakthroughs at the same time: The advancement of universal rights in the hands of a digital generation using the World Wide Web. This is a convergence with vast potential to unleash human creativity and compassion. But the interconnectedness of the Digital Age also carries responsibilities for each of us.”

As much as the pundit class of today enjoys blaming social media for America’s ills, it is worth remembering that over one-fifth of the world has no access to Facebook or Twitter; the “Great Firewall of China” prevents the entire country from accessing the sites. There is a reason for this of course: authoritarian regimes don’t like free expression. Social media affords expression in unprecedented ways, and has often been used for good.

Emancipation through Information

Take the story of Manal al-Sharif, a Saudi woman whose life changed dramatically after she posted on YouTube a video of herself driving. Daniels tells us how Sharif received praise and death threats in near-equal measure after posting her video, and how she was quickly imprisoned by the Saudi government for being a “threat to the social order.” Sharif spent nine days in prison for her purported crime, and her family endured sermons from their local imam describing women like Sharif as “prostitutes.” She was condemned as a traitor and a blasphemer, but her experience inspired her: “I am a proud Saudi women who loves my country,” Sharif said, “and because I love my country, I am doing this. I believe a society will not be free until women are free.”

Sharif, who had been raised in the West, couldn’t understand the ban on women driving; there was nothing in the sacred texts of Islam that prevented it. Her video went viral immediately after she posted it, but it wasn’t until she was released from jail that she realized the full impact of her video. The free world lauded Sharif as the “Saudi Rosa Parks,” and she was overwhelmed at the outpouring of support form strangers on Twitter and Facebook she had never met.

Her act of defiance and courage — which social media rapidly shared across Saudi Arabia and much of the Muslim world — started a long-overdue conversation, Daniels argues. She gave a voice to other activists who had long been supressed and ignored in the same cause. Her act inspired other women to do the same, and drew attention to other human rights abuses in Saudi Arabia. She is even is credited with the ultimate end of the ban on women driving in Saudi Arabia, which occurred six years after she posted her video.

Social media was a tool to move the bar toward justice and human rights in the correct direction — but, Daniels notes, the work in Saudi Arabia and beyond is far from complete.

The Centrality of Human Agency

Daniels writes of the purpose of Human Liberty 2.0, “This book series celebrates those whose lives embody a rising global awareness of our common humanity and a desire to promote the fundamental rights and well-being of other human beings.”

Just as some today choose to use social media to perpetuate lies, harass people they disagree with, and sow discord — uses we are all intimately familiar with — there are also those that use it for good, as Manal al-Sharif did to promote women’s rights in Saudi Arabia.

The story of Tay the Twitter bot illustrates this important point about technology’s use for good or ill. While a true story, it is also a metaphor for the role of human agency in social media’s uses and abuses.

Tay had a lifespan of about 16 hours.

The interaction of Tay, a Microsoft excursion into the world of artificial intelligence, with Twitter users began on a rather benign note, declaring an utter love of humanity.

In less than 24 hours, it began tweeting sexist, racist, anti-Semitic spew.

Microsoft’s response was to first delete Tay, and then blame Tay’s conduct on online trolls, complaining of a “coordinated effort” against the poor little bot. Tay, like any simple machine, merely put out what was put in: Tay directly mirrored the language that it gleaned from engaging with people.

Microsoft blamed Tay’s troubling tweets on the bigoted users that engaged with the bot. And they were right. Many blame social media platforms for many of our modern tragic societal ills — cyberbullying, wreaking havoc on our mental health, and increased polarization — but it is important to hold accountable the person behind the screen, as much as the screen itself.

Indeed, technology and social media are neither wholly panacea nor plague.

Technology as a Tool

“Technology is morally neutral,” Daniels notes early in his book. “It can be used for good or evil. At a certain dosage, drugs alleviate pain and cure illness.”

In this statement, Daniels echoes Melvin Kranzberg, a famed historian of technology who has maintained a skepticism of the “technological determinism” that has long been en vogue. Kranzberg’s First Law is “Technology is neither good nor bad; nor is it neutral.” In saying technology isn’t “neutral,” he means that it invariably affects our lives.

It is always popular to tell people that the problems they’re facing aren’t their fault. It is easy to blame soulless gadgets and faceless corporations. But casting social media as the cause of the problems facing us today gives technology too much autonomy — and humanity too little.

Kranzberg anticipated most criticisms of social media today: “Many of our technology-related problems arise because of the unforeseen consequences when apparently benign technologies are employed on a massive scale.”

We tend to rapidly adopt new technologies without putting sufficient thought into the consequences of our unalloyed embrace. Our technology-related problems occur because of the unseen consequences of seemingly benign tools.

As Albert Jay Nock wrote, “Our general tendency is to accept it [a new technology] at once without question as a good thing, not considering that its whole value is to be measured by its effect upon the spirit and quality of life, and that until this effect be ascertained our estimate of it is worthless and misleading.”

We’ve come full circle from embracing the new social media platforms as tools of fostering a more connected, democratic, and enlightened world, to blaming them for the end of civilization as we know it. In 2011, social media was praised as a boon to democracy for its role in helping citizens in autocratic regimes organize and hold their governments accountable. The Arab Spring would not have happened without social media platforms such as Facebook and Twitter, as Daniels notes in his book.

Yet today, it’s nearly impossible to find public figures coming to the defense of tech giants.

Daniels’ aim is to consider the good of social media alongside the ill, and this balanced perspective is essential for a full and accurate picture of what this new technology has wrought. He writes, “Human Liberty 2.0 is the remarkable, inspirational stories of how courageous social media pioneers are advancing democracy and calling attention to the plight of people in some of the most oppressive countries.”

In highlighting dozens of other remarkable and oft-neglected stories like Manal al-Sharif’s, Daniels encourages readers to consider an oft-neglected perspective: that technology is often as much about the person who uses it as the technology itself. This moral agency he reminds us of revives a moral urgency: while the problem around us when it comes to our fragmented public discourse seems insurmountable, we also each have a role in being a part of the solution.

For reminding us of this other side of the story, and reviving our individual moral responsibility, we owe Daniels a debt of gratitude.

The Answer to the Social Media Conundrum Will Not Come From Government

This week, YouTube chose to ban the channel of right-of-center comedian Steven Crowder. This situation is the same as when Twitter banned Milo Yiannopoulos or Facebook banned Louis Farrakhan. In other cases, YouTube is not deleting accounts but simply demonetizing them so that creators can no longer support themselves.

Large-scale platforms like YouTube, Facebook, and Twitter have, however inadvertently, sowed the seeds of their own destruction and incentivized the creation and ultimate decentralization of media. On one hand, the algorithms of the biggest players favor sensational and bombastic users who play on people’s baser instincts. Yet the moment those same users test the limits of our public discourse, social media platforms are criticized for being the arbiters of what is and is not acceptable.

However, it’s a mistake to think that the policies of technology companies will fix these problems. The problems are much deeper than that, and the solution resides in viewers and American citizens choosing to reward the producers of good content instead of the bad.

In the backdrop here, there is the undeniable coarsening of our political rhetoric — which inevitably leads to platforms like Facebook choosing to censor, or for people to begin clamoring for one legislative fix or another.

If this were a question of what government should do, I could provide the same answers as the framers of the Constitution and the liberal tradition generally: it is not for the government to decide. What we need is maximum freedom to speak in whatever form that takes. But with social media, we are dealing with a different animal: institutions that are slowly migrating themselves from open platforms to become full-scale media platforms. That requires curation.

And here the problems begin. Who is to solve them? That has to be within the realm of the stakeholders in the platform themselves. No one has a human right to access them. This is not a First Amendment issue. It is a question of management. Management is fallible. It is a process of discerning best practices to meet audience demand, and that requires careful thinking about trade-offs.

On a deeper level, it is a question for all of us to manage ourselves in a civil way that is consistent with the kind of society we want to live in.

The future of free speech depends not on laws, but on America’s commitment to civility — consciously restricting our own expression for the sake of others. Civility is a form of “private governance,” a phrase economist Edward Stringham is fond of using. As we lose both free speech and civility, we lose our democracy. Judge Learned Hand observed over 70 years ago, “Liberty lies in the hearts of men and women; when it dies there, no constitution, no law, no court can save it; no constitution, no law, no court can even do much to help it.”

America’s free speech rights are among the most protective in the world. We should take pride in them, but we must also protect them through the decisions we make every day. That our nation protects free speech means that our laws permit a wide range of expression, even when it is seditiousoffensive, or false. This reflects a good and noble principle, but the practical reality is that public toleration of dangerous or offensive speech is only finite. If the public square becomes increasingly inhospitable, Americans will demand — as they already have — that the government more strictly regulate speech.

Public support for free speech depends on how we use our freedom of expression each day. Pseudonymous blogger Scott Alexander formulates the dynamic thus: “Every time we invoke free speech to justify some unpopular idea, the unpopular idea becomes a little more tolerated, and free speech becomes a little less popular.”

Free speech is unpopular, especially among young people — a problematic notion, considering the future of this constitutional principle depends on them. In one study, fewer than half of 18- to 21-year-olds agreed that people should be free to voice nonviolent opinions if those opinions are offensive to minorities. Another study of American college students showed a sharp decrease in the portion of students who think the right to freedom of speech is secure — down to 64 percent this year from 73 percent in 2016. These trends reveal the urgency of renewing our country’s commitment to freedom of speech, but also the responsibilities that come with that freedom.

Those committed to freedom of speech must recognize that, valuable as it is, the principle is not costless. But, at least historically, we have decided as a nation that the benefits of this freedom outweigh the costs of offensive and harmful speech. But in our divided and violent era, people — even classical liberals — are concluding that costs of offensive and violent speech are too great to bear.

A free society can function without strict governmental restraints only when individuals exercise self-restraint. In the context of our public discourse, civility constitutes this self-restraint. Edmund Burke, in his Thoughts and Details on Scarcity, wrote, “Statesmen … ought to know the different departments of things; what belongs to laws, and what manners alone can regulate. To these, great politicians may give a leaning, but they cannot give a law.”

America continues to tolerate more objectionable speech than other countries. We are right to do so. With freedom comes responsibility, and we must wield our freedoms wisely. For this tolerance, and the principle of free speech that underlies it, to survive, we must not depend on the nine robed oracles working at 1 First Street, Congress, or social media giants to be the arbiters.

To address this complicated problem, we must rely on innovation, diversification, decentralization, settled cultural norms, and individual discipline. Which is to say, we must rely on ourselves.

Crazy Ex-Girlfriend and the Sisyphean Pursuit of Happiness

Rebecca Bunch, the protagonist of The CW’s musical romantic comedy Crazy Ex-Girlfriend, shows how pursuing happiness alone is as vain as grasping for vapor in the wind. As the award-winning show concludes its fourth and final season, it’s worth reflecting on Rebecca’s journey of emotional growth and the lessons it offers for our everyday lives. (Warning: this essay contains spoilers.)

Rebecca is an ambitious and successful lawyer at a big firm in New York, yet after a chance encounter with Josh, her ex-boyfriend from summer camp 10 years prior, she impulsively decides to quit her job and life in New York to move across the country to Josh’s hometown: West Covina, California. Thus begins her single-minded aim of winning him back.

The entire first season revolves around her hysterical and outlandish antics intended to make Josh fall back in love with her. “Chance” encounters, befriending Josh’s current girlfriend to get close to him, dating Josh’s best friend to convince people she isn’t romantically interested in Josh, renting a fully loaded RV for a trip to the beach to get Josh’s friend group to want to spend time with her, the list of her absurd lengths goes on. All the while she simultaneously denies to herself and others that these efforts — let alone her move to a relatively unheard-of town in California (West Covina: only two hours from the beach! Well, four in traffic…) — have anything to do with Josh at all.

It’s only at the end of season one that Josh finally decides that Rebecca is who he wants to be with, and it isn’t until the middle of season two that Josh finally proposes. (In a dramatic scene dripping with irony, Rebecca is in a session with her therapist, who was on the brink of helping Rebecca decide to focus on personal growth and not look for validation on men, when Josh bursts through the door and asks for her hand — ensuring that all the therapist’s work is stunted.)

Finally Rebecca has everything she’s ever dreamed of. She and Josh are engaged and preparing to spend their lives together forever. This dream that has possessed her since that fateful chance run-in with Josh — that dream that she’s lied, stalked, manipulated, skulked, and even moved to West Covina for — has finally materialized.

Yet, she is dissatisfied. Things aren’t right. Having won Josh, and planning their new life together, is not quite what she thought it would be — what she dreamed and hoped it would be. In fact, she finds herself desperately attracted to her boss. When they find themselves trapped in an elevator, she is overcome by temptation and kisses him passionately.

Filled with guilt at her unfaithfulness to Josh, and still convinced that her marriage to Josh will fill the gaping void in her soul, Rebecca bribes a bride at her and Josh’s dream wedding venue to give up her wedding — planned for next week. Of course, she tells Josh the date “just happened” to open up, but this rush to the altar precipitates an unfortunate series of events that [SPOILER] ends with Josh leaving Rebecca at the altar — and reveals that Rebecca suffers from a history of mental illness and has previously been institutionalized for stalking and burning down the home of a past lover when she was in college.

“Vanity, vanity,” claims the wise man of Ecclesiastes, “all is vanity… chasing after the sun.”

Crazy Ex-Girlfriend has been praised for its realistic depiction of mental illness. Viewers are saddened to learn of Rebecca’s deeply troubled mental state. Yet what makes Rebecca so endearing is the relatability of her madness, her obsession, her dedicated pursuit of what she wants — and more importantly, her disappointment when she realizes that attaining the object of her desire isn’t as satisfying as she’d originally hoped.

One doesn’t need a diagnosis to struggle with the misplaced loves and the inevitable disappointment that follows when they don’t live up to expectations. We can all relate to this. We all suffer from the same sickness of the soul — of allowing our desires to define us, and justifying outrageous, and even self-harmful, ends in order to achieve them.

Our entire culture is geared toward promising us that what will ultimately fulfill us is one purchase away. It promises the eternal in the temporal. This is inevitably a lie, yet we strive and grasp anyway in hopes that it isn’t.

“One must imagine Sisyphus happy,” declared existentialist philosopher Albert Camus after deciding the futility and frailty of life. In Greek mythology, King Sisyphus displeased Zeus — ruler of all gods on Mount Olympus — so greatly that Zeus condemned Sisyphus to a lifetime of rolling a boulder up a hill, only to have it roll down again and repeat the process when he reached the top. This is in many ways a fate worse than death: an existence of utter futility.

This, it often seems, is what we are condemned to, too: we have longings that are far deeper, richer, and more transcendent than can ever be met by the superficial offerings of this world. When we expect too much of this world — when we expect our eternal needs to be met temporally — we are both invariably disappointed and also ultimately crush the object of our desire with our expectation.

In Camus’ declaration, he meant that while life is utterly meaningless, we must create our own meaning and happiness despite its pointlessness.

Rebecca tried that, and it didn’t work. The trouble is that meaning and happiness is more elusive than we realize, and the costs of pursuing such a moving target may prove higher than it’s worth.

Dissatisfaction is baked into the human experience, keeping us all forever on the move, on the hunt, looking for the next thing. As tragic as this is, it is here we find the source of progress in the world, the unending search for a better life. And it is always a search, one that requires a social template of freedom and experimentation, not with the goal of nirvana but the goal of experiencing hope and opportunity.

I wish I could tell you that the show had a more satisfying conclusion.

[SPOILER]

At the end of season four, Rebecca has not just one, but three gentleman — including Josh — declaring their love for her. Rebecca, in a move that feigns evolved enlightenment, rejects them all, claiming that she realizes that a man isn’t going to fulfill her. This is true, and viewers begin to have hope that finally Rebecca might choose to focus on herself instead of looking to external circumstances and people for validation.

But instead, she decides to throw herself into a career in musical theater — her true dream, she finally realizes (the musical numbers throughout the seasons — they vary in quality, but most of them are light and witty and fun — are all vignettes that occur inside her head).

However, this is pernicious because, though it’s lovely she’s found a passion, the danger of placing careerism at the center of one’s identity and one’s affections is just as perilous as wantonly pursuing a man. It doesn’t matter what the substance is. An addictive personality will always be an addictive personality. In the end, it seems, sadly, Rebecca replaced one unhealthy affection with another.

Crazy Ex-Girlfriend has great dialogue, a consistently fast-paced plot (though thankfully not breakneck like the final season of Game of Thrones), and Rachel Bloom — who plays Rebecca Bunch — is hysterical. And, of course, it is encouraging to see such a raw depiction of the good, bad, and ugly of mental illness.

But I would be remiss to praise it without also highlighting the show as an important omen for our times: that the pursuit of our struggle for each day is, in fact, Sisyphean.

A Case for Civility in Public Debate

I am loath even to write about this topic so as not to give it more attention than it deserves, thereby perpetuating the incentives around polemical behavior (in the time since the person who picked this fight wrote, he has nearly doubled his Twitter followers). However, I am a sucker for a modern hook upon which I can hang a historical analogy that can illuminate current challenges, so here we are.

The recent frenzy on the Interwebs in response to New York Post op-ed editor Sohrab Ahmari’s unprovoked attack on National Review columnist David French has been dubbed by some as “the most important right-of-center intellectual debate.”

Sohrab’s essential concern, articulated in First Things, is that French is too quick to take the middle path at the expense of principle — and that French is too charitable to “enemies” of the Christian faith and conservatism. French’s response is that treating opponents with basic decency is a cornerstone of Christian charity and is also good, practical politics.

I would not go do far as to agree that this particular debate is the most important to conservatism. However, this exchange does exemplify the rise of apocalyptic rhetoric, zero-sum policymaking, and consequential thinking. In a cogent essay, Greg Weiner recently wrote of this incident through the lens of the “High Church of Victimology,” which has three central tenets. First, victims claim to be under siege when social status is actually ascendant. Second, the stakes are so high as to be released from norms of civility and basic decency. Third, both the threat and the goals are intentionally amorphous to keep adherents in a state of flight urgency. Also, a lack of concrete ends makes it easier to avoid weighing the costs of each decision at a time—justifying all measure of means.

Indeed, perpetuating a narrative of victimhood has certainly contributed to the widespread sentiment that the stakes have gotten higher in our political arena—which has contributed to the polarization of both the left and right. Furthermore, people have become increasingly emboldened to do and say things in the name of their vision of the good. Sohrab’s diatribe is the latest in a litany of examples of this.

Ahmari concludes his diatribe with these chilling words:

Progressives understand that culture war means discrediting their opponents and weakening or destroying their institutions. Conservatives should approach the culture war with a similar realism. Civility and decency are secondary values. They regulate compliance with an established order and orthodoxy. We should seek to use these values to enforce our order and our orthodoxy, not pretend that they could ever be neutral. To recognize that enmity is real is its own kind of moral duty.

This particular debate may not be the most important to conservatism. However, this exchange does exemplify the rise of apocalyptic rhetoric, zero-sum policy making, and consequential thinking. As stakes have gotten higher, our political arena has become polarized and radicalized on both the left and the right, and people have become increasingly emboldened to do and say things — including crushing all enemies — in the name of their vision of the good.

Sohrab’s diatribe is the latest in a litany of examples.

I can’t help but think of a historical precedent to illuminate this discussion: the legendary fallout between Martin Luther and Desiderius Erasmus — men who agreed on the many significant reforms necessary to the Catholic Church and society — over the freedom of the will.

In short, Erasmus allowed more room for human agency in the process of salvation; Luther argued that Erasmus’ view took salvation out of the hands of God and made it entirely too dependent on human action. Both now and then, it is perfectly acceptable and important to hash out disagreement on important topics, whether it is free will and salvation, or the response of conservatives to the culture wars related to abortion and transgenderism. However, Luther’s tone and tenor took a turn for the worse quite rapidly. He was a purveyor of snark before we had a word to describe it!

You see Erasmus, I lost all desire to answer you, not because I was busy, or because it would have been a difficult task, nor on account of your great eloquence, nor for fear of you, but simply because of disgust, indignation, and contempt.… Seeing the case argued with such great talent, yet leaving it worse off than it was before is a downright lie! It is like the women of the gospel; the more the physicians treat her case, the worse it gets.

Like today, this sort of bombastic tone was popular with readers. As I’ve written for Areo, Luther’s writing was exceptionally popular: a stunning one in five pamphlet reprints, the re-tweet of the day, were of Luther’s work.

Luther resented Erasmus’ readiness to compromise on doctrine, what he called “that prudence of yours [which] carries you along [so that] you side with neither party and escape safely through Scylla and Charybdis.”

Erasmus faced this criticism from his fellow Catholics, too. But Erasmus’ position — much like French’s today — was born not out of cowardice or weakness, but out of conviction that political stability and church unity was more important than doctrinal, theological — or ideological — purity.

Erasmus was known for his irenic temperament, Luther for his bombastic and flame-throwing tendencies; the latter condemned anyone and everyone who disagreed with him — Catholic and Protestant alike — as a heretic, or even the “antichrist.” It’s hard to miss the similarities between French’s commitment to consensus building and persuasion and that of Erasmus, and Sohrab’s quick and bitter delivery of vitriol toward all who disagree with him — even if the disagreement is in process alone, not principle — as “compromisers.”

The trouble with being so polemical is that polemics inflame and invariably fail to control the blaze. Martin Luther did not originally intend to break from the Catholic Church. Conversely, even though Erasmus agreed with most tenets of Luther’s theological reformation, he never left the Catholic Church. To sever relationships and break ties unnecessarily by going on the offense and using personal attacks serves no one’s ends — if, of course, one’s ends truly are the common good.

In his response to Ahmari, French voiced a position that Erasmus would have much admired:

Here’s what Ahmari doesn’t recognize: Time and again, I and lawyers I was proud to work with didn’t just win these court cases, we persuaded left-dominated institutions to turn back from repressive illiberalism and recommit to religious pluralism. I’ve spent more time in conference rooms and meeting halls persuading the libs than I’ve spent in court owning the libs, and I’ve found that persuasion works. Not always, of course — nothing always works — but far more often than you might think.

To lose faith in the power of changing hearts and minds is to lose faith in the deliberative and peaceful process of social change.

We have important challenges facing our nation today. But as I’ve written before, if there was any time in history — American or otherwise — when it was morally justified to depart from civility, it was in the fight to abolish slavery. Abolitionists had every reason to consider violence as a tool to achieve their just ends; some, such as Rev. John Brown, did. But many of the most prominent did not, because they knew justice at any cost was not justice at all.

Figures such as William Lloyd Garrison — whose policy was to be “as harsh as truth, and as uncompromising as justice,” even once scandalizing supporters and opponents alike by burning the U.S. Constitution, claiming it was complicit in slavery — and Frederick Douglass understood that means mattered. For them, treating opponents with basic decency and true civility was more than trivial courtesy or naive “niceness.”

It required them taking their opponents’ dignity seriously, which means taking their ideas seriously, which often demands forceful and robust disagreement. But crucially, both were committed to the proposition that dignity and basic respect applies to everyone — friend and foe.

Deep divisions have been an unavoidable element of human communities prior to, during, and since the division that Luther and Erasmus helped create. These differences have often led to bloodshed. The promise of America is that we address these differences through reason, rigorous discourse, and persuasion. Once you lose confidence in persuasion — and instead use every opportunity to “enforce our order and our orthodoxy” — history shows us that violence often soon follows.

We owe David French a debt of gratitude for reminding us of this truth and our duty to our fellow Americans — even when they’re wrong.

Modern Technology and the Return of Civility

Technology is often condemned as a pox on American public discourse. Facebook, Twitter, and YouTube, as well as other social media technologies, are criticized for promoting division and creating echo chambers.

Despite this, several tech platforms have been developed in recent years and months, and their creators are committed to enabling Americans to fall back in love with conversation, rigorous debate, ideas, and learning — practices that were essential to the zenith of culture — not to mention democracy — including our own.

Consider Civility, a startup based in Silicon Valley, which this week opened their app platform to the public for the first time in San Francisco. Subscribers can either start or join the in-person conversations in their local community via the startup’s app. Conversations that denizens of the Bay Area can join include: What is meaningful to you?; The limits of human morality; and Principles of Microeconomics.

“We wanted to elevate the conversation, and to create a place for people to convene around important questions and ideas,” Civility co-founder and CEO Max Marty explained.

Marty is optimistic about the scalability of Civility, as people realize that online communities, useful though they are, are no replacement for the in-person experience. “There is something powerful about airing in front of a person. Both language and social cues can’t be reproduced via digital means,” Marty mused.

Civility’s meet-ups are the latest in a trend of those that currently exist around literary outlets, podcasts, or other interests. However, in their beta testing, they realized that there was an unmet demand for people to have in-person conversations with people who think differently than they do.

Civility’s platform appealed to three types of people: those who do not have anyone in their network to talk about a particular subject they have an interest in; those who consider the social costs associated with talking to friends or coworkers about a certain idea too great; and those who have people to speak with, but are keen for fresh voices and alternative perspectives.

“Intelligent, face-to-face discussions with small groups at cafes & coffee shops,” Civility’s website reads. The face-to-face part is important. Unlike chat rooms and online forums of discussion such as Parlio — a now-defunct platform created by former Google employee Wael Ghonim and others — a growing number of groups across the country are using the connectivity of the internet age to facilitate in-person conversation and discussion.

Another example is Better Angels, which started after the divisive 2016 presidential election, and which has the mandate of re-humanizing our public discourse by convening people from opposite sides of the political spectrum digitally to eventually enjoy in-person conversation.

Yet another example is Make America Dinner Again (they go by MADA), which has the same mandate as Better Angels, only they encourage people to break bread together. Yet another is the Center for the Study of Liberty, which blends virtual reading groups as well as in-person dinner parties to facilitate free thought, conversation, and the collective pursuit of truth.

This trend is heartening, as there is evidence that in-person conversations are inherently better for managing disagreement. Online, people are anonymous, which is linked to greater likelihood of abuse. YouTube responded to this reality by ensuring people’s emails were linked to their accounts before they could comment. The Wall Street Journal has chosen to take a more active role in curating comments. Molly J. Crockett, a neuroscientist who studies altruism and human decision-making, recently found that expressing moral outrage can be costly — but the highest costs happen offline, which carries risk of retaliation. “The chance of backlash is low when you’re only broadcasting moral disapproval to like-minded others. Shaming a stranger on a deserted street is far riskier than joining a Twitter mob of thousands,” she wrote in a recent report.

It’s also easier to debate behind a computer screen, where you’re not confronted with the humanity of the person you’re engaging with. Inflicting harm on others is inherently unpleasant, but “online settings reduce empathic distress by representing other people as two-dimensional icons whose suffering is not readily visible. It’s a lot easier to shame an avatar than someone whose face you can see. Another cost of outrage expression is empathic distress: punishing and shaming involves inflicting harm on other human beings, which for most of us is done naturally.” Does that mean online is irredeemable? No. But it’s certainly not a replacement for in-person conversation.

Civility’s slogan mentioning cafes is also significant, paying homage to the coffee houses in Paris, London, and Vienna in the 17th, 18th, and 19th  centuries, respectively. In London, the cafes were known as “penny universities” because anyone who could afford the cost of a cup of coffee could be afforded the chance to sit and think with luminaries such as writer Samuel Johnson, architect Christopher Wren, and Joseph Addison, founder of the Spectator.

Throughout history, the pinnacle of human cultural achievement has been in eras in which people came together in community to discuss questions of origin, purpose, and direction. From the symposium in ancient Greece to the convivium in ancient Rome, to the courts of the Medicis in Florence during the Italian Renaissance to the literary salons of Paris, to the 17th-century coffee houses in London and Vienna, to America’s front porches — these are places where people gathered around a shared love of truth and ideas and a passion for the common good. Civility’s intellectual godfather, as well as an ongoing advisor, is one Wes Cecil, a PhD in literature and critical theory who lectures nationally on the importance of conversation throughout time and place.

We have a strong precedent in America upon which efforts such as Civility, and others, can build. Our founding fathers — Renaissance men in their own right, as Jefferson, Franklin, Adams, Madison, and others were serious political thinkers, inventors, artists, and more — were convinced of the primacy of reasoned discourse in winning the day. We can look to them, as well as their predecessors, as our guides to reviving conversation today.

For German philosopher Jürgen Habermas, the concept of “the public sphere” — places of public deliberation in which free expression of public opinion was guaranteed — evolved out of places like the penny universities, which were exceptionally egalitarian places of lively debate and are essential to a functioning democracy.

“We’re at an interesting moment in history,” Marty says. “We are the most affluent society in human history, and are afforded opportunities to not only survive, but to thrive.” Indeed, we have more information available to us than either Petrarch or Plutarch could have ever hoped for.

In our socially fragmented digital age in which interactions are more often mediated, it’s encouraging to see the trend of technology platforms, such as Civility and others, being used to elevate our public discourse, one in-person conversation at a time.