Itemoids

COVID

Outdoor Dining Is Doomed

The Atlantic

www.theatlantic.com › health › archive › 2023 › 01 › restaurants-outdoor-dining-winter-covid › 672904

These days, strolling through downtown New York City, where I live, is like picking your way through the aftermath of a party. In many ways, it is exactly that: The limp string lights, trash-strewn puddles, and splintering plywood are all relics of the raucous celebration known as outdoor dining.

These wooden “streeteries” and the makeshift tables lining sidewalks first popped up during the depths of the coronavirus pandemic in 2020, when restaurants needed to get diners back in their seats. It was novel, creative, spontaneous—and fun during a time when there wasn’t much fun to be had. For a while, outdoor dining really seemed as though it could outlast the pandemic. Just last October, New York Magazine wrote that it would stick around, “probably permanently.”

But now someone has switched on the lights and cut the music. Across the country, something about outdoor dining has changed in recent months. With fears about COVID subsiding, people are losing their appetite for eating among the elements. This winter, many streeteries are empty, save for the few COVID-cautious holdouts willing to put up with the cold. Hannah Cutting-Jones, the director of food studies at the University of Oregon, told me that, in Eugene, where she lives, outdoor dining is “absolutely not happening” right now. In recent weeks, cities such as New York and Philadelphia have started tearing down unused streeteries. Outdoor dining’s sheen of novelty has faded; what once evoked the grands boulevards of Paris has turned out to be a janky table next to a parked car. Even a pandemic, it turns out, couldn’t overcome the reasons Americans never liked eating outdoors in the first place.

For a while, the allure of outdoor dining was clear. COVID safety aside, it kept struggling restaurants afloat, boosted some low-income communities, and cultivated joie de vivre in bleak times. At one point, more than 12,700 New York restaurants had taken to the streets, and the city—along with others, including Boston, Los Angeles, Chicago, and Philadelphia—proposed making dining sheds permanent. But so far, few cities have actually adopted any official rules. At this point, whether they ever will is unclear. Without official sanctions, mounting pressure from outdoor-dining opponents will likely lead to the destruction of existing sheds; already, people keep tweeting disapproving photos at sanitation departments. Part of the issue is that as most Americans’ COVID concerns retreat, the potential downsides have gotten harder to overlook: less parking, more trash, tacky aesthetics, and, oh God, the rats. Many top New York restaurants have voluntarily gotten rid of their sheds this winter.

The economics of outdoor dining may no longer make sense for restaurants, either. Although it was lauded as a boon to struggling restaurants during the height of the pandemic, the practice may make less sense now that indoor dining is back. For one thing, dining sheds tend to take up parking spaces needed to attract customers, Cutting-Jones said. The fact that most restaurants are chains doesn’t help: “If whatever conglomerate owns Longhorn Steakhouse doesn’t want to invest in outdoor dining, it will not become the norm,” Rebecca Spang, a food historian at Indiana University Bloomington, told me. Besides, she added, many restaurants are already short-staffed, even without the extra seats.

In a sense, outdoor dining was doomed to fail. It always ran counter to the physical makeup of most of the country, as anyone who ate outside during the pandemic inevitably noticed. The most obvious constraint is the weather, which is sometimes pleasant but is more often not. “Who wants to eat on the sidewalk in Phoenix in July?” Spang said.

The other is the uncomfortable proximity to vehicles. Dining sheds spilled into the streets like patrons after too many drinks. The problem was that U.S. roads were built for cars, not people. This tends not to be true in places renowned for outdoor dining, such as Europe, the Middle East, and Southeast Asia, which urbanized before cars, Megan Elias, a historian and the director of the gastronomy program at Boston University, told me. At best, this means that outdoor meals in America are typically enjoyed with a side of traffic. At worst, they end in dangerous collisions.

Cars and bad weather were easier to put up with when eating indoors seemed like a more serious health hazard than breathing in fumes and trembling with cold. It had a certain romance—camaraderie born of discomfort. You have to admit, there was a time when cozying up under a heat lamp with a hot drink was downright charming. But now outdoor dining has gone back to what it always was: something that most Americans would like to avoid in all but the most ideal of conditions. This sort of relapse could lead to fewer opportunities to eat outdoors even when the weather does cooperate.

But outdoor dining is also affected by more existential issues that have surmounted nearly three years of COVID life. Eating at restaurants is expensive, and Americans like to get their money’s worth. When safety isn’t a concern, shelling out for a streetside meal may simply not seem worthwhile for most diners. “There’s got to be a point to being outdoors, either because the climate is so beautiful or there’s a view,” Paul Freedman, a Yale history professor specializing in cuisine, told me. For some diners, outdoor seating may feel too casual: Historically, Americans associated eating at restaurants with special occasions, like celebrating a milestone at Delmonico’s, the legendary fine-dining establishment that opened in the 1800s, Cutting-Jones said.

Eating outdoors, in contrast, was linked to more casual experiences, like having a hot dog at Coney Island. “We have high expectations for what dining out should be like,” she said, noting that American diners are especially fussy about comfort. Even the most opulent COVID cabin may be unable to override these associations. “If the restaurant is going to be fancy and charge $200 a person,” said Freedman, most people can’t escape the feeling of having spent that much for “a picnic on the street.”

Outdoor dining isn’t disappearing entirely. In the coming years there’s a good chance that more Americans will have the opportunity to eat outside in the nicer months than they did before the pandemic—even if it’s not the widespread practice many anticipated earlier in the pandemic. Where it continues, it will almost certainly be different: more buttoned-up, less lawless—probably less exciting. Santa Barbara, for example, made dining sheds permanent last year but specified that they must be painted an approved “iron color.” It may also be less popular among restaurant owners: If outdoor-dining regulations are too far-reaching or costly, cautioned Hayrettin Günç, an architect with Global Designing Cities Initiative, that will “create barriers for businesses.”

For now, outdoor dining is yet another COVID-related convention that hasn’t quite stuck—like avoiding handshakes and universal remote work. As the pandemic subsides, the tendency is to default to the ways things used to be. Doing so is easier, certainly, than coming up with policies to accommodate new habits. In the case of outdoor dining, it’s most comfortable, too. If this continues to be the case, then outdoor dining in the U.S. may return to what it was before the pandemic: dining “al fresco” along the streetlamp-lined terraces of the Venetian Las Vegas, and beneath the verdant canopy of the Rainforest Cafe.

The Internet Loves an Extremophile

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 01 › internet-youtube-podcast-guru-influencers-andrew-tate › 672867

On YouTube, a British influencer named Tom Torero was once the master of “daygame”—a form of pick-up artistry in which men approach women on the street. “You’ll need to desensitise yourself to randomly chatting up hot girls sober during the day,” Torero wrote in his 2018 pamphlet, Beginner’s Guide to Daygame. “This takes a few months of going out 3-5 times a week and talking to 10 girls during each session.”

Torero promised that his London Daygame Model—its five stages were open, stack, vibe, invest, and close—could turn any nervous man into a prolific seducer. This made him a hero to thousands of young men, some of whom I interviewed when making my recent BBC podcast series, The New Gurus. One fan described him to me as  “a free spirit who tried to help people,” and “a shy, anxious guy who reinvented himself as an adventurer.” To outsiders, though, daygame can seem unpleasantly clinical, with its references to “high-value girls,” and even coercive: It includes strategies for overcoming “LMR,” which stands for “last-minute resistance.” In November 2021, Newsweek revealed that Torero was secretly recording his dates—including the sex—and sharing the audio with paying subscribers to his website. Torero took down his YouTube channel, although he had already stopped posting regularly.

[Read: To learn about the far right, start with the ‘manosphere]

This was the narrative I had expected to unravel—how a quiet, nerdy schoolteacher from Wales had built a devoted following rooted in the backlash to feminism. Instead, I found a more surprising story: Tom Torero was what I’ve taken to calling an “extremophile,” after the organisms that carve out an ecological niche in deserts, deep-ocean trenches, or highly acidic lakes. He was attracted to extremes. Even while working in an elementary school, he was doing bungee jumps in Switzerland.

As churchgoing declines in the United States and Britain, people are turning instead to internet gurus, and some personality types are particularly suited to thriving in this attention economy. Look at the online preachers of seduction, productivity, wellness, cryptocurrency, and the rest, and you will find extremophiles everywhere, filling online spaces with a cacophony of certainty. Added to this, the algorithms governing social media reward strong views, provocative claims, and divisive rhetoric. The internet is built to enable extremophiles.

In his daygame videos and self-published books, Tom recounted a familiar manosphere backstory of being bullied by his male peers and friend-zoned by girls. But that wasn’t the whole picture. While doing my research, I received a message from Tom’s ex-wife. (In the podcast, we called her Elizabeth, a pseudonym, because she feared reprisals from his fans.) Elizabeth said she had been at university with Tom Ralis—his birth name—at the turn of the century. They’d met in the choir. He was “quite tall, and quite gawky … he had a kind of lopsided grin and he was sort of cheery and chirpy and wanted to make people laugh,” she told me. Elizabeth was a music student, and she was—unusual for Britain—a follower of the Greek Orthodox faith. How funny, Tom had said. He was interested in that religion too. But he didn’t expect to become her boyfriend. He was happy just to be friends.

[Read: To learn about the far right, start with the ‘manosphere’]

When Elizabeth’s father had a car accident, though, Tom started love bombing her. He turned up at her room in college with tea bags and biscuits, and told her that he did in fact want to date her. This proposal came with an implicit threat: “If I wouldn’t be with him, he would disappear,” she told me. “And the way that he talked about it … there was a kind of threat of suicide, that he would kill himself if I wouldn’t be with him.”

Confused, worried, and under pressure, Elizabeth said she “let him take over.” She began to date Tom, and they got married while still at university. Then, she recounted, they moved to a Greek island, where Elizabeth taught English, and Tom, who had started dressing all in black, went on a pilgrimage to Mount Athos—an Orthodox monastery that bans women and even female animals to maintain its purity. When he returned, Elizabeth said, Tom announced that he wanted to become a monk.

I was surprised by this revelation: The man who became famous for teaching seduction had considered a vow of celibacy? But to Elizabeth, the announcement made perfect sense. When she first met Tom, he was a biology student who “hero-worshipped” the geneticist and atheist Richard Dawkins, she said, before he became “disillusioned with science and rationalism.” The common thread between all of these different Toms—Ralis and Torero; ardent atheist, wannabe monk, and YouTube pick-up artist—was a psychological need, a desire to be respected, to be listened to, to be a preacher. It was the role he wanted. The subject matter that he preached about came second.

[Read: Am I being love bombed? Are you?]

Not every internet guru follows this pattern. Some influencers have developed a genuine interest in a single topic and decided to make it into a career. But many other corners of the internet are full of serial enthusiasts who have pinballed from one ideology to another, believing in each one deeply as they go. These flexible evangelists are perfectly suited to becoming online gurus. They believe, and they need to preach—and because of the lack of gatekeeping on social media, the most talented talkers can easily find an audience online.

Andrew Tate is another extremophile. The misogynist influencer, a former kickboxer and reality-show contestant, used to describe himself as an atheist, but he announced last year that he had converted to Islam because—as one interviewer, the British rapper Zuby, summarized Tate’s view—“Christianity is kinda cucked.” Once Tate decided that God exists—which he had deduced because evil exists, and therefore so must its opposite—it was important to him to find the religion he deemed the most hard-core. (After all, a man who keeps swords in his house could not have become a mild-mannered Episcopalian.) On the other side of the gender divide, Mikhaila Peterson, a second-generation influencer who became known for advocating a “lion diet” as a cure for immune conditions, revealed in 2021 that she had found God through taking psychedelics. She now talks about religion healing her soul with the same intensity that she speaks about her all-meat diet healing her body.

Shortly after Tom Ralis returned from Mount Athos, Elizabeth escaped the Greek island, and their marriage. When they divorced in 2006, YouTube was in its infancy. Throughout the 2010s, she would search for him online occasionally, and she watched him develop his daygame model. It was like the love-bombing technique he had used on her but condensed from several months into a single date. In December 2021, she discovered from a text message sent by a mutual friend that Tom had taken his own life. He had often spoken of his experience with depression, but his death still shocked her. In April last year, several of his online friends organized a tribute in London, and talked about Torero’s effect on their life. He had successfully become the secular online version of a preacher—a YouTube guru.

Tom Torero wanted to be an authority figure, and he found the cultural script that best fulfilled his needs. On my journey through the gurusphere, I encountered many stories like his. Take Maajid Nawaz, whom The New York Times anointed a member of the “Intellectual Dark Web” in 2018. Before becoming famous as a heterodox public intellectual, Nawaz had been jailed in Egypt for four years in the early 2000s for being a member of the Islamist group Hizb-ut-Tahrir. After renouncing that ideology, he became an antiextremism adviser to then-Conservative Prime Minister David Cameron, and at the same time stood as a candidate for Britain’s centrist party, the Liberal Democrats. Having failed to succeed in politics, Nawaz became a talk-radio host and became radicalized again, this time into COVID denialism. He left the broadcaster LBC in January 2022 after claiming that mandatory vaccination was “a global palace coup” by “fascists who seek the New World Order.”

[Cynthia Miller-Idriss: Extremism has spread into the mainstream]

Nawaz is, I would argue, another extremophile. This 2015 description of him by The Guardian could just as easily apply to Tom Torero: “Nawaz’s powers of verbal persuasion are something even his detractors concede. There’s a strong line to take in every answer. But equally, there’s very little sense of being open to persuasion himself.” Unlike most of us, with our needling doubts and fumbling hesitation, extremophiles are fervent in whatever their current belief is. And they want to tell other people about it.

For this reason, extremophiles have always made particularly good op-ed columnists—and now podcasters and YouTubers. The Hitchens brothers are a traditional example: Christopher was a Trotskyist as a young man, yet he became a supporter of the ultimate establishment project, the Iraq War. Peter moved from socialism to social conservatism, and has used his Mail on Sunday column to oppose strict COVID policies. Their analogue in the social-media age is James Lindsay. He believes that America is under threat from a Marxist-pedophile alliance, and he frequently collaborates with the Christian Nationalist Michael O’Fallon. But Lindsay first entered public life in the 2010s, writing books in support of New Atheism. At that time, he saw himself on the left. Although his middle name is Stephen, he told me that he wrote his atheist books as “James A. Lindsay” to deflect any backlash from the conservative community where he lived. As far as he is concerned, he has always been a rebel against the prevailing political climate.

Not everyone with an internet following is an extremophile. Someone like Russell Brand, a left-wing British comedian and actor now dabbling in anti-vax rhetoric and conspiracy theories about shadowy elites “concretizing global power,” strikes me as having a different psychological makeup. He is merely a heat-seeking missile for attention. His mirror image on the right is Dave Rubin, a gay man who has built a fan base among social conservatives opposed to homosexuality, as well as a Trumpist who—sensing the wind changing—recently boasted about attending the inauguration of Florida Governor Ron DeSantis.

Extremophiles are more like the sociologist Eric Hoffer’s “true believers,” the people who fuel mass movements. “The opposite of the religious fanatic is not the fanatical atheist but the gentle cynic who cares not whether there is a God or not,” Hoffer wrote in 1951. Hoffer’s formulation reminded me of a friend telling me about a mutual acquaintance who had been in two cults. I felt like Oscar Wilde’s Lady Bracknell: To be in one cult may be regarded as a misfortune; to join two looks like carelessness. Or think about the Mitford sisters, the quintessential English aristocrats of the early 20th century. As children, Unity was a fascist, and Decca was a communist. Their childhood sitting room was divided down the middle; one side had copies of Der Stürmer and Mein Kampf; the other had hammers and sickles. The only point of political agreement between the two girls was that the mere conservatives and liberals who visited the house were boring.

My journey reporting on the gurusphere has led me to confront my own extremophile tendencies. After being raised Catholic, I became interested in New Atheism in the 2000s, because it was a countercultural phenomenon. Like pretty much everyone else, I would argue that my political beliefs are all carefully derived from first principles. But the ones that I choose to write about publicly are clearly influenced by my own self-image as an outsider and a contrarian. Being self-aware about that helps me remember that my fear of normiedom has to be kept in check, because the conventional wisdom is often right.

Researchers of extremism are now studying its psychological causes as keenly as they are its political ones. “Psychological distress—defined as a sense of meaninglessness that stems from anxious uncertainty—stimulates adherence to extreme ideologies,” wrote the authors of a 2019 paper on the topic. Many people become radicalized through “a quest for significance—the need to feel important and respected by supporting a meaningful cause.” The COVID pandemic was so radicalizing because one single highly conspicuous issue presented itself at exactly the same time that many people were bored, lonely, and anxious. Cults usually try to isolate their followers from their social-support networks; during the pandemic, people did that all by themselves.

The extremophile model helps us make sense of political journeys that are otherwise baffling to us, like the monastery-to-pick-up-artist pipeline. We might be tempted to ask: Who was the real Tom Torero—atheist bro, aspirant monk, or master seducer? The answer is: all of them. He was a true believer, just not a monogamous one.

Are American Men Finally Rejecting Workism?

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 01 › american-rich-men-work-less-hours-workism › 672895

This is Work in Progress, a newsletter by Derek Thompson about work, technology, and how to solve some of America’s biggest problems. Sign up here to get it every week.

One of the weirdest economic stories of the past half century is what happened to rich Americans—and especially rich American men—at work.

In general, poor people work more than wealthy people. This story is consistent across countries (for example, people in Cambodia work much more than people in Switzerland) and across time (for example, Germans in the 1950s worked almost twice as much as they do today).

But starting in the 1980s in the United States, this saga reversed itself. The highest-earning Americans worked longer and longer hours, in defiance of expectations or common sense. The members of this group, who could have bought anything they wanted with their wealth, bought more work. Specifically, from 1980 to 2005, the richest 10 percent of married men increased their work hours by more than any other group of married men: about five hours a week, or 250 hours a year.

In 2019, I called this phenomenon “workism.” In a time of declining religiosity, rich Americans seemed to turn to their career to fill the spiritual vacuum at the center of their life. For better or (very often) for worse, their desk had become their altar.

Since then, the concept of workism has been attached to a range of cultural and political phenomena, including declining fertility trends in the West. I’ve blamed workism for U.S. policies that resist national parental and sick leave because of an elite preference for maximizing the public’s attachment to the labor force.

Then the pandemic happened. I didn’t know how the forcible end of white-collar commutes and the demise of the default office would change affluent American attitudes. I assumed that remote work would make certain aspects of workism even more insidious. Researchers at Microsoft found that the boomlet in online meetings was pushing work into odd hours of the week, leading to more “just finishing up on email!” late nights, and Saturday mornings that felt like mini-Mondays. Working on our computer was always a “leaky” affair; with working from home and COVID, I feared the leak would become a flood.

But I was wrong. This year, Washington University researchers concluded that, since 2019, rich Americans have worked less. And less, and less. In a full reversal of the past 50 years, the highest-educated, highest-earning, and longest-working men reduced their working hours the most during the pandemic. According to the paper, the highest-earning 10 percent of men worked 77 fewer hours in 2022 than that top decile did in 2019—or 1.5 hours less each week. The top-earning women cut back by 29 hours. Notably, despite this reduction, rich people still work longer hours overall.

This analysis may have been thrown off by untrustworthy survey responses received during the chaos of the pandemic. But according to The Wall Street Journal, separate data from the Census Bureau back up that conclusion. From 2019 to 2021, married men reduced their workweek by a little more than an hour. Unmarried men had no similar decline.

So why are rich married men suddenly—and finally—reducing their working hours, by an unusual degree? Yongseok Shin, an economist at Washington University and a co-author of the paper, told me that he had “no doubt that this was a voluntary choice.” When I asked him if perhaps rich married men had worked less in dual-earner households to help with kids during the early pandemic period, he told me that their working hours continued falling in 2022, “long after the worst periods of school closures and issues with child-care centers.”

The title of the new paper is a bit misleading: “Where Are the Workers? From Great Resignation to Quiet Quitting.” The authors make frequent references to quiet quitting, the notion that workers in 2022 suddenly decided to reduce their collective ambition and effort. But their analysis doesn’t actually find anything like that. In the past three years, the median worker hardly reduced his or her hours. All of the decline in hours worked happened among the highest-earning Americans, with the longest workweeks. Is that an outbreak of quiet quitting? I’d say no. It’s more like the fever of workism is finally breaking among the most workaholic Americans.

“I think the pandemic has clearly reduced workaholism,” Shin told me. “And by the way, I think that’s a very positive thing for this country.”

I’m inclined to agree. In the years since I wrote the workism essay, I’ve toggled between two forms of writer’s guilt. Some days, I worry that I went too hard on people who are devoted to their job. If people can find solace and structure and a sense of control in their labor, who am I to tell them that they are suffering from an invisible misery by worshipping a false and marketized god?

But on other days, I think I wasn’t hard enough on workism, given how deeply it has insinuated itself into American values. The New York Times and Atlantic writer David Brooks has distinguished between what he calls “résumé virtues” and “eulogy virtues.” Résumé virtues are what people bring to the marketplace: Are they clever, devoted, and ambitious employees? Eulogy virtues are what they bring to relationships not governed by the market: Are they kind, honest, and faithful partners and friends?

Americans should prioritize eulogy virtues. But by our own testimony, we strongly prefer résumé virtues for ourselves and especially for our children. This year, Pew Research Center asked American parents: What accomplishments or values are most important for your children as they become adults? Nearly nine in 10 parents named financial security or “jobs or careers [our children] enjoy” as their top value. That was four times more than the share of parents who said it was important for their children to get married or have children; it was even significantly higher than the percentage of parents who said it’s extremely important for their kids to be “honest,” “ethical,” “ambitious,” or “accepting of people who are different.” Despite large differences among ethnicities in some categories, the primacy of career success was one virtue that cut across all groups.

I can’t read those survey results without thinking about the fact that teenage anxiety has been steadily rising for the past decade. Commentators sometimes blame a technological cocktail of smartphone use and social media for the psychological anguish of American youth. But perhaps a latent variable is the reverberation of workism in the next generation. These surveys suggest that everything society ought to consider bigger than work—family, faith, love, relationships, ethics, kindness—turns out to be secondary.

The message from American parents, in a century of economic instability, seems to be Your career is up here, and everything else is down there. Is there any scenario in which this is good for us? People can control their character in a way that they can’t control their lifetime earnings. In the ocean of the labor market, we’re all minnows, often powerless to shape our own destiny. It can’t be healthy for a society to convince its young people that professional success, the outcome of a faceless market, matters more to life than values such as human decency, which require only our own adherence.

I don’t know what will happen to workism in the next decade, but if rich American men are beginning to ease up on the idea that careerism is the tentpole of identity, the benefits could be immense—for their generation and the ones to come.

Office hours are back! Join Derek Thompson and special guests for conversations about the future of work, technology, and culture. The next session will be February 6. Register here and watch a recording anytime on The Atlantic’s YouTube channel.