Itemoids

AI

The Rise of AI Taylor Swift

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 03 › ai-taylor-swift-fan-generated-deepfakes-misinformation › 673596

AI Taylor Swift is mad. She is calling up Kim Kardashian to complain about her “lame excuse of a husband,” Kanye West. (Kardashian and West are, in reality, divorced.) She is threatening to skip Europe on her Eras Tour if her fans don’t stop asking her about international dates. She is insulting people who can’t afford tickets to her concerts and using an unusual amount of profanity. She’s being kind of rude.

But she can also be very sweet. She gives a vanilla pep talk: “If you are having a bad day, just know that you are loved. Don’t give up!” And she just loves the outfit you’re wearing to her concert.

She is also a fan creation. Based on tutorials posted to TikTok, many Swifities are using a program to create hyper-realistic sound bites using Swift’s voice and then circulating them on social media. The tool, the beta of which was launched in late January by ElevenLabs, offers “Instant Voice Cloning.” In effect, it allows you to upload an audio sample of a person’s voice and make it say whatever you want. It’s not perfect, but it’s pretty good. The audio has some tonal hitches here and there, but it tends to sound pretty natural—close enough to fool you if you aren’t paying enough attention. Dark corners of the internet immediately used it to make celebrities say abusive or racist things; ElevenLabs said in response that it “can trace back any generated audio to the user” and would consider adding more guardrails—such as manually verifying every submission.

Whether it’s done this is unclear. After I forked over $1 to try the technology myself—a discounted rate for the first month—my upload was approved nearly instantly. The slowest part of the process was finding a clear one-minute audio clip of Swift to use as a source for my custom AI voice. Once that was approved, I was able to use it to create fake audio right away. The entire process took less than five minutes. ElevenLabs declined to comment about its policies or the ability to use its technology to fake Taylor Swift’s voice, but it provided a link to its guidelines about voice cloning. The company told The New York Times earlier this month that it wants to create a “universal detection system” in collaboration with other AI developers.

The arrival of AI Taylor Swift feels like a teaser for what’s to come in a strange new era defined by synthetic media, when the boundaries between real and fake might blur into meaninglessness. For years, experts have warned that AI would lead us to a future of infinite misinformation. Now that world is here. But in spite of apocalyptic expectations, the Swift fandom is doing just fine (for now). AI Taylor shows us how human culture can evolve alongside more and more complex technology. Swifties, for the most part, don’t seem to be using the tool maliciously: They’re using it for play and to make jokes among themselves. Giving fans this tool is “like giving them a new kind of pencil or a paintbrush,” explains Andrea Acosta, a Ph.D. candidate at UCLA who studies K-pop and its fandom. They are exploring creative uses of the technology, and when someone seems to go too far, others in the community aren’t afraid to say so.  

[Read: Welcome to the big blur]

In some ways, fans might be uniquely well prepared for the fabricated future: They have been having conversations about the ethics of using real people in fan fiction for years. And although every fandom is different, researchers say these communities tend to have their own norms and be somewhat self-regulating. They can be some of the internet’s most diligent investigators. K-pop fans, Acosta told me, are so good at parsing what’s real and what’s fake that sometimes they manage to stop misinformation about their favorite artist from circulating. BTS fans, for example, have been known to call out factual inaccuracies in published articles on Twitter.  

The possibilities for fans hint at a lighter side of audio and video produced by generative AI. “There [are] a lot of fears—and a lot of them are very justified—about deepfakes and the way that AI is going to kind of play with our perceptions of what reality is,” Paul Booth, a professor at DePaul University who has studied fandoms and technology for two decades, told me. “These fans are kind of illustrating different elements of that, which is the playfulness of technology and the way that it can always be used in the kind of fun and maybe more engaging ways.”

But AI Taylor Swift’s viral spread on TikTok adds a wrinkle to these dynamics. It’s one thing to debate the ethics of so-called real-person fiction among fans in a siloed corner of the internet, but on such a large and algorithmically engineered platform, the content can instantly reach a huge audience. The Swifties playing with this technology share a knowledge base, but other viewers may not. “They know what she has said and what she hasn’t said, right? They’re almost immediately able to clock, Okay, this is an AI; she never said that,” Lesley Willard, the program director for the Center for Entertainment and Media Industries at the University of Texas at Austin, told me. “It’s when they leave that space that it becomes more concerning.”

Swifties on TikTok are already establishing norms regarding the voice AI, based at least in part on how Swift herself might feel about it. “If a bunch of people start saying, ‘Maybe this isn’t a good idea. It could be negatively affecting her,’” one 17-year-old TikTok Swiftie named Riley told me, “most people really just take that to heart.” Maggie Rossman, a professor at Bellarmine University who studies the Swift fandom, thinks that if Taylor were to come out against specific sound bites or certain uses of the AI voice, then “we’d see it shut down amongst a good part of the fandom.”

But this is challenging territory for artists. They don’t necessarily want to squash their fans’ creativity and the sense of community it builds—fan culture is good for business. In the new world, they’ll have to navigate the tension between allowing some remixing while maintaining ownership of their voice and reputation.

A representative for Swift did not respond to a request for comment on how she and her team are thinking about this technology, but fans are convinced that she’s listening. After her official TikTok account “liked” one video using the AI voice, a commenter exclaimed, “SHES HEARD THE AUDIO,” following up with three crying emoji.

TikTok, for its part, just released new community guidelines for synthetic media. “We welcome the creativity that new artificial intelligence (AI) and other digital technologies may unlock,” the guidelines say. “However, AI can make it more difficult to distinguish between fact and fiction, carrying both societal and individual risks.” The platform does not allow AI re-creations of private people, but gives “more latitude” for public figures—so long as the media is identified as being AI-generated and adheres to the company’s other content policies, including those about misinformation.

But boundary-pushing Swift fans can probably cause only so much harm. They might destroy Ticketmaster, sure, but they’re unlikely to bring about AI armageddon. Booth thinks about all of this in terms of “degrees of worry.”

“My worry for fandom is, like, Oh, people are going to be confused and upset, and it may cause stress,” he said. “My worry with [an AI fabrication of President Joe] Biden is, like, It might cause a nuclear apocalypse.”

A History of Work in Six Words

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 03 › work-revolution-ai-wfh-new-book › 673572

This is Work in Progress, a newsletter by Derek Thompson about work, technology, and how to solve some of America’s biggest problems. Sign up here to get it every week.

Here is a history of work in six words: from jobs to careers to callings.

Until quite recently, we had little concept of “progress” in our labor. Around the world, people hunted or harvested, just as their parents and grandparents had. They hammered nails. They assembled gears and sewed thread and patched homes. Their work was a matter of subsistence and necessity; it was not a race for status or an existential search for meaning. These were jobs. And for hundreds of millions of people everywhere, work is still work—grueling or boring or exploited or poorly paid, or all of the above.

In the 19th century, the railroads and the telegraph forced American companies to change the way they organized their labor. In 1800, traveling from Manhattan to Chicago took, on average, four weeks; in 1857, it took two days. With goods and information moving faster than ever, firms headquartered in major cities had to track prices from Los Angeles to Miami. To conduct this full orchestra of operations, they built a new system for organizing labor. They needed managers. “As late as 1840, there were no middle managers in the United States,” Alfred Chandler observed in The Visible Hand, his classic history of the rise of America’s managerial revolution.

Rail and telegraphs made new kinds of businesses possible, including department stores, mail-order houses, and the national oil and steel behemoths. Large companies required massive, multilevel bureaucracies. And within these laborious labyrinths, workers could ascend from grunt to manager to executive. These corporations invented the modern journey of a career, that narrative arc bending toward a set of precious initials: VP, SVP, CEO.

As the managerial revolution created a sense of professional progress, the decline of organized religion and social integration in the 20th century left many Americans bereft of any sense of spiritual progress. For some, work rose to fill the void. Many highly educated workers in the white-collar economy feel that their job cannot be “just a job” and that their career cannot be “just a career”: Their job must be their calling.

What’s wrong with that? Perhaps nothing. Some people simply love their job, and it would be ridiculous for me to tell them that actually, they are quietly suffering from some disease they cannot perceive. But many of them are also adherents to a cult of productivity and achievement, wherein anything short of finding one’s vocational soulmate amounts to a wasted life. These workers have founded a new kind of religion—one that valorizes work, career, and achievement above all else. And it’s making them a little bit crazy.

I call this new religion “workism.” Workism is not a simple evil or virtue; rather, it’s a complex phenomenon. It is rooted in the belief that work can provide everything we have historically expected from organized religion: community, meaning, self-actualization. And it is characterized by the irony that, in a time of declining trust in so many institutions, we expect more than ever from the companies that employ us—and that, in an age of declining community attachments, the workplace has, for many, become the last community standing. This might be why more companies today feel obligated to serve on the front lines in political debates and culture-war battles.

The credo that work should be the centerpiece of one’s identity quietly governs several stages of modern life. For many children and their parents, it has created an obsession with educational achievement that is igniting an anxiety crisis. For adults, it leads to overwork in the labor force and less time focused on family, friends, and personal pursuits.

In some cases, the worship of work squeezes out other values and relationships that are more conducive to a healthy life and community. In an era of diminishing attachments, career and work sometimes seem like the last truly universal virtues. In a 2019 Pew Research Center survey, roughly half of Americans said that the most important part of a fulfilling life is work that provides joy and meaning. Less than a third said the same about being in a committed relationship or having children. Well, one might say, that’s just one report. But this week, a widely circulated Wall Street Journal survey found that traditional values such as patriotism, marriage, and community seem to be falling out of favor. Although the headline and viral graphs almost certainly exaggerate the degree of decline, the underlying survey found that one virtue finished first, above tolerance, community, and even self-fulfillment: “hard work.”

I think we’re at the cusp of a fourth revolution in work. If I were to write the lede of a similar essay in 20 years, I would have to say, “Here is a history of work in eight words: from jobs to careers to callings to …” Except I’m not sure what the eighth word should be yet.

Today, work and workism are facing a double-barreled revolution—the remote-work phenomenon and the dawn of generative AI.

By snipping the tether between work and home, telecommuting is changing the way that millions of people work, the kinds of companies they start, and where they live. The immediate implications are already fascinating: Fewer commutes and empty offices have decimated public-transit revenue, buckled the commercial-real-estate industry, and triggered showdowns between in-office bosses and white-collar workers seeking flexibility in their schedule. But the second-order effects might be even more interesting. Remote work has encouraged many Americans to seek bigger homes to accommodate their home office, which has created a “donut effect” of plumped-up housing prices in the suburbs of many metro areas. This year, a new paper noted that female remote workers are more likely to intend to have a baby than their all-office counterparts, suggesting that Work From Home could increase fertility rates. Some data even suggest that WFH has encouraged men to pull back on working hours, possibly putting a dent in workism itself.

Imagining utopian scenarios is easy: Perhaps the flattening of the job market will make labor more equal around the country and around the world; perhaps the legacies of workplace sexism, ageism, ableism, and racism will come crumbling down with the demise of the office. But imagining dystopian scenarios is equally easy: Perhaps the disappearance of the workplace will increase modern anomie and loneliness. If community means “where you keep showing up,” then, for many people, the office is all that’s left. What happens when it goes the way of bowling leagues and weekly church attendance?

The other prong is AI. Over the past few years, news of fresh AI breakthroughs—in solving games, in predicting protein shapes, in mimicking human language—has come fast and furious. The release of ChatGPT and GPT-4, the latest large language model from OpenAI, has transformed the way millions of people think about the future of work. People once believed that although machines were talented at replicating human brawn, general intelligence and creativity were firmly in the “for humans only” category. But we may discover that the opposite is true: Text-to-image tools such as Midjourney give ordinary people with little artistic genius access to a superhuman savant of pastiche, allowing them to mix and match styles to create characters, design homes, and produce extraordinary images. ChatGPT, using GPT-4 technology, can write code, poetry, parodies, news articles, book summaries, idea outlines, literature reviews, bibliographies, and bespoke Wikipedia pages about obscure historical events. The implications of this kind of program for white-collar industries are both thrilling and terrifying.

Confidently predicting the future of any larval technology is a fool’s errand. I’ll give it a go anyway. In the 1994 paper “Household Appliances and the Use of Time,” the economists Sue Bowden and Avner Offer found that time-using technologies (such as TV) diffused faster throughout the consumer economy than time-saving technologies (such as vacuum cleaners and refrigerators). The reasons are complex; breadwinner husbands may have demanded TVs before vacuums because they didn’t clean the house. But Bowden and Offer conclude that time-using novelties might also spread faster because they delight people and confer status. By extension, one might expect that generative AI tools such as Midjourney and ChatGPT will be gangbusters consumer technologies before they are mainstream producer technologies. For the next few months, the most obvious AI use case for nonprogrammers will be making stuff to share on social media and group texts in the spirit of “lol these machines are still dumb” or “lol this is kind of amazing.” Generative AI will waste 1 billion hours of time before it saves 1 billion hours of time.

But eventually, I expect that the technological descendants of these tools really will prove revolutionary, bringing the modern workforce into its fourth age of work: from jobs to careers to callings to chimeras.

The chimerical age of work will have several components. First, as people become fluent in the language and faculties of their AI tools, work in almost every economic domain will represent a co-production between human and machine. (If you think of the internet as a kind of proto-AI that allows individuals to work with a database of collective intelligence, you could argue that we’re already in the initial stages of this chimera age.) It will be common for architects and illustrators to work with text-to-image AI, and for home buyers to use these tools to mock up their dream home and rooms. It will be de rigueur for consultants and writers to outline, research, and brainstorm with LLMs; steadfastly refusing to use these tools will, in time, seem as arbitrary as never using search engines. Software engineers will consider AI co-programmers to be as fundamental to their work as computer keyboards. “Was this done with AI?” will soon be as strange and redundant a question as “Was this done with an internet connection?” (Every technology incurs a backlash culture, and I also suspect that, just as the rise of streaming music coincided with the resurgence of vinyl records, the eerie ubiquity of AI-inflected work will create a niche market of bespoke, explicitly anti-AI products.)

The word chimera has two very different meanings. The first is a mythological creature composed of three different animals, and it is in that spirit that I’m predicting a future of human-machine co-productions. But a chimera is also an illusory dream—something profoundly hoped for that doesn’t come to pass.

The technologies that most empower humanity almost always produce a shadow ledger of pain. The steam engine unleashed the industrial revolution and brutally shortened life spans. Nuclear technology can power energy reactors or atomic bombs. The internet makes us productive and unproductive, delighted and miserable, informed and deluded. Like the future of everything else, the future of work will be, above all, messy.

This article has been excerpted from Derek Thompson’s forthcoming book, On Work.