Skip to content

The Rise of Synthetic Ideology

Technology was always good at spreading bad ideas. Now, it is starting to produce them directly.

Dror Poleg
Dror Poleg
5 min read
The Rise of Synthetic Ideology

In 1964, the first compact cassette recorder was launched in the United States. For the first time, anyone could record music or voice, at home, using a small and inexpensive device. Unlike earlier cassettes, the "compact" was small enough to fit in one's pocket. It went on to dominate the music industry for three decades.

Also in 1964, the cleric Ruhollah Khomeini was forcibly exiled from Iran for instigating riots and undermining the Shah. Khomeini settled in Najaf, a city in neighboring Iraq, and, later, in a suburb outside Paris, France. The cleric could no longer preach directly to his disciples. But he could record his sermons.

Pilgrims and visitors took Khomeini's cassettes back into Iran. In some cases, recordings were played over the telephone and recorded by local operatives. While Khomeini was in France, 90,000 mosques inside Iran were duplicating and distributing his tapes, reaching millions of people. Within hours, Khomeini's ideas and instructions reached people on the other side of the world, initiating strikes, instigating riots, and ultimately toppling the Shah. When Khomeini flew back to Iran in 1979, the country was already his. He and his disciples have dominated it ever since. (Bin Laden, too, used cassettes to reach, recruit, and direct disciples)

In exile, Khomeini was aided by idealists of various kinds — Marxists, freedom fighters, moderate clerics, and even a variety of french intellectuals. Different groups sought to use Khomeini's character and charisma for their purposes: to resist tyranny, resist modernization, resist capitalism, or promote one utopian vision or another.

But as Kim Ghattas points out in her excellent Black Wave, in the end, it was Khomeini who used them. Once he took power, he gradually cut — literally or figuratively — his old ties. Many of his enablers never thought Khomeini meant what he said in his recorded sermons. And, in any case, they never thought he'd become powerful enough to implement his ideas. But he was serious and did obtain power (some, like Michelle Foucault, failed to denounce him even then).

New technologies always spread bad ideas, and most people underestimate their potential impact. In 1930s Germany, Nazi ideology was disseminated using recently-invented radio receivers. Joseph Goebbels, the Nazi propaganda minister, supported the development of cheap devices to be owned by all workers. And as with Khomeini, Hitler's rise was initially applauded by various groups who thought they could use him and ignored some of his outlandish ideas.

Ultimately, both Hitler and Khomeini used new media to convert ideas into power. But back then, technology merely helped dangerous ideas go viral. Humans still had to come up with the actual ideas and refine their appeal over long periods of time. Today, technology expedites the process of unearthing and refining ideas that could sway the masses. And it can even come up with new ideas of its own.

From Human Curation to Machine Curation

In 2020, Jeffrey Katzenberg launched a new content app that was expected to reshape the media landscape. Katzenberg knew a thing or two about media. He previously served as the Chairman of Disney, Co-Founded Dreamworks, and produced some of the most successful films of all time. He thought he knew what people wanted and was planning to give it to them — and make a lot of money.

Katzenberg lined up some of the world's leading directors, entertainers, investors, and executives. But Katzenberg's app, Quibi, was dead on arrival. Instead, another app changed the media landscape in 2020.

TikTok, an offshoot of the Chinese app DouYin, allows users to create and share their own videos. It relied on algorithms rather than old-Hollywood instinct to figure out what people wanted to watch.

TikTok was not the first app to use content-recommendation algorithms. But it was the best of its kind, and its product prioritized algorithms above all else. On Facebook, Twitter, or YouTube, algorithms helped users sift through content from people they follow or topics they searched for. On TikTok, users were no longer required to follow anyone or express any explicit preferences: The app threw content at them and zoned in on their preferences. Remarkably, TikTok figured out these preferences before users got bored and switched to a different app.

TikTok is, essentially, an engine that unearths the world's most appealing content — most appealing to humans as a whole and most appealing to specific people. It is an engine that encourages and facilitates fierce competition between ideas and enables the swift dissemination of ideas with the highest chance of success. These ideas are not chosen based on the public good or their alignment with one ideology or another; instead, they are selected because they are appealing. Whatever resonates wins.

In a world with such an engine, bad ideas can emerge and spread quickly. There is no longer a need for a Khomeini to spend 15 years recording cassettes or for a Hitler to spend a decade spreading hate on the radio. A destructive idea that appeals to the masses, that hits a nerve, can spread within minutes.

This is why China banned TikTok, even though the app was made in China. And it is why DouYin, the local version of TikTok, is heavily censored and moderated.

Note that TikTok's threat is not driven by any ideological bias. The danger is not that TikTok is Chinese or that it promotes dangerous ideas per se. Instead, the threat is the speed at which new ideas emerge on TikTok. The faster memes can emerge and spread, the bigger the threat to social stability and the powers that be.

TikTok relies on millions of videos uploaded by more than a billion users. It sifts through them and determines what is most likely to appeal to other users. It is a remarkable app, yet its effectiveness is limited by the amount of content that its users generate. But not for long.

From Human Production to Machine Production

Testing random ideas is a machine's way of being creative. An algorithm cannot replicate Jeffrey Katzenberg's instincts. But it can test how millions of people react to millions of videos and determine which videos are more likely to become popular. Humans produce ideas, and software figures out which ideas to promote.

But it doesn't end there. Software is now beginning to produce its own ideas and content. Over the past year, there has been an explosion of new tools that generate images, video, text, and even code with little human input. See, for example, DALL-E 2, Headlime, Copilot, Copy.ai, Copysmith, and this thread from Ali Abdaal.

Soon enough, social media will be flooded with machine-generated content. Most of it might not be good or appealing, but algorithms will figure out which bits are good and appealing. And as production and curation become integrated, the world will see viral content that is more appealing and more potent than ever.

Unlike Khomeini's or Hitler's ideas, the viral ideas generated by machines will not have a specific agenda. But they will emerge quickly and move people to do things that impact the real world — things that could disrupt public order and threaten the powers that be.

Social media is a way to mobilize crowds. Initially, based on human ideas. Ultimately, based on whatever software thinks will be most likely to get people excited. This presents a severe challenge to social stability and our existing political systems. How can we meet this challenge? All this and more in next week's article. Click here to subscribe, so you don't miss it.

Have a great weekend.


📷 The cover image for this article was generated by Craiyon / DALL·E mini.

MemesArtificial Intelligence

Comments


Related Posts

Members Public

The Promise of Precise Mass

AI will democratize new ways to live — and die.

The Promise of Precise Mass
Members Public

Real Estate & AI: Ten Predictions

How will AI affect the value, design, and operation of physical buildings and cities?

Real Estate & AI: Ten Predictions
Members Public

Intelligence and Leisure-as-Work

People will get paid to do whatever they want. Here's why.

Intelligence and Leisure-as-Work