Imagine a pile containing every piece of information humans created from the beginning of time until 2016. Every library, news broadcast, song, letter, movie, carved stone, oil painting, book, bathroom graffiti—everything.
The stack of information created between 2016-2018 is 10 times higher.
We’re living through the most profound technological revolution in human history. As radically as our physical, meat-space technology is changing, something even more powerful is happening.
The information available to us—and our relationship to that information—is completely unrecognizable to any generation that’s come before us. And we need to pay attention to how it’s affecting us.
Humans preserve meaning.
It’s what we do. For hundreds of thousands of years, we’ve toiled to find ways to infuse the world around us with meaningful information. We created language, painted caves, carved tablets, invented paper. Not only were those things resource-intensive to create, they were even more difficult to maintain.
It’s no wonder our brains treat information-containing artifacts as valuable. But what happens when making artifacts becomes trivial?
In solving one problem, technology often creates another.
By 2016, most Americans (77%) owned smartphones, which solved a uniquely human problem. Suddenly, creating information-containing artifacts became a matter of reaching into our pockets and pointing our rectangles—a tad simpler than carving stone or mastering oil paints.
But increasing the amount of content available to us created a new problem. Our attention is a finite resource. Even if our content pile grows 100-fold, our capacity to consume it remains about the same.
To deal with this flood of new content, we innovated social media algorithms. They offered to sort through this vast body of new content to find what’s interesting to us.
But there’s a problem here. Not everything that grabs our attention is meaningful. Within the set of “things that are interesting to us,” we have some deep-rooted, internal sense that some of these things are meaningful and others are wastes of time.
Meaning grabs attention, but not everything that grabs our attention is meaningful.
What, exactly, do we mean when we say something is “meaningful?” A phone book is filled with information, but it’s not something we’d call meaningful. A song can be meaningful without invoking a single word.
When people use this word “meaning,” it’s a metaphor. They mean there’s something in their life that is analogous to how a sentence has “meaning.” The pieces fit together in some way.
Meaning isn’t what social algorithms are seeking, and meaning hasn’t scaled with content. Meaning is still a rare, valuable resource, and we’re making it harder than ever to find.
As long as something grabs your attention, it has the stopping power social algorithms are trained to seek. From the social algorithm’s perspective, a reality TV show’s cliffhanger and a long-forgotten song from your childhood look about the same. But the experience of each is dramatically different.
By seeking “engaging” content without regard for how it affects the conscious being consuming it, these algorithms are accidentally creating serious problems.
Our current solution to the content flood is already causing harm.
Researchers like NYU’s Jonathan Haidt sounded the alarm in 2018, arguing that social media algorithms weren’t just correlated with the teen mental health epidemic. They were the cause.
In 2011, 36% of teen girls said they experienced persistent hopelessness or sadness. Today, it’s 57%. Every major indicator from suicide attempts to depression diagnoses point to 2016 as a serious downward turn in the mental health of American youth.
This isn’t a new phenomenon. This is technology creating a new problem by trying to solve another. What’s dangerous, though, is that new technologies build atop old ones, and the speed at which we’re inventing new technology is unprecedented.
Social technology is building on a rocky foundation, and we will be its casualties.
Understandably, tech companies built social algorithms to deal with the massive influx of content. Despite those algorithms causing serious problems, especially for our children, those companies are plowing forward in the pursuit of profit.
And they’re companies, right? Why shouldn’t they?
Because the cost of their experiment is paid in mental health.
These “companies” have created more than a consumer product. Choosing to engage in public discourse on Twitter is in no way comparable to picking a brand of toilet paper.
Social technology is too important to serve as an engine for advertising revenue.
Researchers have already demonstrated a clear relationship between overconsumption of social media and negative consequences for mental health. Unfortunately, the importance of solving this problem—and the invention of these new social technologies—hasn’t quite reached our cultural consciousness.
Adults joke about our addiction to social media, meanwhile for teenagers, it’s replacing the normal, face-to-face interactions we know were fundamental to our own social development.
We are mistakenly evaluating these social technologies based on their ability to squeeze their audiences’ attention for advertising revenue. We’re failing to realize just how important these technologies are. We’re also failing to realize just how important it is that we solve the problems they’re creating before those problems become much more complicated.
If 2016 started a content flood, 2025 will be a tsunami.
Today, as we near the end of 2023, a new wave of content approaches our digital shores. It dwarfs the smartphone revolution of content growth. We call it “generative AI.”
Most of us have encountered its party tricks—Chat GPT or DALL·E or Midjourney. “Watch this!” I can generate a realistic photo of a guy wrestling an alligator for some pizza. A one-sentence prompt can generate a 10-page research paper. Here’s Gangster’s Paradise sung in Joe Biden’s voice.
Snapchat filters show us what we’d look like given ideal facial proportions. Anyone can create augmented reality porn of anyone else. Your most embarrassing moment in high school might just be viewed by millions of people.
The problem isn’t simply the amount of content that’s coming. It’s how complicated our relationships will be with that content.
Social technology is a knot that needs untying.
Social media is a knot. Bound together are these enthralling social media networks, massive amounts of advertising revenue, and a mental health crisis.
The problem isn’t TikTok. It isn’t Twitter or screen time or social media in general. At the heart of the meaning crisis is a problem is one of incentives.
Advertisers buy impressions from social media companies. To create more impressions, social media companies can either grow their audience or increase the amount of time their audience spends.
We wouldn’t trust cigarette companies to solve the lung cancer epidemic, and we can’t trust social media companies to solve today’s mental health epidemic. As long as social media companies are compensated based on how addicted their audiences are, they’ll continue to build in that direction.
Treating the mental health crisis begins with solving the problem of social media company incentives.
If you’re reading this article, and you’re thinking, “Ban social media!” then you’re missing the point. We humans invent social technologies. We’ve always done that, and we always will.
But we’re pretending that this newest and most revolutionary social technology is a product to be evaluated based on its profitability. In doing so, we’ve created a sinister set of incentives for social media companies.
Meaning is hard to find. It always has been. But the proverbial haystack in which it’s hiding is growing at an unmanageable rate.
We need to solve the fundamental problems underlying the social media industry or risk burying meaning just to make a little hay.