Get Early Access
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Socials
© 2024 Telos. All rights reserved. • kvk: 75645130 0000
When I reflect on what truly matters, I often think about regret. It's useful to view life through this lens and adopt a regret minimization framework. If you pay attention, you’ll find that the majority of your regrets won’t come from the things you’ve done, but rather from the things you haven’t done.
The most fascinating research I found on this topic was conducted by Bronnie Ware, who visited elderly homes and asked those nearing their end about their regrets. She compiled the following list, ranked from most to least important:
1) I wish I'd had the courage to live a life true to myself, not the life others expected of me.
2) I wish I hadn't worked so hard.
3) I wish I'd had the courage to express my feelings.
4) I wish I had stayed in touch with my friends.
5) I wish I had let myself be happier”
I would very much like to avoid these, but if I am being honest, I can also see myself making some of these mistakes nonetheless. Overall, I am pretty prudently prepping for the longest cycle. However, at least three out of five scare me more than I’d like to admit. That said, I very much appreciate Paul Graham’s practical idea of putting them on top of my to-do list — who knows, maybe I’ll even incorporate that idea into Telos.
What are the most devastating regrets that will torment my generation 60 years from now?
These are the regrets of our elders today. However, unfortunately, just like what happens when you listen to your parent’s advice on how to succeed, the reality is, at best, you’re operating on 30-year outdated software. You might be wise to recapitulate the formula, which brings me to my next point: what will be the most devastating regrets that haunt my generation at the end of our lives?
For a new generation I believe a new regret is inevitable. For me to make this claim, however, two things must hold true: it needs to be predictable enough that many will be affected by it, yet, secondly, non-intuitive enough that most will also fail to address it personally. What I suspect this new regret will be is rooted in a remarkable quote from 1890: “Our life experiences equal what we have paid attention to,” William James remarked, “whether by choice or default.” I believe the following regret will not just be an addition to the list; it will be its frontrunner. I believe the most dreaded regret of my whole generation will be: I wish I had directed my life by choice, not by default.
Inadvertent consequences—which economists term negative externalities—of a new perverse system has taken hold and in turn corrupted our everyday technology in pervasive ways. By extension, it has downgraded our ability to hold a large-scale conversations on which democracy is built; deteriorated our life-long relationships; and worst of all, depleted our brains, undermined our self-authorship—our ability to be stewards of our own destiny
The root cause involves an economic phenomenon. The introduction of a seemingly impossible new business model: suddenly, something we once paid for full price is now sold dirt cheap, below cost, or even offered entirely free of charge. On the surface, everything appears fine. The quality appears intact, demand surges, and the architects behind this model are momentarily celebrated as visionaries. Yet beneath the surface, history tells a different story—a warning about the hidden costs of what Tim Wu calls The Attention Economy.
The attention economy is a pervasive force that has impacted a wide range of industries, often with adverse consequences. Examples include the news industry, the Great Poster Problem in Paris, and the rise of global passive consumerism.
Benjamin Day is most famously known for founding The Sun newspaper in New York City on September 3, 1833. The Sun was one of the first successful “penny press” newspapers. Benjamin Day priced his newspaper at just one cent per copy, which was significantly cheaper than other newspapers of the time, which usually sold for around six cents. This unheard-of price point made seemingly no economic sense, but rendered The Sun dramatically more affordable, thus significantly broadening its readership among the entire working class. The Sun somehow was being sold below the marginal cost of production. How was this possible?
He became chained to the economic imperative not to only capture but also to hold readers’ attention indefinitely. What could go wrong?
Instead of selling newspapers directly to readers, Day implemented a novel business model that resold readers’ attention to advertisers. This marked a significant departure, shifting revenue away from subscription sales to advertisement revenue. This gave rise to a new economic imperative Day could hardly have foreseen: by entering the business of reselling attention to advertisers, he became chained to the economic imperative not to only capture but also to hold readers’ attention indefinitely. What could go wrong?
Under increasing competitive pressure, in August 1835, The Sun published a series of sensational articles claiming that Sir John Herschel, a famous astronomer, had discovered life on the Moon using a powerful telescope. Elaborate lunar landscapes inhabited by strange creatures, including bat-like humanoids and other fantastical beings had been spotted. This led to a massive surge in sales. There was only one problem: everything was a lie. This falsehood spread incredibly far and wide, fueled by the self-reinforcing nature of social proof and the great pretentious pseudo-scientific effort it underwent to look like real journalism. Many point to The Great Moon Hoax as the birth of sensational journalism, the erosion of the news industry, and the seed of the devastating post-truth era of “alternative facts” we face today.
A similar pattern emerged in late 19th-century Paris, where the development of lithography led to an overwhelming proliferation of advertising posters. This new technology enabled the mass production of colorful, high-quality images at relatively low costs. Without regulations to protect the city’s architectural integrity, public spaces became bombarded with advertisement posters, covering buildings, fences, walls, kiosks, and even street furniture like lampposts and benches. Companies engaged in advertising wars, aggressively overlapping competitor posters in high-traffic areas. This littered the city’s beautiful streets with layers of washed-up paper, leaving public spaces disorganized, dirty, and cluttered, leaving Parisians feeling they had lost their city’s beauty.
Lastly, the attention economy found a new medium. Now the major American broadcasting networks’ new programming could be freely accessed if one would only purchase its receiver: the television. By the mid-1950s, more than half of American households had a TV set, and families would gather to watch programs together in the living room. When asked, people reliably underestimate how much television they watch, but when tracked an American adult watches 3-4 hours per day, totaling around 1,770 hours annually, or approximately 74 continuous days of watching each year.
The incentives established by the attention economy in television are pervasive. Episode storylines are artificially dragged out, and unsatisfying cliffhangers are inserted to hook viewers into the next episode. In pursuit of ever-larger audiences, insightful documentaries and cinematic storytelling make way for programming that appeals to the lowest parts of ourselves. In every hour of content, 10 to 20 minutes are dedicated to advertisements with the phrase, “We will be right back after these messages from our sponsors.” What is lamentable here is that American-Hungarian psychologist Mihaly Csikszentmihalyi concluded, among many others, that research suggests that people self-report this time to be on average best described as “uneventful” or “mildly depressing.”
Instead of selling products and services to end users, the attention economy opts in to resell users’ attention to advertisers. At first, on the surface things might appear untouched, perhaps even made better by a sudden reduction or even complete elimination of cost. However, in a historical view of American journalism, French architecture, and global entertainment a clear pattern emerges. Shifting the customer base, and thus the revenue source, alters incentives in a perverse direction: it invariably corrupts the integrity of what it purports to be. It creates the illusion of a gain, only to reveal a deeper loss of something at first difficult to define—something we failed to appreciate until we lost it.
This separation disrupts the balance between supply and demand, leads to systematic pricing errors, and, in extreme cases, causes entire market failures. Ultimately, whatever you incentivize will happen.
There is a reason why consumer-facing products, like those from Apple, reliably delight millions by offering new capabilities, conveniences, and noticeable sparks of joy through their premium experiential goods: they are massively incentivized to do so. This alignment occurs because their customers are almost perfectly aligned with their end-users. In other words, they are one and the same.
This feedback loops might make for good Excel sheets, but they will not make for great products.
Now contrast that to how you are avoiding enterprise software like the plague. How makes you feel overwhelmed, actually distracts you from building things people want, and induces a state of mind that would make you eligible for prescription drugs. This is, again, because they are incentivized to do so. Their end-users are not their customers; their customers are the managerial class who make decisions based on the fact the quoted pricing looks low enough, or the plethora of supported API integrations looks high enough, to ultimately minimize their chances of being blamed by superiors. This feedback loops might make for good Excel sheets, but they will not make for great products.
Let’s take this example of misaligned incentives of enterprise software a step further. Imagine you’re running a successful large-scale business, and a salesperson calls you with an ‘offer you can’t refuse.’ You are among the first selected few granted access to their brand-new project management tool. ‘But wait, there is more.’ This particular tool miraculously comes free of charge and is ominously called ADvantage. Your phone wouldn’t even have time to register the call in your call history after how fast you would hang up the phone. No responsible person would consider running their business this way. Now, imagine a tool that you would use to manage something even more important than your business—your life. Would you pick up the phone?
Neuroscientist Adam Gazelly remarks in his book The Distracted Mind, that most of us inhabit an information-saturated world where no one can keep up with the increasing volume and velocity of modern-day life. Already saying this aloud feels nothing short of a fish describing the world to his fellow constituents, only to be greeted with the question, ‘Wait, what is water?’
Herbert A. Simon, a Nobel Prize-winning economist and cognitive psychologist, commented in 1971 on the unexpected side effect of information overabundance. At first glance, one might intuitively think that a wealth of information offered by, say, the internet would lead to something akin to a second scientific revolution. This belief was widespread among early internet proponents who envisioned an inevitable explosion of wisdom among the populace. But history tells a different story. According to Herbert A. Simon’s famous warning in 1971 “In an information-rich world, the wealth of information means a death of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention.”
With the rise of the attention economy comes a perverse imperative: to increase screen time and engagement at all costs. Over the last two decades, this has become the dominant business model behind the free and open Internet, diverting us from experiences that support our intentions and steering us toward a dystopian future focused on capturing and holding people’s attention. Today, the average American spends more time in front of screens than sleeping. An average Gen Z individual spends about 4 hours on social media and a total of 8 hours and 54 minutes on screens daily. This leads to a staggering 8,760 hours per year, amounting to approximately 24.42 years over a lifetime. How much of this time is truly well spent?
In a society more materially rich than our ancestors could ever have dreamed of, all data today points to a decline in well-being indicators across the board. Technology is miraculous. The internet is amazing. But social media arguably functioning as the default operating system of most individuals life has been a catastrophic mistake. I am more than willing to accept that if the supply of cool and interesting content increases, it follows that we would allocate more of our time. If, for example, those who were most engaged with online content experienced greater well-being, built better relationships, or found greater purpose and direction, it would make for an interesting discussion. Unfortunately, for the sake of our discussion, the reality is, respectively: we do not, it doesn’t, and the reverse is true.
When used as designed, the most adamant social media users seeking connection end up feeling lonelier; those who follow the news become verifiably less informed; and those who look for role models are left to follow ‘influencers’—a nonsensical term from an era of confused individuals that no sane person should legitimize. These so-called influencers primarily succeed in promoting pettiness, self-obsession, and get-rich-quick schemes. They lure males into thinking investing means betting on crypto and entrepreneurship means drop-shipping. At the same time, females are tempted to reduce themselves to a collection of objectified body parts for others to jerk off to. And worst of all, it gave us Logan Paul.
“Each medium, like language itself, makes possible a unique mode of discourse by providing a new orientation for thought, for expression, for sensibility”, Neil Postman writes in Amusing Ourselves to Death. “[…] to take a simple example of what this means, consider the primitive technology of smoke signals. While I do not know exactly what content was once carried in the smoke signals of American Indians, I can safely guess that it did not include philosophical arguments”
Each technology has its own agenda, and due to the attention economy, today’s communication technology is neither designed to foster understanding nor is our information technology in the business of informing us. Instead, these technologies are crafted to capture and hold our attention indefinitely, regardless of the long-term cost. This driving force can be summarized as sophisticated adversarial technology creeping into our lives, disguised as free modern conveniences. These technologies interfere with our human attention, undermining individuals’ aspirations and dreams, drastically shifting the balance from human attention being endogenously driven to becoming exogenously manipulated.
Put bluntly, the internet in general, and social media specifically, have essentially become technological narcotics in the 21st century. People are working tirelessly to distract us, making each moment spent online more addictive than the last. Even worse, we are increasingly living behind screens, wearing our mobile devices like gloves, and carrying them with us from cradle to grave. As a result, the world is becoming an increasingly addictive place, one for which we have yet to develop the antibodies needed to protect ourselves. We are now left to fight tirelessly for our human attention, volition, and self-determination, 24/7/365, against machines that continually improve and never sleep.
I wish I had directed my life by choice, not by default.”
This is why I fear that, at the end of their lives, the most widely shared regret among a new generation will be a new regret: “I wish I had directed my life by choice, not by default.” I believe this is the unfortunate yet entirely predictable fate of those in my generation who paid the ultimate price by paying no attention at all.
The way out of this challenge is to address its root cause: the Attention Economy, as defined by Tim Wu. To understand how we aim to offer a viable alternative to the Attention Economy and tackle the Social Dilemma, explore our Mission, where we outline our vision in detail.
Below you can read a story about Samantha, a teen girl who is part of the first generation raised by social media, and what pernicious effects it has had on herself and her friendships.
Learn more about Telos and our story.