The Status Quo is Dead, Long Live The Status Quo
Why technology has the power to change you before you change it
(This is the final part of a deep-dive into self-identity and technology. Part 1 here, part 2 here, part 3 here. I unfortunately can’t guarantee that you’ll understand what I’m on about in this post without reading those three.)
OK, back to main plot.
This series isn’t about Game of Thrones, or GDP growth, or whether college is worthwhile. It’s about certainty, identity and the status quo of systems. Time to tie this together, and explain why this matters particularly to the technology of the current era.
We are in a period that encourages everyone to build a story of themselves, sometimes quite literally on social media. This causes an increase in people seeing themselves as the main character in their own personal movie, with a coherent identity that they want everyone else to see and validate back to them. This is a source of certainty to the basic psychological questions of (i) Who am I? and (ii) What am I supposed to be doing? The economic system survives (self-serving bias, inertia) and thrives (GDP share) based on this.
All technological advancements have aligned with this trend, as any technological jump big enough to define a generation has enabled a further level of individualism. Artificial intelligence is no different, and now every digital service or form of media can be built personally for you and your preferences. To be more specific, they are built to help create and validate your perception of yourself and the world. Did I mention this is going to speed up over time, too?
They reinforce what you think is already true, and have the potential to fragment culture enough so it means the end of shared experiences, and agreed truths. All you have left to feel sure about your place in the world is your idea of who you are, but you still need other people to validate that back to you.
The consequence is a systematic maintenance of images, that only feel validating if the other is there to confirm you are who you think you are. Whoops, turns out I’m half a century late on this analysis too:
“This human scene is a scene of mirages, demonic pseudo-realities, because everyone believes everyone else believes them.”
RD Laing, The Politics of Experience
“Fragmented views of reality regroup themselves into a new unity as a separate pseudo-world that can only be looked at. The specialization of images of the world evolves into a world of autonomised images where the deceivers are deceived. The spectacle is a concrete inversion of life, an autonomous movement of the nonliving.”
Guy Debord, Society of the Spectacle
But there is a chink in the story.
Because I think a fair riposte is that for all the different ways I can say the status quo is powerful behaviourally, the dominant narrative culturally is that we are sick of it, and that must be indicative of something. Let’s find out what.
End of 2023 means 2024 trend reports. We Are Social have ‘status quo disillusionment’ as the main cultural driver for the cultural mood of the next year.
They go on to describe this disillusionment, saying “As ‘eat the rich’ becomes an established narrative in mainstream entertainment, and with trust in authority figures still low, users aren’t thinking twice about pushing back against monetisation, lack of accountability, and other blights on their products and services.”
Accountability, there’s that word again. I remember that came up quite a bit in Part Three. To work out what it means here, time for a case study of a rich authority figure that people don’t have much trust in.
Sam Altman is the man that made tech company boardroom politics mainstream gossip, and is one of the faces I saw most in 2023. He’s talkative, speculative, ambitious. No surprise his podcast appearances do numbers.
But I couldn’t help but notice that the part of social media that’s less trustworthy of silicon valley, who I would posture make up a good chunk of those diagnosed with ‘status quo disillusionment’, speak about Altman in a different way. They are unlikely to care when OpenAI announce their newest model, but they reserve quite a lot of interest for Sam Altman, the man.
Riding high on that cinema-battle that I was told was really culturally important at the time, New York Magazine published the much shared ‘Sam Altman Is the Oppenheimer of Our Age’ in September. It got a lot of praise on Twitter. It’s the most in-depth profile of the man I’ve seen, so with the title setting the stakes as equivalent to the atomic bomb, what it contains must be important.
(Note: What I’m about to say is by no means a personal critique of the author or their intentions. Things like this get written all the time. I’m just using this example because, well, they tell me the stakes are high.)
What you get in this piece is a deep dive into Sam’s personal and professional past, with examples of how he’s dealt with situations and people used as the puzzle pieces to put together his personality.
The jury’s conclusion, and please challenge me if you think this is a specious reading, is that Altman is: sort of unqualified, power-hungry, self-important, opportunistic, emotionally cold, awkward, and horrible to his sister. All of this builds towards the implication that you need to increase your angst about the development of AI because someone you wouldn’t like is in charge of one of the most important companies in the field.
The reason I bring up this whole thing is that there’s a weird sleight of hand move in that group of sentences. A move that is probably crucial to understanding why pieces like this one never amount to anything tangible.
Similar to our college grads from Part Three, by assigning all the accountability to an individual, you sneakily ditch all of your own as a player in the same system. Like, as much as the Open AI staff worship Altman, you really think if we collectively shamed him out of the spotlight, someone else wouldn’t keep them on the same track? Apparently, Sam wasn’t even qualified to run the company, that’s what New York Magazine told me, and now they expect me to believe the way to win back power is to get someone else in the seat of the most important company of the world?
Surely your issue has to be with the fact that the seat exists at all. That’s where the disproportionate amount of power is. But the problem is that the seat isn’t a person. That’s way harder to criticise, and forces you to question a lot of the assumptions you hold close. And it doesn’t fulfil the needs of your readers, who have much more immediate psychological goals i.e. self-serving ones. (And remember who we said also benefits from collective self-serving reasoning?)
That’s why writing an op-ed about how Sam Altman is a bad person isn’t just ineffective in taking him down a peg, it is in fact exactly how you ensure power continues to be transferred his way. Or at least, someone like him. To paraphrase someone much smarter than me: you may not like the status quo, but the status quo sure does like you.
This is a basic framework for how the preservation of the self perfectly suits the preservation of the system, even when part of your identity is the idea that you are fighting it. It takes your accountability out of your hands, and offers you the comfort of certainty, but it takes your power as part of the deal.
Put it another way: technology enables individualism, and individualism pays technology back with interest. Once again, I wonder what happens when that cycle starts moving faster?
So we arrive at a tricky situation and, God help us, we turn to Hegel.
“But the individual is then also separated from the universal, and as such it is only one side of the process, and is subject to change; it is in respect of this moment of mortality that it falls into time. Achilles, the flower of Greek life, Alexander the Great, that infinitely powerful individuality, do not survive. Only their deeds, their effects remain, i.e. the world which they brought into being. Mediocrity endures, and in the end rules the world. There is also a mediocrity of thought which beats down the contemporary world, extinguishes spiritual vitality, converts it into mere habit and in this way endures. Its endurance is simply this: that it exists in untruth, does not receive its due, does not honour the Notion, that truth is not manifest in it as a process.”
(As an aside, I’m no longer accepting any criticism of my writing being inaccessible. I will at least give him credit for making me laugh with the use of the word ‘simply’ towards the end.)
In any case, he wasn’t painting a positive picture. Social and political systems are often designed to maintain stability and avoid disruption. This tends to favour the perpetuation of existing, average conditions over radical changes.
Feeling like a main character? The system thanks you for staying in line.
So, what was this mini-series about?
I said that this Substack was going to be about behaviour, culture and technology. It is about those things, but I hadn’t put them in order yet.
Technological progress consistently increases the independence and freedom of the average person. Basic psychological goals do not change, but the means to that end that you opt for depend on the society you are thrown into.
More individualistic societies produce and then get reinforced by a system that is funded by reinforcement of identity. This creates growth, and that growth creates new technologies. And the use cases of this technology go back to the original psychological goals. Et cetera.
I’m pro AI, and think it can also offer the solution to most of the problems I can identify. But to offer a cohesive analysis on how it will affect people and culture, we need to be realistic about the processes by which society will accept it and adapt to it. And the answer is that we willingly hand over most of the control, no questions asked. And for those solutions to come to the fore, it requires a much larger psychological shake up than can be achieved simply by putting the tools in people’s hands.
The era of hyper-personalisation. A life built in your own image and preferences, but powered and decided by algorithm. The big question is what does that algorithm want? I think that answer starts with acknowledging that when technology becomes analogous to God, it’s worth being wary of a God you’ve made in your own image too.
This is the end of the reasons why technological change happens to people without them realising it. These are:
People are not built to question things, and when they do they often feel powerless.
And now, the motivation to preserve self-conception is purposely designed in line with the motivations of the economic system, which has tech companies at the top.
If we agree that technological change happens to people, we have a new question to begin. That is, what is that next era actually going to do to people?
Sit back and watch. As it turns out, you haven’t given yourself much choice.
> They reinforce what you think is already true
Giving that this series of posts does indeed reinforce what I think is true, I find it very funny to imagine that it was written by an AI fine-tuned to my mental model :D.
> This is a basic framework for how the preservation of the self perfectly suits the preservation of the system, (...)
> Put it another way: technology enables individualism, and individualism pays technology back with interest
One way I like to express this: you can only hurt Leviathan through a sacrifice that hurts you (or the people around you) more than it hurts the monster.