I love this piece obviously, based on my glowing restack. I would add that some happy-go-lucky people who simply want contentment will be drawn -- are drawn-- to AI companions and are outsourcing simple family/community life to chatbots, snuggling up to the glow of their phone in a cozy dystopia.
I would say these people "won't care" in the sense that they are fat and happy cuddling with their virtual AI buddies.
The same people optimizing, producing, and stuck in the never ending rat race of everything never being enough are excited for AGI because they’ve lost touch with what makes us human. Their Utilitarian mindset makes them believe AGI is unquestionably going to happen (something i think is not possible, philosophically speaking) and they forget that the real thing that drives progress is some innately human, creative and indescribable phenomenon that isn’t replicated by 1s and 0s doing a bunch of if else evaluations.
Thanks for this! Helping me get ahead on my future plans, which will likely be pivoting towards blowing up AI data centers once my humanities degrees become obsolete
Loved this article, would very much like to hear you expand upon your thoughts regarding "Ozempic incidentally killing your desire for everything" as someone currently taking it. Thank you.
I think the true value in the universe comes from conscious beings like humans, so if it turns out technology has made our species unhappier over the course of civilization, I’m more inclined to say progress was a mistake. I don’t, however, believe that technology has made our species unhappier, so progress is probably good.
“You could tell them now that AGI won’t ever happen and they genuinely wouldn’t mind. Same thing with respect to going to Mars.”
I’m not a deontologist. I think the value of the stuff we create isn’t in the stuff itself, it’s in the way it interacts with humans. The feeling of progress and conquering a difficult challenge; the betterness you feel moving up the social ladder. These are what I want to protect, not the specific necessity and idea that “we must go to mars”. Failure and success matter, the goal does not, unless it affects human lives (which is, luckily, 99.999% of all goals).
Luckily, AGI could solve these issues forever, which is why it’s the most important thing happening on earth right now. It could also bungle things up greatly, and I know a few smart people who say the deck of possible progress may be so rigged against us there’s a very high chance we’ll all be paperclips. I am not e/acc but I do believe more people should be working on AI than any other field right now. I just wish they had the discipline to work on solving AI safety instead of making “the cool part” with the highest chance of getting rich come as soon as possible, safety be damned.
Great article. This split is intuitively understandable and immediately gets at a lot of long talks I've had with friends and family. 70% at 5 things feels so on the nose for me - I always described my ambition as 4th place in every race.
One small nitpick - you mark complex language as emerging 10k years ago, but most linguistics see complex language as a fundamental human trait. If we didn't have it, how could we be in a gossip trap for all that time, right? I think from context you know this and it's not the point of the essay, but it always bothers me when I see the notion that ancient humans didn't have language. It's our one thing!
“Most people are content and would be content at any point in human history” I don’t think is true. I think very few humans have ever been fully content. But I agree with general distinction that some strive for happiness and some strive for more. The thing about striving for happiness or wholeness in a content age is that it often means living more on internet simulations, which would be bad for humanity.
" ‘why the hell did it take us so long for us to do anything?’"
Because the truth gets buried. There is very good evidence there was an advanced civilization along the equator during the last glaciation, and that tool use etc. was developed much earlier than the experts want to admit.
Farming et al. didn't get invented like the light bulb, it was slowly developed over time due to the pressures of rising population. The same situation would have existed during previous inter-glacial periods.
Sources: Graham Hancock's various books; Cremo and Thompson's "Forbidden Archeology."
very dynamic read thank you! this is my first time seeing your page and you have a very interesting perspective and im glad i stumbled on it. i definitely feel this is a conversation that should definitely be revisited as we see how society interacts more and more with AGI. what do you think?
AGs ruin everything, it doesn’t even matter if they’re actually conscious or not—only if they are perceived as such. As pleasants become more cringe and AGs become better, less toxic humans, the world will become a more cringe and toxic place.
I love this piece obviously, based on my glowing restack. I would add that some happy-go-lucky people who simply want contentment will be drawn -- are drawn-- to AI companions and are outsourcing simple family/community life to chatbots, snuggling up to the glow of their phone in a cozy dystopia.
I would say these people "won't care" in the sense that they are fat and happy cuddling with their virtual AI buddies.
The same people optimizing, producing, and stuck in the never ending rat race of everything never being enough are excited for AGI because they’ve lost touch with what makes us human. Their Utilitarian mindset makes them believe AGI is unquestionably going to happen (something i think is not possible, philosophically speaking) and they forget that the real thing that drives progress is some innately human, creative and indescribable phenomenon that isn’t replicated by 1s and 0s doing a bunch of if else evaluations.
Thanks for this! Helping me get ahead on my future plans, which will likely be pivoting towards blowing up AI data centers once my humanities degrees become obsolete
Loved this article, would very much like to hear you expand upon your thoughts regarding "Ozempic incidentally killing your desire for everything" as someone currently taking it. Thank you.
I think the true value in the universe comes from conscious beings like humans, so if it turns out technology has made our species unhappier over the course of civilization, I’m more inclined to say progress was a mistake. I don’t, however, believe that technology has made our species unhappier, so progress is probably good.
“You could tell them now that AGI won’t ever happen and they genuinely wouldn’t mind. Same thing with respect to going to Mars.”
I’m not a deontologist. I think the value of the stuff we create isn’t in the stuff itself, it’s in the way it interacts with humans. The feeling of progress and conquering a difficult challenge; the betterness you feel moving up the social ladder. These are what I want to protect, not the specific necessity and idea that “we must go to mars”. Failure and success matter, the goal does not, unless it affects human lives (which is, luckily, 99.999% of all goals).
Luckily, AGI could solve these issues forever, which is why it’s the most important thing happening on earth right now. It could also bungle things up greatly, and I know a few smart people who say the deck of possible progress may be so rigged against us there’s a very high chance we’ll all be paperclips. I am not e/acc but I do believe more people should be working on AI than any other field right now. I just wish they had the discipline to work on solving AI safety instead of making “the cool part” with the highest chance of getting rich come as soon as possible, safety be damned.
Great article. This split is intuitively understandable and immediately gets at a lot of long talks I've had with friends and family. 70% at 5 things feels so on the nose for me - I always described my ambition as 4th place in every race.
One small nitpick - you mark complex language as emerging 10k years ago, but most linguistics see complex language as a fundamental human trait. If we didn't have it, how could we be in a gossip trap for all that time, right? I think from context you know this and it's not the point of the essay, but it always bothers me when I see the notion that ancient humans didn't have language. It's our one thing!
“Most people are content and would be content at any point in human history” I don’t think is true. I think very few humans have ever been fully content. But I agree with general distinction that some strive for happiness and some strive for more. The thing about striving for happiness or wholeness in a content age is that it often means living more on internet simulations, which would be bad for humanity.
" ‘why the hell did it take us so long for us to do anything?’"
Because the truth gets buried. There is very good evidence there was an advanced civilization along the equator during the last glaciation, and that tool use etc. was developed much earlier than the experts want to admit.
Farming et al. didn't get invented like the light bulb, it was slowly developed over time due to the pressures of rising population. The same situation would have existed during previous inter-glacial periods.
Sources: Graham Hancock's various books; Cremo and Thompson's "Forbidden Archeology."
very dynamic read thank you! this is my first time seeing your page and you have a very interesting perspective and im glad i stumbled on it. i definitely feel this is a conversation that should definitely be revisited as we see how society interacts more and more with AGI. what do you think?
AGs ruin everything, it doesn’t even matter if they’re actually conscious or not—only if they are perceived as such. As pleasants become more cringe and AGs become better, less toxic humans, the world will become a more cringe and toxic place.
It’s provocative and I respect that.