work ongoing on the next longer post. speak soon.
A polarising tweet from this week:
And some (much-liked) responses:
(here’s the original tweet if you’d like to examine the replies and quotes yourself, where you will see many more of the below)
Maybe I’m going to get called a moron too by the end of this one, let’s find out.
But when I read those three responses, I thought the same thing each time: have you people ever met a college student? Have you ever been one?
I totally agree that the idea of college is to allow ambitious young people to dive deep into the knowledge well of their choosing, but LORD does it take some overconfidence to suggest that that is what actually happens.
Here’s the story: large scale universities scale up from intimate breeding grounds for the elites of the elite, to everyone else. This is good. Now if you want to learn something, e.g. psychology, you can. 500 others born in the same year as you do too. The university now has a measurement problem, and needs a clean way to evaluate the great from the just ok. The university designs a curriculum of assignments and final exams spanning 3 years. Your performance on these decides how good you are at psychology, before offering you to the world.
One of the tweeter’s says this: “the job of a student is not to output "schoolwork" as if it's a commodity, but to learn through the processes of creating and completing that work.”
But that is not what Universities tell their students their job is. They tell them to get good grades. To be more specific, to write the words on the paper that their teacher will like seeing. To be even more specific, to write the type of words the teacher wants to see. Not primarily what a good answer necessarily is, but what it looks like. To learn more of why that is a bad thing I recommend Lou Keep’s Uruk series. Or reading Seeing Like a State. Or allowing a business major to run your company.
I’ve sat in a lot of lecture halls. I’ve done a lot of assignments, and exams. I did well on some of them. I also know some people who did really well on all of them. And you know what’s weird? How often those people cannot hold a 10 minute conversation about their area of expertise. How they flinch at a direct question.
Ever hear of Goodhart’s law? Maybe it was in the optional readings.
Replier 1 said: “The fundamental logical flaw in this tweet is the apparent assumption that students produce assignments for the teacher and not their own understanding. The work is the point.”
And the real flaw is this: if you want the work to be the point, don’t go to college. Want to stay in college? Consider using ChatGPT.
If your counterpoint is that having a degree is still the only way to prove competence in something, so not going isn’t a good option, my counterpoint is EXACTLY.
But the reason the tweeters above are wrong is slightly more complicated. It’s not that they just don’t know how college grades work. In fact, this doesn’t need to be about college at all.
The issue is that they see ChatGPT producing 500 words with references as fake, but a student writing a similar output as real. That a student is cheating themselves and their teacher out of a legit learning exchange if AI follows the rules for them. Abstracted out, the digital is the land of simulation, and the physical is the land of raw, unadulterated experience.
But that is a flattering interpretation of reality, and there’s layers to this stuff. For further information, contact your nearest French intellectual with post-something on their Wikipedia profile.
Point is this:
People don’t use technology to offload their real lives and passions to the simulation.
People offload the things that they are simulating already.
Let’s do one more example. Here’s a scene I’ve made up but nevertheless has 100% happened a million times in a thousand fonts:
Paris. 31st December. 11:59pm. On that little park thing in front of the Eiffel Tower. Countdown is at 10, and the phones start to come out.
Guy at the back says to his mate: “What’s wrong with all these people, this is going to be beautiful, and they’re all watching through their phone. Like what their online followers can see them doing is more important than real life.”
Mate says back: “Yeah, exactly. Look at that couple over there, waiting to get a selfie when it hits 0, and a kiss for the camera, not for themselves. Why don’t they just put down the phones and remember this moment together.”
This misses the point of the behaviour.
Because if that couple did put their phones down, and just smiled at the beauty of the lights streaming above them, their smiles would be artificial too. If he puts his arm round her and kisses her on the cheek, even if no one sees, it would be just as fake. When they go home and fuck in the new year that night, they will be pretending.
Just like we pretend college kids are learning by doing their assignments themselves.
Otherwise, Instagram’ing fireworks and ChatGPT’ing essays wouldn’t have such an appeal.
These technologies don’t turn your real life fake. They point out how much of your life is fake already. Maybe you won’t like what you see, but it’s worth weighing up.
Unfortunately, you’ll have to evaluate yourself this time.
I don't fully agree but in general I am interested in AI's ability to reawaken broader questions of consciousness and reality, questions that have been "solved."
Stunning analogy, can’t wait for the longer piece.