We seem to be having three AI conversations that are actually one and the same.
In the last week, I read the following:
A journal article arguing that AI-created scripture (in this case Buddhist sutras) could contain real meaning
A New York Times opinion piece by a therapist praising AI therapy (gift link)
A piece in Wired about the rise and addictiveness of self-generated AI pornography
All of these articles are about people finding meaning in AI output that they themselves have generated. The meaning can be spiritual, emotional, or physical—but it is meaning that people enjoy because it comes out of you and yet feels like it is something more than you.
The conversation for each use case follows a predictable pattern.
The people who don’t like AI-generated bible/therapy/porn say that they are dangerous because they (a) desensitize people to the genuine article and (b) trap people in a solipsism where it becomes increasingly difficult to remember what the outside world had to offer.
The people who like these things, on the other hand, say: Calm down, I’m not stupid. I know it’s an extension of me, but the illusion is helpful. Haven’t you ever had a conversation with yourself, or an imaginary friend? Haven’t you ever fantasized? Haven’t you ever talked to God, or eaten magic mushrooms? Why do you care how I externalize my internal self?
I think the pro-therapy/porn/scripture camp is right in theory, but the nay-sayers are right in practice. In theory, AI is just another tool for self-exploration. In practice, its efficiency and versatility change how we think about self-stimulation.
The sound of one hand clapping
The main reason AI is different is that it is extremely good at satisfying individual desires for content. Unless your interests are radically different from all of humanity, the thing you want is likely a rehash of other already-fulfilled desires.
This is a step beyond the plain old internet. As Bo Burnham reminded us, the internet gives you a little bit of everything all of the time (YouTube), resulting in the feeling that boredom is something to be permanently banished. But people constantly get bored on the internet! We scroll TikTok despite finding nothing worthwhile, we search for something worth watching on Netflix, we check for messages instead of looking out the window. We raised our expectations for stimulation because of the web, and then the web disappointed us anyway.
The problem? No matter how much content is out there, nobody can know exactly what you want. Billions of dollars have gone into creating oceans of content in the hopes of striking true for as many people and as much of the time as possible, but nobody knows your exact mind and not everything you want is going to reach you, despite the best efforts of many algorithms.
In this online space, standing on the shoulders of an internet’s worth of data, AI now offers a new proposition: Just tell us what you want and we’ll make it for you, right now. It doesn’t matter if your desires are erudite or sleazy; it can be yours just the same.
And the process of creating this content is the same, highbrow or lowbrow. You tell the machine what you want, it gives you something imperfect, and then you refine, refine, refine. Respond as though you were a cosmic consciousness. Be more probing when I talk about my trauma. Make her breasts bigger. The machines are good; in the space of a dozen refinements, you may end up with the full version of something that only existed as a glimmer in your mind—and that fully realized form may actually bring you to enlightenment, or tears, or orgasm. Your clear specifications, hashed into everything the internet already knows, is a powerful combination. Very often you will get exactly what you wanted and much more.
You can do anything, the only limit is yourself
Because the AI only needs to satisfy one person, the success rate is fairly high. You may be aware that there’s an ongoing debate about whether AI capabilities are about to hit a brick wall or see exponential growth, but for the purposes of self-fulfillment processing power really may be all that’s needed. The difference between a therapy chatbot and a therapy voice conversation is mostly just “compute.” Today you can get enlightening scripture, but next year maybe you’ll be able to experience revelation. The number of routes to self-fulfillment will only grow.
I say this because you may think the use-cases are limited. If it’s just porn and emotional support and spiritual enlightenment, what’s the big deal?
Well, it’s a big deal because the M.O. of the AI industry to try to fill every possible role and see what sticks. There are companies that want AI to be your friend, your tutor, and your boss. You may already love an AI musician. If you treat every instance of humans relying on humans as something to be optimized away, you are definitionally trying to sever human connection. Make a list of all the times during the day that you interact with another person—physically or virtually—and you’ve made a list of AI optimization projects.
AI and the Great Turn Inward
Treated separately, each of these use-cases is relatively innocuous, but the concern is larger than just whether AI is a good/fine/dangerous therapy substitute, or whether gaining spiritual insight from a bank of GPUs is "valid," whatever that might mean. Instead, AI's ability to act as a perfect magic mirror for our internal ruminations makes the task of growing by engaging with the world feel laborious and inefficient by comparison. Finding a good therapist sucks; lots of people never find a spiritual guide in life. Many people would take 70% of the benefit for 5% of the energy.
In other words: the problem isn't that these applications do a bad job at making scripture/porn/therapy. It's that they often do a great job—but in convincingly playing the role of the other without the pesky need for actual others, they preempt many of life’s key relationships. Lose one or two and you’ll probably be fine, but industry trends may push you to drop most.
You may think that you know what this world looks like. Decades after the rise of social media, we’re all aware of echo chambers and the thinness of online connection and the way the internet can distort your sense of self and reality. We already interact with people online as though they’re not really people. Would AI be that different?
Yes—because AI reflections of the self feel great. With AI, you’re not going to feel Instagram-induced feelings of insufficiency, or Twitter-induced feelings of rage, or TikTok-induced feelings of hollowness. Whatever friction you feel from AI’s outputs can simply be refined away. Instead, you present yourself with content that you have tailored to be exactly as challenging/alluring/mystifying as you want. If you want sexual satisfaction, you can get it. If you want to confront your faults, you can get that, too. AI will present as an “other,” but always an idealized other—in other words, not really an other at all. In the words of Jean-Paul Sartre, you need never expose yourself to the hell that is other people.
This might be fine if the self was fully known and static. But we never fully know ourselves, and our selves are in constant flux. We don’t know always know what we want or what we need. By optimizing the task of self-connection, AI creates a new and dangerous illusion: that our deepest desires can be satisfied by looking ever inward. Therein lies the path of self-hallucination.
AI Can Be Your Own Worst Critic, If That’s What You Want
You’ve probably figured out by now (I hope!) that AIs are most useful when they’re doing something you know how to evaluate. This is just as true when the subject matter is the self.
We are not in fact perfect experts on our own selves, which means that we won’t always be able to tell when masturbatory AIs are going off the rails. Worse: We may shrink our selves to what the AI is able to conjure. We may not have enough contact with the outside world to know what we’re missing. We effectively create an echo chamber of one.
I’ll give you an example that I find particularly tragic. On Reddit’s AI discussion forums, I sometimes see people posting ideas for how to prompt AI to help on journeys of self-exploration. Here’s an example:
I want you to help me understand myself better — not just in terms of personality or productivity, but in the deeper structural sense. Ask me questions that reveal: – where I’m lying to myself – where I’m performing instead of expressing – what I might be grieving without realizing – what kind of world my nervous system was actually designed for
I want you to use everything you can gather — not just what I say, but what I avoid saying. Track patterns, contradictions, emotional signals, gaps in logic, and shifts in tone.
Don’t simplify anything. Don’t protect me. Go as deep as possible, using all your insight.
Start when you’re ready.
Now, what does this prompt represent? To me it sounds like a person trying to create an ideal version of a severe inner critic (“Don’t protect me”). That’s not really what a therapist is supposed to do; in fact, therapists often help people understand their own internal thought processes, rather than reify them.
The thing is: if you like these particular questions, the AI is probably going to tell you something that you find meaningful. There aren’t going to be any alarm bells telling you that you’re drowning in your own mindspace. Like a carbon monoxide leak, you may not know anything is wrong until it’s too late.
(Ironically, the people who see this most clearly are the ones pointing out the addictiveness of AI pornography, which allows people to visualize their precise fantasies beyond what any mental image could provide. AI pornography is incredibly popular. Civitai, a platform where users can upload AI models for others to use, is mostly porn.)
An Audience of One
You can sometimes tell when a person is lost in the AI sauce.
Much like an unsolicited dick pic, getting AI output when you don’t want it is a real bummer. I’m sure you’ve had the experience of someone sending you AI content that they think is fascinating but you find incredibly dull. You may have also had the experience of asking someone for advice only for them to tell you, “Here’s what happened when I asked ChatGPT.” I found myself feeling embarrassed on the other person’s behalf because it reads to me like the other person can’t tell when human connection is the point.
But this keeps happening because people are often deeply affected by the output they create—to the point where people develop romantic relationships with their chatbots. What we feel is genuine, just as the meaning one might get from AI-generated scripture is genuine. But the more you sink into these bespoke products the more the rest of the world will feel harsh by comparison.
We are all high-functioning AI addicts—for now
The earliest stages of any addiction are the most fun because the body and/or mind isn’t prepared; a first-time opioid user feels euphoric because your body is operating normally. You get the best of both worlds!
But there are no free lunches. Your body is efficient and adjusts to what is given, assuming that it will keep coming. As Judith Grisel writes, drugs that make you calm actually cause anxiety and drugs that ease pain actually cause it to increase.
Our relationship to AI is something like that, but we are too early in our journey with this technology to see it clearly. Most of us got educated in an AI-free world; we have skills that AI duplicates, and still have jobs that AI threatens to eliminate. We have the luxury of redundancy, the luxury of using AI as a supplement because society has not yet fully adjusted to its presence.
But it will—and it will be painless until we come up for air. Our self-stimulation machines will inevitably lead us to lose interest in the stimulation of others. The time we spend with ourselves will increase as our ability to materialize stimulating material from the safety of our own minds dramatically expands. Why deal with others when the self contains such compelling multitudes?
The worst affected, of course, will be children. What happens to a kid’s development when you surround them with phantoms designed to bend to their perception of their own needs? AI tutors may replicate the “royal tutor” model of education, but remember that royals are notoriously weird people precisely because they don’t interact with normals. How do you create a common culture when kids are self-creating entertainment based on their 10-year-old interests? How do you create rules for society? Are we going to end up in smartphone-like fights about delay AI access as long as possible, to give kids as much chance as we can to develop a self that develops alongside others, knowing that we adults don’t really think any age is safe?
There’s a good line that folks in the tech sector use to defend mediocre products: “This is the worst it’s ever going to be.” Nowhere to go but up, right? In some cases this will be true. I suspect AI agents are going to get pretty good. Maybe we’ll cure cancer, too.
But with self-stimulating AI, this sentiment is almost certainly exactly backwards. These tools are safest when they duplicate stimulation available outside of the self; the more they replace the outside world, the more dangerous they become. Society will adjust to accommodate AI; social skills and educational skills will weaken on the assumption that AI will always be there. We will grow to depend on AI, forgetting that the dependence was caused by AI itself.
Heroin is better than family
People who have struggled with addiction will understand this all too well. Addictive drugs can feel so good that family, friendship, and employment feel inadequate by comparison. We are extremely vulnerable to intense pleasures, and we have a hard time limiting ourselves.
We might find spirituality in AI sutras, emotional connect in AI relationships, and climax in AI pornography. But the path in all three is one in which we allow our selves to become so big that the world simply recedes. That is a path we must profoundly resist.
“All of these articles are about people finding meaning in AI output that they themselves have generated. The meaning can be spiritual, emotional, or physical—but it is meaning that people enjoy because it comes out of you and yet feels like it is something more than you.”
I heard once the definition of idol worship is humans coming up with systems in their own to worship so it is a sort of self worship.
I think this is the same. We create this AI experience so that we can then go ahead and get lost in it. Most People can’t literally worship themselves, that’s cringe. But they can create a convoluted stand in and project all their wants and desires on it and then lose them selves to it.
And this evolves into extreme and more extreme versions of self created systems to lose themselves into.
It’s meaning that comes out of you, but then feels like it’s something more than you. This is such a perfect line.