Looks crazy but this isn't really anything new. Before AI these people became obsessed with fictional characters from romantic novels, movies, or videogames and treated them as their partners and fantasized about them. Now it's more interactive with AI but still the same thing.
The overuse of emojis in the comments makes this whole thing feel a bit off to me. Like someone faking interactions and trying too hard to make them look convincing
Seems wild unless you check out the state of the modern book market. That kind of verbal pornography is everywhere, women can't seem to get enough of it. 4o isn't even chatting in those screenshots, it's writing fiction in which events are described in the past tense, the sort of "a sexy vampire violated me in his crypt but i liked it" fiction that some women seem to be addicted to. Very sad that she calls this a husband :(
What really is wild though is that George Orwell predicted this in 1948. Julia was 26 years old and worked in the book writing department, where novel writing machines spat out books at an industrial scale.
this is not really true in all cases. there are things like ADD and other mental divergences which can cause this without neccesarily being based in trauma directly. there is research around it perhaps sometimes being passed down via generations (indirect trauma?) but this is not in all cases.
beyond trauma induced mental divergence i think good and bad are really a matter of perspective and opinion. people dont generalize well. we are all different. (except for that one guy...)
that being said, its sad either way and often doesnt end well.
We are watching in real time the creation of a whole new category of... not sure what the correct term is for this. Mental disorders? As another commenter put it - that whole subreddit is something...
"Concerning" is a huge understatement - I am curious if anyone around here in the mental health field has any opinions to share on this subject.
I think the real shocker is that people still think anything on the Internet is real. That "well they said it was true so it must be" is somehow still the default
Why don’t they just write a system prompt? If you want GPT 5 to be a sycophantic empath who validates every keystroke in every message you send, it’ll generate texts like that.
Post on Reddit showing many women as upset by OpenAI's new models. They viewed the previous models in ChatGPT as their boyfriends. With the new models, they now feel that their partner has been "taken from them".
this really is something...
reply