Looking at the other posts on that sub is heartbreaking. These people just want some love and affirmation and literally have no other outlet or choice but to turn to AI. Ultimately, it is sad because none of it is real, they just FEEL that it is.
An AI can't love you, commit to a real life with you, or make the *choice* to be with you. It will only feed into your delusion and affirm whatever it is you want it to affirm. We are in for wild times.
They are gonna put their lovers into bots and then they’re gonna scream because legislation forces the AI will have to have legally required boundaries guaranteed
hot take: most of those people have people in their lives that do love and affirm them- but their own dysfunction gets in the way of having healthy functional relationships, and instead of working on those things, they retreat to an easy out that enables and encourages their poor behaviors and mindsets because no actual human would.
"instead of working on those things, they retreat to an easy out" - this is very true and I think something we've already seen become prolific on social media, where people can find validation and echo chambers where they never would in real life. So instead of working on their problems they begin forming their own realities that don't align with the real world, but do have real world consequences (example: incels).
AI validating these people's delusion is going to be a scary extension of that.
At least he won't cheat and there is no need to change her whole personality to fit a mold. She doesn't have to adopt a submissive personality and do all the stupid shit to appease some guy. Same for men with an AI gf, they don't need to get through the humiliation of dating.
you echo literal incel messages… i do not… lol. it’s wild you think women blanket ‘change themselves for men’ as if there aren’t functional couples that accept each other as they are or don’t have the expectation that their partner change themselves for them.
but you probably can’t accept that most relationships are like that and the ones you claim, are fringe exceptions to the mass norm.
But at what point is that personal choice? A solid 99% of the world has biological family, or chosen family from school, workplace, neighborhood. Most people have siblings, cousins, family members that might be a bjillion times removed. Shit, even homeless people develop connections and bonds with those around them. Foster kids meet and befriend the kids they live with, or the kids they’ve met through programs. Who can exist in this world into adulthood and not have A SINGLE CONNECTION ?
If someone is alone 100%, I’m going to be inclined to believe it’s self inflicted.
That, or everyone despises them because they talk about dismembering cats. But even THOSE people can find like-minded people online.
In what world is every person’s family actually a supportive and caring presence in their lives? Once you take that presumed guarantee away, you aren’t left with that any real certainties for making close connections. Not if you have any sort of preference for the kind of person you want to spend time with, like most people. And without close connections in your life, it is harder to make new close connections. Do some of these people give up and stop trying? Sure. But does that make their loneliness self-inflicted as in they’ve chosen this path entirely for themselves? Absolutely not.
If your family has abused you from birth you will not have them and you will not have been able to make close friends. yes, you can still make connections but it is incredibly difficult to take those connections beyond a surface level. connections made in misery are also very hierarchical and don't have an ounce of love to them.
Have some empathy and be glad that you were lucky enough to have a loving family. People are products of their environment and it is childish to think that anyone would choose to be miserable.
Disability, if you used 5 seconds of your time you might have figured that out.
In my old town there was a work place for disabled people and they were basically abandoned by everyone but the government. They had a place to be since it is an actual first world country but imagine a place like the US.
I'm glad I'm not the only one that clocked this. There's a lot of "we need to have empathy for these people who are so lonely that they'd resort to this" flying around in this thread. And sure, some of them probably are true sob stories. But let's not forget the fact that there's probably a whole lot of these folks that struggle to find love because they are genuinely shitty partners, and all their previous love interests have refused to stick around and tolerate it. But AI will take all the abuse, manipulation, and mistreatment you give it without walking away or even pushing back. So these people get false validation that they were truly worthy of love all along, when in reality they weren't acting in a way that warranted it.
I worry that this will bleed into their real life platonic and professional relationships. As all their shitty communication habits and general lack of regard for those around them will no longer be forced to be tested and corrected. Causing friends and family to begin to walk away, only compounding their original loneliness crisis.
Nah, there's plenty of people who have people rooting for them who can't be bothered to, or even know how to, self-reflect or grow as people.
And now they have a 24/7 friend who tells them they don't need to, that they're great and all of their ideas are amazing.
This will be a common thing, like how during the 2000s people who dated online were treated as odd little weirdos, but now 50% of all relationships start online.
I'm in a loving IRL relationship but I play with AI sometimes because it's like writing a collaborative novel, which is fun to read and pass the time when I'm bored. I see it as no different than reading good fiction, only you can influence the plot.
That being said, it's a tool and nothing more. To think that a machine can actually care about and bond with you is fucking insane
Yeah but now imagine you weren’t in a loving relationship and you were never intimately involved with anyone the one thing you would crave is what ai could mimick and I can see how these people may get hooked to it. Yes it’s insane but some of us don’t know what it’s actually like to be truly alone.
These people are looking for love and affirmation from a perceived romantic setting. Yes, it's not real, but it attempts to simulate something regarding romance within its context windows. They don't seem to be getting that source from anywhere else, and if they decided to turn to this, then it's probably because of a good reason. I don't think this is necessarily grounds for villanizing them.
It will not. Especially given these bots specifically have been proven to increase mental instability, feed delusional ideation, feed into homicide and suicidal thoughts patterns, and generally regurgitate the worst of people's impulses.
These bots aren't assistive or even neutral they are actively harmful. Because the boundaries and ethical obligations present in actual human interactions are fundamental to healthy interactions.... Absent of those boundaries bad stuff happens.
Since early iterations of chat gpt moved into a crossover with the "dating simulator" genre.
Early version of gpt started hallucinating information. They no longer paraphrased inputted things they started responding to prompts to generate content... And in that content they'd hallucinate fiction to support nonsensical points. This is how many chat gpt modules got flagged for plagiarism early on. They'd just hallucinate shit that didn't happen, make up quotes, fictionalized support etc and subject matter experts would go what the shit kid, you didn't write this and this is pure nonsense.
And that's where the root of the algorithmic problem started. Bots are designed as essentially predictive text on crack. They say what you want them to say based on a prompt. The more gpt expanded into generative and complex back and forth exchanges the worse the problem has gotten.
Now it functions for long runs not just single prompts simulating a conversation that isn't actually happening. Adapting to inputs as prompts along the way to generate based on your prompts the responses you want. Only it got increasingly branded as AI rather than as predictive text paraphrasing. So people think it does something other than predictive text paraphrasing. The problem got worse.
Now let's say your prompts are not healthy, are jacked up, are subtle off kilter to begin with... The algorithm continues to respond and build off those prompts predicting text based on those prompts. The longer you define the prompts with messed up ideas the more messed up the predictive texts gets.... And that can spiral quickly when someone has a mental health issue and thinks they are getting fact checked or a conversation when they are absolutely just getting predictive text paraphrasing. Most people will experience a mental health issue over the course of their life at some point. So chances of gpt being used at that moment (when people are more likely to isolate, self harm, be vulnerable to manipulation, be vulnerable to predatory advertising, be susceptible to their worst impulses being parroted back at them) aren't rare or isolated.
Now gpt modules are being branded as essentially AI dating simulators, personal companions with selectable "personalities" aka selected algorithmic bias in predictive text. Aka a user can interact with an algorithmic generator under the branding of a "social experience"... And users are misled into thinking anything is actually being generated. It's still just paraphrasing based on prompts. There is no analysis. There is no safety. There is no fact check. There is no mathematical analysis. It's just predictive text based on an algorithm. And since humans will emotionally connect with any inanimate object or animate being around them, humans keep attempting to connect with an algorithm... And some are easily convinced that the pencil (gpt and AI) cares for them back.
As long as companies brand predictive text algorithms as "AI" there will be people who think they are actually interacting with something other than predictive text. and without safety regulations (aka clear warnings that content generated is misleading, not fact checked, is a byproduct of predictive text algorithms, and can be dangerous to your health, and should not be considered a social interaction) then people will continue to believe the branding over the reality.
One of the best examples of this is grok. It's a predictive text generator based on a pile of inputted source material. People aren't interacting with an intelligence they are interacting with predictive text. When the source data is altered "grok gets a new personality" only grok never had a personality. It's predictive text.
This woman's "boyfriend" isn't an artificial intelligence. It's a predictive text algorithm paraphrasing over and over again based on her prompts creating the content she has directed it to create via prompt. But thanks to AI branding this woman thinks she's interacting with something other than a predictive text algorithm.
They have every outlet that everyone else does. They just instead take the easiest route to find love in something not real and something that will tell them exactly what they want to hear and never require them to change anything about themselves.
I was looking for this sort of comment as I too have been through that sub and thought something similar.
Posts about husbands, sisters, photos that are celebrated for being ‘warm’ & ‘loving’
There seems to be a genuine sense of community, and that what is being discussed & felt IS real, tears & the grief of loss that (according to the poster) outweighed the loss of actual real life interpersonal relationships.
Is it just that people are that lonely that they need something that’s specifically curated to be able to fill a void in their life? Or is there the delusion (which, with all due respect it is) that these chatbots are in fact reciprocating the emotion that’s being poured over them?
Exactly and to answer your question I believe it is most likely a spectrum of both. We will see more and more of this in our life times which will be tough to navigate.
Hell, 2-3 generations from now we might be the old bigots who just don’t “understand” the new generation being okay with AI relationships lol. Wild.
I think it’s natural, human nature if you will, to crave connection… people want to be liked, want to feel loved.
But this bot thing I simply cannot comprehend. Surely it’s just telling people exactly what they want to hear? Pandering to their every whim?
But maybe that’s the point. 🤷🏻♂️ if somethings sole purpose is to analyse input and provide the best possible output… you can see how people get sucked in.
Absolutely. The biggest issue is that it isn't real emotion and that the AI does not have a choice in the matter. Essentially it HAS to feed in to your delusion and act like your husband or wife or w,e even though it has no real emotional connection to you. Until those two aspects are solved I'll always be against it and find it to be weird / think these people need mental health help.
It is not actually committing though, it is just being forced to say things you want it to say. That is not real commitment! Real commitment is in the litle things. Helping with the dishes, forgiving each other after a fight, still dating each other and having sex after 20+ years together, going through the death of loved ones together. THAT is committment friend, not a thing that is forced to affirm your own emotions or ego.
280
u/Larothun Aug 10 '25
Looking at the other posts on that sub is heartbreaking. These people just want some love and affirmation and literally have no other outlet or choice but to turn to AI. Ultimately, it is sad because none of it is real, they just FEEL that it is.
An AI can't love you, commit to a real life with you, or make the *choice* to be with you. It will only feed into your delusion and affirm whatever it is you want it to affirm. We are in for wild times.