Abusing AI girlfriends
I don’t often share this kind of thing because I find it distressing. We shouldn’t be surprised, though, that the kind of people who physically, sexually, and emotionally abuse other humans beings also do so in virtual worlds, too.
In general, chatbot abuse is disconcerting, both for the people who experience distress from it and the people who carry it out. It’s also an increasingly pertinent ethical dilemma as relationships between humans and bots become more widespread — after all, most people have used a virtual assistant at least once.Source: Men Are Creating AI Girlfriends and Then Verbally Abusing Them | FuturismOn the one hand, users who flex their darkest impulses on chatbots could have those worst behaviors reinforced, building unhealthy habits for relationships with actual humans. On the other hand, being able to talk to or take one’s anger out on an unfeeling digital entity could be cathartic.
But it’s worth noting that chatbot abuse often has a gendered component. Although not exclusively, it seems that it’s often men creating a digital girlfriend, only to then punish her with words and simulated aggression. These users’ violence, even when carried out on a cluster of code, reflect the reality of domestic violence against women.