The dangerous business of mass-produced intimacy

Much discourse has been circulating in both public and private circles recently regarding the unchecked use of generative AI chatbots among children with unmonitored access to the internet. Most intimately, mass concerns arise over the potential for young adults, and eventually adults, to begin replacing human relationships with fake ones. Key word, replace – not add more. Last month, OpenAI announced that it would be expanding its services to include erotica for adults, after much controversy over chatbots responding to users in incredibly intimate ways, after being prompted to do so. Apart from the obvious inappropriate nature of such content for underage children who could gain access to it through these chatbots indirectly or directly, which has been extensively reported on by pages like the New York Times, Reuters, the BBC, and many others, this historic venture poses another threat to the lifelong reward of interpersonal relationships.

With the rise in perceived loneliness among young men, male commentators such as Scott Galloway have spoken on high desirability a fake relationship poses to specifically the male psyche, not yet exposed to the rewards of interpersonal romantic relationships “Why would you go through the rejection, the dressing well, the effort, the expense, the humiliation, developing a kindness practice, of trying to establish a romantic partnership, when you have literally lifelike synthetic p*rn?”. 1 Exposing individuals to something much more easily accessed with less return, before full development and experiencing any alternative, is a recipe for rewiring the human experience.

To be human is to be recognised by others, to witness each other, to value each other and to create life with one another. To actively slide away from that practise, practising humanness, descends us further into a merging with the technology itself. Amongst the potential threats, there are of course positives. AI chatbots can actually provide relationship support to those who lack many sources of social connection and can help decrease the sense of loneliness. But there’s a line here – one of appropriate and constructed use, and one of counterproductive use. The challenge for policymakers moving forward (and currently), is to stand on the former side of that line.

The fewer independent life skills we develop outside of AI, the less heartbreak, the less emotional work of relationship rebuilding, the less we have the capacity to build the emotional bandwidth needed to navigate life independently. We cede a certain level of autonomy that wasn’t on sale to the highest bidder before – our intimate selves. The more dependent we become on a technological product for a rendition of the human experience, the more bargaining power we give private companies over our interests.  

Therefore, it poses another question: What does our society look like when the crypto, military, space, data, technological, and now intimate relationship industries are infiltrated by private companies? At what point are our roles as individuals to be vehicles of a capitalist environment, whereby the autonomous human experience is reduced in the name of innovation? By placing the role of emotional fulfilment in the private sphere, one cedes to some extent self-control over emotional fulfilment and development. This system, if left largely unregulated, would be something completely different – one of intimate dependence on the producer.

There are 10 companies referred to as ‘The Magnificent 10', the name referring to the fact that the future of the global economy relies, to a large extent, on the continued growth and success of these companies. 2 Six of which are AI-focused companies, such as Meta and Palantir. By entering the business of intimacy, society's reliance on these companies deepens, influencing the emotional well-being and relationship formation abilities of individuals. Whether or not it's inherently purposeful, these “innovations” are self-fulfilling to those in power, whilst negating autonomy and quality of life for future generations by selling convenience. The AI race is more than an innovation race – it has become a race to deploy services that will alter the human experience in ways not requested by the public. The lifelong success as service providers in the business of intimacy will rely on being able to sell their convenience to individuals before they know any different.

This is a concern in the context of a hypothetical mass adoption of chatbots focused on intimate relationship building during a formative period, which is, hopefully, with the help of legislators, extremely unlikely. I also don’t want to be perceived as kneeling at the concept of technological determinism, because no crisis has erupted yet. However, these questions and the ethical scenarios regarding the tangible impact of global companies must be posed to serve our collective future. We have autonomy through democracy in each of our states. It is the responsibility, now and in the future, of our national regulators to enforce the law and restrict the infiltration of the private sphere into our intimate spaces. This battle will define the impact of technology systems on our intimate selves. 

References

1.     Scott Galloway, "The Loneliness Crisis: Scott Galloway on Male Loneliness," Anderson Cooper 360°, CNN, November 4, 2025, video, https://edition.cnn.com/2025/11/04/us/video/scott-galloway-male-loneliness-anderson-cooper-vric-lda-digvid.

2.     "Cboe Magnificent 10 Announcement," Cboe Insights (blog), Cboe Global Markets, accessed November 6, 2025, https://www.cboe.com/insights/posts/cboe-magnificent-10-announcement/.

Next
Next

We need nationalist sentiment. But we have the wrong kind.