Microsoft AI’s CEO Swears To “Never Build A Sex Robot”
Ill-informed statement only further stigmatizes artificial companion relationships

In an MIT Review interview, Mustafa Suleyman, the current CEO of Microsoft AI, expressed his strong concerns regarding Musk’s Grok and OpenAI’s ChatGPT, which recently enabled adult interactions, and whether Microsoft would follow suit. He gave a pointedly direct reply: Never!
But his answer raises a question that should be equally direct: Does Suleyman reflect a savvy business decision, or is it merely a sign of an older, more socially conservative company’s reluctance to stand up and say it’s actually for the customers’ happiness, regardless of what turns them on?
People first?

Suleyman said that despite Grok and OpenAI’s enabling their products to talk dirty, Microsoft instead is taking a different approach, specifically:
Yeah, we will never build sex robots. Sad in a way that we have to be so clear about that, but that’s just not our mission as a company. The joy of being at Microsoft is that for 50 years, the company has built, you know, software to empower people, to put people first.
Suleyman went on to add that this means Microsoft might not be as quick to respond as other companies in its field. However, its more cautious approach is “a feature, not a bug, in this age, when being attentive to potential side effects and longer-term consequences is really important.”
RECOMMENDED READ: Men’s Only Club: Does the Synthetic Companion Industry Really Have a Sexism Problem?
When pressed to explain, Suleyman responded, “We’re very clear on, you know, trying to create an AI that fosters a meaningful relationship. It’s not that it’s trying to be cold and anodyne—it cares about being fluid and lucid and kind. It definitely has some emotional intelligence.”
For example, Suleyman explained that while Microsoft’s Real Talk chatbot might be “a bit more cheeky, it’s a bit more fun …but if you try and flirt with it, it’ll push back and it’ll be very clear—not in a judgmental way, but just, like, “Look, that’s not for me.’”
Who do you serve?

Making Suleyman’s responses puzzling is how, on one hand, he says Microsoft is all about putting people first while failing to understand that they also sometimes need more than a merely flirty chatbot.
Helping to drive this home, when asked for his opinion on whether the use of AIs with strong personalities is a good or not-so-good idea, Suleyman said it was the former, though he felt it needed to be limited:
There are some things which we’re saying are just off the table, and there are other things which we’re going to be more experimental with. Like, certain people have complained that they don’t get enough pushback from Copilot—they want it to be more challenging. Other people aren’t looking for that kind of experience—they want it to be a basic information provider. The task for us is just learning to disentangle what type of experience to give to different people.
The interview left me confused as it appears that Suleyman can’t explain precisely why Microsoft has taken such a firm stance—and if it’s truly about providing users with what they want, then why shouldn’t at least one of their AI products allow for them to engage in sexually explorative chats?
Protest too much?

At the risk of reading too much into it, it feels as if Suleyman is playing defense: repeating that other companies, like Grok and OpenAI, can give their customers what they want, but Microsoft won’t.
Another more than slightly troubling aspect is why he felt it was necessary to be so adamantly anti-sexbot, especially since the interviewer didn’t ask about artificial companions, just if its competitors’ now-sexualized products may have made Microsoft AI rethink its anti-adult entertainment stance.
You’d think an expert would know the difference between erotic AI software and an intrinsically material sex robot—and if Suleyman doesn’t, then maybe he should spend time researching the difference if he’s going to take such a strong stance against them.
Not only that, but unfortunately like Grok and Open AI—who despite profiting off adult AI chats, don’t respect their customers’ emotional wellbeing—and so many others, it looks as if we’ll have to add Microsoft to the list of companies who claim to serve people but in reality don’t (or won’t) understand that sex isn’t just fun but what makes us human.
Image Sources: Depositphotos






