As a rule, any technology that can be used for sex will be used for sex. Even if it shouldn’t. In accord with this rule, researchers and engineers have been improving sexbot technology. By science-fiction standards, current sexbots are crude and are probably best described as sex dolls rather than sexbots. But it wise to keep ethics ahead of the technology and a utilitarian approach to this matter is appealing.
On the face of it, sexbots could be seen as nothing new and now they are a small upgrade of sex dolls that have been around for quite some time. Sexbots are, of course, more sophisticated than the infamous blow-up sex dolls, but the idea is the same: the sexbot is an object that a person has sex with.
That said, one thing that makes sexbots morally interesting is the fact that they are often designed to mimic humans not just in physical form (which is what sex dolls do) but also the mind. For example, the 2010 Roxxxy sexbot’s main feature is its personality (or, more accurately, personalities). As a fictional example, the sexbots in Almost Human do not merely provide sex—they also provide human-like companionship. However, such person-like sexbots are still science-fiction and so human-mimicking sexbots can be seen as something potentially new under the ethical sun.
An obvious moral concern is that human-mimicking sexbots could have negative consequences for humans, be they men or women. Not surprisingly, many of these concerns are analogous to existing moral concerns about pornography.
Pornography, so the stock arguments go, can have strong negative consequences. One is that it teaches men to see women as mere sexual objects. This can, it is claimed influence men to treat women poorly and can affect how women see themselves. Another point of concern is the addictive nature of pornography as people can become obsessed with it to their detriment.
Human-mimicking sexbots would seem to have the potential to be more harmful than pornography. After all, while watching pornography allows a person to see other people treated as mere sexual objects, a sexbot would allow a person to use a human-mimicking object sexually. This might have a stronger conditioning effect on the person using the object, perhaps habituating them to see people as mere sexual objects and increasing the chances they will mistreat people. If so, selling or using a sexbot would be morally wrong.
People might become obsessed with their sexbots, as some do with pornography. Then again, people might simply “conduct their business” with their sexbots and get on with life. If so, sexbots might be an improvement over pornography. After all, while a guy could spend hours watching pornography, he would presumably not last very long with his sexbot.
Another concern raised about some types of pornography is that they encourage harmful sexual views and behavior. For example, violent pornography is believed to influence people to become more inclined to violence. As another example, child pornography is supposed to have an especially pernicious influence. Naturally, there is the concern about causation here: do people seek such porn because they are already that sort of person or does the porn influence them to become that sort of person? I will not endeavor to answer this here.
Since sexbots are objects, a person can do whatever they wish to their sexbot—hit it, burn it, and “torture” it and so on. Presumably there will also be specialty markets catering to unusual interests, such as those of pedophiles and necrophiliacs. If pornography that caters to these “tastes” can be harmful, then presumably being actively involved in such activities with a human-mimicking sexbot would be even more harmful. The person might be, in effect, practicing for the real thing. So, it would seem that selling or using sexbots, especially those designed for harmful “interests” would be immoral.
Not surprisingly, these arguments are also like those used against violent video games. Volent video games are supposed to influence people so that they are more likely to engage in violence. So, just as some have proposed restrictions on virtual violence, perhaps there should be strict restrictions on sexbots.
When it comes to video games, one plausible counter is that while violent video games might have negative impact on some people, they allow most people to harmlessly enjoy virtual violence. This seems analogous to sports and non-video games: they allow people to engage in conflict and competition in safer and less destructive ways. For example, a person can indulge her love of conflict and conquest by playing Risk or Starcraft II after she works out her desire for violence by sparring a few rounds in the ring.
Turning back to sexbots, while they might influence some people badly, they might also provide a means by which people could indulge in desires that would be wrong, harmful and destructive to indulge with another person. So, for example, a person who likes to engage in sexual torture could satisfy her desires on a human-mimicking sexbot rather than an actual human. The critical issue here is whether indulging in such virtual vice with a sexbot would be a harmless dissipation of these desires or fuel them and make a person more likely to inflict them on people. If sexbots did allow people who would otherwise harm other people to vent their “needs” harmlessly on machines, then that would seem good for society. However, if using sexbots would simply push them towards doing such things for real and with unwilling victims, then that would be bad. This, then, is a key part of addressing the ethical concerns about sexbots and something that should be duly considered before mass production begins.
