5 Comments

The real problem with human-like robots is not potential “injustice” to THEM, but a frightening slippery slope to what ill treatment of synthetic, non-suffering humanoids might reveal and actually do to US. Think, for example, of the sexual behavior that might prevail among some humans towards robots created to look like children. The moral problem is not the treatment of those robots; it’s the horrible consequences of eliminating the benefits of self-regulation relative to actual children. We need to wrestle with the idea of regulating human behavior on a secondary legal level: Not outlawing deviant behavior towards convincing humanoid machines because it’s bad for the machines, but outlawing that behavior because we can all agree it’s very likely to open a pandora’s box of horrors due to increased permissiveness and “normalization” of behaviors previously considered heinous.

But good luck with passing laws like that (“Hey, it’s not a REAL kid!”), and if we can’t pass laws like that, what will our future be like? Pretty damn dark.

Expand full comment

This is a really great observation - and I completely agree with your analysis.

It reminds me of what Kant said about animals: that we should treat them well not because of *their* interests, but because we don't want to be cruel people. I think he was wrong to apply this idea to animals (who deserve to be treated well for their own sake), but it applies perfectly to robots.

So I agree that we should have laws against the 'abuse' of robots, and the issue you mention is a perfect case in point. I'm more optimistic than you about the likelihood of getting laws like that passed, though - for example, I think the US banned the creation and importing of child sex dolls a few years ago. So there's hope yet.

Expand full comment

"Should Robots Have Rights?"

The very notion is ridiculous.

They're machines.

Expand full comment

Rights of Pavlov’s Dogs as a Standard for Rights of Persons and Machines

It would be harder for a machine to become a person than for a person to become a machine so in some countries, and the number of those countries is growing, an easier strategy is being pursued for a person to become a machine, for example, through a social credit system conditioning humans like Pavlov’s dogs.

Expand full comment

Thanks for your comment Witold. Certainly, the mechanisation of human life through big tech is a worry - although we shouldn't be misled by that metaphor into thinking that it's substantively changing the human lifeform (yet!).

Expand full comment