Recommended Sponsor Painted-Moon.com - Buy Original Artwork Directly from the Artist

Source: The Conversation (Au and NZ) – By Sam Baron, Associate professor, Australian Catholic University

Sam Baron, Author provided

US-based military robot manufacturer Ghost Robotics has strapped a sniper rifle to a robotic dog, in the latest step towards autonomous weaponry.

Some people have reacted with moral outrage at the prospect of making a killer robot in the image of our loyal best friend. But if this development makes us pause for thought, in a way that existing robot weapons don’t, then perhaps it serves a useful purpose after all.

The response to Ghost Robotics’ latest creation is reminiscent of an incident involving Boston Dynamics, another maker of doglike robots (which, in contrast, strongly frowns on the idea of weaponising them).

In 2015, Boston Dynamics caused controversy after posting footage online of technicians kicking a doglike robot to demonstrate its stability. Many viewers sympathised with the robot, and claimed kicking it was morally wrong.

Ghost Robotics Chief Executive Jiren Parikh has offered an evolutionary explanation for the moral qualms about sniper robo-dogs, claiming they evoke our “evolutionary memories of predators” because they have legs.

But I’m not convinced. I think people are reacting strongly because the robots look like dogs.

A killer best friend?

Dogs are intelligent. When we see a weapon shaped like a dog, it becomes very easy to view that weapon in the same terms. We instinctively see it as a gun that makes its own decisions, because that’s what dogs do.

For many people, that is the true horror of automated weaponry: a deadly robot that makes its own choices about whom it kills. There’s no guarantee the choices the robot makes will be constrained by human morality.

In truth, the robots produced by Ghost Robotics are no more autonomous than existing weapons systems. Like most drones, these gun-toting robo-dogs are fully piloted by a remote operator.

But there’s a crucial difference in our perception: the doglike robots seem more like fully autonomous killing machines, even though they’re not. They just appear that way because we take the intelligence we associate with dogs, and project it onto a bunch of bolts, wires and guns that happens to look like them.




Read more:
Abusing a robot won’t hurt it, but it could make you a crueller person


A moral opportunity

If robotic dogs equipped with sniper rifles fill us with unease, should we stop making them? Well, ideally yes. But if, as seems inevitable, humans are going to continue building robots with guns, maybe we should make them all look like dogs. Or, better yet, domestic pets more generally.

Pets are our family. We know our pets are largely free to act as they please, but the familial bonds we develop with them encourage us to feel accountable for their actions.

That’s because, although we see pets as having a capacity for choice, we also see them as morally inert. When a dog steals the last pork chop from the table, we don’t think the dog is morally culpable. Granted, we might say “bad dog”, but that’s just a way to curb its behaviour. A sharp tone and gruff voice discourage future chop theft, but we aren’t blaming the dog for doing what comes naturally.

Rather, it is we who must take moral responsibility for our pets. That means caring for them, avoiding cruelty and, above all, assuming responsibility if they do something wrong. If my dog bites someone, I feel morally bad even if the dog doesn’t.

If robots with guns look and act like pets, and if we come to see them as such, then some of the moral responsibility we assume for pets might just rub off onto these robots. We may well be more willing to take moral responsibility for the actions of a doglike robot than a faceless military drone.




Read more:
Lethal autonomous weapons and World War III: it’s not too late to stop the rise of ‘killer robots’


The danger of moral detachment

The real horror of autonomous weapons is not the potential they will turn against us in a Skynet-style robotic uprising. It is the horror of detachment. Autonomous weaponry makes it easy to distance ourselves from killing done in our name.

I’m not talking about physical distance (although that is certainly part of it), but rather about moral and emotional distance. Just as buying neatly packaged meat at a supermarket lets us feel detached from the violence done to animals in producing it, so we might feel morally distanced from violence against humans if perpetrated by a remote machine.




Read more:
AI researchers should not retreat from battlefield robots, they should engage them head-on


We need to discourage moral detachment from autonomous weapons. And our existing moral attitude to our pets might offer a shortcut to this goal.

Let me be clear: I don’t support the development of autonomous weaponry or weaponised artificial intelligence. But if we are going to develop such technology, let it be in the shape of pets. Maybe then we won’t kill quite so many people.

The Conversation

Sam Baron receives funding from the Australian Research Council.

ref. Gun-toting robo-dogs look like a dystopian nightmare. That’s why they offer a powerful moral lesson – https://theconversation.com/gun-toting-robo-dogs-look-like-a-dystopian-nightmare-thats-why-they-offer-a-powerful-moral-lesson-170267

NO COMMENTS