In line with my last blog, I carried on thinking about different perspectives about AI. One of my favourite subjects when it comes to perspectives of AI is The Terminator.
When we think of a terminator, we think of an unstoppable killing machine. This idea, however, is only true in part of the film franchise. In the first film, where Schwarzenegger’s character has the sole role of tracking down and killing Sarah Conner, it is a killing machine. In the film, this line is really illustrative of this point:
“Listen, and understand! That Terminator is out there! It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop… ever, until you are dead!”
But, what nobody seems to remember is that in the second film, the character which Arnold plays isn’t a machine hell-bent on killing, but a protector. Having been re-programmed, the T-800 model Terminator is sent to protect John Connor. This quote is quite illustrative:
“The terminator, would never stop. It would never leave him, and it would never hurt him, never shout at him, or get drunk and hit him, or say it was too busy to spend time with him. It would always be there. And it would die, to protect him.”
The reason I think this is really important is that in the public imaginary in the West at least, the idea of The Terminator as an unstoppable killing machine is a archetypical idea of what a militarised robot would be – also remember that we always view ourselves as the victim of these systems. I think this exposes two issues.
Firstly, that we only think of militarised robots as being for destruction, not for protection. I think this prevents the public from realising that the purpose, use, and direction of robotics from realising that it is programmers and users who decide all of this. We project all of our fears onto the robots themselves, when, in actuality, we should be thinking about the programmers and users of these machines. At least Arnold remembers…
Second, the idea that we in the West are going to be the victims of killing machines shows an underlying colonial view. We are ignoring the fact that a number of Western nations are developing these weapons, and that these countries tend to engage in military action in the developing world. So, actually rather than being afraid for ourselves, we should be afraid for others. Or, we could ensure that these machines are used lawfully and ethically.