last week the new F-35 jet struck its first target in the air, a drone/UAV/RPA. The interesting thing about this was the amount of involvement that computer systems had in the strike.
The F-35 was able to identify and target the UAV with its mission systems sensors, pass the targeting information to the missile, and enable the pilot to verify targeting data using the helmet mounted display (HMD) before launching the missile that destroyed the UAV.
This target verification by a human, rather than actual target selection was really to be expected. Flying a jet is difficult enough, and flying a jet as complex as the F-35, with 8 million lines of code, and then being expected to find, select and engage targets is very difficult.
Computerised target selection has been around for a while, at least since the Iran-Iraq War when the USS Vincennes on a monitoring mission managed to mistakenly identify an Iranian passenger plane as a Iranian Air Force jet, and blow it out of the sky. The issue really was not the misidentification, but that the individual in charge of the weapon system did not consider the information in front of him. Had he done so, he could have realised the misidentification, and stopped targeting the airliner. Schmitt and Thurner argue this shows that a human controller is not the panacea that anti-autonomous weapon system campaigners would have you believe (p.248-249).
So, humans can’t do these complex tasks alone, human-machine teams aren’t a panacea, and fully-autonomous technologies aren’t capable of doing these complex tasks yet. Very messy. But I suppose with proper training, and intuitive/symbiotic design, where systems are created to be as easy for humans to use as possible, the machine-human team offers the best of both world, at least until we can see what fully-autonomous systems are really capable of.
Legally, the computerised aspect of target selection doesn’t really add anything, as it is still the pilot in control of weapons release, and so all responsibility for ensuring compliance with the Law of Armed Conflict lies with the human in the machine-human team, the pilot in this case.
Until next time!