F-35 First Strike


last week the new F-35 jet struck its first target in the air, a drone/UAV/RPA. The interesting thing about this was the amount of involvement that computer systems had in the strike.

The F-35 was able to identify and target the UAV with its mission systems sensors, pass the targeting information to the missile, and enable the pilot to verify targeting data using the helmet mounted display (HMD) before launching the missile that destroyed the UAV.

This target verification by a human, rather than actual target selection was really to be expected. Flying a jet is difficult enough, and flying a jet as complex as the F-35, with 8 million lines of code, and then being expected to find, select and engage targets is very difficult.

F-35 cockpit mock-up, 2010. Photo by Ahunt, public domain.

Computerised target selection has been around for a while, at least since the Iran-Iraq War when the USS Vincennes on a monitoring mission managed to mistakenly identify an Iranian passenger plane as a Iranian Air Force jet, and blow it out of the sky. The issue really was not the misidentification, but that the individual in charge of the weapon system did not consider the information in front of him. Had he done so, he could have realised the misidentification, and stopped targeting the airliner. Schmitt and Thurner argue this shows that a human controller is not the panacea that anti-autonomous weapon system campaigners would have you believe (p.248-249).

USS Vincennes, 2005. Photo: U.S. Navy, Photographer’s Mate 1st Class Robert C. Foster Jr

So, humans can’t do these complex tasks alone, human-machine teams aren’t a panacea, and fully-autonomous technologies aren’t capable of doing these complex tasks yet. Very messy. But I suppose with proper training, and intuitive/symbiotic design, where systems are created to be as easy for humans to use as possible, the machine-human team offers the best of both world, at least until we can see what fully-autonomous systems are really capable of.

F-35 at night, 2015. Photo by Skeeze/Pixabay, Public Domain.

Legally, the computerised aspect of target selection doesn’t really add anything, as it is still the pilot in control of weapons release, and so all responsibility for ensuring compliance with the Law of Armed Conflict lies with the human in the machine-human team, the pilot in this case.

Until next time!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s