A convergence in liability between cyber-security and killer robots?


I just came across this article by Bruce Schneier, via his Schneier-gram emails. They are always illuminating, I suggest you subscribe if you are at all interested in cyber -security related things.

In his article, he discusses the recent DDoS attack on Dyn (also see here). Dyn are a company that run internet services behind the scenes of major websites, and so stopping Dyn from functioning would then stop all the customers websites from functioning. Schneier discusses how the DDoS attack, where lots traffic is sent to one website by cyber-criminals, or a nation state, causing it to overload and crash, used lots of online devices other than computers, mostly unsecure Internet of Things (IoT) devices— webcams, digital video recorders, routers and so on’. The devices are forced to send traffic towards such a website by nefarious characters who infect the devices of innocent people, and then take control of them at the same time, referred to as a ‘botnet’

A visualisation of a DDoS attack. Image by Nasanbuyn (Used under CC 4.0 license)

His argument continues, quoting from a previous piece , that customers don’t really care if their devices are used in botnets, as long as they were cheap to buy, they still work, and customers don’t know the victims. Therefore, manufacturers aren’t under market pressure to increase the security of their devices.

His solution to this got me thinking:

What this all means is that the IoT will remain insecure unless government steps in and fixes the problem. When we have market failures, government is the only solution. The government could impose security regulations on IoT manufacturers, forcing them to make their devices secure even though their customers don’t care. They could impose liabilities on manufacturers, allowing people like Brian Krebs [another DDoS victim] to sue them. Any of these would raise the cost of insecurity and give companies incentives to spend money making their devices secure.

This made me wonder about autonomous weapon systems (AWS, aka killer robots). Currently, once a military buys a product, it is the military that takes on almost all of the liability for its use and maintenance. With AWS, this gets tricky, because coding errors are inevitable with the millions and millions of lines of code expected. Thus, it does not seem fair to place all the liability for AWS onto a military when manufacturing and coding errors could be the cause of future failures. No doubt any military fielding or developing weapons with any sort of autonomy is going to place serious and significant security regulations upon the weapon systems during manufacture. However opening up manufacturers to some liability is likely to increase their safeguards around what products they are prepared to release to market. I’m not trying to suggest that they are currently weak, but the risk of being sued by someone wrongly killed by an AWS is likely to increase safeguards further.

Computer code. Image by Crusher95 (used under CC 4.0 license)

This could have the effect of manufacturers and programmers doubling-down on removing as many errors as possible, ensuring that they are highly likely to work perfectly. It could also have the effect of slowing down the movement towards fully-autonomous systems, and keep the ‘human-in-the-loop‘, or at least ‘on-the-loop‘ for a longer period of time. After all, the actual decision to kill (or ‘finish)’ in the ‘kill chain’ is one of the shortest parts, so keeping humans involved in lethal decision-making wouldn’t slow down operations too much.

The US military ‘Kill Chain’ decision-making process for lethal targeting.

Until next time!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s