Editorial Viewpoint: The LAWS of Robotics
Our September issue explores how the Internet of Things concept – a high profile topic in automation circles – has crept its way into military hardware.0
Typically, issues of Design Engineering are developed around a pair of broad themes that overlap in some way (e.g. automation and the automotive industry). Our September issue, however, which focuses on motion control and the Canadian Defence industry, always poses a problem. Normally, these topics have little in common and we struggle to find a story that blends the two elegantly.
This year, however, has been refreshingly, if somewhat disturbingly, different.
As our feature story on the Canadian military’s SIPES project illustrates, the Internet of Things concept – a high profile topic in automation circles – has also crept its way into military hardware.
Packed with all the electronics found in a high end smart phone (camera, potentiometer, gyroscope, GPS, data transmission, etc.) plus some other stuff they wouldn’t talk about, the SIPES prototype explores some potentially transformative ideas. Imagine an infantry armed with weapons that can tell HQ not only where each grunt is located, but where he’s headed, how fast he’s moving, if he’s firing and in what direction.
In addition, such a weapon could also receive data from command. Combining satellite imagery, drone flybys, thermal photography and plain old status reports (plus other stuff they won’t talk about) operations leaders could feed troops information that discerns combatants from bystanders and pinpoints enemy targets through an aiming sight/heads-up display. Taken one step further, it’s not hard to envision the weapon using that data, combined with it’s own sensor input and muzzle velocity, to calculate a firing solution. Pull the trigger and the gun doesn’t shoot until it’s aimed exactly right to hit the target.
Development of these next generation weapons is potentially beneficial and probably overdue given that the current U.S. and Canadian standard issue rifles, the Colt M-9 and C7 respectively, were both introduced decades ago. But just how smart these or any higher order weapon systems should be allowed to get is a question of increasing urgency as artificial intelligence (AI) draws eerily closer to passing the Turing test.
In 2014, Kitchener’s own Clearpath Robotics — makers of autonomous all-terrain and waterborne robotic vehicles — became the first robotics firm to sign on with the Campaign to Stop Killer Robots, an international coalition of NGOs working to ban fully autonomous weapons. At the time, it might have seemed premature and alarmist to warn against so-called Lethal Autonomous Weapon Systems (LAWS). Now, two short years later, it seems prescient.
Take, for example, US.-based AI development company Psibernetix Inc., who announced in June 2016, that it had created an artificial fighter pilot that consistently out performed seasoned “Top Gun” fly boys in high fidelity flight combat simulations.
If a bit of fuzzy logic code can best human fighter pilots, how hard would it be to adapt that software to less demanding situations, like a tank, an aerial drone or any other military weapon?
In some ways, it’s comforting to imagine robotic combatants battling toward a bloodless victory. And barring an amendment to the Geneva Convention, the development of such autonomous weapons may be inevitable. Still, as the a few early adopters of the Tesla Model S autonomous driving mode found out the hard way this year, technology doesn’t absolve us from responsibility or culpability.
Ultimately, a human, the man in the loop, must exist if for no other reason than to bear the burden of war.
– Mike McLeod, Editor
I enjoy hearing from you so please contact me at MMcLeod@design-engineering.com and your letter could be published in an upcoming issue.