
Looking for a Thanksgiving Day table talk that is not politics or professional sport? Okay, let’s talk killer robots. It’s a concept that jumped from the pages of science fiction to reality long ago, depending on how loose a definition you use for “robot.” Military drones abandoned Asimov’s first law of robotics decades ago – “A robot shall not injure a human being or, by inaction, allow a human being to cause harm”.
The topic has been simmering again lately due to the rising prospect of killer robots in domestic law enforcement. One of the best-known robot makers of the era, Boston Dynamics, raised some red flags when it showed footage of its Spot robot deployed as part of Massachusetts State Police training exercises on our stage in 2019.
The robots were not armed and instead were part of an exercise designed to determine how they could help keep officers away during a hostage or terrorist situation. But the prospect of deploying robots in scenarios where people’s lives are in immediate danger was enough to trigger an investigation from the ACLU, which told TBEN:
We urgently need more transparency from government agencies to be upfront with the public about their plans to test and deploy new technologies. We also need nationwide regulations to protect civil liberties, civil rights and racial justice in the age of artificial intelligence.
Meanwhile, the NYPD broke a deal with Boston Dynamics last year after a strong public backlash, after footage surfaced of Spot being deployed in response to a home invasion in the Bronx.
For its part, Boston Dynamics has been very outspoken in its opposition to the weaponization of its robots. Last month, it along with other leading companies Agility, ANYbotics, Clearpath Robotics and Open Robotics signed an open letter condemning the move. It notes:
We believe that adding weapons to robots that are remotely or autonomously controlled, making them widely available to the public and capable of navigating previously inaccessible locations where humans live and work, raises new risks of harm and serious ethical issues entails. Weaponized applications of these new robots will also damage public confidence in the technology in a way that hurts the tremendous benefits they will bring to society.
The letter was believed to be in part a response to Ghost Robotics’ work with the US military. When footage of one of its own robot dogs was shown on Twitter wielding an autonomous rifle, the Philadelphia company told TBEN it took an agnostic stance on how the systems are used by its military partners:
We don’t make the loads. Are we going to promote and advertise these weapon systems? Probably not. That is difficult to answer. Since we sell to the military, we don’t know what they do with it. We are not going to dictate to our government customers how to use the robots.
We do draw the line where they are sold. We only sell to US and allied governments. We don’t even sell our robots to corporate customers in hostile markets. We get a lot of questions about our robots in Russia and China. We don’t ship there, even for our corporate customers.
Boston Dynamics and Ghost Robotics are currently embroiled in a multi-patent lawsuit.
This week, local police reporting site Mission Local surfaced renewed concerns about killer robots — this time in San Francisco. The site notes that a policy proposal under review next week by the city’s board of supervisors contains language about killer robots. The “Law Enforcement Equipment Policy” begins with an inventory of robots currently owned by the San Francisco Police Department.
There are 17 in total – 12 of which are functioning. They are largely designed for bomb detection and disposal – that is, none are specifically designed to kill.
“The robots listed in this section should not be used outside of training and simulations, criminal arrests, critical incidents, exigent circumstances, execution of warrant or during assessments of suspicious devices,” the policy notes. It then adds, more disturbingly, “Robots will only be used as a lethal force option when the risk of loss of life to members of the public or officers is imminent and outweighs any other force option available to SFPD.”
In fact, according to the language, the robots can be used to kill to potentially save the lives of officers or the public. In that context, it might seem innocent enough. At the very least, it appears to fall within the legal definition of “justifiable” lethal force. But new concerns are emerging in what appears to be a major policy shift.
For starters, using a bomb disposal robot to kill a suspect is not without precedent. In July 2016, Dallas police officers did just that, believed to be the first time in US history. “We saw no option but to use our bomb robot and put a device on its extension so it could detonate where the suspect was,” police chief David Brown said at the time.
Second, it’s easy to see how a new precedent could be used in a CYA scenario, if a robot is intentionally or accidentally used in this way. Third, and perhaps most disturbing, one might imagine that the language governing the acquisition of a future robotic system is not designed purely for explosive discovery and destruction.
Mission Local adds that Aaron Peskin, chair of SF’s Board of Supervisors Rules Committee, tried to insert the more Asimov-friendly rule: “Robots must not be used as a use of force against a person.” The SFPD apparently crossed out Peskin’s change and updated it to the current language.
The renewed conversation about killer robots in California comes in part because of Assembly Bill 481. The law, signed into law by Governor Gavin Newsom last September, aims to make policing more transparent. That includes an inventory of military equipment used by law enforcement.
The 17 robots included in the San Francisco document are part of a longer list that also includes the Lenco BearCat armored vehicle, flash bangs, and 15 submachine guns.
Last month, the Oakland Police Department said it would not seek approval for armed remote-controlled robots. The department said in a statement:
The Oakland Police Department (OPD) does not add remote armed vehicles to the department. OPD did participate in ad hoc committee discussions with the Oakland Police Commission and community members to explore all possible uses for the vehicle. However, after further consultation with the Chief and the Executive Team, the department decided not to explore that particular option any longer.
The statement followed public backlash.
The toothpaste is already out of the tube first Asimov’s first law. The killer robots are here. As for the second law – “A robot must obey the commands given to it by humans” – it is still largely within our reach. It is up to society to determine how its robots behave.