New York City Councilor Ben Kallos says he “looked horrified” last month when city police responded to a hostage situation in the Bronx using Digidog from Boston Dynamics, a robot that is remotely controlled and equipped with surveillance cameras. Photos of the Digidog have gone viral on Twitter, partly because of their strange agreement with world-ending machines in the Netflix sci-fi series Black mirror.
Now Kallos is proposing what could be the country’s first law banning police from owning or operating robots armed with weapons.
“I don’t think anyone expected them to actually be used by the NYPD right now,” Kallos says. “I have no problem using a robot to detonate a bomb, but it has to be the right use of a tool and the right kind of circumstances.”
Kallos’ bill does not prohibit unarmed utility robots like the Digidog, but only weapons with weapons. But robotics experts and ethicists say he has taken advantage of concerns about the increasing militarization of police: their increasing access to sophisticated robots by private sellers and a controversial military equipment pipeline. Police in Massachusetts and Hawaii test the Digidog too.
“Non-lethal robots can be quite lethal,” said Patrick Lin, director of the Ethics and Emerging Sciences Group at California Polytechnic University, San Luis Obispo. Lin briefed CIA employees during the Obama administration on autonomous weapons and supported a ban on armed robots. He is concerned that their increased availability is a serious concern.
“Robots can save police lives, and that’s fine,” he says. “But we must also be careful that the police do not make violence more violent.”
In the Bronx incident last month, police used the Digidog to assemble the house where two men held two hostages, hiding hiding places and cramped nooks. Police eventually apprehended the suspects, but privacy advocates expressed concern about the robot’s technical capabilities and policies regarding its use.
The ACLU questioned why the Digidog was not listed about the police department’s announcement of surveillance devices in terms of a city law passed last year. The robot was only mentioned casually in a section on ‘situational awareness cameras’. The ACLU called the disclosure “Highly inadequate,” criticism of the “poor data protection and training departments” regarding Digidog.
In a statement, the NYPD said it had been using robots since the 1970s to save lives in hostage situations and dangerous incidents. This model of the robot is being tested to evaluate its capability against other models used by our emergency services unit and bomb squad. ‘Boston Dynamics did not respond to a request for comment.
Local reaction to the use of the Digidog was mixed, says councilor Kevin Riley, who represents the Bronx neighborhood where the incident took place. Some residents opposed the use of the robot by the police, and others wanted more police presence. A third group thought the robots could help prevent police misconduct by creating distance between officers and suspects.
Riley says he continues to talk to residents who want to feel safe in the area. “It is our job as elected officials to educate residents and make sure they sit at the table,” he told WIRED.
The divergent concerns reflect those that occurred in Dallas in 2016. During a sniper with a sniper, local law enforcement officers used a robot to remotely deliver an explosive device and explode and kill him. The sniper shot dead five police officers.
The incident raised questions about how police obtained robots. Dallas police had at least three bomb robots in 2016. Two was obtained from the defense contractor Northrop Grumman, according to Reuters. The third came by the federal government’s 1033 program, which allows the transfer of surplus military equipment to local police departments. Since 1997, more than 8,000 police departments have received more than $ 7 billion in equipment.