by Caitlin Johnstone - 1 Aug 2021
Hawaii police are defending their use of pandemic relief funds for a robotic “police dog” made by Boston Dynamics which scans homeless people’s eyes to see if they have a fever.
“If you’re homeless and looking for temporary shelter in Hawaii’s capital, expect a visit from a robotic police dog that will scan your eye to make sure you don’t have a fever,” says a new report from Associated Press. “That’s just one of the ways public safety agencies are starting to use Spot, the best-known of a new commercial category of robots that trot around with animal-like agility.”
“Acting Lt. Joseph O’Neal of the Honolulu Police Department’s community outreach unit defended the robot’s use in a media demonstration earlier this year,” AP reports. “He said it has protected officers, shelter staff and residents by scanning body temperatures between meal times at a shelter where homeless people could quarantine and get tested for COVID-19. The robot is also used to remotely interview individuals who have tested positive.”
This has understandably elicited criticism from civil rights advocates.
“Because these people are houseless it’s considered OK to do that,” Hawaii ACLU legal director Jongwook Kim told AP. “At some point it will come out again for some different use after the pandemic is over.”
This report comes just days after we learned that police in Winnipeg have also obtained a “Spot” robot which they intend to use in hostage situations.
Winnipeg Free Press reports:
The Winnipeg Police Service is set to acquire a pricey dog-shaped robot, to be used in hostage situations, that’s already been ditched by police in New York City.
“Spot” is made by Boston Dynamics, which sells the device for US$74,500. Winnipeg police are spending $257,000 to acquire and use Spot. The 32-kilogram robot “has the ability to navigate obstacles, uneven terrain (and) situations where our traditional robot platforms can’t go into,” said Insp. Brian Miln at a news conference Wednesday.
Months earlier the New York Police Department cancelled its lease of the same type of robot they obtained last year following public outcry. More from AP:
The expensive machine arrived with little public notice or explanation, public officials said, and was deployed to already over-policed public housing. Use of the high-tech canine also clashed with Black Lives Matter calls to defund police operations and reinvest in other priorities.
The company that makes the robots, Boston Dynamics, says it’s learned from the New York fiasco and is trying to do a better job of explaining to the public — and its customers — what Spot can and cannot do. That’s become increasingly important as Boston Dynamics becomes part of South Korean carmaker Hyundai Motor Company, which in June closed an $880 million deal for a controlling stake in the robotics firm.
To be absolutely clear, there is not actually any legitimate reason for any normal person to refer to these machines as a “robotic dog”, or a “high-tech canine”, or by a cutesy cliché name for a pet. These are robots. Robots that are being used by police forces on civilian populations. If the robots being used had two legs, or eight, they would not be able to apply such cuddly wuddly labels, and public alarm bells would be going off a lot louder.
Which is of course the idea. As AP noted above, Boston Dynamics is acutely aware that it has a PR situation on its hands and needs to manage public perception if it wants to mainstream the use of these machines and make a lot of money. Because it’s a known fact that westerners tend to be a lot more sympathetic to dogs than even to other humans, arbitrarily branding a quadrupedal enforcement robot a “dog” helps facilitate this agenda.
On-the-ground robot policing is becoming normalized today under the justification of Covid-19 precautions in the same way police around the world have normalized the use of drones to police coronavirus restrictions, at the same time police departments are rolling out dystopian systems for predicting future criminality using computer programs and databases.
This is all happening as the French army is testing these “Spot” robots for use in combat situations, years after the Pentagon requested the development of a “Multi-Robot Pursuit System” which can “search for and detect a non-cooperative human subject” like a pack of dogs. New Scientist’s Paul Marks reported on the latter development back in 2008:
Steve Wright of Leeds Metropolitan University is an expert on police and military technologies, and last year correctly predicted this pack-hunting mode of operation would happen. “The giveaway here is the phrase ‘a non-cooperative human subject’,” he told me:
“What we have here are the beginnings of something designed to enable robots to hunt down humans like a pack of dogs. Once the software is perfected we can reasonably anticipate that they will become autonomous and become armed.
We can also expect such systems to be equipped with human detection and tracking devices including sensors which detect human breath and the radio waves associated with a human heart beat. These are technologies already developed.”
These developments always elicit nervous jokes about Terminator movies and the idea of Skynet robots going rogue and enslaving humanity, but the far more realistic and immediate concern is this technology being used on humans by other humans.
For as long as there have been governments and rulers, there has been an acute awareness in elite circles that the public vastly outnumber those who rule over them and could easily overwhelm and oust them if they ever decided to. Many tools have been implemented to address this problem, from public displays of cruelty to keep the public cowed and obedient, to the circulation of propaganda and power-serving religious doctrines, but at no time has any power structure in history ever produced a guaranteed protection against the possibility of being overthrown by their subjects who vastly outnumber them.
The powerful have also long been aware that robot and drone technologies can offer such a protection.
Once the legal and technological infrastructure for robotic security systems has been rolled out, all revolutionary theory that’s ever been written goes right out the window, because the proletariat cannot rise up and overthrow their oppressors if their oppressors control technologies which enable them to quash any revolution using a small security team of operators.
Or, better yet, fully automated technologies which can fire upon civilians without the risk of human sympathy taking the side of the people. According to a recent UN report, a Turkish-made drone may have been the first ever to attack humans with deadly force without being specifically ordered to.
Live Science reports:
At least one autonomous drone operated by artificial intelligence (AI) may have killed people for the first time last year in Libya, without any humans consulted prior to the attack, according to a U.N. report.
According to a March report from the U.N. Panel of Experts on Libya, lethal autonomous aircraft may have “hunted down and remotely engaged” soldiers and convoys fighting for Libyan general Khalifa Haftar. It’s not clear who exactly deployed these killer robots, though remnants of one such machine found in Libya came from the Kargu-2 drone, which is made by Turkish military contractor STM.
So at this point we’re essentially looking at a race to see if the oligarchic empire can manufacture the necessary environment to allow the use of robotic security forces to lock their power in place forever before the masses get fed up with the increasing inequalities and abuses of the status quo and decide to force a better system into existence.
What a time to be alive.
________________________
My work is entirely reader-supported, so if you enjoyed this piece please consider sharing it around, following me on Facebook, Twitter, Soundcloud or YouTube, or throwing some money into my tip jar on Ko-fi, Patreon or Paypal. If you want to read more you can buy my books.
For more info on who I am, where I stand, and what I’m trying to do with this platform, click here.
0 Comments