Search

The Urgent Need To Review Security Technology - Inkstick

pentingnus.blogspot.com

The phrase “move fast and break things” has been the guiding ethos of this era of technological development. Many of the innovations of the last two decades have specifically been aimed at disrupting industries and societal processes with little public forethought and discussion of what negative consequences might arise from their deployment.

Few areas of technological evolution have a greater potential to erode democratic norms and values than those in the security and criminal justice arenas. Security technologies ranging from tasers, to opaque algorithms that are being used to help determine individual and collective risk, to a constellation of surveillance systems that are being implemented, are all deployed without a proper understanding of their impact on the communities in which they are unleashed.

The current system under which such technologies are reviewed consists mainly of the procurement arms of organizations seeking to purchase them — and then only verifying that they meet the bid specifications. For example, city councils are rarely aware of the fact they are approving facial recognition technologies, surveillance drones, and integrating data for their local law enforcement agencies. Similarly, state and local legislatures have been kept in the dark regarding technology acquisitions through non-disclosure agreements and programs, such as the Department of Defense’s 1030 program where law enforcement agencies can acquire surplus military equipment without having to seek appropriated funds.

What is needed to create more transparency when it comes to purchasing and using security and criminal justice technology? And how can the acquisition of these technologies be done in a way that is compatible with the right to privacy?

WHAT IS THE CURRENT SYSTEM?

At the national level there are several agencies involved in evaluating security and criminal justice technology. The National Institute of Justice (NIJ) is housed within the Department of Justice and conducts technological reviews of a limited selection of law enforcement equipment. It also sets voluntary standards for the manufacture of bullet resistant vests and other items. Its stated goal is “to ensure to the degree possible that equipment is safe, reliable and performs according to established minimum requirements.” The NIJ process basically asks: does a particular piece of equipment work for law enforcement? It does not ponder whether or not it should have been brought to market in the first place.

The National Institute of Standards and Technology plays a role in evaluating everything from defensive cybersecurity technology and related standards to helping to develop standardized procedures for forensic science. The organization co-chaired the National Commission on Forensic Science, which was founded to help implement the recommendations of a National Academy of Sciences report that called much of the underlying science of forensics into question. More than a decade later, few of the recommendations were implemented and the commission itself was abolished by the Trump administration.

The Federal Trade Commission (FTC) also plays a role, but is limited to ensuring and protecting the privacy of a consumer after violations have occurred. The FTC, however,  has failed to proactively ensure that the privacy of Americans is not compromised. Along with the FTC, the Consumer Product Safety Commission also has a link to this arena, having approved the TASER for use based on “theoretical” operation rather than human or animal studies.

The current system, therefore, consists of a variety of agencies that have been interested in how different security technologies aid their missions and stated goals. While technically these agencies should serve the American public, it has become glaringly clear that the rights of the American consumer are not the top priority, even though these technologies are supposed to make Americans feel “safer.” What is needed is a more comprehensive system of review to holistically examine security and criminal justice technologies before they are released into the public domain with lasting and potentially deadly outcomes. And that system involves a review board.

THE NEED FOR AN IRB

The concept of an Institutional Review Board (IRB) was born as a result of some of the worst horrors of human experimentation undertaken in the United States in the twentieth century. The Tuskegee Syphilis Study, where Black participants were infected with syphilis without their consent, and other incidents of human experimentation on vulnerable populations led to the passage of the 1974 National Research Act to codify the IRB concept for biomedical research.

The Belmont Commission set up by the act ascribed a new set of ethical principles for biomedical and behavioral research that includes three important elements. The first is the “respect for persons” to include bodily autonomy and protection of vulnerable persons. The second is “beneficence” or doing no harm while working to maximize the benefits of research to the project while working to minimize the risk to research subjects. The final is “justice,” which means ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly and assigning the risks and rewards of the research equally. The IRB is now a fixture of university campuses around the world helping to ethically guide research practices based on the principles outlined by the Belmont Committee.

While technically these agencies should serve the American public, it has become glaringly clear that the rights of the American consumer are not the top priority, even though these technologies are supposed to make Americans feel “safer.”

The concept of the IRB could be adapted for security and criminal justice technologies on a national level in order to examine those technologies before they are publicly deployed. The evaluation would be designed to include perspectives from psychologists, sociologists, medical professionals, lawyers, engineers, computer and data scientists, civil society groups, including those advocating on behalf of minority communities, such as Data and Society, Citizen Lab, and the Surveillance Technology Oversight Project.

There have been too many examples of technologies being thrust into use with little to no discussion regarding their safety, effectiveness or disparate negative impact until after potential issues arise. Airport body scanners for example, were put into service without rigorous safety testing to say nothing of the likelihood that they are ineffective at detecting weapons. At the time, scientists did raise alarms but the decision to go ahead with their deployment became a political one rather than one based on a careful review of the evidence. The internet is replete with accounts of transgender people, Black women, breastfeeding mothers, and the elderly that have all triggered the devices due to “anomalies” detected on their person. If a device cannot differentiate between different types of organic material and has subjected an untold number travelers to humiliating searches should it still be in service?

The emerging technology of facial recognition has also been thrust into the public sphere with almost no oversight at the national level. Data sets that have been used to train algorithms that power these systems use pictures often scraped from social media and culled from DMV images, in some cases contravening their terms of service and state law. The technology also runs into well-known issues with CCTV cameras and their ability to effectively render darker skin, making misidentification more likely. The most important — and fascinating — thing to note is that the entire notion of utilizing the face as a primary identifier fell out of favor more than a century ago, partially due to a prisoner who arrived at Leavenworth Prison whose measurements were a near perfect match to another prisoner already serving his sentencing. There are currently no standards for what constitutes a match. While the system manufacturer might say that 95% probability of a match is the standard, a 70% match might look good to law enforcement officers who could take action against a person who otherwise might not have been drawn into an investigation. Aside from the question of effectiveness, the IRB could get to the heart of the issue: the loss of anonymity when you leave your house —  and if that is compatible with the protection of the rights of minorities, women, and the LGBTQ community.

Lethal technologies should also be examined by this IRB. The word “drone” entered the lexicon in the early 2000’s as they were deployed first for surveillance in the Global War on Terror and later served as platforms for missiles designed to bring precision to the business of killing. The number of civilian deaths associated with drone strikes is staggering despite being sold as a more sanitized version of war. Additionally, the Air Force operators themselves routinely experience psychological side effects that range from PTSD to suicide. Neither the safety of innocent civilians nor the military personnel who operate the drones was taken into account at the outset. Yet, drones are being eyed by civilian law enforcement organizations for domestic surveillance. It is only a matter of time until armed drones patrol US skies without intervention.

HOW TO ESTABLISH AN IRB

The IRB could be established as an independent federal agency by an act of Congress or even a congressionally chartered non-profit corporation and organized to act without political interference to the greatest extent possible. The IRB would be guided by three overriding principles: to do as little harm as possible, consider the worst possible outcomes, and determine how the technology can be used by various sectors. With that guidance, experts across disciplines and constituencies could evaluate if the technology is able to mitigate possible harms created and if not, decide if agencies and private sector organizations should be able to use it at all. The technology would then be given a label that would decide if it could be used by private sector organizations, local or state law enforcement, the National Guard, or restricted to military or intelligence for use overseas.

We have seen many tools that are developed for one arena, such as overseas counterinsurgency operations, that make their way back home for domestic use. These technologies include ISMI Catchers, persistent surveillance overflights, LRAD crowd control, MRAPS, and many more. Even spyware makers whose products have been utilized to surveil dissidents and reporters are courting local law enforcement agencies in the United States. Spyware could receive a “label” for federal use (with a warrant) or for use overseas only.

Additionally, the IRB could consider the equities of foreign nationals to both prevent a technology being utilized in a manner inconsistent with the Geneva Convention or one that targets minority groups, journalists, or human rights workers. If an agency goes on to use a technology “off label” they would cede qualified immunity in the use of said technology for their agents and employees. Any harm caused by the off-label use of a particular technology would be presumed to be negligent, and a federal offense

 A PIECEMEAL APPROACH IS INADEQUATE

Jurisdictions throughout the country are starting to regulate the use of military and surveillance technology for use by their police departments. A piecemeal approach will not be enough to reliably regulate the security and criminal justice technologies. Individual jurisdictions could place further restrictions on the use of a specific technology but the efforts of a national IRB could set minimum restrictions for how it should be used.

The choice of what technology we utilize and who we are trying to protect with it is one of the most fundamental questions we can answer as a modern society. And we should start by establishing an IRB for security and criminal justice technologies.

Ryan Mason is a corporate security professional and MSc. student in the security studies program at Liverpool John Moores University.

This essay was one of the seven essays that was selected as honorable mentions in New America Foundation’s “Reshaping US Security Policy for the COVID Era” essay contest.

Let's block ads! (Why?)



"urgent" - Google News
March 18, 2021 at 08:30PM
https://ift.tt/3cDsg6M

The Urgent Need To Review Security Technology - Inkstick
"urgent" - Google News
https://ift.tt/2ya063o
https://ift.tt/3d7MC6X
urgent

Bagikan Berita Ini

0 Response to "The Urgent Need To Review Security Technology - Inkstick"

Post a Comment

Powered by Blogger.