An Overview of Arm’s Treaties, Emergent Technologies and Citizen’s Activism
Frederico Carvalho
A contribution to the proceedings of Working Group 1
89th Executive Council Meeting of the WFSW
Paris, 29-30 Abril 2019
War threats and regulatory instruments
We are living through difficult and dangerous times.
The predatory nature of unbridled capitalism is slowly closing all emergency exits for mankind to survive on our planet with a minimum acceptable quality of life. We should not let the disaster happen.
The question of regulations is one of overwhelming importance.
The subject of regulations covers a wide range of fields of activity — social, economic and cultural. Armed conflicts in general, nuclear weapons, space and cyber weapons, and in the present day, the regulation of emergent technologies especially artificial intelligence and genome editing techniques deserve particular consideration.
To be effective, regulations of wider international scope require an agreement between state parties culminating an often lengthy and arduous process of negotiations. In some cases Treaties, Conventions or Protocols are the result of bilateral negotiations or negotiations that involve a limited number of parties. The case of the Intermediate-Range Nuclear Forces (INF) Treaty (1987-2019?), is an outstanding example. START, the Strategic Arms Reduction Treaty (1994-2009) and New START (2011-2021?) are other examples of bilateral treaties, binding the USA and the Russian Federation.
Another example is the Anti-Ballistic Missile (ABM) Treaty (1972-2002).
The Treaty on the Non-Proliferation of Nuclear Weapons (1970-open ended), commonly known as the Non-Proliferation Treaty or NPT, is an international treaty (190 state parties).
The Comprehensive Nuclear-Test-Ban Treaty (CTBT), another multilateral treaty that bans all nuclear explosions, for both civilian and military purposes, in all environments, was adopted by the United Nations General Assembly on September 10, 1996, but has not entered into force, as eight specific states have not ratified the treaty[1].
Through all the years since her foundation in 1945 the Organization of the United Nations and her specialized Agencies, such as the IAEA, have been instrumental in sponsoring and mediating talks that led to several important international treaties. A number of them remain as pillars of the world’s infrastructure vowed to consolidate peace. It is unfortunate that the role and ability of the United Nations System to achieve consensus on matters of global importance for mankind is being eroded.
Beyond the realm of nuclear weaponry a number of other international instruments that concern the development and use of other weapons deserve particular attention. Such are the cases of the Chemical Weapons Convention (CWC) (1997-open ended) and the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (CCW or CCWC) (1983-open ended).
The latter Convention covers landmines, booby traps, incendiary weapons, blinding laser weapons and the clearance of explosive remnants of war. Up to the present time, however, Lethal Autonomous Weapons Systems (LAWS) often referred to as “killer robots” are not covered.
The Biological and Toxin Weapons Convention (BTWC) (1975-open ended) was the first multilateral disarmament treaty banning the production of an entire category of weapons.
Although multilateral international treaties and conventions in force, have been signed and ratified by a large majority of the member countries of the United Nations, there is in every case a small but significant number of states that are non-signatory. Two examples: four nuclear armed states, India, Israel, North Korea and Pakistan are not bound by the NPT; the Chemical Weapons Convention has not been signed or ratified by Egypt, Israel, North Korea and South Sudan. Israel never signed or ratified the Biological and Toxin Weapons Convention. Egypt and Syria, have signed but not ratified.
The most recent initiative aiming at an international agreement on nuclear disarmament — the Treaty on the Prohibition of Nuclear Weapons (TPNW), or Nuclear Weapon Ban Treaty — dates from December 2016 when according to a mandate adopted by the United Nations General Assembly negotiations on a treaty to that end were set to begin in March 2017. The purpose was to draft what would be the first legally binding international agreement to comprehensively prohibit nuclear weapons, with the goal of leading towards their total elimination. The draft was concluded, submitted to the vote and approved on July 7, 2017, by a majority of 122 member states. However, more than one third of the member states did not take part in the vote among them all nuclear armed states and all NATO members with the exception of the Netherlands that voted against. As of today the Treaty is still not legally in force since the required number of ratifications has not yet been assembled.
It is of interest to refer that several nuclear powers including Russia and the USA have explicitly expressed opposition to the Treaty. Several of the non-nuclear armed NATO member states, claim that the treaty will be “ineffective in eliminating nuclear weapons” and called instead for advanced implementation of Article VI of the Non-Proliferation Treaty[2]. Experience shows however that the nuclear powers have not taken a single step for the last fifty years to implement the said Article VI of the NPT. The USA, UK and France, the nuclear armed NATO members, issued a joint statement indicating that they did not intend “to sign, ratify or ever become party to it”. Behind such a stand on nuclear disarmament is the consideration shared by several powers that giving up nuclear weapons would be “incompatible with the policy of nuclear deterrence, which has been essential to keeping the peace in Europe and North Asia for over 70 years“[3]
The case is that it may be argued that the possession of nuclear weapons as well as that of other technologically advanced contrivances susceptible of being used against an opposing party are essential to deter aggressions and should not be accepted as long as conditions for a simultaneous and controlled disarmament are not met. Such considerations may apply to nuclear arms as well as to chemical, biological and autonomous weapon systems.
In the case of nuclear weapons, including the means of delivery, regulatory instruments, namely international or bilateral treaties, are being dismantled. This effectively weakens the basis of the said “policy of nuclear deterrence”. The development of more sophisticated smaller sized nuclear explosives, and stealthier means of delivery, goes in the same direction. It appears that the possibility of launching and winning a limited nuclear war is considered a viable option among high ranking military circles, namely in the USA, and a justification for that development effort.
We are living under an increased danger of an all-out nuclear conflict triggered by a miscalculation of the opponent’s reaction. This adds to the long existing danger of nuclear war being triggered by mistake or by the failure of a faulty component in the paraphernalia of equipment and systems involved in the immense machinery of war.
As far as Robotics, Artificial Intelligence (AI) and Gene Editing techniques are concerned regulatory instruments are in their infancy or non-existent. There is however a growing demand for them. The next section is informative in what concerns military and surveillance applications of AI.
Genome editing is a complex issue that raises a host of questions embracing a variety of fields. As most technical advances it can be used for the good of humanity or it may be the source of new threats. In all cases it raises serious ethical questions. Scientists have called for a genome editing moratorium that could be used to assess its safety implications and to implement adequate regulations on its use.
Genome editing techniques allow to change specific DNA sequences in a precise and controlled way. The so-called CRISPR/Cas9 technology is probably the easiest to set up and to apply. It could be used in basic science, for human health, or improvements to crops. But it can as well be used to kill or sterilize a plant. In fact CRISPR/Cas9 technology has raised the interest of the military. DARPA the US Defense Advanced Research Projects Agency is working on it[4]. Another possible field of application of CRISPR is human enhancement for non-medical purposes. In a recent article on the Journal of Bioethical Inquiry the authors write: “Human performance optimization has long been a priority of military research in order to close the gap between the advancement of warfare and the limitations of human actors.”[5]
Collective public stances against military applications of scientific and technological advances and increased funding of military R&D
In recent years scientific research has led to a swift development of new technologies that are susceptible of military applications in a war theater as well as in domestic law enforcement operations against citizen’s movements. This state of affairs is particularly worrying in a context of escalation of the arms race, degradation of international relations between main powers and dwindling effectiveness of international bodies, mainly the United Nations, in dealing with crisis.
There are however signs of hope
A number of initiatives have emerged within the international scientific community expressing opposition to an increased expenditure of public funds in military research projects. In the past the intervention of scientists has been at the heart of successful international agreements in the nuclear and other fields. Such initiatives are welcome and deserve support. In our days we witness the development at an increasing pace of technologies which can be rapidly applied in the military field. This fast progress makes it particularly difficult for the non-specialist to evaluate or even realize the potential social impacts of the technologies in key areas such as employment and the labor market, intrusion in private life, security or peace keeping. Applications of Artificial Intelligence and Robotics are particularly sensitive when dealing with lethal armament systems with decision-making capacity without human intervention.
Approximately four years ago (July 2015) on the initiative of the “Institute for the Future of Life”, a US-based non-governmental association, an “open letter” of researchers in Artificial Intelligence and Robotics was made public drawing the attention of the scientific community to the dangers of using devices with autonomy of decision as a weapon of war on a battlefield.[6] At the time of the announcement — a scientific meeting in Buenos Aires — the letter was subscribed by about 1000 scientists and other intellectuals of world renown. The late Stephen Hawkins and Noam Chomsky, the well-known American linguist, were two of the signatories. The letter continued to circulate and has already collected, up to now, more than 31 thousand signatures. Thirty-four of the signatories are Portuguese. The letter points out (quote) that “artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms”.
Two years later (August 2017), 116 executive officers from “companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons”, in 26 countries, made public an open letter addressed to the representatives of the 124 state parties of the UN’s Conference of the above mentioned Convention on Certain Conventional Weapons welcoming the Conference’s decision to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems (LAWS). “Many of our researchers and engineers— they added in the letter — are eager to offer technical advice to (the) deliberations” of the Group. The signatories expressed that as professionals “we feel especially responsible in raising this alarm” [7].
“These can be weapons of terror — they added — weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s Box is opened, it will be hard to close”.
The CGE has met for several times since its inception. Until the present there is no notice of any substantial agreement between the parties.
By the time the GGE met last August, 26 states supported a ban on fully autonomous weapons systems—four more than at the April 2018 meeting. However, 12 states—including Russia, the U.S. and the U.K.—opposed even negotiating a treaty on LAWS.
In the European Union activism for Peace and against policies that exacerbate the risks of an arms race is alive. On November 28, 2018, a group of 177 European scientists, from 17 member states, “alarmed about the military turn the EU is taking”, made public a letter expressing their concern over “both the massive amount of public funds that the EU is allocating for military research and the potential development of autonomous weapons (…)”[8]. Thirteen signatories are Portuguese.
A few months before, “Researchers for Peace”, an initiative of eight European associations of scientific workers, published a letter drawing attention to the fact that the EU had for the first time, in 2018, set up a military research program with the objective of “helping to preserve the competitiveness of the arms industry”. The so-called Preparatory Action on Defense Research (PADR) allocated a total sum of 90 million euros to military research projects over a three-year period up to 2020. The letter stressed the fact that the Preparatory Action was meant to be “only a first step in paving the way for a full blown European Defense Fund of an estimated 40 billion euros for research and development of military hardware over the next ten years”. That is, until 2030. The letter added: “Investing EU funds in military research will not only divert resources from more peaceful areas, but is also likely to fuel arms races, undermining security in Europe or elsewhere”. The letter has been signed by more than a thousand scientific workers[9]. Eighty of the signatories are Portuguese or working in Portugal.
In recent years military applications of AI and Robotics, especially autonomous weapons have been at the forefront of the preoccupations of socially responsible scientific workers and peace activists worldwide. In July 2018, a pledge to “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons” was made public in Stockholm. The pledge has been signed to date by over 160 AI-related companies and organizations from 36 countries, and 2,400 individuals from 90 countries. OTC and the Lisbon Nova University are among the signatory organizations as well as a dozen Portuguese scientific workers.
In September 2019 the European Parliament passed a resolution on banning autonomous weapon systems by 566 votes to 47, with 73 abstentions
Recently a different type of actions have surfaced that deserve particular mention.
We refer to reactions by employees of big multinational tech companies that oppose the company involvement in contracts with the military to develop weapons. Three cases, at least have become public. They concern Microsoft, Google and Amazon.
Microsoft employees protested the company’s US$480 million contract to supply the US Army with augmented-reality headsets in a letter stating that they “did not sign up to develop weapons.”[10] In the letter they add: “We are a global coalition of Microsoft workers, and we refuse to create technology for warfare and oppression”[11].
This letter follows an October 2018 open letter by anonymous Microsoft employees in which they demanded the company withdraw its bid for a US$10 billion Joint Enterprise Defense Infrastructure (JEDI) contract with the US Department of Defense. A passage of the letter reads as follows: “Many Microsoft employees don’t believe that what we build should be used for waging war. When we decided to work at Microsoft, we were doing so in the hopes of “empowering every person on the planet to achieve more,” not with the intent of ending lives and enhancing (weapons) lethality.”[12]
In June 2018, in an article by Drew Harwell, the Washington Post announced that “Google is banning the development of artificial-intelligence software that can be used in weapons.”[13] This appears to have been a response to what the author describes as “(…) a firestorm of employee resignations and public criticism over a Google contract with the Defense Department for software that could help analyze drone video, which critics argued had nudged the company one step closer to the “business of war.””
An open letter addressed to Google Executives in support of the estimated number of more than three thousand Google employees and tech workers involved in this process was published by a group of approximately twelve hundred scholars, academics and researchers from different countries. Professor Luís Moniz Pereira who is a member of OTC’s governing bodies is one of the signatories.
In the case of Amazon employee activism is directed against the sale of the company’s facial recognition software — Rekognition — to law enforcement agencies. Workers have circulated an internal letter addressed to Amazon’s CEO asking the management to discontinue partnerships with companies that work with the US Immigration and Customs Enforcement (ICE). “Our company should not be in the surveillance business; we should not be in the policing business; we should not be in the business of supporting those who monitor and oppress marginalized populations,” the employee’s letter states[14]. They add: “We don’t have to wait to find out how these technologies will be used. We already know that in the midst of historic militarization of police, renewed targeting of Black activists, and the growth of a federal deportation force currently engaged in human rights abuses — this will be another powerful tool for the surveillance state, and ultimately serve to harm the most marginalized.”
This appears to represent a clear recognition that the software in question can be used as a tool in the building-up of a police state.
Frederico Carvalho
April 16, 2019
[1] China, Egypt, India, Iran, Israel, North Korea, Pakistan, United States
[2] Article VI of the NPT: “Each of the Parties to the Treaty undertakes to pursue negotiations in good faith on effective measures relating to cessation of the nuclear arms race at an early date and to nuclear disarmament, and on a treaty on general and complete disarmament under strict and effective international control”.
[3] Joint Press Statement from the Permanent Representatives to the United Nations of the United States, United Kingdom and France following the adoption of a treaty banning nuclear weapons ,7 July 2017
[4] https://otc.pt/wp/2019/02/22/a-new-bioweapon-system/
[5] “Ethical Issues of Using CRISPR Technologies for Research on Military Enhancement”, Greene, M. and Master, Z., Journal of Bioethical Inquiry, September 2018, Volume 15, Issue 3, pp 327–335
[6] “Autonomous Weapons: An Open Letter from AI & Robotics Researchers”, July 2015. (https://futureoflife.org/open-letter-autonomous-weapons/)
[7] https://futureoflife.org/autonomous-weapons-open-letter-2017/
[8] https://www.vredesactie.be/unmanned-autonomous-weapons-how-eu-adrift
[9]https://www.researchersforpeace.eu/form/researchers-pledge-form#english
[10]https://www.euronews.com/2019/02/23/microsoft-employees-demand-military-contract-be-dropped (February 2019)
[11]https://www.documentcloud.org/documents/5746790-Microsoft-HoloLens-Letter.html
[12]https://medium.com/s/story/an-open-letter-to-microsoft-dont-bid-on-the-us-military-s-project-jedi-7279338b7132
[13]https://www.postguam.com/business/technology/google-bans-development-of-artificial-intelligence-used-in-weaponry/article_7af8feb4-6ae3-11e8-866c-a72d5ab46141.html
[14]https://gizmodo.com/amazon-workers-demand-jeff-bezos-cancel-face-recognitio-1827037509 (June 2018)