SCIENCE CAPTURED BY POWER STRATEGIES

Military applications of scientific work

Frederico Carvalho

Technological developments are taking place at an accelerated pace. We are all aware of such developments, their potential as well as their likely consequences in the short- and medium-term. However, people’s awareness of the current situation varies, depending on the information they possess and their ability to interpret it critically, in the specific context of the society or social group they belong to, live and work in.
Technological evolution in inevitable, as is the quest for new knowledge about the natural world. This is the task of scientific research which is the source of the advancement of science.
Societies are shaped by forces whose nature and correlation is constantly evolving. Those forces determine the use that is made of scientific and technological knowledge as well as the objectives and course of scientific work itself.
War as a social phenomenon has existed for as long as mankind itself. But the forms that war has taken and the means that it has used have changed significantly over time. As have the social and environmental impacts of wars. Scientific knowledge and the technological developments it has generated have always been associated to the evolution of military means, weapons and systems.

This mosaic of images illustrates conflicts, the clash between warring groups and combat hardware in different epochs. Top right, a cave painting depicts the fight between two human groups brandishing spears (Mesolithic 10-20 thousand years BC). Top left, a two-horse chariot (biga). The Hiksos that dominated Egypt between approximately 1600 and 1500 BC, introduced the war chariot and the horse in war actions. Top middle, a robotic mule developed for the USA DoD capable of carrying about 150 kg through uneven terrain. Below left, a remotely controllable device known as a four rotor drone or quadcopter. Below center, drawing of a machine gun designed by Leonardo da Vinci for Cesario Borgia (1482). Below right, a bas-relief from Asurbanipal palace (Mesopotamia, today’s Irak): Assyrian officer presents a new king to the vanquished Elamites at Madaktu after the battle of Til Tuba (VII century BC).

 In the second quarter of the last century ― nearly eighty years ago ― the development of nuclear weapons ― man’s control over the “atomic fire”, as it is sometimes depicted ― created a new situation: it gave mankind the possibility of exterminating life on Earth. It is a terrible threat that has remained with us and is presently becoming more serious, despite the brave and persistent efforts to defend peace being made everywhere in an organized fashion by women and men who are aware of the dangers that the world faces.

On March 1, 1954, the USA detonated, at Bikini Atoll, Marshall Islands, a hydrogen bomb (see image). The radioactive fallout of this code named Castle Bravo nuclear test, with a yield of 15 megatons of TNT, spread around the world.

Traces of radioactive material were detected as far as Australia, India and Japan, and even the United States and parts of Europe. The most powerful device ever detonated, was the Tsar Bomba, tested by the USSR, on October 30, 1961, over the Novaya Zemlya archipelago in the Arctic Ocean. The corresponding yield was three times that of the Castle Bravo test.
 

At the end of July 2015, the 24th International Joint Conference on Artificial Intelligence was held in Buenos Aires. A document was made public at the event that bore the title “Autonomous Weapons: an Open Letter from AI &Robotics Researchers”.[1 It was an initiative of the Future of Life Institute, based in Cambridge, USA [2 . More than 20,000 researchers, professors and other scientific workers from various fields signed the open letter. Of these, some 3,000 are researchers in Artificial Intelligence and Robotics. The well-known linguist and humanist Noam Chomsky and the physicist and cosmologist Stephen Hawking are two of the many signatories who come from a range of fields including the social sciences, humanities, physical sciences and others. One of the signatories from the first group is Professor Luís Moniz Pereira, a member of the Board of Organização dos Trabalhadores Científicos [3]   and a renowned expert in Artificial Intelligence.

 

The open letter states that: “Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters [4] that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”

But what are robotic weapons with autonomous decision-making capabilities? Robot is a word derived from the Czech “robota” meaning “arduous work” or “toil”  [5] . It is a machine capable of automatically performing complex tasks, usually under the control of a computer or similarly embedded device that communicates with an emitting-receiving centre which may be located nearby or far away. Information collected locally through different types of incorporated sensors is transmitted to this centre, from which instructions issued by an operator may be transmitted. An autonomous robot is different from an automatic robot in that it reacts to information collected locally by its sensors without any human intervention. Its actions do not depend on any analysis or scrutiny by a human operator of the information received. Obviously the device comes out of the assembly line with “default” settings that determine the kind of actions that it may execute, but the robot’s behaviour and reactions under specific circumstances are not subject to the control of an operator. That is why the aforementioned open letter states that robotic weapons with autonomous decision-making capabilities “can search for and eliminate people meeting certain pre-defined criteria”. A robot may, for instance, eliminate “a suspicious-looking individual” regardless of what is meant by “suspicious-looking”. An autonomous robotic weapon is able to identify, select and attack a target without human supervision.

Autonomous robotic weapons may be designed with varying degrees of autonomy. They may get their instructions “at inception” without the possibility of “behavioral” changes by learning and independent autonomous reprogramming. When the latter possibility exists, the systems are said to be based on a “genetic algorithm”, a familiar notion in Artificial Intelligence. Such weapons raise very serious ethical and legal issues, namely but not exclusively related to the liability for so-called “collateral damage”. From an ethical and legal perspective, the issues raised by these particularly sophisticated systems as well as by any other autonomous weapons systems are such that, in my view, the only acceptable solution is a formal ban.

On the other hand – and this point is worth stressing – these are weapons that as physical objects cannot be distinguished from other devices that have different features and peaceful and useful applications. For example, for the delivery of mail or parcels or to film a crowd.

Depicting the emergence of autonomous robotic weapons systems as “(…) the third revolution in warfare, after gunpowder and nuclear arms (…)” reflects an awareness of the danger that they represent for peace, the peaceful coexistence between nations and the survival of our species. It is this awareness that has led many, in different places and fora, to further discuss the existence and use of such lethal weapons systems and the ethical and legal issues that they raise.

In April this year, the 3rd Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) was held in Geneva in the framework of the UN Convention on Inhumane Weapons (CCW) that entered into force in December 1983 [6]. A series of recommendations were drafted on the nature and implications of the use of Lethal Autonomous Weapons in limited offensive actions or theatres of war. The recommendations were addressed to the Fifth Review Conference of the High Contracting Parties to the CCW, set to take place in December in Geneva.

Abundant documentation can be found on the internet about the submissions made by organizations from the various States participating in the 3rd Meeting of Experts. I will focus on two of them: the statements by the International Committee of the Red Cross and by the Holy See. In both cases, I will emphasize the passages that express concern over the use of autonomous weapons systems (AWS).

The International Committee of the Red Cross (ICRC) claims that a number of developments “might make increasingly autonomous weapon systems become less predictable”. Such developments include: increased mobility, meaning the weapon system would encounter more varied environments over greater time periods; increased adaptability, such as systems that set their own goals or change their functioning in response to the environment (e.g. a system that defends itself against an attack) or even incorporate learning algorithms; and increased interaction of multiple weapon systems in self-organizing swarms. The ICRC statement notes that autonomous weapons systems that are able to define their own targets or even “learn” and adapt their operational characteristics will, by definition, be unpredictable.

According to the ICRC, the increasingly autonomous weapons systems will have very serious legal and ethical implications.

The 4-page working document on which is based the statement by the Holy See presented at the Meeting of Experts in April, in Geneva, bears the title: “ Elements Supporting the Prohibition of Lethal Autonomous Weapons Systems”. It presents a very clear stance on the matter. The following quote addresses issues that go well beyond the technological questions associated with the development and use of such weapons.

(…) the disappearance of the human fighter will induce the disappearance of what the relationship of a person to a person and the discovery of the face of the other could provoke. A machine cannot have real empathy (this requires the experience to feel in one’s body what the other feels in his body. The machine has no real corporeality). A machine is not open to the unexpected forgiveness and to a real possibility of reconciliation or pacification”.

There are some who would like to lead us down this road. It is a dangerous road that I believe will pave the way to more suffering, destruction and conflicts. It is a road that can only be of interest to forces that are guided by their lust for profits and that seek to dominate the world.

About two months before the Meeting of Experts on Lethal Autonomous Weapons Systems referred to above, the use of such weapons was also discussed by the UN Human Rights Council. A report disclosed in February 2016   [7] included the following recommendation: “Autonomous weapons systems that require no meaningful human control should be prohibited, and remotely controlled force should only ever be used with the greatest caution.

 In a report dated August 2014, Christof Heyns, UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions  [8] emphasizes the concerns raised by the increasing use of technologies that “despersonalize” the use of force, including armed drones, not only in armed conflict but also in law enforcement and domestic contexts. Those technologies are presently on sale everywhere and are used by law enforcement agencies and private security companies for “crowd control; action against specific classes of perpetrators, such as prison escapees (…) and provision of perimeter protection around specific buildings”. The report also notes that an armed drone ― remotely controlled by a human, will hardly act as a law enforcement official is supposed to act ― that is employ the minimum amount of force as required by the circumstances, assist people in need, etc. Heyns adds that the situation will tend to become even more problematic if police officers make use of weapons with an increased degree of autonomy― that is, weapons with integrated computers that are able to autonomously decide the use of force.

The science of robotics and its associated technologies have evolved rapidly. This has naturally led to the development of many useful applications and devices that can assist us in our daily lives. Whenever we mention the autonomy of robotic devices, it is important to stress that such autonomy is based on a specific scientific domain, that of Artificial Intelligence.

More recently, the prestigious international journal Nature dedicated its Editorial to the issue of Artificial Intelligence. The topic is addressed in a very appropriate manner.   [9]

The Editorial avoids the polarized debate between unyielding sceptics and those who believe in the brightest of futures brought about by AI, focusing instead on the growing consensus that research in AI will have a profound impact on our lives sooner rather than later. Large corporations like Microsoft or Toyota but also Google and Facebook are investing billions of dollars in AI and robotics research. They believe that the breathtaking developments in these powerful technologies could be the next Eldorado and boost their profits. However, the Nature Editorial notes that safeguards should be envisaged against the potential pitfalls of such technologies, whose impact is likely to be felt very quickly as AI converges with progress in robotics leading to very significant technological changes. Among other dangers, the Editorial refers the “all too clear threat” that drones and other autonomous offensive weapons systems will allow machines to make lethal decisions alone.

 

The Editorial looks at the issue of AI from the broader perspective of the social impact of its widespread use in productive sectors, stressing the danger of a mass extinction of jobs and adds: “A society dependent on AI could yield broad benefits if increased wealth resulting from gains in productivity is shared. But currently, most such benefits are concentrated in companies and the capital of their shareholders — including the infamous 1%”.

Other aspects are worth mentioning in the context of the development of technologically-advanced new weapons. Such aspects are also related to advances in AI and robotics, combined with the outcomes of research and innovation in other fields.

It is the case, for instance, of “electronic warfare”, which includes the use of “directed-energy weapons”, namely high-power laser emitters; of the so-called “cyberwarfare”; and also of miniaturized nuclear weapons combined with undetectable airborne delivery systems. There is abundant evidence that significant investments are being made by various US federal agencies in these technologies in collaboration with prestigious universities. There are also indications that research in these fields is being carried out in the Russian Federation, the People’s Republic of China and other technologically-advanced countries. In the latter part of this paper, I will address this issue briefly.    [10]

In April 2015, the Center for a New American Security, a think-tank with close ties to the US Administration published a report with the title “Directed-Energy Weapons: Promise and Prospects”  [11]. The author, Jason Ellis, is a senior scientist at the Lawrence Livermore National Laboratory, one of the largest federal research labs in the US that focuses in particular on military and security-related issues     [12]. He argues for a significant increase in expenditure and for closer coordination of research within the euphemistically called DOD (Department of Defense), pointing out that around 2022, China may overtake the United States in terms of total research and development spending.

The operation of directed-energy laser weapons (which use the same acronym as the autonomous robotic weapons ― LaWS or Laser Advanced Weapons Systems) is based on the emission of a laser beam against a target to be destroyed or disabled. The beam can hit its target from a great distance ― 1 km or more ― without dispersion or loss of energy while travelling through the atmosphere, which would prevent the beam from causing the desired effect. The effect on the target is usually a steep rise in temperature at the point of impact. The increased temperature can melt metal, cause a fire or explosion if the target carries explosives. The US Navy has tested and declared operational laser weapons that destroyed several types of targets: rocket grenades; small vessels; drones and small missiles. At the end of 2014, a directed-energy laser system was installed aboard the USS Ponce, an amphibious transport dock ship. The system was tested and is operational and can be used against drones, small aircraft and high-speed boats. The power range of the new weapon is unknown but estimated to be between 15 and 50 kW. At the beginning of 2015, the USS Ponce has been deployed to the Persian Gulf.

 (U.S. Navy photo by John F. Williams)

As regards new nuclear weapons, it is a well-known fact that the Obama Administration has adopted an atomic revitalization plan at an estimated cost of one billion dollars to be spent over three decades. The decision is a clear violation of Article VI of the Non-Proliferation Treaty and shows that the United States are intent on keeping nuclear weapons operational rather than eliminating them, as well as on using them in battlefields or as retaliation against their supposed enemies.

 

 

 

 

 

 

 

 

 

 

In the case of cyberwars carried out on Earth or in outer space, where thousands of orbiting satellites ensure the operation of infrastructures deemed essential to our daily lives, the risk of disruptions caused by offensive actions or even by accident or miscalculation are huge. Communications, guidance (GPS) and meteorology, for example, depend on the technological devices that human knowledge and ingenuity managed to place on an orbit around the Earth. It would be easy for a military satellite to selectively destroy or disable other satellites. This could be done, for instance, by means of high-energy beams emitted by one of those orbiting devices.

Moreover, comprehensive and inclusive legislation on space activities is still lacking, a fact that has also contributed to a dangerous accumulation of spatial waste that orbits the Earth at very high speeds.

Cyberwars carried out on Earth, including the selective diffusion of computer viruses and hacking activities (although it should be noted that not all actions by so-called hackers are negative) are a fertile ground for disabling key infrastructures, including energy supply, transportation, health services, water supply, communications, industrial plants, etc. In this way, a potential enemy may be paralyzed without the precise origin of the attack being known. That is why more technologically-advanced nations are presently investing significant human, material and financial resources in preventing cyberattacks but also, it must be said, in creating the capabilities to carry them out    [13].

Weapons, drugs and the traffic of human beings are some of the most lucrative businesses of our time. They are carried out on the margins of lawfulness, ethics, morals and human and environmental rights. All make use of technological developments and the advancement of scientific knowledge. Only by making the public aware of the dangers of the inappropriate use of such developments can we minimize the associated risks. The role of the scientific community could be decisive in this regard. It is a community of women and men whose education allows them to separate the wheat from the chaff when it comes to the new possibilities opened up by science and technology. However, to engage public opinion so that it may act as catalyst for a political process, the identification and explanation of potential risks cannot be confined to the scientific community itself. A community, incidentally, where significant numbers are unaware of such issues for lack of interest, time or relevant information outside their specific professional expertise. Promoting a civic activism that will engage common men and women is of the utmost importance. There is therefore an urgent need for dialogue and information-sharing at various levels and with a diversity of partners.

We see the present paper as a modest contribution to that end.

 

October 10, 2016

Frederico Carvalho is Chair of the Executive Board of OTC, the Portuguese Organization of Scientific Workers, and Vice-President of the Executive Council of the World Federation of Scientific Workers. He holds a PhD in Neutron Physics


[3] OTC-The Portuguese Organization of Scientific Workers

[ 4 ] 4-rotor helicopters

[5] The term was first used by Czech writer K. Čapek in his science-fiction play R.U.R. (1920). R.U.R. stood for ‘Rossum’s Universal Robots’ and was a robot plant.

[6] Currently 122 States are parties to the Convention on Certain Conventional Weapons (CCW), also known as the “Inhumane Weapons Convention” whose purpose is to ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately. Mines, booby-traps, incendiary weapons and blinding laser weapons are some of the weapons that come under the Convention.

(http://www.unog.ch/80256EE600585943/(httpPages)/4F0DEF093B4860B4C1257180004B1B30?OpenDocument).

[7] UN Human Rights Council, Doc. A/HRC/31/66, “Joint report of the Special Rapporteur on the rights to freedom of peaceful assembly and of association and the Special Rapporteur on extrajudicial, summary or arbitrary executions on the proper management of assemblies”, 4 February 2016 (Report to the 31st regular session of the Human Rights Council (29 February to 24 March 2016), UNOG-United Nations Office in Geneva)

[8] Special Rapporteur’s report (A/69/265)

http://www.ohchr.org/EN/newyork/Pages/HRreportstothe69thsessionGA.aspx

[9] “Anticipating artificial intelligence”, Editorial, Nature vol.532, p.413, April 28, 2016

[10] In July 2015, at his confirmation hearing in the Senate as supreme commander of the US armed forces, General Joseph F. Dunford said that Russia posed the greatest “existential” threat to the United States. In December, Deputy Secretary of Defense Bob Work reiterated the same concern, depicting the Russian Federation as a “resurgent great power” and China as a “rising power with impressive latent military technological capabilities [that] probably embodies a more enduring strategic challenge”. In February this year, Secretary of Defense Ashton Carter spoke of Russian “aggression” in Europe, adding that sadly “we haven’t had to worry about this for 25 years, and while I wish it were otherwise, now we do”. cf. David Ignatius, Opinion writer, “The exotic new weapons the Pentagon wants to deter Russia and China”, The Washington Post, February 23, 2016

[11] Jason D. Ellis, “Directed-Energy Weapons: Promise and Prospects”, Center for a New American Security, Abril 2015 (60 pp)

[12] The homepage of the laboratory’s website features the following quote from its Director: “Our mission is to make the world a safer place.”

[13]  According to credible sources, the Pentagon’s budget for 2017 includes the following allocations (in USD): $3 billion for advanced weapons to counter, say, a Chinese long-range attack on U.S. naval forces; $3 billion to upgrade undersea systems; $3 billion for human-machine teaming and “swarming” operations by unmanned drones; $1.7 billion for cyber and electronic systems that use artificial intelligence; and $500 million for war-gaming and other testing of new concepts. (cf. Endnote 9)