Lethal Autonomous Weapons Systems: Future Challenges

Artículo
Center for Security Studies (CSS): CSS Analyses on Security Policy - No 164, Noviembre 2014
Matthias Bieri y Marcel Dickow

The challenge that armed drones pose for international law and arms control has only prevailed for a few years. However, experts are al- ready dealing with questions that will arise in the course of technical advances – such as how to handle weapons systems that will one day be able to carry out attacks without human intervention?

The use of armed drones in the so-called “war on terror” since 2001 has triggered ethical and legal controversies. While many questions remain to be clarified when it comes to armed drones and targeted killings, advances in the development of autonomy in weapons systems are already sufficient grounds for thinking ahead. The notion of so-called lethal autonomous weapons systems (LAWS) making independent decisions on the use of weapons illustrates the urgency of containing the proliferation of armed drones if a qualitative arms race is to be prevented.

In many respects, the debate on LAWS builds on the current experiences with armed drones. On the one hand, the advocates of a preventative ban on LAWS cite moral and ethical concerns. On the other hand, they assume that such systems will never meet the requirements of international law, such as the distinction between civilians and combatants. From a military point of view, the advance towards more autonomy appears logical. For some experts, the focus is therefore not so much on banning these future weapons systems, but on assuring that legal obligations are met in their use.

In May 2014, high-ranking international experts debated LAWS in the context of the UN Convention on Certain Conventional Weapons (CCW). It became clear that the states cannot yet agree on a shared understanding of the topic. Therefore, an agreement also seems a distant prospect at this point. Nevertheless, a full-blown de- bate is underway as to whether this type of future weapons should be limited or banned altogether. Switzerland, too, is participating in those discussions. In the framework of this debate, it is engaged in strengthening international humanitarian law. In mid- November 2014, the CCW annual meeting will decide in which framework the talks on LAWS should be continued.

Beyond Drones

Although drones account for only a small fraction of the political complications involving military robots, they are currently the focus of debate. In terms of technology, they only constitute the point of departure for (partially) autonomous robotics. Remote-piloted unmanned aerial vehicles (UAVs) are currently the spearhead of military robotics. However, the platforms currently in use do not represent the state of the art in military technology. Drones are currently much sought after for military missions due to their capability for long loitering times over the areas of operations without exposing military operatives to danger. In addition to real-time reconnaissance with high-resolution optical and radar imagery, they are also capable of carrying and firing weapons. The US practice of selective assassinations, carried out mainly using armed UAVs in states like Yemen, Pakistan, or Somalia, has had a major impact in terms of stimulating political debate on the use of drones (cf. CSS Analysis No. 137). The platforms in use are slow-flying, have no defenses against ground- based anti-aircraft systems or flying aerial defenses, and can therefore only be deployed in conflicts where the own side has established air superiority. If their mission spectrum is to be expanded, further developments will be essential. Particular emphasis is on MALE-class (Medium Altitude, Long Endurance) drones that the US military has been procuring and deploying since the early 2000s. Unmanned HALE-class (High Altitude, Long Endurance) surveillance aircraft that can fly even high- er are currently only used for electronic reconnaissance and are unarmed.

Military Purposes

Aviation is a task that can be relatively easily automated, as there are few obstacles to overcome and few forces involved beyond gravity, aerodynamic resistance, and wind. This is why the victorious advance of military robotics is now beginning with aerial vehicles. For now, the platforms are largely remote-controlled from ground stations. However, the technological trend towards autonomous operations is already set out, for four reasons: First of all, remote control implies significant dependence on a communications infrastructure, which can in principle be jammed or manipulated, and whose signals can betray the location of the airborne platform. The transfer of large quantities of data from the platform to the ground station requires huge bandwidth on satellite transponders. Already today, dedicated military satellites alone are insufficient to provide this bandwidth, which is why Western military forces have grown
dependent on commercial providers of satellite services. Secondly, remote control via long-distance radio involves a delay 
of up to several seconds.

That is not a problem with the current slow-flying surveillance drones. For the faster airborne plat- forms of the future, which will have to be able to prevail even in aerial combat, such a delay may be fatal. Third, technological progress is expected to deliver advantageous developments including faster reaction time and more consistent reactions. Fourth, autonomous systems are expected to deliver more flying performance by disregarding human limitations, including susceptibility to stress, can be discounted.

Moreover, expanding the use of this technology would probably further diminish the own side’s casualties. One reason for the development of armed drones has been the desire to minimize losses as far as possible in order to assuage public opinion. On the other hand, collateral damage from the use of armed drones causes comparatively little public uproar in Western societies.

Replacing the Human

Due to the difficulties arising from band-width requirements and delayed signals, the next step according to military logic is to strengthen the autonomy of the plat- surveying of the environment pose great problems for current-day land robots. Even more challenging would be their deployment in an environment with many human interactions, such as a city. Locomotion as well as sensing, grasping, and moving objects are still highly complex tasks for a robot to perform. Controlled manipulation of the environment by robots, like communication between humans and machines, remain systematic weaknesses. Two trends have emerged in dealing with these inadequacies. On the one hand, it is believed that swarms of relatively simple, communicating individual units can solve complex tasks together; the advantage being that the large number of swarm participants will create redundancy and reduce the complexity of the whole. On the other hand, the developers are concentrating on spaces and tasks that are better “suited” for machines than humans. In the former case, for instance, a forward outpost in the theater of operations could be replaced by a large number of small, cheap disposable sensors deployed by aircraft or artillery, for example, and form a communication network among themselves. In the second case, flying is the simpler mode of movement – airspace being less fraught with obstacles than the surface space – and the task of the robots is not to manipulate its surroundings, but to observe, analyze, and evaluate.

While distancing humans from the dangers of the battlefield through use of robotic systems may help compensate human inadequacies – especially when it comes to dealing with stress, danger, and lack of stamina – but at the same time, new problems are created. The danger is that humans are progressively removed from the chain of decision-making and responsibility. At the technical level, the current solution is to require an unequivocal vote on whether or not the machine is to execute a certain action, for instance regarding the use of force. The key question is whether the human operator is really aware of all that is going on in a given situation. The operator essentially perceives the world through the eyes of the assistance system. When experienced in real time, the reasons why an algorithm has taken a decision to act, or has preselected and suggested such a course of action, can no longer be comprehended. This problem has been existing for years. An additional concern is that the political threshold for use of violence could be lowered. It is also possible that killing human beings by affirmation from afar would cause an emotional distancing from the act of violence. Greater abstraction will only aggravate that tendency.

LAWS – A Matter for the CCW

Against this background, in May 2014, a multilateral expert group in Geneva began to deal with the issue of LAWS. Within the framework of the CCW, a number of possible outcomes were discussed, with the matter already having been raised in the UN Human Rights Council (UNHRC). The convention currently includes five protocols and bans the use of blinding laser weapons, for example. Demands are now being voiced for agreement on a sixth protocol banning the use of LAWS. While a basic ban on LAWS would be possible in the framework of the CCW, the way it has operated in the past has been by regulating the use of weapons, rather than through banning technologies as such.

These discussions revealed the fundamental problems of the debate. There was no agreement as to what constitutes an “autonomous system”. Furthermore, while there was a general consensus that humans should always have meaningful influence on lethal weapons systems, there were disagreements over the meaning of the word “meaningful”. At the conference, Cuba, Ecuador, Egypt, Pakistan, and the Vatican supported a ban on LAWS. The urgency of such a prohibition was emphasized; its supporters argued that the past shows how difficult it is to ban a weapons system once it has been put into service. However, some states stressed that it was too early to impose limitations or even bans in view of the lack of certain knowledge and a shared understanding of the topic.

Another more important objection is that autonomous weapons systems, should they ever be put to use, should meet the same criteria as all other weapons. The first amendment protocol to the Geneva Conventions requires each state to scrutinize new weapons systems for possible violations of obligations under international law. For LAWS, three obligations are of special concern: The capability to distinguish between civilians and combatants, proportionality in the use of violence, and the personal responsibility of the person in charge of a mission. Representatives of civil society assume that autonomous weapons can never be programmed to meet those requirements, and thus require a precautionary ban. Moreover, they do not believe that the use of such weapons can ever be morally justifiable. Machines, they argue, should never have the last word over matters of human life and death.

According to the majority of states, the amendment protocol ensures that, should LAWS fail to meet these requirements, they will never be deployed under prevailing law. Therefore, they say, the focus should be on enforcing international humanitarian law. A new legal regime that not all states adhere to would diminish this standard. However, the case of armed drones exemplifies the difficulty of verifying compliance with international law and depends to a large extent on the goodwill of the deploying country. Thus, the suggestion has been made that all states should confirm their obligations under international law in a special agreement focusing on autonomous weapons systems. Moreover, a moratorium on autonomous weapons systems pending clarification of all important questions has been proposed.

The state parties to the CCW treaty will decide at their annual meeting in November 2014 how and in which framework the talks should be continued. A mandate for negotiating the issue appears unlikely, as the majority of states do not see the need to negotiate a sixth CCW protocol at this time. On the other hand, a continuation of talks with involvement of civil-society experts appears realistic.

Precautions for the Future?

In addition to the CCW, the debates on LAWS in the UNHRC are of vital importance. This body deals in particular with the implications of LAWS for human rights. However, other venues might soon come into play. However, in case that no protocol on banning LAWS should be agreed within the CCW, as currently appears likely, civil-society groups have already suggested that a convention banning their use could be developed outside of this framework. Examples include the Ottawa Treaty or Anti-Personnel Mine Ban Convention, the Convention on Cluster Munitions, or the Chemical Weapons Convention. Civilian research and use of autonomous systems would constitute a special challenge for the verification of such a regime. There is no suggestion that autonomous unarmed systems should be outlawed. However, mounting weapons on nominally unarmed plat- forms would not pose any technical difficulties. Thus, the dual-use character of autonomous systems would be a problem, as is already apparent in the case of drones today. A UN special rapporteur in 2013 stated how troubling it was that no information was available as to who was developing and procuring armed drones.

The proliferation of relevant technologies is therefore a difficult challenge for inter- national export controls. Besides strengthening export control regimes, creating transparency is one of the most urgent aims if the problems arising in connection with proliferation are to be contained. UN special rapporteurs have called on the states to create as much transparency as possible and to comply with applicable laws. Moreover, the European Parliament in February 2014 passed a resolution calling for armed drones to be included in the disarmament and arms control regime.

The Arms Trade Treaty (ATT), which enters into force on 24 December 2014, as well as the UN Register of Conventional Arms impose limits on the trade in armed drones and provide some information about it. However, participation in the ATT is limited; China and Russia, for example, would rather not join for the time being. Also, the example of the ATT illustrates one of the greatest shortcomings of present-day arms control mechanisms: Quantity is increasingly outweighed by the quality of weapons systems and doctrine regarding their use. However, there are very few mechanisms in place for monitoring these two components. Capability-oriented arms control is therefore urgently required. However, introducing such inspections is extremely difficult, as armed forces would be forced to reveal a lot of information about their capabilities. Convincing the military of such a necessity is anything but simple. Conversely, negotiating transparency measures for armed drones appears to be slightly more realistic and more swiftly achievable. Globally speaking, however, even that seems unlikely, although for the European space, the exchange of data and information within the OSCE could be extended to unmanned systems. Notably, this would require the US, which is technologically superior to all other actors, to change its position.

Overall, the chances for success are better when it comes to the definition of norms – for instance, for elaborating a code of conduct. Such a code could govern the deployment of armed drones for domestic and irregular conflicts as well as their use by private military companies. The democratic control of armed forces could also explicitly be extended to armed drones. The relevance of such a commitment can be seen, for instance, in the case of the armed drones deployed in Libya during 2011: US President Barack Obama did not seek parliamentary approval for this deployment, arguing that no casualties were to be expected. When considering the on going civilian research in the field of autonomization, there is also a strong argument to be made for raising awareness of ethical issues among researchers.

The Role of Switzerland

Switzerland as the depositary state of the Geneva Conventions is a strong advocate for enforcement of international humanitarian law. In 2012, together with the International Committee of the Red Cross (ICRC), it launched a diplomatic initiative aimed at creating mechanisms for better observation of international humanitarian law. In the framework of the CCW talks, it has advocated the implementation and strengthening of existing laws.

Switzerland has a standardized process for monitoring compliance with international humanitarian law in its arms procurement planning. Before purchasing a new weapons system, a circle of experts drawn from various departments in the federal administration verifies whether this system could be in violation of prevailing rules, in which case limitations on the weapon’s use would have to be defined. By sharing such proven practices, the importance of international humanitarian law could be promoted at the global level, too. The notion of creating “best practice guides” for the evaluation of new weapons systems appears feasible.

In the summer of 2014, it was reported that the Swiss armed forces aim to purchase Hermes-900 type reconnaissance drones from Israel as part of its 2015 arms procurement program. However, purchasing armed drones is not on the table for the foreseeable future. According to these reports, Swiss companies are participating in the further development of the Hermes 900 as part of a defense offset agreement. Not least due to such involvement, dual-use issues will be relevant for Switzerland, too.

No hay comentarios

Agregar comentario