Lethal Automated Weapons Systems and Humanitarian Law

Introduction

The Lethal Automated Weapons Systems (LAWS) works without the intervention of  humans, such is the advancement of technology in the present day world. They are weapons capable of selecting by using sensors, recognition of image and software for the target locking and also for the means of attack. A group of NGOs have started a coalition to ban the use of LAWS called as’ campaign to stop killer robots’ recently in 2012 calling for complete ban on them. Indeed, these automated weapons were termed ‘Killer Robots’ as it can potentially destroy the mankind. It is a modern cold war between the two super powers of the world, U.S and Russia.. UN has also agreed to consider and review the controls around the use of such technology in automated weapons. The use of these weapons even to test may violate certain humanitarian principles and UN needs to act quickly in order to ban them.

1. History and Technology

The technological advanced countries like Russia, U.S and U.K, have started the use of LAWS in their defence forces. One of the common known examples will be the drones which are unmanned and work according to commands received through signals operated from stations and used in air, land, water or in space. The defence sectors have started to accommodate the artificial intelligence machines to combat their military relations. In 2018, the US Defence Department Report has mentioned that Russia has developed the new intercontinental, nuclear armed, powered and undersea autonomous torpedo. It has the capacity to launch on U.S coastal targets and military bases. It is completely unmanned and works on itself right from setting target. The concerns have been raised by US in the report and Russia has not revealed any details.

2. 1 Automated Submarines

The Australian submarine in the Indian Ocean was called as a collins class where they are extremely efficient in use. It is also expensive and have literally sunk many U.S navy submarines, destroyers. The Royal Australian Navy has thought of replacing submariners with Unmanned Undersea Vehicles (UUV) which will replace crew members in the submarines. It is further understood that these machines are effective only for the short-range and transporting signals will be hard under the ocean.  In regard and support of this, being unveiled a prototype autonomous submarine with long range, which is named as Echo Voyager. It can be used to stand for long hours and not just for a short time where the design of this submarine is such that it can work autonomously for months long. It shows the development of autonomous technology and it can be a standing example to the idea of having automated submarines for long distance programs.

2.2 Automated Drones

The automated drones are the ones which operate without human intervention. These drones are highly automated with artificial intelligence technology where they can learn and self develop to the challenges. Presently, there is testing of ‘swarms of drones’ which can follow other drones to take task from them and work on it. U.S has been testing it and has named them as “Loyal Wingman’. The development is that the unmanned drones being dispatched from the manned aircrafts which can function independently for the main one. The use of this technology will be aided for surgical causing damages and can be used inside buildings and hideouts also. This can bring so many advantages that there are possibilities for marine and ground drone as researched in U.S .

3. Humanitarian Issues

The basic questions of humanitarian law are evoked whenever LAWS is utilised. The questions as to fixation of the target that the LAWS are not capable of distinguishing the combat soldiers and civilians still lies unanswered. Further, there are questions as to whether the LAWS possess qualities to assess the individuals’ mind when it is comes to attacking. The calls and demands for human intervention when such technology has been increasing and alarming. In the lead to such calls and ban these vehicles, the ‘campaign to stop killer robots’ was started in the year 2012. Presently, a group of 116 specialists from robotics and artificial intelligence pioneers have called for ban of these killer robots led by Tesla ’s chief Elon Musk and Alphabet’s chief Mustafa Suleyman calling it humanity’s threat.

3.1 Geneva Convention

The consensus that there must be a human involvement in the automation technology has triggered question of whether the robots can respect the International Humanitarian Laws (IHL). At the present hour, there is lies no question if they can respect IHL. Both the sides have filed to prove that argument in their favour. United States has contented that there are no norms blocking the development of autonomy and IHL does not prevent the development of autonomy. France on the other hand has concluded that the development of LAWS cannot be regarded as contrary to IHL and prevention will be early as its in the infancy stage. There is an obligation and interest to verify the envisaged weapons conform to the IHL under the Geneva Convention Protocol as per Article 36. In other words, whenever new development, means or method in warfare, it is mandated that they have comply to the protocol or other international laws and must not be prohibited.

3.2 Martens Clause

The Martens Clause in the Preamble of the Hague Conventions states that when there are no regulations made, the subjects are protected under principles of International Law. The same can be applied to the automation technology of LAWS where the conditions in the present scenario is similar. LAWS does not have a regulation for the automation. It is significant to address the question of whether these machines which can kill human beings without any intervention would violate human rights. The laws of humanity is protected under the Martens clause, where the convention is clear that civilised nations forcing such force is required to see under the public conscience and laws of humanity. In the present scenario, it is considered bad for automation machines to kill without human intervention.

4. United Nations and LAWS

The Lethal Automated Weapons Systems are presently developed in drones, submarines etc. Automation and Robotics are concepts which can be developed and utilized in any areas. There are possibilities of automation in the nuclear sector also. For e.g. Russia has developed nuclear underwater drone which can function automatically. Each and every sector of defence keeps developing automation leading to a threat in future. The generally concern is not build automation weapons of war and they can pose a security threats for the countries. In this regard, over 100 CEOs of A.I firms have signed letters to the UN. They say even missile and high speed rockets can be built.

The United Nations first acted in the year 2013 for the ‘Campaign to Stop Killer robots’ when it stated that in 2014 an informal meeting was scheduled. The talks between the nations could not yield any result, in the following years- 2015, 2016, it was highly contemplated by the UN Disarmament Board, its chairman Mr. Amandeep Gill has pointed that robots won’t take charge. The meeting was held on 13th November 2017, the fact to be appreciated is that this was the first official step. India lead the talks on these issues when its representative to the Conference on Disarmament mentioned the focus to have increased systemic controls.

Conclusion

The LAWS is a kind of force which causes destruction and possess a threat to mankind in warfare. The idea behind condemning and to regulate or ban them is that it violates humanitarian principles. LAWS are in the infancy stage and with developments, it would be a potential threat in warfare where civilians or militants are to be treated with basic rights and principles. LAWS in the present scenario needs to be regulated by the United Nations because the developed countries have developed and tested LAWS. LAWS being used in warfare must be banned and United Nations has only initiated discussions with its members. The idea behind banning LAWS would be to avoid States using these technologies from violating humanitarian principles.

Author

Rana Prithvi
4th Year, BA.LLB (Hons.) at Tamil Nadu National Law University

You can reach Rana Prithvi at ranaprithvi@gmail.com

 

:

 

 

 

Be the first to comment

Leave a Reply

Your email address will not be published.


*