Indiscriminate by Natureor the principle that prohibits indiscriminate attacks, as found in Article 51 paragraph (4) letters b and c of Additional Protocol 1 of 1977 which states that “Indiscriminate attacks are attacks in which the means or methods cannot be directed at specific military objectives”, and letter c which states, “Indiscriminate attacks are attacks in which the effects of the means or methods used in an armed conflict cannot be limited as required by this protocol.” The way AWS works has in fact caused debate in the international community and is currently being discussed by. Where in line with the opinion ofDisarmament and International Security (First Committee)AWS meets the requirements and is in accordance with this principle. This is because the way AWS works basically has a high level of accuracy in determining the target, which reaches 98.5%, so if this AWS is not equipped with dangerous ammunition, like chemical or biological weapons, it is said to be in accordance with the principles of IHL. Source:Army of None: Autonomous Weapones and the Future Of War.Shows that modern AWS is equipped with an advanced target recognition system that can distinguish between combatants and non-combatants with an accuracy rate of 98.5%, far exceeding the average ability of human soldiers in intensive combat situations. If we look at the implementation of AWS itself, so far there has been some evidence of the successful use of AWS without violating the principles of international humanitarian law. Like the first, Phalanx CIWS (Close-In Weapon System) and SeaRAM(Sea Rolling Airframe Missile)from the United States are ship defense systems that work automatically to shoot down enemy missiles and aircraft that threaten warships. Phalanx CIWS was developed byGeneral Dynamicsin the 1970s and is now used in more than 25 countries. According to a Naval News report on August 12, 2023, the system has been used by the US Navy in the Red Sea to deal with the threat of drones and ballistic missiles. With a purely defensive function, Phalanx and SeaRAM do not violate the principle of proportionality in IHL, because they do not attack humans directly and only act when the ship is in danger. Second,Maritime Mine Counter Measures (MMCM)owned by the UK is another example of AWS used for peaceful and humanitarian purposes. This system was developed by Thales Group in collaboration with the UK and French Ministries of Defense to detect and neutralize dangerous sea mines. According to the reportUK Ministry of Defenceon June 5, 2022, this AWS was tested in the waters of the British Gulf and successfully identified and eliminated more than 50 active mines in a joint NATO mission. The existence of MMCM is crucial in maintaining maritime security, especially for merchant ships and fishermen who are often victims of sea mines left over from conflicts.

Principles of International Humanitarian Law that Contradict AWS

With all the advantages of AWS, let's move on to some things and aspects that are then controversial and become the point of application of AWS itself. One of the fundamental principles and aspects of International Humanitarian Law is the Principle of Distinction which can be found in article 51 paragraphs 1-3 of Additional Protocol 1 of 1977 which requires the military to be able to distinguish and separate between combatants and non-combatants. A real example of the negative impact of AWS technology in warfare can be seen in the US drone strike in Afghanistan in 2021, which resulted in the deaths of 10 civilians, including 7 children, due to misidentification of targets by the automated system. This case shows that even with advanced technology, artificial intelligence still has limitations in complex decision making. Furthermore, the principle of Military Necessity as found in Article 52 paragraph (2) of Additional Protocol I of 1977 which discusses legitimate military objectives and can determine whether attacking a target can provide a definite military advantage and reduce as much as possible the losses suffered by civilians. If we correlate it with the use of AWS, then it can be said that it is difficult for AWS even if it has been equipped with AI to fulfill this principle as Dr. Ronald Arkin (War Robotics Expert from Geoorgia Tech) stated that the task of forming this algorithm will pose very specific difficulties for the designers of this technology later because the level of data and technology is still not sufficient. So the jury, seeing all the risks and speculations from war law experts, is unrealistic for us to continue to approve the use of AWS because it is dangerous and contrary to the principles of international humanitarian law itself. In international humanitarian law, the principle of proportionality is also a fundamental rule that stipulates that military attacks must consider the balance between the military advantage gained and the potential civilian casualties and infrastructure damage caused. This principle is explicitly regulated in Article 51 paragraph (5) letter b and Article 57 paragraph (2) letter a points 1-3 of Additional Protocol I of the 1977 Geneva Convention, which requires maximum precautionary measures before an attack is carried out. Even though it has been equipped with AI, AWS is not adequate to carry out complex things, especially in considering proportionality. The case study of the Palestinian and Israeli conflict in the Gaza Strip can be our joint consideration. Where, according to a New Scientist report, Israeli defense forces (IDF) allegedly used AI drones to identify and target targets in Gaza, causing the deaths of 248 Palestinians. The weapons used in this attack have AWS characteristics, where an AI system is used to analyze, select and attack targets without direct human involvement. As a result, the attack was considered to violate the principle of distinction in international humanitarian law, because it targeted areas with high civilian density. This case proves that AWS has the potential to ignore moral and legal aspects in warfare, and increase the risk of disproportionate attacks.

Conclusion

Pros and Cons regardingAutonomous Weapon Systems(AWS) reflects the complexity of the relationship between the development of weapons technology and the principles of International Humanitarian Law (IHL). In the context ofjus ad bellum, the main issue lies in the legality of using AWS as part of the right to wage war. Meanwhile, in the aspect ofjus in bello, the main challenge is to ensure that AWS operations do not violate fundamental IHL principles, such as distinction, proportionality, humanity, and military necessity. Thus, the legality of using AWS within the framework of international law is not determined by the existence of the technology alone, but by the extent to which the technology is designed, operated and strictly supervised to remain within the corridor of law and humanitarian values. Therefore, AWS cannot necessarily be seen as a threat or a solution, but rather as a legal phenomenon that demands the formation of comprehensive and precautionary-based international regulations.