The Ninja missile does not rely on an explosive warhead to destroy or kill its target. It uses the speed, accuracy and kinetic energy of a 100-pound missile fired from up to 20,000 feet, armed with which deploy in the last moments before impact. six blades
The Ninja missile is the ultimate attempt – thus far – to accurately target and kill a single person. No explosion, no widespread destruction, and no deaths of bystanders.
But other weapon developments will also affect the way we live and how wars are fought or deterred. Russia has in these so-called super-weapons, building on older technologies. They aim to reduce or eliminate technological advantages enjoyed by the United States or Nato. invested heavily
Rusia’s hypersonic missile development aims are highly ambitious. The missile, for example, won’t need to fly outside the earth’s atmosphere. It will remain within the upper atmosphere instead, giving it the ability to manoeuvre. Avangard
Such manoeuvrability will make it harder to detect or intercept. China’s is similarly intended to evade US missile defences. DF-17 hypersonic ballistic missile
The autonomous era
At a smaller scale, are emerging on the weapons market. The weapon development company robot dogs with mounted machine guns took a Ghost Robotics quadrupedal unmanned ground vehicle – or dog robot – and mounted an assault rifle on it. It was one of three robot dogs on Sword International . display at a US army trade show
Turkey, meanwhile, is claiming it has developed , which can identify and kill people, all without input from a human operator, or GPS guidance. According to a four types of autonomous drones from March 2021, such an autonomous weapon system has been used already in Libya against a logistics convoy affiliated with the Khalifa Haftar armed group. UN report
Autonomous weapons that don’t need GPS guidance are particularly significant. In a future war between major powers, the satellites which provide GPS navigation can expect to be shot down. So any military system or aircraft which relies on GPS signals for navigation or targeting would be rendered ineffective.
, China , India and the Russia have developed weapons to destroy satellites which provide global positioning for car sat-nav systems and civilian aircraft guidance. USA
The real nightmare scenario is combining these, and many more, weapon systems with artificial intelligence.
New rules of war
Are new laws or treaties needed to limit these futuristic weapons? In short, yes but they don’t look likely. The US has a global agreement to stop anti-satellite missile testing – but there has been no uptake. called for
The closest to an agreement is the signing of . These are principles to promotes peaceful use of space exploration. But they only apply to “ NASA’s Artemis Accords ” of the signatory countries. In other words, the agreement does not extend to military space activities or terrestrial battlefields. civil space activities conducted by the civil space agencies
In contrast, the US from the Intermediate-Range Nuclear Forces Treaty. This is part of a long-term has withdrawn by US administrations. pattern of withdrawal from global agreements
Lethal autonomous weapon systems are a special class of emerging weapon system. They incorporate machine learning and other types of AI so that they can make their own decisions and act without direct human input. In 2014 the International Committee of the Red Cross (ICRC) to identify issues raised by autonomous weapon systems. brought experts together
In 2020 the ICRC and the Stockholm International Peace Research Institute went further, bringing together international experts to identify what would be needed. controls on autonomous weapon systems
In 2022, discussions are ongoing between countries in 2017. This group of governmental experts continues to debate the development and use of lethal autonomous weapon systems. However, there has still been no international agreement on a new law or treaty to limit their use. the UN first brought together
New rules for autonomous weapon systems
The campaign group, Stop the Killer Robots, has called throughout this period for an on lethal autonomous weapon systems. Not only has that not happened, there is an undeclared international ban on autonomous weapons in Geneva. stalemate in the UN’s discussions
Australia, Israel, Russia, South Korea and the US have or political declaration. Opposing them at the same talks, 125 member states of the Non-Aligned Movement are calling for opposed a new treaty on lethal autonomous weapon systems. With Russia, China, US, UK and France all having a UN Security Council veto, they can prevent such a binding law on autonomous weapons. legally binding restrictions
Outside these international talks and campaigning organisations, independent experts are proposing alternatives. For example, in 2019 ethicist, Deane-Peter Baker brought together the Canberra Group of independent international. The group produced , Guiding Principles for the Development and Use of Lethal Autonomous Weapon Systems. a report
These principles don’t solve the political impasse between superpowers. But if autonomous weapons are here to stay then it is an early attempt to understand what new rules will be needed.
When Pandora’s mythical box was opened, untold horrors were unleashed on the world. Emerging weapon systems are all too real. Like Pandora, all we are left with is hope.
Peter Lee is a Professor of Applied Ethics in the Faculty of Creative and Cultural Industries and is Director of the Security and Risk research theme.
This article is republished from The Conversation under a Creative Commons Licence.
Read the original article.