Continuing a series on world leaders, Miles Deverson takes a look at Angela Merkel
Ben Bland examines the fallout from the Iowa caucuses and looks forward to the New Hampshire primaries.
In the first of a series on world leaders, Miles Deverson takes a look at Nicholas Sarkozy
Izaak looks at Cameron's pronouncements of morality and asks what this means in the age of austerity?
The use of military and combat robots is traditionally considered a topic for science fiction. But, what most do not realise in our present time is that autonomous weapons, which would be capable of engaging targets and issuing lethal force, are right on our doorstep, if not already a reality.
Combat robots are currently being researched and developed as a possible future means of fighting wars and conflicts of all forms. Some believe the future of modern warfare will be fought by automated weapons systems.
If this is beginning to sound like a fictional dystopia, then you should realise that the future is a lot closer than most people think. Military robots date back to the Second World War and the Cold War in the form of the German Goliath tracked mines and the Soviet teletanks. You may be surprised to hear that the USSR was using a series of wireless remotely controlled unmanned robotic tanks produced in the 1930s and early 1940s. This adds a whole new perspective to the forced industrialisation under Stalin and the nature of combat in the Winter War, at the start of the Nazi-Soviet conflict.
Today, autonomous weapons are capable of accomplishing a mission with restricted or no human intervention whatsoever. The degree of involvement by a man-in-the-loop is becoming much smaller with time. These systems are capable of self-propulsion, independent processing of the environment, and independent response to the environment. In addition, there are variables in their lethality and capability to use deadly force. Some autonomous weapons operate with full independence, with the core functions of basic surveillance and reconnaissance.
The most prominent form of autonomous weapon currently in use is the unmanned aerial vehicle (UAVs). Commonly referred to as drones, these were fairly small and used over distances of a few kilometres for simple intelligence purposes.
However, the past few years has seen a marked development in the use of armed drones, such as the RQ-1 Predator and the Reaper. These are able to rain down fire from the skies, while the drone operators could be located in an ‘air-conditioned trailer’ up to 7,000 miles away in a base near Las Vegas.
If you think that the names sound a bit menacing, it is because they do exactly what it says on the tin, so to speak. These aerial weapons operate in the shadows and are the practical outcome to the “CIA’s decade old fantasy of using aerial robots.”
In the rush to develop and exploit this autonomous technology, it appears that ethical and legal questions are being brushed aside. A dangerous implication is that it removes one of the key restraints to warfare – the risk to one’s own forces. It seems logical to conclude from this that war will become a greater possibility in the future. There may be a resurgence in inter-state conflict and these robots could be used on the population, if it were to land into the hands of a particular kind of leader.
Is it too early to tell?
The U.S forces were engaged in six armed conflicts in 2011 alone. Military experts claim that the campaigns in Iraq, Afghanistan, Libya, Yemen, Somalia and Pakistan could not have been possible without the use of such drones.
In Yemen and Pakistan, the U.S is using drones to undertake the targeted killings of ‘high-profile’ targets. The U.N Special Rapporteur has consistently called on the U.S to explain how it justifies extra-judicial killings and assassinations under current international law.
But, this is not a technology confined to the U.S. Britain is also using them, while Israel, too, has used armed drones to launch attacks on the Gaza strip. The condemnation of Hezbollah as a group who employs tactics of terror seems hypocritical here.
The decision to enter Libya was explained as a moral duty to prevent a massacre of rebel forces in Benghazi. But, Italy also used drones to eliminate targets from the skies.
A recent defence market report predicted that annual global expenditure on drones would double to $11.5 billion over the next few years. The danger of these weapons seem to be compounded by a NATO enquiry into an attack on a convoy in Afghanistan in February 2010, in which 23 civilians were killed, which reports that civilian casualties were ‘downplayed’ because they wanted to proceed to the attack.
Largely as a result of this, some analysts have argued that human lives are reduced to pixels on a computer screen. The video feed has been described as incredibly dull, because it patrols particular area up to hours, days and even weeks. Within military circles it has been called ‘Death TV.’
There has been some commentary on how mistakes are bound to be made and young operators being raised on a diet of video games will find it difficult to distinguish between real people and pixels on a screen. Though, one would hope that an operator would be sufficiently trained or sensitive enough to understand the difference between a game such as Grand Theft Auto and a real-life battle zone.
The British Ministry of Defence produced a remarkable report early last year that sounded a clear warning on the ramifications of such weaponry. “We must establish quickly a clear policy on what will constitute appropriate machine behaviour…a significant body of scientific opinion believes in banning autonomous weapons outright…embarking us all on an incremental and involuntary journey towards a Terminator-like reality?” We should listen to this scientific opinion and think about the obvious damage that these weapons will do to peace. By definition, they are weapons after all.
Autonomous weapons should have no place in the future, because they are machines that are capable of collateral damage and chaos. There have been reports that research is being carried out to be able to monitor several drones at once and also to be able to “detect and recognise people from their face, gait and shape.” Increased sophistication could result in systemic and surgical destruction.
Most conflicts are considered as ‘wars among the people’ rather than ones between nations and it follows that in stabilising operations, it is better and less painful to use people, who are capable of helping the population or addressing failed state. Autonomous weapons risk impersonal destruction and indiscriminate death, which threatens to breed further insecurities and failures to develop infrastructure.
You must log in to submit a comment.