Frankie Medina
Dr. Steven Wexler
English 654
11 December 2011
Evolution of War and the Remediation of the Human Soldier
War is all around us. History has chronicled it. And we see war has evolved. Pace has improved by the allowance of target locations to be reached sooner and the deployment of troops to be executed quicker. Primitive warfare has given way to modern tactics. War is no longer restricted to one’s immediate environment as it now may reach across oceans to other ends of the globe. Advancements in technologies have allowed for the incorporation of newer, more sophisticated weaponry, transport, and communications. War has also evolved to a point in which specific governing bodies decide the most logical and effective means of military operation-and these governing bodies may not only be local, but also international, e.g. United Nations. So much of war has changed through time, but one element, up until now, has remained static-the human. Yes, it can be said that the human’s involvement may vary quantitatively or voluntarily, but war has always involved some measure of the human element. This may be changing. Advancements in robotic technologies may soon do away with the need or reliance on the human element-at least as far as literal combat is concerned. Although this has not yet come to fruition, it appears as though we may soon witness a remediation of the human soldier.
In order to truly examine the evolution of war and the possibility of the absence of the human element, a history of warfare must be understood. Quincy Wright, in his 1942 book A Study of War, recognizes three key types of warfare: primitive, historic, and modern. “Primitive peoples”, insists Wright, “only rarely conduct formal hostilities with the object of achieving a tangible economic or political result” (58). This is very contradictory to what many would view to be the driving forces behind many U.S. military operations today. If not driven by economics or politics, what does drive primitive war? Wright goes on to list them: food-when pasturage is short, sex-breaches of the sex mores by non-members of the group, defense of territory, activity-war as sport, self-preservation-when no means of escape presents itself, and society-maintaining social solidarity (75-8). Weapons of the primitive man are “confined to arm-, foot-, or mouth propelled instruments”, e.g. spears, bow, sword (Wright 81). Mobility is “limited to hands and feet” (Wright 81). It is very clear that primitive man, when compared to the modern soldier of war is “lacking” (Wright 82).
Historic war occupies the time “within or between the literate civilizations from Egypt and Mesopotamia down to the age of discovery in the fifteenth century” (Wright 101). Historic war, contrary to that of primitive and the want of social solidarity, “functioned to promote change rather than stability and…to disintegrate rather than to integrate civilization” (Wright 125). The drives of historic warfare are the same as those which exist with primitive peoples, but “their relative difference has been very different” (Wright 131). Food and sex are of less importance; dominance and independence have become operative, while self-preservation has become less of a worry as civilized man is protected by political and legal institutions (Wright 131, 138). Advancements of weaponry and tactics may be seen in the development of helmets and armor, the principles of mechanical elasticity, torsion, and momentum in siege instruments, integration of animal aide, i.e. the horse, fortification, and the presence of a specialized military class (Wright 144-7).
The fifteenth and sixteenth centuries brought us into modern warfare. Drives of modern warfare are understandably more recognizable to us today. Politics-politicians take interest in war as means to maintain or augment power, economic-economic gain can be achieved through selling war supplies, securing advancement in the military profession, and contributing various services to the conduct of war, culture-societies believe war as the appropriate response to a breaches of mores, and religion-crusading for socially approved symbols (Wright 278-88). Weapons and tactics changed with advent and development of the gun and airplane, the professionalization of armies, the use of steam power for land and water military transportation, the armored vessel, use of mines and submarines (wright 293-302). While many of us are solely familiar with modern warfare of the present time it is difficult for us to comprehend the characteristics of primitive and historic warfare. All one has to go on is the retelling of these characteristics through film and literature. Still, the one characteristic we may grasp more completely than all others is the one which still continues to develop-mechanization.
In each instance of war discussed so far, a present human element was required to perform the tasks required to wield weaponry and enact military tactics. With the advancement of robotics and mechanization, this may no longer be the case. Robert Martinage, senior defense analyst at the Center for Strategic and Budgetary Assessments comments, “‘In the Gulf War, we had smart weapons. Now, increasingly, we are fielding brilliant weapons’” (Martin 67). Brilliant is without a doubt preferred to smart. The rhetoric alone instills the hope that fewer civilian casualties due to miscalculation is possible, the end result is achieved with little to no resistance, and the tactics of war are continuing to evolve even at a point when all wartime necessities and wants are possibly met. Still, the term “brilliant” is applied to a non-human entity. The question arises, with brilliance now a characteristic of weaponry, will the human become obsolete or unnecessary? Will a human presence cease to be required to wield military weaponry? Is brilliance enough to allow for fully autonomous operation? Of course we cannot answer these questions today, but developments of specific military technologies may soon lead to the possibility of answering yes to each of these questions. Then again, if we do not achieve complete autonomous operation in weaponry we can be sure of one thing, the human soldier, when incorporated with such technologies will become remediated.
Raytheon Company is currently developing a robotic suit destined for military use. The suit, named Exoskeleton (XOS 2), “will help with the many logistic challenges faced by the military both in and out of battle” (TechNewsDaily.com). The Exoskeleton allows for repeated lifting of 200 pounds without tiring and repeated punching through three inches of wood. The fact that the XOS 2 uses 50% less power than its predecessor the XOS 1, and is capable of using hydraulic power more efficiently speaks to the fact that not only is technology of this nature present, it is being improved, and is sought after. It is hard to argue why it should not be developed. The Exoskeleton can do the work of two or three soldiers, relieve stress to the soldier caused by heavy lifting, yet is still graceful enough to kick a soccer ball (TechNewsDaily.com). As of today the Exoskeleton requires a “pilot” to maneuver the suit and has a 40-minute battery life, but once these problems are solved there is no reason to assume the Exoskeleton will not soon be “striding across the battlefield like an armored, missile-launching gazelle” (Zimbio.com).
This is an extraordinary thought. An unmanned, armored, and armed entity deployed by the military to subdue combatants. Immediately one’s thought may go to the belief that human loss will be almost unheard of. The absence of the human element will allow for only the loss of machinery rather than human life. This is encouraging. Especially with the development of these brilliant weapons that are held by multiple militaries, not just the U.S. Yet, how will the unmanned robot act? Will it act like a human or machine? Will it know its objectives? Will it recognize friend from foe? Is the risk of failure worth the gain of success? These questions must be answered. They revolve around morals, ethics, and social acceptance of the technology.
Alan Turing and Hans Moravec both developed tests to show relations between man and machine. Alan Turing, author of “Computer Machinery and Intelligence”, developed a test which places a human subject in front of a computer terminal. Using this terminal the human subject is to “communicate with two entities in another room”, these entities cannot be seen. The job of the human subject is to “pose questions that can distinguish verbal performance from embodied reality” and decide which of the entities he or she is communicating with is man and which is machine. If one cannot distinguish between the two correctly, the test is failed and Turing’ hypothesis that “machines can think” becomes true. (Hayles xi). The successor to the Turing test is the Moravec test. This test is aimed at proving “that machines can become the repository of human consciousness-that machines can, for all practical purposes, become human beings” (Hayles xii).
A machine thinking. A machine becoming a human being. How is this possible? To aid in answering this question it may be beneficial to look at a definition of information formalized by Claude Shannon and Norbert Werner. Shannon and Werner’s definition of information “conceptualized information as an entity distinct from the substrates carrying it”. N. Katherine Hayles goes on to comment that “[f]rom this formulation, it was a small step to think of information as a kind of bodiless fluid that could flow between different substrates without loss of meaning or form” (xi). This lead to Moravec’s proposition that “human identity is essentially an informational pattern rather than an embodied enaction” (Hayles xii). With information being an independently sustaining entity, and human identity being nothing more than a formation of information, it is easy to understand that “embodiment is not essential to human being” (Hayles 4).
If we are to take these theories as true, then why can we not assume robotic machinery, such as the previously mentioned Exoskeleton, can perform the tasks of a human soldier while processing any logic that accompanies those tasks? It would appear that all that needs to be done to achieve a thinking autonomous machine is the transferring of information. And if information can be passed between substrates without change in form then this should be quite feasible. Speaking to this, Hayles reminds us that “for information to exist, it must always be instantiated in a medium” (13). While this does throw a wrench into the theory of information and materiality as distinct entities, technological advancements in robotics justify the hope/dream/fantasy that information imbedded in an autonomous machine will allow for it to become human, or at least close to it.
While many will agree with Hayles that information and materiality cannot function as distinct entities, it is interesting to continue the discussion and suppose the phenomena of autonomous machinery in the military will be accepted as a replacement or remediation of the human soldier. Colin and Wendell Allen, co-authors of Moral Machines: Teaching Robots Right From Wrong, respond to P.W. Singer’s article “Robots at War: The New Battlefield”. The Allen’s focus is on Singer’s failure to “mention the possibility of using artificial intelligence to mitigate ethical problems” (6). Colin and Wendell Allen do not believe it possible to “create such machines” with “moral decision-making faculties” (6). The Allen’s go on to state that without the ability to distinguish between right and wrong, “autonomous robots are a bad idea” (6). David Axe, author of War Bots, commenting on the same Singer article in which Singer does acknowledge the danger of replacing “thinking, feeling soldiers with emotionless robots…but [also] describes military circles as enthusiastic about this new technology”, comments that “as much as ever, a young infantryman with a rifle, two hands, two eyes, and a brain is the single most important and powerful weapon in an army’s inventory” (6). William O. Waddell, Director, Command and Control Group, Center for Strategic Leadership, U.S. Army War College, adding to the discussion questions the ethics of robots in war. Waddell insists robotic technology “involves the usurpation of decision-making processes”, and goes on to ask “what message do we send by dispatching robots to fight humans?…How do these new technologies fit in with existing legal and ethical codes?” He also questions “the future availability of these technologies to any nation or group that could afford them” (7).
It is quite clear a heightened sense of skepticism accompanies the thought of robotic technology in war. And this alone may be enough to slow the process of implement into combat. Federico Pistono, author of Robots Will Steal Your Job, But That’s OK: How to Survive the Economic Collapse and Be Happy, comments that technology may be slowed by non-acceptance-although this will only hinder the inevitable, so it safe to assume that the implement of robotic technology into the military and thus the replacement and remediation of the human soldier is only a matter of time.
Now that we have covered, although briefly the feelings of humans on the concept of robotic technology in war, perhaps it would be interesting to theorize how the robotic machines themselves would feel. In his short work “The Feelings of Robots”, Paul Ziff considers the possibility of “attribut[ing] feelings to the machine and so blur the line between man and machine” (64). This is perhaps quite similar to the theories of Turing and Moravec previously discussed. If human identity is nothing more than informational pattern and information can be passed unchanged between substrates then machines should be able to be made to feel. To this Ziff says no. Ziff argues, “Only living creatures can literally have feelings” (64). Because, according to Ziff, “robots are not persons” a robot “may kill but not literally murder” (65). Ziff states “there are no psychological truths about robots” and that is why “no robot could be sensibly said to feel anything” (67). Ziff argues that robots act exactly as it is “programmed” to act (68). This may be a benefit to war. If a robot feels exactly as it is made to feel it will never question authority, orders, purpose, or ethics. The robot will never feel fear or remorse. And because robots are “replaceable” there is theoretically no end to the quantity of robots to be made to the military’s disposal.
A seemingly endless supply of identical, obeying, non-feeling machines capable of claiming the risk of death from the human soldier is an amazing thought. Imagine the quantity of mechanical soldiers possible to be placed on the front lines. They are perhaps numbers not approachable by human soldiers given the voluntary nature of our military. At a 2007 Military History Symposium, Major General (Retired) Robert H. Scales, Jr. commented on the need for more infantry:
But as I said, I think, increasingly, as you move to the small units, it’s probably
more important to focus on the human. What do we have to do to fix the problem in
this new age of infantry?
Number one is make more of them. If you put every infantryman in this nation in
one stadium…they will not fill FedEx stadium. We have more first line Air Force and
Navy fighter aircraft, costing between $50 and $450 million apiece than we have
infantry squads. 2,475 if you want to know the number. So we just need more. (258-9)
Of course Major General Scales is most likely speaking of an increased number of human infantry in his speech, but nonetheless, the need for more is made apparent. The state of the infantry needs to be improved quantitatively. Technological advancements in robotics may allow for autonomous machinery to fill that need.
The technology is still in development, and once development is completed acceptance of the technology must be accepted into combat situations. This acceptance will be based on many things. Morals and ethics will be chief among these deciding factors. Will it benefit military operations to replace the human soldier with a remediated form of itself? If the Exoskeleton is capable of operation without a pilot and is capable of receiving information passed from a human source, what then would be the limits of this technology? It is hard to say. Jay David Bolter and Richard Grusin point out that “technological limitations simply point to its great potential” (22). If this is true the possibilities of robotic technology are boundless. So much so that it may cause the fear of “humans [being] displaced as the dominant form of life on the planet by intelligent machines” and thus causing a belief that “the age of the human is coming to a close” (Hayles 283).
Works Cited
Allen, Colin, Wendell Wallach, David Axe, William O. Waddell, and Alex Roland. “Robots at
War.” The Wilson Quarterly 33.2 (2009): 6-7, 9. JSTOR. Web. 27 Nov. 2012.
Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge: The
MIT Press, 2000. Print.
Brooks, Michael G., and Kendell D. Gott, eds. Warfare in the Age of Non-State Actors:
Implications for the US Army. Fort Leavenworth: Combat Studies Institute Press, 2007.
Print.
Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and
Informatics. Chicago: The University of Chicago Press, 1999. Print.
Lowry, Ritchie P. “To Arms: Changing Military Roles and the Military-Industrial Complex.” Social
Problems. 18.1 (1970): 3-16. JSTOR. Web. 4 November 2012.
Martin, Randy. An Empire of Indifference: American War and The Financial Logic of Risk
Management. Durham: Duke University Press, 2007. Print.
Pistono, Federico. Interview by Phil Bowermaster. Transparency Revolution. February 2012.
Youtube.com. Web.
Wright, Quincy. A Study of War. 2nd ed. Chicago: The University of Chicago Press, 1965. Print.
Ziff, Paul. “The Feelings of Robots.” Analysis 19.3 (1959): 64-8. JSTOR. Web. 4 Nov. 2012.