P.W. Singer, fully Peter Warren Singer

P.W.
Singer, fully Peter Warren Singer
1974

American Political Scientist, International Relations Scholar, Specialist on 21st century warfare, Author, Founding Director of the Project on U.S. Policy Towards the Islamic World in the Saban Center at Brookings, Founding Organizer of the U.S.-Islamic World Forum

Author Quotes

Fear Supplements Firepower. When forces do face engagement with child soldier forces, best practice has been to hold the threat at a distance and, where possible, initially fire for shock. The goal should be to maximize efficiency and prevent costly externalities by attempting to break up the child units, which often are not cohesive fighting forces. In a sense, this is the micro-level application of "effects based warfare," just without the overwhelming dependence on high technology. Demonstrative artillery and mortar fires (including the use of smoke), rolling barrages (which give a sense of flow to the impending danger) and helicopter gunship passes have been proven especially effective in breaking up child soldier forces.

It's impressive when you break it [robotics] into three different directions that robotics and war are headed in. For one thing, there's the raw numbers, in terms of the use of these robotic systems. We've gone from a handful of drones during the Iraq invasion to more than 7,000 now in the U.S. military inventory. On the ground, we had zero unmanned vehicles before the invasion of Iraq. We now have over 12,000. And this is just the start.

And it's not just an American expansion. It's global. There are 43 countries working on military robotics right now. So you have this just huge... immense growth. The best way to imagine where we're going is to look at what Bill Gates says about robotics. He says, "Robotics are about where computers were in 1980."

Follow-up Yields Success. The defeat of a child soldier-based opposition does not just take place on the battlefield, no matter how successful. A force must also take measures to welcome child soldier escapees and POWs quickly, so as to dispel any myths on retribution and induce others to leave the opposition as well. This also entails certain preparations being made for securing child detainees, something U.S. forces have had no doctrine or training for, even down to not having proper sized cuffs. Once soldiers have ensured that the child does not present a threat, any immediate needs of food, clothing, and/or shelter should be provided for. Then, as soon as possible, the child should be turned over to health-care or NGO professionals. The business of imprisoning juveniles is not the mission of the military and certainly not positive for the health of the organization.

It's the fundamental difference between the bomber pilots of WWII and even the bomber pilots of today. It's disconnection from risk on both a physical and psychological plain. When my grandfather went to war in the Pacific, he went to a place where there was such danger he might not ever come home again. You compare that to the drone pilot experience. Not only what it's like to kill, but the whole experience of going to war is getting up, getting into their Toyota Corolla, going in to work, killing enemy combatants from afar, getting in their car, and driving home. So 20 minutes after being at war, they're back at home and talking to their kid about their homework at the dinner table. So this whole meaning of the term "going to war" that's held true for 5,000 years is changing.

And then you get to the interactivity of these robots. There's incredible work on social robots that can recognize facial expressions and then, in turn, give their own facial expressions. And this is going to continue, because you have Moore's Law going on here, where our systems -- our microchips are doubling in their computing power just about under every two years or so. And that means that the kind of systems that we have today really are the Model T Ford. They're the Wright Brothers flyers as compared to what's coming. If Moore's law holds true, the way it has held true for the last several decades, within 25 years our systems may be as much as a billion times more powerful than today. And so this all sounds like science fiction, and yet it is real right now. It's a technologic reality and a political reality.

For some strange reason, a few people have concerns about super-smart robots carrying machine guns that can shred entire buildings.

Jacques de Vaucanson was born in Grenoble, France, in 1709. At the age of twenty-six, he moved to Paris, then the center of culture and science during the Age of Enlightenment. Inspired by Isaac Newton?s idea of the universe as a great clock that had been set in motion by the Creator, the Diest philosophers of the time saw the world as guided by mechanical forces. They believed that everything, from gravity to love, could be understood if you could just scientifically reason it out.[

Are USAF attitudes changing on drones? Somewhat but slowly. The senior leadership, which had replaced a previous set who were fired in part for not meeting Gates' push for more Predator flights, are doing better and ramping up buy rates and pilot training for these systems. But it still is somewhat begrudgingly and not well integrated with rest of the force. And indeed, there is talk of creating a new air wing that would focus just on counterinsurgency missions. But, of course, the only plane the leadership talked about buying for it was a light, manned, propeller plane (a back to the future mentality), never once exploring why it would be better than the unmanned systems. Within the force, its mixed as well. There is more and more experience being built up with these systems, so greater constituency, but you still see resistance. I was recently at US Air Force Academy and one of the young officers there was soon to join a Predator squadron. And he was sorely disappointed about it, describing how he wanted to fly and instead his new job was "going to be boring." So, while his unit will likely engage in more combat and critical missions to American national security than almost any other in the force, it is still "boring" to him and he was the object of his mates jokes. I told him there was likely some young officer in 1919 at West Point who equally was probably disappointed to be joining a tank unit instead of the more exciting and prestigious horse cavalry units.

Given the destructive potential of weapons based on triggered nuclear isomers, and the fact that they would have an effect similar to neutron bombs, since the stored energy is released in the form of gamma radiation, killing without creating a great deal of physical destruction, could you get behind a move to ban such weapons? On the other hand, can you imagine any commercial application for triggered nuclear isomers that would be safe - in terms of both protection from the radiation produced and security to prevent the material from being diverted into weapons - and economically viable, given that creating the material in the first place is likely to be energetically inefficient? Powering unmanned, long-duration, high-value assets seems like the perfect match, since neither the radiation nor that fact that it might take 100KWh in to produce 1KWh out would be of much concern.

Just like software, warfare is going open source. That is, we're starting to use more and more systems that are commercial, off-the-shelf -- some of it is even DIY. You can build your own version of the Raven drone, which is a widely used military drone, for about $1,000 dollars. So we have a flattening of the landscape of war and technology that is just like what happened in software. A wide variety of actors can utilize these systems.

Are we going to let the fact that what is unveiling itself right now in war sounds like science fiction and therefore keeps us in denial. Are we going to face the reality of 21st century war? Is our generation going to make the same mistake that a past generation did with atomic weaponry, and not deal with the issues that surround it until Pandora's box was already opened up? Now I could be wrong on this and one Pentagon robot scientist told me that I was, he said, "There's no real, social, ethical, moral issues when it comes to robots. That is, " he added " unless the machine kills the wrong people repeatedly," quote, "then it's just a product recall issue." And so, the ending point for this is that, actually we can turn to Hollywood. A few years ago Hollywood gathered all the top characters and created a list of the top 100 heroes, and top 100 villains of all of Hollywood history the characters that represented the best, and worst of humanity, only one character made it on to both lists, The Terminator, a robot killing machine. That points to the fact that our machines can be used for both good, and evil, but for me it points to the fact that there is a duality of humans as well. This week is a celebration of our creativity. Our creativity has taken our species to the stars. Our creativity has created works of arts and literature to express our love, and now we're using our creativity in a certain direction to build fantastic machines, with incredible capabilities, maybe even one day, an entirely new species. But one of the main reasons that we're doing that is because of our drive to destroy each other. And so the question we all should ask, is it our machines, or is it us, that's wired for war?

Groups use all sorts of means to indoctrinate children, be it through brutality, abuse, forcing them to take drugs, political training, watching films of violence, you name it. Many are the same that armies and rebels groups use, but when applied to children are clearly abuse. Children also develop all sorts of coping mechanisms, such as giving themselves "Jungle names" (calling themselves by some sort of nickname like "commander killer," or "blood never dry" that both sounds fearsome and also disassociates themselves mentally from the horrors of war.

Mankind's 5,000 year old monopoly on the fighting of war is breaking down in our very lifetime. I spent the last several years going around meeting with all the players in this field, from the robot scientists, to the science fiction authors who inspired them, to the 19 year old drone pilots who are fighting from Nevada, to the four star generals who command them, to even the Iraqi insurgents who they are targeting, and what they think about our systems. What I found interesting is not just their stories, but how their experiences point to these ripple effects that are going outwards in our society, and our law, and our ethics, et cetera. And so what I'd like to do with my remaining time is basically flesh out a couple of these. So the first is that the future of war, even a robotics one, is not going to be purely an American one.

As Anthony Lewis once wrote, ?our military technology is so advanced that we kill at a distance and insulate our consciences by the remoteness of the killing.?

Hafnium 178 is one of the isotopes into which energy can be stored by the creation of an isomer, and from which that energy can be released, either in a controlled manner or all at once. [re Hafnium Bomb]

Many, including nearly every roboticist I met while writing this book [Wired for War], hope that these new technologies will finally end our species? bent toward war?the fear among soldiers is the very opposite of the scientists? hope. They worry that war is disappearing.

As anyone who's played Grand Theft Auto knows, we do things in the video world that we wouldn't do face to face. So, much of what you're hearing from me is that, there is another side to technologic revolutions, and that it is shaping our present, and maybe will shape our future of war. Moore's Law is operative, but so is Murphy's Law. The fog of war isn't being lifted, the enemy has a vote, we're gaining incredible new capabilities. But we're also seeing and experiencing new human dilemmas. Now sometimes these are just "oops" moments, which is how the head of a robotics company described it "You just have oops moments." Well what are oops moments with robots and war? Well sometimes they're funny. Sometimes they're like that scene from the Eddie Murphy movie, Best Defense, playing out in reality, where they tested out a machine gun armed robot and during the demonstration it started spinning in a circle and pointed its machine gun at the reviewing stand of VIPs. Fortunately the weapon wasn't loaded and no one was hurt. But other times oops moments are tragic, such as last year in South Africa, where an anti-aircraft cannon had a, quote, "software glitch" and actually did turn on, and fired, and nine soldiers were killed. We have new wrinkles in the laws of war and accountability.

Historical experience has demonstrated a number of effective methods to handle situations when professional troops are confronted by child soldiers.

Moore's law explains how and why we have entered a world in which refrigerator magnets that play Christmas jingles have more computing power than the entire NORAD nuclear defense system had in 1965.

As you note a particularly pernicious characteristic of the phenomenon is its potential to ruin the lives of children and, in doing so, lay the groundwork for future conflicts that harm society writ large. The challenge we face is therefore how to reverse the effects of the doctrine, and, in doing so, restore the children's future. The healing takes not one step, but is a process. It involves disarming and demobilizing the children, an arduous process of rehabilitation, and then capping the transition back to childhood through the reintegration with their families and communities.

However, these were the exceptions to what the rule used to be, that children had no place in war. Throughout the last four thousand years of war as we know it, children were never an integral, essential part of any military forces in history. But the rules of war have changed. The participation of children is now not a rarity, but instead a growing feature of war.

My hope in writing [Children of War] it was three things: 1) When you come across with an issue like this, you are compelled to tell the stories of tragedy and bravery that few too are aware of. 2) but I hoped to do so in a manner that didn't just evoke empathy, but also lead to understanding. That is, only by understanding the causes and dynamics of this appalling phenomenon, can we develop appropriate responses to it. 3) Finally, I wanted to move this issue beyond just heartbreak and show how the supposedly soft issue of children is actually becoming a hard security issue, That is, by looking at this through a new lens, we see that we have not just a moral obligation not to shirk or be "stingy", but also a strategic mandate to act to ensure our own security, something that is important in a period when security is the new currency in political dialogue... I think it gets too few media coverage, because it is a difficult issue that takes place often far away. The children are both victims and perpetrators of violence, usually fighting in messy, complex wars. It?s hard for media to cover that with its minimal international coverage in this day and age.

At a broader level, governments that want to stay ahead of the issue should mobilize the United Nations, as well as local political leaders and religious experts to condemn the practice for what it is, a clear violation of international law as well as local cultural and religious norms.

I think that we are going to have to turn back to the original pillars of the laws of war that are supposed to guide whether a system can be judged as legal or not, such as their ability to discriminate between lawful and unlawful targets, etc. We have been ignoring them with our latest generation of technology. But there is a last one of these pillars I sense we may soon hear more of, whether society finds them objectionable in some way. This is why blinding lasers for example are banned, and was also the driver behind ban of bioweapons, even though they might be militarily useful. But again, this is why we need to engage with the technology, rather wait until after the fact, akin to what happened with land mines. As well, it is critical under the law that we ensure accountability, that there is a clear chain of responsibility for who to turn to when things go awry. This is something that many roboticists don't want to talk about (and one recently got angry with me for even suggesting it in the book), but I think is imperative here. If we can't establish that clear accountability that I believe extends from the user in the field all the way back to the inventor, so that we can find out just exactly where things went wrong, rather than putting all the blame on the end user, then again, the system may not be one we ought to use.

Author Picture
First Name
P.W.
Last Name
Singer, fully Peter Warren Singer
Birth Date
1974
Bio

American Political Scientist, International Relations Scholar, Specialist on 21st century warfare, Author, Founding Director of the Project on U.S. Policy Towards the Islamic World in the Saban Center at Brookings, Founding Organizer of the U.S.-Islamic World Forum