Revisiting America’s War of Choice in Iraq
Wars are fought not only on the battlefield but also in domestic political debates and in histories written after the fact. In the case of the US invasion of Iraq 20 years ago, we are still in this final phase, seeking an elusive consensus about the war's legacy.
NEW YORK – One advantage that historians have over journalists concerns time, not so much in the sense that they are free from urgent deadlines, but that they have the deeper perspective conferred by the years – or decades – between events and the act of writing about them. Twenty years is not a lot of time in historical terms, of course. But when it comes to understanding the war that the United States launched against Iraq in March 2003, it is all we have.
Not surprisingly, even two decades after the war began, there is no consensus regarding its legacy. This is to be expected, because all wars are fought three times. First comes the political and domestic struggle over the decision to go to war. Then comes the actual war, and all that happens on the battlefield. Finally, a long debate over the war’s significance ensues: weighing the costs and benefits, determining the lessons learned, and issuing forward-looking policy recommendations.
The Decision to Intervene
The events and other factors that led to the US decision to go to war in Iraq remain opaque and a matter of considerable controversy. Wars tend to fall into two categories: those of necessity and those of choice. Wars of necessity take place when vital interests are at stake and there are no other viable options available to defend them. Wars of choice, by contrast, are interventions initiated when the interests are less than vital, when there are options other than military force that could be employed to protect or promote those interests, or both. Russia’s invasion of Ukraine was a war of choice; Ukraine’s armed defense of its territory is one of necessity.
To continue reading, register now.
Already have an account? Log in