Musharraf, Mukasey and Checks and Balances

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

Musharraf, Mukasey, and Checks and Balances

Published in the Maryland Daily Record November 26, 2007

This month, the lawyers of Pakistan have been in the streets, and closer to home the Senate has agonized over the Michael Mukasey nomination for Attorney General. The protests of the Pakistani lawyers and the misgivings of the U.S. Senate, it is apparent, address similar problems.

The laws of Pakistan ordain a separation of powers and presidential term limits. They do not permit Pervez Musharraf, the Pakistani president, continued control both of the armed forces and of the presidency. Perhaps, had constitutional processes continued without interruption, he might have remained in command of neither. And the courts, claiming their own measure of sovereignty under a system of checks and balances, had served notice on him that they intended to enforce those processes. So he had closed down the courts, dissolved the legislature, and interfered with the electoral machinery.

Pakistan’s lawyers had gone into the street to protest, with waves of demonstrations that seemingly cooled only when he had beaten and arrested thousands of them, lodging patently phony charges, the most outrageous arising under the anti-terrorism laws, which (shades of Guantanamo) deny them access to the courts to clear their names.

Pakistan faces multiple forms of distress, but the heart of its angst is doubtless caused by the fork in the legal road at which it stands. Either it will be a nation of laws, where the courts can serve as a check on the executive, or it will not. Musharraf’s test of wills with the fledgling legal establishment in his country will probably decide this question for a generation. If he wins, the constitutional machinery of Pakistan won’t be worth much.

He probably knows this, but probably believes that in the welter of threats and difficulties Pakistan faces, the continuation of his presidency is of greater importance than the continuation of the rule of law. He is almost certainly wrong. A willingness to submit to the checks and balances of divided government and to permit succession in keeping with constitutional mechanisms, especially those mechanisms which place real limits on the powers of the head of state and/or require him to yield place to a successor, gives every side a reason not to resort to arms and insurrection. A successful resistance by Musharraf to such laws will confirm to the regime’s various opponents the futility of awaiting their legitimate turn. No wonder the lawyers are taking a stand.

Closer to home, the Mukasey nomination came close to foundering on the question of waterboarding. Ostensibly a narrow issue (unofficial sources reportedly say it has been used only three times in the “war on terror,”) it is a stand-in for a creeping constitutional crisis in our land, also centered around checks and balances. The constitution, laws, and treaties of the land clearly outlaw waterboarding. Mukasey’s unwillingness to acknowledge this for the record was transparently motivated by an awareness that if he did so, he might very well provide an opening for the U.S. judiciary and Congress to take actual corrective action. And though Mukasey stated he found the practice repugnant, he would not allow his answers to become a starting point for accountability for the administration that had appointed him. Mukasey likewise refused to say that all warrantless wiretapping was unconstitutional.

Mukasey secured his confirmation by issuing a pledge to Senators Schumer and Feinstein that if Congress passed a law specifically outlawing waterboarding, he would see that the administration enforced it. He also made a number of remarks that signaled that as a former assistant U.S. attorney and federal judge he understood the importance of institutional independence for the Department of Justice – an independence the Senate knows DOJ needs to restore.

As Senator Schumer summarized in the New York Times: “The department has been devastated under the Bush administration. Outstanding United States attorneys have been dismissed without cause; career civil-rights lawyers have been driven out in droves; people appear to have been prosecuted for political reasons; young lawyers have been rejected because they were not conservative ideologues; and politics has been allowed to infect decision-making.” To address such concerns, Mukasey assured the Senate that partisan politics would not be allowed to influence the bringing or timing of charges, that he viewed protecting civil liberties as vital to national security, and that in his understanding, the President does not stand above the law.

In other words, Mukasey claimed he had imbibed the traditional institutional values of the Justice Department, as his predecessors Alberto Gonzales and John Ashcroft had so conspicuously and appallingly not done.

Traditionally, Justice exhibits a strong culture of competence, ethics, insulation from partisan politics, and adherence to the rule of law – all things which have at times required the Department to act at odds with a White House to which it nominally reports, but which at times (for the last seven years for instance) has exhibited none of these things. That culture has had its downside; its members and even its alumni often teem with a sense of self-righteousness and an arrogance that can make them inflexible and personally insufferable. But it is a culture vitally worth preserving nonetheless – a sort of internal executive check and balance of its own.

In Bush, though, DOJ has encountered a president with an MBA rather than a law degree, and innocent of any legal acculturation whatsoever. To him, those institutional values and the independence they fostered were just a business obstacle to overcome.

In the end the Senators’ desire to see DOJ independence and professionalism restored trumped their desire to challenge other forms of Bush resistance to the constitutional order.

We can expect Mukasey to improve the battered morale of his agency. But will Mukasey live up to his Congressional puffery and stare down his boss? Well, after less than a week in office, he had already urged the president to veto a bill that would endow the FISA court with the exclusive power to authorize intelligence wiretaps. When a former federal judge resists a judicial check on executive powers, it is a very bad sign.

Mukasey’s resistance to checks and balances is the same as Musharraf’s. Musharraf has repeatedly justified his “state of emergency” as necessary to the security of his state, just as Bush has justified warrantless wiretaps, torture, Guantanamo, etc., etc. as necessary to resist Islamic terrorism. The unifying thrust of all the secret legal memos the administration’s lawyers have generated over the last few years in defense of its gallery of horrors has been a reliance on presidential powers to defend the country.

Mukasey’s November 14 letter about the FISA bill is of a piece with the worst of the Gonzales sophistry. The prerogatives of the courts and the Congress have been continually treated as if Article II of the Constitution, setting forth inter alia the President’s powers to defend the nation, were an amendment that superseded Articles I and III establishing Congressional and judicial powers, including checks and balances. But Articles I-VII were all passed at once; reading Article II as if it were an amendment to the rest is bad textualism and worse government.

Our ancient and strong system of checks and balances will probably survive Bush and Mukasey; Musharraf has assured that Pakistan’s new and weak system of checks and balances will most likely fail.

Copyright (c) Jack L. B. Gohn

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

Intelligent Design Revisited

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

Intelligent Design Revisited

 

Published in the Maryland Daily Record October 29, 2007

 

          I wrote about Intelligent Design theory here in May of 2004.  In that column, I said, in essence, that Intelligent Design (“ID”) seemed good enough science to receive some consideration in the curriculum, and that critics who insisted on equating it with Creationism (obviously not science at all) were missing the point.  In December 2005, Judge John Jones of the Middle District of Pennsylvania issued a encyclopedic ruling in Kitzmiller v. Dover Area School District, 400 F.Supp.2d 707, which addressed almost every conceivable issue about ID, including the ones I had raised.  The case arose from the efforts of a school district to require biology teachers to read a statement expressing skepticism towards pure natural selection and encouraging consideration of ID.  Jones struck that effort on First Amendment grounds.

 

          I’ve been wanting to revisit the subject in light of Kitzmiller ever since, though as my readers know, I’ve been kind of busy with other questions.  But my desk is now cleared. 

 

          Turns out some of the things I thought were open issues really weren’t, most notably the extent to which ID was passable science.  ID, in case you’ve tuned in late, is the theory that we can best account for “irreducible complexities” that appear in the course of evolution by inferring the activity of an intelligent designer, most likely God.  Irreducible complexity would exist when an organism manifested a combination of features which seemed too complicated to have arisen simultaneously as mere products of random mutation. 

 

          As the Kitzmiller trial revealed, however, there were only three “exhibits” in this theory, the flagellum (a feature of cell biology), a “clotting sequence” in certain animals, and the immune system.  And for each of them, the evidence at trial showed that science does after all have plausible explanations of how they arose from ordinary random mutation.  Their complexity was not irreducible after all.  Moreover, there is no peer-reviewed scientific literature supporting the ID theory.   So at least for now, ID is bad science.  There simply isn’t enough substance there yet for it to be recommended for serious consideration by any science teacher.

                                                                                                         

          Where Kitzmiller left me unconvinced was the finding that ID isn’t science at all – as opposed to merely being bad science, which I now think it must be. 

 

          Let me make clear at once that the motives of the Dover school board in pushing ID had nothing to do with the desire to further genuine science.  Under the searching analysis of Judge Jones, the school district’s decision stands revealed as nothing more than proselytizing (an attack on Darwin for purely religious motives) attempting to disguise itself as science education.  Hence Judge Jones was right about the First Amendment question.  But as I said in 2004 and say now, you need to distinguish between the message and the messenger.  The doctrine the Dover Fundamentalists pushed could still have been good science.  Kitzmiller shows it wasn’t good.  But was it science?

 

          Not according to Judge Jones.  Assume a God who a) exists in a supernatural sphere and b) intervenes in the natural world by influencing the course of evolution.  If that intervention left physical traces, apparently, per Judge Jones, a scientist investigating those traces would be barred from pursuing the true explanation for them.  For, according to Judge Jones, science seeks explanations of phenomena only from the natural sphere. 

 

          He paraphrases a scientist witness: “[O]nce you attribute a cause to an untestable supernatural force, a proposition that cannot be disproven, there is no reason to continue seeking natural explanations as we have our answer.”  Though not literally phrased as a non sequitur, the remark actually is one; what the scientist and the judge really mean, as the context makes clear, is that somehow the very hypothesis of a supernatural cause would exclude the possibility of any natural causes.  And that is nonsense. 

 

          People of faith, no doubt including many scientists, can and do entertain alternative natural and supernatural explanations all the time.  Undoubtedly the most common is speculation whether a “miraculous” cure was due to medical treatment or providentially answered prayers.  Surely it trenches too deeply on the scientist’s prerogative to say that he or she must not even inquire, or apply scientific tools to eliminating one or the other alternative.  Yes, of course, in the end, the scientist has no tools to probe directly any supernatural world which may exist.  But apparently in Judge Jones’ book the scientist is forbidden even to consider the possibility that natural phenomena may be supernaturally caused.

 

          This goes beyond agnosticism to become a positive rejection of the hypothesis of divine intervention in the natural world.  And yet, if one is forbidden to inquire – and to do so with an open mind – how can one objectively rule out such intervention in the first place?  Most religions teach that such intervention sometimes occurs, and many of them teach further that such intervention may leave lasting natural manifestations (cures for instance).  Apparently scientists addressing claims of such manifestations may only hypothesize natural causes, and must approach questions of supernatural agency only in the role of debunker – never taking seriously the possibility that religious explanations might be accurate.

 

          Judge Jones’ response to this protest, and that of many scientists, no doubt, would be that science might possibly lead an investigator to conclude that there was no scientific explanation available for some phenomenon, but that then the office of science would be at an end.  That sounds reasonable; but it isn’t.

 

          Given the extensive scope of our knowledge of the natural world, the absence of any ready natural explanation for something we see certainly begs at least the question of supernatural agency.  And if there were phenomena on the boundary of the natural and supernatural, they might have unique natural characteristics, hallmarks if you will, deriving from their location on that boundary, and hence be worthy of study for that very reason.

 

          Admittedly, the category of observable phenomena plausibly straddling such a boundary looks small today.  In light of the apparent failure of ID, for instance, it now seems unlikely that the evolutions of  flagella, of unusual clotting sequences, or of the immune system are phenomena suggesting direct supernatural intervention.  But there could be others.  And science should approach them – every aspect of them, including causation – with an open mind.

 

          I do not say this out of anxiety that the absence of scientifically reviewable evidence of providential intervention would threaten  religious belief.  As Judge Jones said more than once, discrediting those whose skepticism of natural selection arises from “missing links,” absence of evidence is not evidence of absence.  But that principle equally confronts skeptics of divine agency, which remains possible without observable supernatural intervention.  A Creator’s providential spin on Creation could have been as completely imparted at the moment of the Big Bang as is the trajectory of a baseball when it leaves the pitcher’s hand.  Prayers today could have been answered then.  My anxiety is for science, not for faith.  The methodology of science should be open to all possibilities.

                                                                                               

          That said, though, I agree now that Intelligent Design is not a controversy that need be taught, or ought to be.  But I say so on the grounds that it’s bad science, not that it’s no science at all.

 

Copyright (c) Jack L. B. Gohn

 

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

 

 

Kingsley Amis

Next Literary Writing

Kingsley Amis

          As recounted in About the Author, I did my doctoral dissertation on Kingsley Amis.  When Zachary Leader published his biography of Amis in 2007, Mark Lasswell of the Wall Street Journal tracked me down, and asked me to write a review and to be sure to incorporate my personal experiences of Amis.  Here’s that review, which in part explores that uneasy territory between an interviewer bent on understanding his subject and a subject who wants to be only partly understood.  Click here for an image of the piece as published: amis-review (you may want to use your browser’s zoom feature), or here: amis-review-submitted for the text as I submitted it before Mr. Lasswell’s wise emendations — while noting that this piece is reprinted with permission of The Wall Street Journal © 2007 Dow Jones & Company, Inc. All rights reserved.

          I hope to add the dissertation itself to this page eventually. For time being, and for the truly obsessed, the bibliography to the dissertation was issued as a separate book by the Kent State University Press, which you are welcome to check out on Amazon.

Copyright (c) Jack L. B. Gohn

Next Literary Writing

War Powers, War Lies: Part 25: Mission Accomplished

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers Column

War Powers, War Lies: A Series:

Part 25: Conclusion: Mission Accomplished

 

Published in the Maryland Daily Record August 27, 2007 

 

            Back in February 2005 I wrote these words in this column: “The U.S. is at war.  Our soldiers die daily, our treasure is poured out, and our international prestige hemorrhages.  No one has asked us citizens if we desire it.  No one has asked Congress, or at least not properly.  No one has leveled with us, and especially no one leveled with us when it could have mattered.  And daily we are fed a diet of lies.”  (Original Intent, 2/4/05.)  The only statement there that does not still apply two and a half years later is the declaration that no one has asked the citizenry.  The 2006 election was transparently a plebiscite on the War, and the War lost.  So far, though, the voice of the public has made no difference.  The prediction of Pierce Butler, a delegate to the 1787 Constitutional Convention, that the president “will not make war but when the Nation will support it” has proven laughably shortsighted.

 

            I have said throughout this series of articles that only one voice matters in matters of the wars our country wages, and that is the president’s.  If he is determined to wage war, no power can realistically stop him.  The Founders failed us on this one.  It had been their intent to make wars hard to start.  They did so by entrusting the power to declare wars to not to the President but to Congress.  This was a mistake; the Founders already knew that wars were frequently begun without declaration.  In tying Congress’ power to the formality of declaration, then, the Founders were asking for trouble, taking it on faith that presidents would not start wars without the formality.

 

            Presidents have routinely betrayed the Founders’ faith, and in our hundreds of military adventures they have seldom asked Congress for a declaration, although sometimes lesser forms of assent have been sought.  One trouble with lesser formalities is that they can be characterized and sought as something else.  For instance, the Gulf of Tonkin Resolution was voted on with the assurance that it would not be treated as authorization for a land war in Asia; later Congress found that that was exactly how it would be used.  (Tonkin Spook, 3/25/05.)

 

            A parallel problem is that the definition of war is slippery, and presidents have taken grievous advantage.  (Imperfect War, 2/25/05.)  If war is ultimately defined as what Congress declares, then any undeclared armed conflict the President wages by definition is not war, and hence the president need not await a declaration.  Thus John Adams’ naval skirmishes with the French, Abraham Lincoln’s full-fledged struggles with the insurgent South, or the Cold War which embraced all the presidencies between Truman’s and Bush 41’s and in which there were never direct military engagements with our principal adversaries – none of these engagements qualified as war.  These most important projections of military power were started without the formality of a declaration, and our courts never blew the whistle.

 

            We have seen, as well, that Congress has failed to find any mechanism to control ongoing “imperfect” wars.  The War Powers Resolution has proven toothless.  (Outgunned, 6/3/05.)  And we have recently witnessed a very public display of the limits of the Congressional power of the purse.  Effective use of that power requires something almost never found in American politics: a veto-proof majority in both houses of Congress willing to explain to a jingoistic electorate a vote to defund our at-risk sons and daughters.  With one more election we may now actually reach that rare point; but before the resulting Congress takes office it will have been five years since the beginning of the war.

 

            Presidential war powers are awesome.  In addition to the ability to send the world’s best-equipped military machine[Comment1]  into action, presidents have at their disposal legions of special forces and militarized CIA units to perform clandestine and covert missions, and in addition have developed proxy armies in the military forces of other countries, particularly in this hemisphere.  The latter, trained especially at the School of the Americas, a/k/a Western Hemisphere Institute of Security Cooperation, give deniability to military operations against anyone standing up against a neocolonial status quo.  (War Off the Books, 7/8/05.)  Add to that the sole power to deploy at a moment’s notice a nuclear arsenal that could simultaneously wage our largest war and end history as we know it – all undeclared, of course.  (MADness, 6/25/07; Metastasis, 7/30/07.)

 

            Broad as presidential war powers are, the present administration has pursued a relentless agenda of expanding them.  It has created a system of more or less hidden gulags around the world in defiance of our treaty obligations, and fought tooth and nail against any judicial review,  justifying this as a war power.  (Captive Taxonomy, 8/5/05.)  It has practiced and tried to legitimize torture (Playbook, 8/26/05), expanded the legal process of rendition into the legally questionable extraordinary rendition, perhaps violated U.S. law against assassination, and shipped covertly held prisoners of war to countries where it is known they will be tortured or killed (Away Games, 10/28/05).  It has created a parallel system of transparently phony military and quasi-military courts, i.e. military commissions and combatant status review tribunals, whose real function is to maintain custody of detainees for as long as the White House thinks best, not to reach any bona fide determinations as to whether they belong in custody.  (Kangaroo, 11/18/05.)

 

            And as an adjunct to a permanent state of undeclared war, the administration has sought permanently to limit governmental accountability in various ways.  It has promulgated an official policy of complying as little as possible with the Freedom of Information Act, and defied the Presidential Records Act.  It has violated the law requiring declassification of documents, has (largely without admitting it) begun a program of reclassifying previously declassified documents, created legally unauthorized forms of classification, and removed documents from governmental reading rooms and websites.  When enormities like the warrantless NSA wiretap program and the foreign gulag are revealed by the press, the government vilifies the messenger.  This too in the name of war.  (Mine to Know, 2/23/07.)

 

            How is it that the populace, which pays so dearly for misbegotten and ill-conducted wars, has not risen up against the constitutional monstrosity that presidential war powers have become and are increasingly becoming?  There are several answers.

 

            At the bottom of them all is the fact that presidents lie about their wars.  They all do.  President Polk challenged Mexican sovereignty by stationing troops on Mexican soil, and when the Mexicans defended themselves, Polk obtained a declaration of war telling the nation of the Mexican attack but not the U.S. provocation.  Franklin Roosevelt lied about Lend Lease and about his intentions to lead America into World War II.  He lied about having sold out Poland to Russia to speed the end of that war.  (Willingly Deceived, 4/29/05.)  And, as noted, Johnson lied to ease us down the last step to total engagement in the Vietnam War, knowing that there had been no Tonkin Gulf attack.

 

            This administration, however, has probably pushed war lying to its worst extreme.  It lied about why we were going to war (the elusive truth apparently being that the object was to satisfy the geopolitical grand design of certain theorists within the administration).  (Why We Fought, 1/27/06.)  It lied that there was such a thing as a Global War on Terror.  (Not GWOT, 2/24/06.)  It lied that it was really engaged in a military struggle with al Qaeda, when in fact it had not been defending the country seriously against that organization either before or after 9/11.  (The War That Wasn’t, 3/31/06.)  It lied to itself that it had a reasonable plan for occupying Iraq.  (Super Bowl, 5/26/06.)  It lied to the world that it truly believed Saddam had weapons of mass destruction, when the only reliable intelligence pointed in the opposite direction.  (Weapons of Mutual Deception, 6/30/06.)

 

            But this goes beyond lies.  Historically, war opponents, who usually have a tendency to dispel lies with inconvenient truths, have been the object of prosecution and imprisonment to silence them.  Though the struggle never ends, over two and half centuries we have gradually established the First Amendment as a bulwark against such campaigns.  We followed that story from John Adams’ Alien and Sedition Laws through Abraham Lincoln’s efforts to imprison Confederate sympathizers, through Wilson’s efforts to imprison anyone who spoke out against World War I or the draft, through the Truman purges of suspected Communists, and through the McCarthyite hysteria, in the wake of which the First Amendment finally largely prevailed.  (Speechcrimes and Groupcrimes, 8/25/06; Wilson’s Gag, 9/29/06; Proxies, 10/27/06; Are You Now or Have You Ever Been?, 1/26/07.)

 

            This hard-won progress, however, means little without a press that reports proscribed truths.  It usually does not.  The press was well aware that the Johnson administration must be lying about Tonkin, and let it slide.  And in the runup to the current war, instead of asking the inconvenient questions, the press gave critical credibility to the “aluminum tubes” story, did not ask questions of the White House about the International Atomic Energy Commission’s authoritative dissent, editorialized overwhelmingly in favor of the war, almost ignored the peace movement, and in general made an honest public discussion impossible until it was far too late – until we were already impaled on Iraq.  (Day Late, Dollar Short, 4/2/07.)

 

            Yet blaming presidents and the press alone would leave us out.  We crave those lies.  It must be faced; we are a warlike people much of the time.  By and large war has been good to us and for us.  And we have historically been willing to commit great atrocities like genocide against Native Americans, area bombing and nuclear warfare to satisfy our war lusts.  (Willingly Deceived; Not One Stone, 5/29/07.)

 

            And yet we are not always consumed by those lusts.  We are always somewhere on a spectrum between a proper view of war (i.e. that it is a necessity of last resort) and the cowboyish view that presently holds sway in the administration.  At the moment, despite all the lies and despite a press that only slowly woke up to its constitutionally-contemplated duty of counteracting them, the cowboys’ outlook is found only in the administration, not in the Congress or in the populace.  But the power of the presidency can frustrate the will of the people, at least until the next election.  And this is a great tragedy.

 

            Is there any remedy? 

 

            I am skeptical that we can legislate one; the practical failure of both the War Powers Resolution and the power of the purse establish that.  Nor can we easily amend our constitution to give Congress greater leverage or force the courts to review presidential assertions of war powers.  Amending the constitution is very hard to do.  And the practical problems of figuring out what might replace the present system are daunting.  Amendment is certainly worth thinking about, but how to amend is not immediately apparent.

 

            For the immediate future, therefore, the remedies seem to be simple but nearly unattainable.  We must begin electing presidents who respect the constitutional separation of powers and the people’s right to the truth.  History has not supplied us with many examples, even among presidents who were in many respects admirable.

 

            We need to elect legislators who are not demagogues or in fear of demagogues.  And that is hard because flag-waving demagogues so often drive the electorate.  And legislators want the electorate to reelect them.

 

            We need also to obtain from the large and irresponsible conglomerates that own most of it, a press with courage and honesty, more committed to telling the truth than to entertaining the masses and pleasing advertisers.  Right.

                                                                                                           

            As hard as it may seem to achieve these things, though, little else will suffice.  And so we must work to elect presidents who accept limits, and dauntless legislators, and we must use our consumer power to force the press to demand straight answers to the important questions.  That effort, already afoot, is a critical struggle of our time.  Good luck to it; good government and honest debate would countenance only better and, no doubt, fewer wars.

 

Copyright (c) Jack L. B. Gohn

 

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers Column

War Powers, War Lies: Part 24: Metastasis

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: A Series

 Part 24: Metastasis

 

Published in the Maryland Daily Record July 30, 2007

 

            As we have seen, the exercise of nuclear war powers was envisioned during the Cold War as being an essentially Executive Branch affair, because of the short decision times involved.  This did not mean the rest of the government was thought dispensable should holocaust arrive. 

 

            To the contrary, in the early days of the Cold War, when nuclear decapitation of the U.S. government loomed as a serious threat, there was a serious plan to preserve all branches.  As is now well known, a secret bunker was built around 1960 under the Greenbrier Hotel in White Sulphur Springs, West Virginia, which would have housed the entire Congress, or at least whichever portion of it survived Armageddon, until the fallout had cleared.   Reportedly there were similar contingency plans for the Supreme Court as well, although little has ever been made public about them.   These complemented perennial plans for locating and spiriting away the President or whoever stood highest in Presidential succession, in the event of nuclear attack, to Mount Weather in the Blue Ridge Mountains, or to a mountain six miles north of Camp David. 

 

            Until the Carter Administration, the planning both for warmaking itself and for the survival of warmakers and of the larger government was rudimentary.  At that point, more careful thought was given to the preservation of the government in the face of nuclear catastrophe.  This evolved into something called the Doomsday Project that reached its height during the Reagan Administration, which, to judge by what is known about it, was focused on maintaining not only the lives of the Executive Branch warmakers but also communications among them and between them and the military.  The communications hardware never worked properly, however, and so, with the decline of the Cold War, these plans were scrapped during the Clinton years. 

 

            Suddenly, with the arrival of 9/11 and the 2001 anthrax attacks, the question of governmental continuity grew more urgent again.  There were news reports in 2002 that senior White House, Defense, and State Department officials had been dispatched to fortified bunkers at two undisclosed East Coast locations.  Unlike previous plans, these were not a matter of response to attack or the temporary taking of shelter.  This time, the deployment was permanent.  While the individual officials would rotate, the bunkers were always in use.  This was a strictly Executive Branch show; the White House did not even bother to inform Congress, which found out about it from the press. 

 

            If the notion of an Executive Branch governing from secret locations of which not even Congress is informed sounds disquieting, more recent actions of the Bush administration are even more so.  On May 9, President Bush signed National Security Presidential Directive 51, blandly captioned “National Continuity Policy.”  Arguably it is of a piece with previous presidential orders establishing continuity of government policies, including Executive Order 11490 (Nixon, 1969), Executive Order 12656 (Reagan, 1988), and Presidential Decision Directive 67 (Clinton, 1998).  To the extent these orders have been declassified, they generally direct federal agencies to make and coordinate contingency plans.  PD51 does this too, but there is a difference of tone.

 

            First, the evil to be addressed is different.  While EOs 11490 and 12656 spoke in identical language of “any national emergency type situation that might conceivably confront the nation,” there was only one real emergency in mind, specifically named: “a massive nuclear attack.”  That uniqueness was for good reason; nothing else readily imaginable would threaten the continuity of the government.  PD51 by contrast provides against a “Catastrophic Emergency,” defined as “any incident, regardless of location, that results in extraordinary levels of mass casualties, damage, or disruption severely affecting the U.S. population, infrastructure, environment, economy, or government functions.”  Arguably, this could be a much smaller incident.  Arguably, it could have been 9/11.

 

            The objective of PD51 is “Enduring Constitutional Government,” which is “a cooperative effort among the executive, legislative, and judicial branches of the Federal Government, coordinated by the President, as a matter of comity with respect to the legislative and judicial branches…”  Although this language is immediately followed by language about “proper respect for the constitutional separation of powers,” it rings a little hollow after language in which the President appears to be appointing himself coordinator of the other branches of government.  This note is struck again with this language: “The President shall lead the activities of the Federal Government for ensuring constitutional government.”  Not the Executive Branch, Federal Government.

 

            Given the contempt with which the Bush Executive Branch has treated other branches of government and the reluctance of the Executive to submit to checks and balances, these bland words do not sound like a promise of ongoing vitality and independence for Congress and the courts.  It is quite foreseeable that another 9/11 event could fall within the category of “Catastrophic Emergency.”  At that point, PD51 says the Administration could seize all governmental operations, until, in its view, it had achieved “Enduring Constitutional Government.”  If this is anything like as ironic a name as Operation Enduring Freedom, which has endowed Afghanistan with resurgent warlords, a resurgent opium trade, and a resurgent Taliban, we are in trouble.

                                                                                               

            We have, in short, observed a metastasis precipitated by area bombing and then nuclear bombing, a metastasis in which the war powers of other branches of government have been invaded and may end up nullified by the Executive.  Under PD51, the other branches might end up effectively nullified too.

 

            Similar rumblings of worry should affect us as a people.  Once upon a time, in the era of movies seen by millions of us as schoolchildren, like Duck and Cover and About Fallout,[1] the Civil Defense authorities at least pretended to have a plan for civilian survival in the face of nuclear war.  The agencies behind those film classics morphed into FEMA, which everyone now knows has no realistic plans to save anyone from anything.  The truth is, no one cares about us citizens.  PD51 is about saving our rulers.  So if there ever were a nuclear attack, about all that we could expect to survive in one piece would be the President, his generals and a puppet government, not us. Fortunately, the nightmare nuclear prospect with our present enemies threatens destruction less pervasive than a Soviet first wave could have inflicted.  The populace could survive a nuclear 9/11.  But if there were a broad nuclear attack or eco-catastrophe, the populace could meet the same fate as the Legislative and Judiciary.  The Executive could consume us too, by leaving us exposed while it sat out the trouble under some mountainside.

 

            Chad Stuart and Jeremy Clyde saw it all quite clearly in their 1968 song, The Ark.  When the Ark lands after a world-destroying military catastrophe, and the doors open, disgorging the survivors:

 

But who is this, staring at the sunshine?

  laughing, shoving crying?

Not you, not me, my friend.

Are not these the men of the iron mountain?

Were not these the leaders of the fighting?

 

            You bet.  Them and nobody else.  That where the current state of war powers and war lies could bring us.  Buckle your seatbelts..

 

Copyright (c) Jack L. B. Gohn 

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: Part 23: MADness

 

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: A Series

 Part 23: MADness

 

Published in the Maryland Daily Record June 25, 2007

           

“Hell of a weapon, really, when you come to think of it.  Imagine these damned things shooting up out of the sea anywhere in the world and blowing some capital city to smithereens.  We’ve got six of them already and were going to have more.  Good deterrent when you come to think of it.  You don’t know where they are or when.  Not like bomber bases and firing pads and so on you can track down and put out of action with your first rocket wave.” 

 

            With these words of the character Felix Leiter describing George Washington-class nuclear submarines armed with Polaris missiles, from the 1961 James Bond thriller Thunderball, Ian Fleming aptly summarized the evolving status of nuclear deterrence at that moment.

 

            Last time, we discussed how incendiary weapons (including nuclear bombs) actually had little battlefield use.  And, as we shall see, this rapidly made the world a very dangerous place.  Leiter’s quoted monologue represents about the halfway point in paradoxical efforts to make the world safer by making it more dangerous.

 

            There were three problems with the initial nuclear weapons that left them really only suitable for use against civilians. 

 

            First was survivability.  A bomb had to survive  to be deployed, and then had to survive until it reached its target.  Bombers could be destroyed on the ground by incoming intercontinental ballistic missiles (ICBMs) and in the air by anti-aircraft fire and fighters.  ICBMs were at least vulnerable to the former. 

 

            Second was precision.  The history of WWII bombing made clear that aircraft, at the mercy of wind conditions, anti-aircraft fire, visibility problems, etc., could not be counted on to deliver bombs to small, precise targets like military installations and units.

 

            Third was intelligence.  Even if U.S. bombs could have been delivered with pinpoint accuracy, it would not have been easy to target them.  Many military targets (e.g. units, ships, or missiles mounted on railcars or trucks) move.  And in the early Cold War days, mobile targets were hard to track.  Spy planes could only provide snapshots.  And at the outset the same was true of satellites.  The latter might take photos from space, but they would then need to be brought down and retrieved, and the film inside developed.  Consequently, the freshest available satellite intelligence might be a month old.

 

            The first problem dictated that nuclear weapons deliverable from a stealthy source, like a nuclear (and therefore untraceable) submarine, were at a high premium, because they were invulnerable to a first strike.  This necessitated heavy reliance on Polaris missiles, which U.S. submarines initially delivered.  But Polaris was notoriously imprecise (the second problem).  Coupling that deficit with perennial intelligence problems (the third), the bottom line was that, just as Felix Leiter said, U.S. nuclear strategy pointed most strongly toward “blow[ing] some capital city to smithereens.”  Soviet cities were large (and so harder to miss) and immobile (meaning no real time intelligence was required to locate them).  They – and their populations – therefore became prominent targets.

           

            And so, for similar reasons, did U.S. cities and their populations.  And after a certain early point, each side, we and the Soviets both, had submarine launch pads that assured that, no matter who struck first (perhaps destroying the other side’s aircraft and land-based missiles), the other side would still be capable of laying the first striker’s cities to waste.  This reality was known as MAD: Mutual Assured Destruction.  MAD was understood to assure that neither side would strike the other first. 

 

            Actually, there was never much risk that either would strike the other first.  But the U.S. continued to plan against the contingency that the Soviets would roll their armor through Germany into Western Europe, a thrust assumed to require us to respond with nuclear weapons.  And the Soviets continued to expect us to strike first, as much because of their wartime experience with Hitler’s surprise attack while there was a nonaggression treaty in place, as because of the fact that the U.S. refused to take a no-first-use pledge. 

 

            The madness (never mind inserting an acronym) of this course of action did not stop with contemplation of the effective loss of all of our cities.  There was a substantial risk of ecological catastrophe as well.  Carl Sagan and colleagues published an influential 1983 paper which popularized the phrase “nuclear winter.”   It appeared to them that if a substantial portion of the world’s nuclear arsenal were detonated, the result would be a flooding of the upper atmosphere with fine particles that would becloud the sun’s rays for long enough to kill off most of the agriculture in at least the northern hemisphere.  While subsequent research has moderated the direness of these predictions slightly,  the science seems basically to sound to this layman.

           

            In short, planning for nuclear contingencies required national leaders to contemplate the instantaneous destruction of vast portions of their own citizenry – and that of all the other nations too.  Madness indeed.

 

            It was not that presidents sought out national self-immolation and world eco-catastrophe as war powers; it was that they could not avoid getting to that point once the logic of nuclear weapons was first embarked upon.  There is a fascinating memoir by William Odom, a member of the Carter administration’s National Security Council, on the state of the planning they found when Carter came to power in 1977.   Assuming the arrival of a nuclear crisis, the Single Integrated Operations Plan (SIOP), effectively left the President with a 10-12-minute window in which to initiate a nuclear war; he had no viable alternatives about how to do it, despite lip service having been paid by a generation of planners to limited nuclear options, and the SIOP did not meaningfully address what would happen more than 12 hours out.  Hence even if the President survived, he would have had no guidance how to navigate the country or the world through recovery from unimaginable destruction.

 

            Fortunately, things have improved in certain ways.  First, the technical hurdles that kept military units from replacing cities as the main targets of nuclear weapons have been largely cleared.  For instance, the Trident, which replaced Polaris, is much more precise.  We now have telemetry, so that our satellites can give us real-time intelligence on the activity and location of military units and assets, diverting nukes from cities to more strategic targets.  The number of nuclear weapons has decreased because of arms limitation treaties.  Most important, the Cold War has ended, and the world’s two chief nuclear antagonists are much less committed to destroying each other.

 

            What remains, however, (in addition to the legacies of nuclear proliferation and the diffusion of fissile material and knowhow in a world full of rogue states and terrorists) are bad mental habits that take us ever further from the Founders’ notion of war as something initiated by the nation’s elected representatives.  Only the Executive could exercise war powers that must be wielded in under 12 minutes.  Owing to the logic of nuclear arms embodied in SIOP, then, Congress had ceased to exist for planning purposes, even though nuclear war would determine the fate of millions and the planetary ecology – obvious policy decisions, especially appropriate for legislative discretion.  Once you take the Legislature out of the decision-making on such matters, you gravely alter the checks and balances system.

 

            Next time we shall see that even in a somewhat post-nuclear world, the damage to checks and balances goes on.

Copyright (c) Jack L. B. Gohn

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: Part 22: Not One Stone

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: A Series

Part 22: Not One Stone

Published in the Maryland Daily Record May 29, 2007

          Operation Gomorrah, commenced July 24, 1943, was, as philosopher A.C. Grayling put it, something new and terrible even by the standards of industrialised violence so far experienced in the Second World War.”[1]

          A Firestorm

Specifically, Gomorrah was mounted for the specific purpose of obliterating the city of Hamburg.  Royal Air Force Lancasters, Halifaxes, Stirlings and Wellingtons came loaded, not with conventional bombs, but with incendiaries.  In his 2006 book, Among the Dead Cities (upon which I draw heavily below), Grayling has described what they accomplished.  Here is part of what resulted on just one of the four nights of the raid, that of July 27-28:

Fires in different streets progressively joined together, forming into vast pyres of flame that grew rapidly hotter and eventually roared upwards to a height of 7,000 feet, sucking in air from the outlying suburbs at over a hundred miles an hour to fuel their oxygen hunger, creating artificial hurricanes ‘resonating like mighty organs’ … which intensified the fires further… Its greatest intensity lasted for three hours, snatching up roofs, trees and burning human bodies and sending them whirling into the air.  The fires leaped up behind collapsing facades of buildings, roared through the streets, and rolled across squares and open areas ‘in strange rhythms like rolling cylinders.’  The glass windows of tramcars melted, bags of sugar boiled, people trying to flee the oven-like heat of air-raid shelters sank, petrified into grotesque gestures, into the boiling asphalt of the streets.[2]

          In the wake of these four nights of raids, 45,000 identifiable corpses were found, and 30,480 buildings were reduced to rubble, half the Hamburg housing stock.  One and a quarter million refugees were created.

          Other exigencies of the war necessitated a pause in the “carpet bombing” or “area bombing” of population centers until February 1944, but then the pattern resumed: Stuttgart, Brunswick, Halberstadt, Regensburg, Schweinfurt and Augsburg.[3]  After another pause for D-Day and invasion support, the effort continued in February 1945: Dresden, Leipzig, Worms, Mainz, Würzburg, Hildesheim, Gladbeck, Hanau and Dulmen.[4]

          Laying the Carpet

          In Europe, although the Luftwaffe had certainly been the first to strike civilian centers from the air, with blitzkrieg in Eastern Europe, and in England with the Blitz, the RAF was the clear leader in the practice of aerial obliteration of civilians, their infrastructure, and their culture.  By contrast, the U.S. Eighth Army Air Force, when it arrived on the scene, insisted on efforts to focus on Hitler’s war industries, notably the ball bearings and oil which were central to the military effort.  (So effective was the Eighth Army that eventually the Luftwaffe was effectively grounded for lack of fuel.)

          In the Pacific, of course, the U.S.  made different choices.  When the U.S. finally got its B-29s within striking distance of the Japanese mainland, those bombers, designed to carry maximum payloads, were directed to fly low, allowing them to carry even greater loads – of incendiaries.  These were then dropped on the wooden Japanese cities of Tokyo, Nagoya, Osaka, and Kobe.  Tokyo was attacked on March 9-10, 1945.  Fifteen square miles in one of its most densely populated districts experienced “a ferocious firestorm that killed more than 85,000 people.”[5]  From there, the U.S. laid waste to “nearly half of the built-up areas of … sixty-six Japanese cities.”  The nuclear destruction of Hiroshima and Nagasaki came at the end, in August 1945.  Between them the nuclear attacks killed perhaps 100,000 people immediately (over 100,000 more later on) and destroyed half the buildings in each city.[6]

          Illegal, But With Impunity

          Whatever the morality of these bombing campaigns, they certainly violated international law.  The Hague Conventions of 1907, never superseded at the time of the Second World War, specifically forbade employment of “arms, projectiles, or material calculated to cause unnecessary suffering.”  Art. 23(e).  Under Article 25, “The attack or bombardment, by whatever means, of towns, villages, dwellings, or buildings which are undefended is prohibited.”  And Article 27 provided: “In sieges and bombardments all necessary steps must be taken to spare, as far as possible, buildings dedicated to religion, art, science, or charitable purposes, historic monuments, hospitals, and places where the sick and wounded are collected, provided they are not being used at the time for military purposes.”  And it is worthy of note that in trying the Nazis at Nuremberg, the tribunal held that the Hague Conventions were part of the customary law of war, and binding on Germany whether or not it was a signatory.  Perhaps equally noteworthy, however, was that, although the twelfth and last of the “minor” Nuremberg trials placed on trial architects of war strategies and tactics that had violated international law, no German, so far as I am able to determine, was ever tried for atrocities against civilians from the air.  Perhaps reasons of state (i.e. the implications for the RAF and the U.S. Air Force) guided prosecutorial discretion .

          There had been efforts between the World Wars to update the Conventions, in view of the flagrant bombing violations perpetrated by all sides in World War One.  A running conference from 1922 to 1937 sponsored a draft set of Rules of Air Warfare, which would have forbidden Gomorrah and Hiroshima in even more express terms.  The Rules were never ratified for various complex reasons.  But in the course of the proceedings, the U.S. and (apparently) the British delegates had endorsed the view that bombardment of civilian centers violated extant law,[7] which the Rules merely articulated.[8]  Area bombing had thus clearly been recognized, in advance, as illegal by representatives of the very nations that later perpetrated it in places like Hamburg and Hiroshima.

          Such illegality posed two separate problems for the Allies: how to justify their actions to themselves, and how to justify them to the populace.  The story of how the Allied leadership muddled through those problems has very large implications.

          Apologetics and Justifications for Atrocities

          As the Allies and the Axis drew closer to war in the 1930s, there was never any question that our side intended to bomb civilians.  As early as the 1920s, the chiefs of British Bomber Command were formulating policy explicitly centered on area bombing.  The thinking was that bombing was so destructive of civilian morale that the national will to fight of adverse powers would crumble.[9]  While Britain had committed itself at the outset to complying with the draft Rules of Aerial Warfare, that commitment lasted only until May 1, 1940, when the order articulating that commitment was rescinded.

          American bombing doctrine, as noted, focused instead on pinpointed destruction of enemy industrial sites.  However, in practice U.S. and RAF military doctrines were much closer together than would appear.  If it was acceptable to bomb industrial sites, the U.S. approach, it was equally acceptable to bomb the infrastructure that made running those sites possible: the bridges, the railways, the water supply, the power grid.  If such bombing meant that life became impossible for civilians in large areas surrounding those sites, then so be it.  And in practice, the distinction was even less than might seem the case as a matter of theory.  American bombers simply lacked the technology and the correct weather in either theater of war to make precision bombing a full-time tactic, though in Europe they tried.[10]  In Japan, they did not try; there they were under the command of Gen. Curtis LeMay, who became notorious in a later war, Vietnam, for urging that we bomb the North Vietnamese “back to the stone age.”  And he was not a late convert to that view.

          In any event, the internal justification for area bombing either espoused a view that civilians were collateral damage to attacks on the industrial war machine or that in modern warfare, the civilian/combatant distinction was not viable or important.  In some cases, bombing of civilians was, ironically, presented as humanitarian and in keeping with the larger goals of the law of war, in that collapse of the enemy could be precipitated faster, and at a lower cost in human life overall, if civilian morale could be broken from the skies.

          For some, it was not even a problem.  Extermination of the enemy populace and culture seemed like a self-evident goal to pursue.  For instance, the day after Pearl Harbor, Congressman Charles Eaton of New Jersey was clearly contemplating genocide against Japan in urging Congress and the country to summon “determination once and for all to wipe off of the earth this accursed monster of tyranny and slavery.” [11] Henry Morgenthau Jr., Secretary of the Treasury in 1944, as victory became clearly inevitable, put forward a plan for the complete demolition of the remainder of German industry postwar, leaving Germany permanently as a “pastoralized” country.[12]

          It was enough of a problem to most national leaders, however, that a frank dedication to extermination of enemy civilians was not prominently embraced.  Grayling is almost amusing in recounting the frustration of Air Marshall Sir Arthur Harris, the leader of RAF Bomber Command, at the official pronouncements and directives of those above him trying to steer him away from area bombing or at least to distance themselves from it.  Harris bluntly described himself this way: “It is my business to kill people: Germans.”[13]  And much boastful publicity was given to the Hamburg raid.  Yet when British humanitarian voices were raised in protest, and questions were asked in the House of Commons, Minister for Air Sir Archibald Sinclair responded firmly: “The targets of Bomber Command are always military.”[14]  This was of course true if and only if everything civilian was military by definition, a mental qualification that Sinclair almost certainly was resorting to.  (Shades of “I have never ordered torture”?)

          “A Military Base”

          Where it came to Japan, and in particular to Hiroshima, a similar mental qualification, amounting to a lie, may have been indulged in by none other than Harry Truman, so often presented as a paragon of honest speaking.  “The world will note,” Truman said, “that the first atomic bomb was dropped on Hiroshima, a military base.”[15]  There were two significant military bases in the vicinity, but the bomb was not dropped, nor was it intended to be dropped on them.[16]  The Target Committee deliberately chose a location in the heart of the town, specifically rejecting going out of town to get near to the bases.  While Truman told the Secretary of War to avoid civilian targets[17], everyone involved in the targeting process knew that order was obsolete when given.  If Truman had been honestly mistaken in the radio address just quoted, he would doubtless have been informed of the gaffe afterwards, and if he had been totally honest, he would have owned up to it.  He did not.

          None of this is a discussion of the morality of area bombing in general, or Hamburg and Hiroshima in particular.  Rather it is a discussion of the way that laws of warfare are broken by even the best nations on earth and lies are told by the leaders of those best nations to hide it.

          It can indeed be argued that the Hamburgs and Hiroshimas were morally correct, even if illegal; but if they were truly believed to be correct, then why the dishonesty?  Why did the Allied leaders not adopt a moral stance challenging the laws of war, or at least the laws that forbade area bombing of civilians?

          I would posit, by way of an answer, that the real thought process here was likely more elemental than either a legal or a moral one.  Nations grow lawless in wartime – an ancient observation, memorably phrased by the Roman orator Cicero: Silent leges inter arma.  (Literally “the laws are silent amid arms.”)  So it may be that the very concept of the laws of war is mostly a fig leaf and a sham.  There is propaganda value, however, in pretending to hold to those laws so that there is a yardstick for the enemy not to measure up to.  But hypocrisy is required to make people believe there is a yardstick at all.

          Meanwhile, the real law of war may be this: destroy the alien, the other, the enemy.  Burn, sack and pillage the cities.  Leave not one stone upon another (as Jesus prophesied would happen to Jerusalem).[18]  Erase the enemy from the book of life, as mankind has tried to do in countless wars, at Auschwitz, in the killing fields of Cambodia, in Rwanda, in Kosovo, in Darfur.  View enemy armies merely as impediments to the chief goal of war: obliterating the civilians. Certainly there is something in the warrior mind that will always tend in this direction.  Even in somewhat more civilized wars there will always be My Lais and Hadithas.

          Candor

          But assuming, without conceding, that this is the real law of war, then even so it would have been productive to dispense with the hypocrisy, and to speak openly of the moral, tactical and strategic choices being made.  Surely it would have been a good idea to talk honestly about Hamburg.  If the nations that billed themselves as the last hope of civilization against barbarity wanted to go ahead and roast cities alive after such a discussion, “civilization” would at least have chosen what was done in its name.

          Honest debate would have been even more helpful before the end.  The choice to bomb Hiroshima may have seemed only a small step beyond the choices to bomb Hamburg and Tokyo.  It is arguable that to the air force personnel who carried out the actions, there was little difference, except that far fewer aircraft were involved.  To the airmen, an incendiary bomb was an incendiary bomb.  In hindsight, though, we know that the Hiroshima projectile had quite different implications.

          And if for security reasons the debate could not be had before the fact, surely it would have been in “civilization’s” best interest to have had a frank national dialogue about the atomic bomb afterwards, and not to have obfuscated with happy talk about military bases.  The deceptions deprived us of much of that dialogue.  The U.S. had just unleashed a weapon whose best use, and perhaps sole purpose, was the obliteration of whole cities at a blow.  It was therefore a weapon whose whole point was to violate international laws we ostensibly observed – unlike conventional incendiary bombs, which at least also had legitimate applications against military targets.

          Beware of Imitators

          Even a nation eager to elevate the “true” law of war above the “impossibly idealistic” Hague Conventions might have wanted to consider whether it was safer to observe the Conventions anyway — and thereby retain their protections.  After all, even a nation frankly bent on merciless obliteration of the enemy might still have had qualms about establishing a precedent that bore a distinct and novel threat of obliterating us as well.  For that is, of course, the strategic situation into which Hiroshima ushered us. 

          The nature of that strange and unprecedented strategic situation will be considered next time.


[1]   Among the Dead Cities (2006) at 16.

[2]   Id., at 18.

[3]  Id., at 65.

[4]  Id., at 73

[5]   Id., at 77.

[6]   Id., at 78.

[7]  “At the beginning of this conference Ambassador Hugh Gibson, speaking for the United States delegation, said that civilization was, threatened by the burden and dangers of the gigantic machinery of warfare then being maintained. He recalled that practically all the nations of the world had pledged themselves not to wage aggressive war. Therefore, he said, the conference should devote itself to the abolition of weapons devoted primarily to aggressive war.  Among the points advocated by Ambassador Gibson [of the United States in Geneva in 1932] were the following: Special restrictions for tanks and heavy mobile guns, which were considered to be arms peculiarly for offensive operations; computation of the number of armed forces on the basis of the effectives necessary for the maintenance of internal order plus some suitable contingent for defense; abolition of lethal gases and bacteriological warfare; effective measures to protect civilian populations against aerial bombing.” http://www.mtholyoke.edu/acad/intrel/WorldWar2/disarm.htm .

[8]   See Grayling at 146 and preceding pages.

[9]  Id., at 134 and following.

[10]   Grayling at 141, citing Stuart Halsey Ross, Strategic Bombing by the United States in World War II: The Myths and the Facts (2003).

[11]   Cong. Rec., 77th Cong., 1st Sess., Vol. 87, pt. 9 at 9520-27 (1941), cited in J. McWhorter, Doing Our Own Thing (2003) at 44.

[12]   Grayling at 159 and following.

[13]   Id., at 118.

[14]   Id., at 189.

[15]   Radio address of August 9, 1945.   Cited in Grayling at 156. Cited to:Public Papers of the Presidents of the United States: Harry S. Truman, Containing the Public Messages, Speeches and Stements of the President April 12 to December 31, 1945 (Washington D.C.: United States Government Printing Office, 1961) page 212. The full text also was published in the New York Times, August 10, 1945, page 12.  Sourced in http://www.dannen.com/decision/hst-ag09.html,

[16]   See the Targeting Committee’s report, preserved at http://www.dannen.com/decision/targets.html .  

[17]   Truman confided to his diary that he had ordered that civilians be spared, and the targets be exclusively military.  http://www.dannen.com/decision/hst-jl25.html.

Unfunny Imus

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

Unfunny Imus

 

            It was certainly legal to fire Don Imus.  He was employed by private employers (at least to the extent that public corporations whose stock in trade is use of public airwaves can realistically be called private), so by conventional reckoning there was no state action, and hence no First Amendment violation.  It was equally legitimate for the advertisers to pull their commercials so as not to be associated with Imus.  There is really no legal question presented.

 

            The story concerns only our society’s mores, not its laws.  And on balance the loss of Imus was a gain for those mores.

 

            We each harbor some racism.  You cannot be born in our society, a society in which the echoes of slavery continue to resound, without suffering from it to some degree.  The best way to cope is to acknowledge it and try to move on, the way you may have to acknowledge being alcoholic or having arthritis.  You certainly don’t have to let racism define you.  But you have to deal with it.  Black and white, we all contend with it, and anyone, black or white or other, who denies this is lying to himself or herself.  It is reflexive in our hearts.

 

            The good news is that most of us, black and white, have got way beyond the point where (most of the time, anyway) the racist reflex dominates our behavior.  Most of the time, most of us refuse to act on our prejudices, and refuse to let our differences deny us friendships, working relationships, and all the benefits of living in a diverse world.  We understand at an increasingly profound level that what we have in common with those who look and/or speak differently is far more important than what distinguishes us.

 

            All the same, the post-slavery conversion is incomplete, and won’t be complete in our lifetimes.  The direction may be clear, but we haven’t reached the goal just yet.  This gives rise to tremendous anxiety, just like any other discontinuity between our instincts and our consciously chosen ways of behaving.  That is why jokes about sex are so powerful: we all are afflicted by instincts that, if we followed them, would quickly wreak havoc with our own lives and of all those we hold dear.  We are ashamed, perhaps justly, of certain thoughts or feelings.  And the gap between our desires and our social roles is papered over with a lot of lies.  This discontinuity is not only distressing, but awfully funny.  And it is very similar to our struggles with racism, which also have their intensely comic aspect.

 

            Which brings us to schock jocks like Imus.  They make their living by working between two limits: the limits of what is polite to say and the limits of what is truly taboo.  To do their work, they have to exceed the first limit, day after day.  In matters of race and sex, how we wish we thought and felt, how we would like to be believed by others to think and feel, represents the first limit.  Beyond that limit lies much comic terrain.  For some reason we like to laugh at the gap between how we think and feel and how we wish we thought and felt.  And when we see that same gap in others, we laugh even harder; self-deception and hypocrisy are often hilarious.  The jokes may be offensive, but they contain some kind of fidelity to things we know about ourselves or others.

 

            But to be continue to be allowed to work, shock jesters have to observe the second limit, too.  They need to be shocking enough to wring painful laughs from us, but they must play off emotional reality of some sort, preferably realities that make us or others ashamed.  Shock jocks are licensed to give voice to views that the better side of us wisely rejects, and to rebel for the nonce against a conventionality that squelches the expression of those views.  As long as it is clearly understood that these performers are just spokesmen for our rebellious ids, and not for coherent or serious political or social points of view, they probably do some good.  They blow off steam, the anxiety created by the discontinuity between the good behavior we have by and large chosen and what we can’t help feeling.  But where there is no emotional reality to the humor, the second limit is reached.

 

            And, as Imus discovered, the limits move.  While he wasn’t watching, jokes in which the whole point was the expression of a supposedly shared viewpoint that there was something ugly about female black athletes had become out of bounds.  Why?  Oversimplifying a lot, the crux of it is probably that there is no longer a critical mass of people whose unconscious view is that black female athletes are unattractive.  (Noting, incidentally, that two non-starting members of the roster were white.)  There doubtless used to be such a critical mass, and Imus would probably have been correct in working on that assumption not so many years ago.  But there are just not enough people around who, even in the racist recesses of their minds, hold that attitude now, for the joke to work.  Even our rebellious ids have moved on. 

 

            If I had to guess, I’d say the moment we knew the point had been reached would be the night in 2001when Halle Berry and Denzel Washington won the Academy Awards.  Certainly we were already there at the moment this last year when Barack Obama emerged as a perfectly viable presidential candidate.  There’s just an increasing number of ways for people to look normal (as most people unconsciously think of normal).  Probably most Americans, regardless of color or gender, looked at the photos of the Rutgers Scarlet Knights and saw the exact opposite of what Imus was saying: a group of attractive young women.  Not knockouts, but quite attractive enough so Imus’ comments simply found no purchase in our subconscious.  None of us is color blind, but you don’t have to be that in order to evaluate objectively the attractiveness of people belonging to other races.  And only someone literally blinded by racism could miss that these young women were perfectly comely.

 

            And one thing about being a shock jock is that you have to shoot, but you better not miss too often.  If the sum and substance of a joke is that you are blinded by racism to who is and is not attractive, and you trust your listeners are too, then, sorry, you’ve missed.  And at that point your sponsors should be pulling the plug in their own economic self-interest, and your network should be taking back the megaphone it had handed you.

 

            There has been a lot of indignant talk about the fact that Imus was pilloried by the likes of Jessie Jackson, whose odious remark about “Hymietown” reveals racist aspects of his own character, and Al Sharpton, whose disgraceful demagoguery over Tawana Brawley (which led to a defamation judgment against him) tells you everything you need to know about his integrity.  There’s been the suggestion that Jackson and Sharpton were no better than Imus.  That could actually be true (though in Jackson’s case at least I would disagree) but it’s not very relevant.  Jackson and Sharpton are really politicians, and the things that can or should bring down politicians are different from the things that bring down entertainers.

 

            The vaudeville hook that has pulled Imus off the stage has been wielded, as such hooks have always been wielded, because the joke wasn’t funny.  If the Scarlet Knights had been truly ugly, the joke might still have been offensive, but people would have laughed, at least.  The uncomfortable truth beneath the ugly joke would perhaps have saved Imus.  Here there simply was no psychological truth, and no reason for anyone to laugh.  And once that was understood, it was also clear there was no good reason for the man to be on the air.  He was paid to make people laugh, and had only disgusted them with his own moral ugliness.

 

            The real significance of the affair, then, isn’t that so many people found the joke offensive.  It’s that so many people failed to laugh.  There was no ring of truth to the joke, not even to the racist lurking in each of us.  Our inner racists weaken a little bit, year by year, and here they just weren’t strong enough any more to do harm.  It’s possible to make too much of it, but the consensus on that point is a telling and encouraging sign of where we are on our long march away from slavery.

 

Copyright (c) Jack L. B. Gohn

 

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers, War Lies: Part 21: Day Late, Dollar Short

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: A Series: Part XXI: Day Late, Dollar Short

 

            As we have seen, in the contest for control of the nation’s powers to start and to wage wars, the Executive usually wins.  The Framers dealt Congress a hand basically limited to the power to declare wars (not anticipating today’s world where wars are never declared by anyone) and the power of the purse (which cannot effectively be exercised to rein in existing wars because it leaves troops unsupported).  Such generic cards are routinely trumped by the President, expressly made the commander-in-chief.  The Supreme Court seldom gets in the game, even when it manifestly should.  The Founders would have been surprised at the lopsidedness of the contest, but over the last two hundred years, we as a nation have become inured to the Executive outgunning the other branches on this issue.  The Founders might have been even more surprised, however, at the toothlessness of the Press.

 

            James Madison, author of the First Amendment, clearly saw the electorate as the ultimate counterweight to any branch of government, and the Press as the essential empowerer of the electorate.  He famously wrote: “A popular government, without popular information, or the means of acquiring it, is but a prologue to a farce or a tragedy; or perhaps both.”   The Press is a primary “means of acquiring” that “popular information.” Justice Hugo Black, in the “Pentagon Papers” case, New York Times Co. v. United States, 403 U.S. 713 (1971), made the point more explicitly, and in a wartime context to boot:

 

The press was to serve the governed, not the governors. The Government’s power to censor the press was abolished so that the press would remain forever free to censure the Government. The press was protected so that it could bare the secrets of government and inform the people. Only a free and unrestrained press can effectively expose deception in government. And paramount among the responsibilities of a free press is the duty to prevent any part of the government from deceiving the people and sending them off to distant lands to die of foreign fevers and foreign shot and shell.

 

            Freedom of the press, then, was devised in effect to render the Press nearly a fourth branch of government, and – in matters of war – to give it, as the electorate’s tribune, the right and in Justice Black’s words, “the duty”  to inquire into and report upon the doings and the lies of the government.  When the Legislative and the Judiciary fail us, the Press is supposed to be the failsafe against deceit that sends us into reckless wars.

 

            So, as the band Smash Mouth asked, “What the hell happened?”   Where was the Press when, as discussed in many earlier pieces in this series, the government told lie after lie to inveigle us into a war that can best be characterized using James Madison’s exact phrase: “a farce or a tragedy; or perhaps both.”  (Well, scratch the “perhaps” part.)  The answer is that the Press was mostly AWOL from its constitutional responsibilities.  And the electorate, unprotected by its supposed guardians, was left to swallow the lies whole.

 

            One would have thought that, with two centuries of progressive inoculation of the country against sedition laws (discussed in earlier columns) – just as Madison no doubt intended, the Press would have felt perfectly free to find out and tell the truth, to warn of the oncoming White House lies like Paul Revere sounding the alarm against the approaching redcoats.  And the Press was free, but in the end that freedom counted for next to nothing, until it was too late.

 

            This was not an accident.  It was the culmination of a generation of newly sophisticated mechanisms of press control, what Michael Wolff has called “the great conservative message apparatus” : a multi-level, multi-pronged machine for putting out the chosen story, and drowning out all others. 

 

            At the bottom is the cheap, sleazy and multifarious agitprop underworld described by David Brock in his memoir of life within it, Blinded by the Right (2002), during which he put out hatchet job “biographies” of Anita Hill and Hilary Clinton; the Swift Boaters are a more recent and notorious manifestation.  Also inhabiting the bottom are the make-believe journalists seeded throughout the media, e.g. Jeff Gannon, the pseudonymous rent boy mysteriously issued White House press credentials and used to toss softball questions at George Bush in press conferences, and flacks Armstrong Williams, Karen Ryan, Michael McManus, and Maggie Gallagher, each on the government payroll while posing as a journalist in newspapers and on television, promoting the Bush agenda.   Add also the fake news reporting done by the Pentagon writing stories for republication as news in Iraqi newspapers, brought to light in 2005.

 

            In the middle are the explicitly right-wing media, the Bill O’Reillys and Rush Limbaughs, who populate “Angry White Male” radio and Fox News, and their somewhat more refined brethren at the American Enterprise Institute and the Hoover Institution, whose real product is often not news or commentary but an attitude, specifically an attitude toward the mainstream media (frequently called by their initials “MSM”).  That attitude is one of embattled righteous indignation.  The attitude is a tactic, more sophisticated than may at first appear, to “decertify” the press in the public eye, as Jay Rosen former journalism chair at NYU put it.   The goal is, as Salon columnist Eric Boehlert summarized in his recent book Lapdogs (2006): “to create a news culture where there are few if any agreed upon facts, therefore making serious debate impossible.”

 

            When it comes to the top, i.e. the part of the government that interfaces with the press, most typically through background briefings and leaks, the message is even clearer.  A White House aide, probably Karl Rove, told author Ron Suskind in 2004 that “a judicious study of discernable reality” is “not the way the world really works anymore… We’re an empire now, and when we act, we create our own reality.”  So much for facts; so much for what Rove (?) scornfully called “the reality-based community.” 

 

            The typical right-wing knock on the MSM is that it is “liberal,” meaning that it proceeds with a “liberal” bias, and thus with a bias.  These are two separate accusations.  The first attributes to the MSM a series of attitudes known as “liberal,” a term whose meaning is now somewhat vague but which certainly includes things like multilateralism, concern for the ecology, resistance to Evangelical hegemonism, and skepticism toward military solutions to national problems.  These attitudes, thought to be bad things, are attributed without much evidence to the practitioners of mainstream journalism.  And at that point, the second accusation becomes viable, namely, that since the MSM proceed from a bias, the objectivity of their reporting, and hence the accuracy of what they report, including but not limited to reports of governmental dishonesty about war and peace, may be rejected out of had by reasonable voters.

 

            But what is the real record here?  The truth is that the MSM have been coopted and cowed, and do not articulate “liberal” views or evidence a “liberal” bias, and, unfortunately, probably partly owing to the success of the right-wing attacks, when it came to George Bush’s wars, failed to report what they knew, investigate what they should investigate, or give proper context to their reasonable suspicions for a very long time.

 

            Some specifics.  And here I acknowledge I rely entirely on Boehlert’s Lapdogs and Frank Rich’s book, The Greatest Story Ever Sold (2006) for a few representative examples (and I am only scratching the surface):

 

$          Regarding WMD, Judith Miller of the Times, in 2001, during the run-up to the war, provided crucial credibility to the tall tales of Iraqi defector Adnan Ihsan Saeed al-Haideri, who claimed to have worked on renovated biological, chemical and nuclear weapons in underground sites.  Later she reported on the “aluminum tubes” red herring (tubes supposedly useful only as centrifuges for uranium refinement).  This story was enormously influential in swaying public opinion.  (If the “liberal” Times said it, it had to be true, surely.)  Members of the scientific community immediately stepped forward to challenge the tubes story.  Miller refused to run their doubts.  Miller kept on believing in WMD long after sanity had prevailed in most of the media.  Miller became, in general, such an Administration pet that she ultimately went to jail to protect Scooter Libby as a confidential source in the White House effort to discredit Ambassador Joseph Wilson, who had told the truth about phony White House intelligence suggesting Niger uranium sales to Iraq.

 

$          On March 7, 2003, the authoritative Mohamed ElBaradei, Director General of the International Atomic Energy Agency, suggested strongly that the documents whose contents Wilson had denounced were forged.  The White House press corps failed to ask a single question about Baradei’s remarks until March 14, five press briefings later.

 

$          In February 2003 alone, the Washington Post editorialized in favor of going to war nine times (after fifteen additional favorable editorials in the preceding five months).

 

$          In October 2002, retired Marine General Anthony Zinni, former head of Central Command in the Middle East, gave the keynote speech at a meeting of the Middle East Institute, a Washington think tank, in which he warned that war was unnecessary, and that Saddam was containable.  The Post buried the story on Page 16.

 

$          According to Fairness and Accuracy in Reporting, a watchdog group, of the 393 people interviewed on-camera for network news reports about the upcoming war in the month before the Iraq invasion, only 6 percent expressed skepticism.  But then, according to media analyst Andrew Tyndall, of the 414 Iraq stories broadcast on NBC, ABC and CBS from September 2002 till February 2003, almost all could be traced to Administration sources. The mainstream television news coverage was a 24-hour Bush Administration propaganda-fest.

 

$          MSNBC fired Phil Donahue, whose show had the best rating on the network, in early 2003 in the wake of an internal NBC memo pointing out that its media rivals were waving the flag about the war and Donahue was a doubting liberal. 

 

$          CNN cleared with the Pentagon the retired generals it planned to use as on-air commentators during the war.

 

$          The MSM consistently and outrageously downplayed the existence and strength of the U.S. antiwar movement.  On October 26, 2004, more than 100,000 people gathered in Washington to protest the war.  The Times covered it in a small story on Page 8, falsely stating that it “fewer people … than organizers had said they hoped for.”  The Post covered it halfway down the front page of the Metro Section, with what ombudsman Michael Getler called “a couple of ho-hum photographs that captured the protest’s fringe elements.”  When Cindy Sheehan set up her August 2005 vigil outside the President’s home in Crawford, she was initially ignored.  Boehlert points up this startling figure: between August 5 and August 8 CNN mentioned her eight times – and Britney Spears eighteen times.  When Sheehan led a protest in Washington on September 24 which drew between 100,000 and 200,000 participants, the MSM effectively ignored her, opting to lavish coverage on Hurricane Rita, which was nowhere near as destructive or dangerous as Hurricane Katrina a few weeks earlier, and in many ways a less important story than the protest.

 

$          Even Bob Woodward, the man whose typewriter had been so instrumental in bringing down Richard Nixon, was sucked in.  His notable contributions during the war were a trilogy of books about the Bush White House, the product of an extraordinary degree of access.  For the first two books, Bush at War (2002) and Plan of Attack (2004), almost nothing derogatory, nothing revelatory of the concerted campaign of deceit, came out.  A first-class muckraker had been turned into a lapdog.  The third volume, State of Denial (2006) no longer maintains the worshipful tone, but has only demoted Bush and his war cabinet to a crew of self-deluded incompetents, and still fails to acknowledge the amoral deceit which is their hallmark in selling the war.

 

            And the MSM is still being lambasted for being too “liberal.”

 

            Naturally, you can only fool all of the people some of the time.  Katrina seems to have been the tipping point in the Press’s largely free ride for the Administration’s war lies.   But the tameness of the MSM, the going-along with stage-managed presidential press conferences, the acceptance of flacks as commentators, and the silencing of dissenting voices, delayed that moment, which should have arrived before the first soldier set foot in Iraq.  Now we are in the hole to the tune of thousands of lives and billions of dollars, and, very possibly, a lost place at the center of the history of this century.

 

            Way to go, Fourth Estate!

 

Copyright (c) Jack L. B. Gohn

 

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: Part 20: Mine To Know

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column

War Powers, War Lies: A Series: Part XX: Mine To Know

 

            As we have seen, our national leaders lust for impunity from criticism, and in time of war they have tended to arrogate to themselves, specifically as war powers, means to suppress critics.  Because the First Amendment has developed to make it nearly impossible to jail wartime critics simply for being wartime critics, our government have sought out new tactics.  A newer one particularly employed by the current administration is an effort to starve criticism by cutting off information.

 

            In theory the government belongs to us all and works for us all, and the actions it takes and the documents it generates and retains with taxpayer money should be available to taxpayers upon request.  In practice, government bureaucracies sequester massive quantities of information.  War affords heightened justification for that sequestration.

 

            To curb the governmental impulse to conceal, Congress has passed many laws: the Presidential Records Act,  the Federal Advisory Committee Act,  the Foreign Intelligence Surveillance Act,  the General Accounting Office Act, and whistleblower provisions of the Civil Service Reform Act among them.  Perhaps most important is the Freedom of Information Act (“FOIA”),  which allows anyone to require the government to stand and deliver information within its possession.  As originally passed, none of these laws made exceptions for wartime circumstances.

 

            The men now occupying the White House have always hated these laws.  Their opposition was chronicled in Prof. Alasdair Roberts’ study, Blacked Out (2006).   For instance, when first passed, FOIA had no provisions for judicial review; if you were turned down by the agency holding the documents you wanted, you were out of luck.  In 1974, outraged by information abuses in Watergate and Vietnam, Congress amended FOIA to provide for judicial review of denials.  President Ford vetoed the change.  Among those who counseled the now sainted Gerald Ford to take this indefensible step were Donald Rumsfeld, by then Ford’s Chief of Staff (and Dick Cheney’s boss), and Antonin Scalia, then head of the Justice Department’s Office of Legal Counsel. (Congress fortunately overrode this veto.)

 

            In January 2002 Dick Cheney told ABC News that the gradual passage of acts like those referenced above constituted “an erosion of the powers and the ability of the President of the United States to do his job.” And of course Cheney famously led efforts to keep the public from learning the identities of those who had participated in the 2001 meetings of a governmental organization, the National Energy Policy Group, presumably to rob critics of the ability to demonstrate factually the way the energy industry was driving the formulation of policy.

 

            Likewise, starting in March 2001, Alberto Gonzales, then of the Office of Legal Counsel, reportedly issued the first of three orders delaying beyond a lawful date compliance with the Presidential Records Act as to the release Reagan Administration records.  This was followed by  an Executive Order claiming a right of current administrations to block the release of records created by previous administrations. 

 

            Then came 9/11, creating a justification not only for the disastrous Iraq adventure but also for massive exclusion of the citizenry from learning the workings of its own government. 

 

            An early intimation of the assertion of a pro-secrecy policy as, in effect, a war power, came in an October 12, 2001 memorandum of Attorney General John Ashcroft to all government agencies. Shorn of double-talk praising FOIA, the memo urged all agencies to slow down the disclosure of all information under FOIA, in part because of concerns about security.  In the name of security, Congress blew another hole in FOIA with the Homeland Security Act of 2002  and yet another with the Critical Infrastructure Information Act of 2002.   These laws shielded from public disclosure information relating to “critical infrastructure” operated by the private sector.   The effect of these laws on terrorism is unknown; a known effect was to roll back the availability of information to communities seeking to learn about environmental hazards created by the presence of industrial facilities in their neighborhoods.

 

            Ashcroft used the USA PATRIOT Act, a response to the terrorist threat, as justification for a claimed option to exclude the public from previously public deportation hearings.   He successfully fought lawsuits by the ACLU and others to learn the names of the secret detainees seized around the world in 2001 and 2002.  He largely stymied Congressional inquiries to learn the same information.   It took years for defense lawyers and the press to secure the names of those held at Guantanamo; we still lack an accounting of those held in the secret international CIA gulag.  The secret detentions and all of the issues that surround them (torture, military tribunals, habeas corpus, etc.) are at the core of the debate about this country’s political course.  Hence this information, of all governmental information developed in the last five years, is perhaps the most critical to the fostering of a well-informed public debate.

 

            At the same time, the Administration began a program of aggressive classification of government documents and a slowdown or even reversal of declassification.  FOIA allows for withholding of classified information.  Under the new regime, however, there were a host of new classifications not contemplated under FOIA or the practice thereunder, and of unknown validity or impact: “Sensitive But Unclassified,” or “Sensitive Security Information” or “For Official Use Only” or “Homeland Security Sensitive” or “Law Enforcement Sensitive.”   In 2004, 15.6 million documents were classified, nearly double the number in 2001, while the statutory declassification process, dropped from 304 million pages in 1997 to just 28 million pages in 2004.   In many cases, information which had been available publicly was taken off public shelves and websites, without notice or acknowledgment. 

 

            The impact, as the New York Times complained in 2005, was that “innocuous White House press pool reports are now subject to classification, while historians complain of yearlong delays before academic requests are even acknowledged, never mind fulfilled.  Environmentalists can’t see routine dam and river drainage maps in the name of homeland security.” 

 

            We have been told by the White House that we shall be on a war footing with Islamic fundamentalist terrorist for a generation.[Comment17]   Apparently this “wartime” expansion of governmental secrecy will therefore likewise continue indefinitely, unless the public musters the indignation and the will to stop it.

 

            It must be stopped, of course.  It is intolerable that so much information, so much of it innocuous, should be gathered and created in our name and at our expense, and we be denied it.  But it is not merely a matter of declassifying innocuous information without military significance.  We also need to know more about things that are quite arguably war secrets (conceding only for the sake of argument that war is what we are in).

 

            The hidden CIA prisons and the secret NSA surveillance program, revealed, respectively, by the Washington Post on November 2, 2005 and the New York Times and Los Angeles Times on December 16, 2005, would certainly have qualified as war secrets under this analysis.  But they were also matters in which the public had a legitimate interest, and an urgent need to learn of and debate.  It may well be that the rage displayed by the White House at these revelations (Bush called the NSA leak a “shameful act”)  was the “bridge too far” in its campaign to keep the public behind its concealments – and its “war.”  While as of late 2005, the American public was still behind the “war” (with 50% believing we had done the right thing to invade), that number dropped to 46% by the end of the year.   The public manifestly cared not only about the mission but about the means by which it was accomplished. 

 

            The Administration should have leveled with the American people about these means.  These were matters far too important to have been kept as war secrets.  The reflexive stance of the Administration, as we have all learned, is that everything that has any conceivable intelligence use, either for us or our adversaries, is a war secret.  As such it is by definition too sensitive to be revealed to anyone, be that person a journalist, a scholar, the defense counsel for the accused in a Guantanamo military commission, or someone like Maher Arar suing the government because he was abducted, rendered to a third country, and tortured, under a misapprehension as to his identity.  If that standard were effectively implemented, however, the ongoing national debate about our values in a time of war would stifle for want of basic data.

 

            The response of the Administration and its apologists has been, essentially, that protection of national security trumps solicitude for national debate about wartime values.  This is, of course, terribly convenient for any administration that wants to silence critics, especially since the war is avowedly permanent.  The silencing of critics would then become permanent as well.  This is not acceptable.

 

            Obviously, then, we need a new approach, one more solicitous of the public’s need and right to know.  We need a revived and strengthened FOIA, implemented with a new bureaucracy to undo the overclassification that the Bush bureaucracy has wrought.  Obviously we won’t get these things tomorrow.

 

            In the meantime, however, there remain ways in which the balance gets righted and the public informed, notwithstanding all governmental efforts to prevent it.  What does not come out through the front door is apt to exit the windows.  Professor Roberts makes a comparison worth pondering.  

 

            In 1969, Daniel Ellsberg, then a RAND Corporation analyst engaged to write a classified history of the Vietnam War, was given the documents that the world later knew as the Pentagon Papers.  Those documents were logged into his office safe by a password so secret it was outside even the RAND security system.   When he decided that the documents needed to be leaked to the New York Times, it was no light matter, even from a logistical standpoint.  It took him six weeks of covert effort to make photocopies of the 7,000 pages.  Breaking his pledge of secrecy and overcoming security precautions designed to enforce that pledge was quite difficult.

 

            By contrast, in December 2002, Treasury Secretary Paul O’Neill was dismissed for want of political loyalty, i.e. criticizing Bush’s economic policies, and confirming that the Iraq invasion had been planned since the first National Security Council meeting of the Bush administration.[Comment22]   As Roberts puts it, O’Neill “walked out of his office with a CD-ROM that contained 19,000 documents.”  Just like that.

 

            You cannot really classify the truth for all that long, even in “wartime.” 

            We would be far better served if the notion of information classification were completely severed from the concept of war powers.  We need one Constitution, as the Supreme Court said long ago, in war and in peace.   And we need one FOIA too.    

Copyright (c) Jack L. B. Gohn

The Big Picture Home Page | Previous Big Picture Column |  Next Big Picture Column

War Powers Page | Previous War Powers ColumnNext War Powers Column