Tourists of Empire

Afghanistan war cartoon By William Astore, retired Air Force Lieutenant, who discusses America’s peculiar brand of global imperialism. He mentions in Afghanistan and elsewhere the U.S. is suffering from Imperial Tourism Syndrome. Published: October 28, 2015 | Authors: William Astore | TomDispatch | Op-Ed

(Photos and cartoons have been inserted from media)

The United States is a peculiar sort of empire.  As a start, Americans have been in what might be called imperial denial since the Spanish-American War of 1898, if not before.  Empire — us?  We denied its existence even while our soldiers were administering “water cures” (aka waterboarding) to recalcitrant Filipinos more than a century ago.  Heck, we even told ourselves we were liberating those same Filipinos, which leads to a second point: the U.S. not only denies its imperial ambitions, but shrouds them in a curiously American brand of Christianized liberation theology.  In it, American troops are never seen as conquerors or oppressors, always as liberators and freedom-bringers, or at least helpers and trainers.  There’s just enough substance to this myth (World War II and the Marshall Plan, for example) to hide uglier imperial realities.

Denying that we’re an empire while cloaking its ugly side in missionary-speak are two enduring aspects of the American brand of imperialism, and there’s a third as well, even if it’s seldom noted.  As the U.S. military garrisons the planet and its special operations forces alone visit more than 140 countries a year, American troops have effectively become the imperial equivalent of globetrotting tourists.  Overloaded with technical gear and gadgets (deadly weapons, intrusive sensors), largely ignorant of foreign cultures, they arrive eager to help and spoiling for action, but never (individually) staying long.  Think of them as the twenty-first-century version of the ugly American of Vietnam-era fame.

Cartoon Afghansistan

The ugliest of Americans these days may no longer be the meddling CIA operative of yesteryear; “he” may not even be human but a “made in America” drone. Think of such drones as especially unwelcome American tourists, cruising the exotic and picturesque backlands of the planet loaded with cameras and weaponry, ready to intervene in deadly ways in matters its operators, possibly thousands of miles away, don’t fully understand.  Like normal flesh-and-blood tourists, the drone “sees” the local terrain, “senses” local activity, “detects” patterns among the inhabitants that appear threatening, and then blasts away.  The drone and its operators, of course, don’t live in the land or grasp the nuances of local life, just as real tourists don’t.  They are literally above it all, detached from it all, and even as they kill, often wrongfully, they’re winging their way back home to safety.

Imperial Tourism Syndrome

Call it Imperial Tourist Syndrome, a bizarre American affliction that creates its own self-sustaining dynamic.  To a local, it might look something like this: U.S. forces come to your country, shoot some stuff up (liberation!), take some selfies, and then, if you’re lucky, leave (at least for a while).  If you’re unlucky, they overstay their “welcome,” surge around a bit and generate chaos until, sooner or later (in places like Iraq and Afghanistan, much, much later), they exit, not always gracefully (witness Saigon 1975 or Iraq 2011).

Drone death protest

And here’s the weirdest thing about this distinctly American version of the imperial: a persistent short-time mentality seems only to feed its opposite, wars that persist without end.  In those wars, many of the country’s heavily armed imperial tourists find themselves sent back again and again for one abbreviated tour of duty after another, until it seems less like an adventure and more like a jail sentence.

The paradox of short-timers prosecuting such long-term wars is irresolvable because, as has been repeatedly demonstrated in the twenty-first century, those wars can’t be won.  Military experts criticize the Obama administration for lacking an overall strategy, whether in Syria, Iraq, Afghanistan, or elsewhere.  They miss the point.  Imperial tourists don’t have a strategy: they have an itinerary.  If it’s Tuesday, this must be Yemen; if it’s Wednesday, Libya; if it’s Thursday, Iraq.

Drone deaths

In this way, America’s combat tourists keep cycling in and out of foreign hotspots, sometimes on yearly tours, often on much shorter ones.  They are well-armed, as you’d expect in active war zones like Iraq or Afghanistan.  Like regular tourists, however, they carry cameras as well as other sensors and remain alert for exotic photo-shoots to share with their friends or the folks back home.  (Look here, a naked human pyramid in Abu Ghraib Prison!)

As tourists, they’re also alert to the possibility that on this particular imperial safari some exotic people may need shooting.  There’s a quip that’s guaranteed to win knowing chuckles within military circles: “Join the Army, travel to exotic lands, meet interesting people — and kill them.”  Originally an anti-war slogan from the Vietnam era, it’s become somewhat of a joke in a post-9/11 militarized America, one that quickly pales when you consider the magnitude of foreign body counts in these years, made more real (for us, at least) when accompanied by discomforting trophy photos of U.S. troops urinating on enemy corpses or posing with enemy body parts.

Here’s the bedrock reality of Washington’s twenty-first-century conflicts, though: no matter what “strategy” is concocted to fight them, we’ll always remain short-time tourists in long-term wars.

Imperial Tourism: A Surefire Recipe for Defeat Drone shooting missile

It’s all so tragically predictable.  When it’s imperial tourists against foreign “terrorists,” guess who wins?  No knock on American troops.  They have no shortage of can-do spirit.  They fight to win.  But when their imperial vacations (military interventions/invasions) morph into neocolonial staycations (endless exercises in nation-building, troop training, security assistance, and the like), they have already lost, no matter how many “having a great time” letters — or rather glowing progress reports to Congress — are sent to the folks back home.

By definition, tourists, imperial or otherwise, always want to go home in the end.  The enemy, from the beginning, is generally already home.  And no clever tactics, no COIN (or counterinsurgency) handbook, no fancy, high-tech weapons or robotic man-hunters are ever going to change that fundamental reality.

It was a dynamic already obvious five decades ago in Vietnam: a ticket-punching mentality that involved the constant rotation of units and commanders; a process of needless reinvention of the most basic knowledge as units deployed, bugged out, and were then replaced by new units; and the use of all kinds of grim, newfangled weapons and sensors, everything from Agent Orange and napalm to the electronic battlefield and the latest fighter planes and bombers — all for naught.  Under such conditions, even the U.S. superpower lacked staying power, precisely because it never intended to stay.  The “staying” aspect of the Vietnam War was often referred to in the U.S. as a “quagmire.”  For the Vietnamese, of course, their country was no “big muddy” that sucked you down.  It was home.  They had little choice in the matter; they stayed — and fought.

Combine a military with a tourist-like itinerary and a mentality to match, a high command that in its own rotating responsibilities lacks all accountability for mistakes, and a byzantine, top-heavy bureaucracy, and you turn out to have a surefire recipe for defeat.  And once again, in the twenty-first century, whether among the rank and file or at the very top, there’s little continuity or accountability involved in America’s military presence in foreign lands.  Commanders are constantly rotated in and out of war zones.  There’s often a new one every year.  (I count 17 commanders for the International Security Assistance Force for Afghanistan, the U.S.-led military coalition, since December 2001.) U.S. troops may serve multiple overseas tours, yet they are rarely sent back to the same area.  Tours are sequential, not cumulative, and so the learning curve exhibited is flat.

There’s a scene at the beginning of season four of “Homeland” in which ex-CIA chief Saul Berenson is talking with some four-star generals.  He says: “If we’d known in 2001 we were staying in Afghanistan this long, we’d have made some very different choices.  Right?  Instead, our planning cycles rarely looked more than 12 months ahead.  So it hasn’t been a 14-year war we’ve been waging, but a one-year war waged 14 times.”

True enough.  In Afghanistan and Iraq as well, the U.S. has fought sequentially rather than cumulatively.  Not surprisingly, such sequential efforts, no matter how massive and costly, simply haven’t added up.  It’s just one damn tour after another.

But the fictional Saul’s tagline on Afghanistan is more suspect: “I think we’re walking away with the job half done.” For him, as well as for the Washington establishment of this moment, the U.S. needs to stay the course (at least until 2017, according to President Obama’s recent announcement), during which time assumedly we’ll at long last stumble upon the El-Dorado-like long-term strategy in which America actually prevails.

Of course, the option that’s never on Washington’s table is the obvious and logical one: simply to end imperial tourism.  With apologies to Elton John, “sorry” is only the second hardest word for U.S. officials.  The first is “farewell.” 

Bumper Sticker VVAW

A big defeat (Vietnam, 1975) might keep imperial tourism fever in check for a while.  But give us a decade or three and Americans are back at it, humping foreign hills again, hoping against hope that this year’s trip will be better than the previous year’s disaster.

In other words, a sustainable long-term strategy for Afghanistan is precisely what the U.S. government has failed to produce for 14 years!  Why should 2015 or 2017 or 2024 be any different than 2002 or 2009 or indeed any other year of American involvement?

Unarmed Vietnamese Hide from US My Lai Assault

Unarmed Vietnamese Hide from US My Lai Assault

At some level, the U.S. military knows it’s screwed.  That’s why its commanders tinker so much with weapons and training and technology and tactics.  It’s the stuff they can control, the stuff that seems real in a way that foreign peoples aren’t (at least to us).  Let’s face it: past as well as current events suggest that guns and how to use them are what Americans know best.

But foreign lands and peoples?  We can’t control them.  We don’t understand them.  We can’t count on them.  They’re just part of the landscape we’re eternally passing through — sometimes as people to help and places to rebuild, other times as people to kill and places to destroy.  What they aren’t is truly real.  They are the tourist attractions of American war making, sometimes exotic, sometimes deadly, but (for us) strangely lacking in substance.

And that is precisely why we fail.

Vets against Afghan and Iraq wars

Related Images:

New Testament Facts on Easter in Dispute

Yokosuka ChapelThis article originated from Father Dan's Blog

(Chapel in Yokosuka, Japan where I attended and studied the Bible when 15 and 16)

I was going to save this until Sunday but then I started thinking - "Gee, this could be fun to discuss in church, after church at brunch - maybe Friday night during family fun time. . . ." After all, the Bible is the infallible, divinely-inspired word of God, right? The entire Christian religion is based on the resurrection of the Christ - so we know that part of the Good Book will be very accurate! The very crux of the argument for Christianity being the one true religion is that it is the only religion in which the Savior actually rose from the dead to fulfill prophecy- so let's look closer to see if there were any conflicts in the observations recorded in the Holy Book:

Father Dan's Easter Quiz:

1. Who first came to the tomb on Sunday morning? a. one woman (John 20:1) b. two women (Matt. 28:1) c. three women (Mark 16:1) d. more than three women (Luke 23:55-56; 24:1,10)

2. She (they) came a. while it was still dark (Matt. 28:1; John 20:1) b. after the sun had risen (Mark 16:2)

3. The woman (women) came to the tomb a. to anoint the body of Jesus with spices (Mark 16:1-2; Luke 24:1) b. just to look at it (Matt. 28:1; John 20:1)

4. The women had obtained the spices a. on Friday before sunset (Luke 23:54-56; 24:1) a. after sunset on Saturday (Mark 16:1)

5. The first visitor(s) was/were greeted by a. an angel (Matt. 28:2-5) b. a young man (Mark 16:5) c. two men (Luke 24:4) d. no one (John 20:1-2)

6. The greeter(s) a. was sitting on the stone outside the tomb (Matt 28:2) b. was sitting inside the tomb (Mark 16:5) c. were standing inside the tomb (Luke 24:3-4)

7. After finding the tomb empty, the woman/women a. ran to tell the disciples (Matt. 28:7-8; Mark 16:10; Luke 24:9; John 20:2) b. ran away and said nothing to anyone (Mark 16:8)

8. The risen Jesus first appeared to a. Mary Magdalene alone (John 20:14; Mark 16:9) b. Cleopas and another disciple (Luke 24:13,15,18) c. Mary Magdalene and the other Mary (Matt. 28:1,9) d. Cephas (Peter) alone (1 Cor. 15:4-5; Luke 24:34)

9. Jesus first appeared a. somewhere between the tomb and Jerusalem (Matt. 28:8-9) b. Just outside the tomb (John 20:11-14) c. in Galilee - some 80 miles (130 Km) north of Jerusalem (Mark 16:6-7) d. on the road to Emmaus - Miles (11 Km) west of Jerusalem (Luke 24:13-15) e. we are not told where (Mark 16:9; 1 Cor. 15:4-5)

10. The disciples were to see Jesus first a. in Galilee (Mark 16:7; Matt. 28:7,10,16) b. in Jerusalem (Mark 16:14; Luke 24:33,36; John 20:19; Acts 1:4)

11. the disciples were told that they would meet the risen Jesus in Galilee a. by the women, who had been told by an angel of the Lord, then by Jesus himself after the resurrection (Matt. 28:7-10; Mark 16:7) b. by Jesus himself, before the crucifiction (Mark 26:32)

12. The risen Jesus a. wanted to be touched (John 20:27) b. did not want to be touched (John 20:17) c. did not mind being touched (Matt. 28:9-10)

13. Jesus ascended to Heaven a. the same day that he was resurrected (Mark 16:9,19; Luke 24:13,28-36,50-51) b. forty days after the resurrection (Acts 1:3,9) c. we are not told that he ascended to Heaven at all (Matt. 28:10, 16-20; John 21:25; the original Gospel of Mark ends at 16:8)

14. The disciples received the Holy Spirit a. 50 days after the resurrection (Acts 1:3,9) b. in the evening of the same day as the resurrection (John 20:19-22)

15. The risen Jesus a. was recognized by those who saw him (Matt. 28:9; Mark 16:9-10) b. was not always recognizable (Mark 16:12; Luke 24:15-16,31,36-37; John 20:14-15)

16. The risen Jesus a. was physical (Matt. 28:9; Luke 24:41-43; John 20:27) b. was not physical (Mark 16:9,12,14; Luke 24:15-16,31,36-37; John 20:19,26; 1 Cor. 15:5-8)

17. The risen Jesus was seen by the disciples a. presumably only once (Matt. 28:16-17) b. first by two of them, later by all eleven (Mark 16:12-14; Luke 24:13-15,33,36-51) c. three times (John 20:19,26; 21:1,14) d. many times (Acts 1:3)

18. When Jesus appeared to the disciples a. there were eleven of them (Matt. 28:16-17; Luke 24:33,36) b. twelve of them (1 Cor. 15:5)

Hey, when has religion ever let facts or figures get in the way of a good quote. If this quiz has in any way shaken your faith, simply open the Bible and pull out sentences at random that make you feel good or (completely out of context) reaffirm any belief you want to hold.

Modern-day Easter is derived from two ancient traditions: one Judeo-Christian and the other Pagan. Both Christians and Pagans have celebrated death and resurrection themes following the Spring Equinox for millennia. Most religious historians believe that many elements of the Christian observance of Easter were derived from earlier Pagan celebrations.

The equinox occurs each year on March 20, 21 or 22. Both Neopagans and Christians continue to celebration religious rituals in the present day. Wiccans and other Neopagans usually hold their celebrations on the day or eve of the equinox. Western Christians wait until the Sunday on or after the next full moon. The Eastern Orthodox churches follow the Julian Calendar, so that their celebration is generally many weeks after that of the Western churches.

The name "Easter" originated with the names of an ancient Goddess and God. The Venerable Bede, (672-735 CE.) a Christian scholar, first asserted in his book De Ratione Temporum that Easter was named after Eostre (a.k.a. Eastre). She was the Great Mother Goddess of the Saxon people in Northern Europe. Similar "Teutonic dawn goddess of fertility [were] known variously as Ostare, Ostara, Ostern, Eostra, Eostre, Eostur, Eastra, Eastur, Austron and Ausos." 1 Her name was derived from the ancient word for spring: "eastre." Similar Goddesses were known by other names in ancient cultures around the Mediterranean, and were celebrated in the springtime. Some were:

Aphrodite from Cyprus Astarte, from Phoenicia Demeter, from Mycenae Hathor from Egypt Ishtar from Assyria Kali, from India Ostara, a Norse Goddess of fertility.

But WAIT! Various early church writers, such as Irenaeus (Bishop of Lyons; circa 120 to ?) Justin Martyr (Christian apologist; 100 to 165), Tertullian (Christian theologian; circa 160 to 220 +) concluded that the Pagan/Christian similarities were a Satanic attempt at "diabolical mimicry." Satan was said to have use "plagiarism by anticipation." That is, the Devil replicated the life experiences of Jesus, centuries before his birth. The reason was to confuse the public into thinking that Jesus was merely a copy of previous godmen.

cropped-Sepulveda-Unitarian-Universalist-Society-Onion.jpg (Sepulveda Unitarian Universalist Society structure known as the "Onion" where Liberal Services, weddings, concerts, poetry and book signings, open mike and other activities are held that are non-denominational and children friendly)

Related Images:

Our “Merciful” Ending to the “Good War” Or How Patriotism Means Never Having To Say You’re Sorry By Christian Appy

Hiroshima explosion Bomber Hiroshima    Hiroshima Torii burning Hiroshima yes, yes,

Hiroshima Einstein

WWII Enola Gay dropped the atomic bomb on Hiroshima August 6, 1945

“Never, never waste a minute on regret. It's a waste of time.”

-- President Harry Truman

Here we are, 70 years after the nuclear obliteration of Hiroshima and Nagasaki, and I'm wondering if we've come even one step closer to a moral reckoning with our status as the world's only country to use atomic weapons to slaughter human beings. Will an American president ever offer a formal apology? Will our country ever regret the dropping of “Little Boy” and “Fat Man,” those two bombs that burned hotter than the sun? Will it absorb the way they instantly vaporized thousands of victims, incinerated tens of thousands more, and created unimaginably powerful shockwaves and firestorms that ravaged everything for miles beyond ground zero?  Will it finally come to grips with the “black rain” that spread radiation and killed even more people -- slowly and painfully -- leading in the end to a death toll for the two cities conservatively estimated at more than 250,000?

Given the last seven decades of perpetual militarization and nuclear “modernization” in this country, the answer may seem like an obvious no. Still, as a historian, I've been trying to dig a little deeper into our lack of national contrition. As I have, an odd fragment of Americana kept coming to mind, a line from the popular 1970 tearjerker Love Story:“Love,” says the female lead when her boyfriend begins to apologize, “means never having to say you're sorry.” It has to be one of the dumbest definitions ever to lodge in American memory, since real love often requires the strength to apologize and make amends.

It does, however, apply remarkably well to the way many Americans think about that broader form of love we call patriotism. With rare exceptions, like the 1988 congressional act that apologized to and compensated the Japanese-American victims of World War II internment, when it comes to the brute exercise of power, true patriotism has above all meant never having to say you're sorry. The very politicians who criticize other countries for not owning up to their wrong-doing regularly insist that we should never apologize for anything. In 1988, for example, after the U.S. Navy shot down an Iranian civilian airliner over the Persian Gulf killing all 290 passengers (including 66 children), Vice President George H.W. Bush, then running for president, proclaimed, “I will never apologize for the United States. Ever. I don't care what the facts are.”

It turns out, however, that Bush's version of American remorselessness isn’t quite enough. After all, Americans prefer to view their country as peace-loving, despite having been at war constantly since 1941. This means they need more than denials and non-apologies. They need persuasive stories and explanations (however full of distortions and omissions). The tale developed to justify the bombings that led to a world in which the threat of human extinction has been a daily reality may be the most successful legitimizing narrative in our history. Seventy years later, it’s still deeply embedded in public memory and school textbooks, despite an ever-growing pile of evidence that contradicts it. Perhaps it’s time, so many decades into the age of apocalyptic peril, to review the American apologia for nuclear weapons -- the argument in their defense -- that ensured we would never have to say we're sorry.

The Hiroshima Apologia

On August 9, 1945, President Harry Truman delivered a radio address from the White House. “The world will note,” he said, “that the first atomic bomb was dropped on Hiroshima, a military base. That was because we wished in this first attack to avoid, insofar as possible, the killing of civilians.” He did not mention that a second atomic bomb had already been dropped on Nagasaki.

Truman understood, of course, that if Hiroshima was a “military base,” then so was Seattle; that the vast majority of its residents were civilians; and that perhaps 100,000 of them had already been killed. Indeed, he knew that Hiroshima was chosen not for its military significance but because it was one of only a handful of Japanese cities that had not already been firebombed and largely obliterated by American air power. U.S. officials, in fact, were intent on using the first atomic bombs to create maximum terror and destruction. They also wanted to measure their new weapon’s power and so selected the “virgin targets” of Hiroshima and Nagasaki. In July 1945, Secretary of War Henry Stimson  informed Truman of his fear that, given all the firebombing of Japanese cities, there might not be a target left on which the atomic bomb could “show its strength” to the fullest. According to Stimson's diary, Truman “laughed and said he understood.”

The president soon dropped the “military base” justification. After all, despite Washington's effort to censor the most graphic images of atomic annihilation coming out of Hiroshima, the world quickly grasped that the U.S. had destroyed an entire city in a single blow with massive loss of life. So the president focused instead on an apologia that would work for at least the next seven decades. Its core arguments appeared in that same August 9th speech. “We have used [the atomic bomb] against those who attacked us without warning at Pearl Harbor,” he said, “against those who have starved and beaten and executed American prisoners of war, against those who have abandoned all pretense of obeying international laws of warfare. We have used it in order to shorten the agony of war, in order to save the lives of thousands and thousands of young Americans.”

By 1945, most Americans didn't care that the civilians of Hiroshima and Nagasaki had not committed Japan's war crimes. American wartime culture had for years drawn on a long history of “yellow peril” racism to paint the Japanese not just as inhuman, but as subhuman. As Truman put it in his diary, it was a country full of “savages” -- “ruthless, merciless, and fanatic” people so loyal to the emperor that every man, woman, and child would fight to the bitter end. In these years, magazines routinely depicted Japanese as monkeys, apes, insects, and vermin. Given such a foe, so went the prevailing view, there were no true “civilians” and nothing short of near extermination, or at least a powerful demonstration of America's willingness to proceed down that path, could ever force their surrender. As Admiral William “Bull” Halsey said in a 1944 press conference, “The only good Jap is a Jap who's been dead six months.”

In the years after World War II, the most virulent expressions of race hatred diminished, but not the widespread idea that the atomic bombs had been required to end the war, eliminating the need to invade the Japanese home islands where, it was confidently claimed, tooth-and-nail combat would cause enormous losses on both sides. The deadliest weapon in history, the one that opened the path to future Armageddon, had therefore saved lives. That was the stripped down mantra that provided the broadest and most enduring support for the introduction of nuclear warfare. By the time Truman, in retirement, published his memoir in 1955, he was ready to claim with some specificity that an invasion of Japan would have killed half-a-million Americans and at least as many Japanese.

Over the years, the ever-increasing number of lives those two A-bombs “saved” became a kind of sacred numerology. By 1991, for instance, President George H.W. Bush, praising Truman for his “tough, calculating decision,” claimed that those bombs had “spared millions of American lives.” By then, an atomic massacre had long been transformed into a mercy killing that prevented far greater suffering and slaughter.

Truman went to his grave insisting that he never had a single regret or a moment's doubt about his decision. Certainly, in the key weeks leading up to August 6, 1945, the record offers no evidence that he gave serious consideration to any alternative.

“Revisionists” Were Present at the Creation

Twenty years ago, the Smithsonian's National Air and Space Museum planned an ambitious exhibit to mark the 50th anniversary of the end of World War II. At its center was to be an extraordinary artifact -- the fuselage of the Enola Gay, the B-29 Superfortress used to drop the atomic bomb on Hiroshima. But the curators and historical consultants wanted something more than yet another triumphal celebration of American military science and technology. Instead, they sought to assemble a thought-provoking portrayal of the bomb's development, the debates about its use, and its long-term consequences. The museum sought to include some evidence challenging the persistent claim that it was dropped simply to end the war and “save lives.”

For starters, visitors would have learned that some of America's best-known World War II military commanders opposed using atomic weaponry. In fact, six of the seven five-star generals and admirals of that time believed that there was no reason to use them, that the Japanese were already defeated, knew it, and were likely to surrender before any American invasion could be launched. Several, like Admiral William Leahy and General Dwight Eisenhower, also had moral objections to the weapon. Leahy considered the atomic bombing of Japan “barbarous” and a violation of “every Christian ethic I have ever heard of and all of the known laws of war.”

Truman did not seriously consult with military commanders who had objections to using the bomb.  He did, however, ask a panel of military experts to offer an estimate of how many Americans might be killed if the United States launched the two major invasions of the Japanese home islands scheduled for November 1, 1945 and March 1, 1946. Their figure: 40,000 -- far below the half-million he would cite after the war. Even this estimate was based on the dubious assumption that Japan could continue to feed, fuel, and arm its troops with the U.S. in almost complete control of the seas and skies.

The Smithsonian also planned to inform its visitors that some key presidential advisers had urged Truman to drop his demand for “unconditional surrender” and allow Japan to keep the emperor on his throne, an alteration in peace terms that might have led to an almost immediate surrender. Truman rejected that advice, only to grant the same concession after the nuclear attacks.

Keep in mind, however, that part of Truman's motivation for dropping those bombs involved not the defeated Japanese, but the ascending Soviet Union. With the U.S.S.R. pledged to enter the war against Japan on August 8, 1945 (which it did), Truman worried that even briefly prolonging hostilities might allow the Soviets to claim a greater stake in East Asia. He and Secretary of State James Byrnes believed that a graphic demonstration of the power of the new bomb, then only in the possession of the United States, might also make that Communist power more “manageable” in Europe. The Smithsonian exhibit would have suggested that Cold War planning and posturing began in the concluding moments of World War II and that one legacy of Hiroshima would be the massive nuclear arms race of the decades to come.

In addition to displaying American artifacts like the Enola Gay, Smithsonian curators wanted to show some heartrending objects from the nuclear destruction of Hiroshima, including a schoolgirl's burnt lunchbox, a watch dial frozen at the instant of the bomb's explosion, a fused rosary, and photographs of the dead and dying. It would have been hard to look at these items beside that plane’s giant fuselage without feeling some sympathy for the victims of the blast.

None of this happened. The exhibit was canceled after a storm of protest. When the Air Force Association leaked a copy of the initial script to the media, critics denounced the Smithsonian for its “politically correct” and “anti-American” “revision” of history. The exhibit, they claimed, would be an insult to American veterans and fundamentally unpatriotic. Though conservatives led the charge, the Senate unanimously passed a resolution condemning the Smithsonian for being “revisionist and offensive” that included a tidy rehearsal of the official apologia: “The role of the Enola Gay... was momentous in helping to bring World War II to a merciful end, which resulted in saving the lives of Americans and Japanese.”

Merciful? Consider just this: the number of civilians killed at Hiroshima and Nagasaki alone was more than twice the number of American troops killed during the entire Pacific war.

In the end, the Smithsonian displayed little but the Enola Gay itself, a gleaming relic of American victory in the “Good War.”

Our Unbroken Faith in the Greatest Generation  

In the two decades since, we haven't come closer to a genuine public examination of history's only nuclear attack or to finding any major fault with how we waged what Studs Terkel famously dubbed “the Good War.” He used that term as the title for his classic 1984 oral history of World War II and included those quotation marks quite purposely to highlight the irony of such thinking about a war in which an estimated 60 million people died. In the years since, the term has become an American cliché, but the quotation marks have disappeared along with any hint of skepticism about our motives and conduct in those years.

Admittedly, when it comes to the launching of nuclear war (if not the firebombings that destroyed 67 Japanese cities and continued for five days after “Fat Man” was dropped on Nagasaki), there is some evidence of a more critical cast of mind in this country. Recent polls, for instance, show that “only” 56% of Americans now think we were right to use nuclear weapons against Japan, down a few points since the 1990s, while support among Americans under the age of 30 has finally fallen below 50%. You might also note that just after World War II, 85% of Americans supported the bombings.

Of course, such pro-bomb attitudes were hardly surprising in 1945, especially given the relief and joy at the war's victorious ending and the anti-Japanese sentiment of that moment. Far more surprising: by 1946, millions of Americans were immersed in John Hersey's best-selling book Hiroshima, a moving report from ground zero that explored the atomic bomb's impact through the experiences of six Japanese survivors. It began with these gripping lines:

“At exactly fifteen minutes past eight in the morning, on August 6, 1945, Japanese time, at the moment when the atomic bomb flashed above Hiroshima, Miss Toshiko Sasaki, a clerk in the personnel department of the East Asia Tin Works, had just sat down at her place in the plant office and was turning her head to speak to the girl at the next desk.”

Hiroshima remains a remarkable document for its unflinching depictions of the bomb's destructiveness and for treating America's former enemy with such dignity and humanity. “The crux of the matter,” Hersey concluded, “is whether total war in its present form is justifiable, even when it serves a just purpose. Does it not have material and spiritual evil as its consequences which far exceed whatever good might result?”

The ABC Radio Network thought Hersey's book so important that it hired four actors to read it in full on the air, reaching an even wider audience. Can you imagine a large American media company today devoting any significant air time to a work that engendered empathy for the victims of our twenty-first century wars? Or can you think of a recent popular book that prods us to consider the “material and spiritual evil” that came from our own participation in World War II? I can't.

In fact, in the first years after that war, as Paul Boyer showed in his superb book By the Bomb’s Early Light, some of America's triumphalism faded as fears grew that the very existence of nuclear weapons might leave the country newly vulnerable. After all, someday another power, possibly the Soviet Union, might use the new form of warfare against its creators, producing an American apocalypse that could never be seen as redemptive or merciful.

In the post-Cold War decades, however, those fears have again faded (unreasonably so since even a South Asian nuclear exchange between Pakistan and India could throw the whole planet into a version of nuclear winter).  Instead, the “Good War” has once again been embraced as unambiguously righteous. Consider, for example, the most recent book about World War II to hit it big, Laura Hillenbrand's Unbroken: A World War II Story of Survival, Resilience, and Redemption. Published in 2010, it remained on the New York Times best-seller list in hardcover for almost four years and has sold millions of copies. In its reach, it may even surpass Tom Brokaw's 1998 book, The Greatest Generation. A Hollywood adaptation of Unbroken appeared last Christmas.

Hillenbrand’s book does not pretend to be a comprehensive history of World War II or even of the war in the Pacific. It tells the story of Louis Zamperini, a child delinquent turned Olympic runner turned B-24 bombardier. In 1943, his plane was shot down in the Pacific. He and the pilot survived 47 days in a life raft despite near starvation, shark attacks, and strafing by Japanese planes. Finally captured by the Japanese, he endured a series of brutal POW camps where he was the victim of relentless sadistic beatings.

The book is decidedly a page-turner, but its focus on a single American's punishing ordeal and amazing recovery inhibits almost any impulse to move beyond the platitudes of nationalistic triumphalism and self-absorption or consider (among other things) the racism that so dramatically shaped American combat in the Pacific. That, at least, is the impression you get combing through some of the astonishing 25,000 customer reviews Unbroken has received on Amazon. “My respect for WWII veterans has soared,” a typical reviewer writes. “Thank you Laura Hillenbrand for loving our men at war,” writes another. It is “difficult to read of the inhumanity of the treatment of the courageous men serving our country.” And so on.

Unbroken devotes a page and a half to the atomic bombing of Hiroshima, all of it from the vantage point of the American crew of the Enola Gay. Hillenbrand raises concerns about the crew's safety: “No one knew for sure if... the bomber could get far enough away to survive what was coming.” She describes the impact of the shockwaves, not on the ground, but at 30,000 feet when they slammed into the Enola Gay, “pitching the men into the air.”

The film version of Unbroken evokes even less empathy for the Japanese experience of nuclear war, which brings to mind something a student told my graduate seminar last spring. He teaches high school social studies and when he talked with colleagues about the readings we were doing on Hiroshima, three of them responded with some version of the following: “You know, I used to think we were wrong to use nukes on Japan, but since I saw Unbroken I've started to think it was necessary.” We are, that is, still in the territory first plowed by Truman in that speech seven decades ago.

At the end of the film, this note appears on the screen: “Motivated by his faith, Louie came to see that the way forward was not revenge, but forgiveness. He returned to Japan, where he found and made peace with his former captors.”

That is indeed moving. Many of the prison camp guards apologized, as well they should have, and -- perhaps more surprisingly -- Zamperini forgave them. There is, however, no hint that there might be a need for apologies on the American side, too; no suggestion that our indiscriminate destruction of Japan, capped off by the atomic obliteration of two cities, might be, as Admiral Leahy put it, a violation of “all of the known laws of war.”

So here we are, 70 years later, and we seem, if anything, farther than ever from a rejection of the idea that launching atomic warfare on Japanese civilian populations was an act of mercy. Perhaps some future American president will finally apologize for our nuclear attacks, but one thing seems certain: no Japanese survivor of the bombs will be alive to hear it.

 

Hiroshima 60th annivesary never again

Hiroshima Do Your Peace

Dalai Lama Peace message avatar

Related Images:

Nina Simone’s Time IS Now, Again

Nina Simone’s Time Is Now, Again (New York Times)

By SALAMISHAH TILLET JUNE 19, 2015

   

Nina Simone in 1969. A new documentary, “What Happened, Miss Simone?,” opens on Wednesday. Credit Jack Robinson/Hulton Archive - Getty Images

nina                  

The feminist writer Germaine Greer once declared: “Every generation has to discover Nina Simone. She is evidence that female genius is real.” This year, that just might happen for good.

Nina Simone is striking posthumous gold as the inspiration for three films and a star-studded tribute album, and she was name-dropped in John Legend’s Oscar acceptance speech for best song. This flurry comes on the heels of a decade-long resurgence: two biographies, a poetry collection, several plays, and the sampling of her signature haunting contralto by hip-hop performers including Jay Z, the Roots and, most relentlessly, Kanye West.

Fifty years after her prominence, Nina Simone is now reaching her peak.

The documentary “What Happened, Miss Simone?,” directed by Liz Garbus (“The Farm: Angola, USA”) and due on Wednesday in New York and two days later on Netflix, opens by exploring Simone’s unorthodox blend of dusky, deep voice, classical music, gospel and jazz piano techniques, and civil rights and black-power musical activism.

Not only did she compose the movement staple “Mississippi Goddam,” but she also broadened the parameters of the great American pop artist. “How can you be an artist and not reflect the times?” Simone asks in the film. “That to me is the definition of an artist.” And in “What Happened,” Simone emerges as a singer whose unflinching pursuit of musical and political freedom establishes her appeal for contemporary activism.

Simone’s androgynous voice, genre-breaking musicianship and political consciousness may have concerned ’60s and ’70s marketing executives and concert promoters, but those are a huge draw for today’s gay, lesbian, black and female artists who want to be taken seriously for their talent, their activism or a combination of both.

“Nina has never stopped being relevant because her activism was so right on, unique, strong, said with such passion and directness,” Ms. Garbus said in an interview at a Brooklyn bakery. “But why has she come back now?” she asked, answering her own question by pointing to how little has changed, citing the protests over the police killings of unarmed African-Americans like Michael Brown, Rekia Boyd, Eric Garner and Freddie Gray.

 

Opinionator | The Stone: Time for a New Black Radicalism JUNE 22, 2015

 

While Simone’s lyrical indictment of racial segregation and her work on behalf of civil rights organizations connects her to our contemporary moment, those closest to her felt more comfortable telling Simone’s story after her death in 2003. As Ms. Garbus said, “From a filmmaking point of view, the answer for her return is also because of the estate, and people being ready to relinquish some control of her story.”

In this case, it was Simone’s daughter, the singer and actress Lisa Simone Kelly, who shared personal diaries, letters, and audio and video footage with Ms. Garbus and has an executive producer credit on the film. Speaking by phone from her mother’s former home in Carry-le-Rouet, France, Ms. Kelly said: “It has been on my watch that this film was made. And I believe that my mother would have been forgotten if the family, my husband and I, had not taken the right steps to find the right team for her to remembered in American culture on her own terms.”

Simone in N.Y. 1965            

Simone in New York in 1965. Credit Sam Falk/The New York Times

Ms. Kelly is only partly right. Over the last decade, a steady stream of reissued albums and previously unheard interviews and songs, as well as unseen concert footage have flooded the market. But the estate has enabled and impaired Simone’s revival. There has been a dizzying array of lawsuits over the rights to her master recordings in the last 25 years, a tangled situation that includes a recent Sony Music move to rescind a deal with the estate.

The most high-profile controversy about Simone’s legacy, however, involves Cynthia Mort’s biopic, “Nina,” due later this year. Starring Zoe Saldana in the title role, the film was initially beleaguered by public criticism over the casting, an antagonism further fueled by leaked photos of Ms. Saldana with darkened skin and a nose prosthetic. Eventually, the film’s release was set back even more by Ms. Mort’s own 2014 lawsuit against the production company, which she accused of hijacking the film, as The Hollywood Reporter put it.

Though Ms. Saldana told InStyle magazine that “I didn’t think I was right for the part,” the fallout and online petition calling for a boycott of the film nevertheless revealed a deep cultural investment in both Simone’s politics and aesthetics by a new generation.

The director Gina Prince-Bythewood said in a phone interview that she used Simone as the muse for her lead character, Noni (Gugu Mbatha-Raw), a biracial British pop sensation, in her 2014 film “Beyond the Lights” because “during her time, Nina was unapologetically black and proud of who she was, and it was reflected in the authenticity of songs like ‘Four Women.’ And this is something that Noni absolutely struggles with because she has been instructed to be a male fantasy.”

But for Ms. Prince-Bythewood, Simone is not simply an alternative to today’s image of an oversexualized or overmanufactured female artist, but the idol most suited for the multilayered identity politics of our social movements. “This moment of ‘Black Lives Matter,’ ” she said, “is a resurgence of racial pride but also a time in which black women are now at the forefront.”

Minnie Driver and Gugu Mbatha-Raw

Minnie Driver, left, and Gugu Mbatha-Raw in “Beyond the Lights.” Credit Suzanne Tenner/Relativity Media, via Associated Press

Like the renaissance of interest in Malcolm X in the early 1990s, Simone’s iconography arises in yet another time of national crisis. However, her biography, as an artist who was proudly black but steadfastly rejected the musical, sexual and social conventions expected of African-American and female artists of her time, renders her a complicated pioneer.

Born Eunice Waymon in 1933, Simone grew up in segregated Tryon, N.C. At 3, she was playing her mother’s favorite gospel hymns for their church choir on piano; by 8, her talents garnered her so much attention that her mother’s white employer offered to pay for her classical music lessons for a year. Determined to become a premier classical pianist, Simone trained at Juilliard for a year, then sought and was denied admission to the Curtis Institute of Music in Philadelphia — a heartbreaking rejection that led to a series of reinventions — renaming herself Nina Simone, performing in Atlantic City nightclubs and adopting jazz standards in her repertoire.

She would go on to have her only Top 40 hit with “I Loves You, Porgy” in 1959 off her debut album, “Little Girl Blue.” To further her music career, Simone moved back to New York, where she befriended the activist-writers Lorraine Hansberry, James Baldwin, Langston Hughes and Malcolm X. Influenced by these political friendships and the momentum of the civil rights movement itself, Simone went on to compose “Mississippi Goddam” in 1964 in response to the assassination of the civil rights leader Medgar Evers in Mississippi and the murder of four African-American girls in a church bombing in Birmingham, Ala., a year earlier. The song was Simone at her best — a sly blend of the show tune, searing racial critique and apocalyptic warning.

Oh but this whole country is full of lies

You’re all gonna die and die like flies

I don’t trust you any more

You keep on saying “Go slow!”

“Go slow!”

Simone’s growing political involvement affected both her professional and personal life. Though she was bisexual, her longest romance was her 11-year turbulent marriage to Andy Stroud, a former police officer who managed her career for most of the ’60s. Stroud would use physical and sexual abuse to limit Simone’s activism and friendships, and to control her unpredictable emotional outbursts. Unfortunately, it would take another 20 years for Simone’s “mood swings” to be diagnosed as a bipolar disorder. In the interim, Simone left her marriage and country, becoming an expatriate in Liberia, Switzerland, then France. (In the film, Ms. Kelly says that because her mother became more symptomatic and abusive toward her, she had to move back in with her father.)

She had not only become more militant by aligning songs like “To Be Young, Gifted and Black” with Stokely Carmichael and the black power movement, but also found it increasingly difficult to secure contracts with American record companies. Looking back on this period in her 1991 memoir, “I Put A Spell on You,” Simone recalled, “The protest years were over not just for me but for a whole generation and in music, just like in politics, many of the greatest talents were dead or in exile and their place was filled by third-rate imitators.” She died in 2003 at her home in France.

“Nina Simone, more than anyone else, talked about using her art as a weapon against oppression, and she paid the price of it,” said Ernest Shaw, a visual artist who last year painted a mural featuring Malcolm X, James Baldwin and Simone on the wall of a Baltimore home just two miles from the scene of Freddie Gray’s arrest.

Today Simone’s multitudinous identity captures the mood of young people yearning to bring together our modern movements for racial, gender and sexual equality.

This is a large part of the appeal of the documentary “The Amazing Nina Simone,” by Jeff L. Lieberman, which features more than 50 interviews with Simone’s family, associates and academics (including me), scheduled to be released later this fall.

Nina SImone in 1965          

Nina Simone in 1965 with her daughter, Lisa. Credit Associated Press

Mr. Lieberman said he wanted to explore the relationship between Simone and Marie-Christine Dunham Pratt, a former model and the only child of the dancer Katherine Dunham, because “many gay men and lesbians have long connected with Nina Simone because she was this outsider in her many worlds, sometimes sad, sometimes lonely, but always determined, and unrelenting in her fight for freedom.”

Still, the preoccupation with Simone has more to do with her sound than her life story. Those who have covered Simone on recent albums — including Algiers, a Southern gospel and punk band; Xiu Xiu, an experimental post-punk group; and Meshell Ndegeocello, the neo-soul, neo-funk artist — are remarkably different from one another. Their common use of Simone speaks to how her music cuts across race, gender and genre.

But it has been hip-hop, the genre that Simone once said had “ruined music, as far as I’m concerned,” that has kept her musically relevant more than anything else.

Lauren Hill

Lauryn Hill. Credit Chad Batka for The New York Times

The two hip-hop artists most responsible for Simone’s current ubiquity are Kanye West and Lauryn Hill. Mr. West has rendered Simone hip-hop- and pop-friendly by sampling her in songs like “Bad News,” “New Day” and “Blood on the Leaves.” While he declined to comment on Simone, like her, he fashions himself as a controversial if not misunderstood rebel — a figure who wants to be appreciated as much for his refusal of artistic genres as for his musical virtuosity.

Ms. Hill was one of the first rappers to mention Simone in song — on the Fugees’ “Ready or Not” in 1996 — and she recorded several songs for “Nina Revisited: A Tribute to Nina Simone,” an album (due July 10) tied to “What Happened, Miss Simone?”

Jayson Jackson, Ms. Hill’s former manager and a producer of Ms. Garbus’s film, conceived “Nina Revisited,” and said that while working on the album, Ms. Hill told him, “I grew up listening to Nina Simone, so I believed everyone spoke as freely as she did.”

Paradoxically, Simone’s comeback also reveals an absence. A majority of pop artists — with the exception of a few like D’Angelo, J. Cole and Killer Mike — have largely been musically silent about police violence in Ferguson, Mo.; New York; and Baltimore.

Nina Simone 1968

Nina Simone in about 1968. Credit Getty Images

John Legend, who covered Simone on his own 2010 protest album with the Roots, “Wake Up!,” and recently started Free America, a campaign to end mass incarceration in the United States, attributes this absence to artists unwilling or unable to take positions outside the mainstream. “I don’t think it is career suicide to take on these positions, but I think there is actually a limited number of artists who really want to say something cogent about social issues, so most do not even dive in,” he said in an interview.

He added, “To follow in her footsteps, I think it takes a degree of savvy, consciousness, communication skills, and a vibrant intellectual community that most artists aren’t encouraged to cultivate.”

Today, Simone’s sound and style have made her a compelling example of racial, sexual and gender freedom. As Angela Davis explained in the liner notes for the album, “In representing all of the women who had been silenced, in sharing her incomparable artistic genius, she was the embodiment of the revolutionary democracy we had not yet learned how to imagine.”

Correction: June 28, 2015

An article last Sunday about the influence of the late singer Nina Simone on modern musicians misspelled the name of her hometown. It is Tryon, N.C., not Tyron.

Salamishah Tillet is an associate professor of English at the University of Pennsylvania, the author of “Sites of Slavery: Citizenship and Racial Democracy in the Post-Civil Rights Imagination” and is writing a book on Nina Simone.

Related Images:

The Monster of Monticello

This Fourth of July we should read Professor Paul Finkelman's excellent article after many incidents of recent racial hatred, gun violence, and President Obama's eulogy at Charleston, S.C. for the victims of another senseless racially motivated killing at a Bible study inside a sanctuary. It had been the target of racists before.  We might look at the third president of our country and see where some of these roots derive from despite the traditional reverence accorded to Jefferson because of his role in writing the Declaration of Independence. Even the flag of the confederacy has provoked many to act to bring it down or request that governors consider that. Obama mentioned in his eulogy bringing it down would not be an act of political  correctness, nor would it detract from those who fought in the civil war, but rather that the purpose the South fought to preserve slavery was wrong. Perhaps now we can face the need to make it harder to put guns into the hands of people not fit to handle a gun. Here are some sobering thoughts on one of our national heroes who was a slaveholder when he drafted the Declaration of Independence. You may be surprised to find his behavior fell far short of what we now expect of our leaders, yet in his time, and even now, he was, and is, revered for his passionate embrace of independence and the American Revolution.

By PAUL FINKELMAN NOV. 30, 2012

Jefferson        

Durham, N.C.

THOMAS JEFFERSON is in the news again, nearly 200 years after his death — alongside a high-profile biography by the journalist Jon Meacham comes a damning portrait of the third president by the independent scholar Henry Wiencek.

We are endlessly fascinated with Jefferson, in part because we seem unable to reconcile the rhetoric of liberty in his writing with the reality of his slave owning and his lifetime support for slavery. Time and again, we play down the latter in favor of the former, or write off the paradox as somehow indicative of his complex depths.

Neither Mr. Meacham, who mostly ignores Jefferson’s slave ownership, nor Mr. Wiencek, who sees him as a sort of fallen angel who comes to slavery only after discovering how profitable it could be, seem willing to confront the ugly truth: the third president was a creepy, brutal hypocrite.

Contrary to Mr. Wiencek’s depiction, Jefferson was always deeply committed to slavery, and even more deeply hostile to the welfare of blacks, slave or free. His proslavery views were shaped not only by money and status but also by his deeply racist views, which he tried to justify through pseudoscience.

There is, it is true, a compelling paradox about Jefferson: when he wrote the Declaration of Independence, announcing the “self-evident” truth that all men are “created equal,” he owned some 175 slaves. Too often, scholars and readers use those facts as a crutch, to write off Jefferson’s inconvenient views as products of the time and the complexities of the human condition.

But while many of his contemporaries, including George Washington, freed their slaves during and after the revolution — inspired, perhaps, by the words of the Declaration — Jefferson did not. Over the subsequent 50 years, a period of extraordinary public service, Jefferson remained the master of Monticello, and a buyer and seller of human beings.

Rather than encouraging his countrymen to liberate their slaves, he opposed both private manumission and public emancipation. Even at his death, Jefferson failed to fulfill the promise of his rhetoric: his will emancipated only five slaves, all relatives of his mistress Sally Hemings, and condemned nearly 200 others to the auction block. Even Hemings remained a slave, though her children by Jefferson went free.

Nor was Jefferson a particularly kind master. He sometimes punished slaves by selling them away from their families and friends, a retaliation that was incomprehensibly cruel even at the time. A proponent of humane criminal codes for whites, he advocated harsh, almost barbaric, punishments for slaves and free blacks. Known for expansive views of citizenship, he proposed legislation to make emancipated blacks “outlaws” in America, the land of their birth. Opposed to the idea of royal or noble blood, he proposed expelling from Virginia the children of white women and black men.

Jefferson also dodged opportunities to undermine slavery or promote racial equality. As a state legislator he blocked consideration of a law that might have eventually ended slavery in the state.

As president he acquired the Louisiana Territory but did nothing to stop the spread of slavery into that vast “empire of liberty.” Jefferson told his neighbor Edward Coles not to emancipate his own slaves, because free blacks were “pests in society” who were “as incapable as children of taking care of themselves.” And while he wrote a friend that he sold slaves only as punishment or to unite families, he sold at least 85 humans in a 10-year period to raise cash to buy wine, art and other luxury goods.

Destroying families didn’t bother Jefferson, because he believed blacks lacked basic human emotions. “Their griefs are transient,” he wrote, and their love lacked “a tender delicate mixture of sentiment and sensation.”

Jefferson claimed he had “never seen an elementary trait of painting or sculpture” or poetry among blacks and argued that blacks’ ability to “reason” was “much inferior” to whites’, while “in imagination they are dull, tasteless, and anomalous.” He conceded that blacks were brave, but this was because of “a want of fore-thought, which prevents their seeing a danger till it be present.”

A scientist, Jefferson nevertheless speculated that blackness might come “from the color of the blood” and concluded that blacks were “inferior to the whites in the endowments of body and mind.”

Jefferson did worry about the future of slavery, but not out of moral qualms. After reading about the slave revolts in Haiti, Jefferson wrote to a friend that “if something is not done and soon done, we shall be the murderers of our own children.” But he never said what that “something” should be.

In 1820 Jefferson was shocked by the heated arguments over slavery during the debate over the Missouri Compromise. He believed that by opposing the spread of slavery in the West, the children of the revolution were about to “perpetrate” an “act of suicide on themselves, and of treason against the hopes of the world.”

If there was “treason against the hopes of the world,” it was perpetrated by the founding generation, which failed to place the nation on the road to liberty for all. No one bore a greater responsibility for that failure than the master of Monticello.

Paul Finkelman, a visiting professor in legal history at Duke Law School, is a professor at Albany Law School and the author of “Slavery and the Founders: Race and Liberty in the Age of Jefferson.”