Monday, December 31, 2007

President Bloomberg?

According to the New York Times, Michael Bloomberg, Democratic-turned-Republican-turned-Independent mayor of Gotham, is moving closer to declaring for president of the United States. That would, of course, raise to two the number of candidates whose sole experience in government consists of overseeing five small, overpopulated boroughs wedged in between New Jersey and Connecticut. Aside from trade junkets and the like, about the only time New York City's mayor deals with anything resembling foreign policy is when he travels north of Dutchess County and attempts to negotiate more financial aid from rural upstate legislators. I know Sinatra said that if you could make it in Manhattan you could make it anywhere, but really, guys, that was just a song. Mayors are supposed to run for governor; governors are supposed to run for president.

Still, Mayor Bloomberg does have an ego, a title, and several billion dollars, and that alone gives him more credibility than, say, the governor of New Mexico (sorry, Bill). He also has the good fortune to show up at a time when retired politicians and pundits are counseling bipartisanship as the ideal solution to what ails us. Who better to represent bipartisanship than a lifelong Democrat who switched parties to clear his path to the mayoralty and then abandoned them altogether once he found that his ambitions could not be satisfied within the walls of Gracie Mansion? Opportunism, after all, is just another word for nothing left to lose.

The American people, by and large, like the idea of independence, or at least they like the label. When given a chance, however, they seldom elect independents to office. In large part, this stems from the fact that non-partisan candidates rarely have the name recognition necessary to be taken seriously or the money to buy that recognition. In a few cases, U.S. voters have supported independents at the state level, particularly when those independents bring with them a certain degree of celebrity. Think Jesse Ventura or Arnold Schwarzenegger, who was, despite the GOP label, not really a partisan figure.

The only recent candidate with the wherewithal, guts, and leisure time to mount a serious independent bid for the White House was Ross Perot in 1992 (his subsequent run in 1996 was little more than proof of Marx's aphorism that history repeats itself as farce). There was a moment in the summer of '92 where Perot actually led both Bill Clinton and President George H.W. Bush in the horse race polls. Fortunately for the parties, and especially for Clinton, Perot's dime store Harry Truman act wore thin by the fall, particularly when the Texas billionaire began to concoct weird, paranoid stories about the Republican campaign sabotaging his daughter's wedding. When last heard from, Perot was losing a debate about NAFTA to Al Gore and warning Americans of a "giant sucking sound" (he meant the potential loss of U.S. jobs to Mexico, but maybe it wasn't the best choice of metaphors).

Perot's implosion in 1992 leaves open the possibility that another well-heeled independent with fewer personality quirks might have a shot at upending the two-party monopoly that has prevailed without interruption since the days of kerosene lanterns. It is, of course, impossible to know how Perot would have fared had he turned out to be less loopy, but before Michael Bloomberg hands over a billion or so of his wealth to admen and consultants, he might want to reflect on this simple fact: despite winning nearly one in every five ballots cast for president in 1992, Ross Perot did not receive a single vote in the Electoral College. People remember this, and Bloomberg will not only have to build a national campaign apparatus from scratch, he will also have to break through the psychology of citizens who do not want to waste their votes on a hopeless, if appealing, candidacy.

I am assuming here that Bloomberg's candidacy actually turns out to be appealing. The truth is that we have no idea how Michael Bloomberg will fare in a no-holds-barred presidential campaign. Nor do we know how he will stand after the media—and the parties—conduct the inevitable full body cavity search of his personal and political history. The value of contesting the party primaries is that a candidate can be fully vetted by journalists and opponents prior to taking the big stage in the fall. If a contender, say, has a penchant for escorting women other than his wife to Bimini on a boat called the "Monkey Business", this fact will emerge before it brings down the hopes of an entire political party. It may certainly be the case that Mayor Bloomberg has lived a life so clean as to make Mitt Romney look like Larry Flynt, but if not, we will not learn about it until next summer.

Another difficulty faced by hizzoner is what we might call the Fred Thompson Problem. As Mr. Spock once said, passing up the chance to get laid, "After a time, you may find having is not so pleasing a thing after all as wanting. It is not logical – but it is often true." During the courtship period, candidates appear pristine and desirable; we can assign to them whatever qualities and issue positions strike our fancy. Once they actually enter the ring, however, they're just another piece of meat. They make mistakes, say stupid things, and take unpopular sides on controversial matters. Or, like Thompson, the lifeless reality on the stump simply doesn't match the charismatic fantasy of the mind. Michael Bloomberg has never been tested by anyone stronger than New York Democrat Mark Green, a perennial loser that he defeated by a mere two points less than eight weeks after the September 11 attacks in 2001, even with strong support from temporary hero Rudy Giuliani.

The final problem for Bloomberg involves the choice of a Vice Presidential candidate. By the time Ross Perot got around to picking a potential veep, he had already tarnished himself with strange and bizarre utterances. As a result, successful politicians shied away from him and he was ultimately forced to select the late Admiral James Stockdale. A distinguished American and former Vietnam-era POW, Stockdale was, by 1992, old, hard of hearing, and ill-prepared for the limelight. His star turn at that year's vice presidential debates was humiliating, Stockdale becoming best known for his unintentionally revelatory introductory comments: "Who am I? Why am I here?" Questions about Perot's judgment only deepened after Stockdale's performance.

Bloomberg may think that he can land an A-list nominee for the VP slot, but this may be easier said than done. No successful Democrat or Republican is likely to take on the job, since doing so would almost certainly result in his or her being blackballed from the party, at least in terms of further advancement. Arnold Schwarzenegger would be an obvious choice, but his Austrian birth disqualifies him from the position. In all likelihood, Bloomberg would be stuck with some political has-been (Bill Cohen, anyone?) or another risky outsider. Either way, the result will be to diminish Bloomberg's credibility and standing with the electorate.

(Joe Lieberman would, perhaps, be another possibility, but his turncoat act has so alienated Democrats that Bloomberg would find himself in the market almost exclusively for GOP and independent voters, hardly the bipartisan tone that he is supposed to set.)

My suggestion to Bloomberg would be to save the billion or so dollars he might spend on a presidential race. He can give me $500 million as a consulting fee for persuading him not to run, and keep the remaining $500 million for himself. Maybe he can use it to buy the Knicks and fire Isiah Thomas, a move that would solidify his popularity in New York for decades to come. Regardless, he comes out way ahead. I see it as a clear win/win solution. Drop a line in the comments section, Mike, and I'll tell you where to send the check.

Sunday, December 30, 2007

Bipartisanship and Other Bad Ideas

From David Broder comes word that a gaggle of political old timers has collected with the express purpose of agitating for bipartisanship and exploring the possibility of supporting a third-party presidential candidacy in 2008. This who's who of has-beens is evidently led by David Boren, an undistinguished oil state senator who, upon retirement, was handed the presidency of an undistinguished state university. On January 7, a team of 1980s all stars will make their way to Boren's new playground, the University of Oklahoma, where they will reminisce about the era of good feelings that was the Reagan years and throw what little remaining weight they have into the political mix.

That a conference on political unity will take place in a state that has recently elected such unhinged extremists as James Inhofe and Tom Coburn to the U.S. Senate is only the beginning of our wonderment. The cast of characters evokes still more head-scratching. Evidently, we will soon be receiving lectures on bridging our national divides from Sam Nunn, the former Georgia senator who angrily took down Bill Clinton over gays in the military; John Danforth, the Missourian who shepherded the bitterly ideological Clarence Thomas through the Supreme Court nomination process; Christie Todd Whitman, who, as George W. Bush's EPA chief, allowed herself to be overruled and humiliated by the anti-environmental wackos in the Vice President's office; and Bill Brock, Chairman of the Republican National Committee during the Carter years, a time in which the GOP waged all-out, and not remotely bipartisan, war against our 39th president. Gary Hart will also apparently make the trip to the Sooner State, hoping the world will forget that his original contribution to ideological moderation was his service as George McGovern's campaign manager in 1972 (and, yeah, maybe he wants us to forget a couple of other things, too).

The subtext of next week's football school summit meeting is the notion that bipartisanship is something that must be recaptured, something that we lost somewhere along the sixty-year road from Berlin to Baghdad, from civil rights to gay marriage. "Today," write Boren and Nunn, summoning their full powers of cliché, "we are a house divided." It is difficult to go wrong borrowing from Jesus and Lincoln, but one may certainly wonder about the time reference added by the two former senators. If our house is divided today, exactly when was it united?

Surely, there was nothing bipartisan about Franklin Delano Roosevelt, who used Democratic supermajorities in Congress to overwhelm fierce GOP opposition to his New Deal policies. Harry Truman was an instinctive partisan gut fighter, and the Republicans who opposed him were not exactly gentle in their condemnation of his policies both at home and abroad, particularly his prosecution of the Korean War. Eisenhower's somnolent presidency can only be considered non-conflictual if one ignores the nascent civil rights movement, the National Guard in Little Rock, and the vicious opposition and endless filibusters that characterized early congressional efforts to provide equal opportunity for African Americans. Oh yeah, and Joe McCarthy. Then, of course, came the 1960s, Vietnam, Watergate, and so forth. If there was a Golden Age of Bipartisanship, it probably occurred for about a month after the attacks on September 11, 2001, and ended just around the time that the Bush Administration decided to use terrorism to its own political advantage.

(I recognize, of course, that many of the political disputes of the 1950s and 1960s did involve bipartisan coalitions of Southern Democrats and Conservative Republicans against social liberals from both parties. This is obviously an accident of history, a time when anger at the party of Abraham Lincoln had not yet subsided in the South, leading to the election of Democratic members of Congress who were well to the right of the national party. There may have been a semblance of bipartisanship, but it came at the expense of excruciating national rancor. Presumably, even Boren and Nunn understand that any seeker of brotherhood and national unity would not choose to set his or her watch back to 1958.)

The David Broders of the world, who inevitably fall victim to the siren song of political Kumbaya, do perhaps have one point. The level of distrust between the parties is greater today than at any time in recent memory. Next time C-Span shows the Nixon and Clinton impeachment hearings, be sure to compare and contrast. Both had their moments of partisan grandstanding, but the Nixon hearings were characterized by serious-sounding people on both sides who were aware of and burdened by the gravity of their decision. In 1998, by contrast, neither Democrats nor Republicans acquitted themselves particularly well, and the most profound constitutional remedy available was treated as just another weapon in the soul-draining culture wars of the 90s. (Let's be clear, though: the Republicans chose to bring this circus to town, and thus bear the greatest responsibility for embarrassing the nation.)

Anyhow, it would be helpful if each side of the political debate would work to reject the notion that their opponents are driven by evil or venal motives. Aside from their obvious elephantine ambitions, Hillary Clinton and John McCain, Barack Obama and Mitt Romney, John Edwards and Fred Thompson do care about their country and want to make it a better place. It would, of course, be easier to hold this position if we were not currently enduring a presidential administration that lied its way into war and brought unthinkable torture back into the mainstream. Nobody is obligated to give Dick Cheney the benefit of any doubt. Nevertheless, there are numerous grownups in the Republican Party, and it still makes sense to reach out to them. Not all compromise is capitulation.

Having said that, the group that will soon be arriving in Oklahoma is already starting on the wrong foot. The implicit threat of a third party candidacy, perhaps that of New York mayor Michael Bloomberg, betrays a petulance and lack of seriousness on the part of these supposedly deep thinkers. Rebuilding bonds is difficult work, and it is not accomplished by taking one's ball and looking for a different game. Not only do third parties rarely succeed, they also permit presidents to be elected without majority public support, hardly a blueprint for bringing the country together.

Like all efforts of this sort, little progress will be made until we all address the elephant in the living room. The Bush administration has taken power-grabbing and merciless partisanship to places heretofore unimagined by constitutional scholars, let alone practicing politicos. Until all sides acknowledge this and agree to step back from the precipice of the unitary executive-cum-dictator, no healing, much less bipartisan unity, will be possible.

Saturday, December 29, 2007

Get Me Rewrite!

Aside from the obvious insensitivity, there is another reason why only fools speculate about the domestic significance of a foreign tragedy in its immediate aftermath. That would be the enormous likelihood of getting it all embarrassingly wrong.

In this case, the fools, known as political pundits, have dominated the airwaves in the forty-eight hours since the assassination of Benazir Bhutto. They have assured us in tones of unshakable confidence that the current unrest in Pakistan will benefit Hillary Clinton, Rudy Giuliani, and John McCain, the three candidates with the greatest degree of foreign policy experience. They have insisted that Iowans and New Hampshirites, the only people in the republic who currently matter, are now tasked with reassessing the presidential race for what must be at least the tenth time in the last six months.

First, please allow me to digress. Precisely what foreign policy experience do the two senators and the former mayor possess? Clinton traveled the globe at the behest of her husband; this, in itself, makes her no more qualified to lead the world than an average contestant on "The Amazing Race". We now know that the junior senator from New York did not even have a security clearance during Bill's administration. This is not, in itself, disqualifying, unless one wishes to argue that the senator earned her foreign policy chops during the 1990s.

McCain, like Clinton, sits on the Senate Foreign Relations Committee and has done so for a number of years. In that capacity, he has watched other people—presidents—make foreign policy decisions, he has held a security clearance, and he has participated in countless hearings. So have Barbara Boxer and Norm Coleman, neither of whom is regarded as an international affairs guru. Implicitly, it is sometimes suggested that his experience as a Vietnam POW also adds to McCain's credibility on these issues, but that is, if you take even a moment to think about it, preposterous.

As for Rudy Giuliani, how on Earth does being mayor of New York City on September 11, 2001, substitute for, among other things, knowledge, information, and good judgment? By that reasoning, shouldn't we find out who the mayor of Shanksville, Pennsylvania, was on that terrible day and nominate him or her for vice president? Being the victim of a crime does not make you a criminologist.

I'm being a little hard on Hillary Clinton and John McCain. They have in fact acquired some foreign policy expertise over the years. I simply doubt that even a wealth of information, or anything else for that matter, truly prepares one for the demands of being Commander in Chief. Henry Kissinger's encyclopedic understanding of world diplomacy, for example, brought us (indirectly) Pol Pot and (directly) Augusto Pinochet. Still, the idea that Rudolph ("Let's Put the Emergency Command Headquarters in the World Trade Center") Giuliani is better prepared to lead a superpower than, say, the average graduate student in international relations, should evoke uncontrollable laughter from anyone fit to be called a journalist. That is does not is both telling and frightening.

Anyhow, end of digression.

Back to the tragic case of Benazir Bhutto and the pundits who raced one another to the Green Rooms upon news of her death. As if talking about kindergartners, they speculated that the terrible events in Pakistan would remind the American people that the world is a mean and scary place (as if we don't re-learn that on every trip through an airport). Ms. Bhutto was, nearly all of them insisted, the victim of al Qaeda, the Taliban, or some other group of, to use Lou Dobbs' trifecta of hysteria, "radical Islamic extremists". Dutifully quaking in our penny loafers, we would now turn our attention toward those candidates most likely to keep us safe, which Barack Obama and Mike Huckabee obviously cannot do because they are men of such appalling ignorance that they probably thought Bhutto was the guy who used to beat up Popeye. Or something like that.

Well, Thursday's truth sometimes becomes Friday's retraction, and, while we're not quite there yet, the simple narrative is already unraveling. Showing the kind of ham-handedness that we normally associate only with Dick Cheney, the Musharraf government has, in less than half a week, already offered three different causes of death for the Pakistani president's leading rival. First, she was killed by bullets, then by shrapnel, and finally, by (and under other, less awful circumstances this would be funny) bumping her head on her car's sunroof. Really hard, apparently. People have died in such a manner, of course, but usually not simultaneous to the moment that they have also been the target of gunfire and a suicide bomb. But, you know, maybe she was just really, really unlucky.

Ms. Bhutto, however, apparently didn't believe in luck, so she dictated a message, to be released upon her untimely death, indicating her belief that the Musharraf government would bear responsibility for any successful attempt on her life. A friend of the martyred former prime minister passed this message on to CNN's Wolf Blitzer, which may be the first time Wolf broke a story without first dropping it on the pavement. The coincidence of Bhutto's own words with the slippery and ever-changing reports coming from the Pakistani government made any further discussion of al Qaeda's or the Taliban's role in the assassination immediately suspect. It remains possible, of course, that these groups may have committed the deed, but the question now concerns the extent to which Musharraf allowed it to happen through insufficient security, or worse.

I, too, am guilty of premature speculation (geez, maybe I should re-phrase that). Yesterday, I suggested that the Bush and Musharraf governments would concur on the terrorists-killed-Bhutto narrative, and that Pakistan's elections would go on as scheduled. This may still occur, but it is clear that Musharraf's unforeseen ineptitude has imperiled not only the "democratic" process in his country, but also the last remaining legitimacy of his regime. These are disquieting times in the Muslim world's only nuclear power.

Returning to the supposed implications of the assassination for America's coming elections, we can no longer be so sure that the watchword will be stability and that a frightened populace will be looking for Big Daddy or Big Mommy. Indeed, given this additional evidence of the failure of our foreign policy leadership, it might turn out that the agents of change (ours, not theirs) could be back in the ascendancy. Paging Obama and Huck: Mr. DeMille may be ready for your close-up after all.

Seriously, though (and this is a very serious situation), we still have no idea what will transpire over the next several days in Pakistan. Thus, it is futile and likely misleading to assume we have any sense of how or even whether the Pakistani crisis will affect the upcoming caucuses and primaries. As usual, our pundits speak with great authority, while actually providing little that is of value.

Friday, December 28, 2007

A Pakistani Tragedy

Watching the coverage yesterday of the assassination of Benazir Bhutto, it might be easy to forget that this was, first and foremost, something that happened to Pakistan and its people. From CNN to Fox News to MSNBC, the overwhelming majority of the focus centered on how the former prime minister's death would affect the United States and American interests. Perhaps the crassest reaction, which occurred within minutes of the first terrible reports from Rawalpindi, involved speculation about the impact of the killing on the standing of the U.S. presidential candidates just one week before the Iowa caucuses.

Pakistan is a complicated place, and Benazir Bhutto was a complicated woman. Daughter of her country's most important and controversial leader of the 1970s, she took control of her father's political party (sometimes in exile) after the elder Bhutto's 1979 execution on charges of plotting the killing of a political opponent. She eventually won election twice as prime minister in her own right, being dismissed each time due to corruption charges. Those charges, while vigorously disputed by Ms. Bhutto, extended well beyond Pakistan, with substantial evidence coming from several European countries, including Switzerland.

The American media, not exactly masters of nuance, tend to paint in broad, childlike strokes, assigning each subject a black or white hat. Even before her death, and especially now, Ms. Bhutto was treated as a heroic figure, the first woman leader of a Muslim country and an individual of towering personal courage. She was certainly that, returning this past year to the land that had killed her father—and quite possibly had a hand in the deaths of two of her brothers—and remaining after an October assassination attempt and a brief period of house arrest. Still, she was neither the Mahatma Gandhi nor the George Washington of Pakistan, and it will now remain forever unclear exactly what her restoration as prime minister might have meant to her country.

What is known is that the current Pakistani leader, Pervez Musharraf, is a shadowy figure with little taste for democracy. His government's relationship with al Qaeda, at least prior to 9/11, has been murky, and it was during his presidency that Dr. A.Q. Khan, the country's leading nuclear scientist, apparently engaged in a highly profitable sideline advancing nuclear proliferation around the world. Musharraf, a military strongman of shifting allegiances, seems a throwback to an earlier Cold War era in which the U.S. government gave its backing to all sorts of unsavory characters in sometimes careless pursuit of broader objectives.

Nobody has to contact CSI to recognize that there are really only two major suspects in the Bhutto assassination. One, of course, is the Pakistani government. Benazir Bhutto's return to her native land was strongly encouraged by the United States, which wanted a more reliable and democratic ally as prime minister to counterbalance Musharraf's problematic presidency. Since Ms. Bhutto's arrival in October, 2007, Musharraf has postponed elections, declared martial law, and placed his rival under house arrest. Each of these actions was reversed under strong American pressure, but it is nevertheless clear that the elimination of Ms. Bhutto would be very much in the Musharraf government's interests.

The other suspect—and the one that has been receiving far more attention from the U.S. media—is al Qaeda. Once again, the motivation is undeniable. Osama bin Laden's outfit has of late been on a bit of a losing streak, having seen its alliances with Sunni leaders in Iraq crumble. While it is nonsense to say that the Americans are now winning the Iraq war—victory involves more than just the cessation of the daily deaths of U.S. soldiers—it is clear that al Qaeda is losing. The murder of Benazir Bhutto would have been a spectacular calling card as well as a devastating blow to the Bush administration's aspirations for Pakistan.

Regardless of the truth, it seems increasingly clear that al Qaeda will receive credit for this terrible crime. The terrorist theory plays into both the American and Pakistani governments' preferred narrative. If Musharraf or his people are responsible for Ms. Bhutto's death, not only does he become the Ferdinand Marcos of his own country, he also creates an untenable situation in Washington. Even the Bushies would be unable to sustain public and congressional support for Musharraf if it turned out that his government bore responsibility for yesterday's events. The problem, however, is that the U.S. administration is convinced that the alternatives to their strongman, including a possible victory by Islamic extremists, are far worse.

This is, unfortunately, much more than a story about lying with dogs and waking up with fleas. It is, instead, a dilemma caused directly by the breathtaking ignorance and arrogance of the Bush administration's foreign policy. That is not to say that the American government is responsible for Benazir Bhutto's death. Her ambitions would likely have eventually led her back to Pakistan with or without U.S. help, and Pakistani leaders have a long history of dying of other than natural causes.

The sequence of events, however, and particularly the Bush administration's desperate dependence on Musharraf are a direct byproduct of the decision to engage in a gratuitous war in Iraq before completing the assignment of neutralizing al Qaeda and nursing Afghanistan back to some form of health and normalcy. The critical importance of Pakistan to the United States stems in large part from its border with Afghanistan and its ability to provide a safe haven to bin Laden and other members of al Qaeda. Dick Cheney's obsession with Saddam Hussein and his grand strategy to dominate the Islamic world have now yielded another foreign policy disaster that will haunt, at the very least, the next presidential administration.

With the loss of Benazir Bhutto, Pervez Musharraf now holds all the cards. We will almost certainly never learn whether his government, either by design or by malign neglect, was involved in the Bhutto assassination. Free of his most formidable rival, the Pakistani president is virtually assured of winning elections that will receive loud and embarrassing approval from Washington. Meanwhile, the Muslim world will continue to seethe and Pakistan, unlike Saddam's Iraq a real nuclear power, will continue its oversized and ominous presence on the world stage.

Thursday, December 27, 2007

...And Let God Sort Them Out

December is, as always, the great burial ground for stories of significant national import. People gear up for the holiday season, they travel to visit relatives, they deal with a houseful of children freed from school for three glorious—or not so glorious—weeks. When such a December falls just a month before the first presidential primaries, what little attention is paid to the news will be overwhelmed by broadcasts with such unlikely datelines and Sioux City and North Conway (Iowa and New Hampshire, respectively, for those who have truly been hibernating).

Regardless, life and death go on, even as we storm the malls and beg Amazon.com for last-minute overnight shipping. One underreported December story that clearly does involve life and death was the recent decision in New Jersey to repeal capital punishment, making Tony Soprano's home state the first to outlaw government executions in over three dozen years. This action, while dramatic, was also primarily symbolic, given that the Garden State has not actually executed a criminal since John F. Kennedy was president. Nevertheless, no state has voluntarily surrendered the right to kill since 1965, and New Jersey's action can be seen as part of a larger national moment of doubt.

The latest pushback against capital punishment has its roots in both activism and science. Beginning during the 1990s, a number of attorneys and law school professors, many taking advantage of the emerging technology of DNA "fingerprinting", decided to investigate various cases in which a high likelihood existed that wrongful conviction had taken place. The results were staggering and, to most Americans, unexpected. A system in which the standard for guilt is "beyond a reasonable doubt" yielded case after case in which the innocent were condemned to prison, sometimes to death row, for crimes that they did not commit. In Illinois, the problem was so great that outgoing Governor George Ryan (himself soon to be imprisoned for crimes he did commit) commuted all of his state's death sentences in early 2003.

Widespread misgivings about capital punishment, however, have not been with us for long. As recently as last decade, the willingness to sign killing orders was considered a standard by which the public would judge the toughness of governors running for president. Thus did Bill Clinton return to Little Rock during his 1992 campaign to oversee the demise of a brain damaged man named Rickey Ray Rector. Thus did George W. Bush, shocking even insufferable preppie Tucker Carlson, mock the recently executed Karla Faye Tucker's pleas for her life.

Scholarly debates continue to rage over whether the death penalty has a deterrent effect on would-be murderers. While most studies suggest a negligible impact at best, a number of papers, especially by economists, indicate that each state-sanctioned execution saves a handful of lives. Social science models, of course, are highly sensitive to the assumptions made by investigators and the variables included and excluded from their analyses. Economists, in particular, believe that human beings are rational actors, guided by at least a primitive sense of the costs and benefits of their actions. By this reasoning, the prospect of being executed will cause some to reconsider the benefit of committing first degree murder, making Americans as a whole safer.

The deterrence argument has always struck me as an odd one. Presumably, the important threshold faced by any rational prospective criminal is the probability of being caught and punished. Thus, greater certainty of punishment should logically have a more immediate effect on the criminal's decision making than the type of punishment, so long as it is sufficiently undesirable. More to the point, however, the notion that first degree murderers behave rationally seems laughably far fetched. Many are clinically, if not legally, insane. Most suffer from serious impulse control issues that would seem to refute notions of careful, or even superficial, calculation. Perhaps this supposed deterrent effect may be felt by hired hitmen and the like, but surely we aren't talking about a lot of people here.

But all of this scientific and pseudo-scientific analysis is, for the most part, beside the point. Capital punishment will, in the end, endure or cease on the basis of factors that are primarily visceral. When people hear about the rape and murder of children, their first response is not to consult the nearest academic economist. When they learn that an innocent convict has been irretrievably lost, they do not consider him a new entry on a spreadsheet to be weighed against lives potentially saved by his gratuitous execution. People who are afraid of one another will opt for the death penalty. People who are frightened by the power of the state and its blindly ambitious prosecutors will agitate against it.

Indeed, before we celebrate the end of state sponsored killings, we should face one additional variable. Those who are improperly executed are not, in most cases, very similar to those who vote. They are nearly always poorer, and very often members of racial and ethnic minority groups. Wrongly convicted individuals tend to have prior criminal records, the sort that sent the police in their direction to begin with. Most middle- and upper-middle class Americans will identify with the victims of crime long before they identify with suspected perpetrators. Moreover, the knowledge that most death row inmates have long rap sheets may persuade fence-sitters that nobody truly innocent is being executed, regardless of the facts of any specific case.

Some day, probably very soon, a new crime wave will hit, or we will be falsely convinced by hysterical media coverage that one is upon us. Or some particularly heinous crime will occur, perhaps in New Jersey, and people will clamor for vindication of their outrage through capital punishment. They'll say that we have to trust our judicial system. They'll say that the Bible demands an eye for an eye. They'll say that tax money shouldn't be used to feed and shelter a monster of this magnitude.

But what they'll really be saying is kill the bastards, kill them all until I feel safe again. Satisfy my need for vengeance. Against such attitudes, it is unlikely that an army of social scientists or a mountain of exculpatory DNA evidence will ever fully prevail.

Wednesday, December 26, 2007

Doctor, Doctor, Give Me the News

If you were, like most of us, preoccupied with holiday preparations this past week, you may have missed the news that Rudy Giuliani spent last Wednesday night in a hospital in St. Louis. The immediate word was that the former New York mayor suffered from flu-like symptoms, a slightly odd locution that drew the immediate attention of the press corps. Was it a high fever, early-stage pneumonia, stomach illness? Or, perhaps, did Giuliani's handlers use the influenza story to cover up something bigger, perhaps a recurrence of hizzoner's cancer? The fact that Giuliani emerged from the hospital the next day appearing flu-free only intensified the questioning.

Presently, the former mayor gave an interview to George Stephanopoulos, which was featured on the latter's Sunday morning gabfest. According to Giuliani, a bad headache, made worse by the pressurization of his airplane cabin, had forced him to order a return to Missouri, at which time he checked into the Barnes Jewish Hospital. Could reporters speak to his doctor, Stephanopoulos asked? Sure, replied Giuliani, but not until after Christmas because the doctor was "tied up until then, and also I’ve got to make sure I get all the cancer tests back, and he’s going to put together a complete picture." And the dog ate the X-rays. Now I don't know of too many people who enter a hospital battling flu-like symptoms and wind up undergoing tests for malignant tumors, but Rudy is a cancer-survivor, as they say these days, so maybe that's standard operating procedure.

Anyhow, we'll evidently soon learn that the former mayor is as healthy as a thrice-married horse, allowing us to return to more important matters such as Hillary Clinton's stubborn unwillingness to wear dresses. The larger question, however, will remain out there just waiting to re-emerge as soon as Barack Obama sneezes one time too many or George W. Bush loses another battle with a salted pretzel. How much do we deserve to know about the health of those who seek to lead us?

I suppose we can begin with the obvious. We should certainly be permitted to learn whether a president or presidential candidate suffers from significant mental impairment. And by this I mean something other than the inability to tell the truth, or string together a coherent sentence, or revise strategy in the face of overwhelming evidence of failure. These deficiencies, all possessed by the incumbent, would, alas, not show up on any brain scan and are thus not disqualifying. Rather, I am thinking about dementia or Alzheimer's or something of that nature. It remains unclear, for example, just when Ronald Reagan's troubles began, but if his doctors suspected a problem during the Gipper's second term in office, the country definitely should have been informed.

Short of that, however, what else do we really need to know? Both Giuliani and Fred Thompson have had diseases that could have been, but were not, fatal. It is often difficult even to predict imminent death unless the person is so desperately ill that the average voter will immediately recognize that fact without the benefit of professional consultation. Franklin Roosevelt was near the end when he was elected to a fourth term as president in 1944. But it was impossible to know precisely when he would expire until he actually did. In the meantime, his mind was sound and what the electorate didn't know didn't hurt them. Besides, as Bush has shown, even the healthiest of us is but one improperly ingested snack food away from doom.

This obsession we—or more properly, the news media—have over health stories and medical records betrays a naïve sense that the presidency rests in the hands of a single man. Presidents die, they get replaced. Life goes on. Presidencies are interrupted in any number of terrible ways, some by acts of God, some by lone gunmen in empty warehouses. This idea that we have any real sense of a candidate's future is simply illusory.

Therefore, let's put an end to all this discussion of medical history and medical records. The modern presidential campaign demands vigor, and those who lack the energy to endure it, for any reason, will quickly fall to the side (see Thompson, Fred). If a candidate makes it through two years of endless public appearances, airplane rides, New Hampshire winters, and Rotary Club chicken dinners, then he or she should be considered, prima facie, healthy enough to serve as Commander in Chief. And if something should happen somewhere along the line, whether natural or criminal, we will turn, as the Constitution tells us to, in the direction of the Vice President.

If the past century taught us anything, it is that our powers to predict are nonexistent. The oldest elected president made it through eight full years in office; the youngest didn't even reach his forty-seventh birthday. The men who replaced the fallen (and, in one case, dishonored) chief executives were themselves, on average, equal to the jobs they inherited. Some (LBJ and Truman, most recently) subsequently won election in their own right.

The quest for the American presidency is quite likely the most invasive public spectacle ever developed by human beings. Thousands of highly qualified individuals pass on the opportunity to contend for their parties' nominations because they don't wish to surrender every grain of privacy in their lives. Some of this surrender is necessary and some is simply the inevitable byproduct of our celebrity-centered culture. But we can and should draw the line at a person's medical records.

Next time, Rudy, tell Stephanopoulos to stuff it.

Tuesday, December 25, 2007

A Christmas Carol

Jennifer was my sixth grade girlfriend, a bright-eyed brunette with the sweetest smile I had ever seen. We talked incessantly on the playground, passed notes in class, and once, during a special school assembly, danced to "Sugar Sugar" by the Archies. As young as we were, our relationship probably fell at least one step short of puppy love, but "dog fetus love" doesn't have much of a ring to it, so there you go.

Christmastime was the best. We both belonged to the school choir, which allowed us time out of the classroom to practice the usual collection of non-challenging songs designed to test our meager pre-teen harmonies. With no seating chart to separate us, Jennifer and I could sit next to one other, holding hands beneath the long cafeteria tables. Our song was "The Twelve Days of Christmas", because it was endless and gave us ten glorious minutes of uninterrupted physical contact.

Then one day in early December, I floated into the cafeteria at the appointed time and found Jennifer gone. She was in class that morning and I couldn't figure out why she would possibly miss our time together. I cursed the turtle doves and the leaping lords and waited impatiently as choir practice suddenly turned interminable.

When I found Jennifer on the playground later in the day, I asked her what happened. She responded that her parents had pulled her out of choir. They had discovered their daughter singing songs about the Virgin Mary and the Baby Jesus and were apparently outraged that nobody had warned them. Jennifer, I discovered that afternoon, was Jewish.

Silently, I raged against my girlfriend, her parents, and a belief system that would deprive two kids of their first childish expressions of love over a bunch of stupid songs. How could it hurt anyone to sing about Christmas? Not once did it occur to me to reflect on the naked presumptuousness of a public school district blithely expecting a little Jewish girl to give musical praise to someone else's lord and savior.

A couple of months later, Jennifer's father got transferred and the family moved out to the Midwest. I was briefly devastated, but time heals quickly at that age and I moved on without too much effort. She and I exchanged letters for a few months, but there really wasn't much to say. She was there and I was here, and even in sixth grade I had a practical side that instructed me that this was a relationship without a future.

I thought about my sixth grade girlfriend many years later when talking to an acquaintance during the holiday season. Alison (unlike Jennifer, not her real name) spoke resentfully of the Christmas celebrations she was forced to endure in her northeastern public elementary school. She, too, was Jewish and she still felt the sense of exclusion nearly two decades later. One month out of every year, she said, she was reminded by teachers and classmates alike that she was different. Not just different, but less worthy, an outsider in her own hometown.

Allison, and Jennifer's parents for that matter, were not at war with Christmas. They had spent their entire lives around Christians and probably even threw a few coins into the Salvation Army pot as they passed the bell ringer at the mall. They undoubtedly knew just about every Christmas carol by heart and may have even mindlessly hummed along with one or two of them when they played on the radio.

But they didn't expect their public schools to treat Christmas, and particularly the religious observance of Christmas, as pre-eminent and universal. They did not expect to be treated as aliens within walls that their tax dollars helped to construct. Nor, as Alison pointed out, were they necessarily mollified by the 10-1 ratio of Christmas to Chanukah songs performed by some of the more progressive institutions.

Aggressive religious conservatives like to say that the United States was founded as a Christian nation. Even if this were not hopelessly ahistorical, it would still miss the—pardon the pun—fundamental point. The Constitution, for all its faults, was a document intended first and foremost to protect numerical minorities. Majorities need no protection. Sometimes, however, majorities forget that their views and practices are not held unanimously, and the Constitution is there to remind them of that fact.

Really, though, this shouldn't even be a constitutional concern. It should be a matter of simple courtesy. In the public places where we all gather, in the spaces that we share with one another, why would we want to impose our traditions and beliefs on our friends and neighbors? Why would we not want to be as welcoming and inclusive as possible?

When we choose to say "Happy Holidays" we are not demeaning Christmas; we are simply recognizing the diversity of our American family. When we remove manger scenes from our city parks, it is not out of disrespect for Christianity, but rather out of respect for everyone else who works, lives, and pays taxes within our various jurisdictions. We still have Christmas in our families, in our homes, in our churches, and, where appropriate, in our hearts. When did that become not enough?

Christmas is a celebration. That some people have chosen to turn it into a wedge issue is a sad perversion of both the holiday and its meaning. It is these culture warriors, always ready to politicize and divide, who have themselves determined that nothing is, at long last, truly sacred. First, they created a fictional War on Christmas, and then, by turning the holiday into a bludgeon, became its biggest enemies. They need to stand down before they do any further damage.

So Happy Holidays, Jennifer, wherever you are. I still think of you when I hear the Archies.

Monday, December 24, 2007

The Decline and Fall of Mitt Romney

As late as Thanksgiving, I was convinced not only that Mitt Romney would win the Republican presidential nomination, but that he also stood a reasonably good chance of beating the Democrats in the general election. Since then, the former Massachusetts governor has done everything in his power to prove me wrong. As we reach Christmas Day, 2007, Romney faces the growing prospect of losing both the Iowa Caucuses and the New Hampshire Primary. Should that occur, he will almost certainly join his dad, who failed in 1968, as yet another footnote in the annals of promising campaigns that fizzled long before the swallows returned to Capistrano.

Given that some of you actually thought that Fred Thompson was going to win, I really shouldn't have to defend myself, but I will anyway. Here's what I figured: John McCain would get pummeled on the immigration issue. Check. Rudy Giuliani would find himself, at long last, unable to outrun the shadow of his own myriad scandals. Check. Thompson's much vaunted charisma would dissipate as soon as he stopped reading from other people's scripts. Check. Mike Huckabee, though superficially appealing, would emit extremist vibes that would be picked up and luxuriated over by the national media. Check. Sam Brownback's, Tom Tancredo's, and Duncan Hunter's candidacies would exist primarily in their own minds and they would never venture outside the precincts of single-digit obscurity. Check, check, and check.

Romney, then, stood to benefit from this process of elimination. Voters uneasy with the candidate's Mormonism would ultimately choose principle over prejudice and throw in with the candidate whose views on the issues, at least this time around, were consistently and unfailingly conservative. The GOP base would conclude that Romney never really meant all those things about abortion and gay rights that he said back in Massachusetts. He simply did what he had to do to fool the liberals and capture the governorship of the only state that voted for George McGovern in 1972. He did, after all, put up a fierce fight against gay marriage during his final days in Boston.

So what went wrong? Well, first it turned out that Romney, who had the good looks of a department store mannequin, also possessed the mannequin's personality. Even his jokes sounded scripted and he delivered each bon mot with the sincerity of a second-year high school drama student. During the presidential debates, his Spock-like demeanor contrasted badly with Giuliani's nervous intensity, McCain's calm frankness, and Huckabee's Gomer Pyle witticisms. Romney was the cold political calculator selling, more than anything else, a clear-eyed, bloodless competence. In the end, he evoked unflattering memories of an earlier Massachusetts governor, Michael Dukakis.

Still, as a rich and stable man in a poor and often unhinged field, Romney remained in a strong position in the Iowa and New Hampshire polls. He and Giuliani competed for the frontrunner tag, and most pundits assumed that the two would ultimately face their big showdown in South Carolina or Florida. Romney, opposing a twice-divorced Catholic, decided that his Mormonism required no explanation even to the millions of Christian conservatives who regarded his faith as heresy at best and cultish at worst.

And then along came Mike Huckabee.

The former Arkansas governor, on the strength of standout debate performances, cute advertisements, and good press, started to rise in both the Iowa and New Hampshire polls. If nothing else, Huckabee's ascendance made clear that Romney had not yet closed the deal with evangelicals. The Mormon murmurs increased and suddenly the airwaves were abuzz with damaging dissections of the church's teachings. Romney quickly shelved his previous strategy of don't ask, don't tell and decided it was time to give The Speech.

I have already commented on the candidate's address at the Bush Library in College Station, Texas, so I won't belabor the point here. Suffice it to say that Romney badly misjudged his target audience. The candidate barely addressed Mormonism per se, indeed only mentioning the church by name on one occasion. Instead, he delivered a largely boilerplate speech, laden with clichés about family and marriage and patriotism, evidently dedicated to persuading the religious right that he shared their various prejudices. And indeed he did share them, all but one.

Only Mitt Romney seemed not to understand that his mandate on that December morning was not to pander, but to reassure. He needed to demystify the Mormon Church, explain what his faith meant to him, and turn attention away from the controversial 19th Century teachings and toward the thriving, modern Church of Jesus Christ of Latter Day Saints that exists today. The fact that he didn't do so was unhelpful; the fact that he appeared to be consciously avoiding doing so was devastating.

Having blown this opportunity, Romney not only allowed Huckabee to gain on and finally pass him in Iowa, he also somehow opened the crypt and permitted John McCain once more to walk among the living. Romney's response to this sudden burst of ill-fortune has been shrill and ineffective. One day, he loudly demands that Huckabee apologize for daring to suggest that President Bush's arrogant foreign policy is arrogant. The next, he attacks John McCain for having changed positions on the question of tax cuts, inexplicably bringing the issue of flip flopping back to the front burner.

More than anything else, though, the loss of frontrunner status seems to have deprived Romney of the glowing self-confidence that had been his greatest strength. Rather than humanizing him, the candidate's attacks on his surging opponents have only served to make Romney seem colder and more petulant. Like Dukakis before him, Romney, a candidate without ideological anchor, seems unable to emerge from his first serious skid. But at least Dukakis managed to win his party's nomination before spinning irreversibly toward oblivion.

To be fair, Romney may still emerge as the Republican nominee. The GOP field remains weak. Mike Huckabee is not reacting well to his initial weeks in the white-hot spotlight. Rudy Giuliani may be permanently tarnished by his ethical difficulties. John McCain must still survive the immigration dead-enders who dominate his party's base. Fred Thompson continues to sleepwalk and mumble.

But even if he does survive, Romney is no longer the formidable contender that he appeared to be two or three months ago. He may yet be the man that Hillary and Obama most fear. But they almost certainly don't fear him as much as they once did.

UPDATE: Welcome to all of you who are visiting this site for the first time. There's a lot of Election '08 commentary on the main page and in the archives. Please indulge. I'll be back to respond to comments this afternoon (last minute shopping calls away me right now). I hope you'll return often.

Sunday, December 23, 2007

Hanks for the Memories

I am not a big movie fan. Very few motion pictures, regardless of the hype, compel me to surrender ten dollars to witness some director's faux-artistic vision. It's not a matter of snobbery; live theater leaves me just as cold, as do opera and the philharmonic. I like to think of myself as having a restless mind, but perhaps I suffer from that adult-onset A.D.D. that I keep hearing about on television. In any event, I have managed to avoid every film that generated buzz in 2007.

The buzz itself, however, cannot be escaped. Reading material in my doctor's waiting room is spare, and I periodically find myself perusing Entertainment Weekly and People Magazine during the cold and flu season. The only alternative is Reader's Digest, which remains little more than an imbecile's guide to life, filled with lame homespun humor and trite Middle American homilies. Thus, despite my lack of interest in the latest cinematic achievement, I inevitably learn which movies are hot and which stars are dreamy. This at least allows me to make conversation with those annoying strangers who want to continue to engage me beyond the basic weather report.

It was during my latest attempt to fleece my HMO that I learned of a new Tom Hanks/Julia Roberts vehicle called Charlie Wilson's War. According to the reviews, Hanks portrays Wilson, a former Congressman, as one of those swashbuckling, larger-than-life Texans who, from LBJ to GWB, have done so much damage to the republic over the past half century. In this movie, however, Wilson is evidently something of a hero, a middle aged playboy who finds his life's meaning in releasing Afghanistan from the clutches of the mighty U.S.S.R. back in the 1980s.

As opposed to today, when politics is scary and depressing, politics in the 80s was merely depressing. The Reagan era brought us the fourth uninterrupted decade of Cold War and still more ceaseless proxy skirmishes across the continents. History accurately records the Soviet Union as a malign presence in the world, but America's response to the Evil Empire generally involved arming a succession of vicious third-world thugs who were, by tactics and brutality, largely indistinguishable from their Communist oppressors. Despite what we tell ourselves today, there was very little that was glorious about Ronald Reagan's foreign policy.

This is particularly true with respect to Afghanistan, that benighted Central Asian nation that hosted the final battle between the forces of Good and Evil. Forgive the Afghanis, however, if they were unable to tell the difference. On the one hand, an increasingly desperate and repressive Soviet puppet regime found itself propped up by an unwilling army of young Russian conscripts who treated the locals with all the respect normally proffered by unhappy draftees. On the other hand, a ragtag coalition of Mujahedin waged fierce and unrelenting war against the Communists. Their ranks included thousands of foreign Islamic fundamentalists, including a young Saudi millionaire named Osama bin Laden. Many, if not most, espoused a fanatical brand of Islam that would eventually serve as the founding creed of the Taliban.

By the 1980s, we were no longer a naïve republic. An earlier generation may have believed that it was possible to transform a petty kleptocracy in Saigon into a flourishing Asian democracy. No such illusions existed during the Reagan era. The label of Freedom Fighter, granted to Osama and his brutal allies, possessed only cheap propaganda value, reeking of the same flagrant dishonesty as terms like People's Republic or Worker's Paradise. Our leaders did not care who won so long as the Soviets lost.

Democrats, having sacrificed their own credibility on the altar of anti-Communism during the 1960s, were reluctant to act precipitously when the Soviet Union invaded Afghanistan in 1979. Jimmy Carter, his presidency already unraveling, reacted mainly in symbolic terms, canceling American participation in the 1980 Moscow Olympics and reinstituting draft registration for young men over seventeen. Congressman Wilson, however, from his perch on a key defense subcommittee, took action and ordered an increase in the covert CIA funds directed toward the anti-Soviet fighters. (Did you know that individual members of Congress could do this? Neither did I.)

The election of Ronald Reagan gave Wilson an ally in the White House and he redoubled his efforts to send millions of taxpayer dollars to Afghanistan in support of the Mujahedin. The result was the Soviet Union's own Vietnam, an unwinnable and costly fight that alienated people on the home front and rocked the Russian aura of invincibility. The humiliating defeat may or may not have contributed significantly to the downfall of the U.S.S.R., but it certainly gave reformer Mikhail Gorbachev valuable leverage vis-à-vis the discredited military and intelligence services.

And of course, the victory by the Mujahedin also gave us the Taliban and al Qaeda. Osama bin Laden became a superstar among certain fanatical Muslims. In this era of Reagan-worship, it is popular to insist that nobody could have foreseen the perils of propping up Islamic fundamentalists in the fight against the Russians and their secularist puppet government.

That may be true insofar as it is impossible to know exactly where the strands of history will lead. Certainly, nobody celebrating the Soviets' defeat was aware that the World Trade Center would someday be in jeopardy as a consequence of their efforts. Nevertheless, the links between fundamentalism and terrorism were well established by the 1980s, and the fall of the Shah of Iran had already provided a glimpse into the dangers of westerners muscling their way around the Islamic world. The point, in any event, is not that Reagan and Wilson should have known every detail about the men whose war they were bankrolling; the point is that they didn’t even care.

For his part, the former congressman says he has no regrets. "We were fighting the evil empire," he tells Time Magazine. "It would have been like not supplying the Soviets against Hitler in World War II."

Then he added: "Anyway, who the hell had ever heard of the Taliban then?"

Saturday, December 22, 2007

The U.S. and Russia: A Tale of Two Presidents

Accompanied by blaring trumpets and a choir of schoolboys, Time Magazine has announced its Person of the Year for 2007. I'm not sure when this turned from a curiosity into an event, but it apparently now merits its own prime time television special. Because it was a slow year and Al Gore simply cannot be allowed to win everything, the honor, such as it is, went to Vladimir Putin, the increasingly autocratic Russian president.

Since Hitler was once named Man of the Year, the bar is not set especially high. Hell, last year, You won it. But Putin's selection does reflect his country's first significant rise up the ladder of scariness since Boris Yeltsin faced down a Communist coup in 1991 and followed up that accomplishment by assiduously poisoning his liver for the next decade. By the time Big Boris was done dancing and slurring his way into legend, Russia seemed so harmless that practically nobody noticed that Yeltsin's hand-picked successor had spent most of his pre-political life as an operative for the not-so-amusing Soviet KGB.

Upon meeting Mr. Putin, George W. Bush, that keen judge of character, bragged that he looked into his colleague's heart and saw a man with whom he could work, a fellow devotee of the democratic arts. Putin rewarded this observation by consolidating power, suppressing opposition parties, and rattling sabers to a degree unseen since the days of Brezhnev and Andropov. Aside from proving once again that Bush is a pathetic and unserious man, Putin's actions also called into question the degree to which the Cold War was ever truly won.

All this brings us back to the last time a Russian made it to the top of Time's annual list of the world's movers and shakers. At the end of 1989, Mikhail Gorbachev not only became that year's cover boy, he was actually named Man of the Decade. This, of course, refers to the decade of the 1980s, which will come as quite a shock to anyone brought up on the Gipper-centric narrative that dominates contemporary thinking about the era. But in the waning days of the Soviet empire, the people who were actually there and reporting the story chose Gorbachev, and not Ronald Reagan, as the one irreplaceable man.

Today, of course, the notion that Reagan won the Cold War is taken as an article of faith, even by those who ought to know better. It all started, we are now told, with Ronnie's "Evil Empire" speech, his massive military buildup, his defense of Freedom Fighters in Afghanistan (such as Osama bin Laden), and his bold Star Wars anti-satellite initiative. By the time Reagan famously demanded that Gorbachev tear down the Berlin Wall, the Russian leader, according to this Republican fairy tale, simply had no choice.

But of course Gorbachev did have a choice. He could have held together his empire by any means necessary, mobilizing armies and deploying tanks. He could have threatened the United States with nuclear brinksmanship. Remember that the stakes involved were no less than the very existence of the regime that he was selected to preserve. Gorbachev could have pushed the button.

Though nuclear war was an unlikely scenario even under the direst conditions, the fact remains that the Soviet Union didn't have to go down without a fight. Most regimes wouldn't, and few guessed that this one would. Indeed, the U.S.S.R. may be the only world power in human history that ever threw in the towel without first losing a war of national survival. Two decades ago, Gorbachev received credit and gratitude for his extraordinary restraint as the Marxist-Leninist experiment drew to its unsuccessful conclusion. Today, he is nearly forgotten, another victim of the ever-expanding myth of Ronald Reagan and his supposedly epochal presidency.

To be sure, Mikhail Gorbachev was no saint. Indeed, nearly until the end, he worked tirelessly to save the Soviet Union and its corrupt and brutal Communist party. Still, when finally faced by the choice between gratuitous bloodshed and the surrender of an empire, Gorbachev opted for the latter. This time, because of his leadership, the tanks didn't roll into Prague and Budapest.

Interestingly enough, Ronald Reagan actually does deserve praise for his role in the dismantling of the U.S.S.R., but not for the reasons usually cited. Reagan entered the presidency a committed Cold Warrior and surrounded himself by others who shared similar views. Many of these advisors warned their boss not to trust Gorbachev. The White House hardliners insisted that all Commies were alike and that Gorby was nothing more than Brezhnev with a friendlier style and a disarming birthmark. Had Reagan listened to these people, had he not given Gorbachev room to maneuver, the old Communist bosses might have replaced their General Secretary with a more ruthless leader, someone who might have taken violent, decisive action when the satellite countries began to break free of their Russian orbit.

But to his credit, the ol' Gipper, a far better judge of character than George W. Bush, decided to take a chance on his new negotiating partner. Maybe he looked into Gorbachev's heart and soul. More likely, Ronald Reagan chose, after forty years of Cold War, to gamble on the possibility of lasting peace. Gorbachev responded by bringing the teetering Soviet empire to a safe and bloodless landing.

Just as in his Hollywood days, President Reagan's best moment came in a supporting, rather than lead, role.

Friday, December 21, 2007

The Sky is Falling (as Usual)

Whenever right-wing critics of higher education run out of ideas, they begin spelunking through college catalogs looking for ridiculous-sounding course titles like "The Sociology of Star Trek" (I think I made that one up, but you never know). The aim here, of course, is to score cheap points with the rubes while simultaneously chipping away at the legitimacy of the academy. This sort of thing comes as naturally as mouth breathing to the folks at the American Council of Trustees and Alumni, an organization whose goal is to exile every liberal college professor to Guantánamo (I exaggerate, but only by degree).

From the ACTA blog comes word of the latest set of supposedly outrageous university offerings, courtesy of one Mark Bauerlein, a culture warrior who draws his paychecks from Emory University. From this perch atop Atlanta's most elite and private ivory tower (conservative academics face such frightful discrimination, do they not?), Professor Bauerlein favors us with the latest survey of "fluff" courses, each designed to soften the brains of America's youth to the point where our new Chinese masters can take over without a fight. You probably already know, but all this is the fault of the aging ex-hippies who dominate the halls of academe and enforce sensitivity training on anyone who still refers to flight attendants as stewardesses.

Because he is a culture warrior instead of a garden-variety anti-academic nag, Bauerlein conflates the old style gut courses ("Reading Superheroes") with catalog entries that strike him as unacceptably leftist or otherwise politically correct ("Che Guevara: The Man and the Myths"). Thus, not only does Bauerlein decry the usual cultural fluff, he also takes on courses that are little more than "tendentious counters to mainstream Americanisms". He fails to deconstruct his notion of "mainstream Americanisms", but one assumes these probably have something to do with the virtues embedded in movies by John Wayne and songs by Pat Boone. Why else would we be expected to find objectionable a course subtitled, "Exploring Gay, Lesbian, Bisexual and Transgender Issues"?

Bauerlein, however, assumes that his audience (he is blogging for the Chronicle of Higher Education) may consist largely of ovo-lacto vegetarians and Joan Baez fans, so he requests that we regard these courses not "in the customary (and accurate) light of political bias", but rather "in relation to something else: the intellectual condition of the students". Think about the children! It is a delightful rhetorical gambit, the sort that conservatives frequently employ when they want to get "down" with their liberal "homies".

But all right, let's go ahead and think about the children, or in this case young adults. Bauerlein points to last year's survey by the National Assessment of Education Progress that asserted that many U.S. college graduates know less about American history than the average Manchurian house pet. According to the author, less than half of all American collegians could meet even basic history standards, and they may actually have known less than they did immediately upon graduating high school.

As an example—and one obviously hand-picked to appeal to his liberal target audience—most "could not explain an old photo of a theater sign that announced, “COLORED ENTRANCE.” If true, this is, indeed, tragic and in need of correction. Perhaps Professor Bauerlein might agree with me that a mandatory course on African American studies is indicated. Whaddya say, Mark?

Lesser outrages include the fact that "only 54 percent of [survey respondents] were able to read a sample ballot correctly, and only 16 percent provided 'complete' or 'acceptable' explanations of how the legislature and judiciary check a president’s power." On the first point, it is odd to see a right-wing culture warrior decry the single lack of skill most responsible for the election of George W. Bush in 2000. As to the second, and speaking of Bush, is it any wonder that young people who came of age during the past seven years have no idea how Congress and the courts limit presidential power?

I don't know, or care, enough about the National Assessment of Education Progress to explore the issue much further. I do know that the NAEP is an instrument of the U.S. Department of Education, yet another highly politicized subsidiary of the Bush administration. Regardless, the conclusion that higher education is causing a decline in knowledge vis-à-vis high school—which is how right-wingers chose to interpret the report—is laughable. The proper test of such a hypothesis would, of course, be to compare college seniors with randomly selected peers of the same age (21 or so) whose educational experience ended upon completing twelfth grade. I would be willing to bet some serious coin that, in such a test, the college kids would do much better than their real world counterparts.

Clearly, too, we should recognize that any yardstick defining "basic" levels of knowledge is necessarily arbitrary. Reasonable people could certainly disagree over just how much information is enough to meet this standard. We shouldn't assume, therefore, that over half of our college graduates sound like Paris Hilton on the set of Jeopardy.

In the end, Bauerlein's brief polemic is unpersuasive. Occasional gut classes, if that's in fact what they are, never hurt anyone. Learning probably should be fun sometimes, and a seemingly silly course here and there does nothing to prevent universities from requiring multiple history courses, or Western Civ, or Shakespeare, or anything else that warms the right-wing heart. If there is a problem—and it's a big if—it has little to do with the stray topical course students might choose to take over their four years in college.

As for the question of political bias (which is Bauerlein's real hobby horse), anecdotal evidence demonstrates nothing. As I've mentioned before on this blog, the overwhelming majority of college classes are without political content, and even in the social sciences and humanities, professors typically strive to represent all viewpoints. Indoctrination is a myth perpetrated by people who misperceive their own biases as unvarnished truth.

With respect to education, both higher and lower, the sky has been falling all my life. Claims about the decline in standards predate my memory and go all the way back to Socrates. When my parents were in school, it was John Dewey who had ruined America's youth; when I was young it was Dr. Spock. Today, right-wingers blame the Woodstock generation. Tomorrow it will probably involve professors brought up listening to rap music. Yet life goes on and the sky remains above our heads, as blue and promising as ever.

We'll be all right, so long as everyone reads their ballots properly next time.

Thursday, December 20, 2007

Are You Experienced?

If Barack Obama is denied the Democratic presidential nomination, it will almost certainly have something to do with voters' fears that he lacks the experience to be, as we said in quainter times, leader of the free world. As Team Clinton never tires of pointing out, Obama has spent just three years on the national stage, as a U.S. Senator, and prior to that toiled briefly in the obscurity of the Illinois State Legislature. This much is obviously true: should Obama become president, he will indeed have one of the shortest resumés of any postwar occupant of the Oval Office.

But how much does that really matter? The empirical record is actually quite mixed. On the one hand, it is easy to look at two of our more recent chief executives and conclude that inexperience is a significant handicap. Jimmy Carter and George W. Bush had very limited public service records prior to taking on the top job. Carter was a one-term governor of Georgia, and a lieutenant governor and state legislator before that. Bush the Second had held only one elected position, spending six years as Texas governor. Prior to his 1994 election to that office, Bush's career had consisted mainly of failing in the oil business. Carter and W are, of course, both regarded as seriously sub par presidents.

The inexperience argument, however, works far better for Jimmy than it does for George. Carter stormed the capital in 1977 with an army of cronies from Atlanta, and many of his early moves were poorly advised and amateurish. Bush 43, on the other hand, enjoyed the assistance of men and women who had themselves spent many years at the highest levels of national government. Dick Cheney, in particular, served as Chief of Staff for one president and Secretary of Defense for another, with a few terms in Congress in between. That unprecedented experience bought us Iraq, Guantánamo, the relief disaster in New Orleans, and a stale economy. Carter, who at least brokered peace between Israel and Egypt, did far less damage.

And what of those men with outstanding pre-presidential records of service? Well, the first President Bush had done it all: Congress, U.N. ambassador, envoy to China, and eight years at the right hand of his immediate predecessor, Ronald Reagan. Bush 41 gets kudos today for his mastery of foreign policy and the Gulf War of 1991, but the American people showed him the door after just one term. And his eldest son has already demolished Poppy's legacy in the Middle East.

Richard Nixon also came to the White House with a strong curriculum vitae. Nixon had been congressman, senator, and Vice President for two terms under Dwight Eisenhower. And look how well that turned out.

By contrast, Ronald Reagan's government experience exceeded George W. Bush's by only two years. To be sure, California's governor is a far more powerful official than the governor of Texas. Still, just in terms of years on the job and length of the resumé, Reagan was not a particularly seasoned public servant. Nevertheless, his presidency is considered by many to have been highly successful. Reagan remains, in my view, the most overrated president in history, but he was clearly more effective that any of the four men listed above (Carter, Nixon, and the Bushes).

Indeed, Franklin Roosevelt, the only truly great president of the twentieth century, had a background that was positively Carteresque. FDR served only as Assistant Secretary of the Navy and one-term New York governor before going on to unseat Herbert Hoover in 1932. Roosevelt turned that unlikely scenario into twelve celebrated years in the White House, a reign ended only by his death.

We obviously have a small sample here, only forty-two presidents total and just eleven since World War II. Thus, it is unwise to try to form too many generalizations. But we can say that the empirical record does not support our notions about the link between experience and presidential success. Relatively inexperienced men have flourished in the job, while long-time public servants have floundered.

One possibility is that we simply need more data points. Another is that the job and its circumstances are too idiosyncratic to allow for prediction. There may, indeed, be nothing that can truly prepare one for the presidency other than the experience of being president. Bill Clinton, despite a full decade as Arkansas governor, spent his first two years in the White House as the second coming of Jimmy Carter, an amateur surrounded by hometown yokels. But he gained his footing as he went along and ultimately led a popular and effective administration.

(And let's also not assume that yokels come only from places like Atlanta and Little Rock. Ronald Reagan's California mafia got him into all sorts of trouble during his eight turbulent years in Washington. It was the inside-the-beltway Baker boys—James and Howard—who helped to right the ship.)

Finally, what about the notion that prior executive experience is critical? Again, we have relatively little to go on. On the one hand, the most successful recent presidents—FDR, Reagan, and Clinton—were in fact former governors. On the other had, as noted above, so were Carter and George W. Bush. John F. Kennedy's first executive experience came as president, yet his lack of a CEO's background didn't prevent him from making all the right calls during the Cuban Missile Crisis.

The bottom line, then, is that we simply don't know. Barack Obama may be no Jack Kennedy, but Hillary Clinton may be no Bill. We won't find out until we elect one of them. And by then it will be too late.

Wednesday, December 19, 2007

She Don't Lie, She Don't Lie, She Don't Lie...

Within the next decade or so, presidential politics will be dominated by people who came of age during the 1970s and early 1980s. As noted yesterday, Barack Obama is the first representative of that group to make a serious run for the White House. Not coincidentally, he is also the first to own up to using cocaine during his youth. Agents and supporters of Hillary Clinton's campaign have made this an issue, claiming that Obama's confession hurts his electability.

When political operatives muse about their opponent's electability, they are usually being disingenuous. That would certainly seem to be the case here. While the Clintonites obviously want Democratic voters to worry about Obama's prospects next fall, they also hope to make the Illinois senator's drug use a talking point among newspaper columnists and television pundits. They have, of course, succeeded.

From the point of view of post-baby boomers, this issue will continue to cause problems. People born before the 1940s are likely to regard cocaine use as bizarre and troubling. Younger voters who grew up in the "Just Say No" era of the late 1980s and beyond may not only consider such behavior as criminal, they may also associate former users with crackheads, the desperate and dangerous men and women whose experience with cocaine caused a decade-long, and often justified, national panic. Thus, members of Obama's generation could get squeezed from both ends of the age spectrum.

Worst of all, there is no way to water down the issue. Former marijuana users can insist that they only tried the drug once, they didn't like it, and it didn't affect them. These rhetorical tactics will not work with cocaine. Most Americans still believe that even one-time coke use is unacceptable. And everyone who tries cocaine is "affected"; it is not a subtle drug.

There is also, as a certain Nobel prize winner might say, an inconvenient truth lurking beneath all the discussion of the excesses of the 1970s. Simply put, millions of young people--and some not-so-young people--used cocaine, often with some regularity, and lived to talk about it. Their presence today as respected teachers, lawyers, and U.S. Senators implicitly contradicts the claims of drug czars and public service advertisers that experimenting with coke is equivalent to stepping into the abyss. This is not a conversation the country wants to have, and voters may punish the candidates who, through no fault of their own, force the issue onto the agenda.

Don't get me wrong: cocaine use is a bad idea. It can cause at least psychological addication, it puts a strain on vital organs, and for the very few but very unlucky, it can be a ticket to the mortuary. But the reality, as bad as it can be, does not square with the generally hysterical arguments made about the drug's power and danger.

Cocaine use is down in the United States, if statistics are to be believed, and that is a good thing. Indeed, it remains possible that the overheated campaign against the drug has contributed to that decline. Nobody, therefore, wants to open a discussion that ends with the realization that most non-crack users from the 70s and 80s emerged unscathed. What would we tell the children?

Well, we could tell them the truth, and trust that they are bright and sophisticated enough to handle it. Yes, Mom and Dad and President Obama once tried coke. We all lived to tell the tale. Cocaine probably won't kill you and marijuana likely won't turn you into a babbling imbecile (well, actually it will for an hour or so, but that's another matter). But drugs are far from safe, they distract you from your personal goals, and they are illegal. Mom and Dad once purchased Barry Manilow records, too. You should learn even from their non-fatal mistakes.

If we don't have an honest national conversation about what happened under the disco ball, people like Barack Obama and those who follow him will continue to face potentially damaging whispering campaigns. The silence of George W. Bush is no longer an option. By choosing to be frank about his past, Obama has already done a significant service to his generation. It remains to be seen, however, the extent to which he will be punished for that honesty.

Tuesday, December 18, 2007

Our First Post-Modern Presidential Candidate

Depending on where you draw the line, Barack Obama may or may not be a Baby Boomer. He is, in any event, the first serious presidential candidate born during the 1960s. Thus, many of his formative experiences took place years after the early boomers were well into adulthood. It may be time, therefore, for politics to come to grips with the 1970s.

The issue of recreational drugs emerged in 1992 with the candidacy of Bill Clinton. The former president famously responded to the question of marijuana use by claiming to have smoked pot without inhaling. This made him, and not for the last time, a temporary object of public ridicule. To be honest, I always thought that, in this one case, Clinton may have been telling the truth. You're at a party with friends and you don't want to appear uncool, so you pretend to toke. Makes sense to me.

By the presidential campaign of 2000, the nation had come to terms with the adolescent use of marijuana. Dozens of politicians had copped to the charge, many insisting that they only tried it once, a claim I always found far less believable than Clinton's. The more interesting controvesy involved the question of whether George W. Bush had ever abused cocaine. Bush refused to address the issue even as he owned up to a serious alcohol problem prior to his fortieth birthday. Most people assumed W's non-denial to be a confirmation, but it nevertheless succeeded in ending the discussion, allowing the Republican nominee survive until his court-assisted victory that November.

America was a very different place in the mid-1960s, the time during which Clinton and Bush came of age. It wasn't until 1965 or so that the first stirrings of the counterculture were felt outside of, say, San Francisco or New York. Drug use was far less common in those days than it would become several years later. Indeed, Bush's likely experiments with cocaine probably only occurred because he extended his wanton youth all the way to the onset of middle age. By 1978, when W was still apparently binging, Bill Clinton was busy being elected Governor of Arkansas, having already served as the state's attorney general.

In 1978, Barack Obama turned seventeen, his own adolsecence perfectly time to correspond with the excesses of the disco era. Not everyone tried drugs during the late 70s, of course. But weed was ubiquitous and coke prevalent, and it would be hard to find a young person from that era who did not at least know several people who indulged.

For his part, Obama acknowledges that he sampled both marijuana and cocaine. This would place him in a category with several million other teenagers of the period, including numerous honors students, valedictorians, and probably even Eagle Scouts. It was, in retrospect, a stupid and somewhat risky thing to do, but it simply wasn't abnormal at the time.

Now, of course, Obama's youthful indiscretions have become part of the 2008 campaign. Team Clinton is whispering (and sometimes saying out loud) that voters will be turned off not only by their opponent's past, but by the fact that he is so forthcoming and relatively unapologetic about it. Having spent their entire adult lives explaining and justifying the 1960s, Bill and Hillary now hope to save the flagging family franchise by running against the 1970s.

More about this tomorrow.

Monday, December 17, 2007

Turncoat Joe

So when Joe Lieberman refers to himself as an "Independent Democrat", that apparently means he's a Republican. No surprise, that. Still, just seven years removed from accepting the Democratic Party's Vice Presidential nomination, the Connecticut senator finally cut his remaining ties to his long-time allies by endorsing John McCain for president. At this point, he may as well cross over to the other side, compelete the lie he told his constituents two years ago, and sit as a full member of the GOP.

Clearly, Lieberman remains obsessed with the Iraq War. He delivered his endorsement to the most hawkish plausible candidate, Mr. Surge. Rudy and Mitt may talk tough, but McCain went all in on the escalation when almost everyone not named Cheney was convinced it would fail. Lieberman, too, supported the Bush Administration's efforts to double down after four years of quagmire and no doubt sees his colleague as a brave armchair warrior like himself (though, of course, McCain once put himself at great physical risk, while Joe has never done anything more dangerous than drive the beltway).

Regardless, one suspects that Lieberman's tranformation had at least as much to do with domestic politics as it did with foreign policy. By all accounts, the senator deeply resents the successful drive by progressive bloggers and their allies to deprive him of the Democratic nomination in his 2006 re-election campaign. That is understandable, I suppose, but ignores the resentment he himself generated by publicly questioning the patriotism of fellow Democrats who did not believe, as he did, that the best way to respond to the attacks of 9/11 was to invade a country utterly uninvolved in the atrocities.

Nevertheless, the attempt to unseat Lieberman was not the left blogosphere's finest hour. First, it displayed embarrassing amateurism. Nobody had apparently bothered to notice that Connecticut law allowed the senator to run as an independent if he lost the primary. Also unanticipated was the possibility that the GOP would abandon its own hopeless candidate in favor of delivering a humiliating blow to the Democrats by supporting Lieberman's victorious third party bid.

Second, Lieberman was, for all his faults, a reliably liberal senator on most issues other than the War on Terror. His conduct leading up to the Iraq War may have been disgraceful, but one must never lose sight of the bigger picture. And in 2006, the bigger picture was to recapture Congress from the Republicans. The loss of Lieberman almost thwarted that goal, and even today puts the Dems one seat behind as they fight to build a true majority in 2008.

Had the bloggers held their fire, Lieberman would almost certainly be a committed, if annoying and sometimes disloyal, Democrat to this day. He would likely be endorsing Hillary Clinton, rather than John McCain, for president. Instead, it seems clear that he will caucus with the GOP come January, 2009.

Sometimes, folks, it's just best to grit your teeth and bear it.

Sunday, December 16, 2007

No Country for Old Anchormen

Tom Brokaw was born somewhere in South Dakota with a pretty but masculine face and a deep resonant voice. He parlayed these gifts into a forty-year career reading news from a teleprompter. This, evidently, qualified him to spend his retirement years as one of America's favorite pop historians.

Brokaw is best known for his work on the men and women who endured the Second World War, documented in his bestselling tome, "The Greatest Generation". The book is, at first glance, a harmless ode to the folks who stormed the beaches, liberated the islands, and kept the home fires burning. The title, however, tips Brokaw's hand. The use of the superlative, greatest, suggests a comparison, and it should be obvious exactly who is, by implication, less great. Nobody remembers anything about the World War I generation and the Gen X'ers are just beginning to make their mark. So Brokaw is rather clearly suggesting that it is, in fact, the baby boomers who were and are inferior to their heroic parents.

Thus, when Tom Brokaw decided to focus his attention on the 1960s, it was clear that he had already chosen sides. I don't have a clue as to Tom Brokaw's personal politics, but I know that his veneration of those born during the 1910s and 1920s plays directly into the prejudices and resentments of conservative culture warriors. To this day, right-wing firebreathers, even those born after 1970, insist that the hippies and radicals of the 1960s hijacked Ward Cleaver's country, leaving military defeat, sexual promiscuity, and political correctness in their wake. The Greatest Generation, by contrast, exemplified the manly virtues of physical courage, moral certitude, and rugged individualism, traits revered—if rarely exhibited—by social conservatives.

I doubt I will bother reading Brokaw's latest book. But I did force myself to sit through his documentary on the year 1968, presented by the History Channel. To hear the ex-anchorman tell it, 1968 was the pivotal year of the decade, altering history and inaugurating many of the cultural debates that rage up to this very day.

Much of Brokaw's work is hopelessly self-referential, as he reminisces about being around the famous and powerful as they made their marks on history. About a half hour in, one begins to imagine that Brokaw has selected 1968 mainly because it allows him to talk incessantly about himself and to show copious clips of his early years as an L.A. broadcaster. Truth be told, 1963 and 1964 were at least as consequential as 1968, but, alas, Brokaw spent those years still in newsreader finishing school, leaving him unable to place himself, Forrest Gump-like, at the center of every story.

The documentary itself paints with broad strokes and hits just about every cliché of the era. Think of it as "1968 for Dummies". You've got Lyndon Johnson and Gene McCarthy, the King and Kennedy assassinations, the student takeover of Columbia University, and the violence at the Chicago Democratic convention. Graying heads discuss their moments in a sun that set many, many years ago. Richard Nixon makes a few appearances, creepy and sweaty as ever.

The central theme of the piece is the battle between elitist, over-educated young radicals and the salt-of-the-earth blue collar folks they alienated. For his part, Brokaw finishes one segment (it might have been the one about Chicago) with a little story about going home and talking to his working class father. I flipped the station at that point (I can only take so much of this sort of thing), but I'm sure daddy lamented the breakdown of patriotism and the nerve of those uppity young college kids and their Vietcong flags. And, of course, as goes Tom Brokaw's father, so goes the Silent Majority. Or something like that.

(By the way, what is it with these guys and their blue collar dads? First Russert, now Brokaw. Could it be that they have some need to work out their filial issues on our time? Or is it simply that smuggling hard-working papa into the picture makes them seem less like overpaid celebrity fops?)

At one point, Brokaw interviews a woman who protested at the Democratic convention and a former Chicago police officer who participated in the violent law enforcement overreaction. The ex-cop is unapologetic for what took place, and Brokaw baits him with a question about whether he, as a son of the working class, resented these spoiled young Ivy Leaguers spending their summer vacation cheering for the enemy. But the ex-cop isn't having any. No, he says, it was just another day at work for him, suggesting that busting heads was simply what he did and he wasn't especially concerned about whose melon got squashed.

Brokaw, however, is undaunted. In scene after scene, he contrasts angry, long-haired radicals with pictures of workin' men who could have stepped out of a Chevy truck ad. When Eugene McCarthy puckishly suggests that his supporters are "A" students and Bobby Kennedy's are the "B" crowd, Brokaw trots out Jeff Greenfield to declaim on Clean Gene's obvious elitism. Pat Buchanan appears before the camera to delight in the many ways the radicals offended Middle America and delivered the presidency to Dick Nixon.

Mainly, though, the documentary is just shallow, much like Brokaw's earlier work. Even Ken Burns, for chrissakes, could have delivered a meatier look at the 60s. The decline of working class white support for the Democratic Party is a multifaceted phenomenon that clearly predates 1968, but you wouldn't know it from watching Brokaw. The Chicago convention may have been a debacle, but, as even the documentary points out, Democrat Hubert Humphrey still came within an eyelash of winning the presidency, and probably would have done so had he sided with the hippies on Vietnam a few weeks sooner than he did.

But the biggest failing of Brokaw's retrospective is its unwillingness to acknowledge that which the historical record makes painfully clear: the freaks were right and the squares were wrong. The Vietnam War was a hopeless travesty, the failed byproduct of administration lies and unchecked national hubris. Racism was and is a cancer on our society, which is to this day not entirely in remission. The patriarchal society that existed before the 1960s deprived half the country's population of the opportunity to address their ambitions and dreams. We are, therefore, a better society because of 1968 and the years that immediately preceded and followed it.

Even worse, Brokaw refuses to face the fact that the forces of reaction in 1968—the pro-war dead-enders, the unrepentant racists, and the bullying sexists—were led by members of his own beloved Greatest Generation. The history of 20th Century America is a complicated and sometimes difficult one. Its heroes and villains are neither clear nor, in many cases, consistent. Brokaw's favorite generation may have won their war, but it took their children to address the gaping inequities between the races, sexes, and—yes, Tom—even the social classes.

Perhaps history, in the end, should be left to the historians rather than some famous broadcaster with too much time on his hands.

Saturday, December 15, 2007

Baseball, Hot Dogs, Apple Pie, and HGH

If you detest hypocrisy and sanctimony, this might be a good month to steer clear of the sports pages. George Mitchell of Maine, former senator and peacemaker in Northern Ireland, has lent his abundant talents to the hypocritical, sanctimonious world of Major League Baseball, leading an investigation into just how many human beings might be tempted to inject themselves with potentially harmful drugs in exchange for lifelong fame and staggering fortune. The result of this unneeded inquiry into Human Nature 101 is a lengthy report that provides the obvious answer: a lot of people would, in fact, yield to that temptation.

Somehow, this unsurprising conclusion has rocked the world of baseball, that singular American sport still regarded as pristine even after generations of racism, gambling, and ruinously inept mismanagement. Professional weasel and current Major League Commissioner Bud Selig said of the Mitchell report that "it is a call to action and I will act". Selig, of course, rose from the ranks of ownership to lead his little money-grubbing fraternity and actually held the top job—and thus the power to act—throughout the period in which rumors of steroid use were as common as $7.50 cups of beer.

As bad as the owners and their marionette commissioner may be, baseball's fans may be even worse. Devotees of hardball maintain a bizarre relationship with the objects of their affection. They deeply resent ballplayers' salaries and yet they demand that their favorite teams shell out top dollar for premier talent. They vent their fury at the athletes whenever a labor dispute arises, and then they insist that their cities provide billionaire owners with taxpayer-financed stadiums. They worship every jackass, bigot, and cheater from the distant past who ever belted 500 home runs or hit safely 3,000 times. But they lustily boo the very best baseball player that they are ever likely to see in person.

And the reporters are the worst of all. Journalists, they call themselves, but it took them over a decade to track down the biggest story of their careers, one that would have required relatively simple detective work. Ballplayers don't talk; losers do. And the journey from druggist to athlete nearly always includes several jock-sniffing losers as middlemen. These are people desperate for reflected glory and some sense of personal significance. Target their unstable egos, make them heroic whistleblowers, and they will sing like Bette Midler: loudly, annoyingly, unstoppably. The writers failed to do this, however, so now they are left to denigrate the game and players of the past fifteen years, as if Barry Bonds and Mark McGwire had stolen our national innocence rather than a couple dozen home runs.

But the most ridiculous claim of all is that we owe it to the past to erase the period from 1990-2005 from our collective memories. Ah, baseball and its past. The Babe and the Iron Horse; Willie, Mickey, and the Duke; the Big Red Machine vs. the Green Monster. Do you ever hear a football fan drone on about our debt to Y.A. Tittle and Slingin' Sammy Baugh? Have you ever known a pro basketball writer to pen odes to his grandfather's heroes, George Mikan or Dolph Schayes? But baseball aficionados still revere the ghosts of the early 20th Century and make risible claims about the ability of eighty-year old segregated teams—the 1927 Yankees, for example—to be competitive in a modern era of taller, stronger, and darker men.

The past, of course, is just as tainted as the present. Early baseball excluded African Americans, allowing the best white players to dominate in a way they never would have in a fairer sport. Grandpa's players looked for an edge, too, legal or illegal, spitting on the baseball, stealing signs, and corking their bats. Jim Bouton's Ball Four revealed the prominence of amphetamine use during the 1960s, with players convinced of the performance-enhancing capabilities of what they called "Greenies". Nobody has yet asked that the records from these eras be expunged.

I have no idea how much of an impact steroids and human growth hormone (HGH) had on the records compiled during the past generation. I do know that drug use obviously didn't turn Josias Manzanillo into Barry Bonds, so it probably didn’t turn Bonds into Bonds, either. Maybe without the cream and the clear, Henry Aaron would still be the new home run king. Maybe without the power-friendly effects of Aaron's home park in Atlanta, Babe Ruth would own the title to this day. Or maybe in a society without such a dreadful racial history, the honor would belong to Negro Leaguer Josh Gibson.

Speaking of race, it will be fascinating to watch the current developments surrounding Bonds and Roger Clemens. Until now, fans and writers have insisted that their disgust with Barry has nothing to do with the color of his skin, punctuating that insistence with ostentatious praise of Aaron, the African American whose record Bonds obliterated. Now an equally talented white man has been labeled as a fellow cheater. Bonds is the greatest hitter of his era, Clemens the greatest pitcher. Many Americans, not only black, will be measuring the outcry from the press boxes and the sports talk shows to see what it reveals about us.

Kentucky Senator Jim Bunning, a Hall of Fame pitcher, had this to say about the latest revelations:

"[T]here is one glaring hole in the Mitchell report, and that is the failure to address how to handle the records of those players who not only cheated by using steroids, but also broke a federal law that has been on the books since 1991. The selfish acts of those individuals who tried to cheat the system have brought the integrity of the game to its knees. It brings into question the legitimacy of any records achieved while using performance enhancing drugs."

This from a man who, dealing with far more serious issues, has supported nearly every action taken by George W. Bush, whose administration could, if several words were changed, be perfectly described by Bunning's condemnation of baseball.

Is it any surprise, then, that Bush is himself the first president to have previously served as the owner of a Major League Baseball team? The corrupt just seem to find one another, don’t they?