MayerBlog: The Web Log of
David N. Mayer

 

 

    See, I Told You So! - October 30, 2014

     

     

    See, I Told You So!

     

                For the past sixteen months (since I posted my last entry, “Thoughts for Summer 2013” (May 30, 2013), MayerBlog has been on hiatus, so I could give priority to my sabbatical project for the 2013-14 academic year, writing the manuscript of my next book, Freedom’s Constitution: A Contextual Interpretation of the Constitution of the United States.  Now that I’ve returned to full-time teaching for the Fall semester, I thought it’s a good time to resume posting to MayerBlog (even though I’m also still working on the final group of chapters for the book!) 

                See, I Told You So! was the title of Rush Limbaugh’s second book, published in 1993. As Limbaugh explained in the introduction to the book, “so many important developments have occurred” since the publication of his first book (The Way Things Ought to Be, in 1992) “that there was a lot of ground to cover,” along with issues and topics he didn’t get a chance to address in the first book.  “But, most important,” he added – in his typically modest Limbaugh style – “I realized early on just how right I have been about so much.”  (Actually, he has been wrong about much, especially since 2012, as I’ll discuss below.) 

                Resuming MayerBlog, I feel much the same way as Rush did in 1993.  Although there have been many important developments in the worlds of politics and pop culture (my main concerns here) over the past year or more, there’s not much that is really new, for these developments mostly were events that I could (and often did) predict in past blog postings.  Hence, like Rush, I feel like saying, “See, I told you so!”  And to paraphrase the Declaration of Independence, let the following facts (and opinions) be submitted to a candid world! 

      

     

    n “Emperor” B.O. Really Is Naked!

                 As regular readers of this blog know, since the 2008 elections it has been my policy to refer to the current “occupier” of the White House, Barack Obama, by his initials, “B.O.,” for the obvious reason.  As I’ve explained, “other modern presidents have been known by their initials – TR, FDR, JFK, LBJ – and using this president’s initials in lieu of his name seems appropriate because B.O. as president, in a word, stinks.”  (For a full explanation, see the first entry in my “Fall-deral 2008” essay (Nov. 6, 2008)).  And I describe him as “the current `occupier’ of the White House,” rather than as the 44th president of the United States, because he doesn’t deserve the title “president.”  Although duly elected in 2008 and reelected in 2012, he is manifestly unfit for the office that he occupies.  (Also I use the word occupier rather than occupant to acknowledge the role B.O. played in inspiring the nihilistic, anti-capitalist “Occupy Wall Street” movement.)  

                Since before the 2008 election, I’ve also compared B.O. to the Emperor in Hans Christian Andersen’s children’s story, “The Emperor’s New Clothes.”  (See my essay “The Emperor Is Naked!” Oct. 16, 2008.)  As I’ve written, B.O. is like the emperor in the story.  He’s truly naked – unfit for the office he holds – yet he and his political advisers (and his apologists in the Democrat party and their allies in the news media) have clothed him, not in a set of magical clothes, but in an image based entirely on bullshit.  As I maintained, B.O. is a master purveyor of bullshit – a true bullshit artist.  He got his party’s nomination and was elected president because of a campaign based almost entirely on bullshit.  And he was reelected to a second term because he has continued to be a master purveyor of bullshit – “the Bullshitter-in-Chief,” as I’ve called him – keeping up the “Emperor’s New Clothes” pretense, as I have discussed in Part I of my 2013 “Prospects for Liberty” essay (Jan. 17, 2013).  

                B.O.’s image portrayed him as a “uniter,” but in fact he has employed the Democrats’ standard demagoguery based on race and class, dividing Americans even more sharply along racial, class, and partisan lines.  (As Pat Cadell and Doug Schoen, moderate Democratic political consultants, have observed, B.O. is the most divisive president in modern history; he’s “tearing the country apart.”)  B.O.’s image portrayed him as a “centrist,” but in fact he has proved himself to be one of the most left-wing politicians ever to hold the presidency.  He promised that he’d bring “change” to Washington, yet B.O.’s regime has brought only more of the same – more of the same old, tired, semi-socialist, paternalist policies that the federal government has been tinkering with, under both Democrat and Republican administrations, for the past century or so, since the beginning of the 20th-century regulatory/welfare state. 

                Worst of all, B.O. was elected in 2008 and reelected in 2012 because of racism – his and his supporters’ cynical exploitation of his racial identity as “black” (which is a kind of bullshit too, for B.O. is really mulatto, or biracial, having a white mother and a black father and having been raised largely by his white maternal grandparents).  He came into office on a wave of popularity – intense popularity, verging upon adulation for many of his supporters, who called him a new political “Messiah” – that perhaps was unprecedented in modern American history.  Americans of both races, and of various political stripes, optimistically believed that election of “the first African-American” president meant that racial divisions in the country had healed, at long last, 150 years after the abolition of slavery.  Yet B.O. and his supporters have continued to exploit racism – a kind of reverse racism – by accusing anyone who criticized him, or his regime’s policies, of being “racist” if they are white or of being race-traitors, or “Uncle Toms,” if they are black.  Sadly, because racism remains a problem in America, the ploy generally has worked, thanks to white Americans’ willingness to accept unearned guilt and to black Americans’ propensity to identify themselves by race rather than as individuals.  Racism has substituted for the fantastical story of a magic suit of clothes in Hans Christian Andersen’s story.  And fear of being called racist or a race-traitor (like the people’s fear of being called stupid or incompetent in Hans Christian Andersen’s story) has prevented many Americans from openly acknowledging what we all really know – that “emperor” B.O. truly is naked!   

    Nevertheless, as I wrote in the first section of Part II of my essay “Thoughts for Summer 2013 – and Beyond” (May 30, 2013), now that B.O. is in his second term – thankfully, he’s term-limited and so is a “lame duck” president – he’ll lose his charm.  That’s the general reason why most two-term presidents, throughout U.S. history, have had problems – including various types of political scandals – during their second terms.  B.O. is no exception; moreover, given his arrogant style of governance, it’s virtually inevitable that scandals will emerge involving him and his regime (which, as I also discuss in the next section, I’ve called “the most lawless in U.S. history”) 

    By May 2013, just four months into his second term, B.O. was facing what conservative columnist George Will called a “trifecta” of scandals:  Benghazi (the militant Islamic terrorist attack on the U.S. consulate in Benghazi, Libya on September 11, 2012 and the subsequent effort by B.O.’s regime to cover up the facts concerning the attack); “IRS-Gate” (as I called it, the targeting of certain conservative and libertarian groups by the IRS); and the Justice Department’s surveillance of AP news reporters and other journalists (“Scandals Debase the Currency of Gov’t Trust,” Investor’s Business Daily, May 17, 2013).   As Will observed at the time, these three scandals were really just the tip of the iceberg, for there are other scandals – some which already broke into news during B.O.’s first term, like the “Fast and Furious” illegal gun-running operation in Mexico, involving Eric Holder’s Justice Department and the FBI – and others that have broken into news over the past year or so.  For example, this summer B.O. was slapped down by the Supreme Court in a unanimous opinion holding that his “recess” appointments to the National Labor Relations Board were unconstitutional.   

    B.O.’s abuses of power have alarmed not only the news media and many civil libertarians (such as George Washington University law professor Jonathan Turley, discussed below) but also many of B.O.’s most fervent supporters and apologists.  They’ve become, metaphorically, just like the little boy in Hans Christian Andersen’s story – the boy who sees the Emperor for what he really is, who by shouting “But he’s naked!” opens the eyes of the people generally.  When supermarket tabloids have cover stories calling B.O. “Worse than Nixon” (see, for example, the April 14 issue of Globe), many Americans start paying attention to the rising “imperial presidency.” 

                Sadly, however, many if not most Americans – those ignorant masses whom Rush Limbaugh charitably calls “low-information voters” – really don’t understand the Constitution, or vital constitutional principles such as separation of powers or checks and balances, or (even worse) don’t even care that the “occupier” of the White House is abusing his constitutional powers.  But they do understand – and do care – that B.O. is not doing his job competently.  His unfitness for the high office he holds has become manifest in recent months, with his inability to handle such matters as the “ISIS crisis” or the Ebola scare (discussed in sections below).  One simple example: many Americans were shocked to learn that, moments after ISIS militants had posted the video showing their first murder of an American journalist by beheading and the man’s parents made a public statement, B.O. was back on the golf course. 

                Perhaps most of all, Americans are getting fed up with all of B.O.’s lies.  As discussed below, the IRS and Benghazi scandals are so damaging to the credibility of B.O. and his regime because they involved outright lies that were told to the American people.  And the most infamous lie that B.O. has told, about ObaminableCare – “If you like your health care plan, you can keep it” – has been named by the fact-checking organization PolitiFact the “Lie of the Year” of 2013 (“PolitiFact `Lie of the Year’: `If you like your healthcare plan you can keep it,’” Politico.com, Dec. 12, 2013).  

    Americans also are fed up with B.O.’s failure to take responsibility for any of his actions – or inaction – and his propensity to blame problems on someone else.  During his first term, he blamed everything on his predecessor, George W. Bush; and now in his second term, when that excuse just doesn’t fly, he blames his subordinates – for example, former HHS Secretary Kathleen Sebelius, for the failed ObaminableCare “roll-out,” or national security adviser James Clapper, for the failure in intelligence that led to the ISIS crisis. 

                Not surprisingly, B.O.’s numbers have been falling precipitously in most public-opinion polls, especially in recent months.  In the latest IBD/TIPP poll, 53% characterized B.O.’s presidency a “failure,” while only 41% rated it a success.  B.O.’s standing is even worse among independents, 58% of whom called his presidency a failure (“Obama Viewed as Failure by Majority of Americans,” I.B.D., October 7).  The respected Rasmussen presidential poll has shown a steady widening of the gap between B.O.’s disapproval and approval numbers.  And John Hinderacker recently posted an article on the Powerline blog suggesting that the numbers in the national approval/disapproval polls are even worse when viewed state-by-state (“How Unpopular Is Barack Obama? More Unpopular Than You Think,” October 9). 

    Perhaps the surest sign of B.O.’s unpopularity is the way Democratic politicians in this year’s elections have been trying to avoid associating themselves with him, or vice versa, as if doing so would hang a giant albatross around their necks.  Democrats running for Congress (either incumbents running for reelection or challengers to Republican incumbents) don’t want to have B.O. campaign with them and don’t want to be seen with him.  And the Democratic challenger to Senator Mitch McConnell (R.–Ky) refused even to answer the question whether she voted for B.O.!

    That’s why so many Democrats were shocked and alarmed to hear B.O. say, in a recent interview, that even though he wouldn’t be on the ballot, he – being the narcissist that he is – considered the election to be a referendum on him and his policies!

     

     

    n The “I” Word (Impeachment), On the Horizon?

                In the 2012 edition of my essay “Rating the U.S. Presidents” (Feb. 22, 2012), I called B.O. “the worst president in U.S. history.”  In that essay, I summarized B.O.’s record as president – and explained why I rank him as a “failure,” at the bottom of the list:  

    “The least qualified person ever to hold the office of Chief Executive, he is unquestionably the worst president in American history.  B.O. was elected president by promising to bring `change’ to Washington, D.C.; instead, he brought more of the same old semi-socialist, welfare/regulatory state, paternalistic policies that the worst of his predecessors, whether Democrat or Republican, brought to Washington – except that B.O. did it at unprecedented levels, resulting in spiraling federal budget deficits and . . . adding more to the national debt than all previous presidents – from George Washington to George W. Bush – combined.  . . .  In comparison, B.O. makes Jimmy Carter seem competent, Bill Clinton seem moral, and FDR seem faithful to the Constitution.”

     

    Then in my essay last year “The Unconstitutional Presidency” (Feb. 21, 2013), I discussed the various ways in which modern U.S. presidents – both Democrat and Republican – have abused the powers of their office, thereby breaking the “chains of the Constitution” by which the Framers sought to restrain the most powerful political office in the world.  I also discussed how the current “occupier” of the White House, B.O., is the most dangerous of the lot – not only the worst president but also, as I’ve called him (giving my top ten reasons) “the most lawless president in U.S. history.”  By calling B.O. “lawless,” I mean that he not only acts in unlawful ways but also acts as though he’s not subject to, or controlled by the law.   

                Since the time I wrote the above, dozens of other commentators have expressed similar concerns about B.O.’s abuse of power.  Moreover, they are not all limited-government conservatives or libertarians.  For example, Jonathan Turley, a constitutional law professor at George Washington University Law School (and a contributor to USA Today) has compared B.O. to Richard Nixon, the president who resigned in 1974 under threat of impeachment.  Professor Turley wrote in 2012 that B.O. “is using executive power to do things Congress has refused to do, and that does fit a disturbing pattern of expansion of executive power.”  Professor Turley adds: “In many ways, [B.O.] has fulfilled the dream of an imperial presidency that Richard Nixon strived for.  . . . [T]his is a president who is now functioning as a super-legislator.  He is effectively negating parts of the criminal code because he disagrees with them.  That does go beyond the pale.”  

                Contrary to his oath of office and violating his duty under Article II of the Constitution to see that the laws are “faithfully executed,” B.O. has abused his executive powers – especially through the device of issuing supposed “executive orders” – thereby usurping the legislative power that Article I of the Constitution vests in Congress.  (Consider, for example, the dozens of changes in the “ObaminableCare” compulsory health-insurance law that he has unilaterally created, for political purposes, as a section below discusses.)  With a Republican majority in the U.S. House of Representatives, B.O. in his second term has been increasingly vocal about ignoring Congress altogether and governing on his own.  Earlier this year, he revealed his plans to unilaterally use executive orders to “bypass” Congress.  His infamous words were: “We are not just going to be waiting for legislation. . . . I’ve got a pen and I’ve got a phone.  And I can use that pen to sign executive orders and take executive actions and administrative actions that move the ball forward.”  Currently rumors are flying “inside the Beltway” that, soon after the November elections, B.O. will use – or abuse – his power to take “executive actions” by essentially granting amnesty to millions of illegal immigrants in the United States. 

                Although he supposedly taught Constitutional Law at the University of Chicago, B.O. (sadly, like many other law professors) doesn’t understand the Constitution.  Executive orders are meant to be just what their name suggests: directives issued by the Chief Executive to executive-branch officers of government, regarding the enforcement of the laws that have been passed by Congress.  To use his authority to issue executive orders to make new law or to change the law that has been made by Congress is to usurp the legislative branch’s constitutionally-delegated authority.    

                B.O.’s various abuses of power are simply too numerous to list here.  I attempted to do so in both my February 2012 and February 2013 essays, but even to update those lists with other important examples of B.O.’s abuses since then would be a difficult task.  Fortunately, Andrew McCarthy – a former federal prosecutor and one of America’s top experts on national security issues – has done a superb job of comprehensively identifying B.O.’s lawlessness and abuse of power in his recently-published book, Faithless Execution: Building the Political Case for Obama’s Impeachment (New York: Encounter Books, 2014).  I highly recommend the book. 

                In the second half of his book, McCarthy offers six articles of impeachment against B.O., which I will briefly summarize here: 

    (1)   His willful refusal to execute the laws faithfully and usurpation of the legislative authority of Congress.

    (2)   Usurping the constitutional authority and prerogatives of Congress.

    (3)   Dereliction of duty as president and commander-in-chief of the U.S. armed forces.

    (4)   Fraud on the American people.  (This section focuses on the Libya war, Benghazi, ObaminableCare, and Solyndra.)

    (5)   Failure to execute the immigration laws faithfully.

    (6)   Failure to execute the laws faithfully (focused on the Department of Justice).

     Based on B.O.’s record thus far, the case for his impeachment is a strong one.  He certainly has committed “high crimes and misdemeanors” and deserves to be impeached and removed from office, McCarthy convincingly maintains. 

                In the last section of my 2013 essay on “The Unconstitutional Presidency” (Feb. 22, 2013), in which I discussed impeachment as “the enema of the state,” I made the following bold statement: “Virtually any of the lawless actions or abuses of power that B.O. has committed thus far during his occupation of the White House would constitute impeachable offenses.   Indeed, in my `Election 2012 Postmortem’ essay (Nov. 10, 2012), I predicted that B.O. would be impeached and removed from office before he completes his second term: 

    “The abuses of power committed by B.O. over the past four years – abuses that justify my calling him `the most lawless president in U.S. history’ . . .  ought to have disqualified him from being elected to a second term.  Now that he has been reelected, however, I seriously doubt whether he will complete a second term.  I am confident that his disregard for the Constitution and the rule of law will prompt him to continue to commit offenses – not merely `high crimes and misdemeanors,’ but also possibly treason (making him the first traitor to hold the office of president) – that will justify his impeachment and removal from office before his second term ends in 2017, in other words, sometime during the next four years.”

     

    I added that he might be the first president to be impeached for treason, given evidence of how far B.O. is willing to go to appease the enemies of the United States, particularly Russia or militant Islamic nations or groups.   

    Continuing my discussion of impeachment, however, I added the following: 

    “But as I noted in my essay `The Legacy of Watergate’ (June 17, 2005), the lessons of both Nixon’s and Clinton’s presidency – of the successful attempt to impeach Nixon and of the failed effort to convict Clinton – a successful impeachment effort must be bipartisan; it cannot be begun by the Republicans alone, unless they hold commanding majorities in both the House and the Senate.

     . . . Hence, as I discussed in the concluding section of my `Election 2012 Postmortem’ essay, the Republicans face a major challenge – both to educate the American people and to make them care, to make them care whether the man in the Oval Office is abusing the awesome powers of his office.”

     

    Andrew McCarthy agrees with me.  The secondary theme of his book, as suggested in its subtitle, is “Building the Political Case” for B.O.’s impeachment.  “Impeachment is not a legal matter of proving `high crimes and misdemeanors.’ It is a political matter of will.”  The political case for impeachment must be convincingly presented to the American people; public opinion must be on its side. 

                As I concluded – drawing upon history and particularly a comparison between the near-impeachment of Richard Nixon and the impeachment but failed effort to remove Bill Clinton – a successful impeachment must have bipartisan support.  Will the Democrats continue to “stand by their man,” as they did with Clinton, or will they abandon B.O., especially as he becomes more and more of a lame duck as the 2016 elections approach?  Of course, that’s the critical question.  I’m optimistic enough to believe that the Democrats will abandon B.O. when he becomes a political liability for them in 2016 – and that the effort in fact may be lead (or secretly supported, behind the scenes) by the Clintons, both Bill and Hillary, to help her candidacy as the Democrats’ heir apparent in the next presidential election. 

                Whatever their political motivations for impeaching B.O., Congress ought to do it – to preserve the Constitution and to prevent B.O.’s dangerous record from setting precedents for future presidents’ abuse of power.  If we don’t impeach B.O., then impeachment becomes a useless tool for preserving the constitutional separation of powers – and the constitutional rights of all Americans – against an executive tyrant.  

    There’s one additional advantage.  If Congress waits until December 2016 or January 2017 to impeach and try B.O., then V.P. Joe “Smiley” Biden would become president for just a few weeks – not long enough to do much additional damage to the Constitution or the country, but just long enough for a few much-needed laughs.

     

      

    n B.O.’s Bitches: The “Lamestream” News Media’s Slobbering Love Affair Continues (For Now)

                 As I wrote in my May 30, 2013 essay, the fake tailors in the real-life story of the naked Emperor, B.O., are not just his political handlers, the White House “spin doctors,” but also the so-called “mainstream” news media (called the “lamestream” media by former CBS newsman and media critic Bernard Goldberg), who by and large have continued their “slobbering love affair” with B.O. (as Goldberg so aptly put it in the title of his recent book).  Predominantly leftist in their political orientation, media “journalists” are not only ideological kinsmen of B.O. but also elitists who take a kind of perverse pleasure in hoodwinking the American people.   

                Taking their lead from B.O. himself and his political handlers, they have emphasized his racial identity as “black” – implying that anyone who fails to support him is “racist.”  In other words, B.O.’s media lapdogs are engaged in what some people call “reverse racism” (bigotry in favor of someone because of their racial identity); but it ought to be considered for what it is – blatant racism, as racism properly ought to be defined, as treating persons not as individuals but as members of a racial group.  (For more on this, see my discussion “Let’s Talk Frankly About Race” in my 2012 “Thoughts for Summer” essay (May 25, 2012), which I also mention below, in the section on Ferguson,. Mo.)  The charge of racism is absurd, for the vast majority of B.O.’s political opponents and critics oppose him because of his policies, or his actions, not because of his race (or his racial identity).  Indeed, as I’ve frequently commented here on MayerBlog, the assertion that it’s “racist” to criticize B.O. is itself racist:  just as it would be racist to dislike B.O. or to oppose his policies simply because of his race, it’s equally racist, by the true meaning of the word, to give B.O. a pass – to blind oneself to his many flaws as president, as his supporters do – also because of his race.  

                It’s not just racism, however, that explains the “slobbering love affair” the media have had with B.O.  It’s also the leftist political philosophy (whether you call it “liberal,” “progressive,” “statist,” “socialist,” “semi-socialist,” or even “fascist” or “collectivist,” it’s the philosophy on which the 20th-century regulatory “welfare state,” or “Big Government Nanny State” has been founded), which the overwhelming majority of journalists, academics, and other supposed “intellectuals” share with B.O.  Because he’s also the most-far-left president in U.S. history, the news media have been “in the tank” for him throughout his first term and did all they could do, through their biased reporting of the news, to help assure his reelection in 2012.  B.O.’s narcissism and arrogance, rather than turning them off, have made the media even more enthusiastic in their support.  They have been, to put it bluntly, B.O.’s “bitches.”  

    It’s a symbiotic (mutually supportive) relationship, as libertarian columnist Larry Elder (who also happens to be black) observed last year in an op-ed: 

     “[B.O.’s] arrogance flows from our fawning, gushing, Bush-hating `news’ media, which shirk their responsibility to fairly report the news.  The media’s fecklessness creates overconfidence.  With good reason, [B.O.] expects his media cheerleaders to look the other way, accept excuses without much challenge and turn the president’s critics and whistleblowers into enemies.”

     

    With a news media that fails to call B.O. on his blatant, outrageous lies, “why shouldn’t [he] feel that he operates under different, special rules, and can do so without risking loss of support?” Elder asks.  “By refusing to hold [B.O.] to the same standard they would hold any garden-variety Republican, the media now face the monster they created” (“In Defiant Obama, Media Face Monster They Made,” Investor’s Business Daily, May 24, 2013).    

                That’s why the media now seems so “shocked – shocked” (to quote Claude Raines’ character in the classic movie Casablanca) by the various scandals that have afflicted B.O.’s regime during his second term – and by the too-obvious-to-be-ignored evidence of his fickle and feckless foreign policy (discussed below).  For many of B.O.’s most fervent supporters and apologists, the “thrill” is gone.  That seems to be the case for MSNBC’s Chris Matthews, the Hardball host who in 2008 famously declared he felt “this thrill going up my leg” every time B.O. speaks.  Last year Matthews observed that B.O. “obviously likes giving speeches more than he does running the executive branch.”  Politico – another friend of B.O. – described Matthews’ remarks as “a rare, unforgiving grilling of the president as severe as anything that might appear on Fox News.”  Matthews grumbled that B.O. “doesn’t like dealing with other politicians . . . his own cabinet . . . members of the Congress, either party . . . doesn’t like giving orders or giving somebody the power to give orders.  He doesn’t seem to like being an executive.”  Instead, “he likes to write the speeches . . . likes going on the road, campaigning” (“The Thrill Is Gone,” I.B.D., May 17, 2013).   

                Matthews' disenchantment with B.O. is quite telling, for it reveals the new narrative being pushed by B.O.’s apologists in the media.  Faced with obvious evidence of the man’s incompetence and the failure of his regime’s policies, the media’s new line is that B.O. is simply too “detached.”  That’s why he’s spending so much time on the golf course or at political fundraisers; he’s now longer challenged by the job – or so they say.  Indeed, most recently, while commenting on B.O.’s mishandling of both the ISIS and Ebola crises (discussed below), Matthews criticized B.O. for being “intellectually lazy” and for trying “to downplay concerns at the expense of being a truth-teller.”  Or, in other words, for being a liar!  (“Chris Matthews on ISIS, Ebola,” realclearpolitics.com, October 2). 

    Most importantly, other prominent members of the “lamestream” media are finally seeing the light – seeing the naked “Emperor” for what he really is.  NBC’s Andrea Mitchell has accused B.O. of “the most outrageous excesses I’ve seen” in her years of journalism, going back before Watergate.  And the Washington Post’s “Fact Checker” – which hands out little “Pinocchios” to politicians who stretch the truth – last year gave B.O. “Four Pinocchios” (the worst rating on its lie-ometer) for his repeated claim that “the day after [the Benghazi attack], I acknowledged that this was an act of terrorism” (“Now the Lapdog Media Tell Us,” I.B.D., May 15, 2013).  And, as noted above, PolitiFact in 2013 called B.O.’s infamous claim, “If you like your health care plan, you can keep it” the “Lie of the Year.”

      

     

    n B.O.’s “Glitches”: The ObaminableCare Fiasco

                 In the third part of my 2013 “Prospects for Liberty” essay (Feb. 7, 2013), I discussed the 2010 federal compulsory health-insurance law that is officially titled “The Patient Protection and Affordable Care Act” but is popularly known as “ObamaCare,” because it is the signature legislative “achievement” of B.O.’s presidency.  (The official title of the law is quite disingenuous, for it provides genuine “protection” for no one and it makes health care more expensive, not more “affordable.”  Interestingly, advocates of the law, including B.O. himself, lately have been calling it simply “the Affordable Care Act” – dropping the first lie but perpetuating the second.)  I then wrote: 

                “Whatever one calls it – whether the PPACA, the ACA, “ObamaCare,” etc. – the 2010 federal law was a monstrosity, arguably the worst piece of legislation ever to pass Congress.  To put it another way, one might call it the biggest Mongolian clusterfuck of them all.  (The Urban Dictionary defines Mongolian clusterfuck as `a generally futile attempt to solve a problem by throwing more people at it rather than more expertise,’ or more generally as something `that is spinning or has already spun out of control with disastrous results.’  The term, which originated in military slang, seems an apt description of the 2010 legislation.)  The massive, 2,801-page law – with its 159 new bureaucracies, $2.6 trillion in new spending, 1,968 new federal powers, and 13,000 pages of regulations (so far) – was rammed through the Democrat-controlled 111th Congress during the final days of its “lame duck” session, despite strong opposition by the Republican minorities in both houses as well as strong opposition by the American people, according to most opinion polls.  Even supporters of the law acknowledged that it was too complex for them to either understand or explain exactly what it would do.  (Remember the infamous comment by former House Speaker Nancy Pelosi that Congress would first have to pass the bill before we know exactly what it provided?)

     

                “Since its enactment, the law has continued to remain unpopular.  In fact, the more Americans find out about the law and its actual effects on the nation’s health-insurance market, the less they like it.”

     

                I’ve now created a new, even more appropriate name for this abomination: appropriately, it’s ObaminableCare.   Last year’s so-called “roll-out” of the federal government’s health-insurance exchange, on the Heathcare.gov website, was truly disastrous – a real fiasco – although the regime’s lapdogs in the news media typically described the website’s problems as mere “glitches.”  Tim Phillips, president of Americans for Prosperity, recently described it far more honestly: “Healthcare.gov became the laughing stock of the nation last year when it couldn’t go 10 minutes without a catastrophic glitch.  It wasn’t too funny to the millions of people who were forced to spend hours, days, weeks, and even months simply trying to find a health care plan” (Tim Phillips, “Worst Features Hidden Until After Election,” Investor’s Business Daily, October 29). 

    The so-called “glitches” mounted each day, proving the truth in what ObaminableCare critics had been saying about the law – that the more Americans familiarized themselves with it, the less they would like it.  Hundreds of millions of dollars were paid to a Canadian firm, with ties to the B.O. White House, to develop the website, and it didn’t work; then more millions of dollars were paid to a new firm to “fix” the flawed website.  Many state-operated exchanges fared not much better.  Americans who had health insurance that they liked discovered, contrary to B.O.’s promise, that they lost the insurance, either because their employer chose a different company’s plan or they lost their employer-provided insurance altogether because financially-pressed employers decided to opt out or reduce employees to part-time, or below 30 hours a week (as the law defines “full-time” employees who are required to be covered by employers).  Then they discovered that the plans available on the government exchange did not allow them to keep their preferred doctors or hospitals – again contrary to B.O.’s promise that “if you like your doctor, you can keep your doctor.”  Eventually this spring Health and Human Services Secretary Kathleen Sebelius was fired – or “resigned,” as the fiction goes – having been made B.O.’s scapegoat for all the “glitches,” and a new secretary (who was less a politician, as Sebelius was, and more of a technocrat), Sylvia Mathews Burwell, was nominated and confirmed by the Senate in early June.  (Have you noticed how little Ms. Burwell – or is it Ms. Mathews Burwell – has been in the news?  Considering all the power that the law vested in the secretary, we should be hearing about her, and all the important decrees she is making, almost every day, as we did with Ms. Sebelius.  But, then again, maybe that was the problem.) 

                These “glitches” are not simply the result of political corruption or incompetence, however.  They are the inevitable result of a fatally flawed law that is doomed to fail because government-run compulsory health insurance, like any other kind of government monopoly, simply cannot function as efficiently, or as fairly, as a free market.  Many conservative and libertarian commentators believe that ObaminableCare was deliberately designed to fail – as “a step” toward a “single-payer,” or fully-socialized national government health-care plan, as B.O. himself conceded the objective of the law to be.  (He was speaking to a group of far-left supporters who wanted truly socialized medicine and explaining that the 2010 law – which instead of being called “socialized medicine” ought rather to be called “fascist medicine,” because it involves total governmental control of private health insurance – was but an incremental step toward their common dream of a “single-payer” system.  Of course, Americans ought to know how poorly government-run health care systems work, by looking at the experience in Britain or Canada – or even in this country with the Veterans Administration system, the scandalous problems of which were in the news during much of the past year.) 

    In one of his typically interesting columns, Charles Krauthammer last year observed, “The law was designed to throw people off their private plans and into government-run exchanges where they would be made to overpay – forced to purchase government- mandated services they don’t need – as a way to subsidize others” (“Nationalized health care isn’t going over so well,” Columbus Dispach, Dec. 22, 2013).   This spring B.O. bragged that the law was “working,” and his regime claimed that some 8 million Americans signed up for a health insurance plan via the government exchanges – a number that virtually everyone outside the regime believes to be a truly phony number.  The regime revised it downward, closer to 7 million, by the end of the summer, but even that number is grossly inflated; for example, it has not been adjusted to take into account people who signed up for a plan but failed to follow through by paying their insurance premiums.  The critical question about the workability of the law – did enough healthy young people, who really don’t need all the mandated insurance coverage but who are paying the premiums to help subsidize the coverage for older, sicker people who do? – remains unanswered.  If the regime knows (as it probably does, because it’s probably demanding that insurance companies report the data), it ain’t tellin.’ 

                When most Americans find out – after this year’s elections – how much their insurance premiums will increase for next year, they’ll realize that the “Affordable Care Act” is indeed a really big lie.  It will – as the critics of the law all predicted several years ago – make their health care more costly and it will give them fewer choices.  The government itself has projected that by 2016, when most of the law’s provisions will have been fully implemented, there will still be about 31 million uninsured Americans, compared with 55 million before the law.  It seems that a lot of time and effort and money (now estimated at $2 trillion) could have been saved – and that more of Americans’ health-care freedom could have been preserved – if instead of ObaminableCare, the government had simply cut a check for those 24 million Americans whom the law theoretically will help.  But it gets even worse. 

    Violating his constitutional duty to see that the laws are “faithfully executed,” B.O. has been unilaterally rewriting the law, granting waivers to various groups and delaying implementation of various provisions, in order to hide its disastrous impact on Americans’ health care – prior to both the 2012 elections (to help assure his chances for election to a second term) and to this year’s midterm congressional elections (to help assure the election of Democrats to Congress).  

    When at least 6.3 million Americans received health insurance cancellation notices last fall, B.O.’s regime ordered that noncompliant policies be allowed to continue for another year.  That deadline was later extended into 2016 – past this year’s midterm elections and into the next presidential election cycle, when it will no longer be B.O.’s problem (except to the extent it will tarnish his “legacy”). Then there’s the employer mandate, which requires companies with 50 or more full-time employees to offer them health insurance or pay a hefty financial penalty.  Fearing retribution from the business community, in which health care costs are all but certain to skyrocket, B.O. delayed the employer mandate (which according to the law was to go into effect on Jan. 1, 2014) by one year to Jan.1, 2015 – two months after November’s elections.   

    All told, B.O. has used administrative action to rewrite ObaminableCare 24 times – “an astonishing figure considering the law still hasn’t been fully implemented,” Phillips notes.  “While each rewrite varies in its details, the goal appears to be the same for most of them: [ObaminableCare] is not working as advertised and voters may hold the law’s supporters” – in other words, Democrats – “accountable for its failure.” Given this “toxic political cocktail,” B.O. has unilaterally changed the law “in order to delay constituent anger until after the next election.”  The most recent example of this ploy is the one-month delay in the start date of Heathcare.gov’s open enrollment period for next year’s plans – November 15 – a full 45 days later than last year’s start date and (the critical point) 11 days after this fall’s midterm elections.   

                Fully 51% of Americans still oppose the ObaminableCare law, with only 39% in support of it, according to recent polls.  Although some legal/constitutional challenges are still pending in federal courts – and might again reach the U.S. Supreme Court (particularly the case involving interpretation of the statutory language dealing with subsidies for the state vs. federal exchanges) – the best hope for opponents of the law (which is to say for supporters of Americans’ health care freedom) is for Republicans first to gain control of both houses of Congress in this year’s elections and then to enact some reforms, starting with those on which they could get some bipartisan consensus.   Real reform – free-market solutions to the problems with America’s healthcare system – may have to wait until after 2017, with a new Republican Congress and a new Republican president (if it’s not too late)!

      

     

    n “IRS-Gate”: Worse Than Watergate

                    In Part I of my “Thoughts for Summer 2013 – and Beyond” (May 23, 2013),  I wrote about the then-unfolding scandal of B.O.’s regime that I have called “IRS-Gate”: the illegal targeting of, harassment of, and discrimination against conservative and libertarian groups by the Internal Revenue Service (IRS), as it reviewed the organizations’ requests for tax-exempt status. 

    At the time I wrote that essay, the story had just broke, some two weeks earlier, as a result of the release of an IRS inspectors-general report.  Investigative reporting by some of the news media (most notably by USA Today) and ongoing investigations by Congress revealed more of the story.  By early spring 2010, IRS workers in Cincinnati, Ohio (the main processing center for tax-exempt organizations) began singling out applications for tax-exempt status from certain groups (initially reported as groups with names containing “tea party,” “patriot” or other buzzwords).  As USA Today summarized it, “Applications from political groups warranted extra attention, but such attention should have been scrupulously neutral and non-partisan.  It wasn’t.  After singling out conservative groups, workers sent letters with intrusive questions to some.  Meanwhile, some liberal groups sailed through the process” (“IRS inquiry rests on four key questions,” USA Today editorial, May 21, 2013).  According to the newspaper’s breaking news story about the IRS scandal (“IRS gave a pass to liberals,” May 15, 2013), a review of IRS data showed that for a 27-month period, from March 2010 through May 2012, the IRS granted no Tea Party group tax-exempt status.  During that same time, however, the IRS approved applications from similar left-liberal groups; indeed, some groups with obviously liberal names (including words like “Progress” or “Progressive”) were approved in as little as nine months.        

    Besides targeting conservative groups for extra scrutiny – refusing or delaying their application for tax-exempt status – the IRS also harassed such groups, making unusual document requests, asking for massive amounts of information the agency couldn’t possibly need to determine tax-exempt status.  For example, groups were asked to provide:  donor names, blog posts, transcripts of radio interviews, resumes of top officers, board minutes and summaries of materials passed out at meetings.  Some groups were asked about connections to other conservative groups or individuals; for example, the IRS demanded that the Center for Constitutional Law “explain in detail your organization’s involvement with the Tea Party.”  Even worse, IRS employees engaged in selective leaks, providing information gleaned from its special scrutiny of conservative groups to certain left-liberal groups.  ProPublica, a left-liberal-leaning nonprofit journalism organization, has revealed that the IRS had leaked it nearly a dozen pending applications, including one submitted by Karl Rove’s Crossroads GPS (“Did IRS Try To Swing `12 Election?” I.B.D., May 16, 2013).   

    Congressional investigations are still being conducted, especially after the IRS official at the center of the scandal – Lois Lerner, head of the Tax Exempt Division – in May 2013 pleaded the Fifth Amendment, refusing to answer questions (after making a brief opening statement denying culpability) in hearings held by the House Oversight Committee.  Chaired by Rep. Darrell Issa (R.–Calif.), the Committee has many capable Republican members who are determined to get to the truth, including one of my former students, Rep. Jim Jordan (R.–Ohio), a leader in the limited-government conservative caucus.  (Democrats on the committee, in contrast, have been working to undermine the investigations by claiming that there wasn’t any targeting of conservative groups at all.)  Congressman Jordan was especially effective in questioning the new IRS commissioner, John Koskinen (a major donor to B.O.’s campaign) in late June, about the critical “lost” emails written by Lois Lerner during the Tea Party-targeting period, asking Koskinen how it was that he waited two months to tell Congress about the missing emails when Treasury and White House officials already knew about them.  Koskinen, following the pattern of other IRS officials in “stonewalling” the committee, said he had no idea who at the IRS told them about it.  The editors at Investor’s Business Daily concluded, “The endless stream of IRS visitors to the White House and the involvement of campaign donors in the Tea Party-targeting scandal belie the claim by the head of the agency that there is no White House involvement” (“No White House Role in IRS Case, Insists Koskinen, an Obama Donor,” June 25).  Needless to say, the committee is uncovering evidence of far more than a “scintilla” of wrongdoing, contrary to B.O.’s claim that there was nothing to the scandal.     

    The critical question, yet to be fully answered, is whether and to what extent B.O.’s White House was involved in this despicable politicization of the IRS – and consequently, the infringement of many Americans’ constitutional rights.  Although congressional investigators are still looking for the “smoking gun,” it’s not hard to “connect the dots” and see the direct connection between B.O.’s (and congressional Democrats’) antipathy to the Supreme Court’s 2010 Citizens United decision and the IRS operation, as my colleague Brad Smith has maintained (Bradley A. Smith, “Connecting the Dots in the IRS Scandal,” Wall Street Journal, Feb. 26, 2014). 

    To understand why this scandal is so serious, consider the close parallel to the “Watergate” scandal that brought down Richard M. Nixon’s presidency, causing him to resign from office in August 1974 under threat of probable impeachment and removal from office by Congress.  The first clause in Article II of the articles of impeachment against Nixon approved by the House Judiciary Committee at the end of July 1974 alleged: 

    “He has, acting personally and through his subordinates and agents, endeavored to obtain from the Internal Revenue Service, in violation of the constitutional rights of citizens, confidential information contained in income tax returns for purposes not authorized by law, and to cause, in violation of the constitutional rights of citizens, income tax audits or other income tax investigations to be initiated or conducted in a discriminatory manner.”

     

    This allegation was followed by other clauses in the second Article of impeachment, accusing Nixon – again either “acting personally” or “through his subordinates and agents” – of misusing other federal agencies, including the FBI, the CIA, and the Department of Justice, to violate individuals’ constitutional rights.  Because Nixon resigned even before the articles of impeachment were voted on by the House of Representatives, his impeachment case was never tried in the Senate.  As far as I know (and I’ve read several books about the Nixon impeachment and the Watergate affair), there has been no credible evidence proving this allegation.  The so-called “White House enemies list” maintained by the Nixon administration – allegedly the source of names of persons to be subjected to harassment by the IRS and other federal agencies – has never been proven to be anything other than a “do-not-invite” list for White House social functions. 

    Thus, in a sense, the potential allegations against B.O. arising from “IRS-Gate” could be far more serious than this impeachment article against Nixon.  From what Americans (including the members of Congress holding the ongoing hearings) already know, the IRS did abuse its power – not only by collecting information against certain individuals and groups, auditing them or otherwise threatening them, in a unlawful discriminatory manner – but also presumably did so in a way that not only deprived the groups involved of their constitutional rights but also perhaps affected the results of the 2012 elections. 

    There was no “smoking gun” linking Nixon to the IRS allegations, as there was linking him personally to the allegations contained in Article I of the articles of impeachment – the main allegations, about obstruction of justice and abuse of power in covering up the break-in at the Democratic National Committee offices in the Watergate office building on June 17, 1972.  (For more on Nixon and the Watergate affair, see my essay “The Legacy of Watergate,” June 17, 2005 – posted on the 33rd anniversary of the Watergate break-in.)  But, in approving the articles of impeachment against Nixon, the House Judicial Committee adhered to a broad standard of what constitutes impeachable offenses – a standard that held the president responsible for acts done “through his subordinates and agents.”  That standard is reasonable, given the office of the presidency as created by the Constitution, what some scholars call the “unitary” Chief Executive.  It also comports with Americans’ popular understanding of the office – “the buck stops here,” as the sign on President Harry Truman’s desk in the Oval Office famously read.  (Ironically, this broad standard of impeachment was also urged by a young Democratic staffer of the House Judiciary Committee, a recent graduate of Yale Law School named Hillary Rodham Clinton!) 

    Based on the precedents set by the Nixon near-impeachment, B.O. certainly could be held accountable for the illegal acts committed by the IRS during his watch – even if there’s no evidence of his personal involvement.  Morally and politically, if not legally, he ought to be held responsible for the actions of the IRS because of the way he has conducted his administration – which I have called a “regime” because it has been so lawless.  Besides bringing his “Chicago style” of gangster government to Washington, D.C., and thus generally encouraging a spirit of lawlessness (of disrespect for the rule of law) throughout the federal government, B.O. himself has taken the lead in demonizing conservatives and libertarians – particularly groups involved in the Tea Party movement – and has encouraged his fellow Democrats as well as his lapdogs in the news media to demonize them, precisely because of their opposition to his policies and their concern about the direction in which the United States is headed.  Given the highly-charged partisan atmosphere that B.O. has cultivated in Washington, D.C., is it at all surprising that someone at the IRS – whether they were “minions” or top-level administrators – took B.O.’s angry outbursts about Tea Party groups, Republicans, conservatives, libertarians, etc., as if they were an order to target them for special treatment? 

      

     

    n Ferguson, Mo.: “Reverse Racism” Still Haunts America

                     In my “Thoughts for Summer 2012” (May 25, 2012), in the section entitled, “Let’s Talk About Race,” I observed, “Rather than being a “uniter,” B.O. has divided Americans along racial lines far more than they were prior to his presidency.  He has exacerbated the problem of racism in America.”  I cited three main reasons: “B.O. became president for racist reasons; his supporters defend him by using racist arguments; and both the policies of his administration and B.O. himself are indeed racist.”

    Racism, as I use the term, is a form of collectivism that treats persons not as individuals but as members of some perceived racial group.  As philosopher and novelist Ayn Rand described it, racism is “the lowest, most crudely primitive form of collectivism.”  She defined it as “the notion of ascribing moral, social, or political significance to a [person’s] genetic lineage.”  Or to paraphrase Dr. Martin Luther King, racism involves judging persons not by the “content of their characters” but by the “color of their skins”; it is a denial of a person’s individuality.  Virulent anti-Semites like Adolf Hitler and his Nazi followers, for example, are racists; but so too are “Nation of Islam” leader Louis Farrakhan and his black Muslim followers.  Contrary to the assertions of some so-called “civil rights” activists – many of whom, like Al Sharpton or Jesse Jackson, have been called “race hustlers,” because they have built their careers upon exploiting racist emotions – black persons (or other members of “minority” groups) can be just as racist as white persons (or other supposed majority groups).  So-called “affirmative action,” or racial preference, programs, which select persons not because of their abilities or qualifications as individuals but solely because of their race, are also racist.  (They also are counter-productive because they ultimately help reinforce negative racial stereotypes, as I discuss in my essay “Affirmative Racism” (Jan. 23, 2006).)  When race-conscious “affirmative action” programs discriminate against white persons – or when black persons are themselves guilty of racism – some people have called it “reverse racism,” but so-called reverse racism is simply just another form of racism.

    As I observed in 2012, “B.O.’s apologists have tended to characterize criticisms of him and of his policies as `racist,’ ignoring the real reasons why Americans have justifiable concerns about his agenda as president. . . . The assertion that it’s racist to criticize B.O. is itself racist:  just as it would be racist to dislike B.O. or to oppose his policies simply because of his race, it’s equally racist, by the true meaning of the word, to give B.O. a pass – to blind oneself to his many flaws as president, as his supporters do – also because of his race.”

    I also noted, “That B.O. himself is guilty of racism has been shown by a number of things he has said, both on the campaign trail and in the White House: for example, his infamous characterization of his own (white) grandmother as `a typical white person’ (typical, according to him, because of her prejudice against black persons); or his hasty assertion that the Cambridge, Mass. police `acted stupidly’ in arresting his friend, black Harvard professor Henry Louis Gates, Jr.” 

    In 2012 B.O. inserted himself into the controversy surrounding the fatal shooting of black teenager Trayvon Martin by George Zimmerman, a neighborhood watch leader who was a “white Hispanic” (as he was typically described in the news media), who in his trial successfully proved that he acted in self-defense.  (The news media kept identifying Martin as an “unarmed teenager,” showing photos of him taken several years before his fatal encounter with Zimmerman.  Martin at the time of his death was a young man who was much taller and in better physical shape than Zimmerman.  Moreover, facts brought out during the trial showed that Martin had thrown Zimmerman to the ground, jumped on top of him, and was threatening his life by repeatedly bashing Zimmerman’s head against the cement sidewalk.) 

    Without knowing any of the facts other than those reported in the media (just as he had done in the Gates controversy), B.O. at a press conference openly sided with Martin’s family who were calling for “justice” (meaning Zimmerman’s arrest and prosecution for murder).  Emphasizing his racial identity, B.O. said, “If I had a son, he’d look like Trayvon” – and thus, rather than diffusing the simmering racial divide, he exacerbated it.  Indeed, B.O. piled on to the media “lynching” of Zimmerman rather than contributing positively to the public dialogue, as he could (and should) have done, as a former lawyer and law professor, by reminding Americans of our constitutional principle of “innocent until proven guilty.”  But (as I wrote at the time) “that may be asking too much of a politician who is a former `community organizer’ and who has demonstrated no commitment to the rule of law.”

    B.O. and his regime similarly have exacerbated racial tensions in America this year, following the fatal shooting of an 18-year-old black man, Michael Brown, by a white police officer, Darren Wilson, on August 9 in Ferguson, Missouri, a predominantly black suburb of St. Louis.

    Contrary to the false narrative being propagated by the race hustlers and so-called “civil rights” activists – who are exploiting the incident in order to push their own agenda – the salient facts as reported in the news media are as follows:  Michael Brown (“Big Mike”), accompanied by his friend, Dorian Johnson, committed a “strong-arm robbery,” stealing a box of cigars at a convenience store and pushing the owner out of the way as the owner tried to stop him from exiting the store (a crime that was captured on the store’s surveillance video); Brown and Johnson, suspiciously walking down the center of a street, were spotted by Officer Wilson in his car; when Wilson pulled up to them, Brown reached in through the police car’s window, beating Wilson – causing Wilson to suffer facial fractures, including an orbital blowout fracture to his eye socket – as the officer and Brown apparently fought over Wilson’s gun; nearly blinded and justifiably fearing for his life, Wilson shot Brown in self-defense, as he exited his car, with the two men facing one another.  Contrary to the phony story told by Brown’s accomplice, Johnson, Brown was not surrendering to Wilson, with his hands up in a “don’t shoot” position.  The story told by a dozen eyewitnesses corroborates Officer Wilson’s account, as do the final autopsy and toxicology reports on Brown (which show that Brown, who had marijuana in his system, was shot in the forehead, upper arm and twice in the chest, with the fatal shot to the top of his head indicating he was falling forward or lunging toward the shooter). 

    Thus the facts support a finding that Officer Wilson shot Brown in self-defense, after Brown had seriously injured the police office and was about to attack him again.  The facts do not support the false narrative being pushed by race hustlers like Al Sharpton, of a racist white cop “profiling” an innocent young black man and then murdering him in cold blood.  That reality did not stop the alleged “civil-rights” leaders from pushing their racial politics, further alienating black Americans from law enforcement, turning Ferguson into a hotbed of violent street protests, and then drawing a “moral equivalence” between police and street rioters.  (I’m not denying there is a real problem with racism in law enforcement, typically with white racist cops unfairly treating innocent black citizens.  But that wasn’t the situation in Ferguson.  And as noted in the following section and by Jason Riley – a conservative commentator at the Wall Street Journal who happens to be black – the more typical situation for racial violence in America involves black-on-black or black-on-white crimes.  As Riley discussed on Fox News’ Special Report with Bret Baier on August 18, “I know something about growing up black and male in the inner city . . . . The real difficulty is not getting shot by other black people . . . . Cops are not the problem.  Cops are not producing these black bodies in the morgues every weekend in Chicago, in New York and Detroit and so forth.  That’s not cops.  Those [are] other black people shooting black people.”)

    B.O. encouraged and participated in the one-sided false narrative about the Ferguson incident that was being pushed by the race hustlers.  He sent Attorney General Eric Holder and a cadre of federal prosecutors to investigate not only Wilson’s shooting of Brown – and to see if they could get a grand jury indictment of Wilson on federal charges of civil-rights violations – but also the entire civil-rights record of the Ferguson police department.

    Sending the most racist attorney general in U.S. history to Missouri, and having his Department of Injustice (as I call it) investigate the matter, with the intent of bringing federal charges against Wilson, certainly has not helped defuse the situation.  Indeed, it has so raised the stakes that in coming weeks – when the grand jury is likely to fail to indict, because of insufficient evidence – Ferguson, Mo. may become a racial tinder-box, igniting a new wave of black crime and violence reminiscent of 1968.  That’s not progress, by any measure.

      

     

    n Eric Holder’s Escape from Justice

                 As noted above, the particularly troubling development in the Ferguson, Mo. case has been the involvement of U.S. attorney general Eric Holder.  Indeed, one of the “tricks” I discussed in “Tricks and Treats 2010” (Oct. 29, 2010) was the Department of Justice (DOJ), under the B.O. regime.  I argued that the DOJ ought to be called the Injustice Department, for that’s what it has become under Holder.   

                Holder, who early in his tenure as AG, called Americans “cowards” on matters of race, seems obsessed with the issue.  As Investor’s Business Daily has editorialized, “Holder’s racial bias is tainting how his department prosecutes civil-rights cases.”  One case in point is the event that occurred on Election Day 2008, when three members of the New Black Panther Party (King Samir Shabazz, Malik Zulu Shabazz, and Jerry Jackson) were videotaped intimidating voters as they stood, dressed in military garb and brandishing nightsticks, outside a Philadelphia polling place.  Although it was perhaps the most clear-cut case of voter intimidation ever, Holder’s DOJ essentially dropped the case in May 2009, letting two of the three walk and issuing a weak injunction against King Shabazz.  The U.S. Civil Rights Commission wanted to know why the case wasn’t pursued and if political considerations were involved.  The DOJ initially stonewalled, claiming that all documents involved in the case were “privileged.”  But after the watchdog group Judicial Watch pursued the case, and a court ordered the DOJ to provide it with the withheld documents, it was revealed that several political appointees were involved  in the decision not to pursue the New Black Panther Party.  J. Christian Adams, a career lawyer in the DOJ voting rights section who had resigned to protest the decision not to pursue the case, testified to the Commission that, under the B.O./ Holder regime, DOJ officials “over and over” showed “hostility” to prosecution of voter-intimidation cases involving “black defendants and white victims.”  Specifically, Adams testified that Julie Fernandes, a deputy assistant attorney general in the Civil Rights Division in charge of voting matter, told Voting Section leadership that the Obama administration would not file election-related cases against minority defendants – no matter what the alleged violation of the law.”  Adams’ testimony has been corroborated by Christopher Coates, former voting chief of the Civil Rights Division, before the U.S. Civil Rights Commission (“Post-Racial Racialism,” Oct. 1, 2010; “Black Panthergate,” Sept. 22, 2010; and “Travesty of the Justice Department,” Sept. 28, 2010, Investor’s Business Daily).   

                The DOJ also failed to prosecute the New Black Panther leaders for their publicly-announced bounty on George Zimmerman, “dead or alive” – even though such incitement is a federal crime.  “Nonprosecution of the Black Panther bounty seems to reveal that [the B.O. / Holder regime’s] overtly racist policy does now apply more generally,” Notre Dame business professor John Gaski noted in a provocative op-ed published last summer.  B.O. “could have been a racial healer, but instead chose to be a crude, petty agitator – true to form.  What he has been doing, transparently” – and in collaboration with his race-conscious attorney general – “is fomenting racial division in order to stoke his racial base in a way that has already turned violent in some instances.”  Federal involvement in cases like the Trayvon Martin or Michael Brown killings helps keep alive “the racial victimhood industry” and the myth of rampant white-on-black violence that simply doesn’t fit the facts.  What are the facts?  That the vast majority of violent crimes against black persons are committed by other black persons; that about 90% of interracial violent crime in the United States are committed by blacks against whites, with the black-on-white murder rate in the U.S. exceeding the white-on-black rate by about 2.5-to-1 and the black-on-white assault rate exceeding the white-on-black assault rate by at least 10-to-1.  Hence, “the hysterical public mantra of epidemic white-on-black violence is thus exposed as a fraud” (John F. Gaski, “The Race Discussion That Eric Holder Doesn’t Want,” I.B.D., Aug. 27, 2013). 

                No doubt Holder’s racist agenda also explains why the DOJ has failed to step in and prosecute as federal civil-rights violations the epidemic of black thugs’ “bash-mob” attacks on whites that broke out in various cities (including Baltimore, Philadelphia, Chicago, Milwaukee, Long Beach, and Washington, D.C.) last year, in the wake of George Zimmerman’s acquittal.  The victims of these violent attacks have included an Australian who was attending college in Oklahoma on a baseball scholarship, who was shot in the back by some teenage black thugs while he was jogging; a white 13-year-old boy who was brutally beaten on a bus in Florida by three black 15-year-olds; or – two clear-cut cases of hate crime – a 13-year-old white boy in Kansas City who was doused with gasoline and lit on fire by two black guys, saying “You get what you deserve, white boy,” and a white guy named Matthew Owens who was beaten on his porch in Alabama by 20 black kids who did it, they said, “for Trayvon” (“Obama’s, Media’s Double Standard on Interracial Shootings,” I.B.D., Aug. 23, 2013). 

                The IBD editors also noted in 2010 that “if that bias weren’t bad enough, check out what’s going on in the division’s housing and civil enforcement section,” which is conducting “a witch hunt against supposedly racist bankers.”  Several banks have been prosecuted for lending discrimination and ordered to set aside millions in loans to black persons – regardless whether or not they were victims.  Other banks have been bullied by DOJ officials into relocating operations and marketing efforts in urban areas to serve the “credit needs” of black persons, regardless of the profitability of those areas.  DOJ prosecutors rely on a dubiously expanded definition of lending discrimination called “disparate impact,” which does not require actual discrimination – just seemingly negative “impact” on minority areas.  As the IBD editors observed, “These cases aren’t about individual discrimination.  They’re about forcing banks to subsidize protected classes and redevelop blighted areas.  That’s not justice; that’s politics” (“`Post-Racial’ Racialism,” Oct. 1, 2010). 

                Holder’s shakedown of the banking industry has resulted in multi-billion dollar payments by major banks, to settle the DOJ’s investigation into their risky subprime mortgages, which allegedly caused the housing collapse in 2006-2007 that in turn brought about the financial crisis in 2008.  (Conveniently absent from the DOJ’s narrative is the fact that wrongheaded governmental policies, particularly from the Clinton administration, had attempted to force the banks and other mortgage providers to extend credit to persons who really couldn’t afford them, thereby causing them to make these “risky subprime” mortgage loans in the first place.)  In mid-July Citigroup agreed to pay $7 billion (comprised of a $4 billion civil monetary payment to the DOJ, $500 million in compensatory payments to state attorneys general and the FDIC, and $2.5 billion in payments to consumers).  Months earlier, a similar deal was struck between the DOJ and JPMorgan Chase & Co., the nation’s biggest bank, which agreed to pay $13 billion.

    Holder is not only racist but also either one of the most corrupt or incompetent attorney generals in modern U.S. history, as revealed by the ongoing Congressional investigations into “Operation Fast and Furious,” the illegal Mexican gunrunning operation in which agents of the Bureau of Alcohol, Tobacco, Firearms, and Explosives pressured U.S. gun dealers to sell weapons to Mexican cartels, all in an apparent effort to raise public support for gun control in the U.S.  (See “Holder’s Cover-Up Gets Criminal,” I.B.D., Jan. 31, 2013).)  Holder has been stonewalling the congressional investigation into the affair, withholding documents that had been subpoenaed by Congressman Darrell Issa’s House Oversight Committee, invoking the doctrine of “executive privilege” (the same excuse used by the Nixon administration to withhold the Watergate tapes that would eventually bring it down).  Holder was held in criminal contempt of Congress in June 2012 – the first such citation against a sitting attorney general in American history.

                It is likely that a recent new development in the Fast and Furious affair may have prompted Holder’s announcement on September 25 that he would resign as AG – but, significantly, after his successor has been nominated and confirmed.  Many political commentators have questioned the timing of Holder’s resignation, but the most interesting theory is that it happened just after a federal judge had denied a request from the DOJ to delay the release of Fast and Furious documents – the same documents that Holder withheld from the House Oversight Committee, claiming executive privilege, and also from the American people, despite a Freedom of Information Act request.  Kelly Terry-Willis, the sister of one of the border agents killed during the Fast and Furious operation, Brian Terry, thinks that the recent court defeat for the DOJ may be the reason why Holder so suddenly resigned.  Tom Fitton, president of Judicial Watch, agrees, calling Holder’s exit “past-due accountability for Holder’s Fast and Furious lies, and I hope it brings some solace to the family of U.S. Border Patrol Agent Brian Terry and the hundreds of innocent Mexicans likely killed thanks to the Holder Justice Department’s scheme that armed the murderous Mexican drug cartels” (“Holder Resignation May Be No Accident,” I.B.D., September 29). 

                His resignation also may have saved Holder from being impeached by the House of Representatives.  Now it appears he will never be tried in the Senate for the “high crimes and misdemeanors” he has committed against the American people, as one conservative commentator has argued.  “If there is any justice, after the Obama era mercifully draws to a close, Holder will be prosecuted for a litany of abuses that have been ably documented by J. Christian Adams in Injustice: Exposing the Racial Agenda of the Obama Justice Department and by John Fund and Hans van Spakovsky in Obama’s Enforcer: Eric Holder’s Justice Department” (Matthew Vadum, “Eric Holder Out, Special Prosecutor to Follow? American Thinker, September 25).   

                In addition to the previously-mentioned actions (and inactions), Holder’s sorry record as AG includes his directives to scrub the word jihad and all Islamic references from FBI counterterrorism training materials, and barring agents from infiltrating mosques – all due to political correctness and “Islamophobia-phobia” – which weakened homeland security; as well as the DOJ’s spying on AP reporters, seizing their phone records, and persecuting Fox News reporter James Rosen (the scandal that some media commentators called “AP-Gate” in 2013).  Summing up Holder’s record, the editors of Investor’s Business Daily observe: “Holder was the first African-American to be named the nation’s top cop., a welcome and poignant milestone in our troubled history.  But to the nation’s detriment, he proved a racialist and conduct himself in the most dishonest manner, enforcing the law for some groups but not for others.  He injected politics into the law at every turn.  He ranks among the worst attorneys general ever to serve.  He’s also one of the longest-serving, and the damage he’s done may be lasting. . . .  His exit can’t come fast or furious enough” (“Holder Leaves, Damage Done,” I.B.D., September 26).  

                George Washington University law professor Jonathan Turley, a left-liberal who (as noted above) has been concerned about B.O.’s abuses of power, wrote a more charitable obituary of Eric Holder’s tenure as AG soon after Holder announced his resignation.  Holder “could have been [a] truly great” attorney general, Professor Turley argues, but he squandered his opportunity by compiling “a destructive legacy on civil liberties and constitutional government,” as B.O.’s accomplice.  Holder “personally announced Obama’s `kill list’ policy, in which the president claimed the right to kill any U.S. citizen without a charge, let alone conviction.”  Moreover, the DOJ under Holder “used the controversial Espionage Act of 1917 to bring twice as many prosecutions as all prior presidents.  He oversaw Nixonian surveillance of journalists and led a crackdown on whistle-blowers.  Holder fought to justify massive warrantless surveillance and unchecked presidential authority” (Turley, “Holder could have been great,” USA Today, September 29).

                Meanwhile, Holder stays on as attorney general until B.O.’s nominee for his successor is confirmed by the Senate – a process that may drag on for months, or more, if B.O.’s pick is as controversial as Holder.  Among the rumored candidates is labor secretary Thomas Perez, the former head of the Injustice Department’s Civil Rights Division – the same man who committed perjury before the U.S. Civil Rights Commission when he testified that there was no “political leadership” involved in the decision to drop the New Black Panthers case.  Perez is even more radical than Holder – an overt extreme leftist, as well as race-baiting Latino activist – whom the editors of Investor’s Business Daily recently described as “an agenda driver for the `social justice’ wing of the Democrat Party.”  “He won’t helm the Justice Department to enforce laws equally – or enforce them at all in the case of illegal immigration.”  Perez recently told the National Press Club that the country needs to “fix our broken immigration system” by expanding legal status for immigrants through “aggressive executive action.”  And he’d continue Holder’s agenda to shake down bankers for billions in payola, under the threat of bringing more racial discrimination lawsuits, using the dubious “disparate impact” theory that Perez pushed as civil rights chief (“A Known Liar to Enforce Our Laws?” October 27).

      

     

    n Alienating Friends, Appeasing Enemies: B.O.’s “3-Fs” Foreign Policy

                 In “Thoughts for Summer 2012” (May 25, 2012), I summarized B.O.’s record on international matters as follows: 

    “[R]egarding foreign policy and national defense, B.O. has failed to make Americans more secure or the world more stable.  Rather than taking a strong stance against militant Islam (the greatest threat to world peace today), he and Secretary of State Hillary Clinton have naively supported `democratic’ movements in countries like Egypt and Libya – the so-called `Arab spring’ – resulting in the empowerment of groups like the Muslim Brotherhood.  Although accelerating U.S. troop withdrawals from Iraq, B.O. has intensified the war in Afghanistan – his Vietnam – while being powerless to stop Iran from developing nuclear weapons, leaving our best ally in the region, Israel, vulnerable to attack unless Israel acts on its own, with a preemptive strike for its own national survival.  . . . Most ominously, B.O. in effect has raised the white flag of surrender under threats by Vlad Putin, as Russia seems ready to renew the cold war.  Hence, he’s backed down from plans to install defensive missiles in eastern Europe – betraying allies like Poland – and apparently has promised the Russians that he’ll continue to push for unilateral American nuclear disarmament (the context of the comments recently picked up on a live microphone in Seoul, where B.O. was meeting with Russian President Medyedev, when B.O. sent a private message to Putin, that he needed `space’ until the election, after which he’d have `flexibility’ with Russia over the issue of missile defense).” 

     

    As I to the last point, I added: “What some political commentators have called `Open-Mic-gate’ is a chilling preview of what Americans can expect, generally, from a second term of B.O.’s presidency – if they are foolish enough to re-elect him.” 

                Unfortunately, Americans were foolish enough to reelect B.O.  And now that he’s a lame-duck president (or “occupier” of the White House), no longer under the pressure of courting voters for his reelection, what will he do when he can act with unlimited “flexibility”?  It’s a frightening prospect, considering B.O.’s fundamentally anti-American agenda: to weaken the United States, not only economically but also militarily, in order to transfer America’s wealth and power to countries in the so-called third world, to promote what he regards as global “social justice.”  (As noted above, that’s the goal of the radical leftist anti-colonialist ideology he inherited from his father, Barack Hussein Obama, Sr., as documented by Dinesh D’Souza, in his two books, The Roots of Obama’s Rage (2010) and Obama’s America: Unmaking the American Dream (2012), and his documentary film, 2016:Obama’s America.  No wonder D’Souza has been targeted by the feds for alleged campaign-finance violations, in a failed attempt to silence him.)   

                B.O.’s foreign policy can be summed up as a policy of alienating the friends of the United States (such as Israel, Canada, Britain, and eastern European emerging capitalist nations like Poland) while simultaneously appeasing enemies (such as Russia, Iran, and militant Islamic groups like the Muslim Brotherhood and Hamas).   

                It is a policy that is not only fickle and feckless but also fucked-up – hence, I call it B.O.’s “3-Fs” foreign policy.  “Fucked-up” is a broad phrase that encompasses the apparent confusion in B.O.’s policy, which is at best internally contradictory and at worst perfidious.  (My dictionary defines perfidy as “a deliberate breach of faith or trust,” or “treachery.”  As discussed in the section below, on the Ebola crisis, conservative columnist Thomas Sowell has suggested that B.O.’s occupancy of the White House has resulted in a “perfidious presidency.”)  Indeed, B.O.’s actions in office repeatedly have bordered on being treasonous – which, as I have suggested above, would make B.O. the first president who would deserve to be impeached on charges of treason.  He’s weakened the ability of the United States to defend its own interests, undermining our credibility abroad, and giving aid and encouragement to our global enemies.  

                Consider, for example, B.O.’s imprudent “prisoner swap” of five high-level Taliban detainees in Guantanamo Bay for U.S. Army Sgt. Bowe Bergdahl, described as a POW but who was more like a deserter from his post who was kidnapped by militant Muslims in Afghanistan.  As U.S. Senator Rob Portman (R.-Ohio) has pointed out, B.O.’s decision violated the National Defense Authorization Act’s requirement that the president give Congress 30 days’ notice before releasing Guantanamo detainees.  Furthermore, as several commentators have suggested – including Fox News judicial analyst Andrew Napolitanto and former former prosecutor Andrew McCathy – B.O.’s “prisoner swap” also violates the federal criminal law against providing material support to terrorism.  (One commentator has called it “the most preposterous prisoner exchange in history”; it “would be like the U.S. in World War II handing over Himmler, Goebbels, and three other top Nazis to release one American POW”  (Michel Fumento, “So-Called Prisoner Exchange Was Merely a Huge Gamble,” I.B.D., June 6).) 

                Consider also, about a year ago, how close B.O. came to involving the United States in yet another war in the Middle East – the Syrian civil war, pitting militant Islamic insurgents against the regime of Syrian president Bashar al-Assad– because of  the “red line” on chemical weapons that he foolishly drew.  Now, with the air strikes he has ordered against ISIS in Syria, it’s anyone’s guess whether the U.S. military intervention will tip the balance in favor of Assad or the rebels (or, if it’s the latter, whether it will benefit most the militant Islamists or the supposed “moderates”).           

                The threat to Americans from “home-grown” militant Islamic terrorists operating on their own within the United States was dramatically called to Americans’ attention by Army Major Nidal Hasan’s killings at Fort Hood, Texas, in November 2009, and by the Boston Marathon bombing.  Now, thanks largely to B.O.’s decision to wage war against the so-called Islamic State (discussed in a section below), others have been inspired by ISIS to commit atrocities inside the U.S. – such as the fired Muslim employee who beheaded a co-worker at a food-processing company in Oklahoma – or in the countries of our allies, such as the recent shooting attack on the Canadian Parliament building in Ottawa, Ontario.  

                It is little wonder, then, that recent polls show a large majority of Americans feel less secure and question the ability of government to protect its citizens.  The latest Associated Press–GfK poll, released early in October (about a month before Election Day) found that only one in five persons (approximately 20%) say they are extremely or very confident that the government can keep them safe from another terrorist attack.  The poll found that Democrats tended to express more faith in the government’s ability to protect them than Republicans, yet even among Democrats only 27% are confident the government can keep them safe.  (The same poll also asked which issues they thought were most important in this year’s elections.  “Climate change” was at the bottom of the list, mentioned by fewer than one in five; at the top was “the economy,” with nine in ten persons calling it an extremely or very important issue.)

     

     

    n Benghazi, 2 Years Later: All (Ben) Rhodes Lead to . . . Hillary!

                This past September 11 marks not only the thirteenth anniversary of the al-Qaeda terrorist attacks on the United States but also the second anniversary of the militant Islamic attack on the U.S. consulate in Benghazi, Libya, which resulted, among other things, in the brutal murder of four Americans (including U.S. Ambassador Christopher Stevens).  It was an outrageous act that exposed the fecklessness of B.O. in dealing with the threat of militant Islam; it also exposed the overall failure of B.O.’s “3-Fs” foreign policy, which has been disastrous.  And, with the false story that was pushed by B.O., his then secretary of state, Hillary Clinton, and other officials (particularly Susan Rice, then U.N. ambassador and now B.O.’s national security adviser) – attributing the attack in Benghazi on “spontaneous” Muslim outrage over an anti-Mohammad video that had been posted on the Internet – B.O. and his regime also told a whopper of a lie to the American people.  (And, in the case of Mrs. Clinton, it was a lie that she shamelessly told, face-to-face, to the father of one of the murdered soldiers, when his body was returned to the United States.)  The cover-up apparently was done – with the full cooperation of B.O.’s “bitches” in the “lamestream” news media – to prevent Benghazi from becoming a scandal that might have jeopardized B.O.’s reelection in 2012. 

                For the past two years, Congress has been investigating the matter – with House Republicans on the Oversight and Government Reforms Committee taking the lead, while Democrats generally have been stonewalling, in an attempt to shield from scrutiny Hillary “What difference does it make?” Clinton.  The House Select Committee chaired by Rep. Trey Gowdy (R.–S.C.), whose able members include Rep. Jason Chaffetz (R.–Utah) and my former student Rep. Jim Jordan (R.–Ohio) has been doing yeoman’s work in questioning State Department employees (uncovering such interesting facts as a weekend operation in a Department basement office to filter out and hide from the Benghazi Accountability Review Board any damaging documents that might put Mrs. Clinton in a bad light).    

                A critical development occurred in late April 2014 when, pursuant to a Freedom of Information Act (FOIA) lawsuit by the conservative watchdog group Judicial Watch, the State Department finally released the non-redacted text of a Sept. 14, 2012 email that had been sent by a White House official, on the eve of Susan Rice’s appearance on five Sunday talk shows, where she repeatedly told the lie blaming the attack on “spontaneous” protests against an allegedly anti-Mohammed video.  The author of the memo was a White House “spin” doctor, Ben Rhodes (identified as the “deputy national security adviser for strategic communication and speech writing”); it was sent to other key White House staffers such as then-communications director David Plouffe and press secretary Jay Carney; and among those cc’d on the email was a key assistant to Hillary Clinton.  The subject line of the email was “RE: PREP Call with Susan: Saturday at 4:00 p.m. ET,” and it discussed the need for Ms. Rice “to underscore that these protests are rooted in an Internet video, and not a broader failure or policy.”  In her Sunday talk-show appearances she was also to “reinforce the President and Administration’s strength and steadiness in dealing with difficult challenges.”  In other words, as the editors of Investor’s Business Daily aptly summed it up, “Her job was not to tell the truth but to put lipstick on the Obama administration’s Benghazi pig” (“Rice Prepped To Lie on Benghazi,” April 30).   

                Notwithstanding this clear evidence of White House orchestration of a cover-up, Mrs. Clinton in media interviews in mid-June “reached into her bag of clichés” and blamed on “the fog of war” the lie that the Benghazi terrorist attack was caused by that inflammatory video (“Liar, Liar, Pants Suit on Fire,” I.B.D., June 19). 

                Many critical questions about the Benghazi attack – and the B.O. regime’s culpability in it – still remain to be answered.  As Charles Krauthammer asked in an op-ed this spring:  “Where and to what extent was there dereliction of duty as memos, urgent pleas, and mounting evidence of danger were ignored [by the State Department] and the U.S. ambassador allowed to enter a deathtrap?” “What happened during the eight hours of the Benghazi attack, at the end of which the last two Americans (of four) were killed by mortar fire? Where was the commander in chief and where was the responsible Cabinet secretary, Hillary Clinton?  What did they do?” And, knowing as we now do, that the White House was pushing the “video made them do it” cover-up, “Who was involved in that decision, obviously designed to protect a president campaigning on the [false] idea that al-Qaida was `on the run’?”  (Charles Krauthammer, “How the GOP Should Handle Benghazi Probe,” I.B.D., May 9).  There’s another critical question that also needs to be answered:  Why was Ambassador Stevens at the consulate in Benghazi, and what were CIA operatives doing there – perhaps running guns to Syrian rebels?  (“Was CIA Running Guns in Benghazi?” I.B.D., Aug. 5, 2013). 

                What were they trying to cover up?  Maybe the embarrassing fact that B.O.’s wrong-headed military intervention in Libya (and also perhaps his attempt to intervene in Syria) was the root cause of the attack on our consulate in Benghazi.  In other words, further evidence that B.O. is unfit for the office he holds – and that former secretary of state Hillary Clinton is equally unfit.  “What difference does it make,” Mrs. Clinton?  It ought to disqualify you from running for the presidency in 2016.

      

     

    n Putin on the Blitz

                 B.O.’s policy toward the United States’ greatest enemy during the Cold War – the former Soviet Union, now almost just as dangerous as Russia, under the dictatorship of former KGB chief Vladimir Putin – apparently is, just like his policy toward militant Islamic terrorists, a policy of appeasement. When Hillary Clinton, as B.O.’s secretary of state, pledged to help “reset” U.S.-Russian relations (using a prop “reset” button which was mislabeled, with the Russian word mistaken for “reset” really meaning “overcharge” – a Freudian slip, perhaps?), she was following the administration’s M.O. in bashing the previous Bush administration. But B.O. has moved from appeasement to downright risk, endangering the national security of the United States, by signaling to Putin his willingness to unilaterally back away from a European missile-defense system, the one effective trump card the U.S. had had in negotiating with Russian thugs. 

                B.O.’s assurance to the Russians that he will have more “flexibility” in his second term apparently has given a green light to Vlad Putin to flex his muscles, at the expense of his neighboring country, the Ukraine.  Soon after the Ukrainian people deposed Putin’s pro-Russian puppet from the presidency in Kiev, Putin brazenly invaded the Crimean peninsula, orchestrated a phony election in which the people of Crimea voted to secede from the Ukraine, and then announced that Russia had formally annexed Crimea.  He then started fomenting a separatist rebellion among the ethnic Russians in eastern Ukraine, arming the rebels – resulting in, among other things, the downing of a commercial airliner by rebels probably using Russian-supplied missiles – and finally injecting Russian special forces into their ranks and adding Russian army support.   

                The speed with which Putin invaded the Ukraine was reminiscent of Hilter’s “blitzkrieg” invasion of Poland in 1939, which triggered World War II.  Even Hillary Clinton, at a private fundraiser in Long Beach, Calif. in early March, made the “eyebrow-raising remarks” comparing Putin’s tactics to those used by Adolf Hitler “back in the `30s,” when Hitler used the pretext of ethnic Germans being mistreated in Czechoslovakia and Romania as a pretext for invading those countries (“Clinton likens Putin’s tactic to Hitler's,” Columbus Dispatch [reprinted from Los Angeles Times], Mar. 6, 2014). 

                The comparison to Hitler is not as far-fetched as it seems.  Like Hitler, Putin seems to be motivated by nationalism – an ardent desire to restore the lost Russian Empire, not necessarily the old Soviet Union but the empire of czarist Russia, with himself (of course) as the emperor.  Re-taking Crimea, and hence restoring to Russia a warm-water port on the Black Sea, is a critical part of that plan; whether it also includes taking all of the Ukraine or just the eastern, Russian-colonized side of the Ukraine, has yet to be known.  (I suspect that Putin’s design, at minimum, is to split the Ukraine into two nations and to make eastern Ukraine a Russian client-state.)  George Will perceptively observed in a recent column that “Putin’s essence is anger.  It is a smoldering amalgam of resentment (of Russia’s diminishment because of the Soviet Union’s collapse), revanchist ambitions (regarding formerly Soviet territories and spheres of influence), cultural loathing (for the pluralism of open societies) and ethnic chauvinism that presages ‘ethnic cleansing’ of non-Russians from portions of Putin’s expanding Russia.”  Will calls it “more than merely the fascist mind” because “its ethnic-cum-racial component makes it Hitlerian” (“An Angry Putin More Vicious Than Jihadists,” I.B.D., September 4). 

                The real danger of B.O.’s policy of weakness toward Russia is that it leaves the future of eastern Europe, and of the NATO alliance, entirely up to the hope that Putin might exercise some self-restraint.  George Will warns: “Suppose Putin, reprising his Ukrainian success, orchestrates unrest among the Russian-speaking minorities in Latvia, Lithuania, or Estonia.  Then, recycling Hitler’s words that his country `could not remain inactive,’ Putin invades one of these NATO members.  Either NATO invokes Article 5 – an attack on any member is an attack on all – or NATO disappears and the Soviet Union, NATO’s original raison d’etre, is avenged.”

      

     

    n “Clutch” Kerry: Hamas-culated in the Mideast

                 In “Spring Briefs 2013” (Mar. 23, 2013), in a section titled “Second-Raters for a Second Term,” I quoted former V.P. Dick Cheney’s description of  B.O.’s second-term Cabinet picks as, collectively, a bunch of “second-rate people.”   I observed, “He’s actually giving them too much credit.  B.O.’s first term Cabinet were a bunch of mediocrities, and his picks for replacement of departing Cabinet members in his second term are even worse – not `second-raters,’ but third-, or fourth-, or fifth- (and so on) raters.  They’re just of bunch of “yes”-men and –women, people who have risen far above their levels of incompetence (according to the “Peter Principle”), the kind of people our second-rate, narcissistic president surrounds himself with, just to feel good about himself.” 

                Perhaps no one better epitomizes B.O.’s second-term “second-raters” than John Kerry, who as Secretary of State has succeeded only in making his predecessor, Hillary Clinton (one of the most mediocre secretaries of state in modern American history), look competent in comparison.  As I noted, “Kerry is a left-wing Democrat, former U.S. Senator from Massachusetts, and is further to the left than his predecessor,” Mrs. Clinton.  Conservative commentator Geoffrey P. Hunt, writing in American Thinker, described Kerry as a “speciously decorated” Vietnam veteran, “disgracing his service and his uniform with dubious testimony before the Senate Foreign Relations Committee in 1971 [comparing the U.S. military with “Genghis Khan”], now the champion for the Afghan quagmire, and third in succession to the White House.”  Moreover, “Kerry’s top treaty-making priority is defeating global warming” (“Second-Rate Appointments from a Third-Rate President,” Feb. 13, 2013).   

                I call him “Clutch” Kerry.  Thanks to some recent cosmetic makeover – whether it was a surgical face lift, or BOTOX injections, has been the subject of some speculation – Kerry has a strange-looking, smoothed-out face and talks so stiffly, with little facial movement other than his lips, that he reminds me of the early 1960s animated character, “Clutch Cargo.”  (The animators of “Clutch Cargo” did their human characters on the cheap: rather than animating their entire faces, they animated only their lips.  Kind of like the way John Kerry looks now.)  

                “Clutch” Kerry’s missteps as secretary of state are already legion, so I will note only the most egregious.  Last September he signed the United Nations Arms Trade Treaty, which would jeopardize Americans’ rights under the Second Amendment, despite repeated indications that the treaty would be dead on arrival in the U.S. Senate.  He has declared, “The era of the Monroe Doctrine is over” – thus jettisoning a fixed star in U.S. foreign policy for the past 190 years.  Following a page right out of the playbook of Neville Chamberlain (the British prime minister known as the “Great Appeaser” because of his naïve Munich Agreement with Hitler in 1938), he has negotiated a deal with Tehran that supposedly will bring an end to Iran’s atomic weapons program.  And earlier this year he absurdly claimed that “climate change,” or global warming, is “the world’s most fearsome weapon of mass destruction.” 

                Worst of all, however, has been “Clutch” Kerry’s attempts at negotiating peace between Israel and the Palestinians.  He has not only insulted the Israelis, our key allies in the region, but he also managed to piss off America’s Arab partners (particularly Egypt and Saudi Arabia) and even to alienate Fatah, the main element of the Palestinian Authority, by rejecting an Egyptian cease-fire initiative backed by the Arab League and accepted by Israel.  Instead, Kerry flew to Paris to secretly negotiate with Qatar and Turkey, allies of Hamas – the Palestinian terrorist group that is the bitter enemy not only of Israel but also of Fatah (whose members Hamas terrorized, expelled, and/or killed when the group seized Gaza in a 2007 coup) and Egypt (because of Hamas’s support for the Muslim Brotherhood and terror attacks on Egyptian soldiers in Sinai).  Hamas is also “deeply opposed by Jordan, Saudi Arabia, and the Gulf states [who] see it, correctly, as yet another branch of the Islamist movement that threatens relatively moderate pro-Western Arab states,” Charles Krauthammer has observed in a recent column.  By helping Hamas, Kerry seems ignorant of political realities in the Middle East – and blind to America’s interest, which as Krauthammer describes it, is “to endorse and solidify this emerging axis of moderate pro-American partners (Israel, Egypt, Jordan, Saudi Arabia, the United Arab Emirates, and other Gulf States, and the Palestinian Authority) intent on seeing Islamist radicalism blunted and ultimately defanged.”   

    Kerry’s betrayal of our Mideast allies has “legitimized Hamas’ war criminality,” which “makes his advocacy of Hamas’ terms not just a strategic blunder – enhancing a U.S.-designated terror group just when a wall-to-wall Arab front wants to see it gone – but a moral disgrace” (Charles Krauthammer, “Clueless Kerry Blunders Along in the Mideast,” I.B.D., August 1).

     

      

    n The ISIS Crisis: Between Iraq and a Harder Place  

                 Perhaps nothing better illustrates B.O.’s failed “3-Fs” foreign policy than the new war in Syria and Iraq that he announced in a 15-minute speech to the nation from the White House on September 10, the eve of the 13th anniversary of the 9/11 Islamic terrorist attacks on the United States.   

    Our new enemy, presumably, is the self-proclaimed “Islamic State,” or ISIS (the Islamic State in Iraq and Syria) – or ISIL (the Islamic State in Iraq and the Levant), the name that B.O. and his regime – and only B.O. and his regime – prefer to use, for some peculiar reason.  (I wonder why?  It’s less accurately descriptive, because Levant is the name given to the entire eastern shore of the Mediterranean, which may indeed be the ultimate aim of conquest for the IS’ers, but is an odd choice for American politicians to use as a label.  Is it some arbitrary decision made by a faceless policy wonk in the White House, or by Valerie Jarrett – B.O.’s Rasputin – or some other political handler, to deemphasize what is truly novel about B..O.’s new war, that for the first time in U.S. history it involves the use of American military force in Syria, in fact intervening in the Syrian civil war?)  According to media reports, B.O.’s hand was “forced” by a sudden shift in American public opinion – which until recently seemed to have grown weary of U.S. military involvement in the Mideast, resulting in the death and maiming of thousands of American servicemen and women – because of two horrifying videos posted on the Internet, showing ISIS militants beheading two American journalists.  (Under the circumstances, it seems that ISIS deliberately dared the United States to take action against them – presumably to help their global recruitment efforts, by exploiting the image of America as the “great Satan,” as the Iranian militant Islamists did during Jimmy Carter’s presidency.  And B.O. jumped right into their trap – egged on further by hawkish “progressives” and/or “neocons” in both political parties, such as John McCain and B.O.’s would-be successor, Hillary Clinton (who is, essentially, John McCain in a skirt – or should I say a pantsuit?)) 

    ISIS is a new twist in a very old story – the centuries-old story of fundamentalist, militant versions of Islam threatening not only their fellow Muslims in the Middle East but also the West, including Europe and the United States.  What’s new about ISIS is not only its brutality – murdering great numbers of men, women, and children and even destroying landmarks and shrines beloved by other Muslims who do not share their extreme vision of Islam – but also its rapid success as it tries to achieve its dream of a new “caliphate,” a theocracy that would unite the whole Islamic world under its militant vision of the faith.  

    Yet militant Islamic terrorists do not pose the same existential threat to the United States as did the old Soviet Union, for the simple reason that the Islamists do not (yet) possess nuclear weapons.  Those who are most directly threatened by ISIS are its Islamic neighbors, not only in the two countries ISIS is attempting to conquer, Iraq and Syria, but also the countries most threatened by ISIS plans for regional hegemony – Turkey, Iran, and Saudi Arabia.  Moreover, most analysts estimate the number of ISIS fighters in both Iraq and Syria to be about 20,000 – a number that is dwarfed by its foes in the Syrian and Iraqi armies, both in numbers and firepower.  (The Iraqi military and police force are estimated at more than 1 million; the Syrian army is estimated at 300,000 soldiers, with more than 100,000 Syrian rebels.  Tens of thousands of Kurdish Peshmerga forces are fighting the group in Iraq.  Still, as an AP wire story reported on September 11, ISIS is “a formidable force that has amassed a significant amount of weapons and hardware captured from Iraqi and Syrian military installations during the recent conquests.”)  The theory apparently behind the strategy that B.O. and his advisers have devised is that U.S. air strikes in Syria would help destroy ISIS’s capabilities, but so far – with the United States having spent nearly $1 billion thus far, and with mission’s costs estimated at between $200 million and $320 million per month going forward, according to the Washington Post – the air strikes do not seem to have been very effective in stopping ISIS.   

    There’s only one thing that’s certain about B.O.’s new war: it’s unconstitutional, unless he gets a congressional authorization for the use of force.  That B.O. would act on his own, without congressional authorization, in this area is not at all surprising, given B.O.’s general propensity to abuse “executive actions” (discussed above) as well as his overall contempt for the Constitution – and given the history of all U.S. presidents, whether Democrats or Republicans, since World War II, abusing their authority as commander-in-chief of U.S. military forces.  As I discuss in Chapter 10 of my Constitution book, the president’s legitimate commander-in-chief power is only the power to determine how to use U.S. military forces, after the Congress determines where and when to use U.S. military force (pursuant not only to Congress’s sole power to declare war but also a bundle of other powers that the Constitution gives Congress regarding military policy).  Only when the United States is directly attacked may the president act without authorization of Congress – and then only defensively.  By initiating yet another war in the Middle East, B.O. is indeed abusing his presidential powers – which is yet another impeachable offense.  

    Nevertheless, Republicans and “conservatives” (except those few who are genuinely limited-government conservatives or libertarians) are equally guilty, along with B.O., in urging him to go to war against ISIS – and in maintaining that he doesn’t need congressional authorization to do so.  Most of the Republican and conservative critics of B.O.’s foreign policy, influenced by the “neoconservative” wing of the Republican Party and by nutty war-hawks like John McCain and former V.P. Dick Cheney, blame B,O. for the ISIS mess – asserting that it was caused by his premature withdrawal of U.S. troops from Iraq, arising out of his failure to negotiate a standing forces agreement with the Maliki government   These critics are right only insofar as the U.S. did leave a power vacuum in the Middle East, by allowing the governmental power in Iraq to pass to a militant Shiite prime minister who alienated Iraq’s Sunni and Kurdish populations.  But that power vacuum was originally created by B.O.’s predecessor, George W. Bush (Bush the younger), when he – pursuant to Congress’s post-9/11 blank-check authorization for the use of force – ordered U.S. military forces to invade Iraq, in order to overthrow Saddam Hussein’s regime.  (And thus to finish the job that his father, George H.W. Bush – or Bush the elder – left unfinished, in the eyes of the neocons, following the first Gulf War, which I’ve called  “the war to restore the Kuwaiti monarchy” – a war in which no vital U.S. interests were at stake.  Exactly like Bill Clinton’s war in the former Yugoslavia (Bosnia), another war against Muslim terrorists, or B.O.’s intervention in Libya to overthrow Mohammar Gadhafi(Actually, Bush the elder, like his son, also got authorization from Congress – months after he had sent U.S. forces to the Middle East, poised to invade Iraq – but he at least recognized he needed congressional authorization.  Clinton and B.O. openly flouted the Constitution and abused their commander-in-chief powers by acting on their own.)   

    I confess that I was wrong in supporting Bush the younger’s war in Iraq – which was arguably the worst U.S. foreign policy blunder since the Vietnam War.  Far too many American servicemen and servicewomen have either lost their lives or become permanently injured or maimed because of the useless Iraqi war.  And it was indeed a useless war.  The neocons are wrong in arguing that B.O. blew it by withdrawing U.S. forces prematurely.  The intervention was doomed from the start because Iraq is an artificial entity, created after World War I by British and French intervention in the Middle East, that attempts to govern three separate peoples – Shiite Muslims, Sunni Muslims, and the Kurds – whose religious and cultural divisions can be overcome only by a brutal dictatorship like that of Saddam Hussein.  By overthrowing Hussein – on the pretext (which today, in 20/20 hindsight, has been proved totally wrong) that he possessed “weapons of mass destruction” that threatened not only his neighboring countries but also the U.S. and the West – the United States created the mess in which ISIS has emerged.  Senator Rand Paul – about the only Republican who can make some sense of the Middle East and U.S. foreign policy (and also about the only Republican U.S. Senator who consistently follows the Constitution) – has pointed out that the failed U.S. policy in the Middle East hasn’t resulted only from the failure of B.O.’s foreign policy; rather, it’s the inevitable result of the failed U.S. policy of military intervention where the vital interests of the United States are not at stake.  It’s the failed policy of the neoconservatives (in the Bush administration) and the so-called “progressives” of both major political parties (both the Clintons and John McCain, among others), who have supported the use of U.S. military force to overthrow the four “secular dictators” (as Rand Paul calls them) – Hussein in Iraq, Mubarek in Egypt, Gadhafi in Libya, and Assad in Syria – who were brutal tyrants over their own people but who provided a buffer against the even more brutal tyranny of the militant Islamic terrorists like al-Qaeda and ISIS

    From a constitutional standpoint, the critical point is this: B.O., on his own authority as “commander in chief,” cannot initiate a war; it must be authorized by Congress, by either a formal declaration of war or a resolution authorizing the offensive use of military force.  Yet, just as he did in intervening in Libya, B.O. takes the position that congressional authorization is unnecessary, while the leaders of Congress – Democrat and Republican alike – have abdicated their responsibility to debate and vote on the issue, at least until after the midterm elections.  

    Meanwhile, both the B.O. regime and its lapdogs in the news media continue to follow the “politically correct” line of denying that militant Islamic terrorists, acting in concert through groups like al-Qaeda and ISIS or acting on their own, are indeed at war – from their perspective, an Islamic “holy war,” or jihad, against the United States and our allies.  Hence, the murders committed by the Muslim jihadist Nidal Hasan at Fort Hood have been classified by the B.O. regime as mere “workplace violence.”  At least Canadian prime minister Stephen Harper has the honesty to identify the perpetrator of the recent shooting at the Canadian Parliament in Ottawa as a “terrorist” (while the Associated Press – giving more evidence that “AP” stands for “Administration Propaganda” – identifies the killer only as an “ex-con,” not as a convert to Islam and a jihadist who apparently was inspired by ISIS).  “Home-grown terrorism” is a real threat to the national security of the United States, and to the lives and property of Americans, but the B.O. regime stubbornly refuses to “profile” Muslims in the U.S. as potential terrorists, or even to acknowledge that it is indeed religion – a militant fundamentalist version of Islam – that is behind global terrorism. 

    Speaking in terms of both the Constitution and the national security of the United States, the only thing worse than an undeclared war is an undeclared half-assed war – but that is precisely what B.O. has led the United States into, in the Middle East.

      

     

    n The Ebola Shinola: Between Hype and Hysteria

                 One genuinely new issue that has emerged this fall has been the Ebola epidemic in western Africa and the question whether it will spread to the United States.  By “it,” I mean the epidemic – for the disease already has reached the U.S., in a few isolated cases, causing Americans much fear.  It doesn’t help that the news media have so hyped the story that it is generating real hysteria.   

                It reminds me of the mid-1980s, when a mysterious disease, Acquired Immune Deficiency Syndrome (AIDS) that initially seemed to infect only sexually-promiscuous gay men, had begun to spread to other people (including IV drug users and even hospitalized children who had been exposed to infected blood during certain medical procedures).  Rather than dying of AIDS itself, patients would die from cancer or other opportunistic diseases that invaded their bodies because of weakened immune systems.  In time – after the hysteria that similarly had been generated by media hype, and by misinformation about the disease (or technically the “syndrome”), had been replaced by rational thinking and sensible precautions (including the practice of “safer sex” by homosexuals and heterosexuals alike and stricter protocols for the use of blood products) – AIDS became more manageable.  Although it is still a deadly disease, new drugs have been developed to extend the lives of persons who have been infected by the HIV virus (the probable cause of the syndrome).  As a result, young Americans have become so complacent about the danger of HIV/AIDS that they are again engaging in risky sexual behavior – a subject that’s been virtually ignored by the news media, which has moved on to more sensationalistic stories.  

                There are more parallels between Ebola and AIDS.  Like AIDS, Ebola is a deadly disease caused by a virus; it is contagious – a communicable disease – but (according to health experts) only in a limited way.  Ebola is spread by direct contact with bodily fluids (blood, urine, semen, sweat, vomit) of an infected person, but apparently (unlike AIDS) only when the infected person is exhibiting symptoms of the disease (particularly fever).   

                But Ebola, a type of hemorrhagic fever, is an especially nasty, deadly disease: the virus shuts down the body’s production of interferon (our first defense against viruses) and then continues to replicate itself over and over. Because the virus attacks platelets – which help the blood to clot – patients infected with Ebola can bleed profusely and vomit blood.  Even a small amount of an infected person’s blood is teeming with Ebola virus, so the disease can easily spread to caregivers who have the most intimate contact with patients’ bodies – doctors, nurses, and other health-care workers in intensive care units, family caregivers, and mortuary workers (“Why Ebolas infection risk rises in ICUs,” USA Today, October 15).  That is why health providers must scrupulously follow the correct protocols when dealing with Ebola patients – and why it makes sense to treat such patients in specialized hospitals that are fully equipped with isolation units to deal with the disease.  (There are only four such hospitals in the United States: the National Institutes of Health in Bethesda, Md.; Emory University Hospital in Atlanta, Ga.; the University of Nebraska Medical Center in Omaha; and St. Patrick Hospital in Missoula, Mont.)  As Professor Peter Piot, who discovered the Ebola virus in 1976, has warned, “The smallest mistake can be fatal.”   

                In West African countries – especially the three countries most affected by the epidemic, Guinea, Liberia, and Sierra Leone – the death toll is rapidly rising.  More than 10,000 people have been infected and nearly half of them (over 4,900 people) have died, according to the World Health Organization, which has warned that unless the disease is contained, the infection rate in western Africa could jump from 1,000 new cases a week to 10,000 cases a week, with a death toll of about 70%, within two months.  The epidemic is largely uncontrolled in these “third world” countries because of their primitive health-care systems, lack of adequate sanitation, and often-perverse customs – such as the Islamic pre-burial ritual of relatives washing the corpses of  loved ones from head to toe, after first pressing down on the abdomen to excrete fluids still in the body.  This ritual, known as Ghusl, is considered a duty for Muslims, lest they leave the deceased “impure” and hence jeopardize his ascension to Paradise.  Another dangerous custom, prescribed by Shariah law, is for Muslims attending the funeral to pass a common bowl for use in ablution or washing of the face, feet, and hands, which of course compounds the risk of infection.  Roughly 85% of Guinea’s and 73% of Sierra Leone’s people are Muslim, as are about 13% of Liberians (Paul Sperry, “Islamic Burial Ritual to Blame for Spread of Ebola in Africa?” I.B.D., October 17). 

                Hence the Ebola epidemic is essentially limited to these “third-world” countries, particularly in West Africa.  Nevertheless, given that B.O. and his regime have been trying to convert the United States into a third-world country, it is little wonder that Americans fear that the epidemic will spread to our shores, as the disease already has done in several isolated cases.  (First there was Dr. Kent Brantly, who was diagnosed with Ebola while he was working in Africa and then brought back to the United States and successfully treated for the disease; now an Ebola survivor, he has donated blood plasma that has helped treat other Ebola patients in the U.S., especially the two nurses who were infected while treating Thomas Eric Duncan, the Liberian national who brought the disease to the United States when he visited relatives here, and who subsequently died at Texas Health Presbyterian Hospital on October 8.  Meanwhile, another American doctor, working with the nonprofit Doctors without Borders group, returned to the United States infected with the disease, bringing the threat to New York City – prompting the governors of New York and New Jersey to order mandatory quarantines for persons traveling from West Africa.)  As the writer of a letter-to-the editor just published in my local newspaper, the Columbus Dispatch, noted: “[D]o the government and medical groups really expect us to take their word that Ebola spreads only with certain types of contact after all the cover-ups and lies we’ve been fed recently?” (“Health-care providers must put others first,” October 29). 

                Given the regime in Washington – not only its incompetence but also its politicization of virtually every agency of government with its leftist ideological agenda – Americans are justifiably skeptical of statements made by its supposed “experts” in public health.  Dr. Tom Frieden, director of the Centers for Disease Control, doesn’t even understand the true meaning of “public health.”  As Health Commissioner for New York City under former Mayor Michael “Big Brother” Bloomberg, from 2002 until his appointment to the CDC by B.O. in 2009, he pushed Bloomberg’s paternalistic agenda – banning tobacco use, trans-fat in foods, even sugary soft drinks – limiting persons’ choices, controlling their private health.  It was an agenda that had nothing to do with public health, properly considered, which concerns contagious diseases (like Ebola).  My parents, who were from the World War II generation, would say: Dr. Frieden “doesn’t know shit from Shinola”!  (Shinola was a famous brand of shoe polish.) 

                State and local governments should not depend on the unreliable CDC, or any other federal agency, to develop protocols and to train health-care providers in implementing the protocols, to control the spread of Ebola.  The states must take the initiative (pursuant to their legitimate constitutional powers to safeguard public health, a part of the “police power” that the Constitution vests exclusively in the states); the CDC and other federal agencies should confine their actions to the sole federal responsibility – preventing the entry of contagious diseases like Ebola into the United States.  One obvious action that the federal government should take is to generally bar entry into the U.S. of any persons (whether or not they are U.S. citizens) who are traveling, either directly or indirectly, from any West African country where the Ebola epidemic is rampant; travelers granted exceptions from the travel ban (such as returning American citizens who were engaged in humanitarian missions in those countries) should be permitted entry into the U.S. only after they have been cleared by strict screening procedures or after they have been quarantined for an appropriate period of time.   

                Amazingly, this obvious federal action is one that the B.O. regime refuses to take – according to CDC director Frieden (apparently echoing the views of his boss), because these West African nations are “fledgling democracies,” with fragile economies that might be negatively affected by such a travel ban.  The duty of the CDC director is to safeguard public health in the United States – not to put the interests of other nations above those of Americans, or to simply be a “yes man” to the politician who appointed him to the job.  Dr. Frieden has proven himself incompetent in the job he holds; he should resign, or be fired, or be impeached and removed from office by the Congress. It’s not just because he “doesn’t know shit from Shinola,” but also because he (like his boss) is simply unfit for his job. 

                Besides becoming a growing public-health crisis, the spread of Ebola into the United States has become a political crisis – yet another major scandal of B.O.’s regime – and thus effectively constitutes an “October surprise,” on the eve of the November congressional elections.  Republicans in the House are holding hearings on the U.S. government’s actions (or inactions), putting Dr. Frieden on the hot seat.  And B.O. took the extraordinary step (for him) of canceling two political fundraisers, in order to have a “photo op,” following an emergency Cabinet meeting, to suggest to the American people that he and his regime are “doing something” about the Ebola scare.  (What B.O. has done is simply add to the federal bureaucracy by appointing a Democratic lawyer and political operative – Ron Klain, former chief of staff to V.P.s “Smiley Joe” Biden and “Algore” – as an Ebola “czar,” giving him the title “Ebola response coordinator.”  Not only does Klain lack any medical background (the only kind of doctor he is, is a political spin doctor) or any expertise regarding public health matters, but his credentials as a political organizer are greatly exaggerated.  What are his claims to fame among fellow Democrats?  He signed off on the Solyndra fiasco, which wasted $535 million in taxpayer money on a failed solar panel manufacturer; he also supervised the even more massive waste of taxpayer money in B.O.’s failed economic “stimulus” program.)  

                At the very least, the Ebola scare provides further evidence of the incompetence of B.O. and his regime.  More than this, however: the developing Ebola crisis provides evidence that B.O. is worse than incompetent – that his occupation of the White House has resulted in a “perfidious presidency,” as conservative economist Thomas Sowell calls it in a recent op-ed: 

    “There was a time when an outbreak of a deadly disease overseas would bring virtually unanimous agreement that our top priority should be to keep it overseas. Yet Barack Obama has refused to bar entry to the United States by people from countries where the Ebola epidemic rages, as Britain has done. The reason? Refusing to let people with Ebola enter the U.S. would conflict with the goal of fighting the disease. In other words, the safety of the American people takes second place to the goal of helping people overseas.

     

    “As if to emphasize his priorities, Obama has ordered thousands of American troops to go into Ebola-stricken Liberia, disregarding the dangers to those troops and to other Americans when the troops return. What does this say about Obama?

     

    “At a minimum, it suggests that he takes his conception of himself as a citizen of the world more seriously than he takes his role as president of the United States. At worst, he may consider Americans' interests expendable in the grand scheme of things internationally.

     

    “If so, this would explain a lot of his foreign policy disasters around the world, which seem inexplicable otherwise. Those critics who have been citing Obama's foreign policy fiascoes and disasters as evidence that he is incompetent may be overlooking the possibility that he has different priorities than the protection of the American people and America's interests as a nation.”

     

    (Thomas Sowell, “The Monstrous Possibility of a Perfidious Presidency,” Investor’s Business Daily, October 7). 

                Sowell is quite right – and perhaps does not go far enough in identifying B.O.’s perfidy.  His apparently failed, or feckless (or fucked-up), foreign policy can be explained by an agenda that is deliberately anti-American: an agenda not only to promote the interests of “third-world” nations over those of the United States, but even to sacrifice the interests of the United States in order to bring about what B.O. regards as a more “equal” global redistribution of wealth and power (and now, apparently, disease, too).

      

     

    n (Tea) Party On, Dudes: Looking Ahead to November – and 2016

                 In my “Election 2010 Roundup” (Nov. 4, 2010), I began by observing, “Americans have dealt with their B.O. problem by applying deodorant to just one armpit.”  That’s how I summarized the historic 2010 midterm Congressional elections, in which the Republicans swept into control of the U.S. House – truly a tsunami, of historic proportions – with a net gain of at least 60 seats.   

                The Tea Party movement (which in other blog essays I’ve called “the deodorant America needs to solve its B.O. problem”) was largely responsible for those Republican gains, by supporting many superb, principled limited-government conservatives, as well as encouraging voters to turn out for GOP candidates, generally (providing “the wind behind the sails of the Republicans’ victories,” I wrote).  Unfortunately, the GOP failed to gain control of the U.S. Senate – and, in the process, lost a great opportunity to defeat Senator Harry Reid (D.-Nev.), the corrupt Senate Majority Leader, by running against him a weak candidate, Sharon Angle.  As I cautioned at the time: 

    “Nevertheless, applying deodorant to just one armpit is at best a temporary solution to the stench emanating from the national body politic (from the Washington, D.C. machinery of government dominated by B.O. and his regime).  To really rid themselves of their B.O. problem, Americans need to take a nice, long shower – to have a good, thorough cleansing – which will happen when they elect limited-government conservative (and libertarian) Republicans to be in charge of all three political branches of the national government – the U.S. House, the U.S. Senate, and the Presidency – two years from now, in the 2012 general elections.”

     

    Unfortunately, I was far too optimistic in my prediction.  The Tea Party movement failed to help the GOP in the 2012 elections – not only because it was being harassed by the IRS, but also because many Tea Party groups actually hurt the GOP in that year’s elections.  By supporting questionable GOP candidates (such as Richard Mourdock and Todd Akin, the anti-abortion nuts who were the U.S. Senate candidates, respectively, in Indiana and Missouri, seats in two “Red” states that the GOP ought to have won), the Tea Party allowed the Democrats to retain control of the U.S. Senate.  Even worse, by failing to support Mitt Romney (whom many conservative activists regarded as too “moderate”), Tea Party activists allowed B.O. to be re-elected president with fewer popular votes than he had in 2008.    

    Thankfully, in this year’s congressional elections, the Tea Party movement has been, on balance, far more of a help than a hindrance to the Republican Party.  In one of its greatest victories in this year’s GOP primaries, a Tea Party-supported candidate, Dave Brat, defeated House Majority Leader Eric Cantor in Virginia’s 7th District.  Brat, an economics professor at Randolph-Macon College, won by spending only about $200,000 in his campaign against Cantor’s $5.4 million.  “But Brat had something more valuable than money,” the editors of Investor’s Business Daily noted; however, it was not the immigration issue (as the editors suggest) or any other single issue that explains Brat’s victory (“Clear Lessons in Cantor Loss,” I.B.D., June 12).  Cantor lost because he was not as principled as Brat, who has a clearly-articulated limited-government, free-market message.  Cantor, in contrast, was perceived by voters as too much of a Washington “wheeler-dealer,” willing to compromise with Democrats, and – perhaps most fatal of all – an “asshole,” in his relations with constituents.  Brat’s victory in the primary – and presumably in the upcoming general election (it’s a heavily Republican district, and Brat’s Democratic opponent is a fellow Randolph-Macon college professor) – illustrates how the Tea Party movement, by encouraging and supporting credible candidates, can strengthen the GOP by giving the party some principled backbone.   

    Contrary to the narrative being pushed by the leftist news media and by certain conservative commentators (such as Rush Limbaugh), there is no clear split in the GOP between “Tea Party” or “grassroots” Republicans and so-called “establishment” Republicans.  The “Tea Party” insurgency began after both major parties had supported the 2009 bailouts of the nation’s biggest banks, and the movement has emphasized issues of fiscal conservatism such as uncontrolled deficits, the burgeoning national debt, and generally the explosive growth of government spending and regulations.  Today, as Los Angeles Times columnist Doyle McManus observed in an insightful op-ed printed this spring, “it’s hard to find much daylight on fiscal issues between `establishment’ conservatives such as [Sen. Mitch] McConnell and `insurgent’ conservatives such as Sen. Rand Paul, R.–Ky,, who endorsed him.”  McManus concludes, “The GOP’s civil war now looks more like a merger.  The establishment has moved right, and many of the tea party’s voters are rejoining/ reconciling with that new mainstream – even if some of their self-appointed leaders are not.”  In other words, the Tea Party movement has “won the battle for influence in the Republican Party” (“Tea party is now a part of the Republican mainstream,” Columbus Dispatch, May 26, 2014). 

    Although some conservatives are calling for the GOP to campaign this year on a clearly-defined platform, a new version of the famous “Contract with America” that brought Republicans to power in Congress 20 years ago (and made Newt Gingrich speaker of the House), the better advice is that given by columnist Charles Krauthammer, who maintains that it’s enough that Republicans campaign as “the party of no” – the party that will stop B.O. and the Democrats from continuing to “transform” America into a fully-socialized, regulatory/welfare state.  “With the welfare state having reached the outer limits of its competency and solvency, it is in desperate need of restructuring and reform.  With an ideologically ambitious president committed instead to expanding entitlements, regulation and government itself, principle alone would compel the conservative party to say stop.”  He adds, “`Stop’ was more than enough in 2010,” the last midterm election in which Republicans won a majority in the House.  2010 was “a referendum on the liberal overreach of the first two Obama years,” with Democrats in control of both chambers of Congress; and it resulted in a “shellacking” of the Democrats, as B.O. himself conceded.  “With the president in decline and his presidency falling apart,” saying “Stop” will be enough in 2014, too, Krauthammer argues.   

    It’s important for Republicans to take back control of the Senate because they cannot stop B.O.’s agenda by controlling only the House, as the past four years have demonstrated.  Contrary to the Democrats’ line that the Republican House does nothing but “block and oppose” – the false narrative of a “do-nothing Congress” that B.O. cites as his excuse for executive power overreaching – the GOP House has in fact passed hundreds of bills, “only to have them die upon reaching the desk of Senate Majority Leader Harry Reid,” who has ensured that any bill that might present a politically difficult vote for his Democratic colleagues never even comes to the floor. “Winning control of the Senate would allow Republicans to pass a whole range of measures now being held up by Reid, often at the behest of the White House.”  The new Republican-controlled Congress can then push a major reform agenda – tax reform, the Keystone XL pipeline and natural gas exports, border security, energy deregulation, and health care reform that repeals the more onerous ObaminableCare mandates – and dare B.O. to veto it.  “The vetoed legislation would become the framework for a 2016 GOP platform.  . . . Show the country what you stand for.  Then take it to the nation in 2016” (“Here’s Why Winning the Senate in November Matters,” I.B.D., October 3). 

    All the factors – particularly the critical factor of voter enthusiasm, with Republicans excited and Democrats dispirited – point toward a big Republican victory in the congressional elections this year, with the GOP gaining at least the six additional seats it needs to control the Senate and also having double-digit gains in its majority in the House.  First things first, as Krauthammer writes: first get control of Congress and then restore Congress’s exclusive constitutional powers to set policies and to make laws. 

     

     

    n Rush, Beck, a “Gut”-Check, and “Bolling for Dollars”

                 Two of the leading “conservative” talk-radio hosts – Rush Limbaugh and Glenn Beck – continue to be disappointing in their inconsistent defense of liberty and limited government.  Meanwhile, libertarians and limited-government conservatives can take comfort in at least one truly libertarian commentary show on Fox News. 

                Rush Limbaugh persists in his simplistic view of the political world being divided between “liberals” and “conservatives,” without acknowledging the true division between collectivists of all stripes (not only left-liberals but so-called “progressives,” social conservatives, neo-conservatives, and other Big Government conservatives) and individualists (true limited-government conservatives and libertarians).  Limbaugh also continues to persist in his naïve view that “conservatives” are in the majority in the American electorate, despite abundant polling data showing that the large plurality of American voters identify themselves as independents.  With his false assumptions about political identity, Limbaugh was partly responsible for B.O.’s reelection because his unjust criticisms of Mitt Romney as too “moderate” helped to dampen the Republican turn-out, thereby enabling B.O. to be reelected with fewer popular votes than he had in 2008.  (He shares that responsibility with some national “tea party” groups – such as Jenny Beth Martin’s Tea Party Patriots – who similarly dampened enthusiasm for Romney among the Republican and independent voters who should have given him the victory that pollsters were predicting on the eve of the 2012 election.)  Now Limbaugh is beating the war drums, unquestioningly accepting the premises of the neocons, which may help doom Rand Paul’s chances in the Republican primaries – leading to the election of Hillary Clinton in 2016. 

                Glenn Beck, who has been calling himself a libertarian and who has criticized both major American political parties, has a far more sophisticated understanding of politics than Limbaugh: for example, he recognizes that modern “progressives” in both parties are the true enemies of freedom today.  Unfortunately, Beck – who is a convert to Mormonism – is a religious fanatic whose thinking is warped by his religiosity.  Claiming to uphold the ideals of America’s Founders, he follows the guidance of David Barton, a pseudo-historian who claims that America was founded as a “Christian nation” – while, in reality, America was founded as a nation based on individual freedom of conscience.  Because of his religiosity, Beck is often depressingly apocalyptic and too easily falls into the traps of conspiracy theorists.  But this time of year, Beck’s radio show has a Friday segment, “More On Trivia” – a trivia contest pitting clerks at convenience stores in two cities whose NFL teams face-off over the weekend – which is a hilarious expose of the large segment of the American population whom Rush Limbaugh too-charitably calls the “low-information voters,” i.e., morons.  (Beck’s radio show, unfortunately, also has a very annoying theme song: “We all want to belong . . . .”  No, WE DON’T!!!) 

                Greg Gutfeld at one time was the most libertarian panelist on The Five, Fox News TV’s weekday program at 5:00 p.m. (Eastern time).  Lately, however, he’s talking more like a conservative – perhaps even a neoconservative – with his hawkish views about the war against militant Islamic terrorism.  He rightly criticizes the B.O. regime, along with the “media-academic complex,” for their “Islamophobia-phobia”: their irrational fear of being called “Islamophobic” if they identify the “terrorists” for what they are – militant Muslims who adhere to a brutally violent fundamentalist version of Islam.  But Gutfeld has been beating the war drums far too loudly, criticizing libertarians for advocating a foreign policy of non-intervention (not “isolationism,” as many conservatives assert) and even embracing the neocons’ (and the so-called progressives’) vision of the United States as the policeman of the world.   

                With “Gut-less” Gutfeld taking a turn to the right, it is his co-panelist on The Five, Eric Bolling, who has emerged in recent months as the most prominent libertarian commentator on Fox News (with the exception of John Stossel, who hosts an occasional special and who occasionally appears as a guest on populist demagogue Bill O’Reilly’s show).  Bolling not only changed his politics from being a social conservative to being a libertarian, but he also explicitly tries – not always successfully – to be more consistently a libertarian.   For example, he now realizes that the “War on Drugs” has been a failed policy, with drug prohibition causing more problems than drugs themselves, and so has embraced decriminalization or legalization, particularly of marijuana.  And on foreign policy, he often agrees with The Five’s leftist panelist, Bob Beckel, in being skeptical of U.S. military intervention in the Middle East.   

                Bolling also hosts the best public-affairs show on Fox News TV, Cashin’ In, a half-hour show (it should be an hour) broadcast on Saturday mornings at 11:30 a.m. (Eastern time).  Two of Bolling’s regular panelists are unabashed advocates of free-market capitalism: Jonathan Hoenig, financial advisor and syndicated columnist (“Capitalist Pig”), and Wayne Rogers, former actor (he played “Trapper John” the first several seasons of the TV series M*A*S*H) and now financial advisor.  (A third regular panelist, Michelle Fields, is a limited-government conservative who seems moderate only in comparison with Bolling, Hoenig, and Rogers; the fourth panelist is a leftist, often either Juan Williams or Bob Beckel, who functions as a foil.)  At the end of each show, Jonathan and Wayne give their stock-market suggestions (in a segment called “What I Need To Know,” which lately has been broadcast only online), but their real contribution to the show is in the earlier segments, discussing key political issues, when they express strident libertarian ideas and values (both often citing the works of Objectivist philosopher/novelist Ayn Rand or Austrian-school economists F.A. Hayek and Ludwig von Mises).  Although Rogers is more conservative than libertarian (“illegal” immigration is one of his blind spots), Hoenig calls himself “Objectivist,” not “libertarian,” and hence defends capitalism and opposes the welfare state on moral grounds.  (On immigration, for example, Hoenig believes in open borders but opposes giving tax-payer funded “entitlements” to immigrants as well as to American citizens, on the grounds that it is immoral for government to use force to confiscate wealth produced by some people and to redistribute it to others.)   

                Rush Limbaugh recently observed on his radio show that Cashin’ In, notwithstanding its Saturday morning time slot, averages about a million viewers – more viewers than CNN and MSNBC have for their weekday evening prime-time programs.  As the most free-market-oriented show on Fox News, perhaps Cashin’ In should be renamed (in honor of its host) “Bolling for Dollars”!

     

     

    n Marriage, American Style (An Update)

     

                Over ten years ago I posted my MayerBlog essay “Marriage, American Style” (May 19, 2004), giving truly “liberal” (that is, classical liberal or libertarian) arguments for legal recognition of same-sex marriage.  I discussed why it would be consistent with the American constitutional tradition (with its emphasis on the sovereignty of the individual) to redefine marriage to include same-sex relationships – and why such a reform would be truly “progressive,” in the proper sense of the word.  (As Thomas Jefferson wrote, “Laws and institutions must go hand in hand with the progress of the human mind.  As that becomes more developed, more enlightened, as new discoveries are made, new truths disclosed, and manners and opinions change with the change of circumstances, institutions must advance also, and keep pace with the times.”)   

                In the ten years since I posted that essay, much progress (true “progress,” as Thomas Jefferson used the word) has been made, both in the United States and around the world.  In 2004 only one U.S. state recognized same-sex marriages, Massachusetts, as the result of a controversial state supreme court decision that conservatives decried as “judicial activism.”  And the United States was only the fourth nation in the world to recognize same-sex marriages.  By May 2013 (when I last posted on this topic) a total of 14 nations had done so – including France and Spain (despite their being heavily Roman Catholic countries).   

                In the United States today, same-sex marriage is legal in about two-thirds of the states (32 at last count) and in the District of Columbia.  I write “at last count” because the number of states recognizing same-sex marriages has rapidly expanded in the past several months (it was only 19 as of early this summer), thanks to a flurry of lawsuits that have been brought in federal courts, challenging the constitutionality of laws and/or constitutional provisions in those states that prohibit marriage except between a man and a woman (in other words, states that limit marriage to the traditional definition of a heterosexual couple).   Federal district court judges have struck down same-sex marriage prohibitions in several states – including Michigan, Utah, Texas, Oklahoma, Oregon, Pennsylvania, and Virginia – mostly on equal protection grounds.  In addition, federal judges in several states (including Ohio, Kentucky, and Tennessee) have ruled that state officials must recognize sex-same marriages legally performed out-of-state (also generally on equal protection grounds, rather than under the Full Faith and Credit Clause).  With several U.S. Courts of Appeal scheduled to hear cases by the end of 2014, it seemed likely, as of late summer, that the U.S. Supreme Court would address the issue by its 2015 term (Richard Wolf, “Gay marriage: Which case will the Supreme Court choose?” USA Today, August 29). 

    However, the Supreme Court recently decided to take a pass on the issue, refusing to review decisions by several federal appeals courts (including the Fourth, Seventh, and Eighth Circuits) that had struck down same-sex marriage bans in five states (Virginia, Indiana, Wisconsin, Oklahoma, and Utah).  Court-watchers immediately offered various theories why the Court punted (see Richard Wolf, “High court steps aside,” USA Today, October 7); but I think the critical factor is that the appellate courts have not yet split over the issue.  So far all the federal courts that have considered the issue have ruled against same-sex marriage bans, except one (a district court in Louisiana).  The U.S. Court of Appeals for the Sixth Circuit, which includes Ohio, will rule on the issue some time this fall; the three-judge panel considering the case here includes two judges who generally adhere to a philosophy of judicial restraint and thus might consider a ruling in favor of same-sex marriage, overturning states’ adherence to the traditional definition of marriage, would be impermissible judicial activism.  (Although I strongly support legal recognition of same-sex marriage on policy grounds, I am less certain that state laws or constitutional provisions limiting marriage to heterosexual couples are unconstitutional.  I think the stronger constitutional challenge is on due process grounds – denying same-sex couples their liberty to enter into a marriage contract with the partner of their choice – while the challenge on equal-protection grounds is right on the dividing line between proper judicial enforcement of the Equal Protection Clause and illegitimate judicial activism, because equality is a tricky concept, in this context.) 

    Theodore Olson, the Republican lawyer (and former U.S. Solicitor General) who argued the challenge to California’s Proposition 8, has said that the Supreme Court, by allowing the courts of appeals decisions to stand, has passed “the point of no return” on same-sex marriage: “I do not believe that the United States Supreme Court could rule that all of those laws prohibiting marriage are suddenly constitutional after all these individuals have gotten married and their rights have changed,” he said in a recent interview.  Olson makes a valid point: for the Supreme Court, at this point, to rule in favor of state prohibitions of same-sex marriage would really raise serious equal-protection problems.  The legal trend is unmistakable.  As Abraham Lincoln said in his famous “House-Divided Speech” in 1858, “a house divided against itself cannot stand”: Either the Court will rule that people in the remaining 18 states have the same fundamental right that federal judges have recognized for people in the other 32 states, or the Court will rule that the definition of marriage is a matter reserved to the states by the Tenth Amendment (and hence the people in all 50 states will have the power to determine, through their state legislatures or in a popular referendum, whether to broaden the legal definition of marriage to include same-sex couples).  If it’s the latter, I predict that the eventual outcome will still be the same: public-opinion polls show that the majority in most states now support the right of same-sex couples to marry.  (Here in Ohio recent polls show that public opinion has completely switched since 2004, when voters approved the state constitutional amendment limiting marriage to a man and a woman.)  As Jefferson wrote, “Laws and institutions must go hand in hand with the progress of the human mind. . . .”

      

     

    n Weeding Out Prohibition, One State at a Time

                 The increased acceptance of same-sex marriage isn’t the only amazing social phenomenon in America over the past couple of years.  Another amazing phenomenon has been the increased acceptance of legalizing, or decriminalizing, marijuana.   

    Summarizing the “mixed bag” of state referendum results in my “Election 2010 Roundup” (Nov. 4, 2010), I expressed my disappointment that California’s voters had rejected Proposition 19, which was the first modern proposal – since the public policy of drug prohibition began in the 20th century – to legalize recreational use of marijuana.  In my previous essay, “Tricks and Treats 2010” (Oct. 29, 2010), one of the “treats” I discussed was Proposition 19 and what it signified.  I was one of the signers of an open letter, “Law Professors Endorse Proposition 19,” which read in part: 

    “For decades, our country has pursued a wasteful and ineffective policy of marijuana prohibition.  As with alcohol prohibition, this approach has failed to control marijuana, and left its trade in the hands of an unregulated and increasingly violent black market.  At the same time, marijuana prohibition has clogged California’s courts alone with tens of thousands of non-violent marijuana offenders each year.  Yet marijuana remains as available as ever, with teens reporting that it is easier for them to buy than alcohol across the country.

     

    “Proposition 19 would remove criminal penalties for private use and cultivation of small amounts of marijuana by adults and allow California localities to adopt – if they choose – measures to regulate commerce in marijuana.  Passage of Proposition 19 would be an important next step toward adopting an approach more grounded in reason, for California and beyond.”

     

    I particularly liked the reference to “reason” in the statement, because the policy of drug prohibition in the United States historically has been based on irrational fears – fears about drugs (most of which irrationally blame the use of drugs themselves, rather than the policy of their prohibition, for all their supposed harmful effects, including drug abuse and crime), as well as irrational hatred of certain ethnic groups.  (The first federal drug law, the Harrison Narcotics Act of 1914, was rooted in anti-Chinese prejudice, for it was Chinese immigrants who most widely used opium, in the Pacific coast states, in the late 19th and early 20th century.  Similarly, federal prohibition of marijuana was rooted in anti-Mexican prejudice.)     

    In addition to the utilitarian arguments for decriminalizing drugs cited in the open letter, I added the moral argument that carries the greatest weight with me, as a libertarian:  that individuals own their own lives and bodies and therefore have the right (one aspect of their fundamental natural rights to life and liberty) to determine for themselves what substances they use or ingest, for either medicinal or recreational use.  Any law that interferes with that right, whether state or federal, is an unconstitutional abuse of government’s legitimate powers to criminalize actions that invade the rights of others.  (I consider all federal “drug laws” to be unconstitutional abuses of Congress’s so-called Commerce Power – for they are not legitimately laws “regulating commerce” among the states but rather an unconstitutional attempt by Congress to legislate on “police powers,” which the Tenth Amendment reserves to the states.  And I consider state laws criminalizing drug possession and use to be unconstitutional because they are abuses of the police power, which legitimately is limited to actions directly harmful to others.  One person getting high on cannabis, especially in the privacy of his own home, harms no one – not even himself, according to the best available medical evidence.) 

                In my November 2010 essay, I observed, “Many states have legalized medical use of marijuana – a trend that’s likely to continue to grow.”  That has proved to be not only true but also a great understatement.  As of November 2010, medical marijuana had been legalized in 15 states, with one more state plus the District of Columbia poised to join the list, which generally comprised Pacific and Western states, some New England states, and even (socially) conservative Michigan.  Today, medical marijuana is legal in 22 states, plus D.C. 

                Four years ago I also observed, “Notwithstanding the defeat of Proposition 19 in California, however, marijuana activists in Colorado are planning to launch two separate campaigns to legalize the drug for adults in 2012.”  The campaign in Colorado ultimately was successful, as voters in that state made it the first (as of January 1 this year) to legalize recreational use of marijuana and to regulate (and tax) production and sale of cannabis.  The state of Washington followed this summer, as the second state to legalize recreational pot.  (Although the regulatory regimes in the two states are broadly similar, there are some interesting differences: for example, Colorado permits residents to grow a small number of plants for their personal use, while Washington requires all marijuana to be grown by licensed farmers; Washington caps the number of stores and sets overall limits on marijuana growing, while Colorado has no caps, choosing instead to let the free market decide.) 

    Policymakers in other states, and even in foreign countries, are closely watching the experience with marijuana legalization in Colorado and Washington.  Prohibitionists and paternalists, along with some leftist elitists, already are circulating horror stories about innocents being harmed by edible marijuana products (including leftist New York Times columnist Maureen Dowd, who detailed her hallucinatory episode after eating a pot-laced chocolate bar).  Meanwhile, polls show a major shift in public opinion: Gallup found last October that 58 percent of Americans favored marijuana legalization, with 39 percent opposed (compared to the mid-1990s, when only 25 percent favored legalization with 73 percent opposed). 

    The floodgates of personal freedom, once opened, cannot be easily closed.  In this year’s November elections, proposals to legalize recreational marijuana will appear on the ballots in two more states, Alaska and Oregon, plus a proposal to legalize medical marijuana in Florida (which I believe would be the first southern state to jump on the bandwagon).  Among those supporting the Oregon legalization measure is travel-book author (and public radio and TV host) Rick Steves.  No wonder he so enjoys visiting Amsterdam!

      

     

    n An Alphabet Soup of Hyper-sensitivity

                     As I’ve previously noted, B.O. and his regime have exacerbated the problems of race and class divisions in modern American society.  The political Left, which dominates both professional journalism and higher education in the United States, has made these problems even worse by its constant carping about members of certain groups being “victims” of supposed “oppression” by members of other groups (usually characterized as wealthy, white, heterosexual men). 

    Hence it is not merely coincidental that American culture has become increasingly intolerant of perceived racism, sexism, homophobia, and other “-isms” and “phobias” that are condemned by  “politically correct” American left-liberals.  Indeed, hypocritically in the name of fostering greater “toleration,” the “p.c. police” (as I call them) have become remarkably intolerant, not only of these perceived “-isms” and “phobias,” but even of perceived “insensitivity” to particular groups of people.   

    For example, advocates for the rights of “disabled people” have urged that we stop using the word handicapped to describe those who are either physically or mentally disabled; and indeed, they have advocated banning the word retarded because it is allegedly a hurtful and insensitive way to refer to people with mental disabilities.  The word, however, originated as an accurate clinical description of those whose mental faculties were not fully developed – just as the word homosexual is an accurate description of those whose sexual orientation is homosexual. 

    Speaking of sexual orientation, the p.c. police (allied with “gay” activists) in the recent past insisted that persons with homosexual orientation be identified as “gay.”  Today, that term – which originally meant happy or carefree but became a slang term for homosexuality – isn’t sufficient; the p.c. police insist that the correct term is the incredibly awkward acronym “LGBTQ,” which stands for “Lesbian, Gay, Bisexual, Transgendered, and Queer/Questioning,” and which supposedly is a more “inclusive” description of the “sexual minority,” or non-heterosexual, “community.”  Apparently neither the p.c. police nor the LGBTQ activists have realized that the more they try to be “inclusive,” the more they alienate others – not just the supposed majority of heterosexuals but also other “sexual minorities” not included in the formula (such as transvestites, sadists and masochists, polygamists and polyandrists, eunuchs, virgins, and asexualists).  

    Although it’s arguably good – another sign of “progress,” in the true sense of the term – that Americans are trying to be more tolerant of diversity, in all aspects, it’s also true that the approach taken by the “p.c. police” is, frankly, silly – and counterproductive.  True toleration of – and indeed, respect for, and even celebration of – diversity will come when Americans stop labeling one another as belonging to particular groups and instead recognize that we are all individuals.  As Ayn Rand wrote in her essay “Racism” (published in her book The Virtue of Selfishness (1964)): “[T]he smallest minority on earth is the individual.  Those who deny individual rights, cannot claim to be defenders of minorities.”   

    A case in point: the campaign being waged by a few p.c. activists against the name of the Washington, D.C. NFL franchise, the Washington Redskins.  The activists claim that Redskins is insulting to “Native Americans” (the p.c. term for American Indians), notwithstanding polls that show most American Indians have no problem with the name (perhaps because they understand that the term refers to the bounty that the British government paid for native scalps during the French and Indian War – a disgraceful chapter in our history that ought not be forgotten).  This summer the U.S. Patent Office’s Trademark Trial and Appeal Board (TTAB) held that the name Redskins, was no longer a valid trademark because it is “scandalous” (an exemption found in the federal trademark statute, the Lanham Act, that ought not to exist because it is inconsistent with First Amendment freedom of speech).  The p.c. police are now petitioning the Federal Communications Commission (FCC) – an agency whose very existence is a violation of the First Amendment – to use its regulatory powers to punish broadcasters who use the word Redskins when reporting on the Washington Redskins.  George Will in a recent column has characterized the FCC petition as “a form of bureaucratic bullying known as regulation by `raised eyebrow.’”  He aptly summed up the situation: “Using the FCC to break another private institution to the state’s saddle for the satisfaction of a clamorous faction illustrates how the government’s many tentacles give it many means of intimidating people who offend it” (“Word bullies run amok in absurd `Redskins’ fight,” Columbus Dispatch, October 15).  Perhaps the best solution to the controversy is the one proposed by Rush Limbaugh: Let the Redskins keep their name, but make them change their logo from an Indian head to a potato! 

    The list of forbidden words – that is, words that are regarded as verboten by the politically-correct American left – has grown to include not only the classic “n-word” (yes, I’m referring to nigger) but also certain variants of the “f-word” (not fuck but fag), the “r-word” (retarded, as previously noted), the “h-word” (handicapped, also previous noted) and even the “i-word” (illegals, when used in reference to illegal immigrants, although to many Democrats the really scary “i-word” is impeachment, as noted above).  For example, bloggers at the Huffington Post (what Limbaugh aptly calls “the Huffing-and-puffing Post”) and other far-left blogs got all bent out of shape over a remark made by Bob Beckel, the left-liberal panelist on the Fox News TV show The Five, who on July 10 referred to Chinese students in the U.S. as Chinamen.”  (As viewers of The Five well know, Beckel is an old-line liberal who in many respects seems to be stuck in American culture as it was in the decades immediately following World War II.  He’s also bluntly honest, especially with regard to those foreigners whom he believes pose the greatest threat to American national security: the Chinese Communists (what some of us still call the “Red Chinese”) and militant Islamic terrorists.  Hearing him call Chinese “Chinamen” isn’t a surprise to anyone familiar with Beckel; at least he didn’t say “chink” or – Heaven forbid! – “Oriental.”  (The preferred p.c. term is Asian, for all peoples of the continent of Asia – a term that’s imprecise and misleading, for it fails to distinguish Chinese, Korean, Vietnamese, Philippino, Pakistani, and Indian, for example.)   

                It’s outrageous that celebrity chef Paula Deen has had her career and her business virtually destroyed simply because she was honest enough to admit, in a deposition taken during a legal dispute with a former employee, that she had once used the “n-word.”  Alexis de Tocqueville observed in his classic book Democracy in America (1835): “I know of no country in which there is so little independence of mind and real freedom of discussion as in America.”  Tocqueville was concerned about what he called “the tyranny of the majority,” for he perceived that in democratic societies “the majority raises formidable barriers around the liberty of opinion” and that “within these barriers, an author may write what he pleases; but woe to him if he goes beyond them . . . . he is exposed to continued obloquy and persecution.”  But in America today, it’s not the majority who are abridging Americans’ freedom of speech: it’s the leftist elite that still controls much of the news media and the academy – not to mention the federal government.

     

     

    n Beware the Religious Left!

                In my “Thoughts for the Summer 2006 Hiatus” (May 30, 2006), I wrote about “The Dangerous Religious Left,” noting the following: 

    “It has become commonplace for the left-biased news media to demonize religious conservatives – the so-called `Religious Right’ – when covering issues like the debate over same-sex marriage, or abortion, or other `social’ or `moral’ issues of concern to conservatives.  What the media largely ignores, however – as a result of the left-wing political bias that pervades virtually the entire journalism profession – is the growing political influence of leftist-oriented Christian clergy, the Religious Left.”  

     

    Noting the involvement of religious left-liberals (who include evangelical Protestants as well as Catholics and other “mainline” sects’ clergy) in the campaign to raise the government-mandated minimum wage, I predicted that the Religious Left would “become increasingly politically active over issues about which they’re passionate – poverty, third-world hunger and disease, sex trafficking, the environment, and the war in Iraq – and will be pushing their left-wing agenda, which generally speaking, will seek to expand the paternalism of the 20th-century regulatory/welfare state, in those areas that left-liberals want more government control as a “solution” to perceived social problems.” 

                The Religious Left is not a new phenomenon in American politics.  Indeed, it was the Religious Left, as embodied in the so-called “Social Gospel” movement of the late 19th century, that was the leading force for socialism in America – and an important constituency of the so-called “Progressive” movement of the early 20th century.  (For more on why I call it “the `so-called’ Progressive” movement, see my essay “Reactionary Progressives” (Mar. 16, 2006).) 

                Today, with a far-left activist occupying the White House (and moving his political party even further to the left side of the political spectrum), the political influence of the Religious Left has become even more dangerous, as it allies itself with Democrats, Big Labor, the NAACP, and a host of other left-wing activist groups, pushing such public policies as minimum-wage increases, gun control, nationalized health care, and “green” energy (i.e., anti-carbon energy policies supposed to forestall “climate change”).  For example, in August several clergy members representing various faiths gathered at the Ohio Statehouse in Columbus, “to kick off a week of prayer aimed at encouraging Ohio legislators to govern with morality and a concern for the least among us,” according to an article in my local newspaper.  The so-called “Moral Week of Action” was led by clergy members in a dozen states; its week-long agenda included such topics as labor rights, “fair and living wages,” “economic justice,” women’s rights, Medicaid expansion, and “environmental justice” (“Faiths unite to pray daily for moral policies,” Columbus Dispatch, August 22).   

    How typical of the Religious Left to be such arrogant elitists – a trait they have in common with the secular leftists – to claim to have a monopoly on “morality,” and to pervert the meaning of the word justice by qualifying it with the adjectives social, economic, or environmental!  (For more on the phrase social justice as used by the Left – and how it really means socialist injustice – see my essay “`Social Justice’ and the `Two Americas’” (Jan. 31, 2008).)  I dare say the threat posed by the Religious Left to all Americans’ rights – particularly their property rights and their rights to economic freedom, but also many “personal” liberties as well – is far more dangerous than that posed by, say, anti-abortion activists on the “Religious Right.” 

                As I concluded eight years ago, “the Religious Left, like the `Religious Right,’ is motivated by collectivist premises that are at odds with the individualism on which American society is based.   This is yet another reason why everyone who cares about reason, individual freedom and responsibility, ought to be alarmed at the continuing influence of religion over public affairs.” 

     

      

    n Tricks, Treats, and a Few BFDs: A Roundup of Pop Culture  

                For my roundup of developments in popular culture, I’ll resume another sometimes-regular feature on MayerBlog, especially for posts in October.  In light of the upcoming Halloween holiday, I note some “Tricks” and “Treats” – that is, both negatives and positives among the recent developments in pop culture.  And, just for fun, I’ll add in a couple “BFDs,” another sometimes-regular feature on MayerBlog – that is, issues that have caused much needless hand-wringing, for only in the sarcastic sense are they really (in the words of V.P. “Smiley” Joe Biden) “Big Fucking Deals.” 

     

    n  Trick: The Suicide of Robin Williams:  The world was shocked on August 13 by the news that Robin Williams, the Oscar-winning actor and comic genius, had committed suicide at age 63.  Williams, who was one of America’s most-beloved celebrities, had first burst on the scene in television (as the star of the ABC sitcom Mork & Mindy) and in stand-up comedy, but his real achievement was a truly impressive film career, which included memorable comedies such as Good Morning, Vietnam (1987), Mrs. Doubtfire (1993), The Birdcage (1996), and Patch Adams (1998); poignant dramas such as Dead Poets Society (1989) and Good Will Hunting (1997); and even thrillers such as Insomia (2002) and One Hour Photo (2002) in which Williams chillingly portrayed the villain.   (One of my favorite performances – one that has been either ignored or under-rated by the critics – is his portrayal of “Andrew,” the robot/android in Bicentennial Man (1999).)  Media coverage sought to explain his suicide by pointing out the “personal demons” that Williams was battling (substance abuse, depression, and the early stages of Parkinson’s disease); and commentators’ reactions ranged from sadness to anger (the latter because Williams’ suicide had left his wife and children heartbroken).  People obviously have different views about the “right to die,” but it seems to me the real tragedy in this case is that, because of laws against physician-assisted suicide, Williams had to end his life alone, in his California home, choking himself with his own belt.  Perhaps the best commentary on the premature completion of Robin Williams’ life came from the Academy of Motion Pictures Arts and Sciences, which tweeted a simple phrase, “Genie – You’re Free,” from the Disney animated film Aladdin (in which Williams had voiced the role of the Genie).  It was an acknowledgement of the truth – a truth that is unpleasant to many people, for various reasons – that Williams owned his own life and thus had the right to decide when to complete it, even if others thought his act was irrational. 

     

    n  Treat: America’s Got Talent, for Real:  My favorite “reality” TV talent competition show is NBC’s America’s Got Talent, mainly because it showcases a true variety of talented acts, not just singers.  It also has – now – the best panel of judges: the chemistry between Howard Stern, Howie Mandell, Heidi Klum, and Mel B is itself  entertaining (thankfully making us forget former judge Piers Morgan, whom I call the “Brit twit”); and host Nick Cannon perfectly balances humor with empathy for the contestants.  But this summer, it was the contestants who really shone.  Every season, the quality of the acts seems to get better and better; this year had an extraordinary array of talent, which meant that many excellent contestants – people who might have won in previous seasons, or on other talent shows – had to be eliminated.  The six finalists illustrate the diversity of talent, with three singers (each quite different from the other two), a unique band (which included a harp and a cello, along with a keyboard player and lead singer), an amazing acrobatic troupe, and a magician.  And I successfully predicted the ranking of the finalists, except for the top two (whom I had reversed): 6. Miguel Dakota, 5. Quintavious Johnson (a precocious 12-year-old), 4. Sons of Serendip, 3. AeroArmy, 2. Emily West (my top choice, a truly splendid singer), and 1. Mat Franco (the best of the magicians – a new David Copperfield, who’ll get $1 million and his own show in Vegas). 

     

    n  Trick: The Too-Soon Completion of Joan Rivers’ Life:  Yet another celebrity whose sudden death shocked everyone was Joan Rivers, the pioneering queen of comedy, who died prematurely at age 81 on September 4.  She was hospitalized on August 28 after going into cardiac arrest at a doctor’s office, during what was supposed to be a routine procedure on her throat.  (It looks suspiciously like medical negligence, causing her to stop breathing and hence depriving her brain of oxygen, which was determined by the New York City medical examiner to be the cause of death.)  Joan Rivers was one of the first female stand-up comedians, known for her sharp wit, delivered with a raspy voice and a brash New York accent, as she delivered her tag line, “Can we talk?”  She crashed the male-dominated realm of late-night talk shows by becoming a permanent guest host on Johnny Carson’s Tonight Show, only to be blackballed by Carson when she got her own competing show on another network.  She also overcame deep personal tragedy – the suicide of her husband, Edgar Rosenberg, after her show was cancelled – to start essentially a second career, as a jewelry designers (which she sold on the TV shopping network QVC), a fashion commentator on the red carpet during celebrity awards shows (with a new tag line, “Who are you wearing?”) and Fashion Police, the hilarious show she hosted on the E! network (which used fashion commentary as a vehicle for her outrageous commentary on celebrities).  At the time she was killed in her doctor’s office, she was promoting her latest humor book, Diary of a Mad Diva, which followed her best-selling 2012 book, I Hate Everyone Starting with Me.  As Rivers herself described her humor, she “succeeded by saying what everyone else is thinking,” but only she dared to say. 

     

    n  Treat: Binge-Watching:  New technologies – including DVRs and video-streaming services – have given Americans even more choices for TV viewing, including a new fad: binge-watching, or spending several hours (or even a whole day, or weekend) watching back-to-back episodes of one’s favorite shows.  Being the old-fashioned (or technologically retro) guy that I am, I’ve only binge-watched DVDs, usually whenever a new complete season of a favorite series has been released (or perhaps either an older series that I missed or watched only sporadically, or a series I entirely missed because it was shown on a premium cable channel, like HBO or Showtime, to which I no longer subscribe).  Among the many series that I admit to having binge-watched are: Angel; Coach; Columbo; Dexter; Frasier; Lois & Clark; Murder, She Wrote; Perry Mason; Quincy, M.E.; Rumpole of the Bailey; United States of Tara; and Wings

     

    n  Trick: Ken Burns’s The Roosevelts:  From Sunday, Sept. 14 until Saturday, Sept. 20 PBS aired the latest extravaganza from filmmaker Ken Burns, the 14-hour The Roosevelts: An Intimate History, a three-way biography (more like a hagiology) of Theodore (Teddy) Roosevelt (TR), Franklin Delano Roosevelt (FDR), and Eleanor Roosevelt, spanning a century, from TR’s birth in 1858 until Eleanor’s death in 1962.  The documentary follows the all-too-familiar Burns formula: a script by Burns’s frequent collaborator, historian Geoffrey C. Ward; narration by Burns’s favorite narrator, actor Peter Coyote, supplemented by the words of the major figures, recited, in character, by famous actors (Paul Giamatti as TR, Edward Herrman as FDR, and Meryl Streep as Eleanor); an abundance of photos and film clips; and commentary by talking heads, nearly all of whom share Burns’s and Ward’s idolatry of the Roosevelts.  The only voice of objectivity – let alone criticism – is that of commentator George Will, the lone talking head who does not idolize the Roosevelts.  It is Will, for example, who observes that for the two presidents Roosevelt, “the Constitution was a mere inconvenience” – which is a rather mild way of saying that they violated their oaths of office and showed nothing but contempt for the Constitution and the limitations it set on governmental powers.  In truth, no single family has done so much damage to the Constitution as the Roosevelts.  (Maybe that’s why Hillary Clinton plans to run for president in 2016: so she and Bill can break the record set by TR and FDR, who vastly expanded the powers of the federal government generally and presidential powers particularly, creating the modern “imperial” presidency)  Today, when Americans are suffering the consequences of the massive national regulatory state built upon TR’s “progressive” agenda and FDR’s “New Deal,” they deserve to know, for example, that the New Deal was not only a failure but also had worsened the Great Depression – but Burns’s film does not tell them this uncomfortable truth.  An Intimate History?  It’s more like an imitation of history, a fairy-tale or myth, filled with left-wing bromides.    

     

    n  BFD: Another Secret Service Fuck-up:  The U.S. Secret Service – formerly the division of the federal Treasury Department originally created to focus on counterfeiting cases and in the modern era the agency responsible for safeguarding the president – has had a number of scandals during B.O.’s occupation of the White House.  But recently it wasn’t just “morals” breaches involving prostitutes in South America that have received attention from the news media; rather, it’s the apparent failure of the Secret Service to perform its basic duty of providing security at the White House itself.  On September 19 around 7:20 p.m. a man jumped the fence on Pennsylvania Avenue, ran straight to the North Portico, entered the unlocked front doors of the mansion and went as far as the East Room, before he was finally apprehended by Secret Service agents.  The intruder, a 42-year-old military veteran from Texas (who, according to his family, suffers from PTSD), was carrying a small folding knife.  Neither B.O. nor other members of the “First Family” were in any danger, for they were away, at the presidential retreat at Camp David.  The embarrassed director of the Secret Service ordered an internal review of its security procedures.  Although B.O. publicly expressed confidence with the Secret Service, the Washington Post has reported that both he and Michelle (Mrs. B.O.) were furious with the agency after a more serious security breach had occurred in 2011 (when a 21-year-old man from Idaho had fired shots that hit the mansion).  And the Post in an editorial suggested that the Service was “floating the notion” of pushing visitors even further back from the perimeters of the president’s home – keeping people off the sidewalks near the White House, creating additional barriers, and establishing new checkpoints further back.  As the Post editors correctly point out, “making more of Washington off-limits should itself be off-limits.”  It was a huge mistake to close Pennsylvania Avenue to traffic and to transform once-public places (where Americans could exercise their First Amendment rights) into secured government parking lots, giving the White House grounds the atmosphere of an armed camp more suitable for a totalitarian dictatorship than a constitutional republic.  The White House is truly “the people’s house” – whoever currently occupies the presidency is only a temporary tenant – and it should be accessible to its owners, the American people, subject to reasonable (and necessary) security measures.  Rather than further restricting public access, the Secret Service simply should do its job more competently.  The agency should be restructured and returned to the Treasury Department.  (Since 2003, it was transferred to the bureaucracy of the Department of Homeland Security, a department created as part of the overreaction to 9/11 and which should not even exist.)  The director, Julia Pierson, has resigned; that’s not only a good start on some much-needed house-cleaning, but also a model that others in the B.O. regime, going all the way to the top, ought to follow. 

     

    n  Treat: Atlas Shrugged, Part III:  The third part of the movie trilogy based on Ayn Rand’s magnificent philosophical novel – Atlas Shrugged, Part III: Who Is John Galt? – played in a scattering of movie theaters nationwide in late September.  Anyone who has read MayerBlog knows that I’m a huge fan of Atlas Shrugged – and have been so, ever since I first read the novel during my winter break in my sophomore year of college, 1974-75 – and that I regard it as the most important book ever written.  (See my essays on the 50th anniversary of the novel: “Atlas Shrugged at Fifty,” Part I (Sept. 27, 2007), Part II (Oct. 3, 2007), and Part III (Oct. 10, 2007).)  Hence, I’ve had mixed feelings about the movie trilogy, for no series of three 90-minute or two-hour films (especially if they are produced on a shoe-string budget) could possibly do justice to Rand’s complex 1000-page novel.  Frankly, I was pleasantly surprised, overall, by the first two Atlas Shrugged films – and quite disappointed by Part III.  Why then do I label it as a “Treat”?  Put simply, it’s because the extremely negative reviews I’ve read – most written by conservatives or libertarians who admit they’re not fans either of Rand or her novel – are quite unfair.  As a fan, I was most disappointed by the reduction of Hank Rearden (to me, the most compelling character in the novel – and a central character in both Parts I and II of the movie franchise) to a minor character in Part III, barely more than a cameo appearance near the end of the film.  I have mixed reaction to the casting choices (I liked the actors playing Galt, Jim Taggart, Hugh Akston, among others, but disliked the actors who played Dagny Taggart, Francisco d’Anconia, Ellis Wyatt, and Midas Mulligan, among others).  But I did like the re-telling of the story of the Twentieth Century Motor Company at the beginning of the film, its (appropriate) focus on the character of John Galt and the “strike” theme, and even the way the film handled Galt’s speech (arguably the most challenging aspect of filming Part III).  Overall, I’m happy to see the trilogy finished, despite the flaws (most of which are understandable given the limited budget and the problems associated with doing three separate movies each with a different cast).  My hope is that it will motivate people not familiar with Atlas Shrugged finally to read it – as well as motivate people who have read the book to read it again! 

     

    n  BFD: Ray Rice-gate:  With a new war breaking out in the Middle East and a host of other problems on both the international and domestic fronts, the news media in mid-September was obsessed with the story of pro football player Ray Rice’s suspension from the NFL.  Rice was caught on video (taken by a hotel elevator’s security camera) hitting and knocking unconscious his then-fiancee (now wife) on Feb. 15 – a video that was publicized by the sensationalistic celebrity news channel TMZ – prompting his two-game suspension by his team, the Baltimore Ravens, and then an indefinite suspension by the NFL.  Various activist groups have called for NFL commissioner Roger Goodell to resign, allegedly for his delayed response to this incident of “domestic violence”; while the NFL Players Association have appealed Rice’s suspension, arguing that he was denied his due-process rights under the collective bargaining agreement.  Meanwhile, evidence has surfaced of another pro football player, Adrian Peterson of the Minnesota Vikings, who also has been indefinitely suspended, for allegedly abusing his four-year-old son by beating him with a stick – putting additional pressure on the NFL to “do something” about players guilty of child abuse as well as spousal abuse.  I certainly don’t mean to trivialize abuse – which is a criminal act, assault, aggravated when committed by a physically powerful man against a woman or a child – but should the NFL really be in the business of policing players’ actions off the field?  Critics say “yes,” because professional athletes are “role models.”  But should they be?  Why?  These are questions far more important than the case of Ray Rice or other NFL “scandals.” 

     

    n  Trick: Make-a-Difference Day:  USA Weekend (the weekend supplement to USA Today and various other newspapers nationwide) since 1992 has been sponsoring “Make a Difference Day” (this year on October 25).  It’s billed as “the nation’s largest day of service,” an annual effort on which Americans are encouraged to do volunteer work on projects big and small, “a day for Americans to set aside some time to improve the lives of others” – in other words, to put other persons’ interests above their own.  As usual, I didn’t observe it, as a matter of principle.  As a neo-Objectivist, I reject a moral code based on altruism; I agree with Ayn Rand’s statement that “the world is perishing from an orgy of self-sacrificing.”  What we really need is “Take Responsibility Day,” a day on which everyone would take responsibility for their own lives – paying the price for their bad choices and reaping the benefits of their good choices.  Or maybe “Mind Your Own Business Day,” because that’s the true “Golden Rule,” the fundamental guiding principle for individuals who live in a free society.  The best way anyone can “make a difference,” in their own life and in the lives of others, is to do the very best job they can in whatever they do to earn their livelihoods.  As Adam Smith put it in Wealth of Nations, “It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest. . . .”     

     

    n  Treat: Pumpkin-flavored Foods:  Comedian John Oliver (who’s not very funny, in my opinion) has railed against fall fanaticism for pumpkin-spice latte, which he calls “a coffee that tastes like a candle.”  “We tolerate pumpkin spice because we like the fall, . . . but just about anything that reminds us of autumn is better than pumpkin spice,” Oliver said.  “I personally, for instance, would rather drink a cable-knit-sweater spice latte or a Major League Baseball spice latte . . . .”   Well, personally, I like pumpkin-flavored foods.  And this time of year, I enjoy not only pumpkin pie, but also pumpkin ice cream (Homemade brand is the best), pumpkin donuts (especially Entenmann’s), and, of course, pumpkin-spice coffee (especially Green Mountain Coffee’s K-cups for the Keurig coffee-maker).  But I concede that pumpkin-spice OREOs do indeed take a good thing too far!

     

    n  Trick: The “Best Damn Band” – and the Worst Damn President (Outside of Washington, D.C., That Is):  As an alumnus of the University of Michigan – which here in Columbus, Ohio is often referred to as “that school up north” – I usually avoid commenting about The Ohio State University, that other school up north (up north on High Street from downtown Columbus, that is).  But as a fan of The Ohio State University’s splendid marching band – “TBDBTL,” as it’s called, “The Best Damn Band in the Land” – I cannot resist commenting on the unfairness of the firing of the band’s director, Jon Waters, on July 24.  The new OSU president, Dr. Michael V. Drake – the dweebish former chancellor of the University of California in Irvine – made the draconian decision (or should we say “Drake-onian”?) to fire Waters, without even giving him a hearing, citing a report that alleged Waters failed to change the band’s “sexualized culture.”  The OSU marching band’s alumni club has rallied to Waters’ defense, issuing its own report in September concluding that Waters was “sacrificed” as a scapegoat to appease federal investigators from the U.S. Department of Education, who are investigating whether the university’s policies on sexual discrimination are in compliance with Title IX (in other words, an unconstitutional overreach of national government powers).  Waters has filed a lawsuit against OSU, demanding his job back and at least $1 million in damages; he has a strong case, based on both the denial of his due process rights and his own claim for sexual discrimination (because the university last year did give a female cheerleading coach a second chance to keep her job when she was faced with similar allegations of sexual misconduct by two assistant coaches).  The university has responded by claiming that Waters was an employee at will – and by making despicable personal attacks on Waters.  A once-great university certainly doesn’t look good anymore.

     

      

     | Link to this Entry | Posted Thursday, October 30, 2014.  Copyright © David N. Mayer.


    Thoughts for Summer 2013, and Beyond (Part II) - May 30, 2013

     

     

    Thoughts for Summer 2013 – and Beyond

     

    Part II

      

    As I noted in my introduction to Part I, MayerBlog will be on a lengthy hiatus – not only for the summer but also continuing during the 2013–14 academic year, when I will be on sabbatical leave, completing the manuscript of my next book, Freedom’s Constitution: A Contextual Interpretation of the Constitution of the United States.  So, this will be the last essay posted to MayerBlog until Fall 2014 (probably Labor Day weekend in 2014). 

    Before going on hiatus, however, I could not resist the temptation to comment on a number of important issues in public policy and popular culture – issues that are in the news today and are likely to remain in the news throughout the summer and beyond.  (If you scroll all the way down to the end of Part I, you’ll also see my summer reading recommendations and my annual preview of summer movies.)  

     

     

    n The Emperor Is (Still) Naked!  America’s B.O. Problem (Update)  

    The perfect metaphor for B.O.’s presidency is the famous children’s story by Hans Christian Andersen, “The Emperor’s New Clothes” – as I have written here, since well before the 2008 elections.  (See my blog essay “The Emperor Is Naked!” Oct. 16, 2008.)  The story tells the tale of a vain monarch and his gullible followers who were hoodwinked by some clever con men masquerading as tailors.  They dressed the Emperor in a supposedly magnificent set of clothes but really in nothing at all; they said that only intelligent persons would be able to see the supposedly invisible wardrobe, so the Emperor and all his followers, fearing they’d brand themselves as stupid, were too afraid to admit they saw nothing.  That is, until one day when the Emperor was parading down the street in his “new clothes,” and one outspoken little boy – with the honesty of a young child – shouted, “But he’s naked!”  That brought an end to the charade.  

                In this metaphor, the real villains are the tailors who perpetrate the fraud that deceives the people.  Who are the tailors in the real-life story of the naked Emperor, B.O.?  Clearly, it’s the so-called “mainstream” news media (called the “lamestream” media by former CBS newsman and media critic Bernard Goldberg), who by and large have continued their “slobbering love affair” with B.O. (as Goldberg so aptly put it in the title of his recent book).  

                Predominantly leftist in their political orientation, media “journalists” are not only ideological kinsmen of B.O. but also elitists who take a kind of perverse pleasure in hoodwinking the American people.  Taking their lead from B.O. himself and his political handlers, they have emphasized his racial identity as “black” – another part of his charade because he’s really of mixed racial heritage (his father was a black African but his mother was white, and he was raised by his white maternal grandparents) – and so, as the tailors in the media constantly tell Americans, B.O.’s presidency is “historic.”  The implication is that anyone who failed to vote for him in 2008 and 2012 must be “racist.”  In other words, B.O.’s media lapdogs are engaged in what some people call “reverse racism” (bigotry in favor of someone because of their racial identity); but it ought to be considered for what it is – blatant racism, as racism properly ought to be defined (treating persons not as individuals but as members of a racial group).  For more on this, see my discussion “Let’s Talk Frankly About Race” in last year’s “Thoughts for Summer” (May 25, 2012).  As I wrote there – explaining why I’ve called B.O. the “affirmative action president”: 

    B.O. is perhaps the least qualified person ever elected president – someone who would not have been seriously considered a contender for the office of Chief Executive, if it were not for his race and for his exploitation of racism.  His campaign deftly played the “race card,” cashing in on the dregs of America’s past problems with racism:  black group-identity politics and white liberal guilt.  If he had not racially identified himself as a black man and cashed in on racism, he would not have been nominated by the Democrats’ party, let alone elected president.

     

    B.O.’s apologists continue to “play the race card,” claiming that anyone who criticizes B.O. is “racist.”  That’s absurd, for the vast majority of B.O.’s political opponents and critics oppose him because of his policies, or his actions, not because of his race (or his racial identity).  Indeed, as I’ve frequently commented here on MayerBlog, the assertion that it’s “racist” to criticize B.O. is itself racist:  just as it would be racist to dislike B.O. or to oppose his policies simply because of his race, it’s equally racist, by the true meaning of the word, to give B.O. a pass – to blind oneself to his many flaws as president, as his supporters do – also because of his race.  Or as Fox News’ Greg Gutfeld has pointed out:

    “The fact is, if you’re a liberal Democrat, you’d vote for Obama whether he was black or white (and, get this: He’s both).  But if you’re a conservative or a right-leaning libertarian, you wouldn’t vote for Obama whether he was black, white, or chartreuse.  So when the left says the reasons behind your choice are racial instead of intellectual or ideological, it’s way beyond arrogant – it’s offensive.  And it deserves a punch in the kisser.  Fact is, I don’t dislike Obama; I dislike Obama’s beliefs.  And I disliked them when they were Hillary Clinton’s.  And Bill Clinton’s.  And Teddy Kennedy’s. And Jimmy Carter’s.  But that’s the beauty of stupidity – it knows no color.”

     

    (The Bible of Unspeakable Truths, 2010, p. 116). 

                It’s not just racism, however, that explains the “slobbering love affair” the media have had with B.O.  It’s also the leftist political philosophy (whether you call it “liberal,” “progressive,” “statist,” “socialist,” “semi-socialist,” or even “fascist” or “collectivist,” it’s the philosophy on which the 20th-century regulatory “welfare state,” or “Big Government Nanny State” has been founded), which the overwhelming majority of journalists, academics, and other supposed “intellectuals” share with B.O.  Because he’s also the most-far-left president in U.S. history, the news media have been “in the tank” for him throughout his first term and did all they could do, through their biased reporting of the news, to help assure his reelection in 2012.  (The fact that B.O. also racially identifies as “black” is just the icing on the cake, which gives leftists even more smug satisfaction – patting themselves on the back for being p.c. and for “transcending” race – in not only supporting but also practically worshipping him, as a new kind of political “messiah.”)  B.O.’s narcissism and arrogance, rather than turning them off, have made the media even more enthusiastic in their support.  

    It’s a symbiotic (mutually supportive) relationship, as libertarian columnist Larry Elder observed in a recent op-ed: 

     “[B.O.’s] arrogance flows from our fawning, gushing, Bush-hating `news’ media, which shirk their responsibility to fairly report the news.  The media’s fecklessness creates overconfidence.  With good reason, [B.O.] expects his media cheerleaders to look the other way, accept excuses without much challenge and turn the president’s critics and whistleblowers into enemies.”

     

    Among the many examples Elder cites are:  the refusal of black Democrat politicians (like Rep. Maxine Waters of California) to blame a 15.9% black unemployment rate (which Rep. Waters called “unconscionable”) on B.O.’s job-killing policies; the refusal of the media generally to challenge outrageous claims that B.O. has made, such as the 3.5 million jobs he’s claimed he “saved or created,” or his assertion that “ObamaCare” would cause the “cost curve” for health insurance to “bend down” (when all the evidence is that’s causing steeply rising health-care costs), or even the false story B.O. told about his own mother (that as she lay dying from cancer in a Hawaii hospital, she fought with her insurance carriers over her medical bills) – an absolutely false narrative the press allowed B.O. to use to personalize his fight for “ObamaCare.”  Or, to cite another example, the claim that then-Senator B.O. made with a straight face – that, contrary to the policies of former President Bush, he would oppose any military intervention unauthorized by Congress unless the country faced imminent risk of attack.  But as president, B.O. joined with France and Britain in bombing Libya, “a country that posed no imminent threat to America,” whose leader, Moammar Gadhafi, had long before surrendered his weapons of mass destruction to the Bush administration.  “President Obama paid no political price for what Senator Obama would have opposed,” notes Elder (“In Defiant Obama, Media Face Monster They Made,” I.B.D., May 24). 

    With a news media that fails to call B.O. on his blatant, outrageous lies, “why shouldn’t [he] feel that he operates under different, special rules, and can do so without risking loss of support?” Elder asks.  “By refusing to hold [B.O.] to the same standard they would hold any garden-variety Republican, the media now face the monster they created.”  That’s why the media now seems so “shocked – shocked” (to quote Claude Raines’ character in the classic movie Casablanca) by the recent scandals discussed below. 

    But now that B.O. is in his second term – thankfully, he’s term-limited and so is a “lame duck” president – he’ll lose his charm (because, by January 2017, if not sooner) he’ll lose his power and become merely an ex-president.  That’s the general reason why most two-term presidents, throughout U.S. history, have had problems – including various types of political scandals – during their second terms.  B.O. is no exception; moreover, given his style of governance (discussed more fully in the next section), it’s virtually inevitable that scandals will emerge involving him and his regime (which, as I also discuss in the next section, I’ve called “the most lawless in U.S. history”). 

    Already, just four months into his second term, B.O. is facing what conservative columnist George Will has called a “trifecta” of scandals:  Benghazi (the militant Islamic terrorist attack on the U.S. consulate in Benghazi, Libya last September 11, and the subsequent effort by B.O.’s regime to cover up the facts concerning the attack), “IRS-Gate” (as I called it, in Part I, the targeting of certain conservative and libertarian groups by the IRS), and the Justice Department’s surveillance of AP news reporters (and other journalists).  As Will observes, these three scandals are really just the tip of the iceberg, for there are other scandals – some which already broke into news during B.O.’s first term, like the “Fast and Furious” illegal gun-running operation in Mexico, involving Eric Holder’s Justice Department and the FBI – and others that are breaking into news right now.  A possible fourth scandal involves “power being wielded by executive branch officials (at the National Labor Relations Board and the Consumer Financial Protection Bureau) illegally installed in office by presidential recess appointments made when the Senate was not in recess.”  A fifth scandal “may be Secretary of Health and Human Services Kathleen Sibelius’ soliciting, from companies in industries that HHS regulates, funds to replace what Congress refused to appropriate,” to “educate” Americans about the “ObamaCare” mandates.  “Because [B.O.’s] entire agenda involves enlarging government’s role in allocating wealth and opportunity, the agenda now depends on convincing Americans to trust him, not their lying eyes,” concludes Will.  “In the fourth month of his second term, it is already too late for that” (“Scandals Debase the Currency of Gov’t Trust,” I.B.D., May 17). 

    These recent scandals – and particularly the scandal involving the AP and other journalists – have alarmed not only the news media and many self-described civil libertarians (because it concerns real threats to freedom of the press) but also many of B.O.’s most fervent supporters and apologists.  They’ve become, metaphorically, just like the little boy in Hans Christian Andersen’s story – the boy who sees the Emperor for what he really is, who by shouting “But he’s naked!” opens the eyes of the people generally. 

    For MSNBC’s Chris Matthews, apparently the “thrill” is gone:  the Hardball host (who in 2008 famously declared he felt “this thrill going up my leg” every time B.O. speaks) recently observed that B.O. “obviously likes giving speeches more than he does running the executive branch.”  Politico – another friend of B.O. – described Matthews’ remarks as “a rare, unforgiving grilling of the president as severe as anything that might appear on Fox News.”  Matthews grumbled that B.O. “doesn’t like dealing with other politicians . . . his own cabinet . . . members of the Congress, either party . . . doesn’t like giving orders or giving somebody the power to give orders.  He doesn’t seem to like being an executive.”  Instead, “he likes to write the speeches . . . likes going on the road, campaigning.”  Gee, how perceptive.  Matthews suddenly realizes B.O. isn’t fit to be president (“The Thrill Is Gone,” May 17). 

    Matthews isn’t alone; other prominent members of the “lamestream” media are finally seeing the light – seeing the naked “Emperor” for what he really is.  NBC’s Andrea Mitchell has accused B.O. of “the most outrageous excesses I’ve seen” in her years of journalism, going back before Watergate.  Thanks to the scandal involving the Justice Department’s surveillance of AP reporters, the Washington Post’s Dana Milbank has accused B.O. of “a full frontal assault on the First Amendment.”  And the Post’s “Fact Checker” – which hands out little “Pinocchios” to politicians who stretch the truth – recently gave B.O. “Four Pinocchios” (the worst rating on its lie-ometer) for his repeated claim that “the day after [the Benghazi attack], I acknowledged that this was an act of terrorism” (“Now the Lapdog Media Tell Us,” I.B.D., May 15). 

    I’ve previously written here in MayerBlog that “the deodorant America needed to deal with its B.O. problem” was the “Tea Party” movement.  That was certainly true in 2010, when the Tea Party movement played a critical role in energizing Republicans, leading to Republican control of the U.S. House of Representatives following the 2010 midterm elections.  Unfortunately, the Tea Party movement failed to help the GOP in the 2012 elections – not only because it was being harassed by the IRS, but also because the Tea Party movement actually hurt the GOP in 2012.  By supporting questionable GOP candidates, the Tea Party allowed the Democrats to retain control of the U.S. Senate and, worse, by failing to support Mitt Romney, Tea Party activists allowed B.O. to be re-elected president with fewer popular votes than he had in 2008.  Now, the “deodorant” America needs to really get rid of its “B.O. problem” is . . .  his impeachment and removal from office.

      

     

    n Countdown to Impeachment  

                In my essay “The Unconstitutional Presidency” (February 21), I discussed the various ways in which modern U.S. presidents – both Democrat and Republican – have abused the powers of their office, thereby breaking the “chains of the Constitution” by which the Framers sought to restrain the most powerful political office in the world.  I also discussed how the current “occupier” of the White House, B.O., is the most dangerous of the lot – not only the worst president in U.S. history but also, as I’ve called him (giving my top ten reasons) “the most lawless president in U.S. history.”  (See in addition to my February essay, my previous post, “Rating the U.S. Presidents: 2012” (Feb. 22, 2012).) 

    The stench emanating from Washington, D.C. these past four and a half years is indeed America’s “B.O. problem.”  Specifically, it comes from the “culture of corruption” that B.O. and his regime have brought to the nation’s capital – what several commentators have described as the “Chicago style” of dirty politics, which in B.O.’s case (given his background as a “community activist” in the “Windy City”) means a combination of general thuggery with the dirty tactics expounded by radical leftist Saul Alinsky, in his handbook, Rules for Radicals.  (For more on this, read the excellent books by Michelle Malkin and David Freddoso, respectively, Culture of Corruption (2010) and Gangster Government (2011).)  B.O.’s “thugocracy” (as David Freddoso has called it) can be described as a combination of misfeasance, malfeasance, and nonfeasance – a veritable smorgasbord of all the types of abuse of power that can be committed by a U.S. president and his lawless regime.   

    That’s why it’s especially ironic that in his recent speech at The Ohio State University’s spring graduation exercises, B.O. advised graduates to put all their trust in government and reject those shrill “voices” that say it’s the source of our problems.  Ignore those limited-government types, he told the class of 2013, who warn “tyranny lurks just around the corner.”  (Not coincidentally, those limited-government types were the groups that the IRS targeted for special scrutiny.)  The irony is that B.O. himself “has proved our fears are well-founded.  Government, particularly governance by this rogue regime, needs more checks, not fewer; more skepticism, not less.  Tyranny isn’t lurking around the corner.  It’s now upon us, manifest in the pattern of misuse and abuse of government power by this presidency” (“A Scary Pattern of Power Abuse,” I.B.D., May 16). 

    It’s also ironic – and more than just a little hypocritical – for B.O. and his apologists to now say, in response to news of the recent “trifecta” of scandals (more fully discussed below), that he was ignorant of these abuses of power, claiming that he learned about them at the same time the public did, when he read accounts in the news media (the same news media that his Justice Department has targeted for surveillance, in another one of the recent scandals).  He’s “not responsible,” the White House claims, because the federal government has become too big and complex. (Never mind that B.O.’s basic agenda since 2009 has been to make the government even bigger – to be in charge not only of every American’s health care but also, among other things, to police every consumer credit transaction, whether it’s a mortgage, or a student loan, etc., etc.)  As B.O.’s chief political adviser, David Axelrod, put it, “part of being president is there’s so much underneath you that you can’t know because the government is so vast.”  As the editors of Investor’s Business Daily observe, “The president can’t have it both ways.  The genius who could direct an expanded government to solve our problems can’t now claim ignorance as scandal explodes” (“Having It Both Ways,” May 16). 

    In the last section of my February essay, I discussed the solution that the framers of the Constitution provided for the problem of presidents who abused their power – the means by which a corrupt president could be, in effect, “flushed away, “ removed, put out of office.  We might consider it “the enema of the state.”  It’s called impeachment. 

    As I observed, virtually any of the lawless actions or abuses of power that B.O. has committed thus far during his occupation of the White House would constitute impeachable offenses.   Indeed, in my “Election 2012 Postmortem” essay (Nov. 10, 2012), I predicted that B.O. would be impeached and removed from office before he completes his second term:

                “The abuses of power committed by B.O. over the past four years – abuses that justify my calling him `the most lawless president in U.S. history’ . . .  ought to have disqualified him from being elected to a second term.  Now that he has been reelected, however, I seriously doubt whether he will complete a second term.  I am confident that his disregard for the Constitution and the rule of law will prompt him to continue to commit offenses – not merely `high crimes and misdemeanors,’ but also possibly treason (making him the first traitor to hold the office of president) – that will justify his impeachment and removal from office before his second term ends in 2017, in other words, sometime during the next four years.”

     

    Reviewing the history of past presidential impeachments – the cases of Andrew Johnson, who in 1868 was impeached, tried in the Senate, and acquitted; of Richard Nixon, who was nearly impeached (the House Judiciary Committee approved articles of impeachment) in 1974 but who resigned before proceedings went further; and of Bill Clinton, who in 1998-99 was also, like Johnson, impeached, tried in the Senate and acquitted – I then considered the prospects for impeaching B.O. and successfully him removing from office.  (He would be the first president to be successfully impeached and removed.)   Drawing lessons from both Nixon’s and Clinton’s presidency – of the successful attempt to impeach Nixon and of the failed effort to convict Clinton – I concluded that a successful impeachment effort must be bipartisan; it cannot be begun by the Republicans alone, unless they hold commanding majorities in both the House and the Senate.  (I did add, however: “Perhaps some particular abuse of power – such as the drone strikes discussed in the “Assassin-in-Chief” section [in the February essay] – will receive attention from the news media and arouse concern among left-liberals as well as conservatives and libertarians.)  Hence, as I discussed in the concluding section of my “Election 2012 Postmortem” essay, the Republicans face a major challenge – both to educate the American people and to make them care, to make them care whether the man in the Oval Office is abusing the awesome powers of his office.  

    Some conservative commentators have cautioned that, politically, it would be nearly impossible to impeach B.O.  Michael Medved, writing an op-ed in USA Today, reminds us that all three of the serious presidential impeachment drives (against Johnson in 1868, Nixon in 1974, and Clinton in 1998-99) occurred when the president’s opponents controlled both houses of Congress by hefty margins (“No, Obama Won’t Be Impeached,” May 20).  And Rush Limbaugh, commenting recently on his radio show, has bluntly claimed that it’s impossible for B.O. to be impeached, simply because he’s black.  “There’s no way Congress will impeach the first `African-American president’,” Rush asserts. 

    The recent “trifecta” of scandals may be game-changers, however.  First were the Congressional hearings that exposed the B.O. regime’s attempts to cover up the deplorable actions in Benghazi, Libya (discussed in the section below, “The Ghosts of Benghazi.”)  Next came the revelation that the IRS had been targeting “Tea Party” groups and other conservative and libertarian groups (generally groups that advocated smaller government) for extra scrutiny – a matter so serious that even B.O. had to try to distance himself from the IRS actions, calling them “outrageous” and feigning ignorance.  (Why is “IRS-Gate” so serious?  Because it closely parallels one of the impeachable offenses charged against Richard Nixon, as I’ve discussed in Part I.)  Then came the news that Eric Holder’s Department of Justice had seized two months of phone records at the Associated Press to track down leaks that led to a 2012 story about a foiled al-Qaida bombing plot.  The AP scandal was quickly followed by the revelation that the Department of Justice (maybe we should call it In-Justice) not only spied on Fox News reporter James Rosen but also labeled him an unindicted “co-conspirator and/or aider and abetter” in an espionage case involving a State Department whistle-blower.  His “crime” was convincing the State Department source into giving him information about North Korean missile tests – in other words, his “crime” was simply to report the news. 

    As I’ve noted above, in the previous section, by this point some of B.O.’s most fervent apologists are wondering if he’s really the “fierce defender of the First Amendment” that his spokesman Jay Carney says he is.  The New York Times declared that the administration “has moved beyond protecting government secrets to threatening fundamental freedoms of the press to gather news.”  As the editors of Investor’s Business Daily observe, that’s “strong stuff, considering the source.”  Even stronger is the criticism of James C. Goodale, the lawyer who represented the Times in the Pentagon Papers case.  Goodale says the Rosen case “is a further example of how [B.O.] will surely pass President Richard Nixon as the worse president ever on issues of national security and press freedom.”  As the I.B.D. editors conclude, “Goodale gets what many liberals still can’t bring themselves to believe – [B.O.] really has no love for the First Amendment.  His administration treats freedom of the press as a reward for good behavior, not a sacred right.  Up to now, the media have mostly behaved, with a few exceptions such as Fox News.  They’re now finding out that, when they do take their role as public watchdogs seriously, the administration bares its teeth” (“Creeping Along Tyranny’s Road,” May 28).  Indeed, the front page story in a recent issue of USA Today had the headline, “Media fear [B.O. is] out to `criminalize journalism’” (May 28). 

    Another commentator who has compared B.O. to Richard Nixon is Jonathan Turley, a constitutional law professor at George Washington University Law School (and a contributor to USA Today), who wrote – last year – that B.O. “is using executive power to do things Congress has refused to do, and that does fit a disturbing pattern of expansion of executive power.”  Professor Turley adds: “In many ways, [B.O.] has fulfilled the dream of an imperial presidency that Richard Nixon strived for.  On everything from [DOMA] to the gaming laws, this is a president who is now functioning as a super-legislator.  He is effectively negating parts of the criminal code because he disagrees with them.  That does go beyond the pale.”  (More recently, in response to the scandal involving Justice Department surveillance of AP reporters, Turley has a column in USA Today arguing that Attorney General Eric Holder – who, it is becoming increasingly apparent, lied to congressional investigators about his knowledge of the surveillance operations – ought to be fired.  He argues that, despite Holder’s role thus far as B.O.’s “sin eater” – a high-ranking associate who shields the president from responsibility for their actions.  Turley notes that Nixon had H.R. Haldeman and John Ehrlichman; Ronald Reagan had Oliver North and Robert “Bud” McFarlane.  He should have added that although North and McFarlane successfully shielded President Reagan from the “Iran-Contra” scandal, Haldeman and Erlichman did not shield Nixon from culpability in Watergate.  Indeed, their fall was just a prelude to Nixon’s.  (“Attorney General Must Go,” May 29). 

    It’s telling that my local newspaper, The Columbus Dispatch – a paper that can fairly be described as middle-of-the-road in its political views – in a recent editorial has suggested that B.O.’s methods have “set the stage” for the current scandals.  “Despite the president’s denials, defenses, and dodges, there is a thread running through the multiple scandals that now beset him and the underlings who serve him in the executive branch: the arrogance of power.”  The editors observe (as I have, in my discussion of “IRS-Gate” in Part I), that none of the recent “trifecta” of scandals (the IRS, the Justice Department and the AP, and Benghazi) can be blamed merely on “low-level” employees in the executive branch, for B.O. as president bears the responsibility.  Not just because of his position at the top of the executive-branch flow chart, but also “because the leadership of the organization sets the example and tone for those underlings who carry out the administration’s policies and who apply them directly to the nation’s citizens.”  No only the recent trifecta of scandals but also other examples of abuse of power permeate B.O.’s presidency: among other things, “the high-handed `recess appointments’ of members of the National Labor Relations Board, actions that two courts have now ruled unconstitutional because the Senate was not in recess when the appointments were made”; “the president’s invocation of executive privilege to stymie congressional investigation of the U.S. Justice Department’s disastrous Fast and Furious program”; “his refusal to open up fully to Congress and the public about his unchecked use of drones to kill terrorist suspects, including American citizens, overseas”; and so on.  “If, in word and deed, the leadership runs roughshod over the rule of law, denies the legitimacy of alternative political viewpoints and regards election as a carte blanche to do as one pleases,” not surprisingly “the rank and file will misbehave accordingly” (“Leading by example,” May 19).

    As I observed in the final paragraph of my February essay, “I don’t expect anyone as arrogant and narcissistic as B.O. to do a similar thing [as Nixon did, to resign from the presidency] with faced with the real possibility of impeachment by the House and a trial in the Senate.”  That’s why I concluded that "it’s vitally important for Republicans to educate the public and, frankly, to not only court but also shape public opinion.”  The Democrats’ success in doing that helps explain why they retained both the White House and control of the Senate in the 2012 elections.  The Republicans will need to be equally successful, to retain control of the House and perhaps regain control of the Senate in the 2014 midterm elections – if we are to hope for a restoration of a constitutional presidency, before B.O.’s lawlessness destroys both the office and the Constitution.

     

     

    n Politics 2013: It’s All About 2014 (and 2014’s About 2016) 

                It’s a sign of the intense partisanship in Washington over the past several years (especially since the hyper-partisan B.O. became president) that most political pundits see the current political battles in the nation’s capital as merely efforts by politicians of both political parties to better position themselves for the next election – in other words, the “mid-term” congressional elections in 2014 and the general election (including the next presidential election) in 2016.  And, to a great extent, that’s true.   Let me stick my neck out here, and make a few bold predictions about these upcoming elections, based on today’s politics (and especially the developing scandals surrounding the B.O. regime). 

    Mid-term elections generally favor the party that doesn’t control the White House, so Republicans should be able to comfortably hold their majority in the U.S. House and also have a good chance to regain control of the Senate.  Republicans need a net gain of only six seats to have a majority in the Senate; Democrats not only have more seats to defend (21, compared to the GOP’s 14) but several Democrat incumbents – many of them in so-called “red” states, which typically tilt Republican – have announced their retirement.  Most recently, it was Senator Max Baucus (D.–Montana) who announced he would not seek re-election. 

    Even if the scandals currently enveloping the B.O. regime will not (realistically) result in a serious (or successful) effort to impeach B.O. and remove him from office, they should help the Republicans in the congressional elections, because American voters would prefer to see “divided government,” or “gridlock,” rather than continued tyranny from an unbridled executive branch.  Moreover, as the federal health-care control law known as “ObamaCare” continues to be implemented, in 2013 and 2014, it will only add to B.O.’s unpopularity.  This is one “train wreck” (to borrow retiring Senator Baucus’s famous phrase describing the law) that B.O. cannot blame on his predecessor, as so-called “ObamaCare” is the signature legislative “achievement” of B.O.’s presidency (as I discussed in Part I of this essay, in the section “The Disease Known as `ObamaCare’”).  The more Americans learn about “ObamaCare” – especially when they see it in actual operation, steeply escalating the costs of health care while also limiting Americans’ choices – the less they’ll like not only the law but also B.O. and the Democrats, the party that used political chicanery to push it through Congress in 2010.  Republican politicians just have to remember to keep clearly reminding the American people who’s responsible for this horrid law – and to keep pledging to reform or repeal it.  (That’s why the recent vote by the Republican majority in the House to repeal the law – although “dead on arrival” in the Democrat Senate and in the Oval Office, because of B.O.’s veto threat – was so important, not just symbolically but also as a reminder to American voters of the GOP’s unity in opposition to “ObamaCare.”) 

    So, I predict big GOP wins in the 2014 congressional elections.  What about 2016? 

    In this year’s “Spring Briefs” essay (March 23), in the section entitled “Saint Hillary?” I observed, in the wake of revelations about the Benghazi fiasco (discussed more fully below, in the section “The Ghosts of Benghazi”), that the Democrats’ “heir-apparent” to the White House, Hillary Rodham Clinton, may face an uphill battle, both to get her party’s nomination (as she tried to do against B.O. in 2008) and to get elected as the nation’s first female president in 2016: 

    [T]he pebbles strewn along Hillary Clinton’s path back to the White House are more like boulders, the inconvenient facts about her past record as U.S. secretary of state, U.S. senator, and first lady of both the U.S. and Arkansas – a past that ought to disqualify her from higher office, except in a political party that chose B.O. as its standard-bearer.  Mrs. Clinton’s tenure as secretary of state can be fairly described as disastrous: over the past four years, militant Islamic terrorists have seized more control of the Middle East, as the murder of the U.S. ambassador and three other Americans at the U.S. consulate in Benghazi, Libya, so pointedly illustrated.  Hillary was a key conspirator in the B.O. regime’s cover-up of that foreign-policy debacle, helping to perpetuate the lie that the Benghazi terrorist attack was motivated by an anti-Mohammed video broadcast on the Internet.  And Hillary’s much-touted “reset” of U.S.–Russian relations has yielded only B.O.’s capitulation to Vlad Putin on such issues as missile defense.  Even more embarrassing to Hillary (if she had any shame) is her record as Slick Willy’s cuckolded wife and criminal co-conspirator.  (Remember those infamous billing records from Hillary’s Rose law firm that mysteriously appeared in the Clinton White House living quarters – documents that were at the heart of the Whitewater land fraud?  All of Hillary’s whining about a “vast right-wing conspiracy” out to get her and Bill will never fully lay that story to rest.  Hillary’s political past may now be in the history books (most of which are written by people with a leftist bias that’s favorable to Hillary); nevertheless, that past will come back to haunt her, if she should be so foolhardy as to reenter political life. 

     

    Maybe like a cockroach that keeps coming back the more one tries to kill it, Mrs. Clinton may come back to political life.  (Like her husband, “Slick Willy,” she may truly be a “Teflon” politician.)  And maybe no other viable candidate will emerge among the Democrats, in the wake of the huge power vacuum left by B.O. 

    At this point, I see no viable presidential candidate emerging for the Democrats in 2016.  On the other hand, as I briefly discussed in my “Election 2012 Post-mortem” (Nov. 10, 2012), there is a whole stable of attractive, energetic, and relatively young Republican members of Congress or governors who could be viable presidential candidates in 2016.  (Forget about New Jersey Gov. Chris Christie, however.  Despite his recent gastric band weight-loss surgery – which clearly shows he’s as interested in running for president as he is in staying alive – Christie’s moderate policies and his blatant pandering to B.O. in the wake of tropical storm Sandy ought to disqualify him from the GOP primaries.)  If I were a betting man, I’d put my money either on Senator Marco Rubio (R.– Fla.), especially if the immigration-reform bill he’s co-sponsoring passes Congress, or on Senator Rand Paul (R.–Ky.), to be the next president of the United States. 

    As a libertarian Republican, obviously I like Rand Paul.  As I commented here on MayerBlog during the 2012 primary season, it’s a shame that it wasn’t Rand Paul who ran for president in 2012, rather than the Senator’s father, former Congressman Ron Paul.  Although the elder Paul has justly been credited for injecting libertarian principles into the mainstream of the GOP and hence into American major party politics (the so-called “Ron Paul Revolution,” nicely discussed in a recent book by Brian Doherty of the same title), the elder Paul had difficulty communicating his message to a wide range of voters.  He seemed like a “Johnny-one-note” in his obsessive criticism of Mideast military intervention by both the Bush and B.O. presidencies; and in general Ron Paul’s devotion to principle came across as a crazy kind of “extremism.”  As a politician, even though he’s less experienced in years, the younger Paul is more mature than his father.  Rather than being a self-conscious rebel against the GOP establishment, he’s been working within the party.  And he earned the respect of a wide range of Americans (from across the full political spectrum), by his principled stance against the B.O. regime’s use of unmanned drones to assassinate civilians, in his recent filibuster (a genuine old-fashioned Senate filibuster), as I discussed in “Spring Briefs 2013” (in the section “Droning On – To a Point”).   Earlier this year, Senator Paul spoke at the GOP Lincoln Day dinner in Cedar Rapids, Iowa – a traditional launchpad for presidential campaigns – where he ripped his potential Democratic rival, Hillary Clinton, and continued to urge the Republican Party to broaden its appeal (see my discussion “Rand Paul and Charles Murray Are Even More Right,” also in “Spring Briefs 2013”).  As one news story put it in its headline, “Sen. Paul’s Iowa trip has feel of 2016 run.” 

    And so, I’m looking forward to President Rand Paul’s inauguration in January 2017!

     

      

    n Hey, Big Spenders! 

                As I have previously commented here on MayerBlog, B.O. and the Democrats suffer from an ailment I call “Deficit-Attention Disorder” (D.A.D.).  As I explained in the section “Debt’s the Difference: The Democrats’ Deficit-Attention Disorder” in last year’s “Thoughts for Summer” (May 25, 2012), one of the biggest differences between the parties is their response to the problem of obscenely high federal deficit spending (with deficits averaging $1 trillion each year that B.O. has been in office) and the resulting hemorrhaging of the U.S. national debt to dangerously high levels (approaching 100% of the nation’s GDP).  I observed: 

    “B.O. and the Democrats refuse to even recognize that there’s a problem; in other words, they suffer from what might be called a `deficit-attention disorder.’  The Republicans, in contrast, are serious about dealing with the problem in the only truly effective way – by cutting government spending – but find themselves targets of Democrat demagoguery for simply trying to do so.” 

     

    And as I further noted in the section “Sequester, Schmequester” in this year’s “Spring Briefs” essay (March 23), “D.A.D. in turn is a disorder that results from the Democrats’ underlying disease:  paternalism (their paternalistic philosophy of government).”  Moreover, the Democrats’ stubborn refusal to recognize both their D.A.D. and the paternalism in which it is rooted is, in fact, the basic problem in Washington, D.C. today. 

    This key difference between the two parties – and particularly the Democrats’ D.A.D. – was quite evident last year in the debate over so-called “sequestration,” the compulsory, across-the-board budget cuts that have resulted from the failure to achieve a real solution to the national-debt ceiling impasse last year.  The compromise tax legislation that B.O. signed into law at the beginning of 2013 – generally retaining the “Bush tax cuts,” the lower rates on individual federal income taxes except for Americans earning more than $400,000 – was considered a defeat by some conservatives but ought to be considered the first of several victories by Republicans (or defeats for B.O. and the Democrats) thus far this year.  When B.O.’s White House (particularly former chief of staff Jack Lew, who’s now the treasury secretary) first proposed sequestration as a kind of poison pill, during the 2011 budget negotiations, they did not imagine that Republicans would be willing to stomach slashing the Pentagon’s budget in return for “cuts” (or rather, reductions in the rate of growth) in domestic programs and a more realistic opportunity to bring federal deficits – and hence the mushrooming national debt – under control.   

    Meanwhile, the B.O. regime has continued its crass political scheme to blame Republicans for the sequester by inflicting pain on the American people – allegedly caused by the sequester but really caused by executive-branch mismanagement of the budget.  The cancellation of public tours at the White House and the delayed opening of certain national parks were just the tip of the iceberg.  More serious were the flight delays resulting from FAA furloughs of air-traffic controllers.  But B.O.’s scheme failed: the American people clearly saw who was responsible for the cancellation of White House tours and the like, and the flight delays caused by the furloughs proved so unpopular – not just with the traveling public but with members of Congress – that Congress quickly passed, and B.O. was forced to sign into law, a bill providing sufficient funding for the FAA.  Insightful political commentators (like the editors of Investor’s Business Daily) opined that the whole fiasco involving FAA furloughs only proves that we ought to privatize air traffic control operations, as many countries (including Canada) have done, with much success (“Air Traffic Control: A Private Affair,” May 1).  Of course, such real reform cannot be accomplished so long as we have a socialist in the White House.  

    The deficit/debt issue will again heat up by Labor Day, when it is expected that the level of the nation’s debt will hit the $16.7 trillion ceiling and the Treasury Department will again warn that the United States may be in danger of defaulting on its obligations.  At that point, as Republicans line up against B.O. and the Democrats in their game of fiscal “chicken,” congressional Republicans ought to call the Dems’ bluff by passing a bill that prioritizes revenues so that certain obligations (not only interest payments on the national debt but also “entitlement” payments such as Social Security checks) will be met.  That would take the specter of “default” off  the table, as both sides try to negotiate yet another compromise between the responsible budget proposed by the House Republicans ( another modest effort by House Budget Chairman Paul Ryan’s committee, which at least makes a serious effort to reduce spending) and the irresponsible budgets proposed by the “big spenders” in the White House and the Democrat-controlled Senate.

      

     

    n B.O.’s Recession:  A Status Report 

    Here on MayerBlog I’ve frequently described the lousy U.S. economy – commonly called the worst “recovery” since the Great Depression” – as “B.O.’s recession,” because it’s the direct result of his failed, misguided economic policies.  (See, for example, my discussion of “Keynesian `Stimulus’ Bullshit,” in Part II of this year’s “Prospects for Liberty” (January 31), supplemented by my discussion of “B.O.’s Anti-Growth Agenda,” in this year’s “Spring Briefs” (March 23).)  Although the economy technically fails to fit most economists’ definition of recession (two or more successive quarters of negative economic growth), the continuing high unemployment rates and the sluggish annual growth (averaging less than 2%) in GDP really make the economy look more like a recession than a recovery. 

    The best that can be said about B.O.’s recession is:  It could be worse.  That’s so, in a couple of different senses – one that is hopeful; one that is ominous. 

    The hopeful sense in which “It could be worse” is this:  It’s unlikely that even more disastrous policies will be implemented during the next four years.  Why?  Simply because B.O. is a failure not just in his policies (in the substance of the policies he’s been pushing) but also as a leader (in actually implementing those policies).  He’s the “Naked Emperor,” from the famous Hans Christian Andersen story (as noted above) – a fraud, whose presidency is based on bullshit.  But, thankfully, B.O. has a lousy work-ethic (which is a polite way of saying he’s lazy), a narcissist who’d rather just be POTUS (President of the United States) than do the job of the POTUS.  The Government Accountability Institute mined the records of the White House calendar and discovered that B.O. “has spent a total of [only] 474.4 hours (or 47.4 10-hour workdays) in economic meetings or briefings of any kind throughout his presidency” thus far.  In comparison, he “has played 115 total rounds of golf and spent 86 days on vacation, for an estimated combined total of 976 hours.”  Lest we forget, B.O. was the president who in 2009 began his occupancy of the White House by declaring, “My economic agenda . . . begins with jobs,” and who has since repeatedly declared that his “No.1 focus” would be “jobs, jobs, jobs” (“A Presidency Driven Into the Rough,” I.B.D., April 30).  Just imagine how bad the economy would be if B.O. really were focused, “laser-like,” on it.   

    The other sense in which B.O.’s recession could be worse, unfortunately, is ominous:  it’s indeed likely to become worse.  Even though an effective Republican opposition can prevent the enactment of even more disastrous legislation being pushed by B.O. and the Democrats (simply through the GOP’s control of the U.S. House and continuing gridlock), the major pieces of legislation that were passed during the first two years of B.O.’s presidency, when both houses of Congress were controlled by Democrats – “ObamaCare,” the Dodd-Frank law, and “stimulus” spending – will continue to hamstring the U.S. economy.  So too will the massive increase in federal government regulations that has taken place during the past four years.  (See, for example, my discussion of “Regulation Nation” in last year’s “Thoughts for Summer” (May 25, 2012).)  And, as the next section discusses, the Dodd-Frank “reform” legislation (what I call the “Frank-n-Dodd monster”) not only failed to fix the underlying problems that led to the 2008 fiscal crisis but is likely to hasten yet another fiscal crisis. 

      

     

    n Déjà Vu: The Coming (Returning) Fiscal Crisis 

                Second only to the so-called “ObamaCare” law, as the worst piece of legislation passed by Congress (at least in recent years), is the monstrous Dodd-Frank law (named after its chief sponsors, former Senator Chris Dodd and former Congressman Barney Frank, both Democrats), which also passed Congress in 2010 by a mostly party-line vote when Democrats controlled both houses of Congress.  The law was supposed to “solve” the problems that led to the financial crisis in 2008, but it actually did nothing to cure the basic underlying problem of the “housing bubble,” and it actually has made it more difficult for banks (especially small banks, which are being increasingly hurt – and in many cases, actually driven out of business – by the steep regulatory costs imposed by Dodd-Frank).  (For more on all of this, see especially the superb book by former BB&T chairman – and now president of the libertarian Cato Institute – John A. Allison, The Financial Crisis and the Free Market Cure: How Destructive Banking Reform Is Killing the Economy (McGraw-Hill, 2013).) 

    Most insightful analysts of the 2008 financial crisis (like Mr. Allison) have concluded that it was triggered by the bursting of the housing bubble, which in turn was caused by irresponsible government regulations pushing “affordable housing” – which in practice meant a relaxing of mortgage standards for buyers with weaker credit – traced back to the Community Reinvestment Act, passed originally during Jimmy Carter’s presidency and then broadened under Bill Clinton, together with similar initiatives pushed by the FHA and mortgage giants Fannie Mae and Freddie Mac.  Now, perversely, the B.O. regime is pushing those same policies “that got us into this mess in the first place” (to quote the favorite mantra of B.O. when he tries to blame economic problems on his predecessor, George W. Bush.  But this problem cannot in any way be blamed on Bush, who tried to reform Fannie Mae and Freddie Mac, only to be rebuffed by those entities’ defenders in Congress, Democrat politicians led by Congressman Frank and Senator Dodd!).  As the editors of Investor’s Business Daily recently warned, the B.O. regime “wants banks to lower lending standards, and Fannie and Freddie are back in the black.  The stage is set for a replay of some very unpleasant history” (“Look Out, Taxpayers: It’s Bubble Time,” April 4).   

    And in another ominous development, B.O. has named Democratic Congressman Mel Watt (former chairman of the Congressional Black Caucus and one of the leading “affordable housing” zealots who caused the housing bubble) as the new executive overseeing Fannie Mae and Freddie Mac, the quasi-private entities that back up nearly 90% of U.S. mortgage loans.  Will the insanity never end?

      

     

    n  Gun-Control Fascists: Their Ammo Is Bullshit 

                Anti-gun fascists – activists pushing for more government control over guns (and hence further erosion of Americans’ Second Amendment rights) – have been trying to disguise their agenda by claiming they’re concerned about “gun violence” and are really calling just for so-called “common-sense reforms” to help assure “gun safety.”  Those claims are simply bullshit, as I maintained in Part III of my “2013: Prospects for Liberty” essay (February 7, in the section on “`Gun Violence’ Bullshit”).  It’s not guns, but the misuse of guns by criminals, that jeopardize Americans’ rights; efforts by the government to expand its control over firearms, in the name of reducing “gun violence” or promoting “gun safety,” only make law-abiding Americans more vulnerable to criminals, less able to protect themselves, and more dependent on government for their safety.  As I noted in the February essay, “gun control” really means government control over the ownership, possession, and use of firearms by individuals – in other words, government control over firearms owners – and the most extreme advocates of gun control (the most extreme advocates of “solutions” to the “gun violence” problem) really want (as their ultimate objective) a government monopoly over firearms.  A government monopoly means the prohibition of private gun ownership, including the confiscation of all firearms possessed by individuals not authorized to do so by the government. 

    Following a key dogma in the leftists’ playbook (to “never let a crisis go to waste,” as B.O.’s former White House chief of staff (and now mayor of Chicago) Rahm Emanuel has put it), gun-control fascists have been trying shamelessly to exploit the tragic school shootings at a Newtown, Connecticut elementary school earlier this year – in which 20 children and six adult staff were murdered – as a kind of emotional ammunition, to push their agenda.   

    B.O. in particular has been exploiting the Newtown tragedy in order to try to ram his gun-control initiatives through Congress.  In an obvious abuse of presidential power, B.O. flew some of the Newtown victims’ families on Air Force One to Washington, D.C., to lobby members of Congress.  And in a speech on April 8 at the University of Hartford, accompanied by some Newtown parents, B.O. claimed that his push for gun control was apolitical.  “Why, then, was he yelling?” asked the editors of Investor’s Business Daily.  B.O. “was thundering over and over that `this is not about politics.’ [But] in the course of his speech, he spoke of `politics in Washington,’ of `political stunts to prevent votes,’ of `political victory or defeat for me,’ and of how he `did not believe the country was as divided as our politics would suggest.’”  “That’s a lot of talk about politics on something that has nothing to do with politics,” the editors observe, adding: “Of course, what [B.O.] is up to has everything to do with politics – and nothing to do with saving children from bullets.”  B.O., the “most-ideological president in U.S. history,” as well as one of the most divisive and one of the worst narcissists ever to occupy the White House, sees in gun-control politics “another opportunity to body-slam a component of the Republican Party’s base, those bitter non-urban GOP voters who `cling to guns or religion,’ as [B.O.] has said of them” (“Politicizing a Mass Killing of Kids,” I.B.D., April 10). 

    Yet, as a USA Today editorial on April 4 acknowledged, the momentum on “gun safety” (meaning gun-control) legislation after the Newtown shooting has slowed.  The editors (themselves members of the gun-control fascist club) attribute that to the supposed power of the “gun lobby.”  What they fail to mention is that all the measures proposed by gun-control zealots – including the new draconian gun controls passed into law by the Democrat-controlled legislatures of Maryland and Colorado – would not have prevented the Newtown massacre.  (One of the Newtown victims’ parents who refused to play along with B.O.’s attempt to politicize the tragedy was Mark Mattioli, who in a Fox News interview asked “Why should I be hampered in protecting myself when someone can come into my home and outgun me?”  He called for increased enforcement of existing laws and more mandatory sentencing, pointing out that criminals won’t follow new laws any more than they do existing ones.  And he has praised the detailed “School Shield” proposal the National Rifle Association has put forth since the massacre to train armed guards to protect schoolchildren from mass killers.)  

    When the battle was taken up in the Congress this spring – in what might be called “a gunfight at the Not-so-OK Corral” – B.O. and the other advocates of gun control suffered a major political defeat.  In the Democrat-controlled Senate, gun-control backers came up far short of the 60 votes they needed for passage (with several Democrats joining the Republicans in opposition).  The compromise background-check bill, co-sponsored by Senators Joe Manchin (D.–W.Va.) and Pat Toomey (R.–Pa.), got 54 votes.  The amendment that would renew a federal ban on so-called “assault weapons” (the bill pushed by the author of the original Clinton-era ban, Senator Diane Feinstein (D.– Calif.)) got only 40 votes, and the effort to put federal limits on the size of ammunition clips got only 46 votes. 

    The much-vilified “gun lobby” which the gun-control fascists like to demonize – the National Rifle Association and other gun-rights groups, including the even more strident Gun Owners of America – are not only effective grass-roots lobbying organizations but are, in fact, the nation’s largest (and most effective) civil rights organizations, for they are defending not only their members’, but all Americans’, fundamental freedom, their Second Amendment rights.

      

     

    n Reality Check: The Boston Marathon Bombing Massacre (BM2) 

                The bombing at the Boston Marathon on April 15 – which killed three persons (including an 8-year-old boy) and injured/ maimed over 170 people – was, depending on one’s interpretation of the facts as reported in the news media, either: (A) another one of a series of recent mass killings perpetrated by deranged young white males, or (B) the most significant militant Islamic jihadist terrorist attack on U.S. soil since the horrid al-Qaeda attacks on September 11, 2001, or (C) something else.  The correct answer is C, and the “something else” I’m referring to is actually a combination of A and B.  (In other words, A and B are narratives based on a false dichotomy resulting from confusion over the definition of the words terrorist and terrorism – and a failure to recognize the significance of the global Muslim jihadist threat.)     

    As news stories over the past month and a half have revealed, the perpetrators of the bombing were two brothers, Tamarlan Tsarnaev, 26 (who was killed in a gun fight with police), and his younger brother, Dzhokhar, 19 (who was apprehended in a nearby neighborhood, just outside the area where Boston police conducted house-to-house searches, and is now in custody, awaiting trial for the murders).  The Tsarnaev brothers are Muslim immigrants from Russia – specifically, their family origins are in Chechnya, the rebellious Russian region near the Caucacus Mountains (and are therefore, quite literally, Caucasian Muslims), which has become a breeding ground for terrorists.  By all accounts, the brothers were fully Americanized, permanent residents attending U.S. schools, but as such, they were perfect “sleepers,” terrorists in waiting, ripe for recruiting.  And from what media accounts tell us thus far, it was the older brother, Tamarlan, who became most disenchanted with life in America and so traveled back to Russia, where he received training in militant Islamic camps near his homeland.  When Tamarlan returned to the United States, he in turn “radicalized” his younger brother, Dzokhar.  Statements made by Tamarlan on his website before the bombing and by Dzokhar after the bombing clearly show they regarded themselves as Islamic jihadists, killing American civilians in retaliation for the killing of civilians by U.S. forces in Afghanistan.  Despite abundant warning signs (particularly with regard to Tamarlan’s activities), U.S. government authorities failed to anticipate the danger.  (Once again, fears of “politically incorrect” ethnic or religious profiling apparently prevented federal authorities from either raising red flags, or acting on them, concerning the Tsarnaev brothers’ plot.  Thus far, thankfully, it seems that other than some friends who aided or abetted Dzokhar’s attempt to escape, no other jihadists were involved in the bombing itself.) 

    That all said, nevertheless, those conservatives who called for the surviving Tsarnaev brother to be treated as an “enemy combatant” – and thus to be denied those procedural protections that American law affords to other accused criminals – are also wrong, in their irrational overreaction to the militant Islamic jihadist threat.  

    The Tsarnaev brothers were indeed terrorists – specifically, militant Islamic terrorists – but that does not make them enemy combatants.  Terrorism is generally understood today as the use of criminal acts calculated to provoke a sense of terror in the general public, a group of persons, or particular persons for political purposes.  That’s the general definition of terrorism that the U.N. General Assembly has used since 1994 in various resolutions condemning terrorist acts, as the Wikipedia article on “Terrorism” notes, adding that in neither U.S. law nor international law is the term terrorism precisely defined.  Since the September 11, 2001 al-Qaeda attacks on the United States, militant Islamic terrorists essentially have declared war on the United States, and the U.S. government has responded by waging war in the Middle East (primarily in Iraq and Afghanistan), though without a formal declaration by Congress because (from the U.S. perspective) it is a defensive war.  Militant Islamic terrorists killed or captured in a foreign theater of war are indeed enemy combatants; but “homegrown terrorists” like the Tsarnaevs – legal residents and/or citizens of the U.S., acting within the United States – are, legally, common criminals.  Terrorism explains their motive; it does not define their criminal acts – which in this case, are murder and attempted murder.

    In this case, thankfully, given the circumstances of the Boston attack (the placement of the bombs in a very public place, near the Marathon’s finish line, where a multitude of cameras, both public and private, captured the suspects’ images, modern technology enabled the police to identify them.  And the usual operation of the American criminal justice system – both the police and FBI in apprehending suspected criminals, and the courts in putting them on trial – seem quite adequate to deal with this particular criminal, according to the rule of (civilian) law.

    According to a military expert recently interviewed by a USA Today reporter, however, the Boston bombing is not an “anomaly,” and we can expect more such cases of “home-grown” militant Islamic terrorism challenging our criminal justice system.  Michael Barbero, a retired Army lieutenant general who led the Joint IED Defeat Organization (JEIDDO), has warned that the threat from homemade bombs – the top killer (and maimer) of U.S. troops in Afghanistan and Iraq – is “here to stay,” will persist for decades, and is likely to become a more prevalent menace domestically (“Military expert: Boston bombing `not an anomaly,’” May 20).  The recent brutal knife attack, resulting in the death of a British soldier on the streets of south London – with the militant Muslim perpetrators caught literally “red-handed,” with blood-stained hands and still wielding the knife with which they did the murder, brazenly invoking jihadist rhetoric on video – not only reminds us that militant Islamic terrorist attacks can and do occur in the West, but also that they do not require firearms to commit jihadist murder. 

    The failure of the U.S. government to fully recognize the dangers of militant Islamic terrorism – and to deal with the problem appropriately, according to the Constitution and U.S. law – may be traced back to the George W. Bush administration, which bowed to political intimidation (mostly from the left – pushing the agenda of so-called “political correctness,” to avoid religious discrimination against Muslims ) to use the euphemistic term terrorism as a blanket description, when what they really meant was “militant Islamic terrorism.”   

    But the B.O. regime – in its zeal to “reset” American foreign policy, to use the term coined by B.O. himself and his former secretary of state, Hillary Clinton – has taken political correctness to ridiculous new depths, resulting in a U.S. government that seems powerless to deal with the real threats to national security – and to the rights, and the lives, of American citizens – that militant Islamic terrorism poses.  One result (as noted in the section below) is the regime’s attempt to cover up the facts about the militant Islamic terrorist attack on the U.S. consulate in Benghazi, Libya, on September 11, 2012.  Another striking case in point: Nidal Hassan, the Fort Hood killer (who committed mass murder – shooting and killing 13 people and injuring over 30 people – at Fort Hood, near Killeen, Texas, on November 5, 2009).  Many political commentators have reasonably called the killings an act of (militant Islamic) terrorism (Hassan allegedly invoked the name of Allah as he shot, targeting U.S. soldiers in uniform).  Yet the B.O. regime stubbornly refuses to recognize the obvious; the Department of Defense and federal law enforcement authorities have classified the incident as an act of “workplace violence” (similar to a U.S. Postal Service worker going nuts – or going “postal,” literally – and shooting his or her fellow employees).  Because Hassan is a member of the U.S. armed forces – a U.S. army major serving as a psychiatrist (ironically!) – he’s being prosecuted (appropriately) in a military court.  So far he’s been charged with 13 counts of premeditated murder (for the persons he killed) and 32 counts of attempted murder (for the persons he wounded) under the Uniform Code of Military Justice; additional charges may be added at his court-martial, and if convicted, he may be sentenced to death (as he should).  Outrageously, however, it was recently reported that he is still collecting his salary (nearly $300,000 annually) as he awaits trial! 

    When will the B.O. regime pull its head out of the sand and acknowledge the real threat of militant Islamic terrorism?     

      

     

    n The Ghosts of Benghazi 

                No wonder “Benghazi” the militant Islamic attack on the U.S. consulate in Benghazi, Libya, September 11, 2012, which resulted in the murder of the U.S. ambassador, Christopher Stevens, and three other Americans – has been associated with “IRS-Gate” and the Justice Department’s surveillance of news reporters, in the so-called “trifecta” of scandals that currently plague the B.O. regime.  Many commentators (among them, the usually insightful and politically-astute Charles Krauthammer) regard Benghazi as the most serious scandal, the one that might actually bring down B.O.’s presidency.  The parallels to Richard Nixon and the Watergate scandal that brought down Nixon’s presidency are quite clear – particularly in the way both administrations have tried to “cover up” an underlying act of malfeasance (in Watergate, it was a petty crime, a “third-rate burglary,” as it’s been aptly characterized; in Benghazi, it’s the fecklessness of B.O.’s foreign policy, generally, and especially his mismanagement of the war against militant Islamic terrorism).  (For more on this, see my discussion “The `Arab Spring’ Turns Into an Islamo-Fascist Autumn,” at the beginning of last fall’s “Tricks and Treats” (Oct. 25, 2012).)  And, as many conservatives have pointed out in arguing that Benghazi is even more serious than Watergate, no one died in Watergate. 

    That Benghazi may become B.O.’s Watergate is even more apparent when we consider the regime’s attempt to “stonewall” ongoing congressional investigations.   The Senate Armed Services Committee hearing – which included testimony by two men with key roles in the Libya debacle, departing Defense Secretary Leon Panetta and Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff – painted a picture of a disengaged president, particularly when under questioning by Sen. Lindsey Graham, Panetta revealed that B.O. talked with him only once during the Benghazi assault and never called him back for updates.  More recently, three State Department “whistleblowers” testified before the House Oversight and Government Reform Committee: Gregory Hicks, former deputy chief of mission in Libya (Ambassador Stevens’ second-in-command, the No. 2 diplomat in Libya during the Benghazi attack); Mark Thompson, acting deputy assistant secretary for counter-terrorism; and Eric Nordstrom, the former regional security officer in Libya.  Hicks’ testimony was particularly explosive: after an emotional retelling of the night of the attack, including the death of Ambassador Stevens, Hicks revealed that he was rebuffed by Washington when he pushed for a stronger military response to the attack, particularly during the critical hours that elapsed before the last two Americans were killed at the consulate.  Hicks testified that a four-man team of military special operations forces was in Tripoli, geared up and ready to drive to an aircraft to come to Benghazi, to help those still trapped in the consulate, when its commander was ordered to stop by his superiors (at Special Operations Command Africa).  Hicks said the commander told him: “I have never been so embarrassed in my life that a State Department officer has bigger balls than somebody in the military.”  Taken together, the testimony of the three State Department officials reveal that the B.O. regime, including both the president and former Secretary of State Hillary Clinton, were aware in the first hour that it was a terrorist attack, knew it was no spontaneous protest caused by a video, refused to send what help they could, and then deliberately lied about it.  (See “Hillary Lied, Four Died in Benghazi,” I.B.D., April 25; “Whistle-Blower: We Saw Them Die,” I.B.D., May 7; and “Whistle-Blowers: Yes, `It Matters,’ I.B.D., May 9.) 

    Even more explosive than his account of the night of the attack was Mr. Hicks’ testimony about the B.O. regime’s subsequent efforts to cover it up.  Hicks testified that he was appalled – that it was “jaw dropping” to him – when he heard the regime’s narrative that the attack resulted from a spontaneous demonstration prompted by an anti-Mohammed video posted on YouTube, the narrative that U.N. Ambassador Susan Rice repeated on five Sunday talk shows.  He also said that Beth Jones, acting assistant secretary of state for Near Eastern affairs, dressed him down shortly after he criticized the regime’s narrative; she also ordered him not to talk to congressional investigators.  After he did talk with them, he received an angry reprimand from Cheryl Mills, chief of staff to former Secretary of State Hillary Clinton (“The Intimidation of Gregory Hicks,” I.B.D., May 10).  And as the congressional hearings have revealed, the infamous Benghazi “talking points” – essentially the script that regime officials used to push their false narrative about the YouTube video – were rewritten at least a dozen times, removing all references to al-Qaida and prior attacks (references that were contained in the original CIA text), at the direction of the secretary of state’s office (“A Lie Hillary Agreed To Peddle,” I.B.D., May 13). 

    Despite all that congressional investigators have learned thus far, several critical questions still remain unanswered.  Why did the B.O. regime begin the false narrative that the video sparked the attack?  Who changed the talking points, and why did they want to mislead the American people?  What U.S. military assets were readily available within striking distance of Benghazi after the attack had begun?  Why were they not made ready, or if they were ready, who gave the order to “stand down” – and why?  What intelligence was available before the attack, and how specific was that intelligence?  Embassy personnel had been requesting additional security for months; why were those requests denied?  What was Ambassador Stevens doing in Benghazi?  What happened to the terrorists who were captured by an American team inside the CIA Annex and then turned over to a Libyan rescue team?  Who, if anyone, is held accountable for the attack?  What efforts are being made to identify and apprehend the terrorist attackers?  (Meanwhile, the producer of the video is still sitting in jail, arrested on an unrelated parole violation.)  The FBI interviewed survivors of the attack on Sept. 15 and 16; why has the B.O. regime denied access to those survivors, and when will that information be made available to Congress and the public?  And finally, What was B.O. doing during the critical hours of the attack?  As Senator Ron Johnson (R.–Wis.) noted in an op-ed earlier this spring, “A complete timeline of what the president was doing that night, like we saw during the operation against Osama bin Laden, has never been provided.  Did he ever assemble a team in the situation room to monitor events?  Or did he simply check out, rest up, and then fly to a campaign fundraising event in Las Vegas only a few hours after those brave Americans had died?” (“5 Months After Benghazi, Justice Remains Elusive,” I.B.D., March 4). 

    Democrats have asserted that congressional investigations into the Benghazi tragedy/fiasco, especially in the Republican-majority U.S. House of Representatives, is a “witch hunt.”  In a sense, they’re right – because the witch (or maybe another word, one that rhymes with witch, is more appropriate) is named Hillary Rodham Clinton.  Contrary to the campaign ads she ran (opposing B.O. in the Democrat primaries in 2012), claiming that she’s be the one to call when there’s an international emergency at 3:00 a.m., Mrs. Clinton seems to have failed spectacularly in her job as Secretary of State.  She ignored requests for additional security at the consulate, in response to intelligence showing Muslim terrorist threats, and she neglected to act to help save the lives of American personnel during the critical hours between the initial attack and the time when four Americans, including the U.S. ambassador, were killed.  Worse, she actively participated in the B.O. regime’s false story that the Benghazi attack was motivated by spontaneous outrage at an anti-Mohammed video posted on the Internet – a story providing political cover for B.O.’s foreign-policy failures, a story that she knew was false, even when she looked in the eye the father of one of the dead Americans and promised him that the government would apprehend the video maker.  As I noted in the section on “2013 Politics,” above, Hillary’s malfeasance in connection with the Benghazi matter, by itself, ought to disqualify her from ever again holding any high political office. 

    In response to the infamous rhetorical question that she asked during her congressional testimony, when a visibly angry (or exasperated) Mrs. Clinton almost shouted, “What difference, at this point, does it make?” we should respond:  Mrs. Clinton, it makes a great deal of difference to the families of Ambassador Stevens and the other murdered Americans – and to the American people, who deserve to know the full truth about the murders, including the answers to all the “Who? What? and Why?” questions noted above. 

    The “ghosts of Benghazi” will continue to haunt B.O. and his regime (including his former secretary of state) until the truth finally and fully comes out.

      

     

    n Assad, Abbas, Hamas – What the Hezbollah! 

                The Middle East today is a powder keg, primed and ready to explode into another major war during the next few years, while B.O. continues to occupy the White House and U.S. foreign policy seems feckless in dealing with the global threat of militant Islamic extremism.  (Indeed, as I briefly discussed at the beginning of last fall’s “Tricks and Treats” essay (Oct. 25, 2012), B.O.’s support for the so-called “Arab spring” seems to have resulted in more militant Muslim groups seizing control in such countries as Egypt and Libya.)   

    The only question is what kind of war will it be:  Will the Muslim regimes in the region again unite in an attack on Israel, as the leaders of Iran have been threatening to do for years?  (In addition to continuing to develop nuclear weapons and long-range missiles that would deliver them to Israel, the Iranians are also giving support to the terrorist organizations Hamas and Hezbollah, in their bases in Gaza and Lebanon, respectively – groups which have pledged to destroy Israel and replace it with an Muslim Palestinian state.)  Or will the civil war currently raging in Syria, between the brutal regime of dictator Bashar Assad and the Syrian rebels, spread into other neighboring nations, leading to a regional war between the two major strands of Islam – Sunnis and Shiites?  As a recent news article in USA Today reports, Sunni Muslims are the majority in Syria, and Sunnis – including the royal families of the Persian Gulf oil sheikdoms as well as the leadership of al-Qaeda – have sent arms and troops to support the Syrian rebels; while Assad has appealed to the Shiites, and has received military support from the Shiite theocracy of Iran as well as its associated terror group Hezbollah in Lebanon.  Already the fighting had bled across Syria’s border with Lebanon, and rockets have been fired between Syria and Turkey and Iraq (“Syria’s wounds bleed into other nations,” May 30). 

    So far B.O. has limited U.S. involvement in the Syrian civil war to non-weaponry aid (mostly food and medical supplies) to the rebels.  But will he get the United States involved in yet another Mideast war?  Earlier he had declared that Syria’s use of chemical weapons against its people crosses a “red line,” and in late April White House press secretary Jay Carney told reporters that B.O. “retains all options to respond” to Syria, including military strikes on the country.  Whether or not intervening militarily is wise (and I think it’s clear that the American people will not tolerate yet another Mideast war in which U.S. servicemen and women are killed and maimed for no well-defined purpose), has B.O.’s announced “red line” boxed the U.S. into a position where failure to intervene might cause us to further lose world standing? (“Taking Away U.S. Credibility,” I.B.D., April 30). 

    Apart from our alliance with Israel, the United States no longer has any vital national interests at stake in the Middle East.   Thanks to the energy “revolution” – technological advances that have made it now feasible to extract the abundant fossil-fuel resources in North America – the United States is now on the verge of being self-sufficient, no longer dependent on Mideast oil for its energy needs (even in spite of B.O.’s war on carbon energy, as I’ve discussed in Part I of this essay).  As Victor Davis Hanson has maintained in a provocative recent op-ed, “the Middle East is becoming irrelevant” – once again, a global backwater, as it was for centuries before the discovery and exploitation of its vast oil reserves.  He adds that B.O. “senses there is no support for U.S. intervention in the Middle East,” and that even his idea of “leading from behind” in Libya “led to the loss of American personnel in Benghazi.”  Hanson predicts that after Iraq, “the U.S. will not nation-build in Syria.  Apparently, Americans would rather  be hated for doing nothing than be despised for spending trillions of dollars and thousands of lives to build Middle East societies” (“The Monotonous Middle East,” I.B.D., May 3).  I hope he’s right.   

      

     

    n The Korean War II? 

                In this year’s Spring Briefs” essay (March 23), I wrote about the “Mischievous Young `Un,” North Korea’s 30-year-old communist dictator, Kim Jung Un (the son and grandson of dead communist dictators Kim Jong Il and Kim Il-sung, respectively), and the threat his lawless regime poses to world piece.  Angrily reacting to the U.N. Security Council’s recent unanimous decision to tighten sanctions against his rogue country, Kim’s regime announced that it was nullifying all nonaggression agreements with South Korea, declaring invalid the armistice agreement that ended the Korean War in 1953.  So, as far as North Korean rhetoric is concerned, the chapter of the Cold War known as the Korean War has again heated up.  Is it Korean War II? 

    As I also noted in March, it’s not just South Koreans who are nervous.  “As Pyongyang marches relentlessly toward deliverable nuclear weapons and the long-range missiles to carry them, one top North Korean general has claimed that his country has nuclear-tipped intercontinental ballistic missiles (which could reach the United States, not only Alaska but also the Northwest coast) ready to blast off.”  The editors of Investor’s Business Daily warned that the three-stage missile North Korea launched last December also orbited a “package,” which experts say could be a test to orbit a nuclear weapon that then could be de-orbited on command anywhere over the U.S. and exploded at a high altitude, releasing an electromagnetic pulse (EMP).  That would fry electronic circuitry and collapse the electric power grid and other critical infrastructures – communication, transportation, banking and finance, food and water – that sustain modern civilization and the lives of 300 million Americans (“Can North Korea Destroy U.S.?” April 5). 

    I concluded my discussion in March by observing that even B.O., who has been a critic of missile defense since long before his 2008 campaign (and who has been very cooperative with Russia in scrapping missile defense in eastern Europe), finally sees the merit in having some missile defense to protect the U.S. homeland from rogue regimes like Kim’s in North Korea.  “I can tell you that the United States is fully capable of defending against any North Korean ballistic missile attack,” White House spokesman Jay Carney has said.  Given all the lies that Carney has recently told on behalf of B.O. (especially with regard to the “trifecta” of scandals involving Benghazi, the IRS, and the Justice Department), let’s hope that this time he’s telling the truth.

      

     

    n The EU: eeyeou! (Lessons from Mrs. Thatcher) 

                As Greece and several other European countries wrestle with fiscal crises generated by their mushrooming national debts (largely due to out-of-control national government “welfare-state” spending), many astute observers have begun questioning whether the much-vaunted European Union (EU) was a good idea.  That’s especially so when EU authorities insist that the solution to member nations’ bankruptcy problems is a policy of “austerity,” defined more in terms of tax increases rather than cuts in government spending. 

    One leader who questioned the value of the EU was Britain’s former prime minister Margaret Thatcher.  Mrs. Thatcher, who completed her life on April 8 (at age 87), was the first woman to become prime minister of Britain and the first to lead a major Western power in modern times.  But she was best known as the “Iron Lady” who reversed Britain’s economic decline by dismantling socialism and substituting free-market policies.  When her Conservative Party came to power in May 1979, the British economy was in shambles, with inflation running at 20%, ballooning budget deficits, and the highest unemployment rate since the Great Depression.  Thatcher’s solution was to promote the free-market views of Milton Friedman and Friedrich Hayek, emphasizing the privatization of the inefficient state-owned enterprises that had caused the nation’s economic stagnation.  “Privatization was fundamental to improving Britain’s economic performance,” she wrote in her memoirs.  But it was more than that: it was “one of the central means of reversing the corrosive and corrupting effects of socialism.”  (In another one of my favorite quotations, Thatcher said, “The problem with socialism is that you eventually run out of other people’s money.”)  And so she privatized the big, state-owned industries that were losing money, including utilities, British Airways, shipping, railways, and auto manufacturing; standing up to the powerful mineworkers union in 1984, she also privatized the mines.  (Comparing Thatcher to her good friend and ally, her partner in freedom and free markets, President Ronald Reagan, the editors of Investor’s Business Daily note that her government’s victory over trade union might – epitomized by the miners’ surrender in 1985 after more than a year of illegal striking – made Reagan’s 1981 firing of PATCO air traffic controllers “look like small ball” (“Margaret Thatcher’s Legacy of Leadership,” April 9).) 

    Thatcher’s government also brought inflation under control by lowering taxes and cutting spending – essentially the same “supply-side,” pro-growth economic policies that President Reagan pursued in the United States.  As she told her ministers, “Gentlemen, if we don’t cut spending we will be bankrupt.  Yes, the medicine is harsh, but the patient requires it in order to live. . . . We did not seek election and win in order to manage the decline of a great nation.”  By restoring economic prosperity and British national pride, Thatcher and her Conservative Party won a second election in 1983 and a third in 1987.  But the tough economic medicine that Thatcher administered made her unpopular (especially among leftists); and in her final year in office, she even faced a revolt among her own cabinet ministers, in part over the question of further integration with Europe.  Thatcher resisted the loss of British autonomy in a European Union, becoming what some people have called a “Euro-skeptic.”  “We have not successfully rolled back the frontiers of the state in Britain, only to see them reimposed at a European level, with a European super-state exercising a new dominance from Brussels,” Thatcher declared.  In response to then-president of European Commission, Jacques Delors – who had proposed that the European Parliament become the popular assembly of the EU, with the Commission to be the executive and the Council of Ministers to be the senate – she famously shouted, “No! No! No!” in the House of Commons. 

    Thatcher’s nearly 12 years in power – from May 1979 until November 1990 – was the longest term of any British prime minister since the early 19th century (since the Earl of Liverpool, who governed from 1812 to 1827).  Yet even her political enemies have to acknowledge, Thatcher turned the British economy around; hence, when Tony Blair’s Labour Party came back to power, it moderated its socialism because of the economic lessons that Thatcher taught – lessons that, unfortunately, have not been learned by the socialist politicians who control the EU. 

    The “austerity” policies that EU authorities have imposed on Greece as a condition for their fiscal bailout are virtually the opposite of the lesson Mrs. Thatcher’s legacy has taught, as the recipe for economic success.  They’re anti-growth, not pro-growth.  Thankfully, another European country provides a model for a better way forward: Estonia. 

    Estonia “took its medicine” as soon as the global financial crisis broke.  It cut government spending relative to its pre-crisis level dramatically – 2.8% in 2009 and 9.5% in 2010 – and is now one of Europe’s fastest growing economies.  “Moreover, Estonia’s central bank refused to prop up banks that shipwrecked on the rocks of a real estate bubble.”  Its economic recovery has been impressive, with unemployment now below the Euro average and having made up its total economic losses by 2012 (Matthew Melchiorre, “U.S. Should Look to Estonia, Where Austerity Is a Success,” I.B.D., April 25). 

    If Europe won’t follow Mrs. Thatcher’s words of caution about misguided EU economic policies, maybe it will follow the present-day model of Estonia.  Maybe U.S. policymakers should follow the Estonian example, too.

     

      

    n National Service: Bah, Humbug! 

                From time to time certain political commentators (almost always from the left side of the political spectrum) call for some form of compulsory “national service.”   For example, DeWayne Wickham, a leftist political commentator for USA Today, in a recent column (published the day after Memorial Day) bemoaned the fact that “since the start of the post-9/11 wars [in Iraq and Afghanistan], less than one-half of 1% of the population has been in the armed services.”  He contrasted that figure with military service in World War II – when there was a national draft – when nearly 9% of Americans served in the military.  He also noted a 2011 Pentagon survey which found that 57% of the servicemen and women on active duty were the children of current or former members of the military, adding that “the composition of today’s armed forces looks more like a family business than a military force that is drawn widely from the nation’s population,” as he claimed it was during the era of national military conscription (between World War II and 1973, when the national military draft ended, during Nixon’s presidency – although compulsory registration for the draft has continued since Jimmy Carter’s presidency).  “The ending of conscription has narrowed the population base from which this nation’s military is drawn and made military service a less democratic ideal.” 

    Further bemoaning that “patriotism” has become “for too many people . . . [simply] wearing a lapel pin, hanging a flag on their front porch, or covering their heart during the singing of [the national anthem],” Wickham urges that the United States “should demand more of its citizens”: 

    “It should require all able-bodied Americans to perform some sort of national service.  If not in the military, then in a national organization such as the Corporation for National & Community Service, a federal agency that uses volunteers to provide services to the poor, the elderly, and disaster victims.”

     

    Only with such compulsory volunteerism – a truly Orwellian concept, a contradiction in terms – would America have meaningful, and “democratic,” patriotism, Wickham asserts (“Reinvigorate patriotism with national service,” USA Today, May 28). 

    Another leftist who has called for compulsory national service, in the form of reinstating the military draft, is Charlie Rangel, the Democratic congressman from New York, who had disgraced himself because of his political corruption (taking bribes and other violations of House ethics rules, for which he was censured but not ousted from Congress).  Rep. Rangel earlier this year – after the Pentagon announced that it would end the ban on women serving in combat – called for a reinstatement of the national military draft, for both women and men.  The bill he has proposed, the “Uniform National Service Act,” would require all men and women between the ages of 18 and 25 to give two years of service “in any capacity that promotes our national defense.”  Like Wickham, Rangel bemoans the fact that “the burden of defending our nation is carried by less than 1 percent of the population,” arguing that reinstating the draft is the only way to ensure “a more equal military.”  (What Rangel didn’t argue explicitly in his announcement was the notion that in the current all-volunteer armed forces, the “burden” of military service falls disproportionately on members of “minority” groups, black and Hispanic Americans – a bit of race-baiting demagoguery that no doubt helped him get reelected, in his overwhelmingly black Harlem district, in 2012, notwithstanding his ethics problems.) 

    Rangel is partly right.  As I maintained in my “Spring Briefs” essay (March 23), in the section that I provocatively titled “Draft Them Gals!,” after the Pentagon has approved women for combat roles in the military, there is no rational basis for the government to require only men over the age of 18 to register for the draft; the Supreme Court, if faced with the question, ought to hold that limiting draft registration to males is a violation of the constitutional principle of equal protection of the laws.  

    However, the problem with the draft, constitutionally speaking, goes beyond its application only to males, as I also noted in the March essay.   Military conscription – and compulsory service to the government, in all its forms – is a direct violation of the Thirteenth Amendment, which prohibits not only slavery but also “involuntary servitude,” in the United States.  (Even Abraham Lincoln, who supported a national military draft law during the Civil War – before the Thirteenth Amendment was added to the Constitution – acknowledged that conscription was a form of involuntary servitude.  When faced with a constitutional challenge to the World War I draft law, the U.S. Supreme Court upheld the law, with hardly any justification; it certainly wasn’t the first or only time the Court has erred in its interpretation of the Constitution.) 

    Military conscription and all other forms of compulsory national service are not only unconstitutional but also, quite literally, un-American.  They are ideas whose time has come – and gone – centuries ago, in medieval England.  Bear with me, as I briefly discuss some English legal history here.  In Anglo-Saxon England all able-bodied men were expected to serve in the national militia, or fyrd, whenever called upon by the king to serve, which usually meant whenever the kingdom was at war.  This principle of universal allegiance (universal duty to serve the king, militarily) was continued after the Norman Conquest by William the Conquerer and his successors; King Henry II, in his Assize of Arms, ordered not only that all able-bodied free men provide military service to the king, but also equip themselves with the appropriate weaponry, according to their social status.  (Hence, long before the “right” to bear arms was affirmed, at least for Protestant subjects of the king, in the 1689 English Bill of Rights, there was – in a real sense – a duty to bear arms, in medieval England.) 

    By the early modern era, however, as medieval feudal society was transformed into modern capitalist society, economically speaking, a similar revolution took place in the law – and especially in the scope of monarchical power.  The king did not have unlimited powers; his authority was constrained by the law – a principle affirmed in Magna Carta (1215) but made real by a series of revolutions in 16th century England, particularly the English Revolution (or Civil War) of the 1640s and the “Glorious Revolution” of 1689.  These latter two events coincided with the founding of most of the original English colonies in America.  While in England, the aftermath of the Glorious Revolution meant that it was not the king but the Parliament that was “sovereign” (and which therefore held the ultimate political power in society), in America it meant that it was not government at all, but the people, who were “sovereign.” 

    The state and national governments of the United States of America were established during the intellectual movement known as the Enlightenment (a movement that, for the first time in human history, regarded all individuals, regardless of their station in life, as equally possessing fundamental rights as human beings); and they were founded on the political principles of another philosophical movement known as “liberalism” (in the classic sense, or “classical liberalism,” as modern libertarians call it).  Those principles, stated so eloquently in the Declaration of Independence, include the fundamental ideas that all human beings inherently (and naturally) hold certain individual rights, including the rights to life, liberty, and the pursuit of happiness; that government is established for the purpose of better “securing,” or safeguarding, these individual rights; and that the legitimate powers of government derive from the “consent of the governed.”  (The latter principle is often called “popular sovereignty,” but it does not mean that “the people” have unlimited power, as English kings claimed to have, centuries ago.  Politically, the people possess – and may grant to government – only those powers necessary to safeguard individual rights, powers that themselves must not violate individual rights.  That’s the fundamental constraint put on the power of government by the American constitutional system.) 

    Under the principles of America’s founding, the government exists for the sole legitimate purpose of protecting the fundamental rights of the individual.  For government to call upon individual citizens to sacrifice their liberty – and even their lives – in the name of “patriotism” or in the name of a universal duty to provide “community service” (that is, to compel individuals to serve some collective), is to betray the very principles on which this nation was founded.   True “patriotism” literally means “love of [one’s] county”; in a nation explicitly founded on the primacy of individual rights, it means that no individual can be called upon to sacrifice his life or liberty for the sake of others – and that no one who truly loves America and the ideals on which it was founded would call for such a sacrifice.  And for those who regard service to others (whether to one’s family, one’s local community, or one’s state or national government) as a moral ideal, it must be remembered that the essence of morality is free will, or voluntary choice.  Compulsory service is not “moral” – just as compulsory “charity” is not moral, either.  (Indeed, as many conservatives have argued, the modern “welfare state,” by compulsory redistribution of wealth from some members of society to others, has in fact undermined private charity and its moral foundations.) 

    On Memorial Day (and again on Veteran’s Day, in November), Americans appropriately honor those exceptional individuals who, by volunteering to serve in the U.S. military, have not only shown their “patriotism” (in the true sense of the word, by showing how much they love, or value, their country) but also have truly been of “service” to their fellow Americans (in the moral sense), helping to preserve all Americans’ freedoms.  That’s why we should honor veterans: they chose to serve in the military, rather than being forced to follow some unconstitutional (and misguided) law.

      

     

    n “Marriage, American Style” Revisited 

                Nine years ago I posted my essay “Marriage, American Style” (May 19, 2004), giving truly “liberal” (that is, classical liberal or libertarian) arguments for legal recognition of same-sex marriage.  (Note that I’m using the more accurate term same-sex marriage – not “gay marriage” – because no one really ought to oppose the latter.  All marriages ideally should be “gay,” in the historic sense of the term as “happy,” in other words, that they’re emotionally rewarding for the married couple.)  I discussed why it would be consistent with the American constitutional tradition (with its emphasis on the sovereignty of the individual) to redefine marriage to include same-sex relationships – and why such a reform would be truly “progressive,” in the proper sense of the word.  (As Thomas Jefferson wrote, “Laws and institutions must go hand in hand with the progress of the human mind.  As that becomes more developed, more enlightened, as new discoveries are made, new truths disclosed, and manners and opinions change with the change of circumstances, institutions must advance also, and keep pace with the times.”)  

                In the nine years since I posted that essay, much progress (true “progress,” as Thomas Jefferson used the word) has been made, both in the United States and around the world.  In 2004 only one U.S. state recognized same-sex marriages, Massachusetts, as the result of a controversial state supreme court decision that conservatives decried as “judicial activism.”  And the United States was only the fourth nation in the world to recognize same-sex marriages.  In the United States today, 12 states and the District of Columbia have legalized same-sex marriages.  And a total of 14 nations have similarly done so – most recently, France (despite its being a heavily Roman Catholic country).   

    Meanwhile, the U.S. Supreme Court will soon decide two important cases relating to same-sex marriage.  In the case involving a constitutional challenge to California’s Proposition 8, a ballot initiative banning recognition of same-sex marriage in the state, I predict the Court will uphold the initiative, viewing the issue of same-sex marriage as a policy question that the people of each state ought to decide for themselves.  (Despite some persuasive arguments by libertarians and even some conservatives maintaining that same-sex marriage is a fundamental right that states cannot deny, without running afoul of the Fourteenth Amendment’s due process and/or equal protection clauses, I think the conservative majority on the Court will uphold federalism as the leading constitutional principle.)  In the other major case, involving a challenge to the federal Defense of Marriage Act (DOMA), which prohibited the U.S. government from recognizing same-sex marriages, even in states where they had been legalized, I predict that the Court will hold for the challengers and strike down the federal law.  Why?  The basic answer, again, is federalism:  the definition of marriage is a matter that the Tenth Amendment reserves to the states.  Legal recognition of same-sex marriage ought to come not from the judiciary (critics would continue assailing so-called “activist” judges who attempt to “legislate” from the bench), but from the people of the state (or at least their representatives in the state legislatures), as they become “more enlightened.”   

    True progress cannot happen overnight, nor can it be imposed by the government on the people. 

      

     

    n Aborting Extremism 

                    On most issues concerning individual rights, I agree with Barry Goldwater: "Extremism in the defense of liberty is no vice, and moderation in the pursuit of justice is no virtue."  What many people (usually those who self-identify as “moderates” or “centrists”) condemn as “extremist,” I regard as a principled defense of freedom, in all its aspects:  for example, I fully support First Amendment freedom of speech, the Second Amendment right to keep and bear arms, and economic freedom and property rights – and hence I oppose, respectively, all legal restrictions on campaign financing or on so-called “indecent” speech, all gun-control measures (including background checks), and virtually all legal restrictions on business owners (including minimum-wage laws, compulsory unionization laws, smoking bans, antitrust laws, even zoning requirements). 

    There is one issue, however, on which I take a position that might be called “moderate”; that issue is in today’s politics the other important “social” issue (besides same-sex marriage) – abortion (and specifically, legal restrictions on a pregnant woman’s freedom to obtain an abortion).  The abortion issue is unique because it does not involve a clear dichotomy between individual rights on the one hand and government controls on the other.  Rather, it involves an unavoidable conflict between two kinds of individual rights, both equally fundamental (the pregnant woman’s right to liberty and self-ownership and the unborn child’s right to life).  Because of this conflict, abortion is the one issue where a “moderate,” or “centrist” position, rather than being a compromise with evil (as it is on most other issues), is the only rational solution – and where the “extremist” positions, on both sides of the issue, are truly irrational “vices.”  

                    What is the moderate, or centrist, position?  It turns on the point in pregnancy when a fetus becomes viable – that is, when it is capable of living outside the mother’s body.  By that point, historically (as a matter of customary Anglo-American common-law, dating back centuries), the baby is recognized under the law as a human being holding the fundamental right to life.  Prior to that point, in the early stages of pregnancy, the mother’s right to liberty (her ownership of her body) is paramount, for the fetus is only a potential human life.  Thus, any legal restrictions on a woman’s freedom to obtain an abortion in the early stages of pregnancy, for any reason, are infringements of her fundamental rights.  After viability, of course, the baby is not merely potential human life but actually a human being; its killing, through so-called “late-term” abortion procedures, is indeed murder and thus ought to be prohibited by the law.  This balancing of the fundamental rights was what the Supreme Court tried to do in Roe v. Wade (1973); it makes sense under the law and also fits with majority public opinion (although the precise legal definition of when human life begins, under our federal constitutional system, is a matter reserved by the Tenth Amendment for the states).

    Since the 2012 election (when Mitt Romney’s negative comments about Planned Parenthood and its use of taxpayer funds to provide abortions might have alienated some female voters, as the left-liberal political narrative about the Republicans’ alleged “war on women” claimed), the leftist, “pro-choice” news media has been publicizing horror-stories about “extreme” anti-abortion laws passed by legislatures and signed into law in “red” states (such as a newly proposed North Dakota law defining human life as beginning at conception, or an Arkansas law – currently being challenged in court – that bans most abortions after 12 weeks of pregnancy).  But the anti-abortion, “pro-life” movement may have received a boost, in public opinion, after the media finally started to cover the recent criminal case involving Dr. Kermit Gosnell, the Pennyslvania abortion doctor who performed late-term abortions at his run-down clinic in west Philadelphia.  Earlier this month, Dr. Gosnell was convicted of four counts of first-degree premeditated murder in killing four viable babies (born alive) and one count of manslaughter, in allowing a pregnant woman to die following an abortion.  He escaped the death penalty by agreeing to waive his right to appeal his murder conviction; he will serve life in prison without parole. 

    Even the most fervent “pro-choice” activists were embarrassed by Dr. Gosnell’s case.  The atrocities revealed during his trial were truly horrendous.  Although a spokesperson for NARAL Pro-Choice America tried to argue that “peeking into Gosnell’s clinic is like peeking into U.S. history pre-Roe v. Wade,” most pundits agreed with Fox News analyst Kirsten Powers (a left-liberal) that Gosnell’s atrocities were no “aberration,” and raise troubling questions about abortion clinics elsewhere across America.  Polls taken after the Gosnell case show a shift in public views, away from support of abortion (especially the type of late-term abortions that Dr. Gosnell performed).  The only thing that pundits have agreed is that the case has changed the debate over abortion.  I predict that the “moderate,” or “centrist” view will eventually win.

      

     

    n The Rebirth of Freedom of Labor  

                     Membership in labor unions plummeted in 2012 to its lowest level since the 1930s, according to government figures released by the Bureau of Labor Statistics earlier this year.  As John Hinderaker reported on the Power Line blog, overall union membership declined from 11.8 percent to 11.3 percent of the workforce, or 14.4 million workers (a decline by about 400,000).  More than half the loss, about 234,000 members, came from government workers including teachers, firefighters, and public administrators.  “But unions also saw losses in the private sector even as the national economy created 1.8 million new jobs in 2012,” the AP wire story reported.  “That membership rate fell from 6.9 percent to 6.6 percent, a troubling sign for the future of organized labor, as job growth generally has taken place at nonunion companies.”  Hinderaker adds that it’s not coincidental:  “Unions kill jobs, and eventually the job losses take their toll on union membership.  One is left wondering why such a spent movement is able to play a disproportionate role in our politial life” (“Unions` Decline Continued in 2012,” January 23).  

    As I’ve previously noted here on MayerBlog, the decline of “Big Labor” in the United States – both in terms of membership and its political clout – is good news for American workers and taxpayers.  (See my essay “Summer 2012 in Review” (Sept. 12, 2012), the sections on “Those Damn Labor Unions!” and “Victories in Wisconsin and Indiana.”)  

    Labor unions may have served an important purpose, in protecting the interests of workers in the private sector, in the early 20th century (at a time when American courts, following English common-law precedents, regarded most labor unions as criminal organizations that illegally restrained trade).  But since the New Deal era, when Congress unconstitutionally exercised its power to “regulate commerce among the states” by passing the first federal labor laws, the government has artificially protected labor unions, making them the favored creature of the law, giving them an unfair advantage in negotiating with employers – and depriving individual workers who chose not to join a labor union of their fundamental rights.  (Earlier in the 20th century, at a time when the U.S. Supreme Court protected “liberty of contract” as a fundamental constitutional right – see my book Liberty of Contract: Rediscovering a Lost Constitutional Right (2011), discussed in my essay “Rediscovering Liberty of Contract” (Jan. 14, 2011) – workers were not required to join labor unions as a condition of their employment.  But the federal labor laws enacted by Congress in the 1930s, 1940s, and 1950s not only mandated many union-contract standards – including minimum wages and compulsory overtime pay – in most employment contracts but also gave unions unfair advantages in organizing and bargaining with employers.  Among other things, federal labor laws mandate that a majority of workers, agreeing to collectively bargain through a labor union, might compel the minority who do not choose to join the union.  One big exception in federal labor laws, discussed below, is the provision permitting states to enact “right-to-work” laws that permit workers to opt out from joining a labor union.) 

    Especially since the end of World War II – the time when labor union membership was at its peak in U.S. history – labor unions have contributed to the economic decline of the United States, particularly in manufacturing industries (consider especially the U.S. auto industry), chiefly through the inflation of labor costs (not just salaries but also health-care and pension benefits, the costs of which have been exploding in recent years).  And public-sector unions (that is, the unionization of government workers, including teachers, fire fighters, and police officers) have practically bankrupted many states and local governments (again largely because of the exorbitant, hemorrhaging costs of health care and pensions, as well as “sweetheart” contracts that provide for workers to contribute only a small fraction of their salaries to these benefit plans).  The only jobs that labor unions protect are union jobs – resulting in excessive labor costs that percolate throughout the economy, which in turn results in fewer jobs for the majority of Americans who are not union members as well as higher taxes for local, state, and federal taxpayers (who are footing the bill for the benefits provided unionized government workers).   

    Little wonder, then, that state governments – generally, states with Republican majorities in their legislatures as well as Republican governors (because labor unions, and especially government-worker unions, remain core constituencies of the Democratic Party) – have begun to reform their labor laws, particularly regarding government-worker unions.  As I noted in my September 2012 essay, it was a huge victory – again for both taxpayers and for workers – when Wisconsin (the birthplace of “progressivism,” the first state to allow government workers to unionize) in 2011 enacted a much-needed bill to reform government-worker unions which, among other things, raises state employees’ share of pension and health costs, limits union negotiations of wages, and prohibits unions from automatically deducting union dues from workers’ paychecks – all common-sense (and actually rather modest) reforms that were sorely needed in order to help reduce the skyrocketing costs of government workers’ pensions and health-care plans.  (For more on the Wisconsin reforms, see my 2011 “Spring Briefs” essay (Mar. 18, 2011) – and scroll down to the entry titled “Real Cheeseheads: Wisconsin Union Thugs and Their Democrat Allies”).  Efforts by Big Labor and the Democratic Party to overturn the law in the courts failed – as did their effort to recall Governor Scott Walker.   

    The success in Wisconsin has inspired other states – especially in the Great Lakes region, part of the nation’s “Rust Belt” where manufacturing industries have been in steep decline due to rising labor costs – to follow with their own reforms.  Indiana, for example, became a “right to work” state – one that allows workers the freedom to decide for themselves whether or not to join a labor union.  (Indiana was the 23rd state to pass a “right to work” law.  Most states with such laws are in the South, and they have been experiencing economic growth despite the national recession.  As reported by economist Richard Vedder in a 2010 Cato Journal article, data show a 23% higher rate of per capita growth in right-to-work states.  And according to a recent report by the U.S. Census Bureau, between April 1, 2010 and July 1, 2012 a net total of nearly 809,000 Americans moved into one of the then-22 right-to-work states from elsewhere in the U.S. – a continuation of the massive exodus of employees and their families from forced-unionism states that the Census Bureau has documented ever since it began tracking state-to-state domestic migration during the 1990s.)  

    Perhaps the most striking development in labor law at the state level has occurred in my original home state, Michigan.  As I reported in my “Election 2012 Postmortem” (Nov. 10, 2012), Michigan voters on Election Day 2012 rejected two proposed constitutional amendments that were pushed by Big Labor: Proposal 2 would have guaranteed collective bargaining, prohibiting the legislature from passing any new laws that would have restricted unionization, such as a “right-to-work” law; and Proposal 4 would have mandated minimum wages and unionization of home health-care workers.   

    In the wake of these major defeats by Big Labor, Republican Governor Rick Snyder (a moderate Republican businessman) and the Republican-controlled legislature passed a law in mid-December making Michigan the 24th “right to work” state.  Again, “right-to-work” laws give workers the right to not join unions and prohibit the coercive collection of dues from those not choosing to join.  The time was right to break a long-standing tacit truce in Michigan politics on union rules: Michigan had the nation’s sixth-highest state jobless rate at 9.1% and it had one of the lowest rates of personal income growth between 1977 and 2011.  As the editors of  Wall Street Journal put it, “it could be the best thing to happen to Michigan’s economy since the internal combustion engine” (“Worker Liberation in Michigan,” Dec. 11, 2012).   Meanwhile, Indiana now has a record number of businesses choosing to expand or set up in the state (220 companies, including Amazon and Toyota, which will invest $3.6 billon and create some 21,000 new jobs, according to the Indiana Economic Development Corporation).   

    The success of neighboring states in passing right-to- work laws has renewed the political pressure on the Republican-majority legislature (and Republican Governor John Kasich) here in Ohio, to follow suit.   The Buckeye Institute for Public Policy Solutions, the state’s free-market think-tank, has been trying to educate Ohio politicians of the economic benefits that a “right-to-work” law would bring to all Ohioans.  (See “Ohio Right-to-Work: How the Economic Freedom of Workers Enhances Prosperity,” March 2012 study posted on the Buckeye Institute website.)  Not surprisingly, in response, Big Labor has already begun its campaign of misinformation, with TV ads and billboards ludicrously claiming that right-to-work or workplace freedom is “Communist”!  

    Finally, there are other ways in which unions limit all Americans’ freedom – and stand in the way of a dynamic, prosperous economy.  Among them are minimum-wage laws, which unions typically support (even though their members, whose labor is priced well above the minimum, do not directly benefit), because they limit competition in the labor market and thus have the economic effect of keeping labor costs inflated (which does benefit union members).  Democrat politicians, eager to do the bidding of their labor-union allies, keep pushing for increases in the minimum wage, at both the state and federal levels; B.O. in his “State of the Union” address earlier this year called for an increase in the federal minimum wage from $7.25 to $9.00 an hour.  That’s a truly asinine idea, as I discussed in “Spring Briefs 2013” (March 23), the section on “Minimum Wage, Maximum Folly,” as well as my earlier essay entitled “Minimum Wage, Maximum Folly” (Oct. 20, 2006).  

    The problem with minimum-wage laws, from the perspective of individual rights, is that they deprive workers whose labor is not worth the minimum price arbitrarily set by government (usually that means younger, less skilled workers) of their freedom to work; or in other words, to enter into a contract with a willing employer at a price less than the minimum set by government.  Such laws thus abridge a fundamental right: the right to earn a living.  And from a utilitarian, or policy, perspective, such laws hurt most those individuals they are supposed to help:  younger, unskilled workers (who also are often members of minority racial groups, which is why the black free-market economist Walter Williams condemned minimum-wage laws in his classic book, The State Against Blacks (1982)). 

    Fortunately, B.O.’s proposed increase in the federal minimum wage – like so many other initiatives he has proposed – is doomed to fail in the Congress, not just because of Republican opposition.  It’s also doomed to fail, like his other policies, because it defies economic reality.

      

     

    n Finally Getting Up to Speed in Ohio 

                Earlier this spring Ohio lawmakers passed a measure to increase speed limits on many miles of interstate highways in the state to 70 mph.  The new speed limits, which will go into effect in July, apply mostly only to rural areas; speed limits on interstate highways in urban areas – such as I-270, the beltway around Columbus (the highway I drive most frequently) – will remain at 65 (or even 55 in some urban areas). 

                It’s about time.  Most of Ohio’s neighboring states (including Michigan, Indiana, and West Virginia) already raised speed limits on many interstate highways to 70 mph several years ago.  I can’t even say that “finally Ohio’s entering the 21st century” because the last time speed limits in Ohio were set at 70 mph was 1963.  Then, like nearly all other states, Ohio was coerced by the federal government (which threatened to withhold federal funding for highways) into reducing speed limits to 55 mph during the “energy crisis” of the 1970s (a crisis created by bad federal government policies), supposedly to conserve fuel.  Not until Reagan era, in the late 1980s, did 65 mph speed limits reappear on Ohio highways.  (That’s not progress but regress:  instead of traffic moving faster and more efficiently it’s back to where it was in the 1960s!) 

                This modest change in highway speed limits raises some questions.  First, why was Ohio slower to act than its neighboring states?  The answer seems quite clear to me: it’s because Ohio’s not only centrally located (in the U.S. east of the Mississippi); it’s also centered on the political spectrum – a state where both major political parties are boringly “moderate,” in the mushy middle, with little difference between Democrats and Republicans and little desire on the part of politicians in either party to challenge the statist status quo, the “Nanny State.” 

                That unfortunate reality of Ohio politics also suggests the prospects are rather bleak for real reform in other areas.  Ohio’s not likely to be a trend-setter but rather a follower.  Still, on what other issues might there be prospects for reform?   One, as suggested in the previous section, is labor-law reform, particularly the adoption of right-to-work laws, another issue on which Ohio’s neighboring states have taken the lead.  To compete effectively with Michigan and Indiana in attracting businesses to Ohio, the state’s lawmakers may have to act, sooner rather than later. 

                Legal recognition of same-sex marriage is another subject of possible reform, even though Ohio is one of many states that has constitutionally declared the definition of marriage to be limited to a man and a woman.  It’s significant that U.S. Senator Rob Portman, a conservative Republican, now supports same-sex marriage, a “change of heart” (in Portman’s words) after he learned that his son Will is gay.  So far it seems that Portman’s acceptance of same-sex marriage has not damaged him politically – perhaps because, as noted in the “Marriage, American Style” update above, other conservative Republicans have changed their position on the issue.  Ohioans in general may have had a change of heart:  according to a recent (March 2013) poll by The Columbus Dispatch, 54% of the adults surveyed favor a proposed amendment to the Ohio Constitution permitting two consenting adults to marry, regardless of their sex – a significant shift in Ohioans’ sentiments since voters overwhelming supported the 2004 ban on same-sex marriage.  FreedomOhio, the group advocating the proposed constitutional amendment, is encouraged and plans to put it on this year’s Nov. 5 ballot. 

                Another important issue where Ohio lags behind other states (including neighboring state – and rival – Michigan) is the liberalization of laws criminalizing marijuana.  As I noted in my “Election 2012 Postmortem” (Nov. 10, 2012), the trend at the state level is definitely in favor of adults’ right to use and possess marijuana (particularly for medical uses).  In the fall 2012 elections, proposals to legalize the recreational use of marijuana were approved by voters in two states (Colorado and Washington) and a proposal to legalize the medical use of marijuana was approved by voters in Massachusetts.  So far, 18 states and the District of Columbia have adopted medical marijuana laws, and two more – Illinois and New Hampshire – are expected to enact them within the next few months.  The group pushing for marijuana reform in the state, OhioRights.org, recently received from the Ohio Ballot Board the go-ahead to begin collecting signatures on petitions for a proposed constitutional amendment.  The amendment (named the Ohio Cannabis Rights Act) would allow people 18 or older with a “debilitating medical condition” to “use, possess, acquire, and produce” marijuana, under the regulation of an Ohio Commission of Cannabis Control; it also would legalize the growing of hemp – declassifying it as a drug (hemp plants are related to cannabis and, although a valuable crop in early American history, have been illegal under the modern “war on drugs”) and allow it to be grown as a crop with oversight by the Ohio Department of Agriculture.  (How typical of “Nanny State” Ohio that something cannot be legalized without subjecting it to government regulation!)  The measure will likely be on the ballot in fall 2014. 

                There yet may be hope for Ohio, as it lurches forward into the 21st century.

      

     

    n Scouting for a Solution 

                In this year’s “Spring Briefs” essay (March 23) – in the section provocatively titled, “Do They Give a Merit Badge for Homophobia?” – I discussed the Boy Scouts of America (BSA) as the organization was considering a change in its policy barring openly “gay,” or homosexual, members.  (It was often described in media reports as a “long-standing policy,” but it had been in place only since 1991.)  At the time it appeared the Scouts were headed toward a compromise – ending the national ban but still permitting individual chartering groups (many of which are religious organizations) to set membership restrictions for themselves.  But when the Boy Scouts took a vote at the organization’s National Annual Meeting on May 23, they adopted a different compromise:  it ended the ban on gay youth, allowing them to participate in scouting, but retained the ban on gay adult Scout leaders.  (61% of the more than 1,400 voting members of the Scouts’ national council voted in favor of the proposal, to become effective January 1, 2014.)  In other words, an openly gay boy (presumably) would be welcome to join – or to stay – in the Scouts, but when he turns 18, he’ll have to cease his association with the group; and the gay male and lesbian parents of Boy Scouts won’t be able to help out their sons’ “pack.”  (The national council didn’t vote affirmatively to ban gay adults from participating, but their vote to rescind the ban on membership was limited only to kids.) 

    It’s “a half-step forward,” declared the editors of USA Today on May 24.  “Cheers should surely be muted because the Scouts couldn’t bring themselves to take the next obvious step and welcome gay leaders.  But . . . it’s a remarkable shift in a very short time.  Just last summer, Scouting leaders reaffirmed their total ban on gays, a policy they had defended all the way to the Supreme Court – and won, because the First Amendment guarantees private groups the right to choose their members.”   

    As I noted in my “Spring Briefs” essay, the BSA was under tremendous pressure, both from outside and from within (including present and former Scouts, both homosexual and heterosexual), to change its policy – as the attitudes of Americans generally are changing, become much more accepting of homosexuality (as the recent trend in favor of same-sex marriage indicates).  Nevertheless, perhaps because of those changing attitudes in the general population, many conservatives are adamantly opposed to a change in membership policy.  About 70% of troops are chartered to faith-based groups, including the Catholic Church, which condemns homosexuality as a sin; many of these religious groups do not consider homosexuality as compatible with the Scouts’ value of being “morally straight” (no pun intended).  Forced to welcome openly gay Scouts and leaders, some groups would end their BSA affiliation, seriously weakening an organization already declining in numbers and influence.  Yet, as the USA Today editors also observed, it’s a positive sign that the Mormon Church, the single largest sponsor of Scout troops, has said the change won’t affect its support for Scouting.  (The Catholic Church said it would need to study the issue, “but at least it didn’t threatened to quit,” as some conservative Christian groups have done.) 

                Progress comes slowly when it requires change in people’s attitudes and beliefs – especially when it comes to breaking down prejudices and bigotry.  (“Homophobia,” the irrational fear and/or hatred of homosexuality, is a real phenomenon, rooted in Americans’ historic culture of repression when it comes to sexuality generally, as I discussed in my essay “In Defense of Sex” (May 16, 2005).  Still, BSA’s decision to change its membership policy regarding sexual orientation is indeed a positive development: whether a full step or only a half step, it is movement toward greater acceptance not just of homosexuality but of the full complexity of human nature. 

                The next challenge for the Boy Scouts – other than deciding whether to continue the ban on gay leaders – will be the organization’s requirement that members believe in God, which effectively excludes agnostic and atheist kids (and their parents).  Already some commentators are calling for the Scouts to open membership to non-believers (see, for example, Tom Krattenmaker’s column, “Good Boy Scouts Don’t Need God,” USA Today, May 13).   

                Just as homosexual Scouts can be as “morally straight” as heterosexuals, so too can agnostic or atheist Scouts.  As comedian/magician Penn Jillette maintains in his recent book Every Day Is an Atheist Holiday! (Blue Rider Press/Penguin, 2012), “[his 7-year-old] son’s morality does not come from God.”  Like Thomas Jefferson, Jillette believes that morality is grounded in human nature, which teaches us the basic principle, “Don’t cause pain”: 

    “[A]ll normal children understand the moral code at an early age.  Children are cruel, children are violent, children have no patience, children are moody, and children can’t seem to understand that sneaking under Daddy’s desk when he’s lost in thought while writing his book and grabbing his feet is going to scare the living shit out of him.  Maybe they do understand that last one.  Children have a lot of work to do on impulse control, but morality takes its place early on.  There are studies about normal children knowing the moral difference between a teacher saying it’s okay to stand up during circle time and a teaching saying it’s okay to lie and hit. They know the teacher can’t turn something that is morally wrong into something right by just saying it.  They understand that right and wrong are separate from authority.”

     

    Other atheists – namely, Objectivists and neo-Objectivists like myself – maintain that although a moral sense isn’t innate in human beings, it is grounded on reason; and that a moral code should recognize certain truths about human nature and be based on rational self-interest, combined with a respect for the equal rights of others.  The bottom line is that one need not believe in a god in order to be moral, or morally upright, as the Scouts profess to be.  Let’s hope that the progress they’re making with regard to sexual orientation might also lead them to understand this key point. 

      

     | Link to this Entry | Posted Thursday, May 30, 2013.  Copyright © David N. Mayer.


    Thoughts for Summer 2013, and Beyond (Part I) - May 23, 2013

     

     

    Thoughts for Summer 2013 – and Beyond

     

    Part I

      

    It’s time for another annual tradition:  once again, MayerBlog will be on hiatus, while I continue writing the manuscript of my next book, Freedom’s Constitution: A Contextual Interpretation of the Constitution of the United States.  But this year the hiatus will extend beyond the summer season and continue through the 2013–14 academic year, as I will be on sabbatical leave, so I can (at last) finish writing the book.  In other words, this will be the last essay posted to MayerBlog until Fall 2014 (probably Labor Day weekend in 2014). 

    Before going on hiatus, however, I could not resist the temptation to comment on a number of important issues in public policy and popular culture – issues that are in the news today and are likely to remain in the news throughout the summer and beyond.  Because there are so many topics (over 30!), I’ll be posting this essay in two parts, Part I today (before the Memorial Day holiday weekend) and Part II next week (after the holiday weekend).  If you scroll all the way down to the end of this entry, you’ll also see my summer reading recommendations and my annual preview of summer movies.

       

     

    n B.O.’s Fuelish War on Carbon Energy 

                Yet another annual tradition – and the traditional beginning of my “Thoughts for Summer” blog entry – unfortunately, has been the topic of high gasoline prices, a perennial issue because gas prices almost always rise every summer, simply as a result of higher demand.  But the problem has become much worse since B.O. began his occupation of the White House.  The day before B.O.’s first inaugural in January 2009, the average price of a gallon of gasoline was a mere $1.83.  Today, gas prices average at least $3.50 nationally and, according to some analysts, will slowly rise to a “new normal” of $4 or even $5 a gallon.  Thus, gas prices have more than doubled during B.O.’s presidency – one of the few promises that he actually has kept.  As I wrote in my entry “Spring Briefs 2011” (March 18, 2011): 

                “High gas prices are the direct result of the policies of B.O.’s regime in Washington – indeed, they’re the deliberate policy of the regime, whose energy policy is best described as an anti-energy policy, designed to create an artificial shortage in carbon-based fuels, that is, oil, natural gas, and coal – a veritable war on America’s domestic oil,  gas and coal industry.   It’s a deliberate program of restricting domestic energy to make so-called `green energy’ (which cannot pay for itself without massive government subsidies) more attractive and necessary, all in order to fulfill B.O.’s campaign promise that energy prices would “necessarily skyrocket” on his energy agenda.  . . . [Former] Energy Secretary Steven Chu (a physicist with no experience in the energy industry, who’s also a true believer in radical environmentalists’ global warming, or `climate change,’ theories), before he was appointed energy secretary, had expressed a fondness of high European gas prices as a means of reducing consumption of fossil fuels.  In September 2008, Chu told the Wall Street Journal, `Somehow we have to figure out how to boost the price of gasoline to the levels in Europe’—which at that time averaged about $8 a gallon.”

               

    And as I added, “Virtually all the decisions made by the B.O. regime not only document its hostility to fossil fuels but have had the effect of raising energy prices.”   

                The “green” policies of the B.O. regime have resulted in recent steep increases in the price of gasoline.  Oil refining capacity continues to be limited – no new oil refinery has opened in the United States since Gerald Ford was president – so that any problems at existing refineries (such as outages or extended maintenance, which recently occurred at several oil refineries that serve the Midwest) will, by limiting supply, drive up gasoline costs.  Moreover, this spring the Environmental Protection Agency proposed strict new “clean fuel” standards which will further increase the cost of fuel.  (The EPA said the so-called Tier 3 rule would cut emissions of smog-forming pollutants as well as toxic emissions like benzene.  What the EPA didn’t say was that levels of these pollutants have been steadily falling for years, and would continue to fall even without the new rule, which the oil industry says will cost tens of billions of dollars (“Pollution Levels Have Plunged, But EPA Plans Costly New Rules,” Investor’s Business Daily, April 22). 

    Why are B.O. and his regime so hostile to carbon-based energy (the so-called fossil fuels of coal, natural gas, and oil)?  As I discussed in Part III of my “2013: Prospects for Liberty” essay (February 7, in the section on “`Green’ Bullshit”), there are two principal reasons.  First is B.O.’s adherence to the bullshit radical environmentalist theory of “global warming” or “climate change” (discussed in the next section, below).  Second, B.O. deliberately aims to weaken the United States economically, to transfer America’s wealth to countries in the so-called third world, to promote what he regards as global economic “justice.”  That’s the goal of the radical leftist anti-colonialist ideology he inherited from his father, Barack Hussein Obama, Sr., as documented by Dinesh D’Souza, in his two books, The Roots of Obama’s Rage (2010) and Obama’s America: Unmaking the American Dream (2012), and his documentary film, 2016: Obama’s America.  As I observed in my “Tricks and Treats” essay (Oct. 25, 2012), D’Souza’s analysis provides perhaps the best explanation for B.O.’s double standard on energy:  he has waged war on carbon-based energy – using his Interior Department and EPA to impede development of America’s vast natural reserves of oil, natural gas, and coal (just when technological advances such as “fracking” have made exploitation of these reserves now economical) – but instead has helped bankroll deep oil drilling in Brazil, Columbia, and Mexico.  As D’Souza explains, the anti-colonial theory “predicts that Obama should want to enrich the previously colonized countries at the expense of previous and current colonizers.  Anti-colonialists insist that since the West grew rich by looting the resources and raw materials of the colonies, it is time for global payback.  The West must have less and the rest must have more.” 

                Whether it’s “green” bullshit or anti-colonialist ideology (another kind of bullshit) that’s behind B.O.’s agenda, it’s clear that during the past four years the B.O. regime has been pushing an anti-carbon agenda and is likely to continue pushing this agenda in the coming years, during B.O.’s second term (when, as a “lame duck” president, he’ll be even less accountable to public opinion than he was during his first term).  That means, among other things: 

    n  The imposition of even more stringent fuel-economy (CAFÉ) mandates on auto and truck manufacturers – which will force Americans to drive in smaller, lighter, less safe and less convenient vehicles.  (As I noted in my February 7 essay, that’s also part of B.O.’s ultimate goal: to make America more like Europe, with both its sky-high gasoline prices and its tiny cars.  And it’s yet another important example of how the B.O. regime has deprived American consumers of their freedom of choice.) 

    n  Continued restrictions on drilling for oil and natural gas off the shores of the United States.  B.O.’s former Interior Secretary Ken Salazar imposed a moratorium on oil drilling in the Gulf of Mexico, in the wake of BP’s Deepwater Horizon disaster – which, as I’ve previously noted, was far more destructive to the Gulf economy than the spill itself.  (Sally Jewell, B.O.’s new Interior Secretary, may be less of a radical environmentalist than Salazar – she was CEO of outdoor retailer Recreational Equipment, Inc. – but may be expected to continue implementing B.O.’s “green” agenda.)  During his tenure at the Department of Interior, Salazar removed major swaths of energy-rich U.S. land from production and exploration, costing the U.S. hundreds of billions of dollars.  As a federal report last year showed, oil output on federal lands fell 11% from 2010 to 2011, while natural gas dropped 6%.  (During the same time, thanks to the energy “revolution” discussed below, oil and natural gas output on private land surged, respectively, 14% and 12%.)  Today, the U.S. government leases less than 2.2% of federal offshore areas for oil and gas development, and just 6% of its lands.  All told, B.O./Salazar energy policies have removed an estimated 1.2 trillion barrels of oil and 21 trillion cubic feet of natural gas off the nation’s table – enough to last the U.S. hundreds of years (“Master of Disaster,” I.B.D., January 18). 

    n  Continued refusal to green-light the Keystone XL pipeline (more on this below) 

    n  Continued anti-carbon regulations implemented by B.O.’s Environmental Protection Agency.  (B.O.’s appointee as EPA director during his first term, Lisa Jackson, was not only a radical environmentalist but also was politically corrupt.  B.O.’s pick for Ms. Jackson’s successor during his second term, Gina McCarthy, is just another radical environmentalist.  She headed the agency’s “clear air” efforts during B.O.’s first term, and as EPA director is expected to push the B.O.’s radical anti-carbon agenda (including efforts to reduce carbon dioxide emissions – despite the fact that CO2 isn’t really a pollutant).)  As noted above, the EPA’s strict new “clean fuel” standards will significantly increase the price of gasoline.  Moreover, the EPA has announced new carbon-dioxide emission rules that will force new power plants to put expensive new equipment – equipment that doesn’t even yet exist – to capture and bury emissions underground.  Because the new EPA rules are so draconian, they effectively ban new coal-fired power plants by making them uneconomical to build.  Although the EPA recently announced it would delay implementation of the rule (after the electrical-power industry objected on legal grounds), when the rule finally is implemented, it will result in higher electricity prices for all Americans, as coal today provides about 40% of electricity generation in the United States (“EPA’s War on Energy Continues,” “EPA Effectively Bans New Coal Plants,” I.B.D., March 28, 2012; “EPA delays tough rule on carbon emissions,” Columbus Dispatch, April 13). 

    n  More “green” initiatives which, rather than producing alternative energy, instead have produced more political scandals – for example, the case of Solyndra, the bankrupt company that has become the poster child for this regime’s crony capitalism (or, more properly, crony socialism or fascism).  And rather than creating any real American jobs, they have resulted in more “outsourcing” of energy-industry jobs overseas – for example, by the use of stimulus dollars to build electric cars in Finland, or the support of the U.S. Export-Import Bank’s plan to loan Brazil’s state-run oil company, Petrobras, $2 billion to do deep-water oil drilling in the Atlantic. 

     

    What’s especially perverse about the anti-carbon energy policy of the B.O. regime is that it’s happening at the same time that the United States is about to experience a true revolution in energy, thanks to North America’s abundant natural resources in so-called fossil fuels.  Technological innovation (particularly hydraulic fracturing, or “fracking”) have now made it economically feasible to extract carbon fuels from deposits embedded deep in shale rock.  This revolution in carbon energy production at last is making realistic the dream that the United States would be fully self-reliant, no longer dependent on Middle East countries – a dream of U.S. policymakers since the 1970s.  Indeed, the U.S. is poised to become “the new Saudi Arabia.” 

    As I observed in my February 7 essay, the estimated supply of carbon fuels in North America is truly awesome:  with regard to oil, some 400 billion barrels of crude that could be recovered using existing drilling technologies, plus at least 1.4 trillion barrels of recoverable oil embedded in shale rock – enough to meet all U.S. oil needs for about the next 200 years, without any imports; with regard to natural gas, a 40% increase in U.S. production (which since 2002 has increased from less to 2% to some 34% of global production); and with regard to coal, some 1.7 trillion tons of identified coal resources and 3.1 trillion tons of total coal resources – enough recoverable coal reserves to last 239 years, at present rates of coal use.  With regard to oil output alone, the International Energy Agency recently predicted that the new supplies of North American crude oil will be a true game-changer, revolutionizing the global energy market, “as transformative to the market over the next five years as was the rise of Chinese demand over the last 15 years” (“New U.S. Oil Output Is a Game-Changer for Energy Industry,” I.B.D., May 15). 

                Yet, as I also noted in my February 7 essay, the good news is that B.O.’s anti-energy policies may be the one part of his second-term agenda that’s likely to meet determined congressional opposition, not only from Republicans but also from moderate Democrats from states rich in carbon-fuel resources (such as Senator Joe Manchin of West Virginia or the new senator from North Dakota, Heidi Heitkamp).  Thus, although radical environmentalists are encouraged by the lip-service B.O. paid to “climate change” in his Second Inaugural, some energy leaders are also optimistic that political pressures may force B.O. to adopt an “all of the above” energy policy. 

    The critical test will be B.O.’s decision regarding the Keystone XL pipeline from Canada, a critically important project that his regime thus far has been blocking.  The pipeline would carry nearly a million barrels of oil per day, bringing supplies from the oil sands of Alberta, Canada as well as U.S. crude from the Bakken oil fields of North Dakota and Montana, some 1,700 miles to Texas refineries, creating thousands of American jobs while also essentially eliminating U.S. dependence on Middle East oil.  Although Nebraska’s Department of Environmental Quality recently gave the revised Keystone XL pipeline route through that state a thumbs up, the U.S. State Department still hasn’t given its approval (which is necessary because the pipeline would cross an international border).  The State Department started its review of the original Keystone XL permit application over four years ago and has since publicly released over 15,500 pages of documents relating to its analysis – showing that the pipeline would have minimal environmental risks and would meet pipeline safety standards.  Moreover, the State Department’s most recent supplemental environmental impact statement found that the project would support over 42,000 annual jobs over its construction period.  Many labor unions, outraged over the B.O. regime’s apparent decision to put “green” politics over jobs, have rallied together to express their support for the project and a desire to move forward.  And many members of Congress, both Democrats and Republicans, are tired of waiting for B.O. to decide which road he will choose – and thus may take matters into their own hands, with a House bill (H.R. 3, the Northern Route Approval Act) that will end the regulatory delays by moving Keystone XL beyond the regulatory process and finally allow construction of the landmark jobs and energy project to begin, as Rep. Marsha Blackburn (R.–Tenn.), vice chairman of the House Energy and Commerce Committee, noted in a recent op-ed (“It’s Time To Act on Keystone,” I.B.D., May 21).     

    As Rep. Blackburn concluded, “The inescapable truth is that with or without the Keystone XL pipeline, Canada’s oil sands will continue to be developed.  The only question is whether this valuable oil supply will be refined in the U.S. or shipped overseas to places like China.”

      

     

    n “Green Bullshit” Gradually Dissipates 

    For over 40 years radical environmentalists have been pushing a political agenda based on a series of beliefs that I have called “`green’ bullshit" (see Part III of my “2013: Prospects for Liberty” essay (February 7)): the theory of “global warming” – or “climate change,” as it’s now euphemistically called – in other words, the theory that the average global temperatures are increasing, and that the Earth’s warming will lead to cataclysmic disasters such as massive flooding of coastal areas as the polar ice cap melts and the world’s oceans rise to dangerously high levels, etc., etc., and that this dangerous global warming is caused by human activity, namely, by man-made carbon dioxide (a so-called “greenhouse gas”) created by the burning of carbon-based “fossil fuels” such as coal, oil (and other petroleum products), and natural gas. 

    The global-warming thesis is a theory that, despite the propaganda of global-warming alarmists, is far from being scientifically proven.  Indeed, it is a flawed theory that fails to fit the facts; in other words, it’s a theory based on faulty “junk” science – to which radical “green” true believers adhere based mostly on faith, not reason.  To them, it’s a kind of religion that has at its core a fanatical hatred for anything that’s man-made and for the human beings who make them.  Greg Gutfeld, a commentator on Fox News (libertarian panelist on The Five and host of Red Eye), summed it up quite well in his first book, The Bible of Unspeakable Truths (Grand Central Publishing, 2010, p. 27): 

    “Sadly, this antihuman bullshit keeps flowing, unfiltered and unquestioned.  Global warming theology and the journalism that soaks it up avoid focusing on fact, and instead build a story – one that is easily understood by the arrogant, simplistic minds that prefer to think humans (or rather, capitalistic humans) are to blame for everything wrong in the world.  It’s not only dishonest, it’s murderous.  We are now scaring a populace into thinking that the biggest threat to mankind is man’s role in climate change, instead of ideologies that promote death and violence.”

     

                Over the past few years, it has become even more clear that the “climate change” or “global warming” theory is nothing but a scam -- indeed it’s been called “the greatest scam in history,” by John Coleman, meteorologist and the founder of The Weather Channel.  The revelations from what has been called “Climate-gate” – e-mail exchanges and other documents hacked from computers at the Hadley Climate Research Unit at the University of East Anglia in Great Britain – reveal that there has been a conspiracy among some in the science community to spread alarmist views of global warming and to intimidate, if not silence, those who disagree (the skeptics of global warming theory, called “denialists” by the true believers, whom I call “warm-mongers”).  In other words, what Michael Crichton warned about in the fictional story in his novel State of Fear – a global hoax by radical environmentalist terrorists – has practically come to pass.   

                Fortunately, however, more and more Americans are realizing that the radical environmentalists’ theory really is nothing more than a massive scam.  Public opinion polls show that concern about alleged “global warming” or “climate change” has drastically declined – particularly as a new “revolution” in carbon-based energy resources (discussed above) has the promise of re-energizing the United States and the moribund American economy.   

    As I noted in my February 7 essay, earlier this year a group of more than 20 retired NASA scientists and engineers, who call themselves “The Right Climate Stuff,” has issued a report that decisively shatters the global warming myth.  After reviewing, studying and debating the “available data and scientific reports regarding many factors that affect temperature variations of the earth’s surface and atmosphere,” they’ve found (among other things):  “the science that predicts the extent of anthropogenic [man-caused] global warming is not settled science”; “there is no convincing physical evidence of catastrophic anthropogenic global warming”: and “because there is no immediate threat of global warming requiring swift corrective action, we have time to study global climate changes and improve our prediction accuracy.”  They conclude that Washington is “over-reacting” on global warming and suggest that a “wider range of solution options should be studied . . . .” (“Facts About Climate,” I.B.D., January 28). 

                The unraveling of the “global warming” scare continues, with a number of recent important pieces of science news, as the editors of Investor’s Business Daily have noted.  Among them:  Scientific American reports that the Arctic “used to be a lot warmer,” a good 8 degrees Celsius warmer, than today (that was 3.6 million years ago – well before man-made carbon dioxide!); a BBC report that over the coming decades “global average temperatures will warm about 20% more slowly than expected”; predictions of sea-level rise by the UN’s Intergovernmental Panel on Climate Change have fallen to only 7 – 23 inches (not the 80-inch figure touted by some alarmists, including UN Secretary-General Ban Ki-moon); new evidence from the Australian Broadcasting Company confirming a 2010 report that rather than shrinking, many low-lying Pacific islands are actually growing; and a New York Times story reporting that, according to a new study, CO2 concentrations (at record high levels, last seen about 2 million years ago, according to recent media reports) would have to triple or quadruple before any of the global warming catastrophes that the alarmists have predicted would actually occur (“Global Warming Cools Off,” May 21).  (For more myth-bashing, see some of the writings of Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute,  particularly his recent essay, published in Forbes, “The Climate Change Horror Picture Show” (April 18).) 

    I’m old enough to remember the late 1970s – just a few years after the modern radical environmentalist movement began, with “Earth Day” in 1970 – when the news media was warning about global cooling.  (Remember the Newsweek cover story, “The Coming Ice Age”?)  It seems that “climate change” science hasn’t improved much over the past 35-40 years.  The I.B.D. editors suggest that “maybe the alarmists [should] apologize for the decades of public disservice they have provided.”  But I wouldn’t hold my breath.

     

       

    n The Disease Known as “ObamaCare” 

    In its shameful decision in National Association of Independent Business v. Sebelius last summer (at the end of June 2012), the U.S. Supreme Court upheld the constitutionality of the 2010 federal health-care control law, officially (but disingenuously) titled “The Patient Protection and Affordable Care Act” (PPACA) but popularly known as “ObamaCare,” because it is the signature legislative “achievement” of B.O.’s presidency.   (The pivotal opinion in the case, written by Chief Justice John Roberts, ridiculously upheld the individual mandate – the keystone of the massive bureaucratic edifice created by the law – as a “tax,” thus providing the crucial 5th vote for the Court’s overall decision upholding the law, 5–4.  As I discussed in my special blog post last summer, “Supreme Folly 2012: The Supreme Court’s `ObamaCare’ Decision” (July 5, 2012), Roberts thus earned the acronym name that my good friend Rod Evans coined for him: “Fierce Jurist Botches”!  

    The massive, 2,800-page “ObamaCare” law was passed by Congress late in 2010, at a time when both houses of Congress were controlled by the Democrats.  It passed by a strict party-line vote, without a single Republican (in either house of Congress) voting in favor of it, and also by some parliamentary chicanery (including evasion of the Constitution’s requirement that revenue bills – which the law ought to be considered, especially given the Court’s decision – must originate in the U.S. House, not the Senate, where the bill that ultimately passed Congress and was signed into law by B.O. really had originated).  As I have frequently noted here on MayerBlog, the law was truly “historical,” but not in the positive sense:  it was the most important law ever passed by Congress on such a strictly partisan basis, and it is also (arguably) the worst – the most monstrous – piece of legislation ever passed by Congress.  In Part III of my “2013: Prospects for Liberty” essay (February 7), I called it “the biggest Mongolian clusterfuck of them all,” with an explanation why that slang term is particularly apt in describing the PPACA – not just a horrid law but also, arguably, the worse disease afflicting the American health-care system (a supposed “cure” that’s far worse than the problems it was supposedly designed to solve).  

    Public opinion polls at the time the law was pushed through Congress showed that the majority of Americans were opposed to it; and over the past two and a half years, the more Americans learn about the law, the more unpopular it has become.  (According to a recent Kaiser Family Foundation poll, only 35% of Americans view “ObamaCare” favorably, while 53% support efforts to continue to block and repeal it.)           

    Nevertheless, the Court’s disappointing decision, coupled with the results of the 2012 elections – the election of B.O. to a second term as “occupier-in-chief” of the White House, and the Democrats’ continuing majority in the U.S. Senate – meant that it is highly unlikely, politically, that the law will be repealed (as most Republicans have pledged to do) before its full implementation effectively begins in January 2014.  Indeed, as long as B.O. remains in the White House and as long as he adamantly insists on vetoing any repeal legislation, it seems that the American people are stuck with his horrid “ObamaCare” law.  Those of us who are adamantly opposed to the law on constitutional, policy, and philosophical grounds – most importantly, as an abridgement of Americans’ fundamental right to health-care freedom (see my discussion in “Unhealthy, Unconstitutional `Reform’,” April 8, 2010) – must rest our hopes for getting rid of the law on such thin reeds as:  additional court cases challenging the constitutionality of the law on other grounds (which I discussed in Part III of my “2013: Prospects for Liberty” essay (February 7); efforts on the part of state governments to “nullify” the federal law (which are even more controversial than “ObamaCare” itself) or to block its implementation (particularly with regard to the proposed expansion of Medicaid); and political change, namely big GOP gains in the 2014 Congressional elections – resulting in Republican majorities in both house of Congress that, with some Democrat support, might be able to either block implementation of the law or to  repeal “ObamaCare” (and then override B.O.’s veto of the repeal legislation). 

    The last option isn’t as far-fetched as it sounds, for not only Democrats in Congress but many of their “core” constituencies (such as labor unions) are becoming increasingly worried about the law and its growing unpopularity.  That’s because, even before it’s fully implemented, “ObamaCare” is increasingly showing signs that it will collapse under its own weight.  Indeed, one prominent Democrat who was one of the law’s chief architects (called a “co-author” of the law by some commentators) – Senator Max Baucus – recently called the law “a train wreck.”     

                Every promise B.O. made about the law already has been broken, particularly the two big promises implied by the law’s disingenuous official title: it has not “protected” patients but instead has caused millions of Americans to lose their health insurance coverage; and rather than making health care more “affordable,” it has caused costs to sky-rocket.  In February, the Congressional Budget Office said that 7 million people likely will lose their employer coverage thanks to the higher costs imposed on employers by ObamaCare – nearly twice the CBO’s previous estimate.  The House Energy and Commerce Committee recently discovered internal costs estimates from 17 of the nation’s largest insurance companies indicating that “ObamaCare” will increase premiums on average by nearly 100%.  Young adults may face premium increases that will soar as high as 400%.  And despite the law being hailed by B.O. as a deficit reducer, the GAO reports that “ObamaCare” will increase federal deficits by $6.2 trillion over the next several years. 

    “ObamaCare” is also a job-killer – and one of the chief causes of the nation’s continuing high unemployment rates.  Just implementing the law’s regulations alone has killed 30,000 jobs, according to a study by the American Action Forum.  A Federal Reserve report listed “ObamaCare” as a leading deterrent for businesses hiring new workers.  (As many economists have noted, that’s the predictable result of the law’s mandates on businesses.  Because they generally apply to businesses with 50 or more full-time employees, with “full-time” defined as working 30 hours or more per week, many businesses simply aren’t expanding beyond 50 employees or are shifting more workers to part-time status.  Franchisees of Taco Bell, Dunkin’ Donuts, and Wendy’s are only some of the businesses that are in the process of reducing their employees’ hours below 30 per week to avoid the costs of covering their health care.  But the law’s negative impact on small-business hiring is even worse.  Because it gives a substantial tax credit to businesses that employ 25 or fewer people, owners of small enterprises have an incentive not to expand their workforce.  Indeed, since the full tax credit is available for those businesses with 10 or fewer workers, there is even less incentive to hire more than 10 employees.) 

                (Among the few jobs that the “ObamaCare” law arguably did help create are those of authors and publishers of books about how Americans might cope with the new law – a veritable cottage-industry that has sprung up in the past year or so, as Americans dread the coming implementation of the law.  One of the best-selling books, at least on Amazon.com, is the ObamaCare Survival Guide, by Nick J. Tate, which is quite readable and informative.  As for books critiquing the health care law, I recommend The Truth About ObamaCare by Sally Pipes, president of the free-market Pacific Research Institute.  And Ms. Pipes is also the author of one of the best books about what should be done to repeal ObamaCare and replace it with real free-market-oriented health-care reform, The Pipes Plan, which is available from Amazon in e-book format for the Kindle reader.) 

    One of B.O.’s main selling points for “ObamaCare” was that it would help the most vulnerable people who have been denied insurance coverage because of “pre-existing conditions.”  But the program established under the law to cover them is refusing new applicants and is already out of money, before less than 1 percent of those with pre-existing conditions received coverage.  And another part of the law – the CLASS program, which was supposed to provide long-term care through a government-run assisted-living insurance program – already has died (officially suspended by the B.O. regime).  The White House decided in October 2011 to pull the plug on the program, admitting that the plan could not financially sustain itself over the long term through enrollee contributions. 

                It’s doubtful whether “ObamaCare” will be fully implemented by January 1, 2014 – the date the individual mandate officially kicks in – because neither the expansion of Medicaid or the establishment of “health insurance exchanges,” both of which are mandates that the law imposes on the states, will be in place.   Indeed, as Michael F. Cannon observes in a white paper published by the Cato Institute, 50 Vetoes: How States Can Stop the Obama Health Care Law (Mar. 21, 2013), “By refusing to create Exchanges or expand Medicaid, states can block many of the PPACA’s worst provisions and push Congress to reconsider the entire law.”  

    In its infamous “ObamaCare” decision last summer, the Supreme Court did hold, 7–2 (with Chief Justice Roberts and two of the Court’s “liberal” justices, including B.O. appointee Elena Kagan, joining the four conservative justices), that states cannot be compelled to expand Medicaid, the chief way in which the law aims to cover millions of low-income uninsured Americans.   Realizing that Medicaid already is a failed program, a number of states (most with Republican governors) have decided not to expand their Medicaid coverage.  (Texas Governor Rick Perry – calling Medicaid “a system of inflexible mandates, one-size-fits-all requirements, and wasteful bureaucratic inefficiencies” – said that to expand this program was “not unlike adding a thousand people to the Titanic.”)  Other states with Republican governors (including, unfortunately, Ohio’s John Kasich), unable to resist the temptation of promised federal funding the first three years, have announced plans to expand Medicaid as dictated by the “ObamaCare” law; however, in some of those states (including Ohio), Republican-majority legislatures may thwart the governor’s proposal by refusing to play along with this unconstitutional power-grab by the national government.   

    As I discussed more fully in my February 7 essay, a health-insurance “exchange,” as defined under the law, is “a mechanism for organizing the health insurance marketplace to help consumers and small businesses shop for coverage.”  Thus described, it appears to be a kind of clearinghouse where consumers without employer-based coverage can obtain supposedly “affordable” health insurance (to comply with the law’s individual mandate); in practice, however, it is the chief means by which the federal government – acting through the broad discretionary power of the Secretary of Health and Human Services to issue directives (“as the Secretary shall decide” is the most repeated phrase in the “ObamaCare” law) – will control the health insurance market.  That’s because the only health insurance policies included in the exchanges are those that comply with the federal (H.H.S.) regulations.  All exchanges are supposed to be operational by January 1, 2014, but the states are not required to create state-run exchanges; they may decide to “opt out” and instead either allow the federal government to create an exchange for their residents or do a “state-federal partnership” exchange.   

    So far 33 states have refused to create exchanges – an astounding number, considering that HHS Secretary Sebelius in early 2012 predicted that only 15 to 30 states might decline to create exchanges.  Creating an exchange is not mandatory (for the Supreme Court has recognized that the Tenth Amendment bars the national government from “commandeering” states in such a manner), and most states have realized that to create their own exchange will neither save them money nor spare them from federal control.  Again Texas Governor Perry summed it up nicely with his pithy comment, “It is clear there is no such thing as a state exchange.  This is a federally mandated exchange with rules dictated by Washington.”  And Wisconsin Governor Scott Walker noted, “No matter which option is chosen, Wisconsin taxpayers will not have meaningful control over the health care policies and services sold to Wisconsin residents.”  As Sally Pipes observes, “Walker is right.”  The federal H.H.S. department dictates that all policies sold on the exchanges must meet one of four classifications – “platinum,” “gold,” “silver,” or bronze,” depending on the percentage of health costs a plan covers – but given the caps on deductibles imposed on all plans, even the least expensive “bronze” plans, the mandates prevent insurers from offering low-cost products that may best fit a family’s budget.  And because “ObamaCare doesn’t just set the rules – it also tasks states with enforcing them . . . running a [state] exchange could therefore get pricey” (Pipes, “ObamaCare Exchanges Turning into Disasters,” I.B.D., January 30).   Moreover, some states (including Ohio) – those with state constitutional provisions or statutes guaranteeing their citizens’ health-care freedom – are constitutionally or legally prohibited from implementing an “ObamaCare” exchange.   

    With even the leftist rag The New York Times reporting that creating and operating federal exchanges “will be a herculean task that federal officials never expected to perform,” it is doubtful whether the exchanges will be operational by the stipulated date, January 1, 2014.  And even if exchanges were set up by that date in all fifty states – whether they’re state-created exchanges, federal exchanges, or state-federal partnership exchanges – it’s even more doubtful that they will work as intended, to give Americans more choices as health-insurance consumers and to lower health-care insurance premiums.  That’s partly because, in many states, there are virtual monopolies, with in-state markets dominated by a single insurance company.  (That’s why a much simpler – and far more workable – solution to the problem of rising health-insurance costs would have been for Congress to pass a simple law allowing Americans to buy insurance across state lines – in other words, ending the states’ ability to monopolize their health-insurance markets.  But such an act would be a market-oriented reform anathema to the schemes of B.O. and Democrat “progressives” to have the national government capture control of the health-care industry.) 

    Three recent developments have raised even more alarm bells about “ObamaCare.”  One is the developing scandal that I call “IRS-Gate” (discussed below), the outrageous news that the IRS targeted pro-growth, conservative and Tea Party groups due to their political beliefs.  It’s especially alarming because the IRS is the key agency tasked with enforcing ObamaCare, both the individual mandate (the provision upheld by the Supreme Court) and the employer mandate.  Even more alarming, the current head of the IRS health-care office, Sarah Hill Ingram, was in charge of the tax-exempt division when agents first started improperly scrutinizing conservative groups over their applications for tax-exempt status.  (According to the IRS, Ingram was reassigned to help the agency implement the health-care law in December 2010, about six months before a Treasury inspector general’s report said her subordinate, the director of exempt organizations, learned about the targeting.  “There isn’t any evidence that Sarah Ingram had any inkling of the problems,” said Rep. Sander Leven, D.– Mich., ranking Democrat on the Ways and Means Committee, which oversees the IRS.  But Ms. Ingram was the one responsible – thus leaving the question why she was reassigned, as well as the broader, more troubling question, whether the IRS can be trusted to enforce the law impartially.)       

    A second troubling development, also involving the IRS, has been the agency’s illegal attempt to enforce penalties on employers in states that do not create health-insurance exchanges.  Under the “ObamaCare” law, certain employers are subject to penalties if they fail to provide “minimum value” health benefits (as dictated by the law) for their employees: namely, companies with over 50 employees in states with their own state-run exchanges, if their employees obtain insurance subsidies through a state-created exchange.  Under the letter of the law, employers in states that have refused to create their own exchanges – as noted above, the 33 states that have decided not to create their own state-run exchanges but instead have opted for federal exchanges or state-federal partnerships – are not subject to this penalty (probably an oversight as Congress rushed to write this hastily-drafted, poorly-crafted law, which the then-Speaker of the House, Nancy Pelosi, said Congress would first have to pass before anyone could fully know what’s in it).  But the IRS, in a blatant power-grab, has decided to enforce the provision on all employers, even those in states without state-run exchanges.  (A lawsuit filed on behalf of some such employers, in five states, is currently pending in the federal courts.  One of the plaintiffs, Dr. Charles Willey, a M.D. and CEO of Innovare Health Advocates, a medical practice group in St. Louis, recently wrote an op-ed in Investor’s Business Daily describing this outrageous development, noting that the illegal action by the IRS forces him and other employers and their families “to purchase costly insurance that they cannot afford, do not need, or do not want”  (“IRS Illegally Usurps Congress with New Law in ObamaCare,” I.B.D., May 22)). 

    Finally, in yet another scandal about the B.O. regime – one that has not received much media coverage, except on Fox News and in the pages of I.B.D. – as funding for “ObamaCare” implementation has dried up, HHS Secretary Kathleen Sebelius has been soliciting funds from private healthcare companies to implement the law.  It’s yet another politicized move (typical of the “Chicago-style” thuggery of the B.O. regime), essentially a “shake-down.”  

    Let’s hope that some way will be found within the next year or so to block or repeal this horrid law before it’s too late – before it not only deprives Americans of their freedom but also seriously endangers their health (and their lives), destroying the best health care in the world.

     

     

    n IRS-Gate: More Than Just a Taxing Situation            

    England’s King Henry II was a great reformer; his reign during the 12th century (1154–1189) marks the formative period of the English “common law” system, which continues to serve as the basis of the law in both England and America.  Yet King Henry is perhaps most famous for his confrontation with his former friend, Thomas Becket, whom he appointed Archbishop of Canterbury, head of the Church in England.  Becket championed the autonomy of the Church, opposing Henry’s (ultimately) successful attempts to put clergymen accused of crimes under the jurisdiction of the royal common-law courts instead of the Church’s own ecclesiastical courts.  As dramatized in the classic 1964 film Beckett (co-starring Richard Burton in the title role and Peter O’Toole as Henry II), at one fateful moment the King voices his exasperation with Becket – “Will no one rid me of this meddlesome priest?” – and a group of knights, hearing the King’s angry outburst and interpreting it as a command, then proceeded to Canterbury Cathedral, where they brutally murdered Becket while he was presiding over the early evening (vespers) services.  Beckett became a martyr (he was named a saint by the Church and his tomb at Canterbury became a destination for pilgrims for the next 350 years); and King Henry II did penance (submitting himself to the lash, as the Beckett filmmakers imagined it), to absolve himself of guilt and to atone for his role in the crime. 

    Why am I telling this story here?  Because B.O. is facing his own Becket moment, and it involves a rapidly-unfolding story that many political commentators believe might cause the downfall of his presidency:  the illegal targeting of, harassment of, and discrimination against conservative and libertarian groups by the Internal Revenue Service (IRS), as it reviewed the organizations’ requests for tax-exempt status – the scandal I call “IRS-Gate.”  Parallels to Richard Nixon’s presidency and the Watergate scandal that was its undoing are obvious.  And, just as with Nixon 40 years ago, the critical question seems to be:  Is the president (the current “occupier” of the White House, as I like to refer to B.O.) responsible for this blatant abuse of power by a federal agency?  (If so, then the related question will be:  Is this an impeachable offense?  I’ll discuss that question more fully in Part II of this essay.) 

    As of the time I’m writing this, it has been less than two weeks since the news story first broke (on Friday, May 10) that the IRS had been targeting certain conservative/libertarian groups (initially reported as just “Tea Party groups”), as a result of the release of an IRS inspectors-general report.  Since then, investigative reporting by some of the news media (most notably by USA Today) and the ongoing investigations by Congress have revealed more of the story.  But investigations are still being conducted, especially after the IRS official at the center of the scandal – Lois Lerner, head of the Tax Exempt Division – recently pleaded the Fifth Amendment, refusing to answer questions (after making a brief opening statement denying culpability) in hearings by the House Oversight Committee on May 20.  Here’s a summary of what we know so far.  

    By sometime in the early spring of 2010, still-unidentified IRS workers in Cincinnati, Ohio (the main processing center for tax-exempt organizations) began singling out applications for tax-exempt status from certain groups (initially reported as groups with names containing “tea party,” “patriot” or other buzzwords).  As USA Today recently summarized it, “Applications from political groups warranted extra attention, but such attention should have been scrupulously neutral and non-partisan.  It wasn’t.  After singling out conservative groups, workers sent letters with intrusive questions to some.  Meanwhile, some liberal groups sailed through the process” (“IRS inquiry rests on four key questions,” USA Today editorial, May 21).  According to the newspaper’s breaking news story about the IRS scandal (“IRS gave a pass to liberals,” May 15), a review of IRS data shows that for a 27-month period, from March 2010 through May 2012, the IRS granted no Tea Party group tax-exempt status.  During that same time, however, the IRS approved applications from similar left-liberal groups; indeed, some groups with obviously liberal names (including words like “Progress” or “Progressive”) were approved in as little as nine months.    

    Besides targeting conservative groups for extra scrutiny – refusing or delaying their application for tax-exempt status – the IRS also harassed such groups, making unusual document requests, asking for massive amounts of information the agency couldn’t possibly need to determine tax-exempt status.  For example, groups were asked to provide:  donor names, blog posts, transcripts of radio interviews, resumes of top officers, board minutes and summaries of materials passed out at meetings.  Some groups were asked about connections to other conservative groups or individuals; for example, the IRS demanded that the Center for Constitutional Law “explain in detail your organization’s involvement with the Tea Party.”  Even worse, IRS employees engaged in selective leaks, providing information gleaned from its special scrutiny of conservative groups to certain left-liberal groups.  ProPublica, a left-liberal-leaning nonprofit journalism organization, has revealed that the IRS had leaked it nearly a dozen pending applications, including one submitted by Karl Rove’s Crossroads GPS (“Did IRS Try To Swing `12 Election?” I.B.D., May 16).  And law professor John Eastman has maintained, in a recent op-ed in USA Today, that the IRS disclosed to a “gay rights group,” the Human Rights Campaign (HRC, which favors same-sex marriage), the confidential tax returns of the conservative group he’s associated with, the National Organization for Marriage (NOM, which is opposed to same-sex marriage), and that this information was immediately republished by The Huffington Post and other left-liberal news media outlets.  He adds that the HRC and NOM are “the leading national groups on opposing sides in the fight over gay marriage,” and hence that the IRS leak of confidential information – including the identities of major donors – “fed directly into an ongoing political battle” (“IRS committed political sabotage,” May 16).  

    It should be noted that most groups applying for status as tax-exempt organizations under sections 501(c)(3) or 501(c)(4) of the Internal Revenue Code rely heavily on contributions from individuals for their funding, and having status as a tax-exempt organization is critically important for them to receive contributions (because it encourages contributors, who can then claim their donations as deductible expenses from their own taxes).  That’s especially true of “grass roots” organizations, comprised of “ordinary” Americans, like most groups involved in the Tea Party movement.  The timing of the IRS “crack-down” on such groups is especially interesting, because these groups were generally credited for being successful in getting Republicans elected in the 2010 Congressional elections, allowing the GOP to regain control of the U.S. House of Representatives.  And they were expected to play an equally important role in the 2012 general elections.  (Although they might jeopardize their tax-exempt status by campaigning for particular candidates, they are entitled to educate voters and to inform them about the candidates – just as left-wing tax-exempt groups are.)  As the I.B.D. editors noted in their May 16 editorial, “Did IRS Try to Swing `12 Election?” during a hotly-contested election, the IRS “managed to put its thumb on the political scale by squelching political activity on the right.”  Some groups report that they curtailed their get-out-the-vote efforts, spending piles of money on legal fees or disbanding altogether in the face of IRS inquisitions.  Thus, the effort to delay or impede certain groups’ tax-exempt status had the effects to kill them, to suppress them, or at least to minimize their impact during the 2010 and 2012 election years. 

    As the editors of Investor’s Business Daily have noted, the IRS targeting of Tea Party groups is something that ought to concern all Americans:   

    “The formation in 2009 of hundreds of Tea Party groups, [all over the U.S.] . . . was a spontaneous reaction by ordinary American citizens to an overspending, oversized government whose overreach called into question its constitutionality.

     

    “Yet it was these very Tea Parties that were specially targeted by the IRS in its outrageously intrusive efforts to learn their every secret, to leak those secrets to a hostile, slanderous media, and to delay beyond justification their approvals as tax-free institutions in an effort to suppress them – while granting those permissions easily to leftist ones.”

     

    Noting that such groups were the “civil society” groups, the voluntary associations, that Alexis de Tocqueville lauded in his classic book Democracy in America, as groups that helped mitigate against the dangers of majoritarian tyranny, the editors conclude: “The fact that the IRS tried and may have succeeded in destroying some Tea Party and other groups is a sign that the broader agenda wasn’t just suppression of political dissent, but imposition of the kind of tyranny found in places like Cuba, where law-abiding civil society has been replaced by fanatical government-directed mobs.”  That’s what Tocqueville warned about, and that’s why “these IRS transgressions ought to be treated by as what they are – a fundamental attack on our democracy” (“An Attack on America’s Civil Society,” May 17).  And as the editors also noted in a previous editoral – referring to the IRS leaks of information from conservative groups to the left-liberal news operation ProPublica – “Democracy is threatened when a presidential campaign co-chair [referring to HRC’s president Joe Solmonese] can use IRS documents to attack a political opponent and a campaign donor gets to write the rules” [noting that heavyweight Obama donor and fundraiser Jeffrey Katzenberg, CEO of DreamWorks Animation, is a key supporter of ProPublica] (“Obama’s Internal Re-Election Service,” May 16). 

    It wasn’t just “Tea Party” groups that were targeted for special attention by the IRS, moreover.  The Washington Post (hardly a conservative paper) reports that of nearly 300 groups selected for special scrutiny, 72 had “tea party” in their title, 13 had “patriot,” and 11 had “9/12.”  According to documents obtained by the Post, IRS exempt organizations chief Lois G. Lerner – who “apologized” for the agency’s actions, before she decided to plead the Fifth before the House Oversight Committee – objected in a meeting held June 29, 2011 with IRS staffers, in which they described giving special attention to groups where “statements in the case file criticize how the country is being run.”  Yet only six months later, on January 15, 2012, the agency decided to look at “political-action type organizations involved in limiting/expanding government, educating on the Constitution and Bill of Rights, social economic reform movement,” according to an appendix in the Inspector General’s report.  In 2010, the pro-Isreal organization Z STREET filed a lawsuit against the IRS, claiming it had been told by an IRS agent that because the organization was “connected to Israel,” its application for tax-exempt status would receive additional scrutiny.  Reportedly, an IRS agent told a Z STREET representative that the applications of other Israel-related organizations have been assigned to “a special unit in the D.C. office” to determine whether the organizations’ activities “contradict the administration’s public policies” (“ObamaCare, IRS, and Tax Power,” I.B.D., May 14).  

    And here in Ohio, Maurice Thompson, the executive director of the 1851 Center for Constitutional Law – a non-profit organization that provides legal representation to Ohioans whose constitutional rights have been violated (including but certainly not limited to tea party groups or their members) – has revealed that in processing the 1851 Center’s application for tax-exempt status, which ultimately was granted, the IRS in its May 20, 2010 letter demanded that the Center must “explain in detail your organization’s involvement with the Tea Party.”  (The Center applied for status as an educational and/or civil public policy charity under section 501(c)(3); it is a public interest law firm that litigates civil rights cases without engaging in politics.) (“IRS Targeting of 1851 Center in May of 2010 Demonstrates Broader Corruption,” 1851 Center press release dated May 16). 

    Congressional committees obviously ought to continue their investigations into this especially serious matter.  Eventually (sooner rather than later, I hope), an independent special counsel with subpoena power will be appointed.  It is “the only possible solution,” argues Lawrence Kudlow in a recent op-ed (“Revenuers Saw Tea Party as Political Threat,” I.B.D., May 20).  Kudlow notes that B.O. “has charged Treasury Secretary Jack Lew with straightening this out,” but adds that Jack Lew is “an Obama political operative.”  That’s putting it mildly: as Kudlow himself observed in an op-ed earlier this year (questioning Lew’s credentials to be Treasury secretary), Lew not only was formerly B.O.’s chief of staff but also an intensely partisan operative, far to the left of his predecessor, Tim Geithner – in Kudlow’s words, “a left-liberal Obama spear-carrier, whose very appointment signals a sharp confrontation with the Republican House over key issues such as the debt ceiling, the spending sequester, next year’s budgets and taxes.”  Only an independent special counsel can investigate “any possible connections with senior Treasury officials, connections that could lead to the Oval Office,” Kudlow concludes. 

    As noted above, there are strong parallels between the B.O. regime’s “IRS-Gate” scandal and President Nixon’s Watergate scandal, not only because they both involved alleged White House participation in illegal (or unconstitutional) activities by federal agencies (and, more importantly, abuse of power in the “cover-up” of those activities) – but also because some of the allegations against Nixon in 1974 also specifically involved the IRS.  The first clause in Article II of the articles of impeachment against Nixon approved by the House Judiciary Committee at the end of July 1974 alleged: 

    “He has, acting personally and through his subordinates and agents, endeavored to obtain from the Internal Revenue Service, in violation of the constitutional rights of citizens, confidential information contained in income tax returns for purposes not authorized by law, and to cause, in violation of the constitutional rights of citizens, income tax audits or other income tax investigations to be initiated or conducted in a discriminatory manner.”

     

    This allegation was followed by other clauses in the second Article of impeachment, accusing Nixon – again either “acting personally” or “through his subordinates and agents” – of misusing other federal agencies, including the FBI, the CIA, and the Department of Justice, to violate individuals’ constitutional rights.  Because Nixon resigned even before the articles of impeachment were voted on by the House of Representatives, his impeachment case was never tried in the Senate.  As far as I know (and I’ve read several books about the Nixon impeachment and the Watergate affair), there has been no credible evidence proving this allegation.  The so-called “White House enemies list” maintained by the Nixon administration – allegedly the source of names of persons to be subjected to harassment by the IRS and other federal agencies – has never been proven to be anything other than a “do-not-invite” list for White House social functions. 

    Thus, in a sense, the potential allegations against B.O. arising from “IRS-Gate” could be far more serious than this impeachment article against Nixon.  From what Americans (including the members of Congress currently holding hearings investigating the IRS abuses) already know, the IRS did abuse its power – not only by collecting information against certain individuals and groups, auditing them or otherwise threatening them, in a unlawful discriminatory manner – but also presumably did so in a way that not only deprived the groups involved of their constitutional rights but also perhaps affected the results of the 2012 elections. 

    There was no “smoking gun” linking Nixon to the IRS allegations, as there was linking him personally to the allegations contained in Article I of the articles of impeachment – the main allegations, about obstruction of justice and abuse of power in covering up the break-in at the Democratic National Committee offices in the Watergate office building on June 17, 1972.  (For more on Nixon and the Watergate affair, see my essay “The Legacy of Watergate,” June 17, 2005 – posted on the 33rd anniversary of the Watergate break-in.)  But, in approving the articles of impeachment against Nixon, the House Judicial Committee adhered to a broad standard of what constitutes impeachable offenses – a standard that held the president responsible for acts done “through his subordinates and agents.”  That standard is reasonable, given the office of the presidency as created by the Constitution, what some scholars call the “unitary” Chief Executive.  It also comports with Americans’ popular understanding of the office – “the buck stops here,” as the sign on President Harry Truman’s desk in the Oval Office famously read.  (Ironically, this broad standard of impeachment was also urged by a young Democratic staffer of the House Judiciary Committee, a recent graduate of Yale Law School named Hillary Rodham Clinton!) 

    Based on the precedents set by the Nixon near-impeachment, B.O. certainly could be held accountable for the illegal acts committed by the IRS during his watch – even if there’s no evidence of his personal involvement.  Morally and politically, if not legally, he ought to be held responsible for the actions of the IRS because of the way he has conducted his administration – which I have called a “regime” here on MayerBlog because it has been so lawless.  (Indeed, I have discussed why B.O. is “the most lawless president in U.S. history,” in both my “Rating the U.S. Presidents – 2012” (Feb. 22, 2012) and my more recent essay, “The Unconstitutional Presidency” (February 21).)  Besides bringing his “Chicago style” of gangster government to Washington, D.C., and thus generally encouraging a spirit of lawlessness (of disrespect for the rule of law) throughout the federal government, B.O. himself has taken the lead in demonizing conservatives and libertarians – particularly groups involved in the Tea Party movement – and has encouraged his fellow Democrats as well as his lapdogs in the news media to demonize them, precisely because of their opposition to his policies and their concern about the direction in which the United States is headed.  

    Given the highly-charged partisan atmosphere that B.O. has cultivated in Washington, D.C., is it at all surprising that someone at the IRS – whether they were “minions” or top-level administrators – took B.O.’s angry outbursts about Tea Party groups, Republicans, conservatives, libertarians, etc., as if they were an order to target them for special treatment?  That’s what I meant above, when I referred to “IRS-Gate” as possibly B.O.’s “Becket moment.”  Just as Henry II encouraged the murder of Thomas Becket, B.O. has encouraged the violation of certain Americans’ rights.  

    But there may even be a “smoking gun” – evidence of B.O.’s personal involvement in the abuse of IRS powers.  According to a special report by Jeffrey Lord just published in the American Spectator, evidence of that smoking gun might be found in the White House visitor’s logs, which show that on March 31, 2010 someone named Colleen Kelley – who happens to be president of the National Treasury Employees Union, an anti-Tea Party group that includes the unionized employees of the IRS – had an appointment with “POTUS,” that is, with B.O., the President of the United States.  The very next day after her White House meeting with the president, according to the Treasury Department’s Inspector General’s Report, IRS employees – the very same employees who belong to the NTEU – set to work in earnest targeting the Tea Party and conservative groups around the U.S. (“Obama and the IRS: The Smoking Gun?”, May 20).

     

      

    n All the News That’s Fit To Control – Or To Quash            

    “IRS-Gate” is one of three major scandals – called a “trifecta” of scandals by conservative political columnist George Will – that currently are dogging B.O. and his regime.  The other two concern Benghazi (the militant Islamist attack on the U.S. consulate in Benghazi, Libya last September 11, and the B.O. regime’s subsequent cover-up, which I will discuss in Part II of this essay) and the seizure of Associated Press phone records by the Justice Department.  Perhaps we could call this third scandal “AP-Gate”? 

    Like the IRS scandal, “AP-Gate” involves serious allegations of abuse of power, this time by the B.O. regime’s Department of Justice (DOJ, which maybe should be called the “Injustice Department”), headed by Attorney General Eric Holder.  On Monday, May 13, just a few days after the IRS scandal broke (which, as noted above, was on Friday, May 10), the Associated Press learned that the Justice Department had conducted a sweeping seizure of its phone records, spanning a two-month period of time.  The subpoenas under which the DOJ seized the phone records were, as USA Today describes them in an editorial, “so broad they covered outgoing calls from the AP’s general numbers in New York, Washington, and Hartford, Conn., and for reporters who cover Congress.  The intrusion didn’t stop there.  Records of office, home and cellphones of some individual reporters were also seized as part of an investigation into leaks about a failed al-Qaeda plot last year” (“Quest to plug leaks yields torrent of abuse,” May 15).   

    The Justice Department insists that the subpoenas fell within lawful guidelines; and in recent testimony before a House panel, Holder defended the action as necessary in order to foil a terrorist plot (“This is among the top two or three [most] serious leaks that I’ve ever seen,” said Holder. “It put the American people at risk,” he asserted, without elaboration.)  But AP CEO Gary Pruitt said “there can be no possible justification” for the action, considering its wide-ranging scope.  And House Judiciary Committee chairman Rep. Bob Goodlatte (R. –Va.) said the Justice Department requests “appeared to be very broad and intersect important First Amendment freedoms.”  He added, “Any abridgment of the First Amendment right to the freedom of the press is very concerning.” 

    It’s ironic that it was the AP who was the target of the DOJ, considering that the AP generally has a left-liberal bias, so much so that it may be considered a lapdog for the B.O. regime.  (I jokingly say “AP” stands for “administration propaganda.”)  But this time, when its own ox is being gored – when its own news reporters are being investigated – the AP suddenly becomes aware of potential abuse of power by the B.O. regime, and the violation of its staffers’ constitutional rights.  Little wonder that, of the recent “trifecta” of major scandals, it’s this one that has received most attention from the news media.  As the editors of Investor’s Business Daily recently noted, “The AP flap has drawn a properly outraged response from the news agency, because the White House’s obsessive efforts to find leaks cast such a broad, indiscriminate net against reporters just doing their jobs.” 

    And just as the IRS scandal has been shown to involve more than just “Tea Party” groups, the Justice Department’s abuse of its investigative powers apparently involves more media persons than AP reporters.  In an even more disturbing news story, it has been revealed that the Justice Department investigated Fox News reporter James Rosen and two other newsmen, and even threatened to treat Rosen as an unindicted “co-conspirator,” in another matter involving an alleged leak of national security information.  As the I.B.D. editors note, “it was a tiny leak of a matter – North Korea’s plan to launch more nuclear tests if United Nations sanctions went through in 2009 – that would have changed nothing, whether reported or not.”  Yet it somehow merited a case against Rosen, “for the kind of reporting that Obama’s allies in the New York Times and Washington Post conduct all the time.”  “And yes,” they add, “we think the White House did it because Fox News is a dissident news organization” – to which one might add, a news organization that B.O. and the Democrats have routinely demonized.  “For a while, it looked like the White House wanted just to control `the narrative.’  But its seizure of AP phone records and surveillance of Fox employees now show its real aim: to control the news” (“The Obama Objective: To Control the News,” May 21).   

    One of the questions yet to be answered in this scandal is whether the B.O. White House and the Holder Injustice Department will stonewall Congressional efforts to investigate it, by claiming not only “national security” considerations but also “executive privilege,” as they have done with efforts to investigate the Department’s “Fast and Furious” project, involving illegal gun-running in Mexico.

      

     

    n Francis, the Socialist Pope? 

                In “Too Pooped To Pope,” the section on the retirement of Pope Benedict XVI in this year’s “Spring Briefs” (March 23), I briefly discussed the new pope, the former Cardinal Jorge Mario Bergoglio, archbishop of Buenos Aires, Argentina, who has taken the name of Francis I (after Saint Francis of Assisi).  As the first pope to come from the Americas, as well as first Jesuit pope, Francis is “historic.”  Both his Jesuit background and his experience in Argentina will shape his character as pope – for good or ill.  In my March entry, I noted some reasons to be both cautious and optimistic about Francis: 

    Wisely, the new pope has fought against the Marxist “liberation theology” movement that swept Latin America in the 1980s; he also has been an outspoken critic of Argentine President Cristina Fernandez’s leftist government, using his homilies to criticize economic and social conditions as well as political corruption.  Some commentators – such as the editors of Investor’s Business Daily – have idealistically predicted that Pope Francis might be an opponent of leftist dictatorships in South America (including Venezuela, Ecuador, and Bolivia) the same way Pope John Paul II opposed communist dictatorships in eastern Europe (“Pope and Change in the Vatican,” March 15).  Nevertheless, Francis sees himself as a champion of the poor and outcast; and he shares with leftist “liberation” theologians the same perverted notion of “social justice” that is rooted in a moral philosophy of altruism, or self-sacrifice, which along with the Church’s adamant stance against contraception, is responsible for most of the poverty in the “Third World.”  Like his predecessors, Pope Francis is unlikely to embrace the only true hope for happiness and prosperity in the world:  free-market capitalism.

     

                It seems that the I.B.D. editors were overly optimistic, however – and I was rather prescient.  The editors recently reported that Francis has spoken about “the tyranny of money” and called for countries to impose more control over their economies to prevent “absolute autonomy” (in other words, individual freedom and responsibility) and foster the “common good” (in other words, government tyranny).  As the editors noted, it appears that Pope Francis “has been infected by the local economic pathologies of his homeland, Argentina, and its liberation theology among the Jesuits.” 

                The editors added that the pope’s policy prescription “has already been tried in Argentina,” where it “has driven millions of Argentines into poverty.”  It’s also been tried in socialist Venezuela (under the dictator Hugo Chavez, who is still dead), “where expanded control over the economy has brought capital controls, and predictable-as-nightfall shortages, the latest of which is of toilet paper, joining corn, flour, sugar, salt and every foodstuff the government has a hand in.”  Reminding readers that the pope has professed his goal of ensuring “dignity” for the poor, the editors ask: “Is it dignified for the poor to live without toilet paper as a result of government control?”  By contrast, “everywhere free enterprise is practiced, prosperity follows, reaching even to the poorest people.”  A better model is provided by Argentina’s successful neighbor, Chile, which in 1974 adopted “the most far-reaching series of free-market reforms in history, driving a once-poor, socialist ravaged country to the ranks of the first world within 30 years”: 

    “Trade was freed, bureaucracy cut, its currency stabilized, its debts paid, and pensions were privatized, turning workers into capitalists and creating a vast pool of capital to develop the country.  As a result, Chile has Latin America’s highest educational level, its highest rate of charity-giving, its lowest infant mortality, lowest corruption, and the highest economic freedom on the continent.”

     

    “The hard fact here is that free markets free people to do and be their best.”  If Pope Francis really wants to help the poor – to make them less poor – “he should take a second look at what capitalist societies have achieved” (“Toilet Paper Lesson,” May 20).

        

     

    n The “Most Dangerous Man in America” 

                Cass Sunstein, former University of Chicago and Harvard law professor, was called “the most dangerous man in America” by political commentator Glenn Beck, when Sunstein served as the B.O. regime’s “regulatory czar” (administrator of the Office of Information and Regulatory Affairs).  Beck had a valid point, for Sunstein epitomizes the elitist “philosopher-king” ideology of the so-called “Progressive” movement.  Now that Sunstein has returned to academia, however, Beck has a new candidate for the title of “most dangerous man in America.”   He’s another “progressive” reformer who poses a major threat to Americans’ liberties:  Michael Bloomberg, the billionaire who is mayor of New York City. 

    I’ve previously written about “Big Brother” Bloomberg, as I call him because he can’t seem to stop imitating “Big Brother,” the paternalistic dictator of George Orwell’s dystopian novel 1984 (see “Summer 2012 in Review” (Sept. 12, 2012) and this year’s “Spring Briefs” (March 23)).  Bloomberg’s efforts to control the lives of New Yorkers have included a ban on smoking in public places, a ban on trans-fats in restaurant food, a ban on sales of soft drinks in cups or containers larger than 16 ounces, and restrictions on hospitals’ distribution of infant formula to new mothers.  Not only have these measures deprived New Yorkers of their freedom, but they have inspired other municipal governments across the United States to imitate Bloomberg’s paternalism, by enacting similar “nanny state” measures of their own.   

    More recently, Bloomberg has been one of the most vocal anti-gun fascists in the United States.  Since the Newtown, Connecticut school massacre, he has demagogued the issue of “gun violence,” and in his role as leader of the organizations “Mayors Against Illegal Guns” he has been pushing for various measures – including a new federal ban on so-called “assault weapons” and restrictions on ammunition – increasing government control over firearms.  (Thus, besides being an enemy of individual freedom and responsibility generally, Bloomberg thus is also an enemy of Americans’ Second Amendment rights.)   

    He has spent a small fortune – about $12 million, by some estimates, a small fraction of his personal fortune – to air across the USA a series of TV ads promoting gun control.  (One of these ads, ironically titled “Responsibility,” features a self-proclaimed “gun owner” – or an actor hired to portray a Midwestern gun owner, dressed up like a “bubba,” consistent with left-liberal elitists’ stereotype of rednecks who “cling to their guns” – who broke the NRA’s first two rules of gun safety: (1) ALWAYS keep the gun pointed in a safe direction, and (2) ALWAYS keep your finger off the trigger until ready to shoot.  As one blogger rhetorically asked, “Is the actor in the video simply an irresponsible gun owner or does he just play one on TV? ( “Michael Bloomberg’s Anti-Gun Ad Flops,” MrConservative.com, March 27).)  

    In “Spring Briefs” I called Bloomberg a “megalomaniac,” but perhaps a better term is “petty tyrant” – a petty tyrant who has been abusing his power as mayor to force his own half-baked ideas about personal health on the people unfortunate enough to live in his city.  He’s abusing the legitimate “police power” that state and local governments possess (which concerns only public health – preventing epidemics of infectious disease and the like – not the private health of individuals).  Bloomberg’s disdain for individual freedom and responsibility extends even to the Constitution that is designed to protect individual rights – the Constitution that he, as a government official, took an oath to defend.  After the Boston Marathon bombing, Bloomberg pompously declared, “We live in a complex world where you’re going to have to have a level of security greater” than we’ve had.  “Our laws and our interpretation of the Constitution, I think will have to change.” 

    By himself, Bloomberg isn’t “the most dangerous man in America,” after all.  But he does epitomize a type of politician who truly has been most dangerous, since the so-called “Progressive” movement of the early 20th century:  a so-called “progressive” reformer, who is really a reactionary – harkening back to a centuries-old paternalistic tradition in government (as I discuss in my essay “Reactionary `Progressives’,” Mar. 16, 2006).  Merely corrupt politicians aim at enriching themselves at the public expense.  But “progressive” reformers are elitists who aim at controlling other people’s lives.  (Politicians hungry for power are always more dangerous than politicians merely hungry for material wealth.)   

    Thankfully, “Big Brother” Bloomberg’s tenure as NYC mayor will soon end because he is term-limited (further evidence of the value of term limits!).  Among those who have announced their candidacy to succeed him is former Congressman Anthony Weiner, who resigned from Congress in 2011 after a scandal involving his over-exposure on social media (more precisely, after tweeting “lewd” photos of his crotch, both covered and uncovered).  How appropriate, to have one Weiner succeed another!

      

     

    n A Lesson from Neal Boortz 

                I’ve been reading Neal Boortz’s autobiographical book, Maybe I Should Just Shut Up, published earlier this year in e-book format, to mark Boortz’s retirement from talk radio.  (Boortz retired on January 18, after 42 years on the radio.  Thanks to Justin Gruver for this splendid video clip marking the occasion.)  The book is chock full of entertaining and informative stories about Boortz’s long career.  Among them is the chapter on “The Public’s Airwaves,” a devastating critique of the myth that underlies government regulation of radio and TV broadcasting.  

    There’s also the following story – a kind of parable, or a thought experiment – from the chapter “Why Liberals Suck at Talk Radio.”  Boortz writes: 

    “Here’s the simple, basic, easy-to-understand scenario I love to present to liberals, just to see if they can successfully defend a basic tenet of liberalism – government-enforced charity.

     

    “I’ve gone through this routine countless times on the air with liberal callers, and not one of them – not one – has ever successfully withstood the challenge.

     

    “I’ll be the compassionate, caring, and loving liberal for this scenario, and you can be the selfish, stingy, greedy conservative.  As we walk along a downtown street, discussing the efficacy of a degree in gender studies in an increasingly technological society, we encounter a panhandler.

     

    “This poor guy doesn’t have any arms or legs, yet he’s playing 26 musical instruments simultaneously.  There’s a bucket in front of him with a few dollars in it.  Behind the bucket is a sign reading, `Tips please.  Thank you.  Stump the Band.

     

    “Being the compassionate liberal, I pull out a 5 and put it in Stump’s bucket.  Then I turn to you and tell you to put $5 in the bucket, too.  You say that you just don’t have the money to spare.  There are bills to pay and mouths to feed.

     

    “OK, here’s the question:  At that point, would it be OK if I were to pull a gun out of my pocket, put that gun to your head and tell you that you don’t have a choice?  Can I simply use deadly force to make you put a 5-dollar bill in that bucket?  I doubt that you’re going to say, `Sure!  Go ahead.  I don’t mind!’

     

    “Your likely answer here is that I have no right to force you to give.

     

    “As I said, I’ve never had a liberal tell me that it would be OK for me to pull a gun out of my pocket and force them to give to the panhandler.  Yet, this is what they do, using the police power of government to seize and transfer (or redistribute, to use Obama’s favorite word) wealth, day after day.

     

    “OK, fine.  Here’s my follow-up question:  If government derives its power from the consent of the governed, and if I, part of the governed, can’t use deadly force to make you give money to someone else, how then can I ask the government to do that for me?”

     

    It’s a splendid parable that gets right to the heart of the perverse morality of the “welfare state,” of paternalistic government.  Of course, when Boortz uses the term liberal, he means left-liberals – not “liberals” in the classical sense of the term, synonymous with “libertarians” (like Boortz himself – or me).   

    What distinguishes libertarians from non-libertarians (whether they’re leftist “liberals” or conservatives) is that libertarians understand that government is force – potentially lethal force – and that whenever someone says, “There ought to be a law” to do X, they’re really advocating the use of force, or threat of force, to compel someone to do something (or to prohibit them from doing something) against their will.  That’s why libertarians support minimizing governmental power:  they want to reduce to a minimum the use of force, the coercive power of the law, to control people’s lives and instead want to maximize individual freedom and responsibility, allowing individuals to voluntarily enter into contracts with others, for their mutual benefit.  In other words, to replace coercion with cooperation. 

    It’s sad that Boortz is retiring at just the time when America needs most an entertaining, informative, and principled libertarian commentator on talk radio.  Rush Limbaugh, Sean Hannity, and Glenn Beck – each of whom is more conservative than libertarian – are not the consistent advocates of individual liberty, in all its aspects, that Neal Boortz was.  (Even Boortz had a few blind spots – issues where he was more conservative than libertarian:  his advocacy of government control over immigration, for example.)  Let’s hope that someday soon the nation’s airwaves will have a worthy successor to Boortz – someone who, like him, will rise from obscurity to national prominence, if only given the opportunity.

      

     

    n The End of an Era

               “Tonight, tonight,

                Won’t be just any night,

    Tonight there will be no morning star . . . .”

     

            -- “Tonight,” lyrics by Stephen Sondheim, from West Side Story (1957) 

     

                In early April, NBC announced that by spring 2014, Jay Leno will be retiring as host of The Tonight Show, to be replaced by Jimmy Fallon, currently host of NBC’s Late Night.  It’s a “generational shift,” reported USA Today, noting that NBC’s iconic Tonight Show has had only five hosts thus far in his 59-year history:  Steve Allen, Jack Paar, Johnny Carson, Jay Leno, and (briefly, before Leno’s return) Conan O’Brien (“`Tonight Show’ revs up for a new generation,” April 4).   

                Robert Bianco, USA Today’s TV critic, praised NBC’s announcement, calling the shift from Leno to Fallon “a safe, easy, logical transition.”  He notes that NBC acted appropriately in making “public and official what it had clumsily allowed to become public and unofficial, via rumor mill, in March.”  This time, there were “no on-air battles à la Leno and Conan O’Brien” (referring to NBC’s embarrassing experiment in 2009-10, when the network had given the Tonight Show briefly to O’Brien, only to pull the plug in less than a year when Leno’s ill-fated prime-time show tanked and O’Brien’s ratings faltered).  Nor were there protracted media debates – “late-night wars” – over who should get the job “à la Leno and David Letterman” (which had prompted Letterman’s move from NBC to CBS).  Still, Bianco adds, when the transition is done in 2014, “The Tonight Show that once was – that dominant late-night program that launched stars and set the cultural conversation – will finally be gone for good” (“`Tonight’ takes the safe path,” April 4). 

                Truth is, The Tonight Show started to go downhill when Johnny Carson retired in 1992, after hosting the show for a record 30 years.  Carson truly was the “king of late night.”  He was a masterful comedian (with especially good comedic timing) and yet also could be a serious interviewer (really serious – with such guests as philosopher/novelist Ayn Rand).   

                Nevertheless, it’s not just luck that explains Jay Leno’s 22 years as Tonight host (1992 – 2009, 2010 – present).  Leno is a funny, likeable guy.  Although not as skillful as Carson, he’s a fairly good interviewer and has some regular segments on the show that are hilarious.  (I especially like his “Headlines” segments, but also enjoy his “Jaywalking” quizzes and the occasional “Battle of the Jaywalk All-Stars,” even though it can sometimes be depressing to see how dumb many “average” Americans are.)  And Leno is nonpartisan in his political humor (as Carson was), poking fun at politicians from both major political parties, and showing a great deal of common sense in his monologues, as he makes humorous observations about current events in politics as well as pop culture. 

                Leno is, above all, a good sport.  He’s been treated abominably by NBC, even though he continues to hold the lead in ratings for the 11:35 p.m. time slot (regularly beating David Letterman on CBS and even his new competition on ABC, Jimmy Kimmel).  Leno even helped smooth the way for Fallon’s succession, appearing with Fallon in a video duet parodying the song “Tonight” from West Side Story.  He even graciously bowed out in a official announcement on his show.  “Congratulations, Jimmy,” he said.  “I hope you’re as lucky as me and hold on to the job until you’re the old guy.” 

                Jimmy Fallon is not a worthy successor – certainly not to the great Tonight Show hosts from the past (Allen, Paar, and Carson), nor even to Leno.  He’s relatively young (38), with a “boyish enthusiasm and natural skills as a performer,” according to NBC publicists.  But Fallon’s sense of humor strikes me as smarmy, even smug – reminiscent of his stint as the fake newsman on Saturday Night Live.  Worse, he has an overtly leftist political bias; he has used his late-night show not only to regularly poke fun at Republicans but also as a campaign platform for B.O. during the 2012 election season.  (Remember when B.O., the “Preezy” of the United “Steezy,” slow-jammed on Fallon’s late-night show?)  When Fallon takes over in 2014, the Tonight Show will return from L.A. to New York City, in a shiny new studio in Rockefeller Center, where it will be produced by SNL’s Lorne Michaels.  It will become, essentially, a weak week-night version of SNL, followed by a Late Night hosted by Jimmy Fallon’s successor, another SNL alum, Seth Meyers.  (Seth is the less-talented older brother of Josh Meyers, one of the cast members of Mad-tv and, briefly, That 70s Show.) 

                Until Leno’s departure, I’ll still try to watch Jay Leno’s “Headlines” segments on Monday nights (usually).  On other nights when I’m still awake after 11:30, I’ll be more frequently watching the Jimmy Kimmel show on ABC – a show that’s far more entertaining than either The Tonight Show or the mean-spirited David Letterman show.  That Jimmy, like Leno, is non-partisan in his humor, poking fun at Democrats as well as Republicans.  (In other words, he’s an “equal opportunity” offender when he jokes about politics.  Not surprising, as Kimmel started his TV career co-hosting with Adam Carolla the hilariously politically-incorrect Man Show and then moved on to hosting Win Ben Stein’s Money.)  Indeed, I usually try to stay awake on Thursday nights (or occasionally on Friday nights), to watch my favorite segment on Jimmy Kimmel’s show:  “This Week in Unnecessary Censorship” – his truly hilarious “tribute” to the f*c*ing FCC!  (See my essay “Abolish the F*CCing FCC,” Feb. 8, 2006.)

     

      

    n (Recommended) Summer Reading  

                As I noted last year, the “lazy, hazy, crazy days” of summer are perfect times for reading my favorite genre of pleasure-reading books:  historical murder mysteries.  I’ve recently updated my “Guide to Historical Mysteries,” to include the latest novels by some of my favorite authors.  Among these are two new mysteries set in ancient Rome – Robert Colton’s Pompeii and Alan Scribner’s Mars the Avenger – as well as a second mystery by Bruce MacBain involving Pliny the Younger (The Bull Slayer) and another book in Ruth Downie’s “Medicus” series (Semper Fidelis), concerning a doctor with the Twentieth legion stationed in Roman-occupied Britain.  Lindsey Davis, author of the splendid Marcus Didius Falco series, has a new ancient Roman novel coming out in mid-June, The Ides of April, with a female protagonist, Flavia Alba, Falco’s adopted daughter.  (It’s the first book in a promising new series, described by one reviewer as “Falco: The Next Generation.”)  Gary Corby’s Sacred Games is the third book in his critically-acclaimed series set in classical Athens.  There are also two new mystery novels set in Victorian England:  Gyles Brandreth’s Oscar Wilde and the Murders at Reading Gaol (the newest title in Brandreth’s Oscar Wilde mystery series), and (coming out in mid-July) Peril in the Royal Train, the tenth title in Edward Marston’s “Railway Detective” series. 

                Another great series of books that are perfect for summer reading are Patrick O’Brian’s Aubrey/Maturin novels.  Set mostly aboard ships of the British Royal Navy during the era of the Napoleonic wars, the 20-volume series realistically depicts naval warfare of the era, along with other adventures in exotic locales all over the world.  But, most of all, it focuses on the compelling story of the friendship between Captain Jack Aubrey and Stephen Maturin, ship’s surgeon and also spy for the British government.  Indeed, it was the story of the Aubrey/Maturin friendship that was the centerpiece of the splendid 2004 film based on the first and tenth novels in the series, Master and Commander: The Far Side of the World, starring Russell Crowe and Paul Bettany, respectively, as Aubrey and Maturin.  The movie got me interested in the first book in the series; reading that book, along with the next several books, got me hooked on the series.     

                And with the planned Fall 2014 release of the movie Atlas Shrugged Part 3 (the third and final installment), this summer is also the perfect time for people who haven’t yet read Ayn Rand’s magnificent novel, Atlas Shrugged  (especially people who saw Parts 1 and 2 of the movie and were intrigued about the book on which they were based) to experience it for the first time – and for people who have read the novel to read it again.  Rand’s philosophic novel, although written over 50 years ago, brilliantly discusses the critical issue of our time:  freedom vs. statism, or individualism vs. collectivism.  Rand herself acknowledged the book’s relevance to then-current cultural, economic, political, and legal issues in her 1964 talk “Is Atlas Shrugging?” (subsequently printed as an essay in her book Capitalism: The Unknown Ideal), and a video produced by LibertyPen and posted on YouTube, narrating excerpts from Rand’s essay juxtaposed with stories from today’s news headlines, underscores the fact that the book is even more relevant (or prescient) today.  

                As for non-fiction books (in biography, history, politics, and public policy), there are several that I recommend.   

    Judge Andrew P. Napolitano’s recent book, Theodore and Woodrow: How Two American Presidents Destroyed Constitutional Freedom (Thomas Nelson, 2012) discusses the destructive impact, on American government and constitutional law, of the two so-called “progressive” U.S. presidents of the early 20th century, Theodore Roosevelt and Woodrow Wilson.  TR and Wilson engineered the greatest shift in power in American history, from individuals and the states to the national government.  Among their disastrous legacies are the “progressive” income tax, the Federal Reserve, presidentially-initiated wars and other foreign military interventions, a bloated federal bureaucracy including administrative agencies that dictate virtually all aspects of Americans’ lives, and a national government that constantly encroaches on business, coddling special interests and discouraging true marketplace competition.  The book also contains an interesting postscript, added by Judge Napolitano as the book was going to press, dealing with Chief Justice John Roberts and his disappointing opinion on “ObamaCare.”  (Napolitano compares Roberts with his overrated early-19th-century predecessor, John Marshall, who also was a political hack who expanded the scope of federal power.) 

    John Allison, president and CEO of the libertarian Cato Institute (and retired chairman and CEO of BB&T), draws upon his considerable experience as a banker/businessman and his knowledge of (and appreciation for) free-market capitalism in his book The Financial Crisis and the Free Market Cure (McGraw-Hill, 2013), discussed in the section on “Déjà Vu,” in Part II of this essay.  He shows how bad government regulatory policies caused America’s financial “meltdown” in 2008, the Great Recession and the subsequent weak “recovery.”  More than an astute diagnosis, however, the book also explains what is needed to “cure” these problems – not more misguided government interventions (which have only made matters worse) but, rather, a true free market in financial services based on sound philosophical principles (the principles that once made America great). 

    Two other new books on public policy expose the disastrous policies of B.O., the current “occupier” of the White House, as he begins his second term in power.  First, David Harsanyi, the libertarian-conservative columnist (and author of Nanny State), in his book Obama’s Four Horsemen (Regnery, 2013) discusses (as the subtitle of the book says) “the disasters unleashed by Obama’s reelection” – the four great catastrophes that threaten to destroy the United States of America we know and love:  debt (federal deficits and a national debt that has hemorrhaged out of control), dependency (a paternalistic federal bureaucracy that controls our lives), surrender (a foreign policy that weakens America’s ability to defend itself and its allies), and death (modern Democrats’ promotion of abortion as a positive good to be subsidized at taxpayer expense).  And second, economist John Lott (author of the best-selling books More Guns, Less Crime and Freedomnomics), in his new book At the Brink (Regnery, 2013) provides a stunning assessment of what B.O. has done thus far – and what he intends to do in his second term – as he continues his “transformation” of America.  He documents the various ways in which the United States today is “at the brink” of catastrophe, economically and socially:  among other things, how the “stimulus” was the most expensive economic failure in history, with consequences that could last for generations; how the continually-growing national debt will have a devastating impact on Americans’ economic security; how “ObamaCare” will result in ever-more soaring health care costs and will destroy the best health-care system in the world; and, of course (given Lott’s expertise on the subject of gun rights), how the B.O.’s regime’s gun-control agenda would in fact make crime worse.  I especially like Chapter Four (on B.O.’s “war on business”) and Chapter Five (on taxes, which exposes, among other things, the bullshit of so-called “fair” tax rates). 

    Amity Shlaes’ biography of the 30th president of the United States, Coolidge (HarperCollins, 2013), is another must-read book for anyone who is concerned about the current state of our national government and who wants to understand the myths on which the 20th-century regulatory “welfare state” has been built.  Coolidge was unfairly vilified by the “progressive” and New Deal historians, who have wrongfully portrayed him as a “do-nothing” Republican president during the frivolous decade of the “Roaring” 1920s, which led to the Great Depression of the 1930s, “the inevitable hangover, for which FDR administered the cure,” as columnist Michael Barone has aptly characterized it, in his positive review of Shlaes’ eloquent biography (Barone, “Calvin Coolidge Finally Getting Credit He’s Due,” I.B.D., March 4).  Thus this biography of Coolidge may be considered yet another splendid addition to the growing body of scholarship (including Shlaes’ earlier book, The Forgotten Man), that have provided a fresh interpretation of the Great Depression, showing how it was caused by wrong-headed government fiscal policies and make worse by FDR’s New Deal, which perpetuated and expanded those policies.  

    It’s no accident that Ronald Reagan admired Coolidge so much that one of his first acts as president was to direct the White House curator to hang Coolidge’s portrait in the Cabinet Room.  In my rating of the U.S. presidents, I rank Coolidge as the “greatest president of the 20th century because he did the least damage to the Constitution” (see “Rating the U.S. Presidents 2012,” Feb. 22, 2012), and Shaes’ book nicely provides the details that justify my assessment.  Coolidge is known as “Silent Cal,” a man of few words.  But it’s his policy of economy in government that most merits Americans’ admiration today.  Shlaes calls him “our great refrainer.”  

               When he assumed the office of president after Warren G. Harding suddenly died in August 1923, Coolidge made the important decision of continuing Harding’s program of cutting taxes, tariffs, and government expenditures.  “I am for economy.  And after that I am for more economy,” the 30th president said.  He met regularly with the director of the new Bureau of the Budget, paring spending any way he could; and he vetoed spending bills – including veterans’ bonuses and farm subsidies – popular with Republican members of Congress.  At the same time, he pressed Congress for tax cuts.  He followed the advice of his Treasury Secretary, Andrew Mellon, who advocated “scientific taxation,” which anticipated supply-side economics theory by prescribing lower tax rates, to stimulate economic growth and increase revenues.  After Coolidge won a full term in 1924, the top income-tax rate was reduced from the wartime 70 percent to 25 percent.  As Michael Barone succinctly summarizes the results:  “An economy that lurched from inflation to recession between 1918 and 1922 suddenly burst into robust economic growth,” allowing Coolidge to achieve budget surpluses every year – “surpluses that he used to pay down the national debt.”  In a fateful decision, in the summer of 1927 while vacationing in the Black Hills of South Dakota, Coolidge announced, simply, “I do not choose to run for president in 1928.”  Barone notes that all indicators showed that Coolidge would have won a second full term – meaning he would have been in office when the stock market crashed in October 1929.  Instead of the perverse economic policies of his successor, the “progressive” Republican Herbert Hoover (who initiated the New Deal policies made even worse by his successor, FDR), we might have had the “great refrainer,” whose policies of fiscal restraint could have avoided, or at least lessened, the Great Depression. 

                Finally, there is an important new book on a topic I’ve written about frequently here on MayerBlog: the claim that Thomas Jefferson fathered the children of his slave Sally Hemings, what I’ve called “the Sally Hemings myth.”  (See my previous essays “Jefferson’s Legacy of Liberty” (April 11, 2013) and “The Truth About Thomas Jefferson” (April 12, 2012).)  M. Andrew Holowchak’s Framing a Legend: Exposing the Distorted History of Thomas Jefferson and Sally Hemings (Prometheus Books, 2013), is a worthy addition to a growing body of scholarship that should be called “corrective” history (rather than “revisionist” history) because it corrects the erroneous interpretation made by proponents of the Jefferson–Hemings paternity thesis and therefore helps shatter the Sally Hemings myth. 

                Holowchak is a professor of philosophy, and in the first half of his book (“Three Prominent Spins”) he applies a philosopher’s training to his devastating critique of books by proponents of the paternity thesis (Fawn Brodie, Annette Gordon-Reed, and Andrew Burstein), exposing such logical defects in their arguments as selective use of evidence, ungrounded speculation, and over-psychologizing.  In the second half of the book (“Unframing a Legend”), Holowchak focuses on Jefferson’s philosophic thinking, showing how evidence of Jefferson’s character and racial attitudes would make a liaison with Sally Hemings quite unlikely.  Although Holowchak does not cite the 2001 Scholars Commission report in the main text of his book, he adds an addendum summarizing its findings – and has a foreword written by the Scholars Commission chairman, Professor Robert Turner.  And throughout the book he frequently cites my own concurring opinion, from the Scholars Commission report, particularly with regard to my views how the Hemings myth has “politicized” history, due to the influences of political correctness, radical multiculturalism, and postmodernism.    

                Anyone who wants to know the truth about Thomas Jefferson’s life ought to read Holowchak’s splendid book.

     

     

    n Summer 2013 Movies: A Preview  

                Summer movies tend to follow one or more grand themes, and this summer the grand theme seems to be big-budget “action” films, particularly in the superhero and sci-fi genres, with an emphasis on sequels and prequels.  In other words, it will be another summer in which moviegoers can expect little originality – just more of the same “tried and true” models for summer box-office success.  Here’s my summary of the movies I’ll be watching for – films I anticipate either seeing or (in some cases) avoiding. 

                Already playing in the theaters are the two films most likely to be the big blockbuster hits of the summer.   Iron Man 3 (which premiered May 3), grossed $175 million just its first weekend.  It’s a sequel, not to the forgettable Iron Man 2 but rather to last summer’s hit The Avengers, for it picks up soon after that adventure (involving the whole Marvel Comics superhero crew defending against an alien attack on New York City).  Robert Downey Jr. is back in the title role, with Gwyneth Paltrow as his assistant/love interest.  This installment in the franchise isn’t just all special effects, for it has a compelling story as well as an impressive performance by Ben Kingsley as the villain.  (Or, as I should say, the supposed villain – SPOILER ALERT! – for it is Guy Pearce as the real villain who has the most impressive performance in the movie.)  The second movie, Star Trek: Into Darkness (which premiered May 17), took in $71 million its first weekend (somewhat less than the projected $90–100 million).  The sequel to director J.J. Abrams’ “reimagined” Star Trek (2009), it brings back the younger cast (headed by Chris Pine as Kirk and Zachary Quinto as Spock), in a story that continues in many ways to challenge the original Star Trek canon.  It has mind-blowing, explosive special effects (especially mind-blowing for those viewers reckless enough to watch it in 3-D), and plenty of inside jokes for veteran Trekkers (not “Trekkies,” as non-fans erroneously call them).  At times, with its fast-paced chase sequences, the movie felt to me more like a Bruce Willis Die Hard film than a Star Trek movie.  And as in Iron Man 3, the standout performance in Star Trek: Into Darkness comes from the villain, played brilliantly by British actor Benedict Cumberbatch, known to American audiences from his performance as a modern-day Sherlock Holmes in the BBC series Sherlock (who plays – sorry, another SPOILER ALERT! – the new Khan, for the movie as a whole is a kind of homage, a pastiche or even a parody, of the original Star Trek II: The Wrath of Khan).     

                The other big sci-fi film of May is After Earth (May 31), directed by M. Night Shyamalan.  Set 1,000 years in the future, it’s about a father and son (Will and Jaden Smith) stranded after crash-landing on a savage, inhospitable planet . . . called Earth.  The premise may be just crazy enough to make it work – helped out by the real chemistry between father and son. 

                Other movies premiering in May (and therefore either already playing or soon to play in movie theaters) include several films that I plan to avoid because I have no desire to waste my money seeing them (even during a discount weekday matinee showing).  The Great Gatsby (May 10) stars Leonardo DiCaprio and Tobey Maguire in an unnecessary remake by Baz Luhrmann (flamboyant director of Moulin Rouge! and Romeo + Juliet).  Instead of authentic 1920s jazz music, the movie tries to reimagine the Jazz Age with hip-hop and pop music from the likes of Jay-Z and Beyoncé.  Even worse, it was filmed in 3-D.  It’s an insult to the classic novel by F. Scott Fitzgerald, which was faithfully adapted for the screen in 1974’s The Great Gatsby starring Robert Redford (which didn’t need all the gimmickry).  The Hangover Part III (May 24) is the final installment (we hope!) in the men-behaving-badly comedy franchise starring Bradley Cooper, Ed Helms, and Zach Galifianakis.  The original Hangover movie in 2009 was quite funny, but its sequel was much less funny; if the first sequel was uncalled for, this second sequel is really uncalled for.  (Some comedy films are so classic that one cannot imagine any sequel doing them justice.  Imagine, for example, The Graduate II, or Animal House II.  The Hangover should have been in that category.)  In Before Midnight (May 24), Ethan Hawke and Julie Delpy reprise the roles they played in the previous “Before” movies: 1995’s Before Sunrise and 2004’s Before Sunset.  Although I like both Hawke and Delpy individually, for some reason I find their pairing annoying, and so I’ve avoided these movies, which only work if one cares about the characters (and I don’t).  Fast and Furious 6 (May 24) is the latest installment of this franchise based on the street-racing life in L.A.; its fans don’t include me.  Finally, the sci-fi/horror film The Purge (May 31) stars Ethan Hawke in a film filled with gratuitous violence and based on the despicable premise that one night a year citizens can commit any crime without punishment. 

                June’s big film – and another potential blockbuster hit of the summer – is Man of Steel (June 14), the Superman remake directed by Zack Synder and starring Henry Cavill in the title role.  With the DC Comics hero turning 75 this year, one might ask:  Why another reboot of the Superman story – especially so soon after 2006’s Superman Returns?  The answer (according to Entertainment Weekly) lies in Warner Brothers’ plan to use a revamped Superman “to lay the groundwork for a planned Justice League film that would team up many DC characters and possibly launch several new franchises.”  Unlike the 2006 remake, Man of Steel promises to be a complete reboot of the classic 1978 Superman movie that starred Christopher Reeve and launched a multi-film franchise of its own.  The classic origin story is kept largely intact, with Jor-El (Russell Crowe) saving his infant son by launching him away from the imperiled planet Krypton on an interstellar journey toward Earth.  The baby’s craft lands on the Kansas farm of the Kents (Kevin Costner and Diane Lane), who adopt the boy and teach him to hide his super-powers.  The familiar story continues in Metropolis, where intrepid Daily Planet reporter Lois Lane (Amy Adams) provides not only the love interest but also the main threat to the Man of Steel’s secret identity.  The real appeal of Man of Steel is the star: hunky 29-year-old British actor Henry Cavill, who was the best thing about Showtime’s highly flawed series The Tudors.  His performance alone may make it worthwhile to sit through yet another (arguably unnecessary) remake.   

                June has another big-budget sci-fi action film – World War Z (June 21), Brad Pitt’s zombie apocalypse movie – but the other noteworthy films in the month are all comedies.  The Internship (June 7) reunites the comedic duo of Vince Vaughn and Owen Wilson (who were so good in their smash 2005 comedy, Wedding Crashers) in a story about two out-of-work 40-something salesman who try to reboot their careers by taking a brutally competitive internship at Google.  This Is the End (June 12) reunites friends (and co-stars of Judd Apatow’s TV show, Freaks and Geeks) James Franco, Jason Segel, and Seth Rogan (who co-directs), in an apocalypse comedy set in a Hollywood home ostensibly owned by Franco (and filled with outrageous art, including some paintings done by the actor himself).  Much of the comedy in the film derives from inside jokes made at the expense of the actors’ professional and private lives (whether it’s rumors of Franco’s homosexuality or Rogan’s much-panned performance in The Green Hornet).  And The Heat (June 28) pairs Sandra Bullock and Melissa McCarthy as, respectively, an FBI agent and a Boston cop, in a comedy-thriller directed by the same director who made McCarthy a hit in Bridesmaids

                July begins and ends with two more big action films.  The Lone Ranger (July 3) is the long-awaited pairing of Armie Hammer in the title role and Johnny Depp in another strikingly original performance as Tonto, in a fast-paced remake of the story of the legendary Western hero, directed by Pirates of the Caribbean veteran Gore Verbinski.  (Why do I say remake?  Most people have forgotten the 1981 film The Legend of the Lone Ranger, starring the still-obscure actor Klinton Spilsbury in the title role, with Jason Robards as President Ulysses S. Grant and Christopher Lloyd in an amazing performance as the villain – amazing because that was the time when Lloyd was most known to American audiences for his performance as Jim, the burned-out cab driver in the classic TV comedy Taxi.  How’s that for a bit of trivia?)  And near the end of the month, Hugh Jackman reprises his starring role as the steel-clawed X-Man in The Wolverine (July 26), another installment in the Marvel Comics franchise and an adaptation of the character’s most famous comic-book story, set in Japan. 

                Finally, August also has two action films (both sequels) worthy of note.  300: Rise of an Empire (August 2) is the sequel to the 2007 hit 300, based on the graphic novel (comic book) story featuring digital animation and ripped abs, in an ultraviolent retelling of the story of the doomed 300 Spartans who resisted the Persian invasion of Greece at Thermopylae.  (300 was interesting, but its comedic parody, Meet the Spartans, was hilarious.)  Rise of an Empire, based on the yet-unreleased graphic novel Xerxes, tells a parallel story focused on the naval battles between the Persians and the Athenians.  And Percy Jackson: Sea of Monsters (August 7), a sequel to 2010’s Percy Jackson & the Olympians: The Lightning Thief, has its young star Logan Lerman returning as Percy – the half-human son of sea god Poseidon, who leads his fellow myth-based pals into the Sea of Monsters in search of the Golden Fleece.  (The film is based on the second book in Rick Riordan’s young-adult series.  Another film based on a best-selling young-adult fantasy series will be released later in the month (August 23): The Mortal Instruments: City of Bones.  What’s with all these complicated titles?)   

      

     | Link to this Entry | Posted Thursday, May 23, 2013.  Copyright © David N. Mayer.


    Jefferson's Legacy of Liberty - April 11, 2013

     

    Jefferson’s Legacy of Liberty

     

      

                Twenty years ago, on April 13, 1993, I was fortunate to participate in the celebration of Thomas Jefferson’s 250th birthday at the University of Virginia in Charlottesville.  The University typically observes Jefferson’s birthday as “Founder’s Day,” for he was its founder.  (Indeed, his efforts to establish U.Va. – which included designing the buildings that constitute the original “academical village” and selecting its first faculty members – were not only the great project of the final years of his life but also one of only three major accomplishments for which he wished to be remembered in his epitaph, as I note below.)  And April 13, 1993, marking the 250th anniversary of Jefferson’s birth, was a very special Founders Day.  

                I was among several speakers (who may be fairly described as a motley crew) who spoke on the U.Va. grounds that day.  (“Grounds” is the word used in Charlottesville to identify the University’s campus; it is one of several unique terms traditionally used at U.Va., where undergraduate students are called “first-year,” “second-year,” and so on, rather than “freshmen” or “sophomores,” etc., and where faculty are addressed as “Mr.” or “Ms.” rather than as “Professor” or “Doctor,” supposedly because of “Mr. Jefferson’s” abhorrence of titles.  Charlottesville and the University are often described as places where people speak of Mr. Jefferson as if he were in the next room.  Notwithstanding these traditions, however, I discovered during my years as a graduate student there in the early 1980s – I earned my M.A. in History in 1982 and my Ph.D. in History in 1988 – that much of the reverence about Jefferson at U.Va. is mere lip service.  Surprisingly, at the time I was the only history graduate student who was writing a dissertation about Jefferson – “The Constitutional Thought of Thomas Jefferson,” which was published in book form by the University Press of Virginia (now known as the University of Virginia Press) in 1994 – and it often seemed that I was the only student in my department who genuinely respected Jefferson.  The attitude toward the “Founder” that was held generally by the whole University community – by not just students but also by faculty and administrators – seemed a bit like rebellious teenage children who do not think it’s “cool” to respect their parents.  More than a few U.Va. faculty members at that time – and no doubt many more since – have expressed open contempt of Jefferson and his legacy.)      

                Back to April 13, 1993 and the “motley crew” who spoke on the U.Va. grounds that day, most under the direct auspices of the University but some (like myself) sponsored by student organizations.  It was a group that included, besides me, some real celebrities:  William Rehnquist, former Chief Justice of the United States; George Will, conservative political columnist; Katie Couric, at that time a host of NBC’s Today show and a U.Va. alumna; and former Soviet dictator Mikhail Gorbachev!  (The wholly inappropriate choice of Gorbachev as a speaker by itself shows how little respect and understanding the University administration and faculty had for Jefferson and his legacy; they were simply enamored by Gorbachev’s celebrity and the naïve belief that his supposedly liberal “reforms” had brought about the end of the Soviet Union, when it was really the steadfastness of Ronald Reagan and the inherent weakness of the Communist regime.  Having Gorbachev as a guest speaker was not only inappropriate, but also rather sad.  Jefferson’s purpose in establishing the University of Virginia was to create a university where, as he wrote to James Madison in 1826, the “vestal flame” of republicanism – the libertarian principles of the American Revolution – would be kept alive, “to spread anew” over Virginia and her sister states in the United States of America.  Sadly, U.Va. has become just like other “elite” American universities, with an administration and a faculty who are predominantly leftist in their political orientation, ignorant of Jeffersonian republican principles,  and who are more enamored of celebrities like Gorbachev and Couric than of real American heroes.) 

                Twenty years ago on April 13 my observance of the special day began with a sunrise program at Shadwell, the site of Jefferson’s birth in 1743, where some brief remarks were made by Merrill D. Peterson, the Thomas Jefferson Memorial Foundation Professor Emeritus at U.Va., the greatest living Jefferson scholar (at that time) and my dissertation director.  (Because Merrill had agreed to direct my dissertation after he had retired and been given emeritus status, I was literally his last Ph.D. student.  I like to think that he agreed to do so because he saw real merit in my study of Jefferson’s constitutional thought, which was the culmination of many years of research, beginning when I was an undergraduate student at the University of Michigan in the early 1970s, continuing through my three years of study as a law student at Michigan in the late 1970s, my four years of study as a graduate student at U. Va. in the early 1980s, my three years practicing law in Washington, D.C. in the mid-1980s, followed by a year as a postdoctoral fellow – even though I really was still predoctoral – at the Institute for Humane Studies in 1987-88.  During those years of study – totaling about 13 years, before I completed my dissertation in August 1988 – I had read not only virtually all the scholarly books written about Jefferson, including several by Peterson, but also had read virtually all of Jefferson’s own writings, both published and unpublished, including the largest collection of Jefferson’s manuscripts, at the Library of Congress.  I earned the title of “Jefferson scholar,” as Merrill Peterson had done before me.) 

                My talk took place in the early evening of April 13, sandwiched in between the late-afternoon talk by George Will and the nighttime fireworks that marked the culmination of the University’s observance.  Two student groups sponsored my talk:  Students for Individual Liberty (S.I.L.) and the Honor Committee.  S.I.L. was the leading libertarian student group at U.Va.; it was the successor organization to the libertarian student group that I had co-founded in 1983, the Libertarian Student Association (LSA).  By inviting me to come to Charlottesville and be their guest speaker on that special day, the leaders of S.I.L. were doubly honoring me:  first, by giving me an opportunity to observe Jefferson’s 250th birthday; and second, by honoring me, as the libertarian group’s own “founder,” on the tenth anniversary (or so) of its founding.  (I was pleased to learn that LSA, which began as a little libertarian/Objectivist group that met mostly to talk about philosophy, had morphed into the more politically active libertarian group, S.I.L., and that group was one of several libertarian groups that were thriving “on grounds” in the early 1990s.)  The Honor Committee, the student group that administers the University’s Honor Code (a student-run honor system is another Jeffersonian tradition at U.Va.), had decided to co-sponsor my talk (which itself was another honor, quite literally, for me – and more than a little ironic, for it meant that my talk was sponsored by the most tradition-bound student group and by perhaps the most “radical” student group at U.Va.!  Mr. Jefferson would be proud!) 

                The text of my April 13, 1993 talk, “The Libertarian Legacy of Thomas Jefferson,” is reprinted below, with a few minor revisions.  It is important, not only as a 20-year personal reminiscence, but also as a kind of antidote against the poisons (not too harsh a term for the outrageous myths) that many people have been propagating about Jefferson during the past two decades. 

                    Two major myths about Jefferson, one about his ideas and the other about his life, have dominated in recent years.   

    Twenty years ago, the chief myth about Jefferson’s ideas was the notion first circulated by his political enemies, the Hamiltonian Federalists, in the 1790s: that Jefferson was a “democrat,” a champion of democracy, or majority rule.  Although he more fully embraced the principle of majority rule than most of his contemporaries, Jefferson shared with other Americans of the Founding generation an abhorrence of democracy, in the pure sense.  Democracy, like aristocracy and monarchy, was one of the “pure” forms of government that – according to centuries-old political thought going all the way back to ancient Greece and Rome (including Aristotle, in his Politics) – would degenerate into either tyranny (what Alexis de Tocqueville called “the tyranny of the majority,” in his Democracy in America) or anarchy (or “mob rule”).  Like the other Founders, Jefferson championed not democracy but “republicanism,” which he defined generally as a form of government representative of the will of “the people.”  The United States of America is not a democracy but a republic – a constitutional republic; that is, a republic with a constitutionally-limited government.  (For more on this, see my essay “A Republic, Not a Democracy” (June 6, 2005).) 

    Jefferson and his good friend (and political collaborator), James Madison, called the opposition political party that they co-founded in the 1790s the “Republican” party because they believed their political opponents, the Federalists (and particularly the Hamiltonian wing of the Federalist party) were trying to undermine republican government in the United States, by attempting to transform it into an aristocratic, monarchical system like Great Britain.  That is why the core principles of the Jeffersonian Republican party emphasized the constitutional limits on the powers of government, particularly the national government (a government of limited powers enumerated in the U.S. Constitution) and – on the other side of the coin – the maximization of individual freedom and responsibility.  When Jefferson and Madison spoke of “self-government,” they were not talking about democracy – the “people” governing society, which means (in practice) the majority riding roughshod over the rights of the minority – but instead were talking about self-government, quite literally: the freedom of individuals to govern themselves, to “own” their own lives, without interference from government.  Jefferson always tempered his comments in favor of majority rule with an even greater reverence for the rights of “minorities” – and, of course, the rights of the individual, the most vulnerable “minority” of them all.  For example, in his First Inaugural Address (March 4, 1801), Jefferson announced the “sacred principle,” that “although the will of the majority is in all cases to prevail, that will to be rightful must be reasonable; that the minority possess their equal rights, which equal law must protect, and to violate would be oppression.”   

    The terms democrat and democracy did not begin to have positive connotations in American political culture until the 1830s – just after Jefferson’s death – when the supporters of Andrew Jackson, who came from the radical faction of Jefferson’s Republican Party, began calling themselves “Democrats.”  (It was in this era, during Jackson’s presidency, that the French aristocrat Tocqueville visited the United States and wrote his book Democracy in America.)  Jefferson and the radical wing of his Republican Party – which became the Democratic Party in the 19th century – became associated in the “public mind” (especially in the North) with Southern “state rights” doctrines, because of Jefferson’s emphasis on federalism and his “strict construction” of federal powers.  In the decades following the Civil War, Jefferson – in part because he was a Virginian as well as a slave-owner, and in part because some of his ideas inspired Southern secession – was associated with the “lost cause” of the Confederacy.  Even though Lincoln’s Republican Party got its name from Jefferson’s Republican Party (and Republican leaders like Abraham Lincoln himself claimed to have derived their fundamental principles from Jefferson), by the early 20th century the Republican Party looked not only to its forerunners in the American Whig Party of the 1830s and 1840s (the Whigs had evolved out of the more moderate wing of Jefferson’s Republican Party) but to the Federalist Party of the 1790s – Jefferson’s political opponents – as their political ancestors.  (The Republican Party of the early 20th century had embraced nationalism rather than “state rights” or federalism as a key political principle, and Republican politicians like Henry Cabot Lodge actually regarded Hamilton rather than Jefferson as founder of their party – one of the important topics discussed by Merrill Peterson in his classic book The Jefferson Image in the American Mind (Oxford University Press, 1962).)  

    So-called “Progressives” in the early 20th century, whether Republican (like Teddy Roosevelt) or Democratic (like Woodrow Wilson), similarly denounced Jefferson and his limited-government philosophy as antiquated, associated with (as Wilson asserted in his New Freedom (1912)) “those early days of simplicity which marked the beginnings of our government,” before the Industrial Revolution had transformed the United States from an agrarian nation into an industrial “super-power.”  (For more on the “Progressives” – and why the term is a misnomer, for they actually were reactionary paternalists – see my essay ”Reactionary `Progressives’” (March 16, 2006).  “Progressives” like Wilson also believed in a broader myth, which I call the myth of complexity, which assumed that as society became more complex – as the country moved from a simpler agrarian economy to a more complicated industrial economy – the role of government, of the coercive power of the law, needed to expand.  Jefferson and America’s Founders understood, on the contrary, that as society advanced, it needed fewer laws; they understood, as modern libertarians do, that free markets can provide better order in society than can government dictates.  To Jefferson, as to modern libertarians, true “progress” in society meant more freedom and less government.) 

                Although the “Progressive” activists were adamantly opposed to Jefferson’s constitutional principles (of limited government, strict interpretation of federal powers, broad interpretation of individual rights, federalism, and so on), they did embrace “democracy” – or, more precisely, they embraced the “modern” administrative state (government by so-called “experts,” on governmental administrative agencies) who supposedly acted in the “public interest,” or for the “common good.”  Since FDR’s so-called “New Deal Revolution” of the 1930s, Democratic politicians have continued to regard Jefferson as their party’s founder (even moreso than Andrew Jackson, who truly was their founder) and thus have emphasized the “democratic” part of Jefferson’s public image.  Republicans continued to waive their equally-good claim of Jefferson as founder of their party and began using the term “Democratic-Republican” (the derogatory term first used by Federalists in the 1790s) to identify Jefferson’s Republican Party.  Social and religious conservatives within the Republican Party, alienated by Jefferson’s libertarianism (especially his views on religious freedom and “the wall of separation of church and state,” the phrase he popularized during his presidency), similarly continued to repudiate Jefferson.  Only a minority of limited-government conservatives and libertarians kept alive Jefferson’s real political philosophy, which (as I discuss below) took liberty, not democracy, as its highest political value.  To most Americans, of either major political party, Jefferson had become the caricature that is presented in the Jefferson Memorial in Washington, D.C., which was designed during FDR’s presidency as a part of his “New Deal” propaganda – with FDR symbolically wrapping himself in the mantle of Thomas Jefferson (as Peterson characterized it in The Jefferson Image in the American Mind).   

    Thus was born the myth of Thomas Jefferson, “father of democracy” – the myth that so dominated in 1993, the year celebrating Jefferson’s 250th anniversary, that it became the theme used for the celebration, at least at the University of Virginia.  This was the context of the opening lines of my talk (reproduced in the final section below), when I observed that “modern Americans seem to have done a better job preserving what Thomas Jefferson has left us in bricks and mortar than we have done in preserving his ideas.”  Jefferson’s real political principles (his advocacy of limited-government, constitutional republicanism, of federalism, separation of powers, and so on) are, frankly, embarrassing to many modern American “elites,” whether in academia, journalism, politics, law, or business, who (like the early 20th-century “Progressives”) reject individualism in favor of paternalism.  To the paternalist elites of all political stripes (not only the left-liberals or “progressives,” but also the so-called “social conservatives” or “neo-conservatives”), Jefferson’s essentially libertarian political philosophy is anathema because of its emphasis on limits on governmental power.  His political philosophy is the antithesis of the modern national regulatory/welfare state.  So, to make Jefferson “relevant” to modern American politics, modern intellectuals extol him as “father of (American) democracy.”  It would be more faithful to the true Jefferson – to the man and (especially) to his ideas – to consider him as “father” of, say, the “Tea Party” movement, or the Republican Liberty Caucus, or even the Libertarian Party.  

    Beginning in the late 1990s, a new myth about Jefferson – focused on his private life – became dominant:  the myth that he fathered one or more of the children of his slave Sally Hemings.  Like the “father of democracy” myth, this one is a new twist on an old allegation. 

    The Sally Hemings myth also had its origins among Jefferson’s political enemies in the 19th century: chiefly, a hatchet journalist and disgruntled office-seeker named James T. Callender (who invented the story in an 1802 Richmond newspaper piece) and a Republican newspaper in rural Ohio that allegedly interviewed Madison Hemings, one of Sally’s sons, in 1873 (who essentially repeated the story told by Callender).  The story was kept alive by Jefferson’s political enemies – during his lifetime by some Federalist political opponents and after his death by some abolitionist propagandists – and as an oral tradition among some (but not all) of the Hemings descendants.  In the late 20th century the myth was given new life in a bestselling biography of Jefferson published in the 1970s, Fawn M. Brodie’s Thomas Jefferson: An Intimate History.  Brodie’s book was widely criticized by reputable Jefferson scholars as merely speculative “psycho-history.”  (Brodie claimed, for example, that when traveling through France, Jefferson described the soil in certain regions as being “mulatto” in color, he was revealing his guilt over miscegenation.)  Gullible members of the reading public – including people who read celebrity gossip magazines and who delight in salacious stories about famous people, whether living or dead – took Brodie’s fictionalized biography seriously, leading to such works as the Merchant-Ivory film Thomas Jefferson in Paris (1995), in which Nick Nolte ludicrously portrayed Jefferson as the American diplomat who began his affair with Sally Hemings in Paris in the 1780s.  (The problem – as even modern believers in the Sally Hemings myth must concede – is that Sally Hemings at the time was not the sensual young woman portrayed in the film, or in the Brodie book; she was a rather immature young girl who accompanied one of Jefferson’s daughters as her personal servant.  The alleged oldest child of Hemings, a son named Thomas, whom Jefferson supposedly fathered while in Paris, never existed; he was the fictional creation of James T. Callender, kept alive in pop culture by the Brodie book and the Jefferson in Paris film.) 

    The Sally Hemings myth was given renewed life in the 1990s thanks to a group of true believers at Monticello (particularly Lucia Stanton and Dianne Swann-Wright), who were engaged in the “Getting Word” oral history project involving the Hemings descendants, and to a black law professor in New York, Annette Gordon-Reed, whose first book, Thomas Jefferson and Sally Hemings: An American Controversy, was published in 1997.  Both the Monticello staffers and Professor Gordon-Reed have used the Hemings myth to push an agenda: to validate the oral history told by the Hemings descendants (which in turn seems to have been based on the sources noted above – the original Callender newspaper piece in 1802, the Madison Hemings “interview” in 1873, and the speculations in the Fawn Brodie book); and in Gordon-Reed’s case, to push a broader agenda about the problem of racism in America.   

    After DNA testing was done on the blood of the descendants of Thomas Jefferson and one of Sally Hemings’s younger children, her son Eston, Nature magazine in the fall of 1998 published an article that falsely claimed the DNA tests “proved” that Jefferson fathered all of Sally Hemings’s children.  The tests proved only that one of Sally Hemings children, Eston Hemings, was fathered by a male Jefferson (someone whose DNA matched the male side of Jefferson’s family); the DNA testing  could just as well have “proved” that Sally Hemings’s child was fathered by Jefferson’s brother, Randolph Jefferson, or one of Randolph’s sons – for whom the circumstantial evidence of paternity is at least as strong (if not stronger) than for Thomas Jefferson.  The timing of the Nature magazine article was critical – coming in the fall of 1998, as the U.S. House of Representatives was about to impeach Bill Clinton – for it gave some scholars a reason to jump on the bandwagon, using the Hemings myth to show that all U.S. presidents – even Mr. Jefferson – have had sexual affairs.  Thus has the Hemings myth been driven by politics, and various political agendas, throughout U.S. history. 

                It was in the wake of the Nature article (which popularized the misrepresentations about the DNA testing), in the spring of 2000, that I was asked to join an interdisciplinary group of distinguished scholars charged with the task of objectively evaluating all the relevant evidence regarding the allegation that Thomas Jefferson fathered the children (either all or some of them) of Sally Hemings.  Our Scholars Commission Report was issued in April 2001, and we were nearly unanimous (12 out of 13 members concurred with the final report, with just one member writing a mild dissent) in concluding that the paternity claim was not supported by reliable evidence.  As summarized by the commission’s chairman, Professor Robert F. Turner of the University of Virginia, in an op-ed published shortly after the report’s release:  “With but a single mild dissent, the scholars’ conclusions ranged from `strong skepticism’ about the allegation to a conviction that the charge was ‘almost certainly false.’  They demonstrated that most of the `evidence’ cited to establish the relationship was either factually false . . . or was explained on other grounds” (“The Truth About Jefferson,” Wall Street Journal, July 3, 2001).   

    As a member of the Scholars Commission, I wrote my own “concurring opinion” in which I discussed the reasons for my strong skepticism about the paternity thesis, which I frankly have called a myth.  In my separate report, “The Thomas Jefferson – Sally Hemings Myth and the Politicization of American History,” I concluded that the myth has been given new life in recent years because of two unfortunate trends in academia (“political correctness” and radical multiculturalism) that have undermined objective standards of historical scholarship.  I particularly criticized the shoddy scholarship by a biased Monticello committee and by the leading proponent of the Jefferson – Hemings thesis, Professor Gordon-Reed, who (as noted above) has advanced the paternity thesis as part of her broader race-conscious political agenda.   

    I concluded that there was no credible evidence supporting the claim that Jefferson fathered any of Sally Hemings’s children; nor is there any credible evidence supporting the claim that Jefferson had any sort of special relationship – let alone a “common-law marriage,” as some paternity thesis believers have imagined it – with Sally Hemings.  The evidence strongly suggests, on the contrary, that the Hemings children were fathered by multiple male members of Jefferson’s family – probably the elder children were fathered by his nephews, Peter and Samuel Carr (who admitted their paternity), and the younger children by his brother, Randolph Jefferson, and possibly one or more of Randolph’s sons.  The evidence suggests that Jefferson himself was celibate after the death of his wife, Martha, in 1782.  (The closest Jefferson came to a sexual relationship after 1782 was with Maria Cosway, a married woman and a talented artist, with whom Jefferson had a romantic friendship in France in the 1780s.)  To Jefferson, sadly, Sally Hemings was just one of the many domestic slaves at Monticello.  (The Sally Hemings imagined by Annette Gordon-Reed did not exist; she is a fictional character created to advance Professor Gordon-Reed’s agenda.  Even the claim that Jefferson treated Sally specially because she was the half-sister of his wife, Martha – the bastard child of Martha’s father, John Wayles, and his slave Betty Hemings – is suspect, for it has an even more dubious source than the story of Sally’s children.  For more on the origins of “the Sally story,” see the excellent book by Rebecca and James McMurry, Anatomy of a Scandal: Thomas Jefferson and the Sally Story (Shippensburg, Pa.: White Mane Books, 2002), the foreword of which I was honored to write.) 

    A slightly abridged version of my report is reproduced in the blog essay I posted last spring, “The Truth About Thomas Jefferson” (April 12, 2012).  As I noted in that essay, the Scholars Commission report at long last has been published in book form:  The Thomas Jefferson – Sally Hemings Controversy: Report of the Scholars Commission, edited by Robert F. Turner (Durham, N.C.: Carolina Academic Press, 2011).  The book can be ordered directly from the publisher, Carolina Academic Press (hardcover list price, $45.00), or from Amazon.com (hardcover edition at a slight discount, about $35, and digital edition, for the KINDLE reader, for $25).  The book contains our original commission report from 2001, complete with all the individual reports – including my individual report and (the lengthiest portion of the book) the separate report by Robert Turner, the chairman of the Scholars Commission, which he has expanded and revised to include a “postscript” summarizing the reactions to the Scholars Commission report over the past decade.  Anyone seriously interested in the Sally Hemings issue ought to consult the Scholars Commission report and particularly the individual views of Professor Turner, who has reviewed all the relevant evidence more thoroughly than any other scholar thus far has done. 

                Another book that has shattered the Sally Hemings myth has been written by William G. Hyland, Jr. – a former prosecutor and a trial lawyer from Virginia – who applied more than 25 years of litigation experience to the question of the Jefferson–Hemings paternity claim.  His book, In Defense of Thomas Jefferson: The Sally Hemings Sex Scandal (New York: Thomas Dunne Books/St. Martin’s Press, 2009), weighs the relevant historical evidence as if it were presented in court in a paternity suit.  Hyland concludes that the paternity thesis is an unsubstantiated charge, based on the accumulation of salacious rumors and irresponsible scholarship over several generations – “much of it inspired by political grudges, academic opportunism, and the trend of historical revisionism that seeks to drag the reputation of the Founding Fathers through the mud,” as the book jacket description states.  Hyland draws upon the Scholars Commission report – including my own individual report – in supporting his analysis.  

                As I noted in my 2012 essay, “The Truth About Thomas Jefferson,” Hyland also presents a devastating critique of Annette Gordon-Reed’s second book, The Hemingses of Monticello: An American Family (2008), which he aptly characterizes as a book of “creative historical imagination” that fails as either biography or history.  As he notes, Gordon-Reed’s second book is “an acidic, eight hundred-page regurgitation of her flawed 1997 book with the Jefferson–Hemings sexual allegation at the center of her social commentary on slavery.”  Her “highly speculative tome” is even more flawed by her techniques of “psychohistory,” misinterpreting clear evidence (as she did in the earlier book) and by her speculation concerning Jefferson and Sally’s “thoughts.”  Hyland is particularly critical of Gordon-Reed’s charge that Jefferson “raped” Sally, an allegation she makes repeatedly throughout her book, yet “she fails to offer a shred of documentary evidence, eyewitness, footnote, or source to support her venal opinion.”  Gordon-Reed’s book consists largely, as she admits, of “imagination” – which is another way of saying it is mere speculation, a work of fiction, masquerading as history or biography.  (The fact that her book received many awards from historical organizations only shows how far the field of history today has declined in its standards of professionalism.)  

    In addition to the “father of American democracy” myth and the Sally Hemings myth, other important myths about Jefferson – many of them also originating among Jefferson’s political enemies and perpetuated by some modern scholars – have arisen in recent decades, including the myths that Jefferson was an “atheist,” an “anti-capitalist” (an opponent of free-market capitalism), a “hypocrite” (who in power during his presidency betrayed his limited-government principles of the 1790s), and so on.  I describe these various myths and separate them from the real Jefferson, the historic Jefferson, in my essay “Thomas Jefferson, Man versus Myth” (April 13, 2006), which has been published in pamphlet form by the Center for Ethics and Entrepreneurship at Rockford College in Illinois.  (The pamphlet can still be purchased for $1.99 from Amazon.com.)

                One of these additional myths – that Jefferson was an “atheist” – deserves some further comments, for in his zeal to counter that myth the author of yet another new Jefferson book – David Barton – has created a new myth of his own, that Jefferson was a “Christian,” in the conventional sense of the word.  Last August, publishing company Thomas Nelson took the unusual step of recalling all copies of Barton’s book The Jefferson Lies: Exposing the Myths You’ve Always Believed About Thomas Jefferson, which had been released in April 2012.  The publisher explained its extraordinary action by citing a number of factual inaccuracies and historical misinterpretations that had been brought to its attention.  In a statement, Nelson said, “[We were] contacted by a number of people expressing concerns about The Jefferson Lies.  We took all of those concerns seriously, tried to sort out matters of opinion or interpretation, and in the course of our review learned that there were some historical details included in the book that were not adequately supported.”  In the wake of these accusations, Nelson recalled copies of the book in retail stores, asked online retailers to stop selling it, and suspended printing and distribution. 

                It’s not surprising that the book contained factual inaccuracies and misinterpretations – in fact, it’s riddled with them:  the author, David Barton, is not a trained historian but a religious conservative who founded an organization called Wall-Builders, which is devoted (ironically) to tearing down the “wall of separation between church and state” and instead advancing the notion that America was founded as “a Christian nation.”  Thomas Jefferson not only popularized the “wall of separation” metaphor (which he used in a famous letter he wrote to a Baptist congregation in Danbury, Conn., early in his presidency) but also has been widely understood, by most historians and Jefferson scholars, as a champion of religious freedom and a critic of “organized” Christian religion in his own era.  (Indeed, the context for the famous quotation found inside the Jefferson Memorial in Washington, D.C. – “I have sworn on the altar of God eternal hostility against every form of tyranny over the mind of man” – was in a letter Jefferson wrote to a political supporter, making the point that his Federalist political opponents in New England “rightly” feared that he, as president, would stand in the way of their agenda to use the coercive power of government to impose their version of Christianity on the American people.)  

                It’s true that Jefferson was no “atheist,” as his political enemies (in his time and even in the modern era) have alleged; but Barton truly distorts Jefferson when he tries to show he was an orthodox Christian.  Rather, Jefferson is best described as primarily a “deist,” one who believed not in a supernatural God who actively intervenes in human affairs but rather in a Creator-God who acts through the laws of nature.  He called himself a “Christian” because he admired Jesus’ moral teachings, as he understood them, but he denied Jesus’ divinity – which surely must disqualify Jefferson from being considered a “Christian” in the theological sense.  He did believe in an afterlife – or at least in a sort of “heaven,” where individuals would be reunited with their deceased loved ones – but did not image a “hell” (except here on earth) and denied the existence of “miracles.”  To a great extent, Jefferson indeed was – as he was characterized in an op-ed by one of Barton’s critics, Boston University religion professor Stephen Prothero – “our least Christian president” (USA Today, July 3, 2012). 

                In trying to disprove what seems obvious to most scholars who have seriously studied Jefferson’s thought, Barton engages in what I call “lawyer’s history”:  picking and choosing from historical data only those facts that support his conclusions, while ignoring any data that suggest otherwise.  Because his “proof” relies on extensive quotation from historical documents – often misinterpreted by being taken out of context or quoted superficially – Barton fools a number of his readers, including libertarian/conservative political commentator, Glenn Beck, who wrote an introduction for Barton’s book and who had Barton as a guest frequently on his former TV show on Fox News.  Barton, whom Beck calls his favorite “historian” and guide to America’s Founders, has completely bamboozled Beck, who fails to understand what a charlatan Mr. Barton truly is. 

                Chief among Barton’s critics have been Warren Throckmorton and Michael Coulter, professors at conservative Grove City College and authors of Getting Jefferson Right: Fact Checking Claims About Our Third President (published by Salem Grove Press in 2012 and available from Amazon both as a paperback and as an e-book).  Throckmorton and Coulter are effective in rebutting Barton’s claims because they are religious conservatives and hence do not fit the “secularist” stereotype that Barton creates to explain away the criticisms of his book.  The two professors have written a devastating, point-by-point rebuttal of most of the outrageous claims Barton makes in his book, thus illustrating (ironically) how aptly titled The Jefferson Lies really is.  Unlike Barton, Throckmorton and Coulter are careful scholars who appreciate the importance of historical context.  Their discussion of many of the particulars about Jefferson’s own religious views and his political record regarding religion – for example, their discussion of the famous “wall of separation” letter to the Danbury Baptists – is one of the best and most thorough I’ve seen in print.  As a correction to Mr. Barton’s “lies,” I highly recommend the Throckmorton and Coulter book.  It’s among the few recently-published books about Jefferson that I would recommend.  

                (There is one chapter in Barton’s book, however, that Throckmorton and Coulter do not rebut – because it warrants no rebuttal.  Like a stopped clock that tells the correct time twice a day, Barton does get one thing “right” about Jefferson – and thus does correctly bash one of the “lies,” or myths, that have been created about Jefferson:  the story that he fathered one or more of the children of his slave Sally Hemings.  That part of Barton’s book (Chapter 1) doesn’t have any original ideas; it merely summarizes the conclusions of the Scholars Commission – and it quotes extensively from the Scholars Commission report, including my own concurring report, “The Thomas Jefferson – Sally Hemings Myth and the Politicization of American History,” all of which, as I’ve noted above, has been published in the book The Thomas Jefferson – Sally Hemings Controversy: Report of the Scholars Commission, edited by Robert F. Turner (Durham, N.C.: Carolina Academic Press, 2011).  I should add that one more piece of evidence showing how poor a historian – really merely a pseudo-historian – Barton is, is the fact that the passages he cites from my report are the least important passages in it.  Thus, he fails to cite the historical works that really do support the conclusions he desires to reach.  Even as an author of “lawyer’s history,” Barton isn’t very competent.) 

                During the past twenty years, dozens of new books about Jefferson have been published – nearly as many new books as appear about Abraham Lincoln, the most popular U.S. president – but, sadly, most of the new and recently-published Jefferson books are absolute nonsense.  They generally offer no fresh insights on Jefferson, his life or (what is most important) his ideas, but instead merely rehash what other writers previously have said about him – including the myths discussed above.  Thus, recent Jefferson “scholarship” has tended to perpetuate the “democracy” myth, the Sally Hemings myth, and all the rest – masking the real Thomas Jefferson, the historic figure, with regard to both his life and his ideas.  Ironically, with all the dozens of recently-published books about him – more Jefferson studies than ever – Americans probably understand Jefferson less today than they have in any other period in U.S. history.  The most important of America’s Founders has become the most elusive.  It’s not because Jefferson, the real Jefferson, is difficult to understand – he isn’t – but rather it’s because the recent scholarship has so distorted and misrepresented him. 

     

     

    The Art of Distortion: A Review of

    Jon Meacham’s The Art of Power 

     

                A case in point:  the best-selling book, Thomas Jefferson: The Art of Power (Random House, 2012), by Jon Meacham, a former Newsweek editor and current editor and executive vice president at Random House.  Although not a historian, Meacham won a Pulitzer Prize for American Lion, his 2008 biography of Andrew Jackson, which was also a best-seller.  (He is also the author of two other New York Times bestsellers, Franklin and Winston and American Gospel.)   

                The lengthy (750-page) narrative of Jefferson’s life compounds the basic mistake made by many recent authors of Jefferson biographies:  rather than viewing Jefferson honestly in the context of his own time and culture, Meacham views him through the lens of a 21st-century left-liberal American, trying to force him into the model of a modern “liberal.”  Hence he concludes, “The real Jefferson was like so many of us: a bundle of contradictions, competing passions, flaws, sins and virtues that can never be neatly smoothed out into a tidy whole.  The closest thing to a constant in his life was his need for power and control” (p. 500).  This emphasis on “power” (as Meacham understands the term) is the fatal flaw of the book, for it completely distorts Jefferson, turning on its head Jefferson’s real political philosophy.    

                Aesthetically, the book is impressive, with expensive “production value” – as one might expect from the fact that the author is also an executive at the publishing company, Random House.  (As noted above, Meacham is an editor – his title, in fact, is “executive editor” – as well as executive vice president of Random House.)  Richly illustrated, the book has beautiful drawings that grace the endpapers, the main title page, and the title pages for each of its nine parts.  It also has three sections of pictures, one in color – which includes several of the paintings from Jefferson’s collection at Monticello, as well as the portraits of the three men in English history whom Jefferson most admired (Francis Bacon, John Locke, and Isaac Newton).   

                Meacham’s book is written in a non-scholarly, populist style that underscores the superficial treatment he gives his subject.  Paragraphs are short and choppy, with many paragraphs lacking topic sentences as the narrative jumps from one point to another.  The style resembles that of a lengthy newspaper article, rather than a book.  Although there are no note numbers in the main text, the book does contain 175 pages of endnotes, citing sources for the quotations that appear in the book, many of them (to the author’s credit) from Jefferson’s writings, both published and unpublished.  The bibliography is quite comprehensive, listing as “books consulted” virtually all the major books about Jefferson and his time – including my own book, The Constitutional Thought of Thomas Jefferson.  However, a close examination of the works cited as sources in the endnotes reveals that Meacham has relied mostly on just a few of these secondary sources, and not the best ones.  (He may have “consulted” my book on Jefferson’s constitutional thought, but apparently he learned nothing from it; he fails to cite my book where he should, when he is discussing such critical topics as the Declaration of Independence, Jefferson’s 1791 opinion on the bank bill, the Kentucky Resolutions of 1798, or the Louisiana Purchase – and so, as noted below, he gets those topics wrong.) 

                It is not just in its aesthetics or its writing style that The Art of Power is superficial.  Substantively, the book is grounded in Meacham’s superficial understanding of the intellectual and political worlds in which Jefferson lived.  His narrative is broad but not deep: it covers many topics, giving just enough detail (including just enough quotation from Jefferson’s writings) to create the impression of being thoroughly researched.  Yet Meacham does not prioritize the subject matter, for he gives equal attention to the significant and the trivial.   

                For example, discussing the background to the American Revolution, Meacham mentions (and even briefly summarizes) the multivolume history of England by Paul de Rapin-Thoyras that was in Peter Jefferson’s library and which Jefferson inherited after his father’s death.  Rapin’s history was indeed an important part of Jefferson’s early education, but it was one of several important works that shaped his thought in the decades prior to the Revolution.  Quoting the 19th-century Jefferson biographer Henry Randall, Meacham calls Peter Jefferson “a staunch Whig” and suggests that the “Whig” version of English history shaped the worldview of Jefferson and his fellow Patriots in the American Revolution.  Yet Meacham does not mention any of the other important writers – a group of thinkers, not just historians but also political philosophers and authors of legal treatises – whom scholars generally call the English “radical Whigs,” or “Real Whigs,” of the 17th and 18th centuries, who truly did influence Jefferson and the other Founders, not only to support the American cause but to formulate a uniquely American constitutionalism.  Without even acknowledging the radical Whig tradition, Meacham writes that Jefferson and his fellow American Revolutionaries fought against monarchical “tyranny” and sought (as he simplistically puts it) “a government they judged fair and representative.”  When he finally gets around to discussing Jefferson’s authorship of the Declaration of Independence, Meacham writes that Jefferson’s influences were “manifold” and included “Locke, Montesquieu, and the philosophers of the Scottish Enlightenment.”  His sources, cited in the endnotes, are three flawed and/or outdated secondary sources:  Pauline Maier’s American Scripture (1997), Garry Wills’ Inventing America (1978), and Carl Becker’s Declaration of Independence (1970).  He doesn’t cite the most serious (and insightful) studies of the Declaration, including Hans Eicholz’s Harmonizing Sentiments (Peter Lang, 2001) or Chapter 2 of my Constitutional Thought of Thomas Jefferson.    

                Although Meacham is an editor and a writer, not a lawyer, his book epitomizes what historians contemptuously call “lawyer’s history,” cherry-picking the evidence and citing only those facts that support his thesis while ignoring facts that do not support his thesis.  And what exactly is Meacham’s thesis?  The title of the book suggests it, and Meacham states it rather directly in the book’s prologue: 

    “He had a defining vision, a compelling goal – the survival and success of popular government in America. . . . In pursuit of his ends, Jefferson sought, acquired, and wielded power, which is the bending of the world to one’s will, the remaking of reality in one’s own image. . . .

     

    “More than any of the other early presidents – more than Washington, more than Adams – Jefferson believed in the possibilities of humanity.  He dreamed big but understood that dreams become reality only when their champions are strong enough and wily enough to bend history to their own purposes.  Broadly put, philosophers think: politicians maneuver.  Jefferson’s genius was that he was both and could do both, often simultaneously.  Such is the art of power. . . .

     

    “A master of emotional and political manipulation, sensitive to criticism, obsessed with his reputation, and devoted to America, he was drawn to the world beyond Monticello . . . .  As a planter, lawyer, legislator, governor, diplomat, secretary of state, vice president, and president, Jefferson spent much of his life seeking control over himself and power over the lives and destinies of others.  For Jefferson, politics was a not a dispiriting distraction but an undertaking that made everything else possible.”

     

    Thus, as portrayed by Meacham, Jefferson was a power-hungry politician, not much different from ambitious politicians in modern America.  He was, writes Meacham, “a breathing human being who was subject to the passion and prejudice and pride and love and ambition and hope and fear that drive most other breathing human being,” particularly those who have spent much of their lives in politics.    

                Meacham rejects as mere political dissembling Jefferson’s often-expressed disdain for the world of politics, his well-documented lifelong desire to retreat to his “family, farm, and books,” at Monticello (and in his later years, when Monticello became too crowded, his retreat home at Poplar Forest).  Meacham’s Jefferson is a “patriarch,” who wields “power” to achieve “control,” not only at Monticello but in the political communities of his state and nation.  Because of his distorted view of Jefferson’s basic character, Meacham is blind to the real reason why Jefferson acted contrary to his own desires and repeatedly came out of retirement – the “holy cause” (as Jefferson described it): “the holy cause of freedom.”  Freedom, or liberty: the opposite of “power,” in the political sense. 

                Meacham’s peculiar definition of power (“the bending of the world to one’s will, the remaking of reality in one’s own image”) profoundly influences his analysis.  Like other modern left-liberals, Meacham fails to distinguish political power – control exercised by government, through the coercive force of the laws – from private order, which is achieved by wise management of one’s own affairs and voluntary interactions with other people, for mutual benefit.  Jefferson certainly was the “patriarch” of Monticello, the head of his family (and the owner of several slave families) on his “little mountain” in central Virginia, where he meticulously planned everything in his house and gardens.  But he never envisioned himself as a patriarch in a political community – whether his local community in Albemarle County, his state (or “country,” as he called it), Virginia, or in the new nation he helped create, the United States of America.  He regarded public service as a kind of duty for someone of his class – a rather old-fashioned attitude grounded in the 18th-century world of “deference” politics in which he grew up – but (as I discuss more fully in my talk on his libertarian legacy, below) Jefferson was drawn to the world of politics, despite his personal aversion to it, because he was moved not only by the sense of duty but also by devotion to a cause.  That “holy cause,” as he repeatedly identified it, was the cause of “freedom.”  Rather than trying to control other people’s lives – to be a political patriarch, a legal paternalist – Jefferson devoted his public career to the struggle to free Americans from the tyranny of patriarchy or paternalism.  He sought to limit, not to expand, the use of governmental power, allowing individuals – his fellow Virginians, his fellow Americans – to exercise the freedom to order their own lives.   

                In discussing the political conflicts of the 1790s, Meacham overlooks the reason why Jefferson and his colleague, Madison, led their opposition party – why they called it the “Republican” Party and why Jefferson, first as secretary of state under Washington’s administration and later as vice president during Adams’s administration, took the stands that he did.  For example, Meacham discusses Jefferson’s opposition to Alexander Hamilton’s plan to charter the Bank of the United States as if they disagreed only on policy grounds (“Hamilton’s vision”).  Although he acknowledges that Jefferson’s opinion against the constitutionality of the bank bill was based on “an argument for strict construction,” Meacham does not take seriously Jefferson’s constitutional argument, maintaining that he was “not doctrinaire” and indeed labeling Jefferson “an improviser and a nationalist” (p. 250).  Thus Meacham ignores one of the key differences in constitutional interpretation between the Jeffersonian Republicans and the Federalists – the Jeffersonians’ insistence on a strict interpretation of federal powers according to enumerated powers scheme of the Constitution.  (If Jefferson gave Washington an excuse to sign the bill by further advising him to defer to the Congress if he was in doubt about the bill’s constitutionality, it wasn’t because Jefferson was “pragmatic,” as Meacham claims, but because Jefferson was equally concerned about Washington abusing his veto power.)   

                Meacham similarly gives scant attention to Jefferson’s constitutional arguments in the other major disputes of the 1790s: the conflict over Washington’s Neutrality Proclamation (Meacham fails to mention how Jefferson encouraged Madison to write his “Helvidius” essays in response to Hamilton’s “Pacificus”), and the opposition to the Alien and Sedition Acts of 1798. With regard to the latter, Meacham does discuss Jefferson’s Kentucky Resolutions, one of Jefferson’s most important public papers, but in doing so he distorts Jefferson’s arguments, claiming that he took contradictory positions as both a “nationalist” and a “nullifier.”  Meacham then makes the following extraordinary claim about Jefferson “acting in character,” a passage that shows how completely wrong Meacham is about Jefferson – and how far Meacham’s thesis about “power” has led him to distort Jefferson’s political (and constitutional) philosophy: 

    “He [Jefferson] was always in favor of whatever means would improve the chances of his cause of the hour.  When he was a member of the Confederation Congress, he wanted the Confederation Congress to be respected.  When he was a governor, he wanted strong gubernatorial powers.  Now that he disagreed with the federal government (though an officer of that government), he wanted the states to have the ability to exert control and bring about the end he favored.  He was not intellectually consistent, but a consistent theme did run through his politics and statecraft.  He would do what it took, within reason, to rearrange the world as he wanted it to be.”

     

    (pp. 318-19).  To fully understand how completely wrong Meacham is about Jefferson in this passage, one needs to read my book The Constitutional Thought of Thomas Jefferson, which shows how “intellectually consistent” Jefferson really was.  (It’s a shame that Meacham, who cited my book in his bibliography – as noted above – apparently did not read it or, if he did, he did not understand it.  Maybe he deliberately ignored it – as he does so many historical facts and historical interpretations – because it wouldn’t fit his specious thesis.) 

    Not surprisingly, Meacham also distorts Jefferson’s record as president, failing to understand the important ways in which Jefferson departed from the precedents set by his predecessors, Washington and Adams, by restoring the constitutional limits on the presidency.  Instead, again pushing his thesis to ridiculous extremes, Meacham generally claims that the story of Jefferson’s two terms as president “is one of a lifelong student of control and power bringing all of his virtues and vices to the largest possible stage” (p. 351).  Meacham discusses the informality that Jefferson brought to the office but suggests it resulted merely from a difference in style – not (as it really was) a deliberate effort to strip from the office the monarchical trappings that he predecessors gave it.  For example, Meacham overlooks the importance of Jefferson’s change in the mode of presenting his annual message to Congress – submitting a written message instead of delivering a “state of the union” address in person – just as Meacham also misses the important subjects of some of these messages (particularly the first, in December 1801, when Jefferson recommended that Congress authorize offensive actions in the Barbary War).  Regarding the Louisiana Purchase, Meacham commits the same error made by other commentators – going all the way back to Henry Adams’s history of Jefferson’s presidency – in dismissing the seriousness of Jefferson’s constitutional scruples and instead claiming that it shows Jefferson’s hypocrisy (as Meacham presents it, a contradiction between “the philosophical Jefferson” and “the political Jefferson”).  And with regard to the disastrous embargo – the enforcement of which Jefferson found excruciatingly embarrassing because it so contradicted his libertarian impulses – Meacham finds it “not out of character for Jefferson,” for “it put him in control” (p. 432).  He keeps hammering away at his thesis, again and again. 

    Last, but not least among the glaring defects of the book, Meacham accepts unquestionably as proven historical fact – or, more precisely, as gospel truth – the Sally Hemings myth in its fullest form, the thesis that Jefferson not only fathered all the children of his slave Sally Hemings but that he and Sally had some sort of long-term relationship.  He writes, 

    “Jefferson maintained a decades-long liaison with Sally Hemings, his late wife’s enslaved half sister who tended to his personal quarters at Monticello.  They produced six children (four of whom lived) and gave rise to two centuries of speculations about the true nature of the affair.  Was it about love?  Power?  Both?  And if both, how much was affection, how much coercion?  Jefferson’s connection with Sally Hemings lasted from about 1787 to Jefferson’s death in 1826 – almost forty years.”

     

    For this remarkable assertion, Meacham cites primarily two sources: Annette Gordon-Reed’s two books (particularly her “monumental” second book The Hemingses of Monticello) and the biased Monticello (Thomas Jefferson Foundation) “Report of the Research Committee on Thomas Jefferson and Sally Hemings.”  In a lengthy endnote (on pp. 522-24), Meacham acknowledges that one member of the Monticello committee dissented and wrote a minority report.  (The dissenter, Dr. White McKenzie (Ken) Wallenborn, also has written a devastating critique of the committee’s bias and shoddy scholarship – “A Committee Insider’s Viewpoint,” published in The Jefferson – Hemings Myth: An American Travesty, edited by Eyler Robert Coates, Jr. (Charlottesville, Va., 2001) – but, not surprisingly, Meacham does not cite that critique nor even refer to Dr. Wallenborn by name.)  Meacham cites only two works “for a contrary view”: William Ryland’s book, In Defense of Jefferson, and (remarkably!) David Barton’s flawed book, The Jefferson Lies.  He does not cite the Scholars Commission report, or the published volume edited by Professor Robert F. Turner, The Thomas Jefferson – Sally Hemings Controversy: Report of the Scholars Commission (Carolina Academic Press, 2011), which (as I’ve discussed above) is the most thorough study of the Jefferson–Hemings paternity question.  Needless to say, Meacham does not even acknowledge that several distinguished scholars are skeptical of the paternity thesis; like so many other recent biographers of Jefferson, he considers the Sally Hemings story to be not a myth but a proven historical fact, accepted by a supposed near-unanimous consensus of scholars.     

    Why does Meacham so unquestionably accept Professor Gordon-Reed’s thesis?  He refers to her as his “friend,” acknowledges an “incalculable” scholarly debt to her, and indeed relies almost entirely on her “masterwork” whenever discussing either Sally Hemings or the Hemings children.  But, in the lengthy endnote on pp. 522-24, Meacham frankly admits the real reason why he accepts the Sally Hemings myth – because it fits so well with his own thesis:  “In my view,” he writes, “there is convincing biographical evidence that Jefferson was a man of appetite who appreciated order, and that the ability to carry on a long-term liaison with his late wife’s enslaved half sister under circumstances he could largely control would have suited him” (p. 522).   

    Indeed, in the final analysis one is left with the impression that Meacham wrote his Jefferson book not only to advance his specious thesis about “power,” but also to help make the Sally Hemings myth seem more plausible.  And, of course, to make a shitload of money for himself and for Random House.  Anyone seriously interested in the real Thomas Jefferson should ignore this deceptive book. 

                Needless to say, therefore, I do not recommend Thomas Jefferson: The Art of Power.   Nor do I recommend any of the other books about Jefferson, particularly Jefferson biographies, published in the past twenty years or more.  (One exception – not a biography but a specialized study of Jefferson’s thought – is Jean M. Yarbrough’s American Virtues: Thomas Jefferson on the Character of a Free People (University Press of Kansas, 1998).  Although I do not fully agree with Professor Yarbrough’s thesis, her discussion of Jefferson’s concern with “character” – the concept as understood in the classical sense – is both a fresh take on Jefferson’s ideas and also an interpretation fairly faithful to those ideas themselves.  Yarbrough’s approach to Jefferson – like my own in The Constitutional Thought of Thomas Jefferson – is to take his ideas seriously, and to take them on his own terms, something that virtually all the recently-published biographies have failed to do.) 

                What Jefferson book(s), then, would I recommend?  The definitive study of Jefferson’s life – still definitive and likely to remain so for the foreseeable future – is the monumental six-volume series by Dumas Malone, Jefferson and His Time, published over a  33-year period (1948 – 1981) by Little, Brown & Company:  Jefferson the Virginian (1948), Jefferson and the Rights of Man (1951), Jefferson and the Ordeal of Liberty (1962), Jefferson the President: First Term, 1801–1805 (1970), Jefferson the President: Second Term, 1805–1809 (1974), and The Sage of Monticello (1981).  Malone’s volumes are the indispensable reference books and the necessary starting points for any serious scholar of Jefferson and his life.  Reading Jefferson is his own words is also critically important; the best one-volume collection is Thomas Jefferson: Writings, published by the Library of America in 1984 (and edited by Merrill Peterson, who also edited the slightly more compact paperback compilation, The Portable Thomas Jefferson, first published by Viking Press in 1975 and reprinted by Penguin Books).  And, of course, for anyone who wants to understand Jefferson’s ideas about government, I recommend my own book, The Constitutional Thought of Thomas Jefferson (University Press of Virginia, 1994, paperback ed. 1995). 

                What about a one-volume biography of Jefferson?  (The question I’m most frequently asked about Jefferson – other than, unfortunately, the question about the paternity of Sally Hemings’ children – is whether there’s a good one-volume biography that I’d recommend.)  I recommend two.  Merrill D. Peterson’s Thomas Jefferson and the New Nation (Oxford University Press, 1970), at over 1000 pages, is a lengthy tome that fully explains Jefferson’s life in the context of his times; it is not merely a splendid biography of Jefferson but also a textbook of American history during “the age of Jefferson.”  Finally, the more compact one-volume biography that I still recommend as the best is Noble E. Cunningham, Jr., In Pursuit of Reason: The Life of Thomas Jefferson (Louisiana State University Press, 1987).  Professor Cunningham, who is also the author of one of the best books about Jefferson’s presidency (The Process of Government Under Jefferson), in his biography presents a nicely balanced account of Jefferson’s life, solidly researched and insightful, with just the right of amount of detail.

      

                Now, without any further ado, here is the revised text of the talk I gave at the University of Virginia on April 13, 1993, in celebration of the 250th anniversary of Jefferson’s birth.  (It has been slightly updated, and footnotes have been omitted.)

       

     

    The Libertarian Legacy of Thomas Jefferson 

     

                Sadly, modern Americans seem to have done a better job preserving what Thomas Jefferson has left us in bricks and mortar than we have done in preserving his ideas. Tourists visiting Charlottesville, Virginia, can witness firsthand the ongoing efforts to preserve Jefferson's home at Monticello as well as his splendid little "Academical Village," the Lawn, which is still a vital center of student life at the University of Virginia.  Further down the road, near Lynchburg, Virginia, preservationists have been restoring Poplar Forest, Jefferson's retreat home (which in many respects is more revealing than Monticello of Jefferson both as an architect and as a homeowner).   

                Notwithstanding these efforts to maintain Jefferson's architectural legacy, however, scholars have been less successful in keeping alive his philosophy, particularly his ideas about government – despite the copious record he has left in his writings, both public papers and private letters.  Perhaps the problem is that modern scholars, living in the post-New Deal era of big government, find it difficult to comprehend a different kind of society, like that of the Founders' generation, where government intervened far less extensively in the day-to-day lives of individuals.   

                Another reason why modern Americans fail to understand fully Jefferson's ideas about government is that those ideas have been so often misinterpreted.  The 1993 celebrations of the 250th anniversary of Jefferson's birth generally championed his reputation as "father of American democracy."  For example, former Chief Justice William Rehnquist, speaking at the University of Virginia, echoed the views of many Jefferson scholars that "the permanence of Jefferson resided not in his specific theories or acts of government, but in his democratic faith." 

                Although it is true that Jefferson was a leading proponent of representative democracy – Alexis de Tocqueville in his classic work, Democracy in America, called Jefferson "the most powerful advocate democracy has ever sent forth" – his devotion to democracy was not absolute or unqualified.  Indeed, Tocqueville thought it significant that Jefferson once warned James Madison that "the tyranny of the legislature" was "the danger most to be feared" in American government.  To Jefferson, democracy and its associated principles – majority rule, equal rights, direct representation of the people in government – were valuable, not as ends in themselves, but as essential means to a greater end, the maximization of individual freedom in civil society.  Liberty was Jefferson's highest value; he dedicated his life to what he once called "the holy cause of freedom."  

                What repeatedly drew him away from his tranquil domestic life at Monticello and back into the political fray was a higher value, "the holy cause of freedom," to which Jefferson felt duty-bound whenever he saw liberty threatened by a powerful central government, whether it was the British government under King George III or the United States government under Federalist administrations.  His passion for this cause was reflected in the very language that he used in his political writings.  Jefferson, the zealous defender of religious freedom, tended to use words such as holy, orthodox, or catholic when discussing political, not religious, principles; and he reserved words such as heretic or apostate to denounce politicians whom he regarded as the enemies of liberty.  He summed up his life's work in a letter he wrote relatively early in his public career, in 1790, soon after his return to the United States following his ambassadorship to France.  "[T]he ground of liberty is to be gained by inches, . . . we must be contented to secure what we can get from time to time, and eternally press forward for what is yet to get.  It takes time to persuade men to do even what is for their own good." 

                Jefferson's philosophy of government, accordingly, stressed the perpetual need to limit government's powers.  As he once wrote, "The natural progress of things is for liberty to yield and government to gain ground."  The notion that government was inevitably threatening to liberty was part of the radical Whig tradition in which Jefferson's early intellectual life was steeped.  Like John Locke, Algernon Sidney, and other lesser-known English radical Whig political philosophers, Jefferson understood, paradoxically, that it was government, which was created to "secure" individual rights, that posed the greatest danger to those rights through the abuse of its legitimate powers.  Hence Jefferson, like other good "Whigs" of his time – and like the classical liberals of the nineteenth century – was profoundly distrustful of concentrated political power and intensely devoted to the ideals of limited government and the rule of law. 

                To Jefferson, the significance of the American Revolution was the opportunity it gave the American people to create a republican form of government – that is, a government not only founded in theory upon the consent of the governed, but one that was continually responsible to the will of the people – "the only form of government which is not eternally at open or secret war with the rights of mankind," he maintained.  He understood the American constitutions, state and federal, to implement in practice the theory of government he so eloquently presented in his original draft of the Declaration of Independence, where he stated the "self-evident" truths that all men were created "equal & independent," that from that equal creation they derived "rights inherent & inalienable, among which are the preservation of life, & liberty & the pursuit of happiness," and that "to secure these ends, governments are instituted among men, deriving their just powers from the consent of the governed." 

                The creation of republican governments alone, however, was not sufficient to guard against abuses of power.  Jefferson also understood the value of such devices as written constitutions, the division and separation of powers, and the people's power to amend constitutions.  As I have shown in my book, The Constitutional Thought of Thomas Jefferson, the fundamental principle of his constitutionalism was most cogently expressed in a paragraph that appeared in his draft of the Kentucky Resolutions in 1798, where he wrote:  

    [C]onfidence is everywhere the parent of despotism--free govern­ment is founded in jealousy, and not in confidence; it is jealousy and not confidence which prescribes limited constitu­tions, to bind down those whom we are obliged to trust with power. . . . In questions of power, then, let no more be heard of confidence in man, but bind him down from mischief by the chains of the Constitution.  

    Zealously guarding liberty, Jefferson was suspicious of the use of governmental power and so cautioned "jealousy" and scrupulous adherence to "the chains of the Constitution," in order to "bind down those whom we are obliged to trust with power."  He feared that without the rule of higher law, the achievement of the American Revolution would be lost.  The governments in Europe "have divided their nations into two classes, wolves and sheep."  If the people of America once become "inattentive to the public affairs," he warned, "you and I, and Congress, and Assemblies, judges and governors shall become wolves.  It seems to be the law of our general nature, in spite of individual exceptions."  

                Like Thomas Paine, who in Common Sense had distinguished government and society, Jefferson understood that the realm of politics was quite limited; outside it, individuals should be free to fashion their lives as they saw fit, through voluntary social relationships.  The "essence of a republic," he wrote, was a system in which individuals "reserve to themselves personally the exercise of all rightful powers to which they are competent," delegating others to their "representatives, chosen immediately, and removable by themselves."  This "proximate choice and power of removal," he believed to be "the best security which experience has sanctioned for ensuring an honest conduct in the functionaries of society" – in other words, for preventing those in power from becoming "wolves." 

                What were those things to which individuals were "competent" to govern themselves?  Among them were natural rights.  The Declaration of Independence listed three such natural, or "inalienable," rights: life, liberty, and the pursuit of happiness.  Elsewhere in his writings Jefferson referred to others: expatriation, religious freedom, freedom of trade, even the right to hold property.  All these various rights might be understood as particular manifestations of one basic natural right, liberty, which Jefferson regarded as sacrosanct as life itself: as he wrote in his 1774 essay, A Summary View of the Rights of British America, "The god who gave us life, gave us liberty at the same time; the hand of force may destroy, but cannot disjoin them." 

                Jefferson regarded as a basic principle of good government the guarantee to all of the enjoyment of these rights.  In 1816, discussing the "rightful limits" of legislators' power, he maintained that "their true office is to declare and enforce only our natural rights and duties, and to take none of them from us": "No man has a natural right to commit aggression on the equal rights of another; and this is all from which the laws ought to restrain him; every man is under the natural duty of contributing to the necessities of society; and this is all the laws should enforce on him; and, no man having a natural right to be the judge between himself and another, it is his natural duty to submit to the umpirage of an impartial third."  He added, "when the laws have declared and enforced all this, they have fulfilled their functions, and the idea is quite unfounded, that on entering into society we give up any natural right."   Two years later, in a report which he prepared as chairman of the Commissioners for the University of Virginia, Jefferson included in his syllabus of the basic principles of government, "a sound spirit of legislation, which, banishing all arbitrary and unnecessary restraint on individual action, shall leave us free to do whatever does not violate the equal rights of others." 

                Fundamental to Jefferson's political philosophy, then, was the idea that no government could legitimately transgress natural rights.  In order for law to be binding, it must not only proceed from the will of properly authorized legislators, but it must also be "reasonable, that is, not violative of first principles, natural rights, and the dictates of the sense of justice."  In the final paragraph of his Virginia Statute for Reli­gious Freedom, for example, Jefferson added a declaration that the rights therein asserted were "the natural rights of mankind," and that although the legislature which enacted the Bill had no constitutional power to restrain subsequent legislatures, any future act repealing it or narrowing its operation would be "an infringement of natural right."  And undoubtedly the institution of slavery was so troubling to Jefferson, throughout his life, because he realized that it violated the natural rights of an entire race of people. 

                To Jefferson, religion was a matter of conscience, a private matter that ought not concern government.  For that reason, he joined his friend and collaborator, James Madison, in calling for both a wide latitude for the free exercise of religious beliefs and a strict avoidance of government "establishment" of religion.  "The opinions of men are not the object of civil government, nor under its jurisdic­tion," his original text declared.  As he explained the purpose of the Statute in his Notes on the State of Virginia, "Our rulers can have authority over such natural rights only as we have submitted to them," noting that "the rights of conscience we never submitted, we could not submit" because men are answerable for them to God only.  "The legi­timate powers of government extend to such acts only as are injurious to others.  But it does me no injury for my neighbour to say there are twenty gods, or no god.  It neither picks my pocket nor breaks my leg." 

                When Jefferson wrote to Madison late in 1787, expressing his great disappointment that the new federal Constitution included no explicit guarantee of rights, the first such right that he listed was freedom of religion.  He surely had in mind the kind of broad statement of "natural right" expressed in his Virginia bill.  Although the language finally adopted by Congress in proposing what would become part of the First Amendment – stating that "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof" – was far less explicit than the language of the Virginia statute, Jefferson interpreted it to be just as comprehensive a guarantee.  In other words, he understood the First Amendment freedom of religion clause, like the Virginia statute, to leave the formation of religious opinions solely to "the reason of man." 

                As president, Jefferson faithfully adhered to this principle and to his broad view of the rights guaranteed by the First Amendment.  He departed from the precedent set by his predecessors, Washington and Adams, by refusing to recommend or designate any day for national prayer, fasting, or thanksgiving.  As he explained his policy, in a letter made public early in his presidency, he noted that since Congress was prohibited by the First Amendment from acts respecting religion, and the president was authorized only to execute its acts, he had refrained from prescribing "even occasional performances of devotion."  In famous words, he declared that the First Amendment mandated a "wall of separation between Church and State."        Collaborating again with James Madison in 1798, Jefferson opposed as unconstitutional the Sedition Act, which had made it a criminal offense to criticize either President John Adams or the Federalist-controlled Congress.  If Jefferson was – as some critics have charged, both in his time and today – less than fully libertarian in his defense of freedom of the press in the years that followed his election in 1800, it was because he was deeply troubled by what he perceived as the "licentiousness" of the press of his time.  During his presidency he expressed concern that his Federalist opponents were "pushing its [the press's] licentiousness and its lying to such a degree of prostitution as to deprive it of all credit."  This was, he had noted, "a dangerous state of things" because "even the least informed of the people have learnt that nothing in a newspaper is to be believed."  To another correspondent he bemoaned the fact that "nothing can now be believed which is seen in a newspaper.  Truth itself becomes suspicious by being put into that polluted vehicle." 

                Despite his belief in the efficacy of state laws against false and defamatory publications, it is important to note that, as president, Jefferson consistently followed a "hands-off" policy, as required by the First Amendment.  In his Second Inaugural Address, he explained his administration's policy as an "experiment" that had been "fairly and fully made" to determine "whether freedom of discussion, unaided by power, is not sufficient for the propagation and protection of truth."  The press, "confined to truth, needs no other legal restraint," he maintained.  "The public judgment will correct false reasonings and opinions, on a full hearing of all parties; and no other definite line can be drawn between the inestimable liberty of the press, and its demoralizing licentiousness.  If there be still improprieties which this rule would not restrain, its supplement must be sought in the censorship of public opinion."  The Second Inaugural, then, did more than reiterate Jefferson's steadfast denial of federal authority over freedom of the press: it revealed that, when pressed to draw a line between "the inestimable liberty" and the "demoralizing licentiousness" of the press, Jefferson came down on the libertarian side.  He would leave to the marketplace of ideas, and ultimately to "the censorship of public opinion," the restraint of falsehoods. 

                Jefferson took very seriously the "chains of the Constitution."  These included not only the enumeration of powers in the main text of the Constitution and the specific limitations on powers found in the Bill of Rights, but also two other devices to keep powers restrained by dividing them: federalism, which divided powers between the states and federal government; and the separation of powers, which divided federal powers among the three branches, legislative, executive, and judicial. 

                Federalism was, to Jefferson, the "true theory of our constitution"; and in a classic statement, made shortly before he was elected president, he described it thus:  

    The true theory of our Constitution is surely the wisest and best, that the States are independent as to everything within themselves, and united as to everything respecting foreign nations.  Let the general government be reduced to foreign concerns only, and let our affairs be disen­tangled from those of all other nations, except as to commerce, which the merchants will manage the better the more they are left free to manage for themselves, and our general government may be reduced to a very simple organization and a very unexpensive one-‑a few plain duties to be performed by a few servants.

    Under Jefferson's view, the whole field of government in the United States was divided into two departments, "domestic" and "foreign," each department having "distinct directories, coordinate and equally indepen­dent and supreme, in its own sphere of action."  To the state governments were reserved "all legislation and administration, in affairs which concern their citizens only"; to the federal government was given "whatever concerns foreigns, or the citizens of the other states."  The "foreign," or federal, sphere, moreover, was strictly limited to the few functions enumerated in the Constitution.  

                Nothing better illustrates Jefferson's strict interpretation of federal powers under the Constitution than his 1791 opinion on the constitutionality of a bill to establish the Bank of the United States.  Citing the language of the Tenth Amendment, that "all powers not delegated to the U.S. by the Constitution, not prohibited by it to the states, are reserved to the states or to the people," Jefferson considered this provision to be "the foundation of the Constitution."  It reiterated the general principle of federal powers expressed by the language of Article I: that the legislative powers of the federal government, vested in the Congress of the United States, were limited to those "herein granted" in the Constitution.  "To take a single step beyond the bounda­ries thus speci­fically drawn around the powers of Congress, is to take possession of a boundless field of power, no longer susceptible of any definition." 

                The rest of Jefferson's opinion shows what he regarded those "bounda­ries drawn about the powers of Congress" to be: they were the words of Article I, the enumerations of Congressional power, construed (as Jeffer­son would later put it) "according to the plain and ordinary meaning of its language, to the common intendment of the time and those who framed it."  (Thus Jefferson was one of the first “originalists,” who interpret the provisions of the Constitution – and particularly its power-granting clauses – according to their original meaning.)  "The incorporation of a bank, and other powers assumed by this bill, have not . . . been delegated to the U.S. by the Constitution," Jefferson concluded, arguing that they were neither "among the powers specially enumerated" nor "within either of the general phrases" of Article I, the "general welfare" and "necessary and proper" clauses.  He understood the "general welfare" phrase to be a statement of the purpose for which the specific power of laying taxes was to be exercised, not a grant to Congress of "a distinct and independent power to do any act they please, which might be for the good of the Union."  To interpet it as the latter, Jefferson observed, "would render all the preceding and subsequent enumerations of power completely useless" as it would, in effect, "reduce the whole instrument to a single phrase," of empowering Congress to do whatever it pleased.  Similarly, he took quite literally the word necessary in the "necessary and proper" clause.  The Consti­tution, he argued, restrained Congress "to the necessary means, that is to say, to those means without which the grant of the power would be nuga­tory"; otherwise, the "necessary and proper" clause also "would swallow up all the delegated powers, and reduce the whole to one phrase."            

                Jefferson's opinion on the constitutionality of the bank bill thus presented a theory of strict interpretation of the Constitution.  To say that Jefferson was a literalist or a strict constructionist, however, is insufficient to describe his theory of interpretation.  First and foremost, he followed what I call a “contextual” theory of interpretation:  it was the context of a particular constitu­tional provision within the overall purpose of the federal Constitution, and not the text of the provision alone, that mattered to Jefferson.  He indeed was a "strict constructionist" with regard to most of the powers granted Congress in Article I, section 8, especially where federal powers could preempt state law.  Nevertheless, he could interpret federal powers under the Constitution quite liberally in matters involving foreign affairs, which he regarded as an exclusive responsibility of the national government since the time of the Articles of Confederation.  (Hence, in his second term as president, he enforced one of the most draconian laws ever passed by Congress – at least prior to the Civil War – the Embargo, which curtailed virtually all foreign trade in a failed attempt to keep the United States out of the war between Britain and France.)  He also could be quite liberal in interpreting power-restraining or rights-guaranteeing provisions of the Constitution, as his interpretation of the First Amendment religion clause demonstrates. 

                Upon becoming president in 1801, Jefferson reiterated his ideal of a federal government limited to its legitimate powers assigned by the Constitution: a government reduced to "a few plain duties performed by a few servants."  His Inaugural Address declared his general support for the idea of "a wise and frugal government, which shall restrain men from injuring one another, [but] which shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned."  More specifically, in his first annual message, in December 1801, he declared that it was his administra­tion's policy "to reduce expenses to what is necessary for the useful purposes of government," and he described those concerns that he consider­ed appropriate for the federal government.  "When we consider that this government is charged with the external and mutual relations only of these states; that the states themselves have principal care of our persons, our property, and our reputation, constituting the great field of human concerns, we may well doubt whether our organization is not too complicated, too expensive; whether offices and officers have not been multiplied unnecessarily, and sometimes injuriously to the service they were meant to promote." 

                Jefferson's administration pursued a policy of economy in government, drastically reducing the size of the federal payroll while simultaneously repealing all internal taxes, including Alexander Hamilton's hated excise on whiskey.  Abolition of internal taxes made possible the elimination of the internal revenue service employed to collect them; this resulted in a significant decrease in the Department of Treasury, by far the largest of the executive departments.  Jefferson also recommend­ed reductions in the army, the navy, and the diplomatic corps. 

                In addition to the repeal of internal taxes and drastic reductions in federal expenditures, Jefferson also endorsed enthusiastically the plan prepared by his treasury secretary, Albert Gallatin, to pay off the entire national debt – some $83 million – within sixteen years by annual appropriations of $7,300,000.  Believing it wrong for the present generation to saddle future generations with a huge national debt, Jefferson sought to establish the principle of "pay-as-you-go" in the federal budget.  During the eight years of Jefferson's administration the debt actually was reduced by almost a third; extraordinary expenses not fore­seen at the beginning of his presidency-‑chiefly, the increased naval costs associated with the Barbary Wars and the $15 million Louisiana Purchase--forced the modification of Gallatin's plan.  Nevertheless, the plan to extinguish the debt was largely successful because of the large increase in revenue from import duties that accom­panied the growth in American commerce during this period.  Indeed, the increased revenues actually created a surplus later in the administration, prompting Jeffer­son to recommend a constitutional amendment permitting expenditures for roads and other improvements projects, as noted below.  After his retirement from the presidency, Jefferson urged continued effort to pay off the debt by reducing federal expenditures, noting that increased public debt would bring increased taxation "and in its train wretchedness and oppression." 

                As president, Jefferson thus sought to accom­plish the objective he had stated in his First Inaugural Address and reiterated elsewhere in his writings at the start of his presidency:  to restore the constitutional equilibrium between the states and federal government, by keeping the latter "a wise and frugal govern­ment" limited to its sphere.  Although later in his presidency he recommended that Congress appropriate money for such projects as roads, canals, river and harbor improvements, and a national university, Jefferson recognized that a constitutional amendment was necessary for Congress to do so because such purposes were not among the enumerated powers of the federal government.  Indeed, Jefferson's strict interpretation of the Constitution almost jeopardized the Louisiana Purchase; he gave up his efforts on behalf of a constitutional amendment permitting the Purchase only after his closest advisers urged that it would cause the French government to reconsider the deal. 

                Critics of Jefferson, both past and present, have cited the Louisiana Purchase as an example of Jefferson's failure, as president, to consistently adhere to his doctrine of strict interpretation of federal powers.  Rather than showing his hypocrisy, however, the entire episode of the Louisiana Purchase illustrates the seriousness of Jefferson's constitutional scruples.  Jefferson understood the importance of the Purchase: it secured New Orleans and control of the Mississippi and was therefore vital to the interests of the United States.  Although Albert Gallatin presented Jefferson with arguments supporting the constitutionality of the Purchase, Jefferson remained sufficiently troubled to draft a constitutional amendment explicitly making the Louisiana territory part of the United States.  No important adviser or supporter of Jefferson apparently urged either the necessity or the practicality of such a constitutional procedure, however.  Indeed, Jefferson's close friend Senator Wilson Cary Nicholas argued strongly against it, saying that a declaration from Jefferson that the treaty exceeded constitutional authority would lead to its rejection by the Senate or at least to the charge of his willful breach of the Constitution.   

                Jefferson's reply to Nicholas's letter, stating in particularly striking terms his lingering constitutional scruples, has been one of the most often quoted of Jefferson's writings on constitutional matters: 

    When an instrument admits two constructions, the one safe, the other dangerous, the one precise, the other indefinite, I prefer that which is safe & precise.  I had rather ask an enlargement of power from the nation where it is found necessary, than to assume it by a construction which would make our powers boundless.  Our peculiar security is in possession of a written Constitution.  Let us not make it a blank paper by construction.

    Conceding the likelihood that the framers' enumeration of powers was "defective" – for "this is the ordinary case of all human works" – he urged, "Let us go on then perfecting it, by adding by way of amendment to the constitution, those powers which time & trial show are still wanting."  In the present case, he concluded, it was "important . . . to set an example against broad construction by appealing for new power to the people." 

                When Jefferson finally dropped the matter and acquiesced in the Louisiana Purchase despite the lack of a constitutional amendment, he did so not because he had given up strict construction but because he was following his advisers' recommendation not to press the constitutional problem, realizing that it could jeopardize a treaty so vital to the nation's security.  "What is practicable must often control what is pure theory; and the habits of the governed deter­mine in a great degree what is practicable," he noted.  Jefferson took solace in what he regarded as the "good sense" of the people, not to permit this one precedent to destroy the whole edifice of enumerated powers upon which constitutional limitations on the federal government rested.  Indeed, a common-sense resolution of his constitutional qualms was suggested by Thomas Paine, who reassured Jefferson that "the cession makes no alteration in the Constitution; it only extends the principles over a larger territory, and this certainly is within the morality of the Constitution, and not contrary to, nor beyond, the expression of intention of any of its articles."  If a new power had been added by construction to those powers assigned by the Constitu­tion to the federal sphere, it was only the power to add to the domain of what Jefferson aptly called the "empire for liberty." 

                The fact that, despite these assurances, Jefferson remained troubled about his constitutional scruples – for years after his presidency – only underscores the degree of his scrupulous regard for the "chains of the Constitution."  Unable to square the acquisition of Louisiana and its incorporation into the Union with his theory of federal powers, Jefferson came to regard it as an extraordinary action of executive prerogative:  he, as president, going beyond the strict limits of the law, for the good of the country.  Even then, he still hoped for an "act of indemnity" by the nation, one that "will confirm & not weaken the Constitution, by more strongly marking out its lines."  The "act of indemnity" to which he referred was, of course, an amendment to the Constitution.  

                With regard to the proper allocation of federal powers, Jefferson took equally seriously the principle of separation of powers.  It is a mistake to try to label Jefferson's presidency as either "strong" or "weak."  Where the Constitution assigned powers exclusively to the president, Jefferson vigorously exercised them; where powers were assigned to or shared with other branches, however, Jefferson both preached and exercised restraint, quite strictly.  He regarded the president, like the federal government generally and all its other parts, as "bound by the chains of the Constitution." 

                Unlike modern presidents, who assert the power as commander in chief to send U.S. armed forces anywhere in the world without the consent of Congress, Jefferson as president was respectful of Congress's war power.  When U.S. Navy ships fought against pirates in the Mediterranean, Jefferson – recognizing that the Constitution gave Congress alone the power to declare war –ordered the Navy to engage in defensive actions only until Congress authorized offensive measures.  The position that he took publicly, in his message to Congress – which modern commentators consider one of the most restrictive interpretations of executive war powers ever uttered by an American president – showed that he wished the decision committing American navy forces to hostilities in the Mediterranean to be not a unilateral one, but one in which Congress shared. 

                Jefferson also held a quite narrow view of the executive power, strictly speaking.  On one occasion he wrote, "I am but a machine erected by the constitution for the performance of certain acts according to the laws of action laid down for me."  For example, as noted above, when Jefferson as president refused to designate a day of national prayer, fasting or thanksgiving, he explained his position by noting that Congress was prohibited by the First Amendment from acts respecting religion and that the president was authorized only to execute their acts.  Thus his view of executive power saw it limited in its exercise both by constitutional restraints and by law.   

                As president he sought to keep his constitutional distance from the Congress.  He could hardly have done otherwise without opening himself to charges of hypocrisy (by his enemies) or backsliding (from his friends and followers), for the Republicans in the 1790s had been sharply critical of what they perceived as Federalist attempts to institute an English monarchical and ministerial system.  Consequently, early in his administration, Jefferson declared that he would abandon "all those public forms and ceremonies which tended to familiarize the public idea to the harbingers of another form of government."  These included the annual speech to Congress, which to Jefferson was too reminiscent of the king's opening of Parliament.  In sending a written message rather than delivering it in person, he broke with the precedent that George Washington had set and started a tradition that lasted more than a century.  Not until Woodrow Wilson did presidents deliver their state of the union addresses in person.  The modern spectacle – with both houses of Congress assembled in the House chamber in wait on the president, whose presence is loudly announced and greeted with two separate standing ovations – no doubt would have appalled Jefferson. 

                In at least one area, however, Jefferson was a "strong" president: in his assertion of his equal power – equal with the other two branches of the federal government, particularly the Supreme Court (dominated at the time by Federalists) – to interpret the Constitution.  The constitutional theory that scholars have called Jeffer­son's "tripartite" doctrine was fully developed in Jefferson's mind by the time of his presidency.  He explained his doctrine in a letter written to Abigail Adams in 1804, defending his actions in discontinuing prosecutions and pardoning offenders under the Sedition Act: 

    You seem to think it devolved on the judges to decide on the validity of the sedition law.  But nothing in the constitution has given them a right to decide for the executive, more than to the Executive to decide for them.  Both magistracies are equally independent in the sphere of action assigned to them.  The judges, believing the law constitutional, had a right to pass a sentence of fine and imprisonment, because that power was placed in their hands by the constitution.  But the Executive, believing the law to be unconstitutional, was bound to remit the execution of it; because that power has been confided to him by the constitution.­

    The Consti­tution, he concluded, "meant that its co‑ordinate branches should be checks on each other" and that, according­ly, to give the judiciary the right to decide questions of constitutional­ity "not only for themselves in their own sphere of action, but for the legislative and executive also in their spheres, would make the judiciary a despotic branch."  (Thus Jefferson was also one of the earliest opponents of the courts’ abuse of their judicial review power – what is today criticized as “judicial activism.”) 

                Jefferson had seemed not at all troubled by the fear of conflicts arising from the departments' divergent interpretations of the Constitu­tion.  In part, this may have been due to the fact that, in Jefferson's day, for all practical purposes, the legislature and the executive continued to determine for them­selves whether or not they were acting within the bounds of the Constitution.  If a truly difficult conflict arose between two or more branches, it could be resolved by the only truly ultimate arbiter of constitutional questions – the people, acting in their elective capacity.  By their periodic choosing of officers for two of the three departments of national government, the people, Jefferson believed, have an opportunity to "reinte­grate" the Constitution, by demonstrating their approval or disap­proval of those branches' interpretation of it. 

                Jeffer­son, though not an advocate of "frequent and untried changes in laws and constitutions," neverthe­less denied that he was a man who looked at constitutions with "sanctimonious reverence . . . like the ark of the covenant, too sacred to be touched."  Accordingly, he favored revisions of laws and constitutions, as the needs arose.  His view was clearly distinct from that of Chief Justice John Marshall, who in his famous opinion in McCulloch v. Maryland argued that the Constitution was "intended to endure for ages to come" as a rationalization for the expansion of federal powers by judicial interpretation.  Jeffer­son, with his Whig heritage of distrust of law and govern­ment, looked to the people rather than to the courts when he thought of adapting the Constitution, or of determining the applica­tion of its provisions, to new circumstances.  Always suspicious of men in power, Jefferson was particularly reluctant to entrust so important a role as the interpretation of the federal Constitu­tion to any one body of men – especially to a Supreme Court dominated, as it then was, by John Marshall.  Hence he preferred that constitutional difficulties remain unresolved, or that the mode of resolving them remain awkward and uncertain, rather than have mutual jealousies give way to confidence in the government at Washington. 

                In the early 1820s, during the Virginia campaign against the claim that the United States Supreme Court was the ultimate arbiter of constitutional questions, Jefferson again emphasized that the ultimate arbiter was the people themselves.  As he wrote one correspondent in 1820, "I know no safe depository of the ultimate powers of the society but the people them­selves; and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion by education.  This is the true corrective of abuses of constitutional power." 

                The notion that the control by the people over their government, according to their own "wholesome discretion," informed by education, constituted the "true corrective" of abuses of power is distinctively Jeffersonian.  Indeed, the emphasis that Jefferson placed on popular participation and control – making the people themselves a vital element in constitutionalism – was the preeminent hallmark of Jefferson's constitutional thought.  None of his contemporaries, with perhaps the exception of John Taylor of Caroline (a fellow Virginian and a radical Jeffersonian Republican), quite so emphasized this element.  It would in fact underlay many of the other aspects of his constitutional thought.  As I have shown in my study of Jefferson's constitutional thought, both the pure theory of separation of powers as well as the theory of federalism that Jefferson espoused were ultimately derived from his thoroughgoing repub­licanism:  with each branch of the federal government, and with each state in the Union, determining constitutional questions, potentially in conflict with one another, some common ground was necessary; and that common ground – in effect, the glue that held Jefferson's constitutional system in place – was in fact the active participation of the people in constitutional questions. 

                This explains Jefferson's lifelong emphasis on the importance of education as well as his support for a system of public schools.  The purpose for his "Bill for the More General Diffusion of Knowledge," as he explained it in Notes on Virginia, was that of "rendering the people the safe, as they are the ultimate guardians of their own liberty."  "Every government degenerates when trusted to the rulers of the people alone.  The people themselves therefore are its only safe depositories.  And to render even them safe their minds must be improved to a certain degree."  Jeffer­son's Bill sought to do by giving all citizens a basic schooling in reading, writing, and history.  The emphasis on historical education was quite deliberate, Jefferson explained:   

    History by apprising them of the past will enable them to judge of the future; it will avail them of the experience of other times and other nations; it will qualify them as judges of the actions and designs of men; it will enable them to know ambition under every disguise it may assume; and knowing it, to defeat its views.

    Beyond this basic schooling, the best students – the "natural aristocracy," determined by merit, or "genius" – would receive advanced training at the institution to which he devoted the final years of his life, the University of Virginia, where he hoped the "vestal flame" of republicanism would be kept alive. 

                In later years Jefferson coupled education with one other proposal, which he considered equally necessary to the preservation of republican­ism:  his proposed system of local government by "little republics," or wards.  His proposal was to divide the counties into wards of such size that every citizen can attend, when called on, and act in person.  "What has destroyed liberty and the rights of man in every government which has ever existed under the sun?  The generalizing & concentrating all cares and powers into one body."  The "secret" of maintaining freedom, he suggested, was to make the individual alone "the depository of the powers respecting himself, so far as he is com­petent to them, and delegating only what is beyond his competence by a synthetical process, to higher & higher orders of functionaries, so as to trust fewer and fewer powers, in proportion as the trustees become more and more oligarchical."  The system of republics thus described would accomplish this and thereby itself become a vital element of constitution­al­ism.  "Where every man is a sharer in the direction of his ward‑republic, or of some of the higher ones, and feels that he is a participator in the government of affairs, not merely at an election one day in the year, but every day; when there shall not be a man in the State who will not be a member of some one of its councils, great or small, he will let the heart be torn out of his body sooner than his power be wrestled from him by a Caesar or a Bonaparte," he also observed.  

                Jefferson thus envisioned, as a vital element of constitutionalism – indeed, as the most effective check on the abuse of governmental power – the active involvement of citizens in the government itself.  An educated, actively involved citizenry would be simul­taneously self‑reliant, managing directly those affairs to which indivi­duals were alone competent, and vigilant, keeping a close watch over their elected officials to whom they had entrusted all other affairs, and making certain that those officials did not turn into "wolves" (and that the people themselves not turn into “sheep”). 

                Jefferson's proposed ward system also gives added meaning to his support for the principle of "rotation in office" – what today is called “term limits.”  As is evident in the modern debate over various proposals to limit the terms of state legislators and members of Congress, one of the goals of rotation in office is to increase the level of popular participation in government by mandating turnover.  Thus, as the proponents of term limitations argue, the virtual monopoly that incumbent, professional politicians hold on some offices may be broken, and the way created for a return to the "citizen-politician" model of the 19th century.  The appeal of term limitations to modern-day Jeffersonians is exactly the same as its appeal to Jefferson himself: it enhances the possibility that each citizen may become, in his words, "a participator in the government of affairs, not merely at an election one day in the year, but every day." 

                A full understanding of Jefferson's ideas regarding constitutional change – and indeed, of his constitutional thought generally – must take into account Jefferson's dual emphasis on education and participation.  The essentially negative view of politics that Jefferson held thus ultimately influenced his constitutional thought in a profound way. 

                Jefferson regarded as truly modest the achieve­ments of his generation, believing that subsequent generations, learning from additional experience, would improve on the Founders' handiwork, with the problem of maintaining a free government becoming far simpler as subsequent generations hit upon better and better solutions.  Hence he recommended that every generation create anew their constitutions – a recommendation that reveals both his assump­tions that constitution‑making was a relatively simple matter and that the people, as a whole, were fully competent to the task.  Although a preeminent member of what Dumas Malone has called the "great generation," Jefferson disclaimed its greatness.  

                Throughout his life Jefferson deliberately downplayed his public service.  For example, in 1800 he drafted a list of his services that emphasized his role in introducing olive trees and upland rice into South Carolina, noting that "the greatest service which can be rendered any country is, to add a useful plant to its culture."      

                Perhaps Jefferson's greatest political legacy, therefore, is the extent to which he devalued politics.  During nearly half a century of public service, Jefferson held many high political offices: third President of the United States, Vice-President of the United States, Secretary of State, U. S. Ambassador to France, Member of Congress, Governor of Virginia.  Nevertheless, he asked to be remembered in his epitaph for only three accomplishments:  author of the Declaration of Independence, author of the Virginia Statute for Religious Freedom, and father of the University of Virginia.  Liberty and knowledge, not political power, were his highest values. 

                The author of the Declaration of Independence died on July 4, 1826, the fiftieth anniversary of the adoption of the Declaration, the date Americans have chosen for the celebration of the nation's birthday.  Like his fellow Patriot of `76, John Adams, who also died on that momentous day, Jefferson was fully aware of the symbolism; his final words, reportedly, were, "Is it the Fourth?"  Significantly, he wrote in his last letter of the libertarian meaning of American independence: "May it be to the world, what I believe it will be, (to some parts sooner, to others later, but finally to all,) the signal of arousing men to burst the chains under which monkish ignorance and superstition had persuaded them to bind themselves, and to assume the blessings and security of self-government." 

      

     | Link to this Entry | Posted Thursday, April 11, 2013.  Copyright © David N. Mayer.


    Spring Briefs 2013 - March 23, 2013

     

     

    Spring Briefs 2013

     

     It’s that time of year again – time for another series of “Spring Briefs” on MayerBlog.  As the spring weather heats up (throughout the USA except, apparently, here in Ohio), so too do controversies in the worlds of politics and popular culture.  Here are my comments on some current developments:

      

     

    n The Relevance of the Constitution

                 Maybe it’s just a coincidence that as B.O. begins his second term in the White House, a number of scholars and political commentators – mostly on the left side of the political spectrum – have been echoing the Occupier-in-Chief’s disdain for the U.S. Constitution.  Among them is Georgetown University law professor Louis Michael Seidman, who wrote a provocative op-ed piece in the New York Times at the end of 2012 entitled “Let’s Give Up on the Constitution.”  Professor Seidman, who (amazingly) teaches constitutional law at Georgetown, argues for the wholesale abandonment of the U.S. Constitution, decrying “all its archaic, idiosyncratic and downright evil provisions.”  In an interview on CBS TV, Professor Seidman explained: 

    “This is our country.  We live in it, and we have a right to the kind of country we want.  We would not allow the French or the United Nations to rule us, and neither should we allow people who died over two centuries ago and knew nothing of our country as it exists today.  If we are to take back our own country, we have to start making decisions for ourselves, and stop deferring to an ancient and outdated document.”

     

    He added that he didn’t mean we should abandon the whole document – just that we should not let it get in the way of what we want to do. 

    The question that should have been asked of Professor Seidman by that CBS interviewer is: “Who is the `we’ you’re talking out?”  “We the People of the United States,” as the Constitution’s preamble states?  Surely, though, the professor ought to understand that Americans are not all of one mind; historically, as a “free” society that regards freedom of thought and expression as one of our most cherished freedoms, we are a people who have profound disagreements about what’s best for us.  It’s to protect each individual from being harmed by other people’s violations of his rights that we form governments in the first place.  But – a fundamental fact about government that every law professor ought to understand – government is dangerous because it is the only entity in society entitled legitimately to use force, or coercion, to accomplish its purposes.  So we create written constitutions, to establish and enforce rules that limit government in the exercise of its powers.  As a law professor, Seidman also ought to understand why the text of all written instruments, including (and especially) constitutions, matter:  the text memorializes the powers that the “sovereign” people of the United States have given to their national government – including the limits imposed on those powers and (the flip side) guarantees of the rights that government cannot abridge – all in order to control its exercise of power, to prevent it from destroying the very rights it was created to protect.  Because the Constitution has a formal mechanism for amending its provisions, the text of the document today doesn’t memorialize the “dead hand of the past”; it reaffirms the relevance of those words to Americans today, who by choosing not to exercise their sovereign power to change the words, are in effect reaffirming them.  (This is the sense – the only sense – in which the theory argued by left-liberal constitutionalists, of a “living Constitution,” is true.)  

    The “we” to whom Professor Seidman referred in his interview, if he would honestly acknowledge it, are the left-liberal elites who share his view of government – people who want to transform the United States from a free nation based on individual rights into a collectivist, paternalistic “people’s state,” like the welfare states of Europe.  Thankfully (I hope) he speaks for a small minority of Americans.  Who needs the Constitution?  We (meaning all Americans) do – to protect us from the likes of Professor Seidman.

      

    n “Second-Raters” for a Second Term 

                Former V.P. Dick Cheney has described B.O.’s second-term Cabinet picks as, collectively, a bunch of “second-rate people.”   He’s actually giving them too much credit.  B.O.’s first term Cabinet were a bunch of mediocrities, and his picks for replacement of departing Cabinet members in his second term are even worse – not “second-raters,” but third-, or fourth-, or fifth- (and so on) raters.  They’re just of bunch of “yes”-men and –women, people who have risen far above their levels of incompetence (according to the “Peter Principle”), the kind of people our second-rate, narcissistic president surrounds himself with, just to feel good about himself. 

    The following sentence states a general truth that applies to virtually every second-term nominee: “__________, B.O.’s nominee to replace __________ as __________, is even less competent than the person he/she is replacing.”  It works for John Kerry (replacing Hillary Clinton as Secretary of State), Jack Lew (replacing Tim Geithner as Secretary of Treasury), Chuck Hagel (replacing Leon Panetta as Secretary of Defense), John Brennan (replacing David Patraeus as CIA director), and so on.   

    I’ve already written (in “2013: Prospects for Liberty, Part I” (Jan. 17)) about the top three named picks.  All three are further to the left than the person they replace, confirming that B.O. is moving even further to the left in his second term.  Hagel is nominally a Republican – but is really a RINO (Republican in Name Only), because his foreign-policy views are far to the left of most Republicans in the Senate.  Among other things, he favors massive Pentagon down-sizing, seems to be rabidly anti-Israel, and is soft on Iran.  Kerry is a left-wing Democrat, former U.S. Senator from Massachusetts, and is further to the left than his predecessor, Hillary Clinton.  As described by conservative commentator Geoffrey P. Hunt in American Thinker, Kerry is a “speciously decorated” Vietnam veteran, “disgracing his service and his uniform with dubious testimony before the Senate Foreign Relations Committee in 1971 [comparing the U.S. military with “Genghis Khan”], now the champion for the Afghan quagmire, and third in succession to the White House.”  Moreover, “Kerry’s top treaty-making priority is defeating global warming” (“Second-Rate Appointments from a Third-Rate President,” February 13).  And Lew, the former White House budget director and chief of staff, was summarized by Lawrence Kudlow as someone who “has no qualifications, standing or experience in the financial world or international sphere,” as well as someone who’s to the left of his predecessor, Tim Geithner.  (Geithner was bad enough, but at least he had some Wall Street experience; Lew has virtually none, having worked only for a three-year stint at Citigroup, from which he took home a $1 million "bonus" shortly after his bank received $45 billion in TARP funds, just before he joined the White House in 2009.)  Lew is, in Kudlow’s words, “a left-liberal Obama spear-carrier, whose very appointment signals a sharp confrontation with the Republican House over key issues such as the debt ceiling, the spending sequester, next year’s budgets and taxes.”  As treasury secretary, a leftist like Lew will push for “trillions of dollars in new tax hikes, absolutely minimal spending restraint, and no serious entitlement reform.”   

    Two other recent picks for B.O.’s second term deserve special mention:  John Brennan, to replace David Petraeus as CIA director, who was only recently confirmed (following Senator Rand Paul’s filibuster, discussed below); and Thomas Perez, to replace Hilda Solis as Secretary of Labor.  Both also fit the profile of being further to the left than their predecessors.  Brennan, who was B.O.’s chief counter-terrorism advisor, had served as deputy CIA director during the Bush administration, when he helped run the post-9/11 terrorist rendition program, which he called “an absolutely vital tool.”  Then, when it became politically unfashionable in 2005, he declared that enhanced interrogation “goes beyond the bounds of what a civilized society should employ”; yet he supports the use of unmanned drones to kill people targeted as terrorists by the government.  Thomas Perez has been described by the editors of Investor’s Business Daily as “so radical he makes the last labor secretary, Hilda Solis, look like Ayn Rand.”  As Assistant U.S. Attorney General, Perez has run the Department of Justice’s troubled civil rights division since 2009.  Among other actions, he dropped the voter intimidation case against the Black Panthers; sued Florida to try to stop the state from purging its voter rolls of 182,000 non-citizens; sued municipalities to force them to scrap written tests for police and firefighters, replacing them with “affirmative action,” race-based hiring; and brought a civil rights harassment suit against Arizona Sheriff Joe Arpaio, whose only “crime” was trying to enforce existing U.S. immigration law.  As the editors conclude, “These acts show the work not of a dedicated public servant but a rabid activist who sees advancing the leftist agenda as the goal, and the law as the obstacle” (“Thomas Perez: Unfit for Office,” March 15). 

    One possible exception to the general rule is the director of the Environmental Protection Agency (EPA).  B.O.’s appointee during his first term, Lisa Jackson, was not only a radical environmentalist but also was politically corrupt.  She resigned from office after having been investigated by the EPA’s inspector general for improperly using a bogus secondary e-mail account for official business.  B.O.’s pick for Ms. Jackson’s successor, Gina McCarthy, is just another radical environmentalist.  She headed the agency’s “clear air” efforts during B.O.’s first term, and as EPA director is expected to push the B.O.’s radical anti-carbon agenda (including efforts to reduce carbon dioxide emissions – despite the fact that CO2 isn’t really a pollutant).  She’s also a radical leftist but, as far as I know, is honest.

     

     n Droning On – To a Point 

                Senator Rand Paul (R.–Ky.) recently earned the praise of a broad array of Americans – not only conservatives and libertarians but also many left-liberals concerned about civil liberties – by doing something that his Senate Republican colleagues usually are roundly criticized for:  filibustering.  In Senator Paul’s case, however, the filibuster – an old-fashioned filibuster in which Paul, with “a little help from his friends” (not just Republicans but also Democrat Senator Ron Wyden), spoke continuously for nearly 13 hours on March 6-7 – was a success, in forcing the B.O. regime to make a commitment to follow the rule of law.  It was a reminder to all Americans why the Senate’s peculiar rule permitting unlimited debate – the filibuster rule, which as practiced in recent years amounts to a super-majority rule requiring 60 votes to end debate – actually does help protect Americans’ constitutional rights, by making the Senate a more “deliberative” (and deliberate) body that, in theory, ought not to pass legislation – or exercise the Senate’s other unique powers, to confirm presidential appointments or to ratify treatises – in haste.    

    Senator Paul filibustered against the nomination of John Brennan as CIA director, in order to call attention to the B.O. regime’s use of unmanned drone aircraft to do “targeted killings” of suspected terrorists.  Paul’s effort started out as a simple question he posed to Attorney General Eric Holder:  Does B.O. have “the power to authorize lethal force, such as a drone strike, against a U.S. citizen on U.S. soil, and without trial?”  The plain, simple answer ought to have been “No,” since such action clearly violates the U.S. Constitution’s Fifth Amendment due-process protection.  But instead Mr. Holder gave a rather long, convoluted answer filled with legalese giving the government “wiggle room”:  “It is possible, I suppose,” wrote Holder, “to imagine an extraordinary circumstance in which it would be necessary and appropriate under the Constitution and applicable laws of the United States for the President to authorize the military to use legal force within the territory of the United States.”  (One such “extraordinary circumstance” might be another Civil War, as Charles Krauthammer has observed in a recent column:  U.S. citizens, members of the armed forces of the seceded Confederate States of America, engaged in an actual armed rebellion against the U.S. government on American soil.  That’s the only valid example I can think of.) 

    Unsatisfied by Holder’s response, Paul – looking a lot like Jefferson Smith, the idealistic senator played by James Stewart in the classic movie Mr. Smith Goes to Washington – did his old-fashioned, 13-hour filibuster, assisted by a few Republican colleagues plus one Democrat, Senator Wyden (as noted above), which made the effort officially “bipartisan.”  (Just as he had done in voting to confirm Chuck Hagel as defense secretary – a controversial vote, for which Paul took much heat from conservatives – Paul was underscoring that his opposition wasn’t merely partisan but was principled.)  What was the point of his filibuster?  In the senator’s own words, it was to make this simple yet important point: “No American should ever be killed in their house without warrant and some kind of aggressive behavior by them. . . . I will not sit quietly and let [B.O.] shred the Constitution.  No person will be deprived of life, liberty, or property without due process.” 

    Paul was forced to stop speaking, after about 13 hours, because of a “call of nature,” but he subsequently declared the effort a success.  On March 7, Attorney General Holder wrote Paul:  “It has come to my attention that you have now asked an additional question: `Does the President have the authority to use a weaponized drone to kill an American not engaged in combat on American soil?’ The answer to that question is no.”  So Paul finally succeeded in getting a definite “no” from the B.O. regime to at least one version of the basic question he was asking.  (The Senate then confirmed Brennan, by a 63-34 vote.  Most of the 34 votes against Brennan were from Republicans, joined by two Democrats, Patrick Leahy of Vermont and Jeff Merkley of Oregon, plus socialist “independent” Bernie Saunders of Vermont.  Interestingly, two senators who helped Paul filibuster – Republican Marco Rubio and Democrat Ron Wyden – joined the 63-senator majority to confirm.  Only two of Paul’s Senate colleagues openly criticized his effort – two Republicans (or maybe RINOs would be a better term), John McCain and Lindsey Graham, who called Paul’s rhetoric “alarmist,” claiming that B.O. wouldn’t kill anyone with a drone.  Maybe McCain and Lindsey are getting soft on B.O.; maybe they were just jealous of Paul’s popularity.) 

    Declaring Senator Paul’s March 6 filibuster to be a political, as well as legal, success – saying it makes him “probably now the 2016 front-runner for president” – Jonathan Moseley, writing in American Thinker, enumerated several reasons why “Rand Paul Shifts [the] Political Orbit” (March 8).  Moseley explains how Paul’s filibuster serves as a “teachable moment,” a model for Republicans to take similar stands in favor of limited-government principles, and against the B.O. regime, and to get popular support for their doing so. 

    Remarkably, among the political commentators praising Paul’s filibuster was left-liberal Washington Post columnist Eugene Robinson, writing the “On the Left” column in Investor’s Business Daily.  Robinson generally regards the Republican senator from Kentucky as (in his words) “an archconservative kook” (which is how many leftists see libertarians like Paul), yet he began his op-ed by writing, “Rand Paul was right.”  Praising Paul for calling attention to American citizens’ guarantee of due process rights, he added, “I cannot argue with the basic point Paul was making:  There must be greater clarity about how and where our government believes it has the authority to use drones as instruments of assassination, especially when U.S. citizens are in the cross hairs.”  He rhetorically asked, 

    “Imagine that drone technology had existed at the time of the 1995 Oklahoma City bombing.  Imagine the government somehow got wind of Timothy McVeigh’s plans in advance and tracked him to the compound where he and Terry Nichols were building their bomb.  Should the president have had the power to order a drone-fired missile strike, killing McVeigh, Nichols and whoever else might be in the vicinity?”

     

    Robinson is satisfied that Holder’s reply to Paul “closes that door.”  (I have less confidence in the B.O. regime, particularly Holder – who, like his chief, is an unabashed liar and hypocrite.)  Yet, as he concedes, even harder questions remain about the use of drones in other countries.  “The way we use drones as killing machines has to be consistent with our freedoms and our values.  For grabbing us by the lapels, Rand Paul deserves praise,” concludes Robinson – adding “Yikes, I said it again” (“Sen. Paul Was Justified in Droning On,” I.B.D., March 8).  I feel the same way about approvingly quoting Mr. Robinson!

      

    n National Community Organizer-in-Chief

                   In his infamous “open mic” comments to the Russian president a year ago – promising he’d be more “flexible” on missile defense in his second term – B.O. said that 2012 was his last political campaign.  Why then is his campaign organization still raising and spending money?  Could it be because after four years in office, B.O. still hasn’t learned how to govern (to exercise the executive powers – and duties – of his office, under the Constitution) and remains stuck in permanent “campaign mode”?  Or maybe he cannot function except as the radical leftist “community organizer” he was during his Chicago years? 

                   B.O.’s reelection campaign group, euphemistically called “Organizing for America,” has been converted into a tax-exempt nonprofit group, renamed “Organizing for Action” (OFA), supposedly to push his second-term legislative agenda.  OFA’s leaders – B.O.’s former White House and campaign aides – say they plan to raise about $50 million to build what the group says will be a “grass-roots” effort to rally public and congressional support for such issues as “climate change,” gun control, and immigration reform.  Jim Messina, who ran B.O.’s reelection campaign and now chairs OFA, claims that more than one million people have undertaken at least one “volunteer action” since the group started in late January (“Leaders of Obama advocacy group defend plan,” USA Today, March 14). 

    The acronym OFA is apt, because it’s truly an “ofa” idea.  (In case you don’t get the pun, “ofa” is how one pronounces “awful” if one’s mouth is full of shit.)  “Organizing for Action” should be renamed “Organizing for Access” because that’s what it provides:  anyone who writes a check for $500,000 sits on OFA’s “national advisory board” and gets to have face-to-face meetings with B.O. four times a year.  As the Investor’s Business Daily editors observe, “That’s a lot of access from a president who refuses to answer questions from the press about his actions and rarely ever holds press conferences.”  It also shows the blatant hypocrisy in B.O.’s claim, during his 2008 campaign, that he’d never take money from lobbyists.  As the editors note, “Cash for access has pretty much been a hallmark of the Obama years and it’s just getting worse. . . . He’s made his White House a revolving door of jobs for lobbyists.  And then there are the crony capitalists,” such as Solyndra, the failed “green energy” company that got $500 million in U.S. taxpayer funds, and the Southern Company, which seems to have bought a $8 billion federal loan guarantee for a plant in exchange for a donation to the B.O. Inaugural Committee.  Even Bob Edgar, president of the leftist Common Cause, says “It just smells” (“Organizing for Access,” February 27). 

    Modern presidents of both major political parties, since the time of Teddy Roosevelt, have been abusing the power of their office by transforming it into what TR called a “bully pulpit,” as I discussed in my essay last month on “The Unconstitutional Presidency.”  But with B.O. in the White House, the bully pulpit has been further transformed – into the bullshit pulpit.  Or maybe just the shit pulpit.

      

    n Sequester, Shmequester

                 What’s all the fuss about “draconian” federal government spending “cuts”?  Allegedly they result from the so-called “sequester” – or sequestration, the across-the-board federal budget cuts mandated by the August 2011 debt-ceiling deal – which began to be implemented on March 1.  The automatic “cuts” in spending total $1.2 trillion over 10 years, half from domestic (discretionary) programs and half from defense.  For this year, however, the “cuts” total just $85 billion – which is truly minuscule compared to a nearly $3.6 trillion budget.  Moreover, because the $85 billion is actually budget authority, not budget outlays, the actual reduction in budget outlays for this year is only $44 billion, or one quarter of 1 percent of GDP (GDP is $15.8 trillion) or only 1.25% of the $3.6 trillion government budget.  And the “cuts” aren’t really cuts at all; rather, they’re reductions in the automatic increases in spending that are built into the federal budget thanks to a gimmick called “baseline budgeting.”  With sequestration, federal spending as a share of GDP will still be 22.2%, well above the post-World War II average of less than 20%.  As the editors of Investor’s Business Daily have noted, the reductions don’t even get spending back to pre-B.O. levels.  “In fact, under the sequester the government will spend about $60 billion more than it did in 2008 just on what are called domestic discretionary programs – like education, law enforcement, highways and the environment.  That’s an increase of more than 10%.  And that 2008 spending level itself was hugely inflated – the result of a 60% increase in the domestic discretionary budget over the previous eight years” (“Automatic Cuts Are No Tragedy,” February 12).  I.B.D cartoonist Michael Ramirez has drawn a “pie chart” which graphically depicts the sequester cuts for what they are – mere crumbs – compared to total federal spending.  

    Notwithstanding the sequester’s relatively minuscule impact on federal spending, B.O. and his regime – along with the Democrats in Congress and their lapdogs in the news media – have been engaging in ridiculous hyperbole, giving apocalyptic warnings about the impact of the so-called “cuts.”  In a series of campaign-style events, B.O. warned that the budget cuts will “gut critical investments,” “weaken America’s economic recovery,” and “weaken our military readiness.”  He said the sequester’s “meat-cleaver approach” of “severe,” “arbitrary,” and “brutal” cuts will “eviscerate” education, energy, and medical-research spending.  And he added that tens of thousands of parents will scramble to find child care, and hundreds of thousands will “lose access to primary care and preventative care.”  His education secretary, Arne Duncan, went on Sunday TV talk shows to suggest that 40,000 teachers would lose their jobs; his transportation secretary, Ray LaHood, and his homeland security secretary, Janet Incompetano, claimed air travelers would suffer huge delays.  But the prize for hyperbole must go to Rep. Maxine Waters (D. –Calif.), who on March 1 claimed that “over 170 million jobs” could be lost due to sequestration.  (As noted below, that would wipe out the entire U.S. workforce – and then some.)  Later, Rep. Waters corrected herself, saying that it was “750,000 jobs” that could be lost. 

                B.O. and his regime sound like “Chicken Little” in the famous children’s story.  Or maybe (better yet) “the boy who cried `wolf’!”  Either way, it seems that B.O.’s regime is trying to manipulate the sequester cuts for their partisan advantage – in an effort to rouse public opinion against the sequester and blame it on Republicans.  But the strategy doesn’t seem to be working.  The American people generally seem to accept sequestration – public opinion polls show a majority of Americans support reductions in government spending – and they’re not so foolish as to believe B.O.’s lie that sequestration is all the Republicans’ fault.  The sequester indeed did originate in the B.O. White House – specifically, the automatic cuts were the idea of then-Budget Director Jack Lew (now B.O.’s new Secretary of the Treasury) and White House Legislative Affairs Director Rob Nabors, as Bob Woodward has revealed.  (See the discussion of “Woodward-gate,” below.)   The White House intended to make the sequester a “poison pill” that Congressional Republicans would never accept – that’s why half the cuts were to come from national defense, which they assumed Republicans couldn’t stomach – but that scheme failed.  It seems that most Republicans in Congress care more about cutting spending and reducing the deficit than they do in preserving current levels of defense spending.  (Maybe that’s because GOP leadership has shifted from neocons like John McCain to real limited-government conservatives or libertarians like Rand Paul.)  With neither the Republicans in Congress nor the American people terribly upset about the sequester, B.O. and his minions indeed do seem like a bunch of either “Chicken Littles” or boys who cried “wolf”!   

                One telling example of how the B.O. regime has botched its attempt to politically manipulate the sequester has been the unpopular decision to end public tours at the White House.  In a recent interview, B.O. denied any responsibility for the decision – Harry Truman famously had a sign on his Oval Office desk reading “The buck stops here,” but it seems B.O.’s motto ought to be “The buck-passing starts here” – as the White House claimed the decision was made by the Secret Service, which employs over 30 agents to provide security during the tours, at the cost of $74,000 a week.  Why eliminate the White House tours, unless the decision was purposely made to maximize public outrage?  Yet as the public learns more facts about the Secret Service and White House budgets, people aren’t blaming Congressional Republicans (who shrewdly managed to ensure the sequester wouldn’t interfere with public tours of the U.S. Capitol); rather, the blame has been falling squarely on B.O. and his regime.  Instead of canceling White House tours, they could have saved much more money by curtailing B.O.’s traveling:  operating Air Force One alone costs almost $180,000 an hour; eliminating just one of B.O.’s golf trips – such as the recent round he played with Tiger Woods in Florida, at a cost to U.S. taxpayers over $1 million – would save enough money to pay for three months’ worth of White House tours (Michael Ramirez, “White House Tours Canceled, but Golf with Tiger Is a Must,” I.B.D., March 12). 

                What’s really going on here?  In a recent column, Charles Krauthammer quoted a leading Democrat lobbyist who told the Washington Post that “the worst-case scenario for us” would be “the sequester hits and nothing bad really happens.”  Is sequestration “dumb”?  Sure, across-the-board cuts are dumb because we need to discriminate, to set priorities; that’s why Congress is supposed to adopt a budget.  “Except that the Democratic Senate hasn’t passed one in four years.  And the White House, which proposed the sequester in the first place, has 18 months to set rational priorities among accounts – and did nothing.  When the GOP House passed an alternative that cut where the real money is – entitlement spending – [B.O.] threatened a veto.  Meaning, he would have insisted that the sequester go into effect – the very same sequester he now tells us will bring on Armageddon.  Good grief.”  Krauthammer then provides the most succinct explanation of sequester politics that I’ve read: 

    “The entire sequester would have reduced last year’s deficit from $1.33 trillion to $1.24 trillion.  A fraction of a fraction.  Nevertheless, insists [B.O.], such a cut is intolerable.  It has to be `balanced’ – i.e., largely replaced – by yet more taxes.  Which demonstrates that, for [B.O.], this is not about deficit reduction, which interests him not at all.  The purpose is purely political: to complete his Election Day victory by breaking the Republican opposition.  . . . In the past two years, House Republicans stopped cold [B.O.’s] left-liberal agenda.  Break them now and the road is open to resume enactment of the expansive, entitlement-state liberalism that [B.O.] proclaimed in his second inaugural address.  But he can’t win if `nothing bad really happens.’  Instead, he’d look both foolish and cynical for crying wolf.”

     

    (“Dems Hoping Sequestration Yields Disaster,” I.B.D., March 1). 

                Krauthammer also succinctly summarizes the aversion that B.O. and the Democrats have to genuine cuts in government spending:  

    “For reactionary [left-]liberalism, . . . whatever sum our ever-inflating government happens to spend today (now double what Bill Clinton spent in his last year) is the Platonic ideal – the reduction of which, however miniscule, is a national calamity.  Or damn well should be.  Otherwise, people might get the idea that we can shrink government, and live on.”

     

    Could we cut $100 billion or more annually from federal discretionary spending?  Of course, Krauthammer observes, citing a 2011 GAO report that “gave a sampling of the vastness of what could be cut, consolidated and rationalized in Washington:  44 overlapping job training programs, 18 for nutrition assistance, 82 (!) on teacher quality, 56 dealing with financial literacy, more than 20 for homelessness, etc.  Total annual cost: $100 billion to $200 billion, about two to five times the entire domestic sequester.  Are these on the chopping block?  No sir.  It’s firemen first.”  “Firemen first” was a phrase coined in 1976 by Washington Monthly editor Charlie Peters to describe how local government functionaries beat budget cuts by putting the fire department first on the chopping block.  A similar tactic is used by local school district officials who, whenever voters defeat a millage, would put popular sports programs (or other extracurricular activities) first on the chopping block, essentially blackmailing voters into supporting increased taxes.   

                The sequester is nothing more, or less, than a good start – a good start on what must be done to reduce federal spending, cut the deficit, and end the hemorrhaging of the national debt.  The controversy over it highlights the mental illness that afflicts B.O. and the Democrats – what I’ve called their “Deficit-Attention Disorder” (D.A.D.): their stubborn refusal to recognize that the basic problem in Washington, D.C. today is out-of-control government spending, which in turn causes obscenely high deficits (averaging $1 trillion each year that B.O. has been in office) and the dangerously-high national debt.  D.A.D. in turn is a disorder that results from the Democrats’ underlying disease:  paternalism (their paternalistic philosophy of government).  It’s time that all Americans (not just some Republicans in Congress and some conservative and libertarian political commentators) begin calling them out on it.

       

    n Shall We Call It “Woodward-Gate”?

                Bob Woodward has long been respected as one of the greatest investigative reporters in modern U.S. history – mainly because of the work he did with his Washington Post colleague Carl Bernstein in breaking the story of the Watergate scandal, and thus helping to bring down Richard Nixon’s presidency, in the early 1970s.  Now Woodward is the target of vicious personal attacks from the political left (including many of the so-called journalists who once so idolized him), because of his recent revelations about B.O.’s lies concerning the sequester.  Not only did Woodward reveal in his book, The Price of Politics, that sequestration was an idea that originated in B.O.’s White House (as noted above), but Woodward also dared to criticize B.O.’s fear-mongering about the sequester.  Appearing on MSNBC, the left’s favorite cable channel, Woodward pointed out that neither Ronald Reagan nor George W. Bush ever would have claimed, as B.O. has, that a budget sequester would prevent the commander-in-chief from protecting the country militarily.  Specifically, Woodward challenged B.O.’s claim that the mere threat of these cuts “forced” the Navy to delay the deployment of an aircraft carrier to the Persian Gulf.  He called B.O.’s assertion “a kind of madness that I haven’t seen in a long time” – a fairly unambiguous reference to Nixon’s behavior during the Watergate crisis.   

    For simply reporting the truth about the B.O. regime, Woodward has been chastised by the leftist news media.  Let’s call it “WoodwardGate.”  Former Los Angeles Times reporter Steve Weinstein called Woodward “senile.”  Salon’s Alex Pareene ludicrously accused Woodward of laziness – “kind of like accusing Hugh Hefner of chastity,” writes the editors of Investor’s Business Daily (which is definitely not part of the leftist news media).  Pareene added that he hopes WoodwardGate will mean that in the future “no one talks to Woodward because since he’s lost it, let’s all stop indulging him.”  And the Huffington Post’s Eric Boehlert “was apparently ready to issue an excommunication for Woodward’s mortal sin of appearing on Fox News” (“Cowed Media Turn on Woodward,” I.B.D., March 4). 

    In the eyes of the left-wing news media – who are still enthralled in what media critic Bernard Goldberg has called their “slobbering love affair” with B.O. (the “Emperor’s New Clothes” phenomenon I’ve so often discussed here on MayerBlog) – Woodward has committed an even greater sin, by disclosing how he has been targeted with threats from Gene Sperling, B.O.’s top economic adviser, who yelled at Woodward for a half hour and then emailed him (with regard to his reporting about the origins of the sequester), “I think you will regret staking out that claim.”  FrontPageMag.com’s Arnold Ahlert observed that Woodward is “getting a taste of what happens to those who challenge the Obama-Democrat-media machine.”  The B.O. White House attacks anyone who criticizes it, then B.O.’s allies in the news media “immediately joined the feeding frenzy before any objective evidence was available – a chilling warning to anyone who would dare defy the power structure in Washington,” from a press that has “abandoned truth-telling for two things they consider far more important: advocacy and access.”  AP reporter Ron Fournier in the National Journal noted that because of WoodwardGate, he “broke ties with a senior White House official” who had been a longtime source for him, adding that “my only regret is I didn’t do it sooner.”  Over a long period, Fournier reports that he had received “emails and telephone calls from this White House official filled with vulgarity, abusive language” and a threat identical to the one Woodward got from Sperling.  It all confirms what the I.B.D. editors call “an undeniable pattern of systemized Obama bullying of the press – extending even to Bill Clinton’s White House counsel Lanny Davis,” who now works as a columnist for the conservative Washington Times.  “In eagerly devouring one of their own, the media attacking Bob Woodward are choosing the power structure in Washington over future generations of journalists, whose job is supposedly to scrutinize that power.” 

    People on the political left like to say that they regard “speaking truth to power” as a virtue, even an act of heroism.  But apparently that only applies when those wielding power are Republican politicians.

      

    n B.O.’s Anti-Growth Agenda

                 Among the dire warnings – the parade of horribles – that B.O. has claimed will result from the sequester is an increase in the nation’s unemployment rate.   How stupid does he think the American people are?  Increased unemployment is the inevitable result of B.O.’s own anti-growth, anti-jobs agenda – policies that include the misleadingly-named “Affordable Care Act” (aka “ObamaCare”).  For a nation that used to have a real unemployment rate of 4–5 % (or less), the “new normal” under B.O.’s regime seems to be about twice that rate – in other words,  more like the nearly-bankrupt European welfare states than like the traditional rate for the USA.  B.O.’s redistributionist agenda has caused not only a rise in unemployment – with 23 million Americans still unable to find regular work – but also a decrease in wages (average wages, adjusted for inflation, have dropped for 21 of the last 23 months), higher rates of poverty, ongoing $1 trillion annual federal budget deficits, and the gross domestic product (GDP) actually falling during the last quarter (Ralph Reiland, “Obama’s Anti-Growth Agenda,” I.B.D., February 15). 

    Contrary to most media reports, the government’s February jobs numbers did not signal an economic recovery.  The White House was eager to tout the monthly jobs report as evidence that the nearly four-year-old “recovery” was finally “gaining traction,” saying that 236,000 jobs were added to the economy and the official unemployment rate dropped to 7.7%, the lowest it’s been since December 2008.  What the White House didn’t say is that the official rate dropped because even more people gave up on looking for a job.  While the country gained 236,000 jobs, the ranks of those not in the labor force – people who don’t have a job and stopped looking – swelled by 296,000.  That continues a trend throughout the “recovery” of the past four years (which I’ve called “B.O.’s recession,” not truly a recovery at all):  the non-workforce has climbed almost twice as fast as people with jobs.  Put another way, only 58.6% of Americans work today, down from 60.6% when B.O. took office.  The average over the previous two decades was 63%.  The number of long-term unemployed is still higher than it was three and a half years ago – and it jumped more than 90,000 in February.  There’s also the fact that 3.7 million workers have gone on the Social Security disability program since mid-June 2009, the fastest enrollment pace ever (“Jobs Report Not So Hot After All,” I.B.D., March 9). 

    The basic reason why the American economy isn’t growing is that B.O.’s tax and regulatory policies have had a devastating dampening effect on American businesses.  As the editors of Investor’s Business Daily observed in a recent editorial, B.O. “bad-mouths businesses and punishes entrepreneurs by taking the profits they’d otherwise plow back into their operations.  And he’s strangling [businesses] with reams of new red tape,” particularly with the uncertainties and costs created by “ObamaCare” and “Dodd-Frankenstein.”  “Companies have put off hiring and investment in new plants and equipment to deal with regulations that now dwarf the New Deal in scope and complexity.”  The editors quote Federal Reserve Gov. Richard Fisher, who in a recent speech singled out the Dodd-Frank law, maintaining that new banking rules “have exacerbated the weakness in economic growth by increasing regulatory uncertainty in key sections of the U.S. economy.”  He added, “Against this backdrop, I am not surprised by the reaction of businesses. . . . Private-sector job creators are in a defensive crouch” (“It’s the Economic Growth, Stupid!”  March 11). 

    As long as B.O. and Democrats in Congress stubbornly insist on higher taxes and more regulations – not just refusing to remove the burdens they’ve imposed on the economy, but actually “doubling down” on their misguided, disastrous anti-business policies – “growth won’t spring back to normal,” the I.B.D. editors conclude.  “But if we just doubled [the] anemic sub-2% real GDP growth [under B.O.] to President Reagan’s 4% average clip, we’d see some $5 trillion in deficit reduction over the next decade.”  In other words, if our tax and regulatory policies were pro-growth instead of anti-growth, we wouldn’t need gimmicks like sequestration to balance the federal budget.

      

    n Karl Rove Is Right!  . . .

                 According to many Democrat politicians and their allies in the news media, the Republican Party nationally is in disarray.  It’s become customary, in modern politics, for a political party that “loses” a major election to engage in some post-electoral “soul-searching.”  But following the 2012 elections – which the GOP only partially lost (as noted below) – it seems that Republicans are not just suffering from an identity crisis but have also become quite dispirited – even depressed, one might say – and are dividing into various camps, each with its own “solution” to the party’s difficulties.  Needless to say, Democrats and their media allies are gleeful (indeed, almost giddy) about the Republicans’ problems – which tends to depress and to divide Republicans even more.  Politics, though, is a game based largely on perceptions of reality, however false; and the perception that the GOP is in deep trouble is indeed false. 

    Many Republicans have become their own worst enemies, by buying into the false perceptions of their colleagues – and their political opponents.  Among those false perceptions is one being propagated by some “conservative” commentators, particularly talk-radio star Rush Limbaugh, who simplistically divides politically-active and aware people into two camps, “liberal” and “conservative,” and then in turn divides the Republican Party into two camps, “establishment” Republicans and true “conservatives.”  According to Limbaugh’s narrative, the GOP “lost” the 2012 election because the party’s presidential candidate, Mitt Romney, was not a true conservative but rather a “Massachusetts moderate” (echoing Newt Gingrich’s accusation during the GOP primary campaign) – which to Limbaugh is the same as a “liberal” – and the candidate picked by the hated “establishment.”  Limbaugh’s overly-simplistic (and just plain wrong) analysis overlooks several important facts, among them:  (1) the GOP may have lost the presidential race (and failed to recapture control of the U.S. Senate), but it “won” in the U.S. House (where it retains its majority) and in most statehouses around the USA; (2) Romney won the party nomination by winning most of the state primary elections, not by getting just party “establishment” support; (3) self-identified “conservatives” do outnumber self-identified “liberals” in most polls, but “independents” – not conservatives – hold the plurality; (4) “conservatives” are not a monolithic group but can be divided into several different camps (social conservatives, limited-government or libertarian conservatives, “neo-conservatives,” and so on); and (5) Mitt Romney truly was more of a limited-government conservative than his other GOP primary rivals, with the exception of Ron Paul (a point that I repeatedly made here on MayerBlog both before and after the election).  It is Limbaugh’s ignorance of the last two points that mostly explains why his analysis of the 2012 election is virtually worthless – except as a kind of self-fulfilling prophecy.  As I warned in my “Spring Briefs” blog last year (Mar. 15, 2012), in the section on “Rush Limbaugh’s Near-Fatal Mistake,” Rush’s criticisms of Romney dampened enthusiasm for the GOP nominee among many “core” GOP constituents (self-identified “conservatives”  who comprise Rush’s audience) – and thus actually helped the Democrats and particularly B.O.’s reelection campaign.  Rush’s “near-fatal” mistake – despite his positive comments about Romney after the GOP Convention had secured him the nomination – became a fatal mistake (fatal to Romney’s election chances); his lack of support for Romney – along with a similar lack of support by most Tea Party groups – was probably the chief reason why Romney lost the presidential election.  

    Given all the misinformation Rush is spreading among his regular listeners, it’s not surprising that a broad array of similarly self-identified “conservatives” are waging a veritable civil war within the GOP, one that supposedly pits “establishment” Republicans against “grass-roots activists.”  Nor is it surprising that many of these self-identified “conservatives” – Tea Party groups, talk-radio hosts, and other activists – have attacked Republican strategist Karl Rove, whom former president George W. Bush called “the architect” of his presidential campaigns.  Rove’s American Crossroads super-PAC had at best mixed success in the 2012 elections, but Rove on February 4 announced plans to create a new PAC, called Conservative Victory Project, to spend money to promote “electable” candidates in GOP congressional primaries in 2014.  Many conservatives have reacted furiously.  For example, radio talk host Mark Levin asked, “Who died and made Karl Rove queen for a day?” 

    Rove and Crossroads President Steven Law (who’s heading the new project) have made the media rounds, from Fox News Channel’s Sean Hannity to MSNBCs Chuck Todd, arguing that they’re not trying to thwart the will of conservative primary voters, or to promote moderates, or to protect incumbents.  They have pointed out that Crossroads groups spent $30 million in the past two elections to help Tea Party favorites such as Senators Marco Rubio, Rand Paul, and Pat Toomey – as well as unsuccessful candidates including Indiana’s Richard Mourdock.  But as Rove realizes – and naïve “conservatives” like Rush Limbaugh fail to realize – many of the Republican candidates lost in 2012 because they alienated voters because of their extreme social conservatism and (for want of a better word) all-around kookiness.  Two prime examples are Mourdock and Missouri’s Todd Akin – candidates for U.S. Senate seats that should have been won by the GOP but weren’t because of their extreme anti-abortion views and their outrageous comments about rape.  Social conservatives, because their positions on such issues as abortion and same-sex marriage are so out of line with evolving American values, are turning off the GOP to many younger voters (who are more libertarian).  (Romney’s efforts to court social conservative support, by taking strong stands against abortion and groups like Planned Parenthood and agreeing with the GOP platform’s opposition to same-sex marriage, were double-edged swords:  they failed to convince social conservatives that he was truly a “severe” conservative, as he once called himself with a poor choice of words, and at the same time they alienated more libertarian GOP and independent voters.) 

    Rove is not saying that the GOP should abandon its principles by becoming a “big tent” that will appeal to moderates or independents.  But he is saying that the GOP should not insist on “conservative” purity in its candidates.  The Conservative Victory Project says that its aim is to institutionalize William F. Buckley’s rule:  Support the most conservative candidate who is electable.  As David Harsanyi, columnist and senior reporter at Human Events, observed in a recent op-ed: “The most electable conservative candidate in the Northeast isn’t going to be a social conservative.  It’s that simple.”  He quotes Rove saying: “If . . . people think the best we can do is Todd Akin and Richard Mourdock, they’re wrong.  We need to do better if we hope to take over the United States Senate.  We need to get better conservative candidates and win” (“Rove has a point: focus on high-quality candidates,” Columbus Dispatch, February 8). 

    To these comments from Harsanyi and Rove, I say “Amen.”  And I’ll add that the “best” or most “electable” type of conservative generally is not going to be a social conservative or a neo-con; it will be a true “limited-government” conservative or a libertarian.  That’s the future of the GOP.

      

    n . . .  But Rand Paul and Charles Murray Are Even More Right!   

                Democrats and their allies in the news media criticized the recent Conservative Political Action Conference (CPAC) for failing to invite “popular” Republican governors, like New Jersey’s Chris Christie, to speak.  But CPAC isn’t a Republican conference; it’s a conservative conference, and Chris Christie isn’t a true “conservative,” no matter how one defines it.  (His speech at the 2012 Republican National Convention, billed as a “keynote” speech, failed to do what a keynote speech ought to do – to support the party’s presidential nominee – and seemed instead to support only Christie’s own future political ambitions.  And his shameless courting of federal disaster funds following tropical storm Sandy last fall seemed to play directly into B.O.’s hands, helping his reelection campaign – making Christie seem more like a RINO, if not a traitor to the GOP).  But considering how many different types of conservatism there are in America today (as noted above), it’s not surprising that many of the speakers at CPAC failed to fit the stereotype of either conservatism or the GOP that left-liberals are trying to create.  Two cases in point are Senator Rand Paul and libertarian political scientist Charles Murray.  

    Rand Paul’s splendid speech not only criticized the B.O. regime and Democrats in Congress with many hard-hitting zingers but also made a convincing case about the direction in which the GOP must change, if it expects to win future elections.  It’s the same point I’ve been making here in MayerBlog, in my postings both before and after the 2012 elections:  that the GOP must become more consistently the party of limited government and of individual freedom.  Among other things, Senator Paul observed: 

    “The Republican Party has to change – by going forward to the classical and timeless ideas enshrined in our Constitution.  When we understand that power corrupts and absolute power corrupts absolutely, then we will become the dominant national party again.

     

    “It is time for us to revive Reagan’s law: For liberty to expand, government must now contract.  For the economy to grow, government must get out of the way. . . . .

     

    “Our party is encumbered by an inconsistent approach to freedom.  The new GOP, the GOP that will win again, will need to embrace liberty in both the economic and personal sphere.

     

    “If we are going to have a Republican Party that can win, liberty needs to be the backbone of the GOP.

     

    “We must have a message that is broad.  Our vision must be broad.  And that vision must be based on freedom.” 

     

    Getting down to specifics, Charles Murray’s speech at CPAC took the attendees by surprise by focusing on one important “liberty” issue:  same-sex marriage.  Murray ditched his prepared remarks on “America Coming Apart” and instead delivered an impromptu admonition, on the question, “How can conservatives make their case after the election?” drawn from his experience with his own four children, who range in age (he said) from 23 to 43.  While they share many of his views on limiting the size of government, and supporting free enterprise, he said, “Not one of them thought of voting for a Republican President” in the last election.  Their disenchantment with the Republican Party was not specifically because of Mitt Romney, he added, but because “they consider the Party to be run by anti-abortion, anti-gay, religious nuts.”  With regard to same-sex marriage, which he called “gay marriage,” he went on, “I think the train has left the station.” 

    Writing about Murray’s speech in New Yorker magazine, left-liberal reporter Jane Mayer seemed gleeful herself when she observed, “Certainly the locomotive power of the issue seemed hard to miss on a day when the top political news was Ohio Republican Senator Rob Portman’s announcement that he, too, supports gay marriage.”  Like Senator Portman – whose position on the marriage issue changed after he found out his 21-year-old son is gay – Murray says his “change of heart” has come after he’s acquired “a number of gay and lesbian friends.”  He also has been influenced by the pro-same-sex marriage arguments made by Jonathan Rausch, an openly gay writer for the National Journal and the Atlantic.  As I observed in my essay “Marriage, American Style” (May 19, 2004), same-sex marriage can be properly seen as a “conservative” issue, if one sees truly principled conservatism in terms of consistently arguing for more limited government and for maximizing individual freedom.  Ms. Mayer then reports that the “disquiet” at CPAC “grew further, as Murray suggested that abortion, too, was an issue better left, for the most part, to `moral suasion’ rather than criminalization.”  Bravo, again!

      

    n The Occupier-Who-Must-Not-Be-Named 

                It is forbidden to refer to the current “occupier” of the White House by name on Glenn Beck’s radio show, a policy that the host – a former conservative who now calls himself “libertarian” – instituted beginning in January.  On Beck’s show, he’s sometimes referred to as “He-who-must-not-be-named” (as the evil Lord Voldemort was called in the Harry Potter novels – which in turn may have been inspired by “she-who-must-be-obeyed,” as English barrister Horace Rumpole called his wife, in John Mortimer’s Rumpole of the Bailey stories).  More often, he’s called simply “That Guy.”  Offenders (including Beck himself) must pay a fine; the money thus collected will be donated to charity.   

                I can find no fault with Beck’s policy because, as regular readers of MayerBlog know, it’s been my practice to refer to “that guy” in the White House as “B.O.” since October 2008.  Except when I’m quoting from someone else, I never use B.O.’s full name.  As I’ve previously explained, referring to him by his initials makes sense and seems especially apt because his record and his policies, in a word, stink.  (As I’ve also frequently written, “that continually rising stench emanating from Washington, D.C. is a result of America’s B.O. problem.”)  I do have one quibble with Beck, however.  He still refers to B.O. sometimes as “the President” – something I refrain from doing, for the obvious reason that B.O. (who, as I’ve often maintained here, is not only the worst but also the most lawless president in U.S. history) does not deserve the title.  I have too much respect for the office of President of the United States to taint it by associating it with B.O., whom I frequently call by other titles he more truly deserves:  Occupier-in-Chief, Bullshitter-in-Chief, Liar-in-Chief, etc.   

      

    n Dorner “Unchained” 

                Coverage of so-called “gun violence” (the euphemism for government control of guns that I discussed in Part III of this year’s “Prospects for Liberty” essay) by the left-wing “lamestream” news media took a strange turn last month, in the media’s coverage of the murderous rampage of Chris Dorner, the disgruntled former L.A. cop who was burned to death in a shootout with his former police colleagues, at a Big Bear, California cabin on February 12.  What was so strange about the story was the way certain politicians and media sources seemed to be sympathetic to Dorner, romanticizing him and his murderous rampage, as if he were the real-life counterpart of the “hero” in Quentin Tarantino’s violent film, Django Unchained – the black man who kills “whitey” in retribution for supposed racial crimes.  (Dorner was a supporter of B.O. whose Facebook “manifesto” argues in favor of the resumption of an assault-weapons ban.)  For example, on CNN’s Reliable Sources, host Howard Kurtz spoke with George Washington University professor Steve Roberts (husband of ABC’s Cokie Roberts and a former reporter for Newsweek).  Roberts’ thought that the manifesto Dorner wrote was “interesting” because of what he wrote about the LAPD, which has had (in Roberts’ words) “a long history of racism.”  He thought Dorner’s “historical sensibility” in the manifesto was “worth paying attention to despite the murders.”  (Dorner’s murderous rampage included the killings of the innocent child and girlfriend of one of the supposed “racist” L.A. cops that he was targeting for retribution.)  Even more favorable comments have been unearthed by conservative commentator Michelle Malkin, on her Twitchy website, where she aggregated tweets from many leftists to whom Dorner has become a kind of cult hero.  “Oh, yeah, I love this guy!  He’s the modern-day, real-life Django,” one Twitter user wrote. 

    There’s more than just the left’s “politically correct” racism at work here.  The media’s coverage (or lack of coverage) of the Dorner case reveals the fallacy at the heart of the gun-control fanatics’ fantasy world, where Dorner is one of the “Good Guys”: “the controlled, heavily trained few within federal, state, and local governments who are entitled to own and handle high-performance firearms as Americans’ protectors,” as the editors of Investor’s Business Daily observed.  “To the predictable response from the left that the government fired Dorner – in the form of the Los Angeles Police Department sacking him for issuing false statements [alleging police brutality and/or racism in the department] – it should be pointed out that someone capable of Dorner’s apparent crimes could well have done so while still wearing a law enforcement or military uniform, and for rationales other than those he chose.”  The editors conclude, “Armed police officers turning into serial killers might be rare, but the Dorner nightmare is further proof that the problem is not guns, but criminality; not the tools of crime but its perpetrators” (“Government Gun Holders Go Crazy, Too,” February 9). 

    What the Dorner case really teaches us about “gun violence” is that the right protected by the Second Amendment is indeed precious to “ordinary,” law-abiding Americans.  It’s important not only to protect them against criminals, but also against government and its agents when they act like criminals, too.  The core principle behind the Second Amendment is that government ought not to have a monopoly on firearms; Dorner’s murderous rampage reminds us why.

       

    n  “Dearth Hour” 

                For 60 minutes (starting at 8:30 p.m. Eastern time) on Saturday night, March 23, radical environmentalists and other gullible people will observe “Earth Hour” by killing their lights, to increase awareness of “climate change.”  Earth Hour, organized by the World Wildlife Fund, began in Sydney, Australia in 2007 and has since spread to 152 countries, including the USA – most of them prosperous countries in the “developed” industrial world.  It persists because radical environmentalists are still perpetrating the “climate change” scam – the myth that man-made “greenhouse gases” from the burning of carbon-based “fossil fuels” causes “global warming” and other types of “climate change,” endangering the earth.  Despite abundant evidence that the theory is in fact a myth, based on “junk” pseudo-science, radical environmentalists still manage to intimidate many gullible people, who feel self-righteous when they do something symbolic for the cause – such as turning off their lights for an hour on a Saturday night. 

                Besides being silly, Earth Hour is counter-productive, for it actually does harm to its stated goal, as noted by Bjorn Lomborg (author of The Skeptical Environmentalist and Cool It: The Skeptical Environmentalist’s Guide to Global Warming).  “It may inspire virtuous feelings, but its vain symbolism reveals exactly what is wrong with today’s feel-good environmentalism.”  Writing in Project Syndicate, a website devoted to “thought-provoking commentaries,” Lomberg points out that during Earth Hour any reduction in CO2 emissions, resulting from a drop in electricity demand during the hour, will be offset by the surge from firing up coal or gas stations to restore electricity supplies afterward.  Moreover, he notes, the candles that many participants will light during Earth Hour “are still fossil fuels – and almost 100 times less efficient than incandescent bulbs.  Using one candle for each switched-off bulb cancels out even the theoretical CO2 reduction; using two candles means you emit more CO2.” 

                In an aptly-titled editorial, Investor’s Business Daily reminds us: “Electricity is not a curse.  It is a blessing.  It’s given us refrigeration that keeps food from rotting.  It heats us, cools us, and pumps clean water into our homes.  It powers vital medical equipment and modern conveniences from televisions to toasters to cellphones.”  The editors add:  “Rather than turning lights off Saturday night, we suggest not only leaving them on, but turning on those that aren’t needed.  As the Competitive Enterprise Institute suggests, celebrate human achievement” (“Dearth Hour,” March 22).

      

    n Draft Them Gals! 

                With the U.S. military now OKing women for combat, there’s no rational basis for limiting conscription (the military draft) to men.  Current federal law compels only men between ages 18 and 25 to register for a military draft.  Never before has the country drafted women into military service, and neither the B.O. regime nor Congress seem eager to make them register.  But, constitutionally speaking, they may have no other choice – unless they repeal the draft registration law entirely.  (That’s what really ought to be done, for military conscription is not only inconsistent with America’s libertarian founding principles but also a direct violation of the Thirteenth Amendment prohibition of involuntary servitude, even though the Supreme Court failed to recognize the validity of that argument in its infamous World War I era Selective Service Cases.)    

    More than three decades ago the Supreme Court ruled that it was constitutional to register only men for a draft.  The Court’s rationale was that registration creates a pool of potential combat troops in case a national emergency – like a world war – required a rapid increase in the size of the military.  At the time women were excluded from serving in battlefield jobs, so there was no reason to register them for possible conscription into the armed forces, the Court held.  Now that front-line infantry, armor, artillery, and special-operations jobs are open to female volunteers who can meet the physical requirements, “it will be difficult for anyone to make a persuasive argument that women should continue to be exempt from registration,” said Diane Mazur, a law professor at the University of Florida and a former Air Force officer quoted in a recent AP news article (“Might draft registration expand to women?”  Columbus Dispatch, February 26). 

    So-called feminists – such as the National Organization of Women (NOW – what Rush Limbaugh has called the “National Association of Gals,” or NAG) – often complain about perceived inequalities between women and men, particularly the denial of “equal rights” for women.  Shouldn’t that include equal responsibilities as well?

       

    n Saint Hillary? 

                Many Democrats have been deluding themselves, thinking that the heir apparent to the White House is Hillary Rodham Clinton (HRC – otherwise known as Her Royal Clintonness).  After resigning as secretary of state, Mrs. Clinton is now in retirement; but her absence from the Washington political scene has not quelled speculation about 2016.  (After all, absence makes the heart grow fonder.)  B.O., the “Anointed One,” himself apparently anointed Hillary as his successor, in that now-infamous joint interview on CBS’s Sixty Minutes on January 27, where Steve Kroft asked such powder-puff questions that calling them “softball” would be too harsh – and an insult to the game.  (Fox News’s Greg Gutfeld, on The Five, aptly characterized it this way: “That wasn’t an interview; it was a ménage a trois!”)  Newsweek magazine (more justly called Newsweak) proclaimed Mrs. Clinton “the most powerful woman in U.S. history” on an early February cover.  That phony smile on the face of V.P. “Smiley” Joe Biden, who no doubt fancies himself as heir apparent, must be even more phony these days! 

                One fairly sure sign that HRC is still politically ambitious – that she remains (to put it bluntly) a power-hungry bitch, with her eyes again on the White House – is her recent action endorsing same-sex marriage.  “I support it personally and as a matter of policy and law, embedded in a broader effort to advance equality and opportunity for LGBT Americans and all Americans,” Mrs. Clinton said in a video released March 18 by the advocacy group Human Rights Campaign.  Why would the now-retired secretary of state make such an announcement unless she plans to run again for political office?  Note her use of the “politically-correct” (P.C.) buzzwords, “LGBT Americans” – referring, of course, to “lesbian, gay, bisexual, and transsexual” Americans – in a blatant attempt to appeal to those non-heterosexuals who identify themselves with those labels.  Never mind that, as secretary of state, Hillary did nothing to try to stop persecution of homosexual persons in radical Muslim countries or leftist dictatorships.  For her, apparently words speak louder than actions – or inaction.  In any event, it looks as though Mrs. Clinton is positioning herself for the 2016 presidential election. 

    But the pebbles strewn along Hillary Clinton’s path back to the White House are more like boulders, the inconvenient facts about her past record as U.S. secretary of state, U.S. senator, and first lady of both the U.S. and Arkansas – a past that ought to disqualify her from higher office, except in a political party that chose B.O. as its standard-bearer.  Mrs. Clinton’s tenure as secretary of state can be fairly described as disastrous: over the past four years, militant Islamic terrorists have seized more control of the Middle East, as the murder of the U.S. ambassador and three other Americans at the U.S. consulate in Benghazi, Libya, so pointedly illustrated.  Hillary was a key conspirator in the B.O. regime’s cover-up of that foreign-policy debacle, helping to perpetuate the lie that the Benghazi terrorist attack was motivated by an anti-Mohammed video broadcast on the Internet.  And Hillary’s much-touted “reset” of U.S.–Russian relations has yielded only B.O.’s capitulation to Vlad Putin on such issues as missile defense.  Even more embarrassing to Hillary (if she had any shame) is her record as Slick Willy’s cuckolded wife and criminal co-conspirator.  (Remember those infamous billing records from Hillary’s Rose law firm that mysteriously appeared in the Clinton White House living quarters – documents that were at the heart of the Whitewater land fraud?  All of Hillary’s whining about a “vast right-wing conspiracy” out to get her and Bill will never fully lay that story to rest.  Hillary’s political past may now be in the history books (most of which are written by people with a leftist bias that’s favorable to Hillary); nevertheless, that past will come back to haunt her, if she should be so foolhardy as to reenter political life. 

      

    n Too Pooped To Pope 

                Pope Benedict XVI shocked the world on February 11 when he announced his resignation, effective February 28, citing his age and declining health.  The former Cardinal Joseph Ratzinger, elected pope in April 2005, had served for almost eight years but decided to step down because, at age 85, he was becoming increasingly infirm, physically.  The Pope Emeritus Benedict (as he is now designated) leaves the Roman Catholic Church with a mixed legacy.  Although he helped make the Church seem relevant to the modern world by using new technologies – he tweeted from an iPad, issued benedictions from a Facebook page, and distributed Vatican news from a YouTube channel – he was a staunch conservative, theologically, reaffirming the Church’s traditional teachings on such matters as abortion, contraception, marriage, and a celibate, male-only priesthood hierarchy.  Although he denounced the scandal of child abuse by pedophile priests – calling it a “scourge” – Benedict XVI was criticized for not doing enough to expose the crimes.  And under his papacy, administrative problems within the Church festered, such as the incompetence, financial corruption, and possible organized-crime ties within the Vatican Bank.      

                Although Pope Benedict’s action was unprecedented in the modern era (the last time a pope resigned was 600 years ago, in 1415) – and therefore quite unsettling to the tradition-bound Church – it’s likely to become more common in the future, as modern medical technology allows people to survive well into their 80s, 90s, even beyond – despite physical or mental infirmity.  What will the Church do when a future pope develops Alzheimer’s disease?  It can only hope that pope emulates Benedict XVI in acting responsibly and resigning.  As Mary Johnson wrote in a recent USA Today op-ed, Benedict’s “most valuable legacy” could be his resignation.  “Is not this sort of responsible behavior, so void of heroics, so laudably sane, exactly what one might expect from a responsible steward, a good shepherd, a faithful servant?” she asks, adding – and she meant this as a compliment – that by resigning instead of staying until his death, “he treated the papacy not like a sacrament or a monarchy, but a job” (“Pope Benedict” An Unexpected Revolutionary,” February 18).   

                Meanwhile, the Church’s cardinals recently (on March 13) selected a new pope – and by their action, seemed only to “kick the can” proverbially down the road.  Any Catholics who hoped that the new pope would be a true “reformer” were no doubt disappointed by the choice of Cardinal Jorge Mario Bergoglio, archbishop of Buenos Aires, Argentina, who has taken the name of Francis I (after Saint Francis of Assisi ).  At age 76 and in questionable health (reportedly he has only one functioning lung), he is unlikely to have a long tenure.  And although Pope Francis represents many “firsts” for the Church – among them, the first pope from the Americas, the first pope from South America, the first Jesuit pope – he is known for standing firmly for core doctrine, like his predecessors John Paul II and Benedict XVI.  The main way he’s different?  By all accounts a genuinely humble man – in Buenos Aires, he took public transportation around the city and eschewed the archbishop’s palace, choosing instead to live in a modest room, where he cooked his own meals – the new pope has a personal style that is the antithesis of Vatican splendor.   

                Wisely, the new pope has fought against the Marxist “liberation theology” movement that swept Latin America in the 1980s; he also has been an outspoken critic of Argentine President Cristina Fernandez’s leftist government, using his homilies to criticize economic and social conditions as well as political corruption.  Some commentators – such as the editors of Investor’s Business Daily – have idealistically predicted that Pope Francis might be an opponent of leftist dictatorships in South America (including Venezuela, Ecuador, and Bolivia) the same way Pope John Paul II opposed communist dictatorships in eastern Europe (“Pope and Change in the Vatican,” March 15).  Nevertheless, Francis sees himself as a champion of the poor and outcast; and he shares with leftist “liberation” theologians the same perverted notion of “social justice” that is rooted in a moral philosophy of altruism, or self-sacrifice, which along with the Church’s adamant stance against contraception, is responsible for most of the poverty in the “Third World.”  Like his predecessors, Pope Francis is unlikely to embrace the only true hope for happiness and prosperity in the world:  free-market capitalism.

      

    n  Hugo Chavez Is Still Dead 

    “Generalissimo Francisco Franco is still dead” was a catchphrase that originated in 1975 during the first season of NBC’s Saturday Night Live – a regular feature of Chevy Chase’s “Weekend Update” news segment – and which mocked the weeks-long media reports of the Spanish military dictator’s impending death.  Today, 38 years later, after weeks of media reports of the impending death of another dictator, Venezuela’s socialist “president” Hugo Chavez, on March 5 it finally was announced that Chavez was indeed dead from cancer.   

    Don’t expect SNL or any other mock “news” show (such as Jon Stewart’s Daily Show on Comedy Central) to repeat that joke from 1975: to today’s left-wing news media, the death of a socialist dictator like Chavez isn’t as amusing as the death of a fascist dictator like Franco.  (Some leftist journalists and Hollywood celebrities like Oliver Stone and Sean Penn have even lionized Chavez as a “champion of the poor.”  Never mind that he’s a thug whose 14-year rule has made his country even poorer and more crime-ridden.)  To the individuals who lived under the oppressive rule of either of these dictators, however, there’s not a bit of difference – socialism and fascism are simply different sides of the same tyrannical collectivist coin), nor is there anything either funny or sad about the end of their horrible lives.  

    Anyone who doubts former President George W. Bush’s warning that there’s an “axis of evil” in the world – a global conspiracy of rogue states who are rabidly anti-American because the USA represents the freedom and individualism they’re plotting to destroy – need only look at the guest list for Chavez’s funeral in Caracas.  It included Cuba’s Raul Castro, Iran’s Mahmoud Ahmadinejad, and the left-wing dictators of Bolivia, Ecuador, and Nicaragua.  The U.S. delegation was led by Rep. Gregory Meeks (D.–N.Y.) and former Rep. Bill Delahunt (D.–Mass.), who were “friends” of the dictator because of the cash and influence Chavez showered on them.  (Delahunt is the “broker” of Chavez’s gambit to buy influence in the U.S. through his cheap heating oil program, doled out through Joe Kennedy’s Citizens Energy.  Meeks sought Chavez’s help to improve the court chances of a convicted Ponzi schemer.) “Taking Chavez’s penny, both remain in his debt,” even after death (“Sorry Show in Caracas,” I.B.D., March 9).

      

    n The Once and Future Queen 

                The big news among fans of the British royal family is that Prince William and his wife, Kate (the former Kate Middleton, now Duchess of Cambridge), are expecting a baby, the heir-apparent to the British throne.  Of course, the current queen – Elizabeth II – following her triumphant 60th-anniversary Jubilee, is as popular as ever.  But the two men who are next in line – her son, Prince Charles, and her grandson, Prince William – are barely noticed these days.  Charles never was very popular; nor is he terribly bright – with his radical environmentalist nonsense and his self-conceit that he’s an expert on architecture.  And William, once the leading royal heart-throb and hence media darling, has lost some of his allure, after marriage and his growing bald spot.  (Thankfully for him, his wife’s fashion sense – and her increasingly obvious “baby bump” – seem to get most of the attention.)

    Anticipating that the future heir-apparent may be a girl, Great Britain and the other countries that are members of the Commonwealth of Nations (who all consider the reigning British monarch to be their monarch, too) last fall began the process of changing their centuries-old rules of succession that put sons on the throne ahead of any older sisters.  (Since the medieval period, England generally has followed the rules of primogeniture, which in determining the line of succession to the hereditary throne gave priority to the oldest male in the direct line of descent.  A woman would become monarch only if she had no brothers, either younger or older than she – which is partly why both Victoria and Elizabeth II became queen.)  The new rules would apply only to future heirs and would have no impact on the current line of succession.  (Next in line to the throne after Elizabeth II is Prince Charles, who is the queen’s firstborn child.  Charles’ sister, Anne, is lower in the line of succession than her younger brothers Andrew and Edward by virtue of their male gender.  William is second in line to the throne after his father, but since Charles had only sons – William and his brother, Prince Harry – the issue was never raised until William married.)   

    The change in the rule of succession will require approval by the legislatures of all 16 Commonwealth nations that have Queen Elizabeth II as their head of state, including not only Britain itself but also such countries as Canada, Australia, and New Zealand.  But representatives of all the Commonwealth nations – meeting in Perth, Australia in late October 2012 – have agreed to the change.  (They also agreed to lift another centuries-old rule, a ban on British monarchs marrying Roman Catholics.)  Most observers believe the change is non-controversial, given the degree to which the laws in Commonwealth countries have recognized the legal and political equality of women, since the early 20th century (when women were given the right to vote).  No doubt the long and mostly successful reign of Elizabeth II also has permanently changed the attitudes of the people of Britain and the Commonwealth countries, most of whose population cannot remember having a male monarch on the throne.  (It’s been over 60 years, after all, since Elizabeth’s father, George VI, was king.)  So, once the new rules are implemented, the next in line to the throne after Charles and William will be the baby that Kate is now carrying.  If she’s a daughter, she’ll be the first British princess to beat out any younger brothers and accede to the throne – assuming that there will still be a monarch sitting on a throne at that future time.

      

    n Mischievous Young ‘Un 

                North Korea’s 30-year-old communist dictator, Kim Jung Un (the son and grandson of dead communist dictators Kim Jong Il and Kim Il-sung, respectively), has been up to more mischief – and not just by “wining and dining” former NBA star Dennis Rodman (who called Kim “an awesome kid,” in what has to be the weirdest bromance of the year).  Rodman’s “basketball diplomacy” apparently has failed.  Angrily reacting to the U.N. Security Council’s recent unanimous decision to tighten sanctions against his rogue country, Kim’s regime announced that it was nullifying all nonaggression agreements with South Korea, declaring invalid the armistice agreement that ended the Korean War in 1953.  And it’s not just South Koreans who are now nervous.  As Pyongyang marches relentlessly toward deliverable nuclear weapons and the long-range missiles to carry them, one top North Korean general has claimed that his country has nuclear-tipped intercontinental ballistic missiles (which could reach the United States, not only Alaska but also the Northwest coast) ready to blast off.  

    In a recent editorial, Investor’s Business Daily argues that North Korea’s recent actions – both the tearing up of the 1953 armistice and the threat of a nuclear strike on U.S. territory – may be more of a warning to China than to us: 

    “China has long been caught in a bit of a conundrum regarding North Korea.  Supporting Pyongyang has kept alive the threat of another war on the Korean peninsula, which has perpetuated a strategic problem for the U.S.  Beijing fears an Asian repeat of the reunification of Germany after the collapse of the Soviet Union.  A reunified Korean would give the U.S. a firm foothold on its border that does not sit well with China’s regional, expansionist goals.

     

    “Yet North Korea’s relentless pursuit of nuclear weapons and the missiles to carry them has created another big problem – renewed war on the Korean peninsula, including strikes on Japanese and U.S. territory, would not be good for China either.  Their dilemma has been how to keep North Korea as a useful nuisance but restrain it from being a threat.”

     

    (“Don’t Ignore Petulant Pyongyang,” I.B.D., March 12). 

    Now even B.O., who has been a critic of missile defense since long before his 2008 campaign (and who has been very cooperative with Russia in scrapping missile defense in eastern Europe), sees the merit in having some missile defense to protect the U.S. homeland from rogue regimes like Kim’s in North Korea.  “I can tell you that the United States is fully capable of defending against any North Korean ballistic missile attack,” White House spokesman Jay Carney recently said.  Let’s hope he’s right.

     

    n Going Postal – Going, Going, Gone 

                The U.S. Postal Service (USPS) has announced plans to stop Saturday home delivery (except for parcels), starting in August.  The USPS says the cost-cutting move is necessary if it is to remain viable.  But already, though, a chorus of howls is erupting in Congress.  Senator Susan Collins (R.–Maine) has called the plan “inconsistent with current law” mandating six-day-a-week delivery and “threatens to further jeopardize [USPS’s] customer base.”  Now the media is reporting that the plan might be scrapped, recalling what happened a few years ago when the USPS proposed shutting 3,200 money-losing post offices.  The list was whittled down to just 162 after lawmakers started complaining. 

    Something has to be done.  Last year, the USPA posted a record $15.9 billion loss, and Postmaster General Patrick Donahoe says it’s hemorrhaging about $25 million a day, thanks to massive bloat and a cratering core business (as more and more Americans turn to electronic forms of communication rather than “snail mail”).  All but 7,000 of its 32,000 post offices lost money, and a recent government audit found the USPA has almost twice as many mail facilities as it needs, 35,000 excess workers, tens of thousands unneeded machines, and hugely wasted travel costs.  The fact that 85% of its workforce is unionized doesn’t help, either.  Meanwhile, first-class mail volume is plunging.  It fell 25% over the past decade and will likely drop another 46% over the next. 

    Donahoe complains that Congress wants USPS to run like a business, while refusing to let it do so.  “He’s right,” observe the editors of Investor’s Business Daily.  USPS has been trying for years to restructure its operations in light of its crumbling core business, only to be “stymied by Congress, which routinely blocks needed reforms either to protect special interests or spare districts from losing a precious post office.”  That’s why, the editors conclude, “the only real solution isn’t to tinker around the margins with phony congressional fixes, but to privatize the USPS entirely.”  That would liberate the agency to cut costs, improve efficiencies, innovate and grow new lines of business.  “Other countries have done so, and consistently found improved quality and lower cost.  Yet this common-sense idea keeps getting lost in the mail here” (“A Private Post Office?” February 8).    

      

    n Minimum Wage, Maximum Folly 

                In his 2013 “State of the Union” address, among the typical laundry list of asinine ideas, B.O. proposed that Congress raise the federal minimum wage from $7.25 to $9.00 an hour.  And a  number of states have raised, or are in the process of raising, their government-mandated minimum wages – thus perpetuating one of the worst public policies of the past century. 

    I wrote about this issue in a MayerBlog essay several years ago:  “Minimum Wage, Maximum Folly” (Oct. 20, 2006).  All the arguments against minimum-wage laws remain just as true today as they did then.   Indeed, they’re even more poignantly true today, with our sky-high unemployment rates, particularly among the groups whom minimum-wage laws hurt the most:  young, unskilled, and minority workers.  For example, teen unemployment has been above 20% for four years – and in California (one of the states considering a minimum-wage increase) the teen unemployment rate is nearly 35%.  “Fewer job opportunities is the last thing these teens need,” argues Michael Saltsman (research director at the Employment Policies Institute) in a recent op-ed.  He cites a new scholarly report – coauthored by UC-Irvine minimum-wage expert David Neumark – that exposes the flaws in other academic studies claiming that increasing the minimum wage has no negative effect on entry-level job opportunities.  That claim not only defies common-sense and the realities of the business world but has been rejected by most sound (and unbiased) economists for years.  There’s now a “robust academic consensus that says the minimum wage will worsen the outlook” for young workers (“Minimum Wage Does Damage,” I.B.D., January 24).  And economist Walter Williams – whose splendid op-ed provided the name for my 2006 essay – continues to hammer home the critical point that minimum-wage laws historically have harmed the very groups they are supposed to help, particularly minority workers (“Minimum-Wage Hike Is Poison to Minorities,” I.B.D., March 6).   By the way, economics professor Frank Stephenson has posted a series of short essays on the “Division of Labour” blog, smashing to bits the illogical rationales offered by supporters of minimum-wage laws.  See, for example, his post “Some Questions for Mr. Ross” (February 12). 

    It’s not just libertarian or conservative advocates of free-market economics who have pointed out the folly of increasing the minimum wage; some left-liberal economists have jumped on the bandwagon, too.  For example, Christina Romer – the former chair of B.O.’s Council of Economic Advisors, who is now back in academia teaching economics at Berkeley – wrote an op-ed published in the New York Times earlier this month, an op-ed which (in the words of the I.B.D. editors) “delicately but systematically dismantled [B.O.’s] call for a 24% hike in the federal minimum wage over the next two years.”  First, she questioned the need for a minimum wage at all, noting the essential truth that “robust competition is a powerful force helping to ensure that workers are paid” decent wages.  Then she stripped bare the argument that a minimum wage is needed on “fairness” grounds, pointing out that many minimum-wage workers aren’t poor but are supplementing a family income or are teens just entering the job market.  Plus, to the extent that businesses pass on the cost of a higher minimum wage to consumers, that could “harm the very people whom a minimum-wage is supposed to help.”  Ms. Romer also demolishes the claim that a higher minimum wage will boost economic growth by giving workers more spending power.  At best it will translate into less than $20 billion in new spending - “not much in a $15 trillion economy,” she observed.  No wonder Ms. Romer is no longer part of the regime.  “It’s too bad Obama isn’t listening to [her] these days.  If he ever did” (“Minimum Sense,” March 5). 

      

    n “Outrageous” Nanny-Statism 

                The Transportation Safety Administration (TSA) – an agency that ought not exist – has issued new rules relaxing the draconian restrictions that have been in place on airline passengers since the 2001 militant Islamic terrorist attacks.  Earlier this month TSA Administrator John Istole announced that, effective April 25, travelers will be allowed to carry (or to have in their carry-on luggage) such items as small knives (knives without a molded grip and with blades less than 6 centimeters, or 2.36 inches, in length – like Swiss army knives, for example); and certain sports equipment, such as billiard cues, ski poles, hockey and lacrosse sticks, and two (but only two!) golf clubs.  Razor blades and box cutters, such as those the 9/11 hijackers used, would still be prohibited.  The move follows TSA actions that earlier eased restrictions on cigarette lighters and fingernail clippers, which were also formerly banned.  And the allowances are more closely in line with standards set by the International Civil Aviation Organization, the TSA said.  In other words, it’s a long-overdue, common-sense change in U.S. travel regulations. 

    Nevertheless, the change set off an immediate outcry, especially from flight attendants.  The Flight Attendants Union Coalition, representing nearly 90,000 flight attendants at U.S. airlines nationwide, called it “a poor and shortsighted decision by the TSA.”  “We believe that these proposed changes will further endanger the lives of all flight attendants and the passengers we work so hard to keep safe and secure,” the group said.  Stacy Martin, president of Southwest Airlines’ flight-attendants union, Transport Workers Union of America Local 556, called the decision “outrageous” (“TSA to allow pocketknives on planes,” USA Today, March 6). 

    What’s really “outrageous” here are the misguided “security” measures the federal government has taken – including the creation of the TSA itself – since 9/11.  As I maintained in my essay observing the tenth anniversary of the Sept. 11, 2001 attacks – “Remembering 9/11: The Tenth Anniversary” (Sept. 15, 2011) – “the U.S. government (both the Bush administration and the Congress) did overreact, with mostly misguided policies – policies that have been followed (despite rhetoric to the contrary) by the B.O. regime and by both the 111th and 112th Congresses – that, unfortunately, have not really made Americans safer from terrorist attacks while, at the same time, have eroded Americans’ civil liberties.”  Rather than helping Americans be better prepared to defend themselves from future terrorist attacks, these policies have transformed us into a nation of sheep, even more dependent on the government for our security.  Nothing better illustrates this than the pathetic whining of some Americans about such a minor, common-sense change in these nonsensical rules.

       

    n “Big Brother” Bloomberg Strikes Again 

    New York City was once the greatest city in the world – both its financial capital and a leading cultural center.  Today, sadly, it’s more like the laughing-stock of the world, thanks to Mayor Michael “Big Brother” Bloomberg and the truly “outrageous” nanny state he’s turning the city into.  (Maybe we should call it a “nanny city-state,” or a “ninny state.”)  As I noted in “Summer 2012 in Review” (Sept. 12), Bloomberg “can’t seem to stop imitating `Big Brother,’ the paternalistic dictator of George Orwell’s dystopian novel 1984.”  First Bloomberg banned smoking in public places – not only indoor workplaces but even outdoor spaces, such as all the city’s parks, beaches, swimming pools and pedestrian plazas.  (Never mind that the scientific studies purporting to show health dangers from “secondhand smoke” are not only dubious with regard to indoor smoking but totally non-existent when it comes to outdoor smoking.)  Then Bloomberg banned trans-fats from restaurant food.  (Never mind that trans-fat amounts to a tiny fraction of total fat consumption and, according to many nutritionists, actually better for our health than saturated fats, which Bloomberg’s ban doesn’t touch.)  

    Last summer Bloomberg proposed a ban on soft drinks sold in containers 16 ounces or more, supposedly to force New Yorkers to reduce their sugar consumption.   His proposed rule would prohibit restaurants, mobile food carts, delis and concessions at movie theaters, stadiums or arenas from selling sugary drinks in cups or containers larger than 16 ounces.  The ban would apply to soda pop (beverages with more than 25 calories per 8 ounces) but not to 100% juice drinks or beverages with more than 40% milk.  Contrary to popular belief, however, the ban does not include convenience stores, such as 7-Elevens, and supermarkets, both of which are regulated by state government.  (So customers can still get their “Big Gulps.”)  However, Bloomberg recently said that he’d urge the state to move forward with a ban that matches the city’s new regulations.  

    The ban on soft drinks in containers larger than 16 ounces was approved on Sept. 13 by the city Board of Health, whose members were appointed by the mayor, and was to take effect at the beginning of March.  Thankfully, however, a judge of the New York Supreme Court (the name the Empire State gives its trial courts) – Judge Milton Tingling – struck down the infamous ban, on the brink of its implementation.  He declared Bloomberg’s soda diktat to be an unconstitutional “arbitrary and capricious” action that violates the doctrine of separation of powers.  Indeed, Judge Tingling ruled that “it would eviscerate,” not just violate, that fundamental doctrine of American government, and that “such an evisceration has the potential to be more troubling than sugar sweetened beverages.”  The judge also ruled that the city health board didn’t have the authority to limit or ban a legal item (like sugary soft drinks) under the guise of “controlling a chronic disease” (presumably, obesity), because “imminent danger” to public health by such a “disease” was not proven in this case.  (Thus, the judge correctly recognized that the ban was not a valid exercise of government’s “police power” because it did not concern public health.)  The judge imposed a permanent restraining order against Bloomberg’s ban, which the mayor immediately vowed to appeal to a higher court.  Various groups that had opposed the ban – including the American Beverage Association, which was one of the plaintiffs challenging it in court, as well as the National Association of Theater Owners – cheered the court’s decision. 

    Like a fascist Energizer bunny, however, “Big Brother” Bloomberg “keeps going and going” – constantly proposing new paternalistic government measures.  Immediately following last summer’s movie-theater massacre in Colorado – even before all the victims’ bodies were removed from the theater – Bloomberg politically exploited the horrid crime by calling for more government restrictions on guns and suggesting that police go on an illegal strike in order to coerce public support.  Following the Newtown, Connecticut school massacre early this year, he similarly exploited the tragedy to call for more gun controls, including a federal ban on so-called “assault weapons” and restrictions on ammunition.  And as if his wars on cigarettes, trans-fats, soft drinks, and guns weren’t enough, Bloomberg recently has taken aim at three other targets:  baby formulas, Styrofoam containers, and tobacco products.  As part of the city’s campaign to urge mothers to breast-feed their infant babies – a program awkwardly named “Latch On NYC” – Bloomberg last summer announced new regulations on city hospitals, forcing them to keep baby formulas hidden behind locked doors, to cease giving out samples of formula to new mothers, and to document a medical reason every time the hospital gives a baby a bottle.  Then in his mid-February “State of the City” address, “Big Brother” Bloomberg called for a ban on plastic-foam (such as Stryofoam brand) food packaging, arguing that it’s filling up landfills (another myth).  Most recently, he proposed to ban all stores from publicly displaying tobacco products.  Under his proposal, tobacco products would have to be kept out of public view – under counters, in drawers, or behind curtains – supposedly to discourage people from smoking.  “Even one new smoker is one too many,” Big Brother Bloomberg pompously said. 

                Bloomberg is a megalomaniac who is depriving New Yorkers of their personal freedom – and of their responsibility for their own lives.  The important underlying issue here isn’t health or safety; rather it’s power – political power, the coercive power of government.  As Lord Acton stated in his famous maxim, “Power tends to corrupt, and absolute power corrupts absolutely.”  New Yorkers ought to examine their city charter and consider whether it vests far too much power in just one person, the mayor.

      

    n Do They Give a Merit Badge for Homophobia? 

                In early February the Boy Scouts of America (BSA) were “in the news again, for the only thing they ever seem to be in the news for anymore: their attitudes toward homosexuality.”  So reported Nick Gillespie, editor of Reason.com and a former Boy Scout himself, in a Wall Street Journal op-ed (“A Lesson from the Scouts’ Own Book,” February 1).  The BSA had planned to hold a vote whether to end the national organization’s longstanding ban on openly “gay,” or homosexual, scouts and leaders.  Although many commentators expected the Scouts to adopt a compromise position – ending the national blanket ban but still permitting individual chartering groups (many of which are churches) to decide for themselves whether to permit open homosexuals to join and help run their troops – the BSA decided on February 6 to delay the controversial vote.  The organization declared it needed more time for “a more deliberate review of its membership policy” and would take action on the issue in May during its National Annual Meeting. 

    As a matter of constitutional law, of course the BSA, as a private membership organization, is entitled to determine its own “values” and to decide for itself whom it will accept (or reject) as members; that is a fundamental freedom – actually, a part of both the freedom of speech and freedom of “association” guaranteed by the First Amendment.  But the organization is under tremendous pressure, both from outside and from within (including present and former Scouts, both homosexual and heterosexual), to change its policy – as the attitudes of Americans generally are changing, become much more accepting of homosexuality (as the recent trend in favor of same-sex marriage indicates).  Nevertheless, perhaps because of those changing attitudes in the general population, many conservatives are adamantly opposed to a change in membership policy.  About 70% of troops are chartered to faith-based groups, including the Catholic Church, which condemns homosexuality as a sin.  Forced to welcome openly gay Scouts and leaders, some groups would end their BSA affiliation, seriously weakening an organization already declining in numbers and influence.   

    Legal and political considerations aside, the issue indeed is one of moral principles, as conservatives maintain.  But the values they claim to uphold – the traditional values on which the BSA was formed a century ago – are seriously flawed and out of touch, not only with contemporary society but also with human nature.   Science teaches us that homosexuality is as “natural” as heterosexuality; that sexual orientation is determined by a variety of factors, both genetic and environmental, too complex to fully understand.  But the fact that only a small minority of people have a homosexual orientation doesn’t mean they’re not “normal” human beings – just as the fact that only a small minority of people have blood type AB doesn’t mean they’re not “normal” or “natural” either.  Traditionalists within the Boy Scouts claim that they’re upholding the Scout Oath, which requires members to do their best to be “morally straight” at all times.  What’s so “morally straight” about bigotry directed against those who are different – bigotry rooted in the very real phenomenon known as “homophobia” (heterosexuals’ irrational fear and hatred of homosexuality)?  Or, for that matter, why is it “morally straight” to expect boys and men who have homosexual orientation but who otherwise cherish the Scouts to repress their own feelings, just to belong? 

    In contrast, Girl Scouts USA – which celebrated its centennial last spring – has adopted a policy not to discriminate on the basis of sexual orientation.  The Girl Scouts not only have great cookies but also a great deal of common sense. 

     

    n More Idol Speculations – From a Survivor Fan 

                I had planned to never again mention Fox’s American Idol TV show, just as I had planned to skip watching the 12th season of the singing competition.  It’s all because of the changes in the judging panel.  Not that I was a fan of last season’s new judges, Jennifer Lopez and Steven Tyler, but I dislike their replacements even more.  Australian country singer Keith Urban is a nice enough guy (for a non-American singing in a genre that ought to be exclusively American), but his fellow newcomers, Mariah Carey and Nicki Minaj – when they’re not playing “dueling divas” – have egos not matched by their talent.  (To me, they seem to scream rather than sing their songs.)  Nicki Minaj is particularly annoying in her unpredictable oddity – aptly described by Robert Bianco (TV critic for USA Today) as a blend of “Paula Abdul’s mothering, Stephen Tyler’s sometimes-inappropriate quirkiness, and, at her best, Simon Cowell’s directness.” The veteran judge, Randy Jackson, on the other hand is still annoyingly predictable, with all his “dawg” talk and his recurring complaint that the singing is “pitchy.”  

                But like a motorist who can’t help but to stop and gawk at a car wreck, I found myself watching, first the auditions, then the Las Vegas rounds (where the judges whittled the contestants down to the Top 10), and now the live shows from Hollywood.  “America” now gets to vote – that is, those Americans who are such fans of the show that they text, tweet, or phone their votes for their favorite.  (Everyone gets up to something like 50 votes, which pretty much guarantees whoever has the most fanatic fans will win, regardless whether they’re the most talented singer).  Last year, I successfully predicted the winner, Phillip Phillips – whose unique singing style impressed me the first time I saw him in the auditions.  This year, I won’t press my luck by sticking out my neck again to predict a winner:  there’s no one who seems to me to be a standout (although the show’s producers seem to have done everything they can to stack the deck so the next winner won’t be another white male).  But I do have a favorite:  Lazaro Arbos, the 21-year-old from Naples, Florida, who stutters so badly he can barely speak but who sings beautifully – another Mel Tillis!  (Google that name if you don’t know who he is.) 

                Meanwhile, I’m again enjoying another new season (amazingly, the 26th season) of CBS’s Survivor, the grand-daddy of TV “reality” competition shows.  As I noted in “Tricks and Treats 2012” (Oct. 25),  the success of the show can be attributed to several factors, among them, an effective host (Jeff Probst, who now has his own talk show on daytime TV) and an essentially sound basic format.  The producers of the show have managed to keep that format intact – having “survivors” in a harsh wilderness setting, competing with one another in various games, and in their social as well as physical survival skills, to “outwit,” “outplay,” “outlast” their competitors, not necessarily in that order) – while adding enough new “twists,” or differences, each season to keep it interesting.   

                Set in Caramoan, an island group in the Philippines, this season pits a team of “Fans” against a team of returning “Favorites.”  So far the veteran players have been winning most of the competitions – a result not only of their experience but also the foolish strategy of the Fans, who’ve created a six-person alliance targeting the four youngest (and best-looking, and strongest) players for elimination.  But the Favorites have had some rough spots too, mostly due to personality conflicts (the divisive play of ex-"federal agent" Phillip and the volatility of Brandon, whose recent meltdown resulted in the team forfeiting so they could immediately vote him off.)  Again, I won’t stick out my neck to make a prediction who will win, but I will note my favorite player, one of the Favorites:  Malcolm Freberg, a 26-year-old bartender from Hermosa Beach, California.  A veteran of last season’s “Survivor: Philippines,” Malcolm is strong (he played college football for Dartmouth), fairly smart, and has a pleasant personality.  In his previous outing in the Philippines, he nearly “outwitted/outplayed/outlasted” his competitors, being the 15th player voted off (and the 8th member of the jury).  Malcolm says he’s “slightly less cocky” than the last time he did this, so perhaps he’s learned from his mistakes and this time will make it all the way to the finale. 

      

    n  An Amazing Disg-race 

                Another “reality” competition show, The Amazing Race (also on CBS), is in hot water because of an outrageous recent episode.  Set in Hanoi, Vietnam, the March 17 episode had contestants trying to memorize a song performed for them by children in front of a portrait of North Vietnam communist leader Ho Chi Minh, with subtitled lyrics that included “Vietnam Communist Party is glorious. The light is guiding us to victory.”  The contestants then had to go to a B-52 Memorial, which is the wreckage of an American bomber plane shot down during the Vietnam War, to find the next clue in their televised round-the-world journey.  In the episode, the twisted metal of the downed plane is treated as any other prop, with a bright “Amazing Race” “Double-U-Turn” signed planted in front of it, signifying to contestants the next phase of their scavenger hunt. 

    The episode has aroused the ire of a broad spectrum of viewers, including (not surprisingly) Vietnam War veterans, the VFW, and the American Legion.  And it’s not just veterans’ groups or conservatives who are outraged:  Bob Beckel, the left-liberal commentator on Fox News’ The Five, was one of the first media figures to condemn the program.  So far both CBS network executives and the show’s executive producer, Jerry Bruckheimer, have refused to apologize – or even to acknowledge the controversy.   With a Marxist sympathizer in the White House, apparently folks in the media no longer consider it outrageous for American game-show contestants to participate in Communist propaganda, even when it’s done in a former enemy nation that was responsible for the deaths of tens of thousands young Americans killed in war.

                    UPDATE:  At the beginning of the March 24 episode of Amazing Race, CBS made the following statement: “Parts of last Sunday’s episode, filmed in Vietnam, were insensitive to a group that is very important to us – our nation’s veterans.  We want to apologize to veterans, particularly those who served in Vietnam, as well as to their families and any viewers who were offended by the broadcast.”  (Notice that CBS does not admit it was wrong to promote Communist propaganda – just “insensitive” to Vietnam veterans.)  Producer Jerry Bruckheimer issued a similar apology via his company’s Facebook site. 

      

    n Some More BFDs

                 Another (occasional) tradition on MayerBlog are entries discussing “BFDs” – issues in politics and/or pop culture that have caused much needless hand-wringing, for only in the sarcastic sense are they really (in the words of V.P. “Smiley” Joe Biden) “Big Fucking Deals.” Here are some more BFDs that have wasted a lot of time and effort this spring:

    • Sip-of-Water-gate:  Senator Marco Rubio (R.–Fla.) became the object of much derision from the leftist news media because of his awkward pause, to take a sip of water, as he delivered the official Republican Party response to B.O.’s State of the Union address on February 12.  Commentators on CNN (which might stand for “communist news network”) went so far as to declare that Rubio’s political future – including a possible run for the presidency in 2016 – was finished!  It seems that leftist criticism of Senator Rubio has intensified as he takes a leading role in negotiating immigration reform on Capitol Hill – and (ironically) as the predominantly leftist news media has begun touting him as the “savior” of the GOP (as Time magazine did on an early February cover).  But Rubio’s supposed gaffe is nothing compared to the embarrassing faux pas committed by B.O. when visiting Britain a couple years ago.  He rose to toast Queen Elizabeth II – beginning with the words, “Ladies and gentlemen, to the Queen!” – apparently unaware (didn’t any of his handlers tell him?) those words should be spoken at the end of the toast, for they would signal the band to begin playing “God Save the Queen.”  Yet B.O., being his usual narcissistic self – oblivious to everything but his own imagined greatness – insulted the Queen and all of Great Britain by continuing to deliver the rest of his toast as the band played the British national anthem, which B.O. seemed to assume was merely background music for his own voice!  (Thanks to Thomas Lifson, who reminds us of B.O.’s “cringe-inducing gaffe involving a sip” and posts the video in his entry, “A Little Perspective on Rubio’s Sip of Water,” on the American Thinker blog, February 14.)

    • Michelle, Ma Belle – NOT!  Michelle Obama is the most overrated First Lady in U.S. history.  Like her husband, Mrs. B.O. benefits from what I’ve called “The Emperor’s New Clothes” phenomenon (in her case, the “Empress’s New Clothes”): many people in the news media, not wanting to be called racist for criticizing the first black First Lady in U.S. history, give her false adulation – ooing and aahing over her supposed great fashion sense, or her latest change in hair style – while ignoring the negative side of Michelle.  In truth, she’s not very fashionable, isn’t very attractive (with those silly Moe Howard bangs and that unsightly under-bite), and by many accounts (on those few occasions when the truth seeps through the media hype) she’s a shrew!  (The only sympathy I have for B.O. is because he’s married to this woman.  Little wonder that recently, while she and the girls flew to a posh resort in Colorado for a ski weekend, B.O. instead flew to Florida to play golf with Tiger Woods.)  More importantly, she hen-pecks not only her husband but also all Americans, with her crusade against what she regards as “unhealthy” foods.  Like her husband, she’s a paternalist who wants to dictate how other people should live their lives.  And, with all the grossly expensive travel junkets she’s taken since her husband began his occupation of the White House, she’s also probably the most profligate spender of money ever to be First Lady (another way in which she and B.O. make the “perfect couple” who deserve each other).  Mrs. B.O. is truly the “Marie Antoinette” of modern American politics – only instead of saying to the peasants “Let them eat cake,” she’s saying “Let them eat tofu”!

    • Not Exactly a Carnival Triumph:  Last month more than 3,000 passengers on the Carnival cruise ship, Carnival Triumph, experienced “cruise ship hell” when the ship broke down in the middle of the Gulf of Mexico.  The 900-foot, 14-story vessel was returning to Galveston, Texas from Cozumel, Mexico, on the third day of a four-day cruise when an engine-room fire knocked out power and plumbing on most of the ship.  No one’s life was endangered, and by virtually all accounts the crew of over 1,000 acted valiantly, doing their best to help the passengers cope with the lack of electricity, running water – and working toilets.  Toilets and drainpipes overflowed, soaking many cabins and interior passages in sewage, and filling the ship with an overpowering stench.  How could this happen?  Because, by several accounts, many of the passengers behaved like animals.  Passenger Jacob Combs, 30, an Austin, Texas-based sales executive with a health-care and hospice company, told Reuters, “Just imagine the filth.  People were doing crazy things and going to the bathroom in sinks and showers.  It was inhuman.”  Combs had nothing but praise for the crew members, saying they had gone through “hell” cleaning up after the irresponsible passengers.  “They were constantly cleaning,” he said, adding that after he left the vessel, “The thing I’m looking forward to most is having a working toilet and not having to breathe in the smell of fecal matter.”  No shit!  (Meanwhile, Carnival’s multiple apologies – and offers of full reimbursement, travel expenses, future cruise credits, and a $500 additional payment to help compensate for the ordeal – have failed to satisfy many disgruntled passengers, who vow to sue the cruise line for millions of dollars in damages.  Apparently they can’t accept that, despite the cruise industry’s generally remarkable safety record, from time to time, shit does happen.) 

    • Zombies and Witches, Oh, My!:  Vampires are so passé; the hottest supernatural genre in pop culture today seems to be zombies.  They’re the subject of a recent movie, Warm Bodies, as well as a hit cable TV series, The Walking Dead, now in its third season on AMC.  I hate to make this political (yeah, sure!), but the phenomenon seems apt today.  Aren’t zombies supposed to feed on living human brains?  That may explain all the “low-information voters” who reelected B.O.  There’s another fad in pop culture (the supernatural fantasy department) – witches – as illustrated by some current films (Beautiful Creatures, Oz the Great and Powerful, and Hansel & Gretel: Witch Hunters) and TV series.  (Even the CW’s Vampire Diaries seems to have been taken over by witches, overshadowing the vampire title characters.)  Why all this interest in witches?  No doubt it has something to do with certain women being touted as female “role models,” including former First Lady H.R.C. and the current First Lady, Mrs. B.O. 

    • The Devil’s in the Details:  History, the cable TV channel, has generated a mini-controversy with its hit series The Bible.  It seems that the actor cast to portray Satan bears a striking facial resemblance to B.O., according to many viewers – including Glenn Beck, who tweeted, “Anyone else think the Devil in #TheBible Sunday on History Channel looks exactly like That Guy?” (as noted above, Beck’s euphemism for B.O.).  “This is utter nonsense,” executive producers Mark Burnett (creator of Survivor) and his wife, actress Roma Downey, said in a joint statement.  “The actor who played Satan” – Mohamen Mehdi Ouazanni – is “a highly acclaimed Moroccan actor,” who previously played parts (including Satanic characters) in several Biblical epics, Burnett noted.  Downey, a devout Roman Catholic, added, “Both Mark and I have nothing but respect and love [for] our President, who is a fellow Christian.”  (She neglected to mention that B.O. is the son of a Muslim – and a Marxist – from African and that, by his own admission, he inherited the “dreams” of his father.)  But the real scandal here is that History – which used to be known as The History Channel – has so little real history in its programming.  It seems that “History” today consists largely of myths, UFO stories, and redneck reality shows like American Pickers, Pawn Stars, and Swamp People.  

    • Wrestling with the IOC:  The International Olympic Committee (IOC) has announced that it plans to eliminate wrestling.  At its recent meeting in Switzerland, the 15-member executive board recommended that wrestling not be on the list of 25 core sports proposed for the 2020 Summer Games.  Jim Scherr, former CEO of the U.S. Olympic Committee and an Olympic wrestler in 1988, said the decision-makers behind the move do not represent countries where wrestling is popular and successful – namely, the USA (which has won 124 Olympic wrestling medals) or Russia (which along with the former Soviet Union has 167 wrestling medals).  Sherr added that he thinks the decision is “a reflection of the Eurocentric nature of the IOC board, the IOC membership as a whole.”  USA Today sports writer Christine Brennan went further, maintaining that the decision reflects the IOC’s anti-U.S. bias.  “The USA is to the IOC what Notre Dame is to college football.  People love to hate it” (“IOC’s anti-U.S. bias no surprise,” February 14).  For all those young athletes (and their families) whose dreams of an Olympic medal in wrestling may be crushed, the IOC’s decision really is a B.F.D.  And for everyone who appreciates the history of wrestling as a sport – one of the world’s oldest sports, and one of the original events at the ancient Olympic games – the IOC’s decision is a travesty.  But it illustrates a phenomenon that’s becoming all-too-predictable: an international bias against the West.  That such a bias has infected the Olympics – one of the greatest legacies left to the world by the birthplace of the West, ancient Greece – is just simply sad.

      

     | Link to this Entry | Posted Saturday, March 23, 2013.  Copyright © David N. Mayer.


    The Unconstitutional Presidency - February 21, 2013

     

     

    The Unconstitutional Presidency

     

     

    This essay is in part based on a two-part talk on “Restoring the Constitutional Presidency” I presented at the Atlas Society’s Summit in Washington, D.C. on June 30 & July 1, 2012.  Videos of my talk are posted on The Atlas Society’s website: Part I and Part II.  (Each part is approximately 60 minutes in length, including Q&A.)

       

    In the fall of 1787, when Thomas Jefferson (then serving as U.S. ambassador to France) received a copy of the new Constitution of the United States proposed by the Constitutional Convention, he had concerns about two chief “defects” he perceived in the document.  One was the absence of a bill of rights – a concern Jefferson shared with the Antifederalist opponents of the Constitution and, probably, a majority of the American people.  That defect almost derailed ratification of the Constitution, but it was corrected by 1791 with the addition of the first ten amendments – the Bill of Rights – to the Constitution.   

    It was the second “defect” that Jefferson perceived that relates to the topic of this essay: the perpetual eligibility of the president for re-election.  Recognizing that the Constitution created a strong and independent Chief Executive, Jefferson feared that without “rotation in office” – what we today call “term limits” – the president would be perpetually re-elected, making him an officer for life, essentially an elective monarch.  Besides being an evil in itself, that would be just one step away from a hereditary monarch – which would undo the American Revolution, substituting an American king for the British monarch (George III) whom the Declaration of Independence had pronounced a “tyrant.” 

    Most of the American people in 1787-88, as the Constitution was being ratified, did not share Jefferson’s concern about the lack of term limits on the presidency.  Jefferson himself acquiesced in Americans’ confidence in George Washington – whom everyone expected to be elected the first president – and Washington earned that confidence by his decision to voluntarily retire after two terms.  Jefferson applauded Washington’s decision – which created a tradition that Washington’s successors (including Jefferson) would follow, until the 20th century.  But Jefferson nevertheless predicted that it would be necessary to amend the Constitution to limit presidents to two terms – making explicit and part of the text the tradition that Washington originated – because “the natural progress of things is for liberty to yield and government to gain ground.”  But, Jefferson added, amendment of the Constitution to correct this flaw would have to wait until “inferior characters” succeeded Washington and “awakened us to the danger which his merit has led us into.” 

    The “inferior character” who finally awakened the American people to the flaw in the Constitution that Jefferson had perceived finally was elected president in 1932: Franklin D. Roosevelt (FDR), who was reelected in 1936 and then reelected not only to an unprecedented third term in 1940 but even to an unprecedented fourth term in 1944.  As the president who greatly expanded federal government powers through his “New Deal” programs as well as the president who served as Commander-in-Chief during World War II, FDR wielded far more power than any of his predecessors and, in many ways, embodied the greatest fears of Jefferson, Madison, and other Founders who imagined that the great powers of the office might indeed transform the presidency into a monarchy.  Not until after FDR, by his death in 1945, had become, effectively, a president reelected for life (precisely what Jefferson had feared), did the American people finally amend the Constitution by adding the Twenty-second Amendment in 1951, limiting the president to two terms. 

    Although the Twenty-second Amendment has corrected the “defect” in the Constitution that so concerned Jefferson in 1787-88, the presidency still remains a dangerously powerful office.  FDR’s successors from both major political parties, both Democrat and Republican, have continued to expand the powers of their office – raising the specter of “an imperial presidency,” in effect, an elective monarch, who wields far more power than did the British “tyrant,” George III. 

    As I’ll argue in this essay, the modern American presidency has departed significantly from the model created by the framers at the Constitutional Convention and implemented by the early American presidents.  Unless something is done to restore the limits that the Constitution puts on the office, the modern presidency will continue to degenerate into a tyranny, thus fulfilling the worst nightmares of Jefferson and other Founders.   

    That’s especially true today, as the current “Occupier” of the White House, B.O., begins his second term.  B.O. has been, as I described him in my essay “Rating the U.S. Presidents 2012” (Feb. 22, 2012), not only “the worst president in U.S. history” but also “the most lawless president in U.S. history.”  B.O. himself asserted (in comments he made last March to the Russian president – confidential comments that were accidentally picked up on an open microphone) that in his second term, he could be “flexible” – meaning he feels free to ignore not just the constitutional restraints on his exercise of power but also political restraints (including public opinion).  A “lame duck” president can be an irresponsibly dangerous president.  Thankfully, as discussed in the final section below, the framers of the Constitution provided a remedy – impeachment and removal from office – one that I predict will have to be used, to put an end to B.O.’s tyrannical abuse of presidential power.

      

      

    The Constitutional Presidency:

    Restrained by the “Chains of the Constitution”

      

    During the 1790s Thomas Jefferson (whom Washington chose as his first Secretary of State) and James Madison (a leader in the U.S. House of Representatives) became the leaders of the first true national opposition political party.  They called theirs the “Republican” Party because they feared that the establishment Federalist Party was undermining the limited, republican form of government created by the Constitution.  A major part of their concern was the broad power wielded by Washington, particularly in the realm of foreign policy.  Jefferson and Madison did not blame Washington, whom they both admired, but they were concerned that Washington was following too much the advice of his Secretary of the Treasury, Alexander Hamilton – a man who openly admired the British monarchical system of government and who wished the American presidency would be more like the British monarchy. 

    One of the major issues that divided the Republican and Federalist parties – and which pitted Jefferson and Madison against Hamilton, in the arena of public-policy debates – was Washington’s decision to issue a Neutrality Proclamation in 1793, reminding Americans that the United States was neutral with regard to the European war between Britain and France.  It wasn’t the policy of neutrality per se that concerned Jefferson and Madison – indeed, they supported the policy – but rather the president’s unilateral action in issuing the proclamation without first consulting Congress.  They argued that by acting on his own authority as president, Washington had undercut Congress’s power to declare war, one of its enumerated powers in Article I, Section 8 of the Constitution. 

    The parties’ conflict over foreign policy served as the backdrop yet again to another constitutional controversy that arose during the presidency of Washington’s successor, John Adams.  The Federalists (especially the hawkish Hamiltonian wing of the party) were agitating for the United States to enter the European war on the side of Britain – in other words, to go to war against France.  As part of their attempted build-up to war (what some historians have called the “pseudo-war”) during the summer of 1798, the Federalists who controlled both houses of Congress passed into law a series of measures called the Alien and Sedition Acts.  Of these, the Sedition Act was most ominous: it criminalized criticism of the president or the Congress, in a blatant attempt to silence the Republican party opposition.  Jefferson and Madison responded by anonymously drafting resolutions, which were passed by the legislatures of Kentucky and Virginia, respectively, protesting the unconstitutionality of the laws. 

    In his Kentucky Resolutions – in the eighth resolution that he drafted – Jefferson declared a principle that I regard (in my book The Constitutional Thought of Thomas Jefferson) to be the touchstone of his philosophy regarding the Constitution.  He wrote: 

    “Free government is founded in jealousy, not in confidence.  It is jealousy, and not confidence, which prescribes limited constitutions, to bind down those whom we are obliged to trust with power.”

     

    After further noting that the Constitution had “fixed the limits” of power, he concluded: 

    “In questions of power, then, let no more be heard of confidence in  man, but bind him down from mischief by the chains of the Constitution.”

     

    Jefferson’s metaphor is quite apt: the purpose of written constitutions is to limit governmental power, to prevent its abuse.  Hence, if we are to take seriously the Constitution, we also must take seriously its chief structural devices for limiting power, what Jefferson called “the chains of the Constitution,” including separation of powers and its corollary principle, checks and balances.  Jefferson took quite seriously these structural devices for limiting power; indeed, more than most of the Founders – and certainly more than any of the other U.S. presidents – he particularly took seriously the constitutional separation of powers, including the critically important limitations it put on the power of the President of the United States.               

    The success of Jefferson’s and Madison’s Republican Party in the election of 1800 – in which Jefferson was elected president and the Republicans won majorities in both houses of Congress – was considered by Jefferson to be a second American Revolution.  “The Revolution of 1800,” he called it, for it represented the American people’s validation of the party’s constitutional principles and a reaffirmation of the Constitution as it was understood at the time of its ratification.  The Federalists in various ways had departed from that original meaning of the Constitution – as noted above, the Hamiltonian wing of the Federalist Party had tried to transform American republicanism into a system reminiscent of the aristocratic, monarchical English constitutional system – and the Republican victory in 1800 also marked the beginning of the end of the Federalist Party.  A permanent minority party at the national level after 1800, the Federalists eventually disappeared as a party after 1815.  (Their opposition to the War of 1812, which they denounced as “Mr. Madison’s War,” essentially marked the death knell of their party.)  During the 1820s, the “Era of Good Feelings” that prevailed during James Monroe’s presidency, there was no clear partisan division, although schisms within the Jeffersonian Republicans (which first appeared during Jefferson’s second term) anticipated the next major party system, the “Second Party System” (as political historians call it) – the Jacksonian Democrats versus the Whigs – the precursor of our modern party system, of Democrats versus Republicans. 

    As president, Jefferson departed in significant ways from the practices of his predecessors, Washington and Adams – certain practices that he regarded as unconstitutional, for they stretched the powers of the office too far.  Some of these changes were more symbolic than substantive, such as Jefferson’s practice of sending his annual reports in writing to the Congress rather than delivering in person a “State of the Union” address (discussed in the next section below).  Others were both symbolic and substantive, such as Jefferson’s refusal to issue presidential proclamations for days of national prayer or thanksgiving – which he viewed as barred by the First Amendment religion clause (which he wrote, in famous words, “erected a wall of separation” between church and state).  More substantively, Jefferson sought to nullify the bad precedents that Washington and Adams set by making new precedents, more respectful of the constitutional limits on the presidency.  He refused to continue prosecutions that had begun under the Sedition Act, and he used his pardoning power to release from prison persons who had been convicted under the unconstitutional law.  When the United States became involved in its first foreign war, the “Barbary War” in the Mediterranean against North African Muslim pirates, Jefferson – mindful of his criticism of Washington’s Neutrality Proclamation – was careful to get Congress’s authorization of offensive measures (as noted in the section on Commander-in-Chief powers, below).  With regard to the most audacious act of his presidency, the Louisiana Purchase, Jefferson was so troubled by the precedent it might set that he drafted a constitutional amendment authorizing it, “to set an example against broad construction.”  

    In general, Jefferson recognized that the Constitution separated the national government’s legislative and executive powers, vesting the former in Congress and the latter in the president.  (And in addition to the general law-making power, the Constitution also vested in Congress the power over “the purse,” to collect taxes and to appropriate money, as well as important war powers, discussed in the section on the Commander-in-Chief power, below.)   In exercising his most important power as president, the “Executive Power” vested in the president by Article II of the Constitution, Jefferson understood his basic role was to enforce the laws made by Congress:  “I am but a machine, erected by the constitution for the performance of certain acts, according to the laws of action laid down for me,” he wrote in 1805.  This was not merely a presidential power, but a duty that the Constitution (by a provision in Article II, Section 3) imposed on the president: “he shall take care that the laws be faithfully executed.” 

    Jefferson transformed the presidency, restoring the original limits that the Constitution placed on the office, and his more-restrained model of the Chief Executive, “bound by the chains of the Constitution,” set the precedents that virtually all his successors followed throughout the 19th century.  Even Andrew Jackson, one of the “strongest” presidents of the 19th century, followed Jefferson in respecting separation of powers, vis a vis the Congress (although Jackson famously defied the U.S. Supreme Court); coming from the more radical wing of Jeffersonian Republicanism, Jackson also followed Jefferson in respecting the Constitution’s system of strictly enumerated federal powers.  (In one of his most famous acts, vetoing the bill that re-chartered the Second Bank of the United States, Jackson used the veto power precisely as Jefferson understood it, as a check against unconstitutional legislation – as the section on the veto power, below, more fully discusses.)  Within the limits of his office, however, Jackson did fully exercise his powers – prompting an opposition party, the Whigs (who derived their name from the English opposition party of the late 17th century), to criticize Jackson for acting more like a monarch, “King Andrew,” than like a president.  Before the Civil War, presidents of both parties, Democrat and Whig, respected the limits the Constitution placed on their office. 

    Even Abraham Lincoln, who as president during the Civil War stretched the powers of the national government to their limits (or even beyond their limits, as his critics have maintained to the present day), was mindful of the constitutional separation of powers.  He broadly used his presidential war powers (especially his power as Commander-in-Chief), ordering a blockade of Southern ports, authorizing the arrest and imprisonment of civilians without due process of law (suspending the writ of habeas corpus), and freeing slaves in “rebel” territories through his Emancipation Proclamation.  Yet Lincoln was deferential to Congress on matters of legislative policy (except for Reconstruction, where he used a “pocket veto” to nullify the Wade-Davis Bill).  The apparent paradox has been explained by historian David Donald, who calls Lincoln “a Whig in the White House,” citing his political background as a Whig (before the Republican Party was created in the mid-1850s) and the Whig criticisms of “King Andrew” Jackson. 

    Lincoln’s successor, Andrew Johnson, was the first president to be impeached and nearly removed from office.  As noted in the section on impeachment, below, Johnson was impeached in part because his was openly critical of Congress, using his office as a “bully pulpit” (as Teddy Roosevelt later called it), and thereby undermining Congress’s law- and policy-making prerogatives. 

    The fact that Johnson was impeached for using the presidential office essentially the same way his 20th-century successor, Teddy Roosevelt, has been praised for using it – as a “bully pulpit” – highlights how significantly the office of president was transformed in the early 20th century, during the so-called “Progressive Era.”  As I have frequently noted here on MayerBlog, the “Progressive” activists sought a more powerful, paternalistic government and had nothing but disdain for anything that stood in their way – not just the American tradition of individualism but also the American constitutional system, with all its structural devices (including separation of powers, checks and balances, and federalism) for limiting the power of government and safeguarding the rights of individuals.  It’s not surprising, then, that the two “Progressive” presidents of the early 20th century – the Republican, Teddy Roosevelt (TR), and the Democrat, Woodrow Wilson – departed in significant ways from the precedents set by Jefferson and their other 19th-century predecessors, as well as from the powers of the presidency as enumerated in the text of the Constitution. 

    TR famously used the presidential office as a “bully pulpit,” appealing directly to the American people, and thus by-passing Congress (and its legitimate power to not only make the laws but also to determine public policy).  More ominously, TR (in his autobiography) created a new model for the presidency – emphasizing the role of the president as “steward” of the American people: 

    “[O]ccasionally great national crises arise which call for immediate and vigorous executive action. . . . [and] in such cases, it is the duty of the President to act upon the theory that he is the steward of the people, and that the proper attitude for him to take is that he is bound to assume that he has the legal right to do whatever the needs of the people demand, unless the Constitution or the laws explicitly forbid him to do it.”

     

    What is so dangerous – frightening, in fact – about TR’s new model of the presidency is that it turns on its head the fundamental principle of the Constitution, the enumeration of powers.  Under the Constitution, the president has only those powers granted him, the powers enumerated in the text (most of them in Article II).  If the president exercises any powers other than those enumerated, he’s not being a “steward”; he’s acting unconstitutionally, and the word for that is “tyrant” or “dictator.” 

    Woodrow Wilson, a former university president who had written a doctoral dissertation that praised the British parliamentary system of government, had similar disdain for the limits the Constitution imposed on the presidency.  In fact, as president he fancied himself to be more like a British prime minister.  Little wonder, then, that he changed the Jeffersonian tradition of submitting written annual messages to Congress and instead started the tradition of modern presidents personally speaking before a joint session of Congress, in their annual “State of the Union” address (discussed below).  Among other things, Wilson led the United States into World War I – a European war in which the United States had no business getting involved – but he failed in his plans for U.S. participation in the League of Nations when the Senate refused to ratify the treaty.  It was a deliberate rebuff to Wilson – and a sign of the American people’s desire to return to constitutional “normalcy” after the war.    

                Unfortunately, two of Wilson’s successors – the Republican, Herbert Hoover, and the Democrat, Franklin D. Roosevelt (FDR) – exploited the economic crisis of the Great Depression to expand not only the powers of the national government, generally, but the powers of the president, vis a vis the other two branches of government, Congress and the Supreme Court.  The paternalistic national “welfare state” programs known as the “New Deal” actually began under Hoover, most under a model of government cooperating with business.  Under FDR the programs expanded greatly, following a more coercive model of government regulating, or controlling, business – as epitomized by FDR’s exercise of unprecedented regulatory power under the National Recovery Act, which the Supreme Court (correctly) held to be unconstitutional.  Then FDR turned on the Court itself, with his infamous “Court-packing” plan of 1937 – a plan he never needed to implement because he was able to transform the Court, with pro-New Deal justices, through the normal process of attrition, as older justices died or retired.  By the time the United States entered World War II, all but one of the justices had been nominated by FDR; and the “Roosevelt Court” upheld virtually all of FDR’s unconstitutional acts, including (in the Court’s infamous Korematsu decision in 1944) his despicable executive order authorizing the confinement of Japanese-Americans in internment camps on the West coast. 

                It was FDR who transformed the presidency – not only through his unprecedented election to four terms, but also through his exercise of powers not granted the president by the Constitution.  FDR’s “stewardship” of the office really was more like a tyranny or a dictatorship – and it emboldened his successors, both Democrat and Republican, to abuse power in additional ways.

       

     

    The Unconstitutional State of the Union Address

                

    Last week the current “occupier” of the White House delivered the annual speech to a joint session of Congress known as the “State of the Union” address.  Setting aside the substance of B.O.’s speech – the policies he’s pushing as he begins his second term, which I’ve generally discussed in my previous “2013: Prospects for Liberty” essays – I’m focusing here instead on the format of the “State of the Union” address, as it has evolved over the past century (since the presidency of Woodrow Wilson).   

    The modern “State of the Union” address, I argue, is unconstitutional – it is an abuse of the power legitimately granted the president to propose measures to Congress, and it is a perversion of the fundamental constitutional principle of separation of powers.  This particular abuse of presidential power has been committed by all presidents, of either party – Republicans and Democrats – since Wilson’s time.  (Indeed, my discussion in this section is based on one of the earliest essays I posted to my blog, “The Unconstitutional State of the Union Address” (Jan. 20, 2004), criticizing the State of the Union address as delivered by B.O.’s predecessor, President George W. Bush.)   

    Following the practice of recent years, the address is a grand spectacle of modern American political theater.  Announced loudly by the Sergeant-at-Arms (“Mr. Speaker, the President of the United States!”), the current occupant of the White House strides through the aisle of the House chamber – slowly (as he shakes the hands of politicians from both parties) – and then, following two standing ovations, begins delivering his speech to an audience consisting of members of both Houses of Congress, the Cabinet, the justices of the Supreme Court, and (in the balcony) notables, including the First Lady and some “ordinary citizens” whose stories the President will cite in his speech, to give it the personal touch (a practice artfully employed by that master of political theater, Ronald Reagan, and copied by his successors).  The President’s speech consists largely of a laundry list of proposed new programs (the President’s “_____-initiative,” as each is usually described), peppered with political one-liners, greeted by raucous applause and additional standing ovations, at least by those members of Congress in the President’s political party.   

    Thomas Jefferson would be appalled.   As president, he broke the precedents set by his predecessors, Washington and Adams, and instead of personally addressing Congress, he sent his annual message in writing, usually in December of each year of his presidency, when it was read (no doubt in a dry monotone voice) by the clerk of each House.  Although some cynical Jefferson biographers claim that Jefferson did this because he was not comfortable speaking publicly, his deliberate decision not to address Congress personally was one of several ways by which he sought to limit the presidency (in its symbolic trappings as well as in the actual exercises of power) to the duties enumerated in the Constitution.

                To Jefferson, the spectacle of the President personally delivering an annual address to Congress resembled too closely the British monarch’s opening of a new session of Parliament.  He strongly believed that such ceremony had no place in a republican form of government.  Indeed, as noted above, Jefferson always feared that the American presidency might become a sort of elective kingship and thus sought to dispense with all ceremonies or practices reminiscent of  European-style monarchies.  For the same reason, he stopped the practice of holding “levees,” or formal receptions, in the executive mansion and instead hosted dinner parties where the guests were seated “pell-mell” rather than according to protocol.  The British ambassador to the U.S. was offended at Jefferson’s informality – which probably was a sure sign that Jefferson had successfully “republicanized” the forms of American government. 

                With regard to the substance of government and the actual exercise of powers, Jefferson also took quite seriously the constitutional doctrine of separation of powers, which is a key structural feature of the Constitution.   He recognized that Article I vests the legislative power in Congress and that Article II vests the executive power in the President; Congress makes policy, and the President implements it.  The Constitution does authorize the President to “from time to time give to the Congress information of the state of the Union, and recommend to their consideration such measures as he shall judge necessary and expedient” – the constitutional basis for a State of the Union address.  But, as he typically did with regard to the power-granting clauses of the Constitution, Jefferson interpreted this provision quite strictly.  Hence, he sent written messages to the Congress, and his annual messages consisted of recommendations couched in deferential language, respectful of Congress’s role as the law-making branch of the national government.  As Jefferson described the role of the President as chief executive, “I am but a machine erected by the Constitution for the performance of certain acts according to the laws of action laid down for me.” 

                Jefferson’s successors as President in the 19th century also would be appalled at the spectacle of the modern State of the Union address.  Regardless of party (whether Jeffersonian Republican, Whig, Democrat, or Republican), they all followed Jefferson’s practice of sending a written annual message to Congress.  Even Abraham Lincoln, at the height of the Civil War, sent written messages to Congress, including his special July 4, 1861 message to a special session of Congress called by Lincoln to deal with the crisis of Southern states’ secession from the Union.  To Lincoln, as to his fellow 19th-century presidents, it would have been unthinkable for him to personally appear in Congress to give a speech.  Typically, the only public speeches an American president would make in the 19th century would be the address he gave at his inaugural.  (Lincoln’s Gettysburg Address – his brief remarks at the dedication of the military cemetery at Gettysburg, Pa. – was noteworthy not only as a great speech but as an exception to this rule.)  Indeed (as the final section below will more fully discuss), part of the reason President Andrew Johnson was impeached and almost removed from office was that he breached presidential etiquette by giving public speeches in which he criticized Congress (“in a loud voice” and with “intemperate, inflammatory, and scandalous harangues,” as the impeachment articles put it). 

                The Jeffersonian practice ended in the early 20th century with Woodrow Wilson, who began the modern practice of presidents delivering their annual message in person to a joint session of Congress.  It’s not surprising that Wilson broke with the Jeffersonian practice, for in various ways Wilson rejected the American constitutional doctrine of separation of powers and instead favored the British parliamentary system, fancying his own role as President to be akin to that of the British prime minister.  As noted above, it was also consistent with the political atmosphere of the early 20th century – the so-called “Progressive” era in American politics – for policymakers to forget the importance of separation of powers, preferring “efficiency” to limits on power.  In this era, Congress began creating the so-called “fourth branch” of government, the independent regulatory agencies, which combined legislative, executive, and judicial powers in the same bureaucrats’ hands, thereby throwing separation-of-powers doctrine to the wind.  Not surprisingly, both the power of the President and the power of the federal government generally have grown over the past century, to extremes that America’s Founders could not have imagined in their wildest nightmares. 

                Modern presidents do not merely recommend measures to Congress.  Following the Budget and Accounting Act of 1921, the President has been given the initiative in submitting budgets and, effectively, in legislating, notwithstanding the constitutional separation of powers.   The only part of the lawmaking function assigned by the Constitution to the President – the veto power – was meant by the Framers as a shield against unconstitutional legislation, but modern presidents have turned it into a political tool to force their policy preferences on Congress (as the next section more fully discusses).  

                In modern State of the Union addresses there’s also something especially unseemly about the presence of Supreme Court justices, who are supposed to be independent of the other two “political” branches of the government (and also, at least in theory, insulated from shifting public opinions).  After all, many of the “initiatives” the president proposes might very well – if the system works as it should – end up being challenged in a case or controversy brought before the Court.  Justices have no business attending this political event.  

                B.O. has politicized the State of the Union address even more than his recent predecessors.  Two years ago, in his 2010 address, he abused the occasion, by openly criticizing the Supreme Court’s decision in the Citizens United campaign financing case – and in so doing, slandered the justices who were in attendance.  As I noted in my “Spring Briefs 2010” (Mar. 18, 2010), some two months after the event, Chief Justice John Roberts finally broke his silence about B.O.’s  tactless remarks unfairly criticizing the Supreme Court – most of whose justices were sitting in the front row of the U.S. House chamber, as the Congress’s distinguished guests.  Referring to the Court’s recent decision in Citizens United v. Federal Election Commission, B.O. claimed that the Court “reversed a century of law to open the floodgates for special interests – including foreign corporations – to spend without limit in our elections.”  Justice Samuel Alito spontaneously responded to B.O.’s comments by mouthing the words, “That’s not true,” and as many knowledgeable commentators – including my friend and colleague, Capital University law professor Brad Smith (former member and chairman of the FEC) – have noted, Alito was right.  B.O. had blatantly lied about the Court’s decision (certainly not the first time the Bullshitter-in-Chief, a Harvard Law graduate and former law professor, was wrong about the law).       Chief Justice Roberts, speaking at the University of Alabama the first week of March 2010, answered a law student’s question about the president’s criticism of the Court as follows: 

    “The image of having the members of one branch of government standing up – literally surrounding the Supreme Court [justices], cheering and hollering – while the Court, according to the requirements of protocol, has to sit there expressionless, I think, is very troubling.”

     

    He then questioned why the justices attend the annual event.  “To the extent it has degenerated into a political pep rally, I’m not sure why we’re there.”  Yet the justices (including Chief Justice Roberts) continue to attend – some, but not all, of them (to his credit, Justice Clarence Thomas – my favorite justice on the Court today – has steadfastly refused to attend).  Just as he did in his decision in the “ObamaCare” case last summer, Roberts wimped out.  To quote the anagram aptly devised by my good friend Rod Evans (author of the word-play book, Tyrannosaurus Lex), “Fierce Justice Botches” – yet again. 

     

      

    Abuse of the Veto Power

      

                The power to veto legislation is generally regarded as a presidential power, but it is not even mentioned in Article II among the powers vested in the president by the Constitution.  That is because the veto is a prime example of a constitutional “check and balance,” properly understood – that is, an exception to the general constitutional principle of separation of powers.  Thus, because it concerns legislative power, the veto is found in Article I, specifically in the so-called Presentment Clause (Article I, Section 7), which outlines the procedures by which bills become laws.  This provision specifies that if the president approves of a bill passed by both houses of Congress, "he shall sign it"; but "if not, he shall return it with his objections" to Congress, where the bill may still become law if approved by a two-thirds vote in each house.  The veto is a classic example of one of the "checks and balances" of the Constitution, for it gives the Chief Executive a share in the legislative power, which under the Constitution principally rests with Congress.  Indeed, because Article I, Section 1 provides that legislative powers are “vested” in the Congress, the power given the president in Section 7 is an exception to this general rule.  Accordingly, it ought to be construed strictly – not to give the president a broad, discretionary “share” in the legislative power but rather to give the president an opportunity to check Congress’s abuse of the legislative power.    

                Why did the framers of the Constitution give this power to the president?  Remarkably, both Alexander Hamilton and Thomas Jefferson – leaders of the two major political parties of the 1790s, who fundamentally disagreed about the Constitution and particularly about presidential power – both agreed that the purpose of the veto was to guard against unconstitutional legislation.  As explained by Hamilton in the Federalist Papers, the veto was intended to give the president a "shield" against legislation that unconstitutionally interfered with his powers.  Without the veto, the president "would be absolutely unable to defend himself against the depredations" of Congress, Hamilton argued.  Jefferson similarly described the veto as “the shield provided by the constitution” to protect not just the president but also the other branch of the national government, the judiciary, as well as the states, against the “invasions” by Congress of their “rights” under the Constitution.  Jefferson felt so strongly that the proper use of the veto was limited to constitutional grounds that he actually undercut his argument against Hamilton’s bank bill by advising President Washington to defer to Congress unless he was certain of the bill’s unconstitutionality.     

                As the framers intended, the veto gave a preliminary line of defense against unconstitutional laws before they were put into effect; it made it unnecessary to wait for an actual case or controversy to bring such legislation before the Supreme Court for that body to exercise its power of judicial review to nullify unconstitutional laws.  The veto power explains in part why presidents take an oath "to preserve, protect, and defend" the Constitution. 

                The early presidents understood that the limited purpose of the veto was to guard against unconstitutional legislation and, accordingly, they used the veto power very sparingly.  Our first  president, George Washington, vetoed only two bills.  Washington followed the advice of his secretary of state, Thomas Jefferson, who advised him to veto a bill passed by Congress only if his mind were "tolerably clear that it is unauthorized by the Constitution."  Otherwise, Jefferson believed, the president should defer to Congress.  "If the pro and con hang so even as to balance" the president's judgment, "a just respect for the wisdom of the legislature would naturally decide the balance in favor of their opinion" (as Jefferson wrote at the end of his opinion on the constitutionality of the bank bill).  In his two terms as president, Jefferson cast no vetoes at all; his predecessor, John Adams, and Adams’s son, John Quincy Adams, also cast no vetoes; James Monroe cast only one veto.  The only early president to exercise the veto power with any frequency was James Madison, who vetoed seven bills, all on constitutional grounds.  Thus in the forty year period (1789‑1829) covered by the first six presidential administrations there were a total of only ten vetoes. 

                During the rest of the 19th century and the early decades of the 20th century, presidents generally adhered to the tradition of limited use of the veto.  During Andrew Jackson’s eight years in office, he vetoed twelve bills – more than all his predecessors combined – thereby earning the nickname, "King Andrew," given him by his political opponents, the Whigs (as noted above).  Although Jackson’s famous veto of the bill rechartering the Second Bank of the United States was the first veto to cite policy considerations, his principal objection was that the Bank was unconstitutional.  Even Abraham Lincoln, a strong president who exercised extraordinary powers during the Civil War, cast only two regular vetoes.  "As a rule I think the Congress should originate as well as perfect its measures without external bias," Lincoln wrote, showing that he, like the early presidents, was reluctant to substi­tute his judgment on policy matters for that of Congress.  After the Civil War, the number of presidential vetoes increased dramatically, but the majority of vetoes were cast against private bills, granting pensions to Civil War veterans who claimed service-related disabilities but whose claims had been rejected by the Pension Bureau.  Ulysses S. Grant, believing many of these claims to be fraudulent, vetoed forty private bills; Grover Cleveland vetoed 482.  In his book The American Presidency, historian Forrest McDonald notes that Cleveland's "veritable orgy" of vetoes was striking – Cleveland used the veto power more than twice as much as his twenty-one predecessors combined – but concludes that almost two-thirds of the bills vetoed during the Constitution's first two centuries were private bills, not public.  These vetoes were based premised on constitutional grounds, for private bills generally violated the long-standing but unwritten constitutional principle that it was wrong for government to tax some citizens for the benefit of others. 

                In the twentieth century use of the veto power fell again to a lower level until the presidency of Franklin Roosevelt, when the power again underwent a radical change.  William McKinley vetoed only two public bills; Theodore Roosevelt, fifteen; William Howard Taft and Woodrow Wilson, each more than twenty public bills.  Franklin Roosevelt, in his twelve years in office, vetoed over 600 bills – despite the fact that he had huge Democratic majorities in Congress who usually gave him what he wanted.  As McDonald observes, Roosevelt used the veto "as a method of cracking the whip, to keep Congress subordinate if not subservient."  Thus Roosevelt initiated the modern presidents' practice of using the veto as a political tool.  This practice was followed by FDR's successor, Harry Truman, who when faced with a Republican Congress from 1947 to 1949, disapproved over 175 bills, using his veto messages as a means of calling attention to his policies while simultaneously attacking what he called "the do-nothing Eightieth Congress."   

                When modern presidents use the veto in so blatantly a political manner, they do more than simply engage in partisanship: they also seriously undermine the Constitution itself.  A president who vetoes every piece of legislation that he dislikes on policy grounds forces Congress to accede to his wishes, unless legislators are able to muster the two-thirds majority in both houses they need to override the vetoes.  Such use of the veto indeed constitutes blackmail.  It thwarts not only the will of the people, as manifested in the most recent congressional elections, but also the design of the framers of the Constitution, who intended that the legislative power be vested primarily in the people's representatives in the Congress.  Profligate use of the veto, in effect, transforms the simple majority vote required by Article I to a two-thirds majority requirement – in effect, working a change in the constitutional procedures for enacting legislation.   

                B.O. not only has followed in the footsteps of  post-FDR presidents who abuse the veto power, but he has carried that abuse – using the veto as a political tool to force his policy choices on Congress – to dangerous new heights, resulting in the further concentration of power in the White House.  Consider, for example, how B.O. has used the veto, or the threat of a veto, to force his tax-and-spend policies on Congress and to attempt to negate the role of the House of Representatives (the house that directly represents “the people” and which has the exclusive constitutional power of initiating tax legislation), simply because the House has a Republican majority who have rejected B.O.’s “soak-the-rich” tax policy.  As I discussed in the section on the so-called “fiscal cliff” in Part II of my “2013: Prospects for Liberty” essay (January 31), it was B.O. himself, and not Congress, who was responsible for creating such a fiscal “crisis,” through his abuse of the veto power.  

     

       

    Abuse of the “War Power” (Commander-in-Chief)

      

    One of the most important ways in which the framers of the Constitution implemented the principle of separation of powers was to deliberately divide what some commentators have called “the war power” – but which is more properly understood as a bundle of discrete powers, “the war powers” – between the President and Congress.  The framers sought to chain “the Dog of war” (as Jefferson put it) by granting Congress the power to declare war, which is properly viewed as the power to determine whether or not the military forces of the United States should be used.  Other important powers concerning military policy also are vested in the Congress, including the powers to “grant letters of marque and reprisal” (authorizing private vessels to engage in war); to “make rules concerning captures on land and water”; to “raise and support armies”; to “provide and maintain a navy”; to “provide for organizing, arming, and disciplining the militia”; and even to “make rules for the government and regulation of the land and naval forces” (all powers of Congress enumerated in Article I, Section 8). 

    Yet, since the end of World War II, one of the chief ways in which modern presidents have abused the powers of their office – a practice exercised , unfortunately, by presidents of both major parties, Democrat and Republican – has been abuse of the president’s power to be “Commander in Chief of the Army and Navy of the United States, and of the Militia of the several States, when called into the actual Service of the United States” (as specified in Article II, Section 2).  Modern presidents, claiming to act under authorization of this clause, have usurped Congress’s power to declare war by deciding, on their own authority (without consulting Congress), to commit U.S. military forces to various “conflicts” or “police actions” (euphemisms for “wars”), around the globe. 

    Harry Truman began this unconstitutional practice by sending troops to the Korean peninsula, to fight in the Korean War – claiming that his unilateral action (without any explicit Congressional authorization) was pursuant to U.S. treaty obligations, because the “police action” was authorized by the United Nations, after communist North Korea had attacked the free South Korea.  

    Defending the actions of Truman (and similar actions taken by his successors), some foreign-policy scholars have argued that U.S. ratification of the United Nations treaty has created a “new world order” in which the 18th-century rules about war and peace have become obsolete, thereby making the Article I, Section 8 clause granting Congress the sole power to declare war a kind of constitutional dead-letter.  That argument fails, for at least two important reasons.  First, the argument mischaracterizes the effect of U.S. ratification of the UN treaty:  no treaty can effectively amend the Constitution – which can be amended legitimately only in one way, pursuant to the procedures provided in Article V.  Second, the argument ignores the other provisions in Article I, Section 8 (noted above) that give Congress most of the other “war powers.”   

    Considered in the context of these provisions, as well as Congress’s power to declare war, it is clear that, in so dividing the war powers, the framers of the Constitution meant the president’s power as Commander-in-Chief to be quite limited:  it is the power to wage war, to actually command – to direct how the military forces of the United States are to be used – but not the power to determine whether U.S. military forces are to be used.  That important policy determination – the question whether to initiate war, to determine when and where U.S. military forces should be used – was left to the determination of Congress.  (Why?  Because Congress represents the people, and it is the people – through their tax dollars and their blood – who must pay the full costs of a war.)  Only after Congress has authorized war, by a formal declaration (the purpose of which is to clearly demonstrate that war has begun), may the president, as Commander-in-Chief, determine how the war is to be waged.  (The exception is where the United States are attacked by a foreign power – where a state of war has been created by another nation’s declaration of war against the U.S. – in which case some of the Founders (particularly Alexander Hamilton and his fellow Federalist war-hawks) believed that a Congressional declaration of war was unnecessary.  Other Founders, however (namely, Jefferson, Madison, and their Republican party), took the view that Congress must authorize all offensive uses of the U.S. military, even when a foreign power has declared war against or has attacked the United States.  In such a situation, they believed, the president could respond with defensive military force only, until Congress authorizes the use of offensive force.  (That was the position Jefferson took – for which he was roundly criticized by the Hamiltonian Federalists – during the first term of his presidency, with regard to the “Barbary War” in the Mediterranean.))    

    The Supreme Court has steered clear of the “war powers” controversy, regarding it as a nonjusticiable “political question” between the two “political” branches, the president and Congress – which is why the constitutionality of U.S. involvement in particular wars or conflicts has not been resolved by the courts.  From time to time, however, cases arise that indirectly concern the war powers, particularly when presidents’ abuse of their Commander-in-Chief power adversely affects the rights of private persons.  For example, during the Korean War, President Truman attempted to seize control of the nation’s steel mills, to avert a possible strike, arguing that it fell within his powers of Commander-in-Chief because it concerned production of necessary war materials.  (He issued an executive order, which also raised the issue whether the president was abusing his executive authority by usurping the law-making powers of Congress.)  The Supreme Court rejected Truman’s argument, finding that his actions were unconstitutional – specifically, a violation of the steel companies’ property rights as well as an abuse of presidential powers.  Writing the opinion of the Court, Justice Black held that the president’s executive order “cannot properly be sustained as an exercise of the President’s military power as Commander in Chief of the Armed Forces” – because that power cannot constitutionally give the president the power “to take possession of private property in order to keep labor disputes from  stopping production.”  If a labor strike at steel mills would jeopardize national security and the “war effort,” that’s “a job for the Nation’s lawmakers, not for its military authorities.”  Justice Black added that Congress had given the president a lawful way to avert a strike in the steel mills – by enforcing a provision of the Taft-Hartley Act authorizing 60-day injunctions of strikes in emergencies – but that Truman failed to follow the law.  Instead of doing his duty to see that the laws passed by Congress are faithfully executed, Truman unconstitutionally attempted to be a “lawmaker” himself (Youngstown Sheet & Tube Co. v. Sawyer (1952)). 

    Presidents John F. Kennedy (JFK), Lyndon Baines Johnson (LBJ), and Richard Nixon, again abusing their powers as Commander-in-Chief, committed U.S. military forces in Vietnam, when communist North Vietnam invaded the free South Vietnam.  Again the presidents justified their actions by claiming to act under treaty authority (not only the UN but also SEATO, the Southeast Asian Treaty Organization, in which the U.S. had promised to help defend South Vietnam from attack).  During LBJ’s presidency, Congress did pass the Gulf of Tonkin Resolution, authorizing the president to use military force to protect troops he’s already committed to the Vietnam War – a resolution that some scholars interpret as a de facto declaration of war, giving the president essentially a blank check to continue U.S. involvement in the conflict.  But as the war continued, and even escalated, during Nixon’s presidency – after nearly a decade in which thousands of young men were drafted, compelled to fight in Vietnam, and either died or were wounded – both Congress and the American public tired of the conflict.  

    In this war-weary atmosphere of the early 1970s, Congress passed, over Nixon’s veto, the joint resolution known as the War Powers Act (1973), which required the president to consult Congress and obtain formal congressional approval (or some similar formal action by Congress) after U.S. military forces are committed in a foreign theater of war, where hostilities exist or were imminent.  

    Starting with Nixon, however, no president – whether Republican or Democrat – has followed the letter of the law (the War Powers Act), because it has been the constant position of the White House (regardless which party is in control) to regard the Act as an unconstitutional intrusion on presidential powers, including the Commander-in-Chief power.  In fact, however, the War Powers Act represents a minimal effort on the part of Congress to reassert its war power – its Article I, Section 8 authority to declare war – which Congress had, effectively, abdicated to U.S. presidents after 1945.  It’s not just the U.S. presidents who should be blamed for usurping Congress’s war power; it’s also Congress itself, which has for most of the past 65 or so years acquiesced in this unconstitutional power grab by the presidents. 

    For example, President George H.W. Bush (“Bush the elder,” as I call him) committed U.S. troops to the defense of the Kuwaiti monarchy, from attack by Saddam Hussein’s regime in Iraq, in the first Gulf War.  Bush the elder sent the troops to the Gulf starting late in the summer of 1990; by February 1991, when Congress eventually passed resolutions authorizing him to offensively use military force – literally on the eve of the attack on Baghdad – U.S. military forces had been so committed (by the president’s own actions as Commander-in-Chief) that Congress had little choice, lest it be accused of not “supporting our troops.”  (Thus, Bush the elder essentially blackmailed Congress into authorizing the Gulf War.)  Bill Clinton, acting on his presidential authority alone and without prior authorization by Congress, committed U.S. troops to various conflicts around the globe, including the war in Bosnia (in the former Yugoslavia).  George W. Bush (“Bush the younger,” I call him) committed U.S. military forces to two wars, in both Iraq and Afghanistan, but at least he had Congressional authorization – a shameful resolution passed by Congress, in the wake of the 9/11/2001 Islamic terrorist attacks on the United States, giving Bush the younger essentially a blank check to fight wars against whomever he regarded as responsible for the terrorist attacks. 

    Following in the unfortunate (and unconstitutional) tradition of his predecessors, B.O. similarly has committed U.S. military forces to wars – particularly, the NATO operation in Libya – without authorization of Congress, as the next section will discuss.  But it’s actually among the least dangerous ways in which B.O. has abused the powers of his office.

      

     

    Abuse of Other Presidential Powers:

    Why B.O. Is the Most Lawless President in U.S. History

     

                 As regular readers of this blog know, since the 2008 elections it has been my policy to refer to the 44th president of the United States, Barack Obama, by his initials, “B.O.,” for the obvious reason.  As I’ve explained, “other modern presidents have been known by their initials – TR, FDR, JFK, LBJ – and using this president’s initials in lieu of his name seems appropriate because B.O. as president, in a word, stinks.”  (For a full explanation, see the first entry in my “Fall-deral 2008” essay (Nov. 6, 2008)).  

    B.O. came into office on a wave of popularity – intense popularity, verging upon adulation for many of his supporters – that perhaps was unprecedented in modern American history.  Much of that popularity came from his status as “the first African-American” president, as is popularly believed (although, technically, Obama is biracial – his mother was white, his father was black, a black African Muslim, in fact – he identifies himself as black).  Much of the popularity also came from the image created by his campaign managers, claiming he was bringing not only “Change” and “Hope” to Washington but also transforming American politics and government.  B.O. was, to many people, not only a “post-modern” president but a “post-racial,” or “trans-racial” leader, who would bridge not only racial divisions in America but also partisan divisions, between Democrats and Republicans or left-liberals and conservatives, a “uniter,” not a “divider,” even a kind of “new (political) Messiah,” as some supporters claimed. 

                Behind this image of the 44th president is a man who in reality, frankly, doesn’t come close to living up to the hype.  B.O. personifies the naked “Emperor” in the famous children’s story by Hans Christian Andersen, “The Emperor’s New Clothes” – as I also have written here, since well before the 2008 elections.  (See my blog essay “The Emperor Is Naked!” Oct. 16, 2008.)   In that essay I maintained that B.O. is a master purveyor of bullshit – a true bullshit artist.  He got his party’s nomination and was elected president because of a campaign based almost entirely on bullshit.  And he was reelected to a second term because he has continued to be a master purveyor of bullshit – “the Bullshitter-in-Chief,” as I’ve called him – keeping up the “Emperor’s New Clothes” pretense, as I have discussed in Part I of this year’s “Prospects for Liberty” essay (January 17).  

                B.O.’s image portrayed him as a “uniter,” but in fact he has employed the standard Democrat demagoguery based on race and class, dividing Americans even more sharply along racial, class, and partisan lines.  (As Pat Cadell and Doug Schoen, moderate Democratic political consultants, have observed, B.O. is the most divisive president in modern history; he’s “tearing the country apart.”)  B.O.’s image portrayed him as a “centrist,” but in fact he has proved himself to be one of the most left-wing politicians ever to hold the presidency.  It’s not just simply a case of substance failing to match style, for even in his much-vaunted style, B.O. has failed to live up to his hype.  The supposedly eloquent speaker has been exposed as an inarticulate moron overly-dependent on a teleprompter.       

                The biggest load of bullshit associated with B.O. and his presidency, however, is the notion that he’d bring “change” to Washington.  All that B.O.’s regime has brought to Washington has been more of the same – more of the same old, tired, semi-socialist, paternalist policies that the federal government has been tinkering with, under both Democrat and Republican administrations, for the past century or so, since the beginning of the 20th-century regulatory/welfare state.  All B.O. really has attempted to do is to expand the welfare state – to increase government and its controls over Americans’ lives, making the U.S. even more of a socialist (or, more properly speaking, a fascist) country – and that’s not any kind of real “change” in public policy, at all.  No wonder the economy is such a “mess,” to use the term B.O. likes to use, as he still tries to blame on his predecessor, George W. Bush, the recession that B.O.’s polices have prolonged and worsened. 

                Thus, B.O. is, as I described him in my essay “Rating the U.S. Presidents 2012” (Feb. 22, 2012), “the worst president in U.S. history.”  In that essay, I summarized B.O.’s record as president – and explained why I rank him as a “failure,” at the bottom of the list:   

    “The least qualified person ever to hold the office of Chief Executive, he is unquestionably the worst president in American history.  B.O. was elected president by promising to bring `change’ to Washington, D.C.; instead, he brought more of the same old semi-socialist, welfare/regulatory state, paternalistic policies that the worst of his predecessors, whether Democrat or Republican, brought to Washington – except that B.O. did it at unprecedented levels, resulting in spiraling federal budget deficits and . . . adding more to the national debt than all previous presidents – from George Washington to George W. Bush – combined.  . . .  In comparison, B.O. makes Jimmy Carter seem competent, Bill Clinton seem moral, and FDR seem faithful to the Constitution.”

     

    I also gave my own “Top Ten” list, of the ten reasons why I regard B.O. as the worst president.  The most important – the No. 1 reason – was his “Contempt for the Constitution.”  I noted that, sadly, what I predicted in my 2009 “Prospects for Liberty” essay (Jan. 15, 2009) about B.O.’s presidency has come true: 


    “Given his politics, he lacks the requisite understanding of, and respect for, constitutional limits on the powers of government – which means that on January 20 when he takes the oath of office to `preserve, protect, and defend the Constitution of the United States,’ he will be lying.  Virtually everything he’ll do as president will undermine the Constitution, the limits it places on the powers of the executive branch and the federal government generally as well as its protections for the rights of individuals.”

     

    During the fours years thus far he has been in office, B.O. has clearly shown nothing but contempt for the rule of law generally and particularly for the higher-law limitations that the Constitution imposes on the powers of the national government and the president.  

    To those reasons why I regard B.O. as the worst president in U.S. history, I added another – really an elaboration on the topic discussed above, B.O.’s contempt for the Constitution.  That additional reason is that B.O. is also “the most lawless president in U.S. history.”  He blithely violates not only the law of the Constitution – the higher law that limits the government – but also the broader limitations on the exercise of governmental power known as “the rule of law.”  By calling B.O. “lawless,” I mean that he not only acts in unlawful ways but also acts as though he’s not subject to, or controlled by the law.   

    To show that B.O. is the most lawless president in U.S. history, I created another “Top Ten” list, of ten ways (not necessarily in order of importance) in which B.O. – in his first term in office – has demonstrated his contempt not only for the Constitution but also for the rule of law, generally.  What follows is a summary of that “Top Ten” list, slightly revised, to take into account recent political developments.  To paraphrase Thomas Jefferson’s introduction of the charges against King George III in the Declaration of Independence, “let these facts be submitted to a candid world”: 

     

    10.  Not the Rule of Law, But the Rule of Alinsky:  B.O. not only holds far-left political views (as shown by his constant attempts to use federal taxes to redistribute wealth), but also may be considered a disciple of the 1960s neo-Marxist anarchist/nihilist Saul Alinsky (1909–1972), author of the leftist strategy book, Rules for Radicals.  B.O. and his minions frequently practice the methods of Alinsky’s Rules, particularly rule #13 (which arguably is B.O.’s favorite): “Pick the target, freeze it, personalize it, and polarize it.”  One can see that in his attacks on “fat cat bankers,” “greedy health insurers,” and “millionaires and billionaires.”  That also explains why B.O.’s claim to be “a unifier, not a divider” has spectacularly back-fired.

      

     9.  Orwellian Language:  B.O.’s lawlessness is reflected even in the way he speaks.  His major public speeches frequently use words in a manner that is best described as “Orwellian,” from George Orwell’s famous distopian novel 1984, where a totalitarian regime controlled people through the perversion of words.  Engraved on the wall of the “Ministry of Truth” in Orwell’s fictional Oceania were the slogans “War is Peace,” “Freedom is Slavery,” and “Ignorance is Strength.”  To those may be added a new slogan, suggested by B.O.’s speeches – “Spending is Investment” – as B.O. continually refers to increased federal government spending for such things as education, energy, and infrastructure as “investments.”  Related to B.O.’s Orwellian use of language is his equally dangerous practice of telling lies.   B.O. is a shameless and compulsive liar.  His regime’s explanation of the Islamic terrorist attacks in Benghazi, Libya was based on a big lie (as discussed below).  His recent State of the Union address contained several major lies (as the editors of Investor’s Business Daily recently observed).  And commentator Steve McCann, in a provocative piece posted on American Thinker, called B.O. “the most dishonest, deceitful, and mendacious person in a position of power” he’d ever witnessed (“The Mendacity of Barack Obama,” April 15, 2011). 

      

    8.  Initiating War:  As noted in the section above, like many of his predecessors, both Democrat and Republican, in the post-World War II era, B.O. has violated the exclusive power of Congress to declare war, by initiating war through unilateral presidential action, by abusing his “commander-in-chief” power.  In B.O.’s case, this unconstitutional exercise of the “war power” has been particularly egregious because he not only has continued the two wars begun during George W. Bush’s presidency in Iraq and Afghanistan (both of which were at least authorized by Congressional resolutions) but also has committed U.S. military force to Libya, in the war that resulted in the downfall of Libyan dictator Muammar Gaddafi’s regime.  (In response to B.O.’s apologists, who deny that U.S. military intervention in Libya was “war” because U.S. air forces alone provided support for NATO operations there, the obvious answer is that war does not require ground troops:  whenever the U.S. uses its military force offensively against another nation, it is at war – and the constitutional requirement of a Congressional declaration of war cannot be ignored.)  B.O.’s Libyan venture is doubling egregious because the vital interests of the United States were not at all involved; rather, it primarily concerned the economic interests of Europe, as western European countries are the major buyers of Libyan oil.  So, B.O. abused U.S. military power to benefit our commercial rivals in Europe – yet another instance of B.O.’s occupation of the White House being more beneficial to other countries than to the United States.

      

    7.  Assassin-in-Chief:  Among the unconstitutional or extra-constitutional powers that B.O. has exercised, there is one in particular that ought to concern genuine civil libertarians.  As Jack Curtis observed in a provocative piece posted last year on American Thinker, B.O. has “order[ed] the killing of selected individuals (and any others nearby) located in foreign countries at will, using drone aircraft.”  Citing a lawsuit filed by the ACLU and the Center for Constitutional Government that challenged this practice – and the Center’s press release saying that the B.O. regime has claimed it has the power to kill, without review, any American it decides is a threat – Curtis writes that this claim is “absolutely scary,” and explains: “Killing citizens at presidential will without review seems pretty far from constitutionally limited government. . . . The fact that it originated with killings mostly located in the backwoods [or mountains] of Pakistan offers no assurance of future locations” (“The Powers of This President,” March 7, 2012).   

    In a recent piece also posted on American Thinker, Herb Titus and William J. Olson continue writing about the killing of civilians – including three U.S. citizens – on B.O.’s order in drone strikes in Yemen in 2011.  “As the worldwide drone program ramps up, there have been increasing calls for the president to reveal the basis for his claimed authority.”  Congress only recently has been investigating the matter, as the Senate begins gathering information for hearings on John Brennan’s confirmation as CIA director.  NBC News on February 4 released a leaked U.S. Justice Department “white paper” that explains the regime’s criteria for ordering the “targeted killing” of American citizens off the battlefield on foreign soil.  This is a power that “no prior president ever thought he possessed – a power that no prior resident is known to have exercised”; in effect, it is “the power to kill citizens without judicial process – a power that has been unknown in the English-speaking world for at least 370 years” (“Assassin in Chief?” February 7).

      

    6.  The B.O. White House “Enemies List”:  One of the charges against former President Richard M. Nixon, in the articles of impeachment drawn up by the House Judiciary Committee in 1974 (discussed in the next section below), was that the Nixon White House had maintained an “enemies list” and was using the powers of government to punish individuals on that list.  It turned out that the allegation was false – the Nixon “enemies list” amounted to nothing more than a list of persons who were not to be invited to White House social events.  Neverthless, the specter of an administration compiling an enemies list and using the awesome powers of government to punish persons on that list has, sadly, come true in the B.O. White House.   

    There’s the infamous example of Gerald Walpin, former inspector general for AmeriCorps, who blew the whistle on political corruption in that government program – and who thus became the target of a campaign of character assassination orchestrated by the B.O. regime.  There’s also the case involving Charles and David Koch, the businessmen brothers who help bankroll various nonprofit organizations that promote personal liberty and free enterprise – and who were selected as a “political punching bag” by the president’s re-election team.  In an op-ed last year in The Wall Street Journal, the Koch brothers’ attorney, Ted Olson (former solicitor general of the United States) describes how the B.O. regime has demonized his clients – among other ways, by having the White House engaged in derogatory speculative innuendo about the integrity of the [Kochs’] tax returns, by having a leading Democratic member of Congress demand their appearance before a congressional committee to be interrogated about the Keystone XL oil pipeline project (in which the Kochs have no involvement, other than publicly criticizing B.O. for killing it), and more generally, by having the president’s surrogates and allies in the media regularly attacking them, sullying their reputation, and questioning their integrity. 

    As Olson notes, “when Joseph McCarthy engaged in comparable bullying, oppression, and slander from his powerful position in the Senate, he was censured by his colleagues and died in disgrace. . . . In this country, we regard the use of official power to oppress or intimidate private citizens as a despicable abuse of authority and entirely alien to our system of a government of laws. . . . That is why it is exceedingly important for all Americans to respond with outrage to what the president and his allies are doing to demonize and stigmatize David and Charles Koch,” who have been the targets of a “multiyear, carefully orchestrated campaign of vituperation and assault,” one that has been “choreographed from the very top” (“Obama’s Enemies List,” Feb. 1, 2012). 

    The B.O. regime’s attacks on Mr. Walpin and the Koch brothers are not isolated instances; they are routine practices of a regime that implements Alinksy’s Rules for Radicals and particularly B.O.’s favorite rule, “Pick the target, freeze it, personalize it, and polarize it.”  Other Americans who have been targeted as “enemies” of the regime are documented in Chapter 7 (“We’re Gonna Punish Our Enemies”) of David Freddoso’s book Gangster Government.  They include Rush Limbaugh, the U.S. Chamber of Commerce, FOX News, and even the justices on the Supreme Court, whom B.O. directly attacked (by mischaracterizing the Court’s Citizen United decision, as noted above) during his 2010 State of the Union address, delivered with six justices seated immediately in front of him!  

     

    5.  Taking Over the “Fourth Branch” of Government: Administrative agencies – agencies such as the Environmental Protection Agency (EPA), the Federal Communications Commission (FCC), the National Labor Relations Board (NLRB), and so on – are regarded as “independent” executive agencies because they were created by acts of Congress and are staffed by commissioners with staggered terms, so no one president can control them through his appointments.  They also have been called the “fourth branch” of government because these agencies combine legislative (rule-making), executive (enforcement), and judicial (adjudicating disputes in administrative courts) functions, a dangerous blending of the three essential functions of government that the framers of the Constitution meant to keep separate.   

    Not only are these agencies dangerous in themselves to Americans’ liberties, but they have become particularly dangerous during B.O.’s regime because of his efforts to control them (through his appointees) and to use them illegitimately to make laws – abusing the agencies’ regulatory powers to by-pass Congress, by enacting “rules” that actually amount to new legislation.  When Congress failed to pass the “cap-and-trade” bill that B.O. was pushing, supposedly to control so-called “greenhouse gas” emissions (essentially carbon dioxide, from the burning of “fossil fuels” – coal, oil, and natural gas), B.O. had his handpicked EPA administrator, Lisa P. Jackson, assume regulatory authority over greenhouse gases, even though Congress had not empowered the agency to do so.  Similarly, when Congress refused to enact “card check” legislation doing away with secret ballots in union elections, B.O.’s appointees to the NLRB imposed the change by administrative decree.  With their threatened action against Boeing, to block the company’s plan to build a facility in South Carolina (a right-to-work state), the pro-union NLRB even attempted to dictate where a business may locate.  And B.O.’s appointees to the FCC have assumed, or attempted to assume, regulatory authority over the internet, even though Congress has not given the agency such power.   

     

    4.  King of the “Czars” and Abuser of the Recess Appointment Power:  Modern presidents’ practice of appointing so-called “czars” – presidential appointees not subject to Senate confirmation – is an abuse of the appointment power, as these “czars” function not merely as presidential advisors but as actual bureaucrats, exercising executive powers.  B.O. has outdone all this predecessors in this unconstitutional practice, appointing – in just his first few months in office – “more czars than the Romanovs.” 

                Related to his appointment of unconfirmed “czars” is another abuse of the presidential appointment power that B.O. has employed.  The Constitution provides (in Article II, Section 2, clause 3) that the president may fill “vacancies that may happen during the recess of the Senate.”  Previous presidents have exercised this power only when the Senate was legitimately in recess (which, according to the Constitution, requires House approval if the Senate recesses for more than three days); the shortest prior period for a recess appointment was a break of ten days.  B.O. has abused this recess appointment power by making appointments while the Senate was not officially in recess – when in fact the Senate had taken pro forma steps to stay in business – as he did in January 2012, with the appointment of former Ohio attorney general Richard Cordray as director of the new Consumer Financial Protection Bureau (CFPB), a self-funded independent agency created by the Dodd-Frank Act in 2010, as well as the appointment of three new members (all of them pro-union) to the National Labor Relations Board (NLRB).   

                In both cases, B.O. made the illegitimate appointments to avoid contentious Senate confirmation battles; and in Cordray’s case, the appointment was not only unconstitutional but also illegal, as the law creating the CFPB specifies that the director has authority to act only after being confirmed by the Senate.  These appointments, made in a deliberate attempt to circumvent the Senate’s constitutional exercise of its “advice and consent” power, thus put under a cloud of illegality each and every decision made by the CFPB and the NLRB.  As I noted in Part III of my “2013: Prospects for Liberty” essay (February 7), a recent decision by the U.S. Court of Appeals for the District of Columbia held that B.O.’s so-called “interim” appointments to the NLRB were unconstitutional (Canning v. NLRB, Jan. 25).  The ruling apparently invalidates all decisions made by the NLRB in which these illegitimate appointees took part, and it also calls into question not only Cordray’s appointment as director of the CFPB but also all the actions he has taken (and rules he has announced) as director.  It remains to be seen whether the Supreme Court will affirm the Court of Appeals decision – and thus uphold the Constitution against B.O.’s unlawful use of the appointment power. 

     

    3.  King of the Waivers:  The rule of law requires that government act – exercising its power to use force – through general, objective laws that apply equally to all individuals.  (That’s essentially what America’s founders meant by “a government of laws,” not “the rule of men” – the sine qua non of republican government.)  This aspect of the rule of law is also reflected in the constitutional guarantee to individuals of “the equal protection of the laws.”  When a president does not uniformly enforce federal laws, applying them equally to all Americans, but instead selectively enforces the laws – exempting certain favored individuals or groups from the burdens that the laws impose on others – he thus violates both the constitutional guarantee of equal protection and the general principle of the rule of law.  

    Rather than faithfully execute the laws, the duty imposed on him by the Constitution and the presidential oath, B.O. has instead selectively enforced federal laws, assuming a discretionary power to “waive” the law for certain favored individuals or groups.  Because the 2010 federal health-insurance law (so-called “ObamaCare”) imposes heavy costs on virtually all employers, the B.O. regime has sought to soften its impact, making it more politically palatable to the American people, as well as to reward its friends, by granting thousands of waivers from the law’s onerous mandates.  Thus, as of last year, thousands of businesses, state and local governments, labor unions, and insurers, covering over three million individuals or families, have been granted a waiver from ObamaCare by Secretary of Health and Human Services Kathleen Sebelius.  Michael Barone has noted that more than half of those three million persons participate in plans run by labor unions (even though union members are only 12% of all U.S. employees, they have received 50.3% of ObamaCare waivers).  He also noted that in 2011 Sebelius granted 38 waivers to restaurants, nightclubs, spas and hotels in former Speaker Nancy Pelosi’s San Francisco congressional district (“Waiver Grants Latest Example of Chicago Way,” Investor’s Business Daily, May 26, 2011).  

    The practice of selective issuance of waivers has continued with the White House’s announcement that ten states would be given waivers from the Bush-era federal education law known as the “No Child Left Behind” law.  (Like ObamaCare, that law unconstitutionally injects the federal government into matters reserved to the states under the Constitution.  Consistent with his oath of office, the president may “waive” enforcement of such unconstitutional laws in all fifty states.  But to waive it in some states while enforcing it in others is exactly the kind of favoritism or cronyism that the principles of equal protection and the rule of law forbid.) 

     

    2.  Crony Fascism:  As I’ve previously noted here on MayerBlog, B.O.’s agenda to expand the federal regulatory/welfare state may be more properly called “fascist” rather than “socialist” because, rather than having outright government ownership of major industries, his policy leaves businesses in private hands but subjects them to pervasive government regulatory control – much as Hitler did in Germany and Mussolini did in Italy in the 1930s and 1940s.  Indeed, Sheldon Richman in the Concise Encyclopedia of Economics defines fascism as “socialism with a capitalist veneer.”  Citing this definition in an American Thinker essay, Steve McCann has identified the B.O. regime’s use of cronyism – what some call “crony capitalism” but which is actually crony fascism – as a key element of what he calls B.O.’s “fascist economy” (“Obama’s Fascist Economy,” Sept. 21, 2011).)  The Orwellian-named Patient Protection and Affordable Care Act of 2010 (popularly known as “ObamaCare”) is not only fascist but, with the thousands of waivers now being selectively granted by B.O.’s secretary of health and human services (discussed above), is also a perfect example of how the B.O. regime’s fascist policies also lead to political corruption in the form of cronyism. 

                Two other key examples of B.O.’s cronyism aptly illustrate how his policies have not only undermined the American economy but also violated the rule of law.  The infamous bailouts of two Detroit auto companies, Chrysler and GM – for which B.O. during his 2012 reelection campaign claimed credit for “saving” the U.S. auto industry – were in essence government takeovers of the two companies in order to pay off a major campaign donor to B.O. and other Democrat politicians (the United Auto Workers union), at the expense of bondholders of the two companies and in blatant disregard of long-established rules of bankruptcy law.  As noted above, when President Truman tried to nationalize the steel industry, in the face of a looming national strike that he claimed might endanger the Korean war effort, the Supreme Court found his actions to be unconstitutional, in the landmark Youngstown Sheet & Tube Company v. Sawyer decision in 1952.  And as law professor Todd Zywicki has pointed out, “by restructuring through a politicized bailout process,” GM and Chrysler “were left in a weaker competitive position than they would have been had they simply gone through a traditional Chapter 11 [bankruptcy] process” (“Romney’s Big Opening on Bailout,” Investor’s Business Daily, Feb. 22, 2012). 

                Similarly, B.O.’s “green” energy program is a textbook example of corrupt cronyism, involving a series of “green,” or alternate energy, companies with ties to the White House, which the B.O. regime promoted and heavily subsidized with taxpayer funds, which now seem to be going down the drain as the companies fail.  The most visible part of the scandal is Solyndra, the failed solar-panel maker that squandered $535 million of “stimulus” money.  But many more companies are involved in the scandal, as an investigation by reporters for the usually leftist Washington Post has revealed.  Post reporters “found that $3.9 billion in federal grants and financing flowed to 21 companies backed by firms with connections to five Obama administration staffers and advisers” (“Influence for Sale,” Investor’s Business Daily, Feb. 17, 2012).    

     

    1.  Abuse of Executive Orders:  B.O. has followed the unfortunate precedents set by many of his predecessors in abusing his power to issue “executive orders,” not to clarify how the executive branch will enforce the laws passed by Congress but instead to make new laws, thereby usurping the legislative power that Article I of the Constitution vests in the Congress.  He has done so more audaciously than any of his predecessors, actually bragging about how he has by-passed Congress.   

                For example, on a three-day Western trip in late October 2011, B.O. announced that he would use executive orders to implement three initiatives: programs that he claimed would help 1.6 million college students repay their federal loans, 1 million homeowners meet their mortgage payments, and 8,000 veterans find jobs – all without any legislation being passed by Congress.  In a speech to students at the University of Colorado – Denver, he infamously declared, “We can’t wait for Congress to do its job.  So where they won’t act, I will. . . . We’re going to look every single day to figure out what we can do without Congress.”  (It was the opening act of B.O.’s reelection bid, employing a strategy of trying to imitate Harry S. Truman’s campaign against a supposed “do-nothing” Republican Congress, but ignoring the fact that B.O.’s own Democratic Party still controls the Senate.)   

                B.O. began aggressively using his executive-order power almost from the moment he took office.  The day after he was inaugurated, he revoked one of George W. Bush’s executive orders limiting access to presidential records.  The very next day, he signed an executive order calling for the U.S. military detention facility in Guantanamo Bay, Cuba to be closed within a year – an order that has not been implemented, as “Gitmo” remains open today.  He continues to abuse the power, with his “can’t wait for Congress” initiatives, including his implementation last summer of parts of the so-called DREAM Act (providing government benefits for the children of illegal aliens), when Congress refused to pass the legislation.  (Commenting on B.O.’s order, issued through a Homeland Security Department memo which effectively means that immigration laws no longer apply to some 800,000 people, columnist Charles Krauthammer has called it “a fundamental rewriting of the law,” by presidential fiat.  “The Immigration Bombshell: Obama’s Naked Lawlessness,” Investor’s Business Daily, June 22, 2012.)  Most recently, B.O. has said he’d implement over 20 different gun-control measures by executive order if Congress fails to act as he demands it does, to “reduce gun violence.”  Never mind that the president lacks such an authority under federal law – or that, even if he had the authority, it would violate Americans’ Second Amendment rights, as I discuss in Part III of my “2013: Prospects for Liberty” essay (February 7). 

                Bill Clinton’s abuse of the executive-order power (ably chronicled in chapter 3 of the book The Rule of Law in the Wake of Clinton (2000), edited by the Cato Institute’s Roger Pilon) led former White House aide Paul Begala to quip in The New York Times:  “Stroke of the pen, law of the land.  Kind of cool.”  That nicely sums up the cavalier attitude that B.O. and his regime have for the rule of law.  It’s not “cool” to abuse the power of the presidency; it’s unconstitutional. 

     

                To this rather lengthy list may be added two more recent instances of abuse of power:  the cover-up of the Justice Department’s “Fast and Furious” Mexican gun-running scheme, and the cover-up of the real story behind the Sept. 11 Islamist attack on the U.S. consulate in Benghazi, Libya. 

    “Operation Fast and Furious” was the illegal Mexican gunrunning operation in which agents of the Bureau of Alcohol, Tobacco, Firearms, and Explosives pressured U.S. gun dealers to sell weapons to Mexican cartels, all in an apparent effort to raise public support for gun control in the U.S.  Despite efforts by Congressional investigators to find out how the ill-fated operation began – especially to see if a “smoking gun” can be found, tracing the origin of the illegal operation to the office of Attorney General or even to the White House – Attorney General Eric Holder apparently has successfully evaded responsibility for the operation, targeting instead some low-level officials in the Justice Department and its subsidiary Bureau.  (See “Holder’s Cover-Up Gets Criminal,” Investor’s Business Daily, Jan. 31, 2012.)   The White House assisted Holder in the cover-up by asserting the doctrine of “executive privilege” – the same doctrine that former President Nixon asserted in an attempt to block congressional investigation of the Watergate scandal (noted below). 

    It is interesting that notwithstanding the wholesale turnover of members of B.O.’s cabinet – like rats leaving a sinking ship – one prominent member of B.O.’s first-term cabinet is apparently staying on for the second term:  Attorney General Eric Holder.  That’s ironic, because Holder appears to be either one of the most corrupt or one of the most incompetent attorney generals in modern U.S. history, as revealed by Congressional investigations into “Operation Fast and Furious.”  In previous presidencies (the Nixon administration, noted below, and most famously, that of Harry S Truman – who even had a sign on his Oval Office desk, “The buck stops here”), the occupant of the White House has been held responsible for the illegal actions of his subordinates, even if he had no actual prior knowledge of their activities.  Apparently that’s no longer the case, with the current “occupier” of the White House.  

    Another troubling cover-up – so troubling because the B.O. regime apparently again has succeeded, first, by deflecting Congressional Republican attempts to get to the truth and, second, by distracting the “lamestream” news media from reporting the story – is the regime’s cover-up of the September 11, 2012 violent attack on the U.S. consulate in Benghazi, Libya.  

    The September 11 attack resulted in, among other things, the brutal murder of four Americans (including U.S. Ambassador Christopher Stevens) in Libya.  As I discussed in the first section (“The `Arab Sping’ Turns into an Islamo-Fascist Autumn”) of my “Tricks and Treats 2012” blog entry (October 25), the B.O. regime did more than merely cover up the real nature of the attack – and its causes – for they concocted a phony “cover story,” which claimed that the attack on the U.S. consulate in Benghazi was motivated by “spontaneous” Muslim outrage at an anti-Mohammed video broadcast on the Internet.  That phony story was a deliberate lie, fed to the news media by the highest-ranking foreign-policy officials in the B.O. regime:  U.N. Ambassador Susan Rice, former Secretary of State Hillary Clinton, and B.O. himself.  

    What were they trying to cover up?  Principally, the fact that the attack was a deliberate act, planned in advance for the 11th anniversary of the 9/11/2001 Islamist terrorist attacks on the U.S., by a militant Islamist group associated with al-Qaeda; and secondarily, the fact that the U.S. government (the State Department and the military) failed to protect Ambassador Stevens and his staff, either by negligence or by deliberate malfeasance.  As the editors of Wall Street Journal succinctly put it, “Four Americans lost their lives in Benghazi in a terrorist attack that evidence suggests should have been anticipated and might have been stopped.  Rather than accept responsibility, the Administration has tried to stonewall and blame others” (“The Libya Debacle,” Sept. 27, 2012).        

    Why were they trying to cover it up?  Not only because the killing of one of our ambassadors is embarrassing in itself, but also because the incident exposes the fecklessness of B.O. in dealing with the threat of militant Islam. It also exposes, as I wrote on October 25, “the overall failure of B.O.’s foreign policy, which can be described only as disastrous”: 

    “These events, and their aftermath, demonstrate B.O.’s incompetence to be president – just as the parallel event 32 years ago, the violent seizure of the U.S. embassy in Iran by Islamist revolutionaries who held Americans hostage for over a year – demonstrated the incompetence of B.O.’s predecessor, Jimmy Carter.  It’s much more than merely B.O.’s `Jimmy Carter moment,’ however.  The difference between B.O. and Carter is that B.O. is not merely incompetent or feckless: he’s also dangerous, following policies that border on being treasonous.  (He’s weakened the ability of the United States to defend its own interests, undermining our credibility abroad, and giving aid and encouragement to our enemies in the Islamic world.)” 

      

    I added:  

    “Ironically, perhaps the main (if not the sole) reason why B.O. has been rated high on foreign policy in many American polls has been the killing of former al-Qaeda head Osama bin Laden – an act for which B.O. has unjustly claimed credit, showing his shameless narcissism as well as hypocrisy.  (See my discussion of “Laden with Hypocrisy,” in my “Thoughts for Summer 2011” (May 11, 2011).)  By claiming credit for the killing of bin Laden so shamelessly – by in effect `spiking the ball’ – at the Democratic National Convention, which concluded in Charlotte, N.C. on September 6, barely a week before the 9/11 anniversary, it appears that it was B.O. himself and his fanatic supporters who motivated these recent attacks in the Arab world.  Demonstrators in Tunisia, for example, chanted, `Obama, Obama, we are all Osamas!’” 

     

                B.O.’s foreign-policy failures are even more ominous if one considers the theory suggested by Dinesh D’Souza’s books and film 2016: Obama’s America – that, because of the “anti-colonialist” ideology that he inherited from his father, B.O. is deliberately pursuing an agenda designed to weaken the United States economically and militarily.  His covert support for militant Islamists and his attempts to appease Russian dictator Vladimir Putin – going so far as to promise Putin’s representative more “flexibility” in meeting Russia’s demands for the downsizing of the U.S. nuclear missile arsenal – may in fact amount to treason.  (As I discuss in the final section below, B.O. may be another first in the history of U.S. presidents – the first president to commit acts that would justify his impeachment on the grounds of treason.) 

    It is not just conservatives or libertarians who are seriously concerned about B.O.’s abuse of the powers of his office.  Jonathan Turley, a constitutional law professor at George Washington University Law School (and a contributor to USA Today) has said that B.O. “is using executive power to do things Congress has refused to do, and that does fit a disturbing pattern of expansion of executive power.”  Professor Turley adds: “In many ways, [B.O.] has fulfilled the dream of an imperial presidency that Richard Nixon strived for.  On everything from [DOMA] to the gaming laws, this is a president who is now functioning as a super-legislator.  He is effectively negating parts of the criminal code because he disagrees with them.  That does go beyond the pale.”  Quoting Professor Turley – who in many respects is a typical left-liberal law professor – Steve Friess, writing in Politico, comments that the Nixon analogy “may be apt.”  Friess’s article also cites John Eastman, a conservative constitutional law professor at Chapman University School of Law, who also draws parallels between B.O.’s version of the “imperial presidency” and Nixon’s action of not spending, or impounding, funds appropriated by Congress.  (Nixon’s impeachable conduct and the supposed “imperial presidency” model are discussed in the next section, below.) 

    In light of the many ways B.O. has abused the powers of his office, violating both the Constitution and the general rule of law, he most certainly did not deserve to be elected to a second term.  Indeed, he deserves to be impeached and removed from office – before he does any further damage to the Constitution and to the United States of America.

      

     

    Impeachment: Enema of the State

     

                   What is there to do when the American people foolishly re-elect to a second term such a lawless “occupier” of the White House?   Anticipating a future time when the U.S. president might act as a scoundrel and even a would-be dictator, the framers of the Constitution wisely provided a solution – a means by which a corrupt president could be, in effect, “flushed away, “ removed, put out of office.  We might consider it “the enema of the state.”  It’s called impeachment. 

                   The House of Representatives, representing “the people,” is granted (in the final clause of Article I, Section 2) “the sole Power of Impeachment.”  Impeachment – the direct accusation and arraignment of an individual for misconduct – was a power that had been won by the English House of Commons by the late medieval period.  It had proved to be a useful check against the king and his ministers, a device by which a minister of the Crown could be held directly responsible to Parliament for his official acts.  As granted to the House by the Constitution, the power of impeachment similarly provides a check against the abuse of power by officers in the other two branches of government, including the president and federal judges.   

                   The Senate is granted (by the sixth clause in Article I, Section 3) “the sole Power to try all Impeachments,” sitting essentially as a court with each member “on Oath or Affirmation” to render a true verdict.  That role for the Senate parallels that of the English House of Lords, which since early medieval times had retained the judicial functions of the king’s great council and remained the highest court in the land.  As one English constitutional historian has summed it up, “The House of Commons, by acting as a grand or accusing jury, could present ministers or other servants of the king before the House of Lords for trials for serious offenses, such as treason or felony.  If the upper house found the accused guilty of the charges against him the penalty might be death.”  The Constitution (in the final clause of Article I, Section 3) prescribes no such serious penalty; rather, it limits the consequences of conviction by the Senate (which requires a two-thirds vote) to removal from office and disqualification to hold any federal office.   

                   The grounds for impeachment are specified in Article II, Section 4, which provides: “The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.”  Treason and bribery have fairly straightforward meanings.  (Indeed, the crime of treason against the United States is specifically defined in Article III, Section 3.)  But “high Crimes and Misdemeanors” has no precise legal meaning; it was meant to embrace, as Alexander Hamilton maintained in Federalist Papers No. 65, “those offenses which proceed from the misconduct of public men, or, in other words, from the abuse or violation of some public trust.”  

    The history of impeachments provides further evidence of the meaning of “high Crimes and Misdemeanors” – those offenses proceeding from presidents’ “misconduct” or their “abuse or violation” of their “public trust.”  During the past 225 years, since ratification of the Constitution, serious efforts have been made to impeach a U.S. president only three times – during the presidencies of Andrew Johnson, Richard Nixon, and Bill Clinton – and only two of these presidents (Johnson and Clinton) actually have been impeached by the House and tried by the Senate.  (Nixon was nearly impeached by the House, but he resigned before articles of impeachment came up for a full House vote.) 

    Andrew Johnson’s impeachment and trial in 1868 has generally been criticized as a “political” act – the basic interpretation given by historian Michael Les Benedict, in his book The Impeachment and Trial of Andrew Johnson (1973), and by former Supreme Court Chief Justice William H. Rehnquist, in his book Grand Inquests (1992).  (Rehnquist was only the second Chief Justice to preside over a Senate trial of a president, of Bill Clinton in 1999; and arguably, he was the best-informed Chief Justice to do so, because of his authorship of this book which compared the historic impeachments of Justice Samuel Chase and President Johnson, both of which Rehnquist regarded as politically-motivated.)   

    The articles of impeachment against Johnson charged that he – “unmindful of the high duties of his office, of his oath of office, and of the requirement of the Constitution that he should take care that the laws be faithfully executed” – had violated the law by removing Edwin Stanton as Secretary of War and replacing Stanton with an interim appointee, Lorenzo Thomas.  The articles identified the law Johnson allegedly violated – the 1867 Tenure of Office Act, which required Senate consent for the removal or replacement of a confirmed Cabinet member.  That law itself was of dubious constitutionality: the Constitution requires Senate confirmation (its “advise and consent” power) only for the appointment of Cabinet members and other U.S. officials; and it had become accepted practice since Washington’s time for the president to fire Cabinet members without Senate approval (as well as to make valid interim appointments, when the Senate was not in session).  This charge against Johnson looked like a kind of political “entrapment” – Congress passed (over Johnson’s veto) an unconstitutional law, dared Johnson to violate it, and then charged him with “a high misdemeanor” for violating that law – and hence confirms the traditional view that Johnson’s impeachment was politically motivated. 

    Another article of impeachment against Johnson – the tenth – charged that he also had committed “a high misdemeanor” in office by publicly criticizing Congress.  The article alleged that Johnson:  

    “did attempt to bring into disgrace, ridicule, hatred, contempt, and reproach the Congress of the United States . . ., to impair and destroy the regard and respect of all the good people of the United States for the Congress and legislative power thereof . . ., and to excite against the odium and resentment of all the good people of the United States against Congress and the laws by it duly and constitutionally enacted.”

     

    How did he do this?  By “openly and publicly,” in various public speeches, delivering “in a loud voice certain intemperate, inflammatory, and scandalous harangues,” and that he did so “amid the cries, jeers, and laughter of the multitudes then assembled and within hearing.”  In other words, he not only criticized Congress publicly but, apparently, did so with great political effect! 

    The traditional view by most scholars is that this article of impeachment underscored the “political” motives behind Johnson’s impeachment, but one scholar (Walter Berns, writing a 1994 op-ed in the Wall Street Journal) has suggested that this might have been a serious charge that Johnson misused the power of his office.  Berns has pointed out that in early American history, following the text of the Constitution, the president’s legislative role was limited to providing “information on the State of the Union,” by a written annual message to Congress – the tradition begun by Thomas Jefferson (which I discuss above, in the section on “The Unconstitutional State of the Union”).  Before Teddy Roosevelt changed the presidential office into a “bully pulpit” – a change institutionalized by Woodrow Wilson, who sought to refashion American government along the lines of the British parliamentary model (as also noted above) – U.S. presidents rarely spoke in public.  “Aside from issuing proclamations, or responding to addresses or `serenades,’ presidents spoke directly to the people mostly on special occasions: upon assuming office [or upon leaving it]; or when dedicating, say, . . . the cemetery at Gettysburg.”  And “[b]efore the coming of radio and TV, even presidential candidates kept aloof from the public, issuing no statements on their own behalf and making no speeches”: they “stood for office,” but did not campaign (“The Prattling Presidency,” Wall Street Journal, Oct. 31, 1994).    

    Thus, going beyond his veto of what he regarded as unconstitutional legislation, Johnson actively sought to undermine Congress’s Reconstruction policy, by taking his case directly to the people.  What today is considered a normal practice of presidents was, in the 19th century, justly regarded as an abuse of presidential power.  Johnson’s impeachment was no more “political” than was his acquittal by the Senate.  The Constitution requires a two-thirds “guilty” vote of the Senate, for a president to be removed from office.  In their balloting at the end of Johnson’s Senate trial in 1868, 35 senators found him guilty but 19 senators (including 7 Republicans) found him not guilty – just one vote shy of the two-thirds needed to convict and remove him.  Although constitutional scholars and popular authors (most famously, JFK in his book Profiles in Courage) have applauded for their supposed courage (or integrity) the 7 Republican senators who voted to acquit Johnson, the reality is that they too were influenced by political motives:  their abhorrence of their colleague, Benjamin Wade, who as president pro tempore of the Senate would have become president if Johnson were convicted.  Apparently as politically difficult as Johnson had been in attempting to thwart Congressional Reconstruction, the Radical Republicans who controlled Congress regarded Johnson as less of a threat than the bellicose Senator Wade, especially when the Republicans enjoyed a two-thirds “veto-proof” majority in both houses of Congress, thereby effectively nullifying the rest of Johnson’s presidency. 

                Long before the Watergate scandal began – with an innocuous story in the Washington Post reporting a mysterious break-in at the Democratic National Committee’s offices in the Watergate office complex on June 17, 1972 – Richard Nixon’s political enemies (and his critics in the academic world) had been calling for his impeachment.  In fact, the New York Times printed a series of articles in the early spring of 1973 (just before the famous Senate Watergate committee had begun its public hearings), arguing that Nixon was creating a dangerous “imperial presidency.”  (I read the articles as they were reprinted in The Detroit Free Press, March 11-15, 1973.)  In one of these articles, under the headline “Nixon Is Fighting for the Strongest Presidency since FDR,” distinguished constitutional historian Henry Steele Commager was quoted saying, “In so many ways, I think Mr. Nixon has gone far beyond any previous president in history.”  Another presidential scholar, Thomas E. Cronin, was quoted saying that Nixon “has systematically gone about trying to strengthen the presidency in a number of ways, frequently by circumventing the Constitution or expanding on past practices that were ambiguous or questionable.”  These practices included Nixon’s “impounding” of funds appropriated by Congress for domestic programs and his use of the war power in Vietnam.  One of the articles accused Nixon of “stealing power from his own Cabinet”; another maintained that he was “dismantling 40 years of Democratic policy” through his “New Federalism,” which turned back authority – and money – for various federal programs to the states and local governments.  In another article, under a headline that asked the provocative question, “Is Congress Helpless Against Nixon Power Grab,” Commager is again quoted, saying:  “One answer would be impeachment if the Congress had any guts, but it doesn’t.” 

    Only after the media began obsessing over the Watergate break-in – and the American public’s attention was stirred by the Senate Watergate committee hearings in the spring of 1973 – did Congress begin to seriously consider impeachment.  The Watergate affair is simply too complex to discuss here, but I did discuss it previously – in a MayerBlog essay posted on the 33rd anniversary of the break-in, “The Legacy of Watergate” (June 17, 2005).  I’ll reiterate just a few salient points, focusing on the articles of impeachment drawn up against Nixon in 1974.  

    Nixon’s impeachable actions concerned not the break-in itself – that was merely a “third-rate burglary,” as some commentators have called it, part of the “dirty tricks” of the 1972 campaign – but his abuse of the powers of his office in an attempt to “cover up” the break-in.  (To this day, there has been no evidence showing that Nixon in advance approved of, or even knew about, the break-in, which was committed by a secret unit called “the plumbers,” acting on behalf of Nixon’s reelection campaign, the Committee to Re-elect the President, with the unfortunate acronym name, CREEP.  But the “smoking gun” sought by investigators and members of Congress – which linked Nixon directly to the cover-up – was his statement, recorded on one of the infamous “Watergate tapes,” secret recordings of Oval Office conversations, in which Nixon discussed paying “hush money” to a particular person.)  

    The articles of impeachment drawn up against Nixon by the House Judiciary Committee in the summer of 1974 echoed the language of the articles drawn up against Johnson in 1868.  They alleged that Nixon violated his office and his oath to follow the Constitution, and his duty to see that the laws are faithfully executed, by (1) “using the powers of his office, engaged personally and through his subordinates and agents,” in a course of conduct or plan “designed to delay, impede, and obstruct” investigation into the Watergate break-in, and to “cover up, conceal, and protect those responsible” and “conceal the existence and scope of other unlawful covert activities”; (2) “violating the constitutional rights of citizens” by misusing executive agencies including the IRS and the FBI; and (3) “refusing to produce [those] papers and things” subpoenaed by the House Judiciary Committee.  The third article accused Nixon of disobeying congressional subpoenas – something that many other presidents have done throughout U.S. history, particularly when they assert “executive privilege."  The second article, as noted above, consisted of bare allegations of misuse of various agencies, without any evidence supporting the allegations (at least as they pertained to Nixon himself).  But it was the first article, alleging obstruction of justice through the attempted cover-up, that was supported by evidence – including the above-mentioned “smoking gun” tying Nixon personally to the cover-up.   

    With only the Johnson impeachment as a precedent – and given the traditional criticism of that impeachment as politically-motivated – even the most partisan Democrats on the House Judiciary Committee in 1974 were reluctant to impeach Nixon unless he could be personally tied to some criminal activity.  Some on the committee (including a young staffer, a recent graduate of Yale Law School, Hillary Rodham Clinton) maintained that any sort of malfeasance in office might constitute “high crimes and misdemeanors” (an opinion she’d later regret when her husband faced impeachment 24 years later); but most members of the House committee followed a narrower standard requiring actual criminal conduct.  Note, however, that both the first and second articles of impeachment held Nixon responsible not just for the actions he personally undertook, but also the actions undertaken “through his subordinates and agents.”  In other words, the House Judiciary Committee, while following a fairly narrow standard of what constitutes impeachable offenses, regarded the “buck” as stopping at the president’s desk. 

    Nixon, showing both a patriotism and a sense of shame which were conspicuously absent in Bill Clinton, resigned the presidency in August 1974, rather than put the United States through the agony of another presidential impeachment trial.  Indeed, he decided to resign even before the House voted formal charges of impeachment, after a bipartisan delegation of members of Congress visited him at the White House and told him, frankly, that he’d be impeached by the House and most likely convicted by the Senate.  The fact that the effort to impeach Nixon was truly bipartisan – that Republican leaders joined Democrats in urging Nixon to resign – made the formalities of impeachment and trial unnecessary (along with, as I’ve noted, Nixon’s patriotism – putting the good of the country, and of the presidency, ahead of his own ego, as well as his own sense of shame). 

                Over sixteen years ago, in October 1996 just before the elections that year, I wrote an op-ed that was published in my local newspaper, The Columbus Dispatch, arguing that rather than being elected to a second term, Bill Clinton ought to have been impeached and removed from office.  (That was long before anyone had heard of Monica Lewinsky, the White House intern with whom Clinton was having a sexual relationship – an affair he tried to cover up by committing perjury before a grand jury and by obstructing justice by suborning others to commit perjury – the crimes for which he was impeached in 1998, although he ultimately was found not guilty by the Senate in a politicized trial in 1999.)  In 1996 the “high crimes and misdemeanors” Clinton appeared to have committed involved the scandals known as Whitewater, “Travelgate,” and “Filegate” – matters involving allegations of abuse of power far more serious than the Lewinsky cover-up (and also far more serious than the Watergate affair that ended Richard Nixon’s presidency), but for which special prosecutor Kenneth Starr did not recommend impeachment because he could not discover evidence personally linking those crimes to Clinton.   

                   As I noted in my “Legacy of Watergate” essay and again in my essay “The Worst U.S. President” (Feb. 15, 2008) – needless to say, written at a time when I couldn’t imagine an even worse (or more lawless) president than Clinton – Ken Starr did Clinton a huge favor by deciding to report to Congress only those actual crimes for which he found evidence personally linking Clinton – a narrower standard of impeachable offenses than the House Judiciary Committee had adopted with regard to Nixon in 1974.  That is why two articles of impeachment approved by the House late in 1998 charged Clinton only with (1) obstructing justice by providing “perjurious, false, and misleading testimony” to a federal grand jury and (2) obstructing justice by, among other things, concealing evidence and suborning perjury by others.  Notwithstanding clear evidence of his guilt for both these crimes, Clinton was acquitted by the Senate – voting 45 guilty, 55 not guilty on the first (perjury) count and voting 50–50 on the second (obstruction of justice) count.  As the House’s chief investigative counsel, David P. Schippers (a Democrat and former prosecutor from Chicago) explained in his excellent book, Sell Out: The Inside Story of Clinton’s Impeachment (2000), the senators who voted to acquit Clinton did not follow the evidence; instead, they followed “the path of political expediency,” in which high public officials would be answerable only to "politics, polls, and propaganda."   Clinton remained in office because he was popular and he had the support of his party, who dismissed his unlawful actions as “merely” about “sex.” 

                   Republicans, by focusing on the sexual nature of the Lewinsky affair rather than the crimes of perjury committed in furtherance of the cover-up, botched their prosecution of Clinton, allowing the Democrats to dismiss it as merely about “sex.”  Judging from the role he played in last year’s presidential election, at the national convention and on the campaign trail, Clinton is regarded by Democrats as a kind of senior statesman in the party, rather than as a serial abuser of women and an unconvicted felon who was stripped of his license to practice law.  How short political memories are! 

                   Virtually any of the lawless actions or abuses of power that B.O. has committed thus far during his occupation of the White House would constitute impeachable offenses.   Indeed, in my “Election 2012 Postmortem” essay (Nov. 10, 2012), I predicted that B.O. would be impeached and removed from office before he completes his second term:

                “The abuses of power committed by B.O. over the past four years – abuses that justify my calling him `the most lawless president in U.S. history’ . . .  ought to have disqualified him from being elected to a second term.  Now that he has been reelected, however, I seriously doubt whether he will complete a second term.  I am confident that his disregard for the Constitution and the rule of law will prompt him to continue to commit offenses – not merely `high crimes and misdemeanors,’ but also possibly treason (making him the first traitor to hold the office of president) – that will justify his impeachment and removal from office before his second term ends in 2017, in other words, sometime during the next four years.”

     

    I added that he might be the first president to be impeached for treason, given the ominous incident that some commentators have called “Mic-Gate”: an incident that shows how far B.O. is willing to go to appease Russia.  In a private conversation captured accidentally on a “hot mic,” B.O. told Russian President Dmitri Medvedev on March 26, 2012:  “On all these issues, but particularly missile defense, this can be solved, but it’s important for him [Putin] to give me space. . . . This is my last election.  After my election, I have more flexibility.”  So B.O. wants the Kremlin to know that more “flexibility” is on the way, but since American public opinion will oppose it, he needs to wait until after his presumed reelection.  That’s practically an admission that B.O. intends to act against the interests of the United States, to give comfort and aid to our enemy, Russia (which remains the enemy of the U.S. even with the “Cold War” over) – in other words, to commit treason. 

                   But as I noted in my “Legacy of Watergate” essay, the lessons of both Nixon’s and Clinton’s presidency – of the successful attempt to impeach Nixon and of the failed effort to convict Clinton – a successful impeachment effort must be bipartisan; it cannot be begun by the Republicans alone, unless they hold commanding majorities in both the House and the Senate.  (Perhaps some particular abuse of power – such as the drone strikes discussed in the “Assassin-in-Chief” section above – will receive attention from the news media and arouse concern among left-liberals as well as conservatives and libertarians.)  Hence, as I discussed in the concluding section of my “Election 2012 Postmortem” essay, the Republicans face a major challenge – both to educate the American people and to make them care, to make them care whether the man in the Oval Office is abusing the awesome powers of his office.  As I wrote: 

    “More generally, Republicans must educate the American people about the basic laws of economics, the fundamentals of our free-market capitalist system, and our federal, republican system of limited government powers – and the vital role played by the U.S. Constitution, if scrupulously adhered to, in both limiting government power and safeguarding individual rights (including such precious rights as economic freedom).  They cannot allow the Democrats and their allies in the left-wing news media to misrepresent Republicans and their policies, to tell lies and to demagogue with the American people.

     

    “Indeed, Republicans today also should take the lead in educating the American people about the importance of the rule of law – and the vital importance of all public officials adhering to the law (particularly the president, who at his inaugural takes an oath to `support and defend’ the Constitution, which also imposes on him the duty `to see that the laws are faithfully executed’).  Republicans should emulate one of the founders of their party in the 19th century, Abraham Lincoln, who in his famous Address before the Young Men’s Lyceum of Springfield, Illinois, declared, in part, `Let every American, every lover of liberty, every well wisher to his prosperity, swear by the blood of the Revolution, never to violate in the least particular, the laws of the country; and never to tolerate their violation by others. . . . In short, let it [respect for the rule of law] become the political religion of the nation . . . .’

     

    “And, [of course], Republicans must take the lead in exposing the various ways B.O. is abusing the powers of the presidency, violating both the Constitution and the rule of law.  The House Republicans should have the guts to be prepared to initiate impeachment proceedings against B.O. – but only after they have first paved the way by getting popular support (primarily by getting the American people to care about abuses of governmental power by seeing how it threatens their individual rights).  They should learn a lesson from the unsuccessful impeachment of Bill Clinton – when the Republicans failed to make the case why he had violated the law and ought to be removed from office, and so they failed to acquire bipartisan support from the Senate Democrats they needed to convict Clinton and remove him.  Republicans instead should learn a lesson from the near-impeachment of Richard Nixon in 1974: how Nixon’s political enemies (within the Democratic Party, academia, and journalism) started building the case for his impeachment well before the Watergate scandal caught the public’s attention; so, by the time the House was ready to vote for articles of impeachment, a bipartisan group of members of Congress advised Nixon he must go – and thus he resigned, for the good of the country.”   

     

    I added, “I don’t expect anyone as arrogant and narcissistic as B.O. to do a similar thing [as Nixon did], with faced with the real possibility of impeachment by the House and a trial in the Senate – which is why it is so vitally important for Republicans to educate the public and, frankly, to not only court but also shape public opinion.”  The Democrats’ success in doing that helps explain why they retained both the White House and control of the Senate in the 2012 elections.  The Republicans will need to be equally successful, to retain control of the House and perhaps regain control of the Senate in the 2014 midterm elections – if we are to hope for a restoration of a constitutional presidency, before B.O.’s lawlessness destroys both the office and the Constitution.

     

     | Link to this Entry | Posted Thursday, February 21, 2013.  Copyright © David N. Mayer.


    2013: Prospects for Liberty (Part III) - February 7, 2013

     

    2013: Prospects for Liberty

     

    Part III

     

      

    Continuing my sometimes-annual January blog essay on “The Prospects for Liberty” in the coming year, I’m emphasizing again this year what I have called “the tyranny of bullshit.”  Part I discussed the single greatest threat to individual freedom today – government paternalism, through the expanding national regulatory/welfare state – and the policies being pushed by B.O. and his party, the Democrats, the party that advocates further expansion of the welfare state.  Their policies are based on nothing more than bullshit:  bullshit rationalizations offered by paternalistic politicians, bullshit believed by a majority of the gullible public, the fools who voted for the politicians “whose sole qualification to rule [us] was their capacity to spout the fraudulent generalizations that got them elected to the privilege of enforcing their wishes at the point of a gun," to quote Ayn Rand’s apt definition of politicians.  Unlike other forms of tyranny in the past, the tyranny of bullshit is not based on the use of force against an unwilling people:  it’s based instead on fraud, fraud committed by the politicians who spout the bullshit theories that persuade a gullible majority of the people to put the collars around their own necks.   

    Part I focused on the reelection of B.O., the current “Occupier” of the White House and the Bullshitter-in-Chief, whose policies pose the most tangible threat to Americans’ freedoms in the near future (and the most tangible manifestation of “the tyranny of bullshit”).  I discussed the reasons why B.O. was reelected to a second term and, in general, the challenges posed by those of us who oppose his policies and the hope we have to defeat, or at least to thwart, his efforts to implement them.   

                In Part II, I began discussing particular aspects of B.O.’s “tyranny of bullshit,” focusing on two of the most important: “`Fairness’ and `Social Justice’ Bullshit” (threats to liberty from redistributionist taxation) and “Keynesian `Stimulus’ Bullshit” (threats to liberty from profligate government spending).  The section on taxation included a discussion of the new tax law, passed at the very end of the 112th Congress, intended to resolve the tax part of the “fiscal cliff” crisis.  The second section included discussion of the unresolved parts of the “fiscal cliff,” the critical problems of out-of-control federal deficit spending and the burgeoning national debt. 

                In this third and final part of the 2013 “Prospects for Liberty” essay, I’ll discuss some additional types of bullshit propagated by B.O. and the Democrats that threaten Americans’ freedom – and the nation’s well-being.  Within each category, I’ll also discuss the challenges that those of us who oppose B.O.’s policies face – and also the hope we have in successfully meeting those challenges and thus defending Americans’ liberty.

       

     

     

    “Health Care Reform” Bullshit:

    The Threat to One of the Most Precious Aspects of Liberty

    (Our Freedom to Own Our Own Lives)

       

                Last summer, on June 28, the U.S. Supreme Court surprised most legal commentators by its decision in National Association of Independent Business v. Sebelius, the case challenging the constitutionality of the 2010 federal health-care regulatory law, which is officially titled “The Patient Protection and Affordable Care Act” but is popularly known as “ObamaCare,” because it is the signature legislative “achievement” of B.O.’s presidency.  (The official title of the law is quite disingenuous, for it provides genuine “protection” for no one and it makes health care more expensive, not more “affordable.”  Interestingly, advocates of the law, including B.O. himself, lately have been calling it simply “the Affordable Care Act” – dropping the first lie but perpetuating the second.) 

                Whatever one calls it – whether the PPACA, the ACA, “ObamaCare,” etc. – the 2010 federal law was a monstrosity, arguably the worst piece of legislation ever to pass Congress.  To put it another way, one might call it the biggest Mongolian clusterfuck of them all.  (The Urban Dictionary defines Mongolian clusterfuck as “a generally futile attempt to solve a problem by throwing more people at it rather than more expertise,” or more generally as something “that is spinning or has already spun out of control with disastrous results.”  The term, which originated in military slang, seems an apt description of the 2010 legislation.)  The massive, 2,801-page law – with its 159 new bureaucracies, $2.6 trillion in new spending, 1,968 new federal powers, and 13,000 pages of regulations (so far) – was rammed through the Democrat-controlled 111th Congress during the final days of its “lame duck” session, despite strong opposition by the Republican minorities in both houses as well as strong opposition by the American people, according to most opinion polls.  Even supporters of the law acknowledged that it was too complex for them to either understand or explain exactly what it would do.  (Remember the infamous comment by former House Speaker Nancy Pelosi that Congress would first have to pass the bill before we know exactly what it provided?) 

                Since its enactment, the law has continued to remain unpopular.  In fact, the more Americans find out about the law and its actual effects on the nation’s health-insurance market, the less they like it.  They have discovered, among other things, that the law imposes 18 new taxes on Americans, including levies on medical device makers and small businesses that decide not to cover their employees – taxes totaling at least $800 billion, “a massive tax hike that hurts businesses and hampers economic growth,” as the editors of Investor’s Business Daily have concluded.  “ObamaCare” also increases the Medicare tax by nearly 1% - then counts the revenues from that tax hike twice, pretending the whole amount goes to fund Medicare and increase general budget revenues at the same time.  Worse, the law carves more than $570 billion out of Medicare – meaning that financially untenable program will go bankrupt much sooner.  Perhaps worst of all, the new Independent Payment Advisory Board – an unelected body, over which Congress has no control – will make sweeping decisions about what is covered.  “In short, the government will replace patients and doctors in deciding who gets care – and what care they get” (“Five Challenges in 2013 that Dwarf `Fiscal Cliff’,” January 2).  

    While state governments decide whether or not to create the insurance exchanges mandated by the law (discussed below), it has become more and more apparent that “ObamaCare” will fail to achieve the two principal goals stated in its official title:  it will not “protect” patients by guaranteeing coverage to Americans who are currently uninsured, nor will it make health care more “affordable.”  In fact, most Americans are finding that health-care costs – particularly the premiums that they or their employers pay for health insurance – are beginning to skyrocket, as the law starts to be implemented.  And many Americans are realizing that they will lose their insurance coverage, as employers change their business practices to adjust to the law’s “employer mandate.”  That provision of the federal law requires companies with over 50 employees to provide insurance for anyone working 30 or more hours a week, or face fines.  As Sally Pipes (CEO of the Pacific Research Institute) explains, the law creates a strong incentive for companies to reduce their workers’ hours to part-time (for example, Wal-Mart recently announced it will not offer health insurance to new employees who work less than 30 hours a week) – or for small businesses not to grow beyond 50 full-time workers.  Either way, the law will result in “job cuts [that] will hurt the working poor most” (Pipes, “Under ObamaCare, Many Will Lose Their Coverage,” I.B.D., December 27). 

                “ObamaCare” may not be fully “socialized medicine” – just a step toward socialized medicine – but, in a very real sense, it may be called “fascist medicine.”  Whole Foods CEO John Mackey caused a stir in 2009 when, in a Wall Street Journal op-ed, he compared the law to socialism.  In a January 16 interview on NPR, Mackey caused even more controversy when he was asked a follow-up question about the law.  After all, the interviewer noted, the so-called “public option” was never adopted; the law is a mishmash of mandates, regulations and price controls, but falls short of an outright nationalization of the insurance industry, so how could Mackey compare it to socialism?  Mackey set off another firestorm with his response – saying, “Technically speaking, it’s more like fascism,” explaining: “Socialism is where the government owns the means of production.  In fascism, the government doesn’t own the means of production, but they do control it, and that’s what’s happening with our health care programs and these reforms.”  Although Mackey since has walked back his statement – apologizing for his “bad choice of language” – but he was quite right:  “ObamaCare” is indeed fascist.  More precisely, it may be labeled “corporatism,” the economic theory that undergirded the fascist ideology in Benito Mussolini’s fascist regime in Italy in the 1920s and 1930s.   

                The brainchild of Alfred Rocco, a key figure in Mussolini’s regime, corporatism “called for the organization of the economy into corporate sectors that would cooperate with the government in implementing state policy.”  Rocco aimed at “uniting workers, entrepreneurs, and government officials in economic activities, carried out in the public interest or the interests of the state.”  As he put it, “For Fascism, society is the end, individuals the means, and its whole life consists in using individuals as instruments for its social ends.”  As Robert Romano, senior editor of Americans for Limited Government, observes: “Certainly, that’s what ObamaCare – with its individual and employer mandates to respectively purchase and provide health insurance – seeks to accomplish.  It guarantees customers to large companies, in this case insurance providers that supported passage of the legislation, and in the process cartelizes the system” (Romano, “ObamaCare as `Fascism’? If Ideological Shoe Fits . . .,” I.B.D., January 23). 

    In truth, the law creates a corporatist or fascist system that (as its chief supporters wish, although they generally aren’t honest enough to say so) will eventually lead to fully socialized medicine, under a government monopoly.  Under the guise of extending health-care coverage to millions of uninsured Americans, the law really aims at nationalizing the health-care industry – imposing on private health insurance companies mandates that are fiscally unsound, which will eventually drive them out of business, destroying the market for private health insurance – and thus ultimately creating a “single-payer,” “universal,” national system of socialized medicine, jeopardizing not just the freedom but the health and even the lives of all Americans.  (B.O. himself, in a speech to supporters prior to its enactment, called the law “a step” toward universal coverage, or fully socialized medicine.) 

                Having the government “guarantee” so-called universal health care – which really is a euphemism for having the government totally control the providing of health-care services through a government monopoly, giving the government literally life-or-death control over the lives of its individual citizens – has been a dream of so-called “progressive” political activists since the early 20th century.  It was a key component of the national “welfare state” advocated by political activists in the so-called “Progressive” movement of the early 1900s – yet it was the one component omitted from FDR’s “New Deal” programs (which did include other key components, such as old-age pensions and disability payments through “Social Security” as well as unemployment compensation) and only partially enacted in Lyndon B. Johnson’s so-called “Great Society” programs, as expanded by Richard M. Nixon (namely, Medicare and Medicaid).   

                To call such programs – or the political activists who promote them – “progressive” really is a misnomer, a perversion of terms, for they are anything but progressive, in the true sense of the word.  As I discussed in my essay “Reactionary `Progressives’” (Mar. 16, 2006), there’s nothing truly “progressive” about the 20th-century “welfare state” model:  it’s actually reactionary, based on a centuries-old paternalistic model of government that had been revived by German Chancellor Otto von Bismarck in the late 19th century, in part to provide an “opiate for the masses” that might discourage Communist revolution.  This paternalistic model shows contempt not only for the free market but also (and most fundamentally) for the ability of ordinary citizens to be in control of their lives, by being free to make their own choices and to act on those choices; instead, “progressive” reformers ironically have unlimited confidence in the ability of government “experts” – the legislators who pass the laws, and the bureaucrats who make regulations to implement them – to make the choices and to impose their choices on individual citizens.  

    The so-called “right to health care” that left-liberal “progressive” activists have been pushing for the past century or so is really a perversion of the concept of individual rights, properly considered.  What it really means is not a “right,” properly speaking, at all, but rather the loss of the genuine right that all Americans have – an aspect of their fundamental natural right to liberty, which includes liberty of contract – which is the right best described as “health care freedom,” the right of individuals to be in control of their own health and lives, including their freedom to enter into contracts for health care products and services, such as health insurance.  Yet it’s that genuine, legitimate right to health-care freedom that the 2010 law eventually will destroy, as I wrote in my essay “Health Care `Rights’ and Wrongs” (Oct. 16, 2009). 

                And although it has been touted by its proponents as major “health care reform,” the 2010 law is not really a “reform,” in the true sense of the word.  It moves the U.S. health-care system in entirely the wrong direction, away from the kind of common-sense true reform that’s really needed.  As I discuss in my 2009 “Health Care `Rights’ and Wrongs” essay, the real problem with the high cost of the U.S. health-care system is its misuse of insurance to cover routine health-care costs rather than extraordinary costs like hospitalization (which is what insurance ought to be for).  The private-insurance part of the U.S. health-care system is its strength – giving Americans access to the best-quality health care in the world – even despite its problems (not only cost but the problem of “portability” because it’s employer-based), while the part that has been socialized by government (Medicare and Medicaid, primarily) has inflated overall costs while reducing quality of care to those people (generally the elderly and the poor) who are limited in their choices to these government-monopolized programs.  Real reform would focus on contracting the socialized part of the nation’s health-care system (gradually privatizing both Medicare and Medicaid) while strengthening the market for private insurance, making it more competitive, and giving individuals greater freedom of choice in shopping around for the insurance plans best suited to their individual circumstances.  Real reform would focus on the basic problem of high costs rather than the chimera of expanding coverage to people who lack insurance (for whatever reasons). 

    The legal challenge to the 2010 law, in the case of National Association of Independent Business v. Sebelius, focused on the most controversial – and the key – provision of the law, the so-called “individual mandate” (the requirement that Americans purchase health insurance as mandated by the federal government).  The pivotal opinion was written by Chief Justice John Roberts, who sided with the Court’s four more conservative justices in holding that was not a constitutional exercise of the powers of Congress under either the Commerce Clause or the Necessary and Proper Clause.  Roberts also joined the four conservatives and, remarkably, two of the more “liberal” justices on the Court, in holding unconstitutional another controversial part of the law, mandating that the states expand Medicaid to provide coverage for uninsured Americans.  Yet in the crucial part of Roberts’ opinion, the Chief Justice joined the four left-liberal justices in upholding the individual mandate, calling it a tax – in other words, a constitutional exercise of Congress’s power to levy taxes.  That’s the part of the decision that made all the headlines, as the news media touted it as a “victory” for B.O. and his regime.  (Interestingly, the media virtually ignored the fact that two “liberal” justices – including Justice Kagan, B.O.’s own former Solicitor General, had joined Roberts and the four conservatives in holding the Medicaid mandate on the states to be unconstitutional.  In that respect, at least, the decision was a huge political defeat for B.O.) 

    As I wrote in my special post discussing the decision (“Supreme Folly 2012: The Supreme Court’s `ObamaCare’ Decision,” July 5), Roberts’ peculiar (and ridiculous) decision is best explained by regarding the Chief Justice as a “wimp,” who was intimidated by threats of criticism by left-liberal commentators that a decision overturning the law would have been “activist.”  In two especially telling passages in his opinion, Roberts declared a need to “adopt any reasonable construction in order to preserve the statute from being ruled unconstitutional” and then stated:  “It is not our job to protect people from the consequences of their political choices.”  So, Roberts used the spurious rationale that the individual mandate was not a penalty but rather a “tax” – a rationale that not even the B.O. regime took seriously, when it tried to defend the constitutionality of the law during oral arguments before the Court.  Ironically, in order to avoid the false impression that the Court was being “activist,” deciding the case on political grounds, Roberts really did engage in illegitimate judicial activism, in fashioning the “tax” rationale – and thus in effect rewriting the law – to avoid a confrontation with the B.O. regime and leftist political commentators.  In other words, Chief Justice Roberts became “Fierce Jurist Botches,” the apt anagram created by my good friend Rod Evans, philosophy professor and author of the word-play book, Tyrannosaurus Lex

                Libertarian law professor Randy Barnett, the intellectual leader of the constitutional challenge to the federal law, made a valiant effort to put a positive “spin” on the Court’s decision, in a Washington Post op-ed (which I discussed in “Supreme Folly”) and in a subsequent interview published in Reason magazine (“`We Won in Our Effort To Preserve the Constitution’,” October 2012).  Barnett wrote that although challengers to the law lost the case, they saved the Constitution as a limitation on the powers of the national government by prevailing on all their most significant arguments of constitutional law.  He also found solace in the hope that the Court’s decision would make the 2010 health-care law a central issue in the 2012 elections.  As Barnett concluded in his op-ed, “Those [of us] who value our republican system of limited federal powers should . . . get to work to achieve politically the complete victory that the chief justice denied us.”  

                Unfortunately, Professor Barnett was proved wrong in his hope that “ObamaCare” would be a central issue in the 2012 elections.  It was barely mentioned: B.O. and the Democrats did not defend the unpopular law; and although Mitt Romney and the Republicans promised to repeal it, they did not highlight the issue.  And some pundits believed that Romney’s record as governor of Massachusetts (so-called “RomneyCare,” the state law he supported that also mandated purchase of health insurance) blurred the clear lines dividing the two parties on this issue.  (That argument is unconvincing, for several reasons; among them, it ignores federalism – how a law that’s unconstitutional and unworkable at the national level might be otherwise at the state level – as well as Romney’s clear promise to void the federal law.)  A better explanation is that, notwithstanding the unpopularity of “ObamaCare,” the majority of voters put a priority on other issues.  As I noted in Part I of this essay, B.O.’s reelection can be attributed to a variety of factors – but it is clear that support for “ObamaCare” wasn’t one of them.  The Supreme Court’s decision, allowing the keystone of the law to stand as a “tax,” thus ironically took it off the public’s radar.  

    This does not mean, however, that “ObamaCare” is a “done deal” and that the American people should consider their right to health-care freedom yet another fundamental right that they’ve lost to a fascist “Nanny State.”  Because key provisions of the law do not begin to go into effect until next year, 2014 – and because the federal law requires the cooperation of the states to fully implement it – it is still possible to block it and thus to help preserve Americans’ health-care freedom.   

    The most important way that state governments can block implementation of “ObamaCare” is to refuse to set up an insurance exchange.  An “exchange,” as defined under the law, is “a mechanism for organizing the health insurance marketplace to help consumers and small businesses shop for coverage.”  Thus described, it appears to be a kind of clearinghouse where consumers without employer-based coverage can obtain supposedly “affordable” health insurance (to comply with the law’s individual mandate); in practice, however, it is the chief means by which the federal government – acting through the broad discretionary power of the Secretary of Health and Human Services to issue directives (“as the Secretary shall decide” is the most repeated phrase in the “ObamaCare” law) – will control the health insurance market.  That’s because the only health insurance policies included in the exchanges are those that comply with the federal (H.H.S.) regulations.  All exchanges are supposed to be operational by January 1, 2014, but the states are not required to create state-run exchanges; they may decide to “opt out” and instead either allow the federal government to create an exchange for their residents or do a “state-federal partnership” exchange.  (Under the last type of exchange, a state-federal partnership, states can oversee insurance plans and assist consumers, but the federal government will handle duties such as enrollment and determining eligibility.  Under a purely federal exchange, the federal government will call all the shots.  It also will – as the H.H.S. Secretary announced in early December – impose a 3.5% fee on insurance plans sold via a federally-run exchange, presumably to help offset the exchange’s costs.  These costs will be passed on to consumers, further eroding the law’s promise to make health care more “affordable.”)  

                So far, only 19 states, most of them (all but five) with Democrat governors, have decided to set up state-run exchanges.  That means that the majority of states – 25 so far, most of them with Republican governors – have decided to opt out, either completely or by doing the state-federal partnership.  States are opting out because of the high costs and uncertainty about creating state-run exchanges, which are estimated to cost taxpayers in each state, at a minimum, between $10 million and $100 million a year – but that’s just a rough guess, because the full contours of the exchange have yet to be specifically delineated by the H.H.S. Secretary.  (The state of Nebraska, for example, did a thorough analysis, estimating that running an exchange would cost the state about $646 million over eight years, fiscal 2013–20.  “It is simply too expensive to do a state insurance exchange,” Nebraska Gov. Dave Heineman said.)   

                States are also opting out because they have justifiable concerns about the loss of state autonomy – in other words, they fear becoming merely functionaries of the federal government.  In a letter to H.H.S. Secretary Kathleen Sibelius, Texas Governor Rick Perry explained why he wasn’t playing along with the pretense of so-called state-run exchanges.  Governor Perry bluntly wrote: “It is clear there is no such thing as a state exchange.  This is a federally mandated exchange with rules dictated by Washington.”  And Wisconsin Governor Scott Walker noted, “No matter which option is chosen, Wisconsin taxpayers will not have meaningful control over the health care policies and services sold to Wisconsin residents.”  As Sally Pipes observes, “Walker is right.”  The federal H.H.S. department dictates that all policies sold on the exchanges must meet one of four classifications – “platinum,” “gold,” “silver,” or bronze,” depending on the percentage of health costs a plan covers – but given the caps on deductibles imposed on all plans, even the least expensive “bronze” plans, the mandates prevent insurers from offering low-cost products that may best fit a family's budget.  And because “ObamaCare doesn’t just set the rules – it also tasks states with enforcing them . . . running a [state] exchange could therefore get pricey” (Pipes, “ObamaCare Exchanges Turning into Disasters,” I.B.D., January 30).     

                Some states – those with state constitutional provisions or statutes guaranteeing their citizens’ health-care freedom – are constitutionally or legally prohibited from implementing an “ObamaCare” exchange.  Here in Ohio, for example, an amendment to the Ohio Constitution – the “Health Care Freedom Amendment” approved overwhelming by voters in November 2011 – prohibits state and local governments from compelling “either directly or indirectly, any person, employer, or health-care provider to participate in a health care system.”  Under the constitutional provision, “compel” is defined to include “the levying of penalties or fines.” Maurice Thompson, director of the 1851 Center for Constitutional Law (and original draftsman of the Ohio amendment), has authored a paper explaining why Ohio constitutionally cannot implement an exchange.  Cato Institute analyst Michael Cannon has noted that 13 states “have passed statutes or constitutional amendments . . . that bar state employees from carrying out [the] essential functions of an ObamaCare exchange.”     

                Another way states may block full implementation of “ObamaCare” is to refuse to expand Medicaid coverage.  A provision in the federal law requires the states – as a condition for receiving federal funding – to expand Medicaid by covering everyone earning up to 138% of the federal poverty threshold ($32,000 for a family of four).  (This is the chief way in which the federal law aims to cover millions of low-income uninsured Americans.)  As briefly noted above, in an often-overlooked (yet critical) part of last summer’s decision, the Supreme Court held that the states could not be forced to expand Medicaid as the federal government dictates.  Some states (most with Republican governors) have decided not to expand their Medicaid coverage, but other states (including, disappointingly, some Republican governors, such as John Kasich of Ohio and Jan Brewer of Arizona) have decided to embrace Medicaid expansion because they cannot resist the lure of federal funding.  (The federal government has promised to provide a 100% spending match on the newly eligible for the next three years, phasing that down to 90% in 2020.)  Governor Kasich, in his recent budget proposal, said it was “the right decision for Ohio” to expand Medicaid eligibility – adding some 275,000 Ohioans to the welfare rolls within the first year, while prompting another 250,000 who are currently eligible to sign up, he estimates – just to receive an estimated $13 billion in federal funds over the next seven years.  (In other words, an offer he just couldn’t refuse.  Judas had his 30 pieces of silver; Kasich, $13 billion in federal money.)  

                Constitutional challenges to the law are still being considered by the courts.  One case, begun by a suit filed by Liberty University in Lynchburg, Va. on the day the bill was signed into law in 2010, challenges the constitutionality of the employer mandate and specifically whether the federal law’s forced funding of abortion and contraception is unconstitutional under the First Amendment religion clause (protecting the free exercise of religion) and the federal Religious Freedom Restoration Act.  The Liberty University lawsuit had been dismissed by a panel of the U.S. Court of Appeals for the Fourth Circuit, in Richmond, Va., but the Supreme Court ordered the court to consider it.  A similar lawsuit, filed by North Carolina’s Belmont Abbey College (a Roman Catholic school), is also being considered by the federal appellate court; it is among the 40 or so lawsuits (mostly from church-affiliated schools and hospitals opposed to abortion) that are challenging the contraception mandate on religious grounds.  Nevertheless, these cases might be rendered moot by the recently-announced changes in the employer mandate that will exempt some religious institutions from covering certain medical services (like abortion or contraception) against which they have conscientious religious scruples. 

                Another case – considered by some commentators to be the best vehicle for tearing apart the “Affordable Care Act” – is a suit filed by the Oklahoma Attorney General, Scott Pruit, challenging the constitutionality of the employer mandate generally.  Pruitt argues the IRS oversteps its legal authority if it taxes Oklahoma businesses to subsidize the federal health insurance exchanges that the law creates. Taxing Oklahoma businesses to pay for a federal exchange established by Washington in his state – which will happen as Oklahoma is declining to set up its own exchange – is not authorized under the law, Pruitt contends.  

                If the federal courts (and ultimately the U.S. Supreme Court) would side with the Oklahoma Attorney General, the very existence of the federal law becomes vulnerable.  Even though most of the law would survive if the employer mandate were struck down, the law lacks so-called severability language – meaning that if any part of it is found unconstitutional, the entire law should be thrown out.  (That didn’t happen last summer, however, when the Court invalidated the Medicaid expansion provisions.)  John Goodman, president of the National Center for Policy Analysis, has said that invalidation of the employer mandate could be a “fatal blow” for the law.  Nevertheless, given the Supreme Court’s record – not only John Roberts’ horrible decision last summer but also the Court’s record in enforcing federal constitutional limits, in its entirety at best a mixed record – sadly, Americans cannot count on the federal courts to protect their rights. 

                Ironically, the Supreme Court’s decision upholding the individual mandate as a “tax” (and allowing states to opt out of the expanded Medicaid program) opens up the law to another constitutional challenge.  “If the mandate is an indirect tax, as the Supreme Court held, then the Constitution’s `Uniformity Clause’ (Article I, Section 8, Clause 1) requires the tax to `be uniform throughout the United States,’” write Washington D.C. attorneys David B. Rivkin, Jr. and Lee A. Casey in a provocative op-ed in Wall Street Journal.  The “ObamaCare” tax fails to meet this standard because low-income taxpayers who can discharge their mandate-tax obligation by enrolling in the new, expanded Medicaid program (the functional equivalent of a tax credit) can do so only in those states that have not opted out of the Medicaid expansion.  The Court would face a “tough choice” if this challenge reaches it, Rivkin and Casey conclude:  

    “Having earlier reinterpreted the mandate as a tax, [the justices] would be hard-pressed to approve the geographic disparity created when states opt out of the Medicaid expansion.  But that possibility is inherent in a scheme that imposes a nominally uniform tax liability accompanied by the practical equivalent of a fully off-setting tax credit available only to those living in certain states.  To uphold such a taxing scheme would eliminate any meaningful uniformity requirement – a result that the Constitution does not permit.”

     

    (“The Opening for a Fresh ObamaCare Challenge,” December 5). 

                Two other cases also offer fairly plausible grounds for the federal courts to strike down “ObamaCare” in whole or in part.  Attorneys for the Goldwater Institute, a libertarian research and legal advocacy organization in Arizona, argue that the law’s Independent Payment Advisory Board (the appointed board which is supposed to hold down Medicare costs) violates separation of powers, because it is not subject to congressional oversight or even meaningful agency review.  The institute also contends that, by compelling individuals to disclose medical information to insurers and the federal government, the law places an undue burden on individuals’ right to privacy.  And a lawsuit filed by the Pacific Legal Foundation, another libertarian advocacy group, also turns on the Supreme Court’s decision upholding the individual mandate as a “tax,” arguing that if that is so, then the bill was unconstitutionally passed by Congress.  The Constitution requires that all revenue bills originate in the House of Representatives, but the bill that eventually became the “ObamaCare” law was a substitute bill that initially passed the Senate, not the House. 

                Ultimately, there is only one sure way to save Americans’ real “right” to health care – their right to health-care freedom: the freedom to decide for themselves what kind of health insurance, if any, they need, and the freedom to choose that insurance in a nationwide free market, which is what real health care “reform” ought to be aimed at.  That is to repeal the noxious 2010 law, a law that never should have been passed.  As noted above, the more Americans find out about “ObamaCare,” the less they like it – and the more they want to see it repealed.  (A recent Rasmussen poll showed 56% of Americans favor repealing it.)  Unfortunately, to do so would require a political “revolution” in the United States that cannot occur until after B.O. leaves office:  Republican victories, assuring the GOP control of both houses of Congress and the White House, in the 2016 general elections.  Until then, as the law becomes implemented, Americans should continue opposing it and resisting it, as much as they can – and hoping that over the next four years it doesn’t irreparably destroy the system that has provided the best health care in the world.

      

      

    “Green” Bullshit:

    Threats to Liberty from “Climate Change” Paranoia

    and the B.O. Regime’s War on Carbon-Based Fuels

      

    As I wrote in the second part of my 2009 “Prospects for Liberty” essay,  

    Perhaps the greatest threat to the freedom and prosperity of the industrialized Western world today is the threat posed by radical environmental activists and the politicians who follow their bullshit, pseudo-scientific theories.  And no theory better epitomizes this “green bullshit,” as I call it, than the theory of “global warming” – or “climate change,” as it’s now euphemistically called – in other words, the theory that the average global temperatures are increasing, and that the Earth’s warming will lead to cataclysmic disasters such as massive flooding of coastal areas as the polar ice cap melts and the world’s oceans rise to dangerously high levels, etc., etc., and that this dangerous global warming is caused by human activity, namely, by man-made carbon dioxide (a so-called “greenhouse gas”) created by the burning of carbon-based “fossil fuels” such as coal, oil (and other petroleum products), and natural gas.

     

    I added that “the global-warming thesis is a theory that, despite the propaganda of global-warming alarmists, is far from being scientifically proven.  Indeed, it is a flawed theory that fails to fit the facts”; in other words, it’s a theory based on faulty “junk” science.   

                Over the past few years, it has become even more clear that the “climate change” or “global warming” theory is nothing but a scam -- indeed it’s been called “the greatest scam in history,” by John Coleman, meteorologist and the founder of The Weather Channel.  The revelations from what has been called “Climate-gate” – e-mail exchanges and other documents hacked from computers at the Hadley Climate Research Unit at the University of East Anglia in Great Britain – reveal that there has been a conspiracy among some in the science community to spread alarmist views of global warming and to intimidate, if not silence, those who disagree (the skeptics of global warming theory, called “denialists” by the true believers, whom I call “warm-mongers”).  In other words, what Michael Crichton warned about in the fictional story in his novel State of Fear – a global hoax by radical environmentalist terrorists – has practically come to pass.   

                Fortunately, however, more and more Americans are realizing that the radical environmentalists’ theory really is nothing more than a massive scam.  Public opinion polls show that concern about alleged “global warming” or “climate change” has drastically declined – particularly as a new “revolution” in carbon-based energy resources (discussed below) has the promise of re-energizing the United States and the moribund American economy.  Notwithstanding media reports about 2012 being the “hottest year on record” for the continental United States – and the global warm-mongers’ attempt to exploit public misinformation about this story to continue to promote their “climate change” scam, most people now understanding that climate is distinct from weather.  Whatever weather phenomena might explain the summer heat (and drought) in the continental United States (such as an unusually northward jet stream) last year, really provide no evidence for global warming.  Indeed, 2012 was a colder than normal year in other parts of the world, including Alaska and most of Europe.    

                Most recently, a group of more than 20 retired NASA scientists and engineers, who call themselves “The Right Climate Stuff,” has issued a report that decisively shatters the global warming myth.  After reviewing, studying and debating the “available data and scientific reports regarding many factors that affect temperature variations of the earth’s surface and atmosphere,” they’ve found (among other things):  “the science that predicts the extent of anthropogenic [man-caused] global warming is not settled science”; “there is no convincing physical evidence of catastrophic anthropogenic global warming”: and “because there is no immediate threat of global warming requiring swift corrective action, we have time to study global climate changes and improve our prediction accuracy.”  They conclude that Washington is “over-reacting” on global warming and suggest that a “wider range of solution options should be studied . . . .” (“Facts About Climate,” I.B.D., January 28). 

                The recently-announced news about the hypocrisy of one of the leading propagandists for the “climate change” scam, former V.P. Al Gore, only underscores how phony the whole campaign against so-called “fossil fuels” has been.  (In case you haven’t heard, Al Gore recently made an estimated $100 million, or one-fifth of the proceeds, from the $500 million sale of his failed cable TV news channel, Current TV, to the radical Muslim, anti-Semitic broadcaster Al-Jazeera.  Al-Jazeera “is literally bought and paid for by the monarchy of the oil-and-gas-rich Mideast nation of Qatar,” notes the editors of Investor’s Business Daily.  Hence, it’s “a three-fer in greedy hypocrisy for