MayerBlog: The Web Log of
David N. Mayer



    Thoughts for Summer 2013, and Beyond (Part II) - May 30, 2013



    Thoughts for Summer 2013 – and Beyond


    Part II


    As I noted in my introduction to Part I, MayerBlog will be on a lengthy hiatus – not only for the summer but also continuing during the 2013–14 academic year, when I will be on sabbatical leave, completing the manuscript of my next book, Freedom’s Constitution: A Contextual Interpretation of the Constitution of the United States.  So, this will be the last essay posted to MayerBlog until Fall 2014 (probably Labor Day weekend in 2014). 

    Before going on hiatus, however, I could not resist the temptation to comment on a number of important issues in public policy and popular culture – issues that are in the news today and are likely to remain in the news throughout the summer and beyond.  (If you scroll all the way down to the end of Part I, you’ll also see my summer reading recommendations and my annual preview of summer movies.)  



    n The Emperor Is (Still) Naked!  America’s B.O. Problem (Update)  

    The perfect metaphor for B.O.’s presidency is the famous children’s story by Hans Christian Andersen, “The Emperor’s New Clothes” – as I have written here, since well before the 2008 elections.  (See my blog essay “The Emperor Is Naked!” Oct. 16, 2008.)  The story tells the tale of a vain monarch and his gullible followers who were hoodwinked by some clever con men masquerading as tailors.  They dressed the Emperor in a supposedly magnificent set of clothes but really in nothing at all; they said that only intelligent persons would be able to see the supposedly invisible wardrobe, so the Emperor and all his followers, fearing they’d brand themselves as stupid, were too afraid to admit they saw nothing.  That is, until one day when the Emperor was parading down the street in his “new clothes,” and one outspoken little boy – with the honesty of a young child – shouted, “But he’s naked!”  That brought an end to the charade.  

                In this metaphor, the real villains are the tailors who perpetrate the fraud that deceives the people.  Who are the tailors in the real-life story of the naked Emperor, B.O.?  Clearly, it’s the so-called “mainstream” news media (called the “lamestream” media by former CBS newsman and media critic Bernard Goldberg), who by and large have continued their “slobbering love affair” with B.O. (as Goldberg so aptly put it in the title of his recent book).  

                Predominantly leftist in their political orientation, media “journalists” are not only ideological kinsmen of B.O. but also elitists who take a kind of perverse pleasure in hoodwinking the American people.  Taking their lead from B.O. himself and his political handlers, they have emphasized his racial identity as “black” – another part of his charade because he’s really of mixed racial heritage (his father was a black African but his mother was white, and he was raised by his white maternal grandparents) – and so, as the tailors in the media constantly tell Americans, B.O.’s presidency is “historic.”  The implication is that anyone who failed to vote for him in 2008 and 2012 must be “racist.”  In other words, B.O.’s media lapdogs are engaged in what some people call “reverse racism” (bigotry in favor of someone because of their racial identity); but it ought to be considered for what it is – blatant racism, as racism properly ought to be defined (treating persons not as individuals but as members of a racial group).  For more on this, see my discussion “Let’s Talk Frankly About Race” in last year’s “Thoughts for Summer” (May 25, 2012).  As I wrote there – explaining why I’ve called B.O. the “affirmative action president”: 

    B.O. is perhaps the least qualified person ever elected president – someone who would not have been seriously considered a contender for the office of Chief Executive, if it were not for his race and for his exploitation of racism.  His campaign deftly played the “race card,” cashing in on the dregs of America’s past problems with racism:  black group-identity politics and white liberal guilt.  If he had not racially identified himself as a black man and cashed in on racism, he would not have been nominated by the Democrats’ party, let alone elected president.


    B.O.’s apologists continue to “play the race card,” claiming that anyone who criticizes B.O. is “racist.”  That’s absurd, for the vast majority of B.O.’s political opponents and critics oppose him because of his policies, or his actions, not because of his race (or his racial identity).  Indeed, as I’ve frequently commented here on MayerBlog, the assertion that it’s “racist” to criticize B.O. is itself racist:  just as it would be racist to dislike B.O. or to oppose his policies simply because of his race, it’s equally racist, by the true meaning of the word, to give B.O. a pass – to blind oneself to his many flaws as president, as his supporters do – also because of his race.  Or as Fox News’ Greg Gutfeld has pointed out:

    “The fact is, if you’re a liberal Democrat, you’d vote for Obama whether he was black or white (and, get this: He’s both).  But if you’re a conservative or a right-leaning libertarian, you wouldn’t vote for Obama whether he was black, white, or chartreuse.  So when the left says the reasons behind your choice are racial instead of intellectual or ideological, it’s way beyond arrogant – it’s offensive.  And it deserves a punch in the kisser.  Fact is, I don’t dislike Obama; I dislike Obama’s beliefs.  And I disliked them when they were Hillary Clinton’s.  And Bill Clinton’s.  And Teddy Kennedy’s. And Jimmy Carter’s.  But that’s the beauty of stupidity – it knows no color.”


    (The Bible of Unspeakable Truths, 2010, p. 116). 

                It’s not just racism, however, that explains the “slobbering love affair” the media have had with B.O.  It’s also the leftist political philosophy (whether you call it “liberal,” “progressive,” “statist,” “socialist,” “semi-socialist,” or even “fascist” or “collectivist,” it’s the philosophy on which the 20th-century regulatory “welfare state,” or “Big Government Nanny State” has been founded), which the overwhelming majority of journalists, academics, and other supposed “intellectuals” share with B.O.  Because he’s also the most-far-left president in U.S. history, the news media have been “in the tank” for him throughout his first term and did all they could do, through their biased reporting of the news, to help assure his reelection in 2012.  (The fact that B.O. also racially identifies as “black” is just the icing on the cake, which gives leftists even more smug satisfaction – patting themselves on the back for being p.c. and for “transcending” race – in not only supporting but also practically worshipping him, as a new kind of political “messiah.”)  B.O.’s narcissism and arrogance, rather than turning them off, have made the media even more enthusiastic in their support.  

    It’s a symbiotic (mutually supportive) relationship, as libertarian columnist Larry Elder observed in a recent op-ed: 

     “[B.O.’s] arrogance flows from our fawning, gushing, Bush-hating `news’ media, which shirk their responsibility to fairly report the news.  The media’s fecklessness creates overconfidence.  With good reason, [B.O.] expects his media cheerleaders to look the other way, accept excuses without much challenge and turn the president’s critics and whistleblowers into enemies.”


    Among the many examples Elder cites are:  the refusal of black Democrat politicians (like Rep. Maxine Waters of California) to blame a 15.9% black unemployment rate (which Rep. Waters called “unconscionable”) on B.O.’s job-killing policies; the refusal of the media generally to challenge outrageous claims that B.O. has made, such as the 3.5 million jobs he’s claimed he “saved or created,” or his assertion that “ObamaCare” would cause the “cost curve” for health insurance to “bend down” (when all the evidence is that’s causing steeply rising health-care costs), or even the false story B.O. told about his own mother (that as she lay dying from cancer in a Hawaii hospital, she fought with her insurance carriers over her medical bills) – an absolutely false narrative the press allowed B.O. to use to personalize his fight for “ObamaCare.”  Or, to cite another example, the claim that then-Senator B.O. made with a straight face – that, contrary to the policies of former President Bush, he would oppose any military intervention unauthorized by Congress unless the country faced imminent risk of attack.  But as president, B.O. joined with France and Britain in bombing Libya, “a country that posed no imminent threat to America,” whose leader, Moammar Gadhafi, had long before surrendered his weapons of mass destruction to the Bush administration.  “President Obama paid no political price for what Senator Obama would have opposed,” notes Elder (“In Defiant Obama, Media Face Monster They Made,” I.B.D., May 24). 

    With a news media that fails to call B.O. on his blatant, outrageous lies, “why shouldn’t [he] feel that he operates under different, special rules, and can do so without risking loss of support?” Elder asks.  “By refusing to hold [B.O.] to the same standard they would hold any garden-variety Republican, the media now face the monster they created.”  That’s why the media now seems so “shocked – shocked” (to quote Claude Raines’ character in the classic movie Casablanca) by the recent scandals discussed below. 

    But now that B.O. is in his second term – thankfully, he’s term-limited and so is a “lame duck” president – he’ll lose his charm (because, by January 2017, if not sooner) he’ll lose his power and become merely an ex-president.  That’s the general reason why most two-term presidents, throughout U.S. history, have had problems – including various types of political scandals – during their second terms.  B.O. is no exception; moreover, given his style of governance (discussed more fully in the next section), it’s virtually inevitable that scandals will emerge involving him and his regime (which, as I also discuss in the next section, I’ve called “the most lawless in U.S. history”). 

    Already, just four months into his second term, B.O. is facing what conservative columnist George Will has called a “trifecta” of scandals:  Benghazi (the militant Islamic terrorist attack on the U.S. consulate in Benghazi, Libya last September 11, and the subsequent effort by B.O.’s regime to cover up the facts concerning the attack), “IRS-Gate” (as I called it, in Part I, the targeting of certain conservative and libertarian groups by the IRS), and the Justice Department’s surveillance of AP news reporters (and other journalists).  As Will observes, these three scandals are really just the tip of the iceberg, for there are other scandals – some which already broke into news during B.O.’s first term, like the “Fast and Furious” illegal gun-running operation in Mexico, involving Eric Holder’s Justice Department and the FBI – and others that are breaking into news right now.  A possible fourth scandal involves “power being wielded by executive branch officials (at the National Labor Relations Board and the Consumer Financial Protection Bureau) illegally installed in office by presidential recess appointments made when the Senate was not in recess.”  A fifth scandal “may be Secretary of Health and Human Services Kathleen Sibelius’ soliciting, from companies in industries that HHS regulates, funds to replace what Congress refused to appropriate,” to “educate” Americans about the “ObamaCare” mandates.  “Because [B.O.’s] entire agenda involves enlarging government’s role in allocating wealth and opportunity, the agenda now depends on convincing Americans to trust him, not their lying eyes,” concludes Will.  “In the fourth month of his second term, it is already too late for that” (“Scandals Debase the Currency of Gov’t Trust,” I.B.D., May 17). 

    These recent scandals – and particularly the scandal involving the AP and other journalists – have alarmed not only the news media and many self-described civil libertarians (because it concerns real threats to freedom of the press) but also many of B.O.’s most fervent supporters and apologists.  They’ve become, metaphorically, just like the little boy in Hans Christian Andersen’s story – the boy who sees the Emperor for what he really is, who by shouting “But he’s naked!” opens the eyes of the people generally. 

    For MSNBC’s Chris Matthews, apparently the “thrill” is gone:  the Hardball host (who in 2008 famously declared he felt “this thrill going up my leg” every time B.O. speaks) recently observed that B.O. “obviously likes giving speeches more than he does running the executive branch.”  Politico – another friend of B.O. – described Matthews’ remarks as “a rare, unforgiving grilling of the president as severe as anything that might appear on Fox News.”  Matthews grumbled that B.O. “doesn’t like dealing with other politicians . . . his own cabinet . . . members of the Congress, either party . . . doesn’t like giving orders or giving somebody the power to give orders.  He doesn’t seem to like being an executive.”  Instead, “he likes to write the speeches . . . likes going on the road, campaigning.”  Gee, how perceptive.  Matthews suddenly realizes B.O. isn’t fit to be president (“The Thrill Is Gone,” May 17). 

    Matthews isn’t alone; other prominent members of the “lamestream” media are finally seeing the light – seeing the naked “Emperor” for what he really is.  NBC’s Andrea Mitchell has accused B.O. of “the most outrageous excesses I’ve seen” in her years of journalism, going back before Watergate.  Thanks to the scandal involving the Justice Department’s surveillance of AP reporters, the Washington Post’s Dana Milbank has accused B.O. of “a full frontal assault on the First Amendment.”  And the Post’s “Fact Checker” – which hands out little “Pinocchios” to politicians who stretch the truth – recently gave B.O. “Four Pinocchios” (the worst rating on its lie-ometer) for his repeated claim that “the day after [the Benghazi attack], I acknowledged that this was an act of terrorism” (“Now the Lapdog Media Tell Us,” I.B.D., May 15). 

    I’ve previously written here in MayerBlog that “the deodorant America needed to deal with its B.O. problem” was the “Tea Party” movement.  That was certainly true in 2010, when the Tea Party movement played a critical role in energizing Republicans, leading to Republican control of the U.S. House of Representatives following the 2010 midterm elections.  Unfortunately, the Tea Party movement failed to help the GOP in the 2012 elections – not only because it was being harassed by the IRS, but also because the Tea Party movement actually hurt the GOP in 2012.  By supporting questionable GOP candidates, the Tea Party allowed the Democrats to retain control of the U.S. Senate and, worse, by failing to support Mitt Romney, Tea Party activists allowed B.O. to be re-elected president with fewer popular votes than he had in 2008.  Now, the “deodorant” America needs to really get rid of its “B.O. problem” is . . .  his impeachment and removal from office.



    n Countdown to Impeachment  

                In my essay “The Unconstitutional Presidency” (February 21), I discussed the various ways in which modern U.S. presidents – both Democrat and Republican – have abused the powers of their office, thereby breaking the “chains of the Constitution” by which the Framers sought to restrain the most powerful political office in the world.  I also discussed how the current “occupier” of the White House, B.O., is the most dangerous of the lot – not only the worst president in U.S. history but also, as I’ve called him (giving my top ten reasons) “the most lawless president in U.S. history.”  (See in addition to my February essay, my previous post, “Rating the U.S. Presidents: 2012” (Feb. 22, 2012).) 

    The stench emanating from Washington, D.C. these past four and a half years is indeed America’s “B.O. problem.”  Specifically, it comes from the “culture of corruption” that B.O. and his regime have brought to the nation’s capital – what several commentators have described as the “Chicago style” of dirty politics, which in B.O.’s case (given his background as a “community activist” in the “Windy City”) means a combination of general thuggery with the dirty tactics expounded by radical leftist Saul Alinsky, in his handbook, Rules for Radicals.  (For more on this, read the excellent books by Michelle Malkin and David Freddoso, respectively, Culture of Corruption (2010) and Gangster Government (2011).)  B.O.’s “thugocracy” (as David Freddoso has called it) can be described as a combination of misfeasance, malfeasance, and nonfeasance – a veritable smorgasbord of all the types of abuse of power that can be committed by a U.S. president and his lawless regime.   

    That’s why it’s especially ironic that in his recent speech at The Ohio State University’s spring graduation exercises, B.O. advised graduates to put all their trust in government and reject those shrill “voices” that say it’s the source of our problems.  Ignore those limited-government types, he told the class of 2013, who warn “tyranny lurks just around the corner.”  (Not coincidentally, those limited-government types were the groups that the IRS targeted for special scrutiny.)  The irony is that B.O. himself “has proved our fears are well-founded.  Government, particularly governance by this rogue regime, needs more checks, not fewer; more skepticism, not less.  Tyranny isn’t lurking around the corner.  It’s now upon us, manifest in the pattern of misuse and abuse of government power by this presidency” (“A Scary Pattern of Power Abuse,” I.B.D., May 16). 

    It’s also ironic – and more than just a little hypocritical – for B.O. and his apologists to now say, in response to news of the recent “trifecta” of scandals (more fully discussed below), that he was ignorant of these abuses of power, claiming that he learned about them at the same time the public did, when he read accounts in the news media (the same news media that his Justice Department has targeted for surveillance, in another one of the recent scandals).  He’s “not responsible,” the White House claims, because the federal government has become too big and complex. (Never mind that B.O.’s basic agenda since 2009 has been to make the government even bigger – to be in charge not only of every American’s health care but also, among other things, to police every consumer credit transaction, whether it’s a mortgage, or a student loan, etc., etc.)  As B.O.’s chief political adviser, David Axelrod, put it, “part of being president is there’s so much underneath you that you can’t know because the government is so vast.”  As the editors of Investor’s Business Daily observe, “The president can’t have it both ways.  The genius who could direct an expanded government to solve our problems can’t now claim ignorance as scandal explodes” (“Having It Both Ways,” May 16). 

    In the last section of my February essay, I discussed the solution that the framers of the Constitution provided for the problem of presidents who abused their power – the means by which a corrupt president could be, in effect, “flushed away, “ removed, put out of office.  We might consider it “the enema of the state.”  It’s called impeachment. 

    As I observed, virtually any of the lawless actions or abuses of power that B.O. has committed thus far during his occupation of the White House would constitute impeachable offenses.   Indeed, in my “Election 2012 Postmortem” essay (Nov. 10, 2012), I predicted that B.O. would be impeached and removed from office before he completes his second term:

                “The abuses of power committed by B.O. over the past four years – abuses that justify my calling him `the most lawless president in U.S. history’ . . .  ought to have disqualified him from being elected to a second term.  Now that he has been reelected, however, I seriously doubt whether he will complete a second term.  I am confident that his disregard for the Constitution and the rule of law will prompt him to continue to commit offenses – not merely `high crimes and misdemeanors,’ but also possibly treason (making him the first traitor to hold the office of president) – that will justify his impeachment and removal from office before his second term ends in 2017, in other words, sometime during the next four years.”


    Reviewing the history of past presidential impeachments – the cases of Andrew Johnson, who in 1868 was impeached, tried in the Senate, and acquitted; of Richard Nixon, who was nearly impeached (the House Judiciary Committee approved articles of impeachment) in 1974 but who resigned before proceedings went further; and of Bill Clinton, who in 1998-99 was also, like Johnson, impeached, tried in the Senate and acquitted – I then considered the prospects for impeaching B.O. and successfully him removing from office.  (He would be the first president to be successfully impeached and removed.)   Drawing lessons from both Nixon’s and Clinton’s presidency – of the successful attempt to impeach Nixon and of the failed effort to convict Clinton – I concluded that a successful impeachment effort must be bipartisan; it cannot be begun by the Republicans alone, unless they hold commanding majorities in both the House and the Senate.  (I did add, however: “Perhaps some particular abuse of power – such as the drone strikes discussed in the “Assassin-in-Chief” section [in the February essay] – will receive attention from the news media and arouse concern among left-liberals as well as conservatives and libertarians.)  Hence, as I discussed in the concluding section of my “Election 2012 Postmortem” essay, the Republicans face a major challenge – both to educate the American people and to make them care, to make them care whether the man in the Oval Office is abusing the awesome powers of his office.  

    Some conservative commentators have cautioned that, politically, it would be nearly impossible to impeach B.O.  Michael Medved, writing an op-ed in USA Today, reminds us that all three of the serious presidential impeachment drives (against Johnson in 1868, Nixon in 1974, and Clinton in 1998-99) occurred when the president’s opponents controlled both houses of Congress by hefty margins (“No, Obama Won’t Be Impeached,” May 20).  And Rush Limbaugh, commenting recently on his radio show, has bluntly claimed that it’s impossible for B.O. to be impeached, simply because he’s black.  “There’s no way Congress will impeach the first `African-American president’,” Rush asserts. 

    The recent “trifecta” of scandals may be game-changers, however.  First were the Congressional hearings that exposed the B.O. regime’s attempts to cover up the deplorable actions in Benghazi, Libya (discussed in the section below, “The Ghosts of Benghazi.”)  Next came the revelation that the IRS had been targeting “Tea Party” groups and other conservative and libertarian groups (generally groups that advocated smaller government) for extra scrutiny – a matter so serious that even B.O. had to try to distance himself from the IRS actions, calling them “outrageous” and feigning ignorance.  (Why is “IRS-Gate” so serious?  Because it closely parallels one of the impeachable offenses charged against Richard Nixon, as I’ve discussed in Part I.)  Then came the news that Eric Holder’s Department of Justice had seized two months of phone records at the Associated Press to track down leaks that led to a 2012 story about a foiled al-Qaida bombing plot.  The AP scandal was quickly followed by the revelation that the Department of Justice (maybe we should call it In-Justice) not only spied on Fox News reporter James Rosen but also labeled him an unindicted “co-conspirator and/or aider and abetter” in an espionage case involving a State Department whistle-blower.  His “crime” was convincing the State Department source into giving him information about North Korean missile tests – in other words, his “crime” was simply to report the news. 

    As I’ve noted above, in the previous section, by this point some of B.O.’s most fervent apologists are wondering if he’s really the “fierce defender of the First Amendment” that his spokesman Jay Carney says he is.  The New York Times declared that the administration “has moved beyond protecting government secrets to threatening fundamental freedoms of the press to gather news.”  As the editors of Investor’s Business Daily observe, that’s “strong stuff, considering the source.”  Even stronger is the criticism of James C. Goodale, the lawyer who represented the Times in the Pentagon Papers case.  Goodale says the Rosen case “is a further example of how [B.O.] will surely pass President Richard Nixon as the worse president ever on issues of national security and press freedom.”  As the I.B.D. editors conclude, “Goodale gets what many liberals still can’t bring themselves to believe – [B.O.] really has no love for the First Amendment.  His administration treats freedom of the press as a reward for good behavior, not a sacred right.  Up to now, the media have mostly behaved, with a few exceptions such as Fox News.  They’re now finding out that, when they do take their role as public watchdogs seriously, the administration bares its teeth” (“Creeping Along Tyranny’s Road,” May 28).  Indeed, the front page story in a recent issue of USA Today had the headline, “Media fear [B.O. is] out to `criminalize journalism’” (May 28). 

    Another commentator who has compared B.O. to Richard Nixon is Jonathan Turley, a constitutional law professor at George Washington University Law School (and a contributor to USA Today), who wrote – last year – that B.O. “is using executive power to do things Congress has refused to do, and that does fit a disturbing pattern of expansion of executive power.”  Professor Turley adds: “In many ways, [B.O.] has fulfilled the dream of an imperial presidency that Richard Nixon strived for.  On everything from [DOMA] to the gaming laws, this is a president who is now functioning as a super-legislator.  He is effectively negating parts of the criminal code because he disagrees with them.  That does go beyond the pale.”  (More recently, in response to the scandal involving Justice Department surveillance of AP reporters, Turley has a column in USA Today arguing that Attorney General Eric Holder – who, it is becoming increasingly apparent, lied to congressional investigators about his knowledge of the surveillance operations – ought to be fired.  He argues that, despite Holder’s role thus far as B.O.’s “sin eater” – a high-ranking associate who shields the president from responsibility for their actions.  Turley notes that Nixon had H.R. Haldeman and John Ehrlichman; Ronald Reagan had Oliver North and Robert “Bud” McFarlane.  He should have added that although North and McFarlane successfully shielded President Reagan from the “Iran-Contra” scandal, Haldeman and Erlichman did not shield Nixon from culpability in Watergate.  Indeed, their fall was just a prelude to Nixon’s.  (“Attorney General Must Go,” May 29). 

    It’s telling that my local newspaper, The Columbus Dispatch – a paper that can fairly be described as middle-of-the-road in its political views – in a recent editorial has suggested that B.O.’s methods have “set the stage” for the current scandals.  “Despite the president’s denials, defenses, and dodges, there is a thread running through the multiple scandals that now beset him and the underlings who serve him in the executive branch: the arrogance of power.”  The editors observe (as I have, in my discussion of “IRS-Gate” in Part I), that none of the recent “trifecta” of scandals (the IRS, the Justice Department and the AP, and Benghazi) can be blamed merely on “low-level” employees in the executive branch, for B.O. as president bears the responsibility.  Not just because of his position at the top of the executive-branch flow chart, but also “because the leadership of the organization sets the example and tone for those underlings who carry out the administration’s policies and who apply them directly to the nation’s citizens.”  No only the recent trifecta of scandals but also other examples of abuse of power permeate B.O.’s presidency: among other things, “the high-handed `recess appointments’ of members of the National Labor Relations Board, actions that two courts have now ruled unconstitutional because the Senate was not in recess when the appointments were made”; “the president’s invocation of executive privilege to stymie congressional investigation of the U.S. Justice Department’s disastrous Fast and Furious program”; “his refusal to open up fully to Congress and the public about his unchecked use of drones to kill terrorist suspects, including American citizens, overseas”; and so on.  “If, in word and deed, the leadership runs roughshod over the rule of law, denies the legitimacy of alternative political viewpoints and regards election as a carte blanche to do as one pleases,” not surprisingly “the rank and file will misbehave accordingly” (“Leading by example,” May 19).

    As I observed in the final paragraph of my February essay, “I don’t expect anyone as arrogant and narcissistic as B.O. to do a similar thing [as Nixon did, to resign from the presidency] with faced with the real possibility of impeachment by the House and a trial in the Senate.”  That’s why I concluded that "it’s vitally important for Republicans to educate the public and, frankly, to not only court but also shape public opinion.”  The Democrats’ success in doing that helps explain why they retained both the White House and control of the Senate in the 2012 elections.  The Republicans will need to be equally successful, to retain control of the House and perhaps regain control of the Senate in the 2014 midterm elections – if we are to hope for a restoration of a constitutional presidency, before B.O.’s lawlessness destroys both the office and the Constitution.



    n Politics 2013: It’s All About 2014 (and 2014’s About 2016) 

                It’s a sign of the intense partisanship in Washington over the past several years (especially since the hyper-partisan B.O. became president) that most political pundits see the current political battles in the nation’s capital as merely efforts by politicians of both political parties to better position themselves for the next election – in other words, the “mid-term” congressional elections in 2014 and the general election (including the next presidential election) in 2016.  And, to a great extent, that’s true.   Let me stick my neck out here, and make a few bold predictions about these upcoming elections, based on today’s politics (and especially the developing scandals surrounding the B.O. regime). 

    Mid-term elections generally favor the party that doesn’t control the White House, so Republicans should be able to comfortably hold their majority in the U.S. House and also have a good chance to regain control of the Senate.  Republicans need a net gain of only six seats to have a majority in the Senate; Democrats not only have more seats to defend (21, compared to the GOP’s 14) but several Democrat incumbents – many of them in so-called “red” states, which typically tilt Republican – have announced their retirement.  Most recently, it was Senator Max Baucus (D.–Montana) who announced he would not seek re-election. 

    Even if the scandals currently enveloping the B.O. regime will not (realistically) result in a serious (or successful) effort to impeach B.O. and remove him from office, they should help the Republicans in the congressional elections, because American voters would prefer to see “divided government,” or “gridlock,” rather than continued tyranny from an unbridled executive branch.  Moreover, as the federal health-care control law known as “ObamaCare” continues to be implemented, in 2013 and 2014, it will only add to B.O.’s unpopularity.  This is one “train wreck” (to borrow retiring Senator Baucus’s famous phrase describing the law) that B.O. cannot blame on his predecessor, as so-called “ObamaCare” is the signature legislative “achievement” of B.O.’s presidency (as I discussed in Part I of this essay, in the section “The Disease Known as `ObamaCare’”).  The more Americans learn about “ObamaCare” – especially when they see it in actual operation, steeply escalating the costs of health care while also limiting Americans’ choices – the less they’ll like not only the law but also B.O. and the Democrats, the party that used political chicanery to push it through Congress in 2010.  Republican politicians just have to remember to keep clearly reminding the American people who’s responsible for this horrid law – and to keep pledging to reform or repeal it.  (That’s why the recent vote by the Republican majority in the House to repeal the law – although “dead on arrival” in the Democrat Senate and in the Oval Office, because of B.O.’s veto threat – was so important, not just symbolically but also as a reminder to American voters of the GOP’s unity in opposition to “ObamaCare.”) 

    So, I predict big GOP wins in the 2014 congressional elections.  What about 2016? 

    In this year’s “Spring Briefs” essay (March 23), in the section entitled “Saint Hillary?” I observed, in the wake of revelations about the Benghazi fiasco (discussed more fully below, in the section “The Ghosts of Benghazi”), that the Democrats’ “heir-apparent” to the White House, Hillary Rodham Clinton, may face an uphill battle, both to get her party’s nomination (as she tried to do against B.O. in 2008) and to get elected as the nation’s first female president in 2016: 

    [T]he pebbles strewn along Hillary Clinton’s path back to the White House are more like boulders, the inconvenient facts about her past record as U.S. secretary of state, U.S. senator, and first lady of both the U.S. and Arkansas – a past that ought to disqualify her from higher office, except in a political party that chose B.O. as its standard-bearer.  Mrs. Clinton’s tenure as secretary of state can be fairly described as disastrous: over the past four years, militant Islamic terrorists have seized more control of the Middle East, as the murder of the U.S. ambassador and three other Americans at the U.S. consulate in Benghazi, Libya, so pointedly illustrated.  Hillary was a key conspirator in the B.O. regime’s cover-up of that foreign-policy debacle, helping to perpetuate the lie that the Benghazi terrorist attack was motivated by an anti-Mohammed video broadcast on the Internet.  And Hillary’s much-touted “reset” of U.S.–Russian relations has yielded only B.O.’s capitulation to Vlad Putin on such issues as missile defense.  Even more embarrassing to Hillary (if she had any shame) is her record as Slick Willy’s cuckolded wife and criminal co-conspirator.  (Remember those infamous billing records from Hillary’s Rose law firm that mysteriously appeared in the Clinton White House living quarters – documents that were at the heart of the Whitewater land fraud?  All of Hillary’s whining about a “vast right-wing conspiracy” out to get her and Bill will never fully lay that story to rest.  Hillary’s political past may now be in the history books (most of which are written by people with a leftist bias that’s favorable to Hillary); nevertheless, that past will come back to haunt her, if she should be so foolhardy as to reenter political life. 


    Maybe like a cockroach that keeps coming back the more one tries to kill it, Mrs. Clinton may come back to political life.  (Like her husband, “Slick Willy,” she may truly be a “Teflon” politician.)  And maybe no other viable candidate will emerge among the Democrats, in the wake of the huge power vacuum left by B.O. 

    At this point, I see no viable presidential candidate emerging for the Democrats in 2016.  On the other hand, as I briefly discussed in my “Election 2012 Post-mortem” (Nov. 10, 2012), there is a whole stable of attractive, energetic, and relatively young Republican members of Congress or governors who could be viable presidential candidates in 2016.  (Forget about New Jersey Gov. Chris Christie, however.  Despite his recent gastric band weight-loss surgery – which clearly shows he’s as interested in running for president as he is in staying alive – Christie’s moderate policies and his blatant pandering to B.O. in the wake of tropical storm Sandy ought to disqualify him from the GOP primaries.)  If I were a betting man, I’d put my money either on Senator Marco Rubio (R.– Fla.), especially if the immigration-reform bill he’s co-sponsoring passes Congress, or on Senator Rand Paul (R.–Ky.), to be the next president of the United States. 

    As a libertarian Republican, obviously I like Rand Paul.  As I commented here on MayerBlog during the 2012 primary season, it’s a shame that it wasn’t Rand Paul who ran for president in 2012, rather than the Senator’s father, former Congressman Ron Paul.  Although the elder Paul has justly been credited for injecting libertarian principles into the mainstream of the GOP and hence into American major party politics (the so-called “Ron Paul Revolution,” nicely discussed in a recent book by Brian Doherty of the same title), the elder Paul had difficulty communicating his message to a wide range of voters.  He seemed like a “Johnny-one-note” in his obsessive criticism of Mideast military intervention by both the Bush and B.O. presidencies; and in general Ron Paul’s devotion to principle came across as a crazy kind of “extremism.”  As a politician, even though he’s less experienced in years, the younger Paul is more mature than his father.  Rather than being a self-conscious rebel against the GOP establishment, he’s been working within the party.  And he earned the respect of a wide range of Americans (from across the full political spectrum), by his principled stance against the B.O. regime’s use of unmanned drones to assassinate civilians, in his recent filibuster (a genuine old-fashioned Senate filibuster), as I discussed in “Spring Briefs 2013” (in the section “Droning On – To a Point”).   Earlier this year, Senator Paul spoke at the GOP Lincoln Day dinner in Cedar Rapids, Iowa – a traditional launchpad for presidential campaigns – where he ripped his potential Democratic rival, Hillary Clinton, and continued to urge the Republican Party to broaden its appeal (see my discussion “Rand Paul and Charles Murray Are Even More Right,” also in “Spring Briefs 2013”).  As one news story put it in its headline, “Sen. Paul’s Iowa trip has feel of 2016 run.” 

    And so, I’m looking forward to President Rand Paul’s inauguration in January 2017!



    n Hey, Big Spenders! 

                As I have previously commented here on MayerBlog, B.O. and the Democrats suffer from an ailment I call “Deficit-Attention Disorder” (D.A.D.).  As I explained in the section “Debt’s the Difference: The Democrats’ Deficit-Attention Disorder” in last year’s “Thoughts for Summer” (May 25, 2012), one of the biggest differences between the parties is their response to the problem of obscenely high federal deficit spending (with deficits averaging $1 trillion each year that B.O. has been in office) and the resulting hemorrhaging of the U.S. national debt to dangerously high levels (approaching 100% of the nation’s GDP).  I observed: 

    “B.O. and the Democrats refuse to even recognize that there’s a problem; in other words, they suffer from what might be called a `deficit-attention disorder.’  The Republicans, in contrast, are serious about dealing with the problem in the only truly effective way – by cutting government spending – but find themselves targets of Democrat demagoguery for simply trying to do so.” 


    And as I further noted in the section “Sequester, Schmequester” in this year’s “Spring Briefs” essay (March 23), “D.A.D. in turn is a disorder that results from the Democrats’ underlying disease:  paternalism (their paternalistic philosophy of government).”  Moreover, the Democrats’ stubborn refusal to recognize both their D.A.D. and the paternalism in which it is rooted is, in fact, the basic problem in Washington, D.C. today. 

    This key difference between the two parties – and particularly the Democrats’ D.A.D. – was quite evident last year in the debate over so-called “sequestration,” the compulsory, across-the-board budget cuts that have resulted from the failure to achieve a real solution to the national-debt ceiling impasse last year.  The compromise tax legislation that B.O. signed into law at the beginning of 2013 – generally retaining the “Bush tax cuts,” the lower rates on individual federal income taxes except for Americans earning more than $400,000 – was considered a defeat by some conservatives but ought to be considered the first of several victories by Republicans (or defeats for B.O. and the Democrats) thus far this year.  When B.O.’s White House (particularly former chief of staff Jack Lew, who’s now the treasury secretary) first proposed sequestration as a kind of poison pill, during the 2011 budget negotiations, they did not imagine that Republicans would be willing to stomach slashing the Pentagon’s budget in return for “cuts” (or rather, reductions in the rate of growth) in domestic programs and a more realistic opportunity to bring federal deficits – and hence the mushrooming national debt – under control.   

    Meanwhile, the B.O. regime has continued its crass political scheme to blame Republicans for the sequester by inflicting pain on the American people – allegedly caused by the sequester but really caused by executive-branch mismanagement of the budget.  The cancellation of public tours at the White House and the delayed opening of certain national parks were just the tip of the iceberg.  More serious were the flight delays resulting from FAA furloughs of air-traffic controllers.  But B.O.’s scheme failed: the American people clearly saw who was responsible for the cancellation of White House tours and the like, and the flight delays caused by the furloughs proved so unpopular – not just with the traveling public but with members of Congress – that Congress quickly passed, and B.O. was forced to sign into law, a bill providing sufficient funding for the FAA.  Insightful political commentators (like the editors of Investor’s Business Daily) opined that the whole fiasco involving FAA furloughs only proves that we ought to privatize air traffic control operations, as many countries (including Canada) have done, with much success (“Air Traffic Control: A Private Affair,” May 1).  Of course, such real reform cannot be accomplished so long as we have a socialist in the White House.  

    The deficit/debt issue will again heat up by Labor Day, when it is expected that the level of the nation’s debt will hit the $16.7 trillion ceiling and the Treasury Department will again warn that the United States may be in danger of defaulting on its obligations.  At that point, as Republicans line up against B.O. and the Democrats in their game of fiscal “chicken,” congressional Republicans ought to call the Dems’ bluff by passing a bill that prioritizes revenues so that certain obligations (not only interest payments on the national debt but also “entitlement” payments such as Social Security checks) will be met.  That would take the specter of “default” off  the table, as both sides try to negotiate yet another compromise between the responsible budget proposed by the House Republicans ( another modest effort by House Budget Chairman Paul Ryan’s committee, which at least makes a serious effort to reduce spending) and the irresponsible budgets proposed by the “big spenders” in the White House and the Democrat-controlled Senate.



    n B.O.’s Recession:  A Status Report 

    Here on MayerBlog I’ve frequently described the lousy U.S. economy – commonly called the worst “recovery” since the Great Depression” – as “B.O.’s recession,” because it’s the direct result of his failed, misguided economic policies.  (See, for example, my discussion of “Keynesian `Stimulus’ Bullshit,” in Part II of this year’s “Prospects for Liberty” (January 31), supplemented by my discussion of “B.O.’s Anti-Growth Agenda,” in this year’s “Spring Briefs” (March 23).)  Although the economy technically fails to fit most economists’ definition of recession (two or more successive quarters of negative economic growth), the continuing high unemployment rates and the sluggish annual growth (averaging less than 2%) in GDP really make the economy look more like a recession than a recovery. 

    The best that can be said about B.O.’s recession is:  It could be worse.  That’s so, in a couple of different senses – one that is hopeful; one that is ominous. 

    The hopeful sense in which “It could be worse” is this:  It’s unlikely that even more disastrous policies will be implemented during the next four years.  Why?  Simply because B.O. is a failure not just in his policies (in the substance of the policies he’s been pushing) but also as a leader (in actually implementing those policies).  He’s the “Naked Emperor,” from the famous Hans Christian Andersen story (as noted above) – a fraud, whose presidency is based on bullshit.  But, thankfully, B.O. has a lousy work-ethic (which is a polite way of saying he’s lazy), a narcissist who’d rather just be POTUS (President of the United States) than do the job of the POTUS.  The Government Accountability Institute mined the records of the White House calendar and discovered that B.O. “has spent a total of [only] 474.4 hours (or 47.4 10-hour workdays) in economic meetings or briefings of any kind throughout his presidency” thus far.  In comparison, he “has played 115 total rounds of golf and spent 86 days on vacation, for an estimated combined total of 976 hours.”  Lest we forget, B.O. was the president who in 2009 began his occupancy of the White House by declaring, “My economic agenda . . . begins with jobs,” and who has since repeatedly declared that his “No.1 focus” would be “jobs, jobs, jobs” (“A Presidency Driven Into the Rough,” I.B.D., April 30).  Just imagine how bad the economy would be if B.O. really were focused, “laser-like,” on it.   

    The other sense in which B.O.’s recession could be worse, unfortunately, is ominous:  it’s indeed likely to become worse.  Even though an effective Republican opposition can prevent the enactment of even more disastrous legislation being pushed by B.O. and the Democrats (simply through the GOP’s control of the U.S. House and continuing gridlock), the major pieces of legislation that were passed during the first two years of B.O.’s presidency, when both houses of Congress were controlled by Democrats – “ObamaCare,” the Dodd-Frank law, and “stimulus” spending – will continue to hamstring the U.S. economy.  So too will the massive increase in federal government regulations that has taken place during the past four years.  (See, for example, my discussion of “Regulation Nation” in last year’s “Thoughts for Summer” (May 25, 2012).)  And, as the next section discusses, the Dodd-Frank “reform” legislation (what I call the “Frank-n-Dodd monster”) not only failed to fix the underlying problems that led to the 2008 fiscal crisis but is likely to hasten yet another fiscal crisis. 



    n Déjà Vu: The Coming (Returning) Fiscal Crisis 

                Second only to the so-called “ObamaCare” law, as the worst piece of legislation passed by Congress (at least in recent years), is the monstrous Dodd-Frank law (named after its chief sponsors, former Senator Chris Dodd and former Congressman Barney Frank, both Democrats), which also passed Congress in 2010 by a mostly party-line vote when Democrats controlled both houses of Congress.  The law was supposed to “solve” the problems that led to the financial crisis in 2008, but it actually did nothing to cure the basic underlying problem of the “housing bubble,” and it actually has made it more difficult for banks (especially small banks, which are being increasingly hurt – and in many cases, actually driven out of business – by the steep regulatory costs imposed by Dodd-Frank).  (For more on all of this, see especially the superb book by former BB&T chairman – and now president of the libertarian Cato Institute – John A. Allison, The Financial Crisis and the Free Market Cure: How Destructive Banking Reform Is Killing the Economy (McGraw-Hill, 2013).) 

    Most insightful analysts of the 2008 financial crisis (like Mr. Allison) have concluded that it was triggered by the bursting of the housing bubble, which in turn was caused by irresponsible government regulations pushing “affordable housing” – which in practice meant a relaxing of mortgage standards for buyers with weaker credit – traced back to the Community Reinvestment Act, passed originally during Jimmy Carter’s presidency and then broadened under Bill Clinton, together with similar initiatives pushed by the FHA and mortgage giants Fannie Mae and Freddie Mac.  Now, perversely, the B.O. regime is pushing those same policies “that got us into this mess in the first place” (to quote the favorite mantra of B.O. when he tries to blame economic problems on his predecessor, George W. Bush.  But this problem cannot in any way be blamed on Bush, who tried to reform Fannie Mae and Freddie Mac, only to be rebuffed by those entities’ defenders in Congress, Democrat politicians led by Congressman Frank and Senator Dodd!).  As the editors of Investor’s Business Daily recently warned, the B.O. regime “wants banks to lower lending standards, and Fannie and Freddie are back in the black.  The stage is set for a replay of some very unpleasant history” (“Look Out, Taxpayers: It’s Bubble Time,” April 4).   

    And in another ominous development, B.O. has named Democratic Congressman Mel Watt (former chairman of the Congressional Black Caucus and one of the leading “affordable housing” zealots who caused the housing bubble) as the new executive overseeing Fannie Mae and Freddie Mac, the quasi-private entities that back up nearly 90% of U.S. mortgage loans.  Will the insanity never end?



    n  Gun-Control Fascists: Their Ammo Is Bullshit 

                Anti-gun fascists – activists pushing for more government control over guns (and hence further erosion of Americans’ Second Amendment rights) – have been trying to disguise their agenda by claiming they’re concerned about “gun violence” and are really calling just for so-called “common-sense reforms” to help assure “gun safety.”  Those claims are simply bullshit, as I maintained in Part III of my “2013: Prospects for Liberty” essay (February 7, in the section on “`Gun Violence’ Bullshit”).  It’s not guns, but the misuse of guns by criminals, that jeopardize Americans’ rights; efforts by the government to expand its control over firearms, in the name of reducing “gun violence” or promoting “gun safety,” only make law-abiding Americans more vulnerable to criminals, less able to protect themselves, and more dependent on government for their safety.  As I noted in the February essay, “gun control” really means government control over the ownership, possession, and use of firearms by individuals – in other words, government control over firearms owners – and the most extreme advocates of gun control (the most extreme advocates of “solutions” to the “gun violence” problem) really want (as their ultimate objective) a government monopoly over firearms.  A government monopoly means the prohibition of private gun ownership, including the confiscation of all firearms possessed by individuals not authorized to do so by the government. 

    Following a key dogma in the leftists’ playbook (to “never let a crisis go to waste,” as B.O.’s former White House chief of staff (and now mayor of Chicago) Rahm Emanuel has put it), gun-control fascists have been trying shamelessly to exploit the tragic school shootings at a Newtown, Connecticut elementary school earlier this year – in which 20 children and six adult staff were murdered – as a kind of emotional ammunition, to push their agenda.   

    B.O. in particular has been exploiting the Newtown tragedy in order to try to ram his gun-control initiatives through Congress.  In an obvious abuse of presidential power, B.O. flew some of the Newtown victims’ families on Air Force One to Washington, D.C., to lobby members of Congress.  And in a speech on April 8 at the University of Hartford, accompanied by some Newtown parents, B.O. claimed that his push for gun control was apolitical.  “Why, then, was he yelling?” asked the editors of Investor’s Business Daily.  B.O. “was thundering over and over that `this is not about politics.’ [But] in the course of his speech, he spoke of `politics in Washington,’ of `political stunts to prevent votes,’ of `political victory or defeat for me,’ and of how he `did not believe the country was as divided as our politics would suggest.’”  “That’s a lot of talk about politics on something that has nothing to do with politics,” the editors observe, adding: “Of course, what [B.O.] is up to has everything to do with politics – and nothing to do with saving children from bullets.”  B.O., the “most-ideological president in U.S. history,” as well as one of the most divisive and one of the worst narcissists ever to occupy the White House, sees in gun-control politics “another opportunity to body-slam a component of the Republican Party’s base, those bitter non-urban GOP voters who `cling to guns or religion,’ as [B.O.] has said of them” (“Politicizing a Mass Killing of Kids,” I.B.D., April 10). 

    Yet, as a USA Today editorial on April 4 acknowledged, the momentum on “gun safety” (meaning gun-control) legislation after the Newtown shooting has slowed.  The editors (themselves members of the gun-control fascist club) attribute that to the supposed power of the “gun lobby.”  What they fail to mention is that all the measures proposed by gun-control zealots – including the new draconian gun controls passed into law by the Democrat-controlled legislatures of Maryland and Colorado – would not have prevented the Newtown massacre.  (One of the Newtown victims’ parents who refused to play along with B.O.’s attempt to politicize the tragedy was Mark Mattioli, who in a Fox News interview asked “Why should I be hampered in protecting myself when someone can come into my home and outgun me?”  He called for increased enforcement of existing laws and more mandatory sentencing, pointing out that criminals won’t follow new laws any more than they do existing ones.  And he has praised the detailed “School Shield” proposal the National Rifle Association has put forth since the massacre to train armed guards to protect schoolchildren from mass killers.)  

    When the battle was taken up in the Congress this spring – in what might be called “a gunfight at the Not-so-OK Corral” – B.O. and the other advocates of gun control suffered a major political defeat.  In the Democrat-controlled Senate, gun-control backers came up far short of the 60 votes they needed for passage (with several Democrats joining the Republicans in opposition).  The compromise background-check bill, co-sponsored by Senators Joe Manchin (D.–W.Va.) and Pat Toomey (R.–Pa.), got 54 votes.  The amendment that would renew a federal ban on so-called “assault weapons” (the bill pushed by the author of the original Clinton-era ban, Senator Diane Feinstein (D.– Calif.)) got only 40 votes, and the effort to put federal limits on the size of ammunition clips got only 46 votes. 

    The much-vilified “gun lobby” which the gun-control fascists like to demonize – the National Rifle Association and other gun-rights groups, including the even more strident Gun Owners of America – are not only effective grass-roots lobbying organizations but are, in fact, the nation’s largest (and most effective) civil rights organizations, for they are defending not only their members’, but all Americans’, fundamental freedom, their Second Amendment rights.



    n Reality Check: The Boston Marathon Bombing Massacre (BM2) 

                The bombing at the Boston Marathon on April 15 – which killed three persons (including an 8-year-old boy) and injured/ maimed over 170 people – was, depending on one’s interpretation of the facts as reported in the news media, either: (A) another one of a series of recent mass killings perpetrated by deranged young white males, or (B) the most significant militant Islamic jihadist terrorist attack on U.S. soil since the horrid al-Qaeda attacks on September 11, 2001, or (C) something else.  The correct answer is C, and the “something else” I’m referring to is actually a combination of A and B.  (In other words, A and B are narratives based on a false dichotomy resulting from confusion over the definition of the words terrorist and terrorism – and a failure to recognize the significance of the global Muslim jihadist threat.)     

    As news stories over the past month and a half have revealed, the perpetrators of the bombing were two brothers, Tamarlan Tsarnaev, 26 (who was killed in a gun fight with police), and his younger brother, Dzhokhar, 19 (who was apprehended in a nearby neighborhood, just outside the area where Boston police conducted house-to-house searches, and is now in custody, awaiting trial for the murders).  The Tsarnaev brothers are Muslim immigrants from Russia – specifically, their family origins are in Chechnya, the rebellious Russian region near the Caucacus Mountains (and are therefore, quite literally, Caucasian Muslims), which has become a breeding ground for terrorists.  By all accounts, the brothers were fully Americanized, permanent residents attending U.S. schools, but as such, they were perfect “sleepers,” terrorists in waiting, ripe for recruiting.  And from what media accounts tell us thus far, it was the older brother, Tamarlan, who became most disenchanted with life in America and so traveled back to Russia, where he received training in militant Islamic camps near his homeland.  When Tamarlan returned to the United States, he in turn “radicalized” his younger brother, Dzokhar.  Statements made by Tamarlan on his website before the bombing and by Dzokhar after the bombing clearly show they regarded themselves as Islamic jihadists, killing American civilians in retaliation for the killing of civilians by U.S. forces in Afghanistan.  Despite abundant warning signs (particularly with regard to Tamarlan’s activities), U.S. government authorities failed to anticipate the danger.  (Once again, fears of “politically incorrect” ethnic or religious profiling apparently prevented federal authorities from either raising red flags, or acting on them, concerning the Tsarnaev brothers’ plot.  Thus far, thankfully, it seems that other than some friends who aided or abetted Dzokhar’s attempt to escape, no other jihadists were involved in the bombing itself.) 

    That all said, nevertheless, those conservatives who called for the surviving Tsarnaev brother to be treated as an “enemy combatant” – and thus to be denied those procedural protections that American law affords to other accused criminals – are also wrong, in their irrational overreaction to the militant Islamic jihadist threat.  

    The Tsarnaev brothers were indeed terrorists – specifically, militant Islamic terrorists – but that does not make them enemy combatants.  Terrorism is generally understood today as the use of criminal acts calculated to provoke a sense of terror in the general public, a group of persons, or particular persons for political purposes.  That’s the general definition of terrorism that the U.N. General Assembly has used since 1994 in various resolutions condemning terrorist acts, as the Wikipedia article on “Terrorism” notes, adding that in neither U.S. law nor international law is the term terrorism precisely defined.  Since the September 11, 2001 al-Qaeda attacks on the United States, militant Islamic terrorists essentially have declared war on the United States, and the U.S. government has responded by waging war in the Middle East (primarily in Iraq and Afghanistan), though without a formal declaration by Congress because (from the U.S. perspective) it is a defensive war.  Militant Islamic terrorists killed or captured in a foreign theater of war are indeed enemy combatants; but “homegrown terrorists” like the Tsarnaevs – legal residents and/or citizens of the U.S., acting within the United States – are, legally, common criminals.  Terrorism explains their motive; it does not define their criminal acts – which in this case, are murder and attempted murder.

    In this case, thankfully, given the circumstances of the Boston attack (the placement of the bombs in a very public place, near the Marathon’s finish line, where a multitude of cameras, both public and private, captured the suspects’ images, modern technology enabled the police to identify them.  And the usual operation of the American criminal justice system – both the police and FBI in apprehending suspected criminals, and the courts in putting them on trial – seem quite adequate to deal with this particular criminal, according to the rule of (civilian) law.

    According to a military expert recently interviewed by a USA Today reporter, however, the Boston bombing is not an “anomaly,” and we can expect more such cases of “home-grown” militant Islamic terrorism challenging our criminal justice system.  Michael Barbero, a retired Army lieutenant general who led the Joint IED Defeat Organization (JEIDDO), has warned that the threat from homemade bombs – the top killer (and maimer) of U.S. troops in Afghanistan and Iraq – is “here to stay,” will persist for decades, and is likely to become a more prevalent menace domestically (“Military expert: Boston bombing `not an anomaly,’” May 20).  The recent brutal knife attack, resulting in the death of a British soldier on the streets of south London – with the militant Muslim perpetrators caught literally “red-handed,” with blood-stained hands and still wielding the knife with which they did the murder, brazenly invoking jihadist rhetoric on video – not only reminds us that militant Islamic terrorist attacks can and do occur in the West, but also that they do not require firearms to commit jihadist murder. 

    The failure of the U.S. government to fully recognize the dangers of militant Islamic terrorism – and to deal with the problem appropriately, according to the Constitution and U.S. law – may be traced back to the George W. Bush administration, which bowed to political intimidation (mostly from the left – pushing the agenda of so-called “political correctness,” to avoid religious discrimination against Muslims ) to use the euphemistic term terrorism as a blanket description, when what they really meant was “militant Islamic terrorism.”   

    But the B.O. regime – in its zeal to “reset” American foreign policy, to use the term coined by B.O. himself and his former secretary of state, Hillary Clinton – has taken political correctness to ridiculous new depths, resulting in a U.S. government that seems powerless to deal with the real threats to national security – and to the rights, and the lives, of American citizens – that militant Islamic terrorism poses.  One result (as noted in the section below) is the regime’s attempt to cover up the facts about the militant Islamic terrorist attack on the U.S. consulate in Benghazi, Libya, on September 11, 2012.  Another striking case in point: Nidal Hassan, the Fort Hood killer (who committed mass murder – shooting and killing 13 people and injuring over 30 people – at Fort Hood, near Killeen, Texas, on November 5, 2009).  Many political commentators have reasonably called the killings an act of (militant Islamic) terrorism (Hassan allegedly invoked the name of Allah as he shot, targeting U.S. soldiers in uniform).  Yet the B.O. regime stubbornly refuses to recognize the obvious; the Department of Defense and federal law enforcement authorities have classified the incident as an act of “workplace violence” (similar to a U.S. Postal Service worker going nuts – or going “postal,” literally – and shooting his or her fellow employees).  Because Hassan is a member of the U.S. armed forces – a U.S. army major serving as a psychiatrist (ironically!) – he’s being prosecuted (appropriately) in a military court.  So far he’s been charged with 13 counts of premeditated murder (for the persons he killed) and 32 counts of attempted murder (for the persons he wounded) under the Uniform Code of Military Justice; additional charges may be added at his court-martial, and if convicted, he may be sentenced to death (as he should).  Outrageously, however, it was recently reported that he is still collecting his salary (nearly $300,000 annually) as he awaits trial! 

    When will the B.O. regime pull its head out of the sand and acknowledge the real threat of militant Islamic terrorism?     



    n The Ghosts of Benghazi 

                No wonder “Benghazi” the militant Islamic attack on the U.S. consulate in Benghazi, Libya, September 11, 2012, which resulted in the murder of the U.S. ambassador, Christopher Stevens, and three other Americans – has been associated with “IRS-Gate” and the Justice Department’s surveillance of news reporters, in the so-called “trifecta” of scandals that currently plague the B.O. regime.  Many commentators (among them, the usually insightful and politically-astute Charles Krauthammer) regard Benghazi as the most serious scandal, the one that might actually bring down B.O.’s presidency.  The parallels to Richard Nixon and the Watergate scandal that brought down Nixon’s presidency are quite clear – particularly in the way both administrations have tried to “cover up” an underlying act of malfeasance (in Watergate, it was a petty crime, a “third-rate burglary,” as it’s been aptly characterized; in Benghazi, it’s the fecklessness of B.O.’s foreign policy, generally, and especially his mismanagement of the war against militant Islamic terrorism).  (For more on this, see my discussion “The `Arab Spring’ Turns Into an Islamo-Fascist Autumn,” at the beginning of last fall’s “Tricks and Treats” (Oct. 25, 2012).)  And, as many conservatives have pointed out in arguing that Benghazi is even more serious than Watergate, no one died in Watergate. 

    That Benghazi may become B.O.’s Watergate is even more apparent when we consider the regime’s attempt to “stonewall” ongoing congressional investigations.   The Senate Armed Services Committee hearing – which included testimony by two men with key roles in the Libya debacle, departing Defense Secretary Leon Panetta and Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff – painted a picture of a disengaged president, particularly when under questioning by Sen. Lindsey Graham, Panetta revealed that B.O. talked with him only once during the Benghazi assault and never called him back for updates.  More recently, three State Department “whistleblowers” testified before the House Oversight and Government Reform Committee: Gregory Hicks, former deputy chief of mission in Libya (Ambassador Stevens’ second-in-command, the No. 2 diplomat in Libya during the Benghazi attack); Mark Thompson, acting deputy assistant secretary for counter-terrorism; and Eric Nordstrom, the former regional security officer in Libya.  Hicks’ testimony was particularly explosive: after an emotional retelling of the night of the attack, including the death of Ambassador Stevens, Hicks revealed that he was rebuffed by Washington when he pushed for a stronger military response to the attack, particularly during the critical hours that elapsed before the last two Americans were killed at the consulate.  Hicks testified that a four-man team of military special operations forces was in Tripoli, geared up and ready to drive to an aircraft to come to Benghazi, to help those still trapped in the consulate, when its commander was ordered to stop by his superiors (at Special Operations Command Africa).  Hicks said the commander told him: “I have never been so embarrassed in my life that a State Department officer has bigger balls than somebody in the military.”  Taken together, the testimony of the three State Department officials reveal that the B.O. regime, including both the president and former Secretary of State Hillary Clinton, were aware in the first hour that it was a terrorist attack, knew it was no spontaneous protest caused by a video, refused to send what help they could, and then deliberately lied about it.  (See “Hillary Lied, Four Died in Benghazi,” I.B.D., April 25; “Whistle-Blower: We Saw Them Die,” I.B.D., May 7; and “Whistle-Blowers: Yes, `It Matters,’ I.B.D., May 9.) 

    Even more explosive than his account of the night of the attack was Mr. Hicks’ testimony about the B.O. regime’s subsequent efforts to cover it up.  Hicks testified that he was appalled – that it was “jaw dropping” to him – when he heard the regime’s narrative that the attack resulted from a spontaneous demonstration prompted by an anti-Mohammed video posted on YouTube, the narrative that U.N. Ambassador Susan Rice repeated on five Sunday talk shows.  He also said that Beth Jones, acting assistant secretary of state for Near Eastern affairs, dressed him down shortly after he criticized the regime’s narrative; she also ordered him not to talk to congressional investigators.  After he did talk with them, he received an angry reprimand from Cheryl Mills, chief of staff to former Secretary of State Hillary Clinton (“The Intimidation of Gregory Hicks,” I.B.D., May 10).  And as the congressional hearings have revealed, the infamous Benghazi “talking points” – essentially the script that regime officials used to push their false narrative about the YouTube video – were rewritten at least a dozen times, removing all references to al-Qaida and prior attacks (references that were contained in the original CIA text), at the direction of the secretary of state’s office (“A Lie Hillary Agreed To Peddle,” I.B.D., May 13). 

    Despite all that congressional investigators have learned thus far, several critical questions still remain unanswered.  Why did the B.O. regime begin the false narrative that the video sparked the attack?  Who changed the talking points, and why did they want to mislead the American people?  What U.S. military assets were readily available within striking distance of Benghazi after the attack had begun?  Why were they not made ready, or if they were ready, who gave the order to “stand down” – and why?  What intelligence was available before the attack, and how specific was that intelligence?  Embassy personnel had been requesting additional security for months; why were those requests denied?  What was Ambassador Stevens doing in Benghazi?  What happened to the terrorists who were captured by an American team inside the CIA Annex and then turned over to a Libyan rescue team?  Who, if anyone, is held accountable for the attack?  What efforts are being made to identify and apprehend the terrorist attackers?  (Meanwhile, the producer of the video is still sitting in jail, arrested on an unrelated parole violation.)  The FBI interviewed survivors of the attack on Sept. 15 and 16; why has the B.O. regime denied access to those survivors, and when will that information be made available to Congress and the public?  And finally, What was B.O. doing during the critical hours of the attack?  As Senator Ron Johnson (R.–Wis.) noted in an op-ed earlier this spring, “A complete timeline of what the president was doing that night, like we saw during the operation against Osama bin Laden, has never been provided.  Did he ever assemble a team in the situation room to monitor events?  Or did he simply check out, rest up, and then fly to a campaign fundraising event in Las Vegas only a few hours after those brave Americans had died?” (“5 Months After Benghazi, Justice Remains Elusive,” I.B.D., March 4). 

    Democrats have asserted that congressional investigations into the Benghazi tragedy/fiasco, especially in the Republican-majority U.S. House of Representatives, is a “witch hunt.”  In a sense, they’re right – because the witch (or maybe another word, one that rhymes with witch, is more appropriate) is named Hillary Rodham Clinton.  Contrary to the campaign ads she ran (opposing B.O. in the Democrat primaries in 2012), claiming that she’s be the one to call when there’s an international emergency at 3:00 a.m., Mrs. Clinton seems to have failed spectacularly in her job as Secretary of State.  She ignored requests for additional security at the consulate, in response to intelligence showing Muslim terrorist threats, and she neglected to act to help save the lives of American personnel during the critical hours between the initial attack and the time when four Americans, including the U.S. ambassador, were killed.  Worse, she actively participated in the B.O. regime’s false story that the Benghazi attack was motivated by spontaneous outrage at an anti-Mohammed video posted on the Internet – a story providing political cover for B.O.’s foreign-policy failures, a story that she knew was false, even when she looked in the eye the father of one of the dead Americans and promised him that the government would apprehend the video maker.  As I noted in the section on “2013 Politics,” above, Hillary’s malfeasance in connection with the Benghazi matter, by itself, ought to disqualify her from ever again holding any high political office. 

    In response to the infamous rhetorical question that she asked during her congressional testimony, when a visibly angry (or exasperated) Mrs. Clinton almost shouted, “What difference, at this point, does it make?” we should respond:  Mrs. Clinton, it makes a great deal of difference to the families of Ambassador Stevens and the other murdered Americans – and to the American people, who deserve to know the full truth about the murders, including the answers to all the “Who? What? and Why?” questions noted above. 

    The “ghosts of Benghazi” will continue to haunt B.O. and his regime (including his former secretary of state) until the truth finally and fully comes out.



    n Assad, Abbas, Hamas – What the Hezbollah! 

                The Middle East today is a powder keg, primed and ready to explode into another major war during the next few years, while B.O. continues to occupy the White House and U.S. foreign policy seems feckless in dealing with the global threat of militant Islamic extremism.  (Indeed, as I briefly discussed at the beginning of last fall’s “Tricks and Treats” essay (Oct. 25, 2012), B.O.’s support for the so-called “Arab spring” seems to have resulted in more militant Muslim groups seizing control in such countries as Egypt and Libya.)   

    The only question is what kind of war will it be:  Will the Muslim regimes in the region again unite in an attack on Israel, as the leaders of Iran have been threatening to do for years?  (In addition to continuing to develop nuclear weapons and long-range missiles that would deliver them to Israel, the Iranians are also giving support to the terrorist organizations Hamas and Hezbollah, in their bases in Gaza and Lebanon, respectively – groups which have pledged to destroy Israel and replace it with an Muslim Palestinian state.)  Or will the civil war currently raging in Syria, between the brutal regime of dictator Bashar Assad and the Syrian rebels, spread into other neighboring nations, leading to a regional war between the two major strands of Islam – Sunnis and Shiites?  As a recent news article in USA Today reports, Sunni Muslims are the majority in Syria, and Sunnis – including the royal families of the Persian Gulf oil sheikdoms as well as the leadership of al-Qaeda – have sent arms and troops to support the Syrian rebels; while Assad has appealed to the Shiites, and has received military support from the Shiite theocracy of Iran as well as its associated terror group Hezbollah in Lebanon.  Already the fighting had bled across Syria’s border with Lebanon, and rockets have been fired between Syria and Turkey and Iraq (“Syria’s wounds bleed into other nations,” May 30). 

    So far B.O. has limited U.S. involvement in the Syrian civil war to non-weaponry aid (mostly food and medical supplies) to the rebels.  But will he get the United States involved in yet another Mideast war?  Earlier he had declared that Syria’s use of chemical weapons against its people crosses a “red line,” and in late April White House press secretary Jay Carney told reporters that B.O. “retains all options to respond” to Syria, including military strikes on the country.  Whether or not intervening militarily is wise (and I think it’s clear that the American people will not tolerate yet another Mideast war in which U.S. servicemen and women are killed and maimed for no well-defined purpose), has B.O.’s announced “red line” boxed the U.S. into a position where failure to intervene might cause us to further lose world standing? (“Taking Away U.S. Credibility,” I.B.D., April 30). 

    Apart from our alliance with Israel, the United States no longer has any vital national interests at stake in the Middle East.   Thanks to the energy “revolution” – technological advances that have made it now feasible to extract the abundant fossil-fuel resources in North America – the United States is now on the verge of being self-sufficient, no longer dependent on Mideast oil for its energy needs (even in spite of B.O.’s war on carbon energy, as I’ve discussed in Part I of this essay).  As Victor Davis Hanson has maintained in a provocative recent op-ed, “the Middle East is becoming irrelevant” – once again, a global backwater, as it was for centuries before the discovery and exploitation of its vast oil reserves.  He adds that B.O. “senses there is no support for U.S. intervention in the Middle East,” and that even his idea of “leading from behind” in Libya “led to the loss of American personnel in Benghazi.”  Hanson predicts that after Iraq, “the U.S. will not nation-build in Syria.  Apparently, Americans would rather  be hated for doing nothing than be despised for spending trillions of dollars and thousands of lives to build Middle East societies” (“The Monotonous Middle East,” I.B.D., May 3).  I hope he’s right.   



    n The Korean War II? 

                In this year’s Spring Briefs” essay (March 23), I wrote about the “Mischievous Young `Un,” North Korea’s 30-year-old communist dictator, Kim Jung Un (the son and grandson of dead communist dictators Kim Jong Il and Kim Il-sung, respectively), and the threat his lawless regime poses to world piece.  Angrily reacting to the U.N. Security Council’s recent unanimous decision to tighten sanctions against his rogue country, Kim’s regime announced that it was nullifying all nonaggression agreements with South Korea, declaring invalid the armistice agreement that ended the Korean War in 1953.  So, as far as North Korean rhetoric is concerned, the chapter of the Cold War known as the Korean War has again heated up.  Is it Korean War II? 

    As I also noted in March, it’s not just South Koreans who are nervous.  “As Pyongyang marches relentlessly toward deliverable nuclear weapons and the long-range missiles to carry them, one top North Korean general has claimed that his country has nuclear-tipped intercontinental ballistic missiles (which could reach the United States, not only Alaska but also the Northwest coast) ready to blast off.”  The editors of Investor’s Business Daily warned that the three-stage missile North Korea launched last December also orbited a “package,” which experts say could be a test to orbit a nuclear weapon that then could be de-orbited on command anywhere over the U.S. and exploded at a high altitude, releasing an electromagnetic pulse (EMP).  That would fry electronic circuitry and collapse the electric power grid and other critical infrastructures – communication, transportation, banking and finance, food and water – that sustain modern civilization and the lives of 300 million Americans (“Can North Korea Destroy U.S.?” April 5). 

    I concluded my discussion in March by observing that even B.O., who has been a critic of missile defense since long before his 2008 campaign (and who has been very cooperative with Russia in scrapping missile defense in eastern Europe), finally sees the merit in having some missile defense to protect the U.S. homeland from rogue regimes like Kim’s in North Korea.  “I can tell you that the United States is fully capable of defending against any North Korean ballistic missile attack,” White House spokesman Jay Carney has said.  Given all the lies that Carney has recently told on behalf of B.O. (especially with regard to the “trifecta” of scandals involving Benghazi, the IRS, and the Justice Department), let’s hope that this time he’s telling the truth.



    n The EU: eeyeou! (Lessons from Mrs. Thatcher) 

                As Greece and several other European countries wrestle with fiscal crises generated by their mushrooming national debts (largely due to out-of-control national government “welfare-state” spending), many astute observers have begun questioning whether the much-vaunted European Union (EU) was a good idea.  That’s especially so when EU authorities insist that the solution to member nations’ bankruptcy problems is a policy of “austerity,” defined more in terms of tax increases rather than cuts in government spending. 

    One leader who questioned the value of the EU was Britain’s former prime minister Margaret Thatcher.  Mrs. Thatcher, who completed her life on April 8 (at age 87), was the first woman to become prime minister of Britain and the first to lead a major Western power in modern times.  But she was best known as the “Iron Lady” who reversed Britain’s economic decline by dismantling socialism and substituting free-market policies.  When her Conservative Party came to power in May 1979, the British economy was in shambles, with inflation running at 20%, ballooning budget deficits, and the highest unemployment rate since the Great Depression.  Thatcher’s solution was to promote the free-market views of Milton Friedman and Friedrich Hayek, emphasizing the privatization of the inefficient state-owned enterprises that had caused the nation’s economic stagnation.  “Privatization was fundamental to improving Britain’s economic performance,” she wrote in her memoirs.  But it was more than that: it was “one of the central means of reversing the corrosive and corrupting effects of socialism.”  (In another one of my favorite quotations, Thatcher said, “The problem with socialism is that you eventually run out of other people’s money.”)  And so she privatized the big, state-owned industries that were losing money, including utilities, British Airways, shipping, railways, and auto manufacturing; standing up to the powerful mineworkers union in 1984, she also privatized the mines.  (Comparing Thatcher to her good friend and ally, her partner in freedom and free markets, President Ronald Reagan, the editors of Investor’s Business Daily note that her government’s victory over trade union might – epitomized by the miners’ surrender in 1985 after more than a year of illegal striking – made Reagan’s 1981 firing of PATCO air traffic controllers “look like small ball” (“Margaret Thatcher’s Legacy of Leadership,” April 9).) 

    Thatcher’s government also brought inflation under control by lowering taxes and cutting spending – essentially the same “supply-side,” pro-growth economic policies that President Reagan pursued in the United States.  As she told her ministers, “Gentlemen, if we don’t cut spending we will be bankrupt.  Yes, the medicine is harsh, but the patient requires it in order to live. . . . We did not seek election and win in order to manage the decline of a great nation.”  By restoring economic prosperity and British national pride, Thatcher and her Conservative Party won a second election in 1983 and a third in 1987.  But the tough economic medicine that Thatcher administered made her unpopular (especially among leftists); and in her final year in office, she even faced a revolt among her own cabinet ministers, in part over the question of further integration with Europe.  Thatcher resisted the loss of British autonomy in a European Union, becoming what some people have called a “Euro-skeptic.”  “We have not successfully rolled back the frontiers of the state in Britain, only to see them reimposed at a European level, with a European super-state exercising a new dominance from Brussels,” Thatcher declared.  In response to then-president of European Commission, Jacques Delors – who had proposed that the European Parliament become the popular assembly of the EU, with the Commission to be the executive and the Council of Ministers to be the senate – she famously shouted, “No! No! No!” in the House of Commons. 

    Thatcher’s nearly 12 years in power – from May 1979 until November 1990 – was the longest term of any British prime minister since the early 19th century (since the Earl of Liverpool, who governed from 1812 to 1827).  Yet even her political enemies have to acknowledge, Thatcher turned the British economy around; hence, when Tony Blair’s Labour Party came back to power, it moderated its socialism because of the economic lessons that Thatcher taught – lessons that, unfortunately, have not been learned by the socialist politicians who control the EU. 

    The “austerity” policies that EU authorities have imposed on Greece as a condition for their fiscal bailout are virtually the opposite of the lesson Mrs. Thatcher’s legacy has taught, as the recipe for economic success.  They’re anti-growth, not pro-growth.  Thankfully, another European country provides a model for a better way forward: Estonia. 

    Estonia “took its medicine” as soon as the global financial crisis broke.  It cut government spending relative to its pre-crisis level dramatically – 2.8% in 2009 and 9.5% in 2010 – and is now one of Europe’s fastest growing economies.  “Moreover, Estonia’s central bank refused to prop up banks that shipwrecked on the rocks of a real estate bubble.”  Its economic recovery has been impressive, with unemployment now below the Euro average and having made up its total economic losses by 2012 (Matthew Melchiorre, “U.S. Should Look to Estonia, Where Austerity Is a Success,” I.B.D., April 25). 

    If Europe won’t follow Mrs. Thatcher’s words of caution about misguided EU economic policies, maybe it will follow the present-day model of Estonia.  Maybe U.S. policymakers should follow the Estonian example, too.



    n National Service: Bah, Humbug! 

                From time to time certain political commentators (almost always from the left side of the political spectrum) call for some form of compulsory “national service.”   For example, DeWayne Wickham, a leftist political commentator for USA Today, in a recent column (published the day after Memorial Day) bemoaned the fact that “since the start of the post-9/11 wars [in Iraq and Afghanistan], less than one-half of 1% of the population has been in the armed services.”  He contrasted that figure with military service in World War II – when there was a national draft – when nearly 9% of Americans served in the military.  He also noted a 2011 Pentagon survey which found that 57% of the servicemen and women on active duty were the children of current or former members of the military, adding that “the composition of today’s armed forces looks more like a family business than a military force that is drawn widely from the nation’s population,” as he claimed it was during the era of national military conscription (between World War II and 1973, when the national military draft ended, during Nixon’s presidency – although compulsory registration for the draft has continued since Jimmy Carter’s presidency).  “The ending of conscription has narrowed the population base from which this nation’s military is drawn and made military service a less democratic ideal.” 

    Further bemoaning that “patriotism” has become “for too many people . . . [simply] wearing a lapel pin, hanging a flag on their front porch, or covering their heart during the singing of [the national anthem],” Wickham urges that the United States “should demand more of its citizens”: 

    “It should require all able-bodied Americans to perform some sort of national service.  If not in the military, then in a national organization such as the Corporation for National & Community Service, a federal agency that uses volunteers to provide services to the poor, the elderly, and disaster victims.”


    Only with such compulsory volunteerism – a truly Orwellian concept, a contradiction in terms – would America have meaningful, and “democratic,” patriotism, Wickham asserts (“Reinvigorate patriotism with national service,” USA Today, May 28). 

    Another leftist who has called for compulsory national service, in the form of reinstating the military draft, is Charlie Rangel, the Democratic congressman from New York, who had disgraced himself because of his political corruption (taking bribes and other violations of House ethics rules, for which he was censured but not ousted from Congress).  Rep. Rangel earlier this year – after the Pentagon announced that it would end the ban on women serving in combat – called for a reinstatement of the national military draft, for both women and men.  The bill he has proposed, the “Uniform National Service Act,” would require all men and women between the ages of 18 and 25 to give two years of service “in any capacity that promotes our national defense.”  Like Wickham, Rangel bemoans the fact that “the burden of defending our nation is carried by less than 1 percent of the population,” arguing that reinstating the draft is the only way to ensure “a more equal military.”  (What Rangel didn’t argue explicitly in his announcement was the notion that in the current all-volunteer armed forces, the “burden” of military service falls disproportionately on members of “minority” groups, black and Hispanic Americans – a bit of race-baiting demagoguery that no doubt helped him get reelected, in his overwhelmingly black Harlem district, in 2012, notwithstanding his ethics problems.) 

    Rangel is partly right.  As I maintained in my “Spring Briefs” essay (March 23), in the section that I provocatively titled “Draft Them Gals!,” after the Pentagon has approved women for combat roles in the military, there is no rational basis for the government to require only men over the age of 18 to register for the draft; the Supreme Court, if faced with the question, ought to hold that limiting draft registration to males is a violation of the constitutional principle of equal protection of the laws.  

    However, the problem with the draft, constitutionally speaking, goes beyond its application only to males, as I also noted in the March essay.   Military conscription – and compulsory service to the government, in all its forms – is a direct violation of the Thirteenth Amendment, which prohibits not only slavery but also “involuntary servitude,” in the United States.  (Even Abraham Lincoln, who supported a national military draft law during the Civil War – before the Thirteenth Amendment was added to the Constitution – acknowledged that conscription was a form of involuntary servitude.  When faced with a constitutional challenge to the World War I draft law, the U.S. Supreme Court upheld the law, with hardly any justification; it certainly wasn’t the first or only time the Court has erred in its interpretation of the Constitution.) 

    Military conscription and all other forms of compulsory national service are not only unconstitutional but also, quite literally, un-American.  They are ideas whose time has come – and gone – centuries ago, in medieval England.  Bear with me, as I briefly discuss some English legal history here.  In Anglo-Saxon England all able-bodied men were expected to serve in the national militia, or fyrd, whenever called upon by the king to serve, which usually meant whenever the kingdom was at war.  This principle of universal allegiance (universal duty to serve the king, militarily) was continued after the Norman Conquest by William the Conquerer and his successors; King Henry II, in his Assize of Arms, ordered not only that all able-bodied free men provide military service to the king, but also equip themselves with the appropriate weaponry, according to their social status.  (Hence, long before the “right” to bear arms was affirmed, at least for Protestant subjects of the king, in the 1689 English Bill of Rights, there was – in a real sense – a duty to bear arms, in medieval England.) 

    By the early modern era, however, as medieval feudal society was transformed into modern capitalist society, economically speaking, a similar revolution took place in the law – and especially in the scope of monarchical power.  The king did not have unlimited powers; his authority was constrained by the law – a principle affirmed in Magna Carta (1215) but made real by a series of revolutions in 16th century England, particularly the English Revolution (or Civil War) of the 1640s and the “Glorious Revolution” of 1689.  These latter two events coincided with the founding of most of the original English colonies in America.  While in England, the aftermath of the Glorious Revolution meant that it was not the king but the Parliament that was “sovereign” (and which therefore held the ultimate political power in society), in America it meant that it was not government at all, but the people, who were “sovereign.” 

    The state and national governments of the United States of America were established during the intellectual movement known as the Enlightenment (a movement that, for the first time in human history, regarded all individuals, regardless of their station in life, as equally possessing fundamental rights as human beings); and they were founded on the political principles of another philosophical movement known as “liberalism” (in the classic sense, or “classical liberalism,” as modern libertarians call it).  Those principles, stated so eloquently in the Declaration of Independence, include the fundamental ideas that all human beings inherently (and naturally) hold certain individual rights, including the rights to life, liberty, and the pursuit of happiness; that government is established for the purpose of better “securing,” or safeguarding, these individual rights; and that the legitimate powers of government derive from the “consent of the governed.”  (The latter principle is often called “popular sovereignty,” but it does not mean that “the people” have unlimited power, as English kings claimed to have, centuries ago.  Politically, the people possess – and may grant to government – only those powers necessary to safeguard individual rights, powers that themselves must not violate individual rights.  That’s the fundamental constraint put on the power of government by the American constitutional system.) 

    Under the principles of America’s founding, the government exists for the sole legitimate purpose of protecting the fundamental rights of the individual.  For government to call upon individual citizens to sacrifice their liberty – and even their lives – in the name of “patriotism” or in the name of a universal duty to provide “community service” (that is, to compel individuals to serve some collective), is to betray the very principles on which this nation was founded.   True “patriotism” literally means “love of [one’s] county”; in a nation explicitly founded on the primacy of individual rights, it means that no individual can be called upon to sacrifice his life or liberty for the sake of others – and that no one who truly loves America and the ideals on which it was founded would call for such a sacrifice.  And for those who regard service to others (whether to one’s family, one’s local community, or one’s state or national government) as a moral ideal, it must be remembered that the essence of morality is free will, or voluntary choice.  Compulsory service is not “moral” – just as compulsory “charity” is not moral, either.  (Indeed, as many conservatives have argued, the modern “welfare state,” by compulsory redistribution of wealth from some members of society to others, has in fact undermined private charity and its moral foundations.) 

    On Memorial Day (and again on Veteran’s Day, in November), Americans appropriately honor those exceptional individuals who, by volunteering to serve in the U.S. military, have not only shown their “patriotism” (in the true sense of the word, by showing how much they love, or value, their country) but also have truly been of “service” to their fellow Americans (in the moral sense), helping to preserve all Americans’ freedoms.  That’s why we should honor veterans: they chose to serve in the military, rather than being forced to follow some unconstitutional (and misguided) law.



    n “Marriage, American Style” Revisited 

                Nine years ago I posted my essay “Marriage, American Style” (May 19, 2004), giving truly “liberal” (that is, classical liberal or libertarian) arguments for legal recognition of same-sex marriage.  (Note that I’m using the more accurate term same-sex marriage – not “gay marriage” – because no one really ought to oppose the latter.  All marriages ideally should be “gay,” in the historic sense of the term as “happy,” in other words, that they’re emotionally rewarding for the married couple.)  I discussed why it would be consistent with the American constitutional tradition (with its emphasis on the sovereignty of the individual) to redefine marriage to include same-sex relationships – and why such a reform would be truly “progressive,” in the proper sense of the word.  (As Thomas Jefferson wrote, “Laws and institutions must go hand in hand with the progress of the human mind.  As that becomes more developed, more enlightened, as new discoveries are made, new truths disclosed, and manners and opinions change with the change of circumstances, institutions must advance also, and keep pace with the times.”)  

                In the nine years since I posted that essay, much progress (true “progress,” as Thomas Jefferson used the word) has been made, both in the United States and around the world.  In 2004 only one U.S. state recognized same-sex marriages, Massachusetts, as the result of a controversial state supreme court decision that conservatives decried as “judicial activism.”  And the United States was only the fourth nation in the world to recognize same-sex marriages.  In the United States today, 12 states and the District of Columbia have legalized same-sex marriages.  And a total of 14 nations have similarly done so – most recently, France (despite its being a heavily Roman Catholic country).   

    Meanwhile, the U.S. Supreme Court will soon decide two important cases relating to same-sex marriage.  In the case involving a constitutional challenge to California’s Proposition 8, a ballot initiative banning recognition of same-sex marriage in the state, I predict the Court will uphold the initiative, viewing the issue of same-sex marriage as a policy question that the people of each state ought to decide for themselves.  (Despite some persuasive arguments by libertarians and even some conservatives maintaining that same-sex marriage is a fundamental right that states cannot deny, without running afoul of the Fourteenth Amendment’s due process and/or equal protection clauses, I think the conservative majority on the Court will uphold federalism as the leading constitutional principle.)  In the other major case, involving a challenge to the federal Defense of Marriage Act (DOMA), which prohibited the U.S. government from recognizing same-sex marriages, even in states where they had been legalized, I predict that the Court will hold for the challengers and strike down the federal law.  Why?  The basic answer, again, is federalism:  the definition of marriage is a matter that the Tenth Amendment reserves to the states.  Legal recognition of same-sex marriage ought to come not from the judiciary (critics would continue assailing so-called “activist” judges who attempt to “legislate” from the bench), but from the people of the state (or at least their representatives in the state legislatures), as they become “more enlightened.”   

    True progress cannot happen overnight, nor can it be imposed by the government on the people. 



    n Aborting Extremism 

                    On most issues concerning individual rights, I agree with Barry Goldwater: "Extremism in the defense of liberty is no vice, and moderation in the pursuit of justice is no virtue."  What many people (usually those who self-identify as “moderates” or “centrists”) condemn as “extremist,” I regard as a principled defense of freedom, in all its aspects:  for example, I fully support First Amendment freedom of speech, the Second Amendment right to keep and bear arms, and economic freedom and property rights – and hence I oppose, respectively, all legal restrictions on campaign financing or on so-called “indecent” speech, all gun-control measures (including background checks), and virtually all legal restrictions on business owners (including minimum-wage laws, compulsory unionization laws, smoking bans, antitrust laws, even zoning requirements). 

    There is one issue, however, on which I take a position that might be called “moderate”; that issue is in today’s politics the other important “social” issue (besides same-sex marriage) – abortion (and specifically, legal restrictions on a pregnant woman’s freedom to obtain an abortion).  The abortion issue is unique because it does not involve a clear dichotomy between individual rights on the one hand and government controls on the other.  Rather, it involves an unavoidable conflict between two kinds of individual rights, both equally fundamental (the pregnant woman’s right to liberty and self-ownership and the unborn child’s right to life).  Because of this conflict, abortion is the one issue where a “moderate,” or “centrist” position, rather than being a compromise with evil (as it is on most other issues), is the only rational solution – and where the “extremist” positions, on both sides of the issue, are truly irrational “vices.”  

                    What is the moderate, or centrist, position?  It turns on the point in pregnancy when a fetus becomes viable – that is, when it is capable of living outside the mother’s body.  By that point, historically (as a matter of customary Anglo-American common-law, dating back centuries), the baby is recognized under the law as a human being holding the fundamental right to life.  Prior to that point, in the early stages of pregnancy, the mother’s right to liberty (her ownership of her body) is paramount, for the fetus is only a potential human life.  Thus, any legal restrictions on a woman’s freedom to obtain an abortion in the early stages of pregnancy, for any reason, are infringements of her fundamental rights.  After viability, of course, the baby is not merely potential human life but actually a human being; its killing, through so-called “late-term” abortion procedures, is indeed murder and thus ought to be prohibited by the law.  This balancing of the fundamental rights was what the Supreme Court tried to do in Roe v. Wade (1973); it makes sense under the law and also fits with majority public opinion (although the precise legal definition of when human life begins, under our federal constitutional system, is a matter reserved by the Tenth Amendment for the states).

    Since the 2012 election (when Mitt Romney’s negative comments about Planned Parenthood and its use of taxpayer funds to provide abortions might have alienated some female voters, as the left-liberal political narrative about the Republicans’ alleged “war on women” claimed), the leftist, “pro-choice” news media has been publicizing horror-stories about “extreme” anti-abortion laws passed by legislatures and signed into law in “red” states (such as a newly proposed North Dakota law defining human life as beginning at conception, or an Arkansas law – currently being challenged in court – that bans most abortions after 12 weeks of pregnancy).  But the anti-abortion, “pro-life” movement may have received a boost, in public opinion, after the media finally started to cover the recent criminal case involving Dr. Kermit Gosnell, the Pennyslvania abortion doctor who performed late-term abortions at his run-down clinic in west Philadelphia.  Earlier this month, Dr. Gosnell was convicted of four counts of first-degree premeditated murder in killing four viable babies (born alive) and one count of manslaughter, in allowing a pregnant woman to die following an abortion.  He escaped the death penalty by agreeing to waive his right to appeal his murder conviction; he will serve life in prison without parole. 

    Even the most fervent “pro-choice” activists were embarrassed by Dr. Gosnell’s case.  The atrocities revealed during his trial were truly horrendous.  Although a spokesperson for NARAL Pro-Choice America tried to argue that “peeking into Gosnell’s clinic is like peeking into U.S. history pre-Roe v. Wade,” most pundits agreed with Fox News analyst Kirsten Powers (a left-liberal) that Gosnell’s atrocities were no “aberration,” and raise troubling questions about abortion clinics elsewhere across America.  Polls taken after the Gosnell case show a shift in public views, away from support of abortion (especially the type of late-term abortions that Dr. Gosnell performed).  The only thing that pundits have agreed is that the case has changed the debate over abortion.  I predict that the “moderate,” or “centrist” view will eventually win.



    n The Rebirth of Freedom of Labor  

                     Membership in labor unions plummeted in 2012 to its lowest level since the 1930s, according to government figures released by the Bureau of Labor Statistics earlier this year.  As John Hinderaker reported on the Power Line blog, overall union membership declined from 11.8 percent to 11.3 percent of the workforce, or 14.4 million workers (a decline by about 400,000).  More than half the loss, about 234,000 members, came from government workers including teachers, firefighters, and public administrators.  “But unions also saw losses in the private sector even as the national economy created 1.8 million new jobs in 2012,” the AP wire story reported.  “That membership rate fell from 6.9 percent to 6.6 percent, a troubling sign for the future of organized labor, as job growth generally has taken place at nonunion companies.”  Hinderaker adds that it’s not coincidental:  “Unions kill jobs, and eventually the job losses take their toll on union membership.  One is left wondering why such a spent movement is able to play a disproportionate role in our politial life” (“Unions` Decline Continued in 2012,” January 23).  

    As I’ve previously noted here on MayerBlog, the decline of “Big Labor” in the United States – both in terms of membership and its political clout – is good news for American workers and taxpayers.  (See my essay “Summer 2012 in Review” (Sept. 12, 2012), the sections on “Those Damn Labor Unions!” and “Victories in Wisconsin and Indiana.”)  

    Labor unions may have served an important purpose, in protecting the interests of workers in the private sector, in the early 20th century (at a time when American courts, following English common-law precedents, regarded most labor unions as criminal organizations that illegally restrained trade).  But since the New Deal era, when Congress unconstitutionally exercised its power to “regulate commerce among the states” by passing the first federal labor laws, the government has artificially protected labor unions, making them the favored creature of the law, giving them an unfair advantage in negotiating with employers – and depriving individual workers who chose not to join a labor union of their fundamental rights.  (Earlier in the 20th century, at a time when the U.S. Supreme Court protected “liberty of contract” as a fundamental constitutional right – see my book Liberty of Contract: Rediscovering a Lost Constitutional Right (2011), discussed in my essay “Rediscovering Liberty of Contract” (Jan. 14, 2011) – workers were not required to join labor unions as a condition of their employment.  But the federal labor laws enacted by Congress in the 1930s, 1940s, and 1950s not only mandated many union-contract standards – including minimum wages and compulsory overtime pay – in most employment contracts but also gave unions unfair advantages in organizing and bargaining with employers.  Among other things, federal labor laws mandate that a majority of workers, agreeing to collectively bargain through a labor union, might compel the minority who do not choose to join the union.  One big exception in federal labor laws, discussed below, is the provision permitting states to enact “right-to-work” laws that permit workers to opt out from joining a labor union.) 

    Especially since the end of World War II – the time when labor union membership was at its peak in U.S. history – labor unions have contributed to the economic decline of the United States, particularly in manufacturing industries (consider especially the U.S. auto industry), chiefly through the inflation of labor costs (not just salaries but also health-care and pension benefits, the costs of which have been exploding in recent years).  And public-sector unions (that is, the unionization of government workers, including teachers, fire fighters, and police officers) have practically bankrupted many states and local governments (again largely because of the exorbitant, hemorrhaging costs of health care and pensions, as well as “sweetheart” contracts that provide for workers to contribute only a small fraction of their salaries to these benefit plans).  The only jobs that labor unions protect are union jobs – resulting in excessive labor costs that percolate throughout the economy, which in turn results in fewer jobs for the majority of Americans who are not union members as well as higher taxes for local, state, and federal taxpayers (who are footing the bill for the benefits provided unionized government workers).   

    Little wonder, then, that state governments – generally, states with Republican majorities in their legislatures as well as Republican governors (because labor unions, and especially government-worker unions, remain core constituencies of the Democratic Party) – have begun to reform their labor laws, particularly regarding government-worker unions.  As I noted in my September 2012 essay, it was a huge victory – again for both taxpayers and for workers – when Wisconsin (the birthplace of “progressivism,” the first state to allow government workers to unionize) in 2011 enacted a much-needed bill to reform government-worker unions which, among other things, raises state employees’ share of pension and health costs, limits union negotiations of wages, and prohibits unions from automatically deducting union dues from workers’ paychecks – all common-sense (and actually rather modest) reforms that were sorely needed in order to help reduce the skyrocketing costs of government workers’ pensions and health-care plans.  (For more on the Wisconsin reforms, see my 2011 “Spring Briefs” essay (Mar. 18, 2011) – and scroll down to the entry titled “Real Cheeseheads: Wisconsin Union Thugs and Their Democrat Allies”).  Efforts by Big Labor and the Democratic Party to overturn the law in the courts failed – as did their effort to recall Governor Scott Walker.   

    The success in Wisconsin has inspired other states – especially in the Great Lakes region, part of the nation’s “Rust Belt” where manufacturing industries have been in steep decline due to rising labor costs – to follow with their own reforms.  Indiana, for example, became a “right to work” state – one that allows workers the freedom to decide for themselves whether or not to join a labor union.  (Indiana was the 23rd state to pass a “right to work” law.  Most states with such laws are in the South, and they have been experiencing economic growth despite the national recession.  As reported by economist Richard Vedder in a 2010 Cato Journal article, data show a 23% higher rate of per capita growth in right-to-work states.  And according to a recent report by the U.S. Census Bureau, between April 1, 2010 and July 1, 2012 a net total of nearly 809,000 Americans moved into one of the then-22 right-to-work states from elsewhere in the U.S. – a continuation of the massive exodus of employees and their families from forced-unionism states that the Census Bureau has documented ever since it began tracking state-to-state domestic migration during the 1990s.)  

    Perhaps the most striking development in labor law at the state level has occurred in my original home state, Michigan.  As I reported in my “Election 2012 Postmortem” (Nov. 10, 2012), Michigan voters on Election Day 2012 rejected two proposed constitutional amendments that were pushed by Big Labor: Proposal 2 would have guaranteed collective bargaining, prohibiting the legislature from passing any new laws that would have restricted unionization, such as a “right-to-work” law; and Proposal 4 would have mandated minimum wages and unionization of home health-care workers.   

    In the wake of these major defeats by Big Labor, Republican Governor Rick Snyder (a moderate Republican businessman) and the Republican-controlled legislature passed a law in mid-December making Michigan the 24th “right to work” state.  Again, “right-to-work” laws give workers the right to not join unions and prohibit the coercive collection of dues from those not choosing to join.  The time was right to break a long-standing tacit truce in Michigan politics on union rules: Michigan had the nation’s sixth-highest state jobless rate at 9.1% and it had one of the lowest rates of personal income growth between 1977 and 2011.  As the editors of  Wall Street Journal put it, “it could be the best thing to happen to Michigan’s economy since the internal combustion engine” (“Worker Liberation in Michigan,” Dec. 11, 2012).   Meanwhile, Indiana now has a record number of businesses choosing to expand or set up in the state (220 companies, including Amazon and Toyota, which will invest $3.6 billon and create some 21,000 new jobs, according to the Indiana Economic Development Corporation).   

    The success of neighboring states in passing right-to- work laws has renewed the political pressure on the Republican-majority legislature (and Republican Governor John Kasich) here in Ohio, to follow suit.   The Buckeye Institute for Public Policy Solutions, the state’s free-market think-tank, has been trying to educate Ohio politicians of the economic benefits that a “right-to-work” law would bring to all Ohioans.  (See “Ohio Right-to-Work: How the Economic Freedom of Workers Enhances Prosperity,” March 2012 study posted on the Buckeye Institute website.)  Not surprisingly, in response, Big Labor has already begun its campaign of misinformation, with TV ads and billboards ludicrously claiming that right-to-work or workplace freedom is “Communist”!  

    Finally, there are other ways in which unions limit all Americans’ freedom – and stand in the way of a dynamic, prosperous economy.  Among them are minimum-wage laws, which unions typically support (even though their members, whose labor is priced well above the minimum, do not directly benefit), because they limit competition in the labor market and thus have the economic effect of keeping labor costs inflated (which does benefit union members).  Democrat politicians, eager to do the bidding of their labor-union allies, keep pushing for increases in the minimum wage, at both the state and federal levels; B.O. in his “State of the Union” address earlier this year called for an increase in the federal minimum wage from $7.25 to $9.00 an hour.  That’s a truly asinine idea, as I discussed in “Spring Briefs 2013” (March 23), the section on “Minimum Wage, Maximum Folly,” as well as my earlier essay entitled “Minimum Wage, Maximum Folly” (Oct. 20, 2006).  

    The problem with minimum-wage laws, from the perspective of individual rights, is that they deprive workers whose labor is not worth the minimum price arbitrarily set by government (usually that means younger, less skilled workers) of their freedom to work; or in other words, to enter into a contract with a willing employer at a price less than the minimum set by government.  Such laws thus abridge a fundamental right: the right to earn a living.  And from a utilitarian, or policy, perspective, such laws hurt most those individuals they are supposed to help:  younger, unskilled workers (who also are often members of minority racial groups, which is why the black free-market economist Walter Williams condemned minimum-wage laws in his classic book, The State Against Blacks (1982)). 

    Fortunately, B.O.’s proposed increase in the federal minimum wage – like so many other initiatives he has proposed – is doomed to fail in the Congress, not just because of Republican opposition.  It’s also doomed to fail, like his other policies, because it defies economic reality.



    n Finally Getting Up to Speed in Ohio 

                Earlier this spring Ohio lawmakers passed a measure to increase speed limits on many miles of interstate highways in the state to 70 mph.  The new speed limits, which will go into effect in July, apply mostly only to rural areas; speed limits on interstate highways in urban areas – such as I-270, the beltway around Columbus (the highway I drive most frequently) – will remain at 65 (or even 55 in some urban areas). 

                It’s about time.  Most of Ohio’s neighboring states (including Michigan, Indiana, and West Virginia) already raised speed limits on many interstate highways to 70 mph several years ago.  I can’t even say that “finally Ohio’s entering the 21st century” because the last time speed limits in Ohio were set at 70 mph was 1963.  Then, like nearly all other states, Ohio was coerced by the federal government (which threatened to withhold federal funding for highways) into reducing speed limits to 55 mph during the “energy crisis” of the 1970s (a crisis created by bad federal government policies), supposedly to conserve fuel.  Not until Reagan era, in the late 1980s, did 65 mph speed limits reappear on Ohio highways.  (That’s not progress but regress:  instead of traffic moving faster and more efficiently it’s back to where it was in the 1960s!) 

                This modest change in highway speed limits raises some questions.  First, why was Ohio slower to act than its neighboring states?  The answer seems quite clear to me: it’s because Ohio’s not only centrally located (in the U.S. east of the Mississippi); it’s also centered on the political spectrum – a state where both major political parties are boringly “moderate,” in the mushy middle, with little difference between Democrats and Republicans and little desire on the part of politicians in either party to challenge the statist status quo, the “Nanny State.” 

                That unfortunate reality of Ohio politics also suggests the prospects are rather bleak for real reform in other areas.  Ohio’s not likely to be a trend-setter but rather a follower.  Still, on what other issues might there be prospects for reform?   One, as suggested in the previous section, is labor-law reform, particularly the adoption of right-to-work laws, another issue on which Ohio’s neighboring states have taken the lead.  To compete effectively with Michigan and Indiana in attracting businesses to Ohio, the state’s lawmakers may have to act, sooner rather than later. 

                Legal recognition of same-sex marriage is another subject of possible reform, even though Ohio is one of many states that has constitutionally declared the definition of marriage to be limited to a man and a woman.  It’s significant that U.S. Senator Rob Portman, a conservative Republican, now supports same-sex marriage, a “change of heart” (in Portman’s words) after he learned that his son Will is gay.  So far it seems that Portman’s acceptance of same-sex marriage has not damaged him politically – perhaps because, as noted in the “Marriage, American Style” update above, other conservative Republicans have changed their position on the issue.  Ohioans in general may have had a change of heart:  according to a recent (March 2013) poll by The Columbus Dispatch, 54% of the adults surveyed favor a proposed amendment to the Ohio Constitution permitting two consenting adults to marry, regardless of their sex – a significant shift in Ohioans’ sentiments since voters overwhelming supported the 2004 ban on same-sex marriage.  FreedomOhio, the group advocating the proposed constitutional amendment, is encouraged and plans to put it on this year’s Nov. 5 ballot. 

                Another important issue where Ohio lags behind other states (including neighboring state – and rival – Michigan) is the liberalization of laws criminalizing marijuana.  As I noted in my “Election 2012 Postmortem” (Nov. 10, 2012), the trend at the state level is definitely in favor of adults’ right to use and possess marijuana (particularly for medical uses).  In the fall 2012 elections, proposals to legalize the recreational use of marijuana were approved by voters in two states (Colorado and Washington) and a proposal to legalize the medical use of marijuana was approved by voters in Massachusetts.  So far, 18 states and the District of Columbia have adopted medical marijuana laws, and two more – Illinois and New Hampshire – are expected to enact them within the next few months.  The group pushing for marijuana reform in the state,, recently received from the Ohio Ballot Board the go-ahead to begin collecting signatures on petitions for a proposed constitutional amendment.  The amendment (named the Ohio Cannabis Rights Act) would allow people 18 or older with a “debilitating medical condition” to “use, possess, acquire, and produce” marijuana, under the regulation of an Ohio Commission of Cannabis Control; it also would legalize the growing of hemp – declassifying it as a drug (hemp plants are related to cannabis and, although a valuable crop in early American history, have been illegal under the modern “war on drugs”) and allow it to be grown as a crop with oversight by the Ohio Department of Agriculture.  (How typical of “Nanny State” Ohio that something cannot be legalized without subjecting it to government regulation!)  The measure will likely be on the ballot in fall 2014. 

                There yet may be hope for Ohio, as it lurches forward into the 21st century.



    n Scouting for a Solution 

                In this year’s “Spring Briefs” essay (March 23) – in the section provocatively titled, “Do They Give a Merit Badge for Homophobia?” – I discussed the Boy Scouts of America (BSA) as the organization was considering a change in its policy barring openly “gay,” or homosexual, members.  (It was often described in media reports as a “long-standing policy,” but it had been in place only since 1991.)  At the time it appeared the Scouts were headed toward a compromise – ending the national ban but still permitting individual chartering groups (many of which are religious organizations) to set membership restrictions for themselves.  But when the Boy Scouts took a vote at the organization’s National Annual Meeting on May 23, they adopted a different compromise:  it ended the ban on gay youth, allowing them to participate in scouting, but retained the ban on gay adult Scout leaders.  (61% of the more than 1,400 voting members of the Scouts’ national council voted in favor of the proposal, to become effective January 1, 2014.)  In other words, an openly gay boy (presumably) would be welcome to join – or to stay – in the Scouts, but when he turns 18, he’ll have to cease his association with the group; and the gay male and lesbian parents of Boy Scouts won’t be able to help out their sons’ “pack.”  (The national council didn’t vote affirmatively to ban gay adults from participating, but their vote to rescind the ban on membership was limited only to kids.) 

    It’s “a half-step forward,” declared the editors of USA Today on May 24.  “Cheers should surely be muted because the Scouts couldn’t bring themselves to take the next obvious step and welcome gay leaders.  But . . . it’s a remarkable shift in a very short time.  Just last summer, Scouting leaders reaffirmed their total ban on gays, a policy they had defended all the way to the Supreme Court – and won, because the First Amendment guarantees private groups the right to choose their members.”   

    As I noted in my “Spring Briefs” essay, the BSA was under tremendous pressure, both from outside and from within (including present and former Scouts, both homosexual and heterosexual), to change its policy – as the attitudes of Americans generally are changing, become much more accepting of homosexuality (as the recent trend in favor of same-sex marriage indicates).  Nevertheless, perhaps because of those changing attitudes in the general population, many conservatives are adamantly opposed to a change in membership policy.  About 70% of troops are chartered to faith-based groups, including the Catholic Church, which condemns homosexuality as a sin; many of these religious groups do not consider homosexuality as compatible with the Scouts’ value of being “morally straight” (no pun intended).  Forced to welcome openly gay Scouts and leaders, some groups would end their BSA affiliation, seriously weakening an organization already declining in numbers and influence.  Yet, as the USA Today editors also observed, it’s a positive sign that the Mormon Church, the single largest sponsor of Scout troops, has said the change won’t affect its support for Scouting.  (The Catholic Church said it would need to study the issue, “but at least it didn’t threatened to quit,” as some conservative Christian groups have done.) 

                Progress comes slowly when it requires change in people’s attitudes and beliefs – especially when it comes to breaking down prejudices and bigotry.  (“Homophobia,” the irrational fear and/or hatred of homosexuality, is a real phenomenon, rooted in Americans’ historic culture of repression when it comes to sexuality generally, as I discussed in my essay “In Defense of Sex” (May 16, 2005).  Still, BSA’s decision to change its membership policy regarding sexual orientation is indeed a positive development: whether a full step or only a half step, it is movement toward greater acceptance not just of homosexuality but of the full complexity of human nature. 

                The next challenge for the Boy Scouts – other than deciding whether to continue the ban on gay leaders – will be the organization’s requirement that members believe in God, which effectively excludes agnostic and atheist kids (and their parents).  Already some commentators are calling for the Scouts to open membership to non-believers (see, for example, Tom Krattenmaker’s column, “Good Boy Scouts Don’t Need God,” USA Today, May 13).   

                Just as homosexual Scouts can be as “morally straight” as heterosexuals, so too can agnostic or atheist Scouts.  As comedian/magician Penn Jillette maintains in his recent book Every Day Is an Atheist Holiday! (Blue Rider Press/Penguin, 2012), “[his 7-year-old] son’s morality does not come from God.”  Like Thomas Jefferson, Jillette believes that morality is grounded in human nature, which teaches us the basic principle, “Don’t cause pain”: 

    “[A]ll normal children understand the moral code at an early age.  Children are cruel, children are violent, children have no patience, children are moody, and children can’t seem to understand that sneaking under Daddy’s desk when he’s lost in thought while writing his book and grabbing his feet is going to scare the living shit out of him.  Maybe they do understand that last one.  Children have a lot of work to do on impulse control, but morality takes its place early on.  There are studies about normal children knowing the moral difference between a teacher saying it’s okay to stand up during circle time and a teaching saying it’s okay to lie and hit. They know the teacher can’t turn something that is morally wrong into something right by just saying it.  They understand that right and wrong are separate from authority.”


    Other atheists – namely, Objectivists and neo-Objectivists like myself – maintain that although a moral sense isn’t innate in human beings, it is grounded on reason; and that a moral code should recognize certain truths about human nature and be based on rational self-interest, combined with a respect for the equal rights of others.  The bottom line is that one need not believe in a god in order to be moral, or morally upright, as the Scouts profess to be.  Let’s hope that the progress they’re making with regard to sexual orientation might also lead them to understand this key point. 


     | Link to this Entry | Posted Thursday, May 30, 2013.  Copyright © David N. Mayer.

    Thoughts for Summer 2013, and Beyond (Part I) - May 23, 2013



    Thoughts for Summer 2013 – and Beyond


    Part I


    It’s time for another annual tradition:  once again, MayerBlog will be on hiatus, while I continue writing the manuscript of my next book, Freedom’s Constitution: A Contextual Interpretation of the Constitution of the United States.  But this year the hiatus will extend beyond the summer season and continue through the 2013–14 academic year, as I will be on sabbatical leave, so I can (at last) finish writing the book.  In other words, this will be the last essay posted to MayerBlog until Fall 2014 (probably Labor Day weekend in 2014). 

    Before going on hiatus, however, I could not resist the temptation to comment on a number of important issues in public policy and popular culture – issues that are in the news today and are likely to remain in the news throughout the summer and beyond.  Because there are so many topics (over 30!), I’ll be posting this essay in two parts, Part I today (before the Memorial Day holiday weekend) and Part II next week (after the holiday weekend).  If you scroll all the way down to the end of this entry, you’ll also see my summer reading recommendations and my annual preview of summer movies.



    n B.O.’s Fuelish War on Carbon Energy 

                Yet another annual tradition – and the traditional beginning of my “Thoughts for Summer” blog entry – unfortunately, has been the topic of high gasoline prices, a perennial issue because gas prices almost always rise every summer, simply as a result of higher demand.  But the problem has become much worse since B.O. began his occupation of the White House.  The day before B.O.’s first inaugural in January 2009, the average price of a gallon of gasoline was a mere $1.83.  Today, gas prices average at least $3.50 nationally and, according to some analysts, will slowly rise to a “new normal” of $4 or even $5 a gallon.  Thus, gas prices have more than doubled during B.O.’s presidency – one of the few promises that he actually has kept.  As I wrote in my entry “Spring Briefs 2011” (March 18, 2011): 

                “High gas prices are the direct result of the policies of B.O.’s regime in Washington – indeed, they’re the deliberate policy of the regime, whose energy policy is best described as an anti-energy policy, designed to create an artificial shortage in carbon-based fuels, that is, oil, natural gas, and coal – a veritable war on America’s domestic oil,  gas and coal industry.   It’s a deliberate program of restricting domestic energy to make so-called `green energy’ (which cannot pay for itself without massive government subsidies) more attractive and necessary, all in order to fulfill B.O.’s campaign promise that energy prices would “necessarily skyrocket” on his energy agenda.  . . . [Former] Energy Secretary Steven Chu (a physicist with no experience in the energy industry, who’s also a true believer in radical environmentalists’ global warming, or `climate change,’ theories), before he was appointed energy secretary, had expressed a fondness of high European gas prices as a means of reducing consumption of fossil fuels.  In September 2008, Chu told the Wall Street Journal, `Somehow we have to figure out how to boost the price of gasoline to the levels in Europe’—which at that time averaged about $8 a gallon.”


    And as I added, “Virtually all the decisions made by the B.O. regime not only document its hostility to fossil fuels but have had the effect of raising energy prices.”   

                The “green” policies of the B.O. regime have resulted in recent steep increases in the price of gasoline.  Oil refining capacity continues to be limited – no new oil refinery has opened in the United States since Gerald Ford was president – so that any problems at existing refineries (such as outages or extended maintenance, which recently occurred at several oil refineries that serve the Midwest) will, by limiting supply, drive up gasoline costs.  Moreover, this spring the Environmental Protection Agency proposed strict new “clean fuel” standards which will further increase the cost of fuel.  (The EPA said the so-called Tier 3 rule would cut emissions of smog-forming pollutants as well as toxic emissions like benzene.  What the EPA didn’t say was that levels of these pollutants have been steadily falling for years, and would continue to fall even without the new rule, which the oil industry says will cost tens of billions of dollars (“Pollution Levels Have Plunged, But EPA Plans Costly New Rules,” Investor’s Business Daily, April 22). 

    Why are B.O. and his regime so hostile to carbon-based energy (the so-called fossil fuels of coal, natural gas, and oil)?  As I discussed in Part III of my “2013: Prospects for Liberty” essay (February 7, in the section on “`Green’ Bullshit”), there are two principal reasons.  First is B.O.’s adherence to the bullshit radical environmentalist theory of “global warming” or “climate change” (discussed in the next section, below).  Second, B.O. deliberately aims to weaken the United States economically, to transfer America’s wealth to countries in the so-called third world, to promote what he regards as global economic “justice.”  That’s the goal of the radical leftist anti-colonialist ideology he inherited from his father, Barack Hussein Obama, Sr., as documented by Dinesh D’Souza, in his two books, The Roots of Obama’s Rage (2010) and Obama’s America: Unmaking the American Dream (2012), and his documentary film, 2016: Obama’s America.  As I observed in my “Tricks and Treats” essay (Oct. 25, 2012), D’Souza’s analysis provides perhaps the best explanation for B.O.’s double standard on energy:  he has waged war on carbon-based energy – using his Interior Department and EPA to impede development of America’s vast natural reserves of oil, natural gas, and coal (just when technological advances such as “fracking” have made exploitation of these reserves now economical) – but instead has helped bankroll deep oil drilling in Brazil, Columbia, and Mexico.  As D’Souza explains, the anti-colonial theory “predicts that Obama should want to enrich the previously colonized countries at the expense of previous and current colonizers.  Anti-colonialists insist that since the West grew rich by looting the resources and raw materials of the colonies, it is time for global payback.  The West must have less and the rest must have more.” 

                Whether it’s “green” bullshit or anti-colonialist ideology (another kind of bullshit) that’s behind B.O.’s agenda, it’s clear that during the past four years the B.O. regime has been pushing an anti-carbon agenda and is likely to continue pushing this agenda in the coming years, during B.O.’s second term (when, as a “lame duck” president, he’ll be even less accountable to public opinion than he was during his first term).  That means, among other things: 

    n  The imposition of even more stringent fuel-economy (CAFÉ) mandates on auto and truck manufacturers – which will force Americans to drive in smaller, lighter, less safe and less convenient vehicles.  (As I noted in my February 7 essay, that’s also part of B.O.’s ultimate goal: to make America more like Europe, with both its sky-high gasoline prices and its tiny cars.  And it’s yet another important example of how the B.O. regime has deprived American consumers of their freedom of choice.) 

    n  Continued restrictions on drilling for oil and natural gas off the shores of the United States.  B.O.’s former Interior Secretary Ken Salazar imposed a moratorium on oil drilling in the Gulf of Mexico, in the wake of BP’s Deepwater Horizon disaster – which, as I’ve previously noted, was far more destructive to the Gulf economy than the spill itself.  (Sally Jewell, B.O.’s new Interior Secretary, may be less of a radical environmentalist than Salazar – she was CEO of outdoor retailer Recreational Equipment, Inc. – but may be expected to continue implementing B.O.’s “green” agenda.)  During his tenure at the Department of Interior, Salazar removed major swaths of energy-rich U.S. land from production and exploration, costing the U.S. hundreds of billions of dollars.  As a federal report last year showed, oil output on federal lands fell 11% from 2010 to 2011, while natural gas dropped 6%.  (During the same time, thanks to the energy “revolution” discussed below, oil and natural gas output on private land surged, respectively, 14% and 12%.)  Today, the U.S. government leases less than 2.2% of federal offshore areas for oil and gas development, and just 6% of its lands.  All told, B.O./Salazar energy policies have removed an estimated 1.2 trillion barrels of oil and 21 trillion cubic feet of natural gas off the nation’s table – enough to last the U.S. hundreds of years (“Master of Disaster,” I.B.D., January 18). 

    n  Continued refusal to green-light the Keystone XL pipeline (more on this below) 

    n  Continued anti-carbon regulations implemented by B.O.’s Environmental Protection Agency.  (B.O.’s appointee as EPA director during his first term, Lisa Jackson, was not only a radical environmentalist but also was politically corrupt.  B.O.’s pick for Ms. Jackson’s successor during his second term, Gina McCarthy, is just another radical environmentalist.  She headed the agency’s “clear air” efforts during B.O.’s first term, and as EPA director is expected to push the B.O.’s radical anti-carbon agenda (including efforts to reduce carbon dioxide emissions – despite the fact that CO2 isn’t really a pollutant).)  As noted above, the EPA’s strict new “clean fuel” standards will significantly increase the price of gasoline.  Moreover, the EPA has announced new carbon-dioxide emission rules that will force new power plants to put expensive new equipment – equipment that doesn’t even yet exist – to capture and bury emissions underground.  Because the new EPA rules are so draconian, they effectively ban new coal-fired power plants by making them uneconomical to build.  Although the EPA recently announced it would delay implementation of the rule (after the electrical-power industry objected on legal grounds), when the rule finally is implemented, it will result in higher electricity prices for all Americans, as coal today provides about 40% of electricity generation in the United States (“EPA’s War on Energy Continues,” “EPA Effectively Bans New Coal Plants,” I.B.D., March 28, 2012; “EPA delays tough rule on carbon emissions,” Columbus Dispatch, April 13). 

    n  More “green” initiatives which, rather than producing alternative energy, instead have produced more political scandals – for example, the case of Solyndra, the bankrupt company that has become the poster child for this regime’s crony capitalism (or, more properly, crony socialism or fascism).  And rather than creating any real American jobs, they have resulted in more “outsourcing” of energy-industry jobs overseas – for example, by the use of stimulus dollars to build electric cars in Finland, or the support of the U.S. Export-Import Bank’s plan to loan Brazil’s state-run oil company, Petrobras, $2 billion to do deep-water oil drilling in the Atlantic. 


    What’s especially perverse about the anti-carbon energy policy of the B.O. regime is that it’s happening at the same time that the United States is about to experience a true revolution in energy, thanks to North America’s abundant natural resources in so-called fossil fuels.  Technological innovation (particularly hydraulic fracturing, or “fracking”) have now made it economically feasible to extract carbon fuels from deposits embedded deep in shale rock.  This revolution in carbon energy production at last is making realistic the dream that the United States would be fully self-reliant, no longer dependent on Middle East countries – a dream of U.S. policymakers since the 1970s.  Indeed, the U.S. is poised to become “the new Saudi Arabia.” 

    As I observed in my February 7 essay, the estimated supply of carbon fuels in North America is truly awesome:  with regard to oil, some 400 billion barrels of crude that could be recovered using existing drilling technologies, plus at least 1.4 trillion barrels of recoverable oil embedded in shale rock – enough to meet all U.S. oil needs for about the next 200 years, without any imports; with regard to natural gas, a 40% increase in U.S. production (which since 2002 has increased from less to 2% to some 34% of global production); and with regard to coal, some 1.7 trillion tons of identified coal resources and 3.1 trillion tons of total coal resources – enough recoverable coal reserves to last 239 years, at present rates of coal use.  With regard to oil output alone, the International Energy Agency recently predicted that the new supplies of North American crude oil will be a true game-changer, revolutionizing the global energy market, “as transformative to the market over the next five years as was the rise of Chinese demand over the last 15 years” (“New U.S. Oil Output Is a Game-Changer for Energy Industry,” I.B.D., May 15). 

                Yet, as I also noted in my February 7 essay, the good news is that B.O.’s anti-energy policies may be the one part of his second-term agenda that’s likely to meet determined congressional opposition, not only from Republicans but also from moderate Democrats from states rich in carbon-fuel resources (such as Senator Joe Manchin of West Virginia or the new senator from North Dakota, Heidi Heitkamp).  Thus, although radical environmentalists are encouraged by the lip-service B.O. paid to “climate change” in his Second Inaugural, some energy leaders are also optimistic that political pressures may force B.O. to adopt an “all of the above” energy policy. 

    The critical test will be B.O.’s decision regarding the Keystone XL pipeline from Canada, a critically important project that his regime thus far has been blocking.  The pipeline would carry nearly a million barrels of oil per day, bringing supplies from the oil sands of Alberta, Canada as well as U.S. crude from the Bakken oil fields of North Dakota and Montana, some 1,700 miles to Texas refineries, creating thousands of American jobs while also essentially eliminating U.S. dependence on Middle East oil.  Although Nebraska’s Department of Environmental Quality recently gave the revised Keystone XL pipeline route through that state a thumbs up, the U.S. State Department still hasn’t given its approval (which is necessary because the pipeline would cross an international border).  The State Department started its review of the original Keystone XL permit application over four years ago and has since publicly released over 15,500 pages of documents relating to its analysis – showing that the pipeline would have minimal environmental risks and would meet pipeline safety standards.  Moreover, the State Department’s most recent supplemental environmental impact statement found that the project would support over 42,000 annual jobs over its construction period.  Many labor unions, outraged over the B.O. regime’s apparent decision to put “green” politics over jobs, have rallied together to express their support for the project and a desire to move forward.  And many members of Congress, both Democrats and Republicans, are tired of waiting for B.O. to decide which road he will choose – and thus may take matters into their own hands, with a House bill (H.R. 3, the Northern Route Approval Act) that will end the regulatory delays by moving Keystone XL beyond the regulatory process and finally allow construction of the landmark jobs and energy project to begin, as Rep. Marsha Blackburn (R.–Tenn.), vice chairman of the House Energy and Commerce Committee, noted in a recent op-ed (“It’s Time To Act on Keystone,” I.B.D., May 21).     

    As Rep. Blackburn concluded, “The inescapable truth is that with or without the Keystone XL pipeline, Canada’s oil sands will continue to be developed.  The only question is whether this valuable oil supply will be refined in the U.S. or shipped overseas to places like China.”



    n “Green Bullshit” Gradually Dissipates 

    For over 40 years radical environmentalists have been pushing a political agenda based on a series of beliefs that I have called “`green’ bullshit" (see Part III of my “2013: Prospects for Liberty” essay (February 7)): the theory of “global warming” – or “climate change,” as it’s now euphemistically called – in other words, the theory that the average global temperatures are increasing, and that the Earth’s warming will lead to cataclysmic disasters such as massive flooding of coastal areas as the polar ice cap melts and the world’s oceans rise to dangerously high levels, etc., etc., and that this dangerous global warming is caused by human activity, namely, by man-made carbon dioxide (a so-called “greenhouse gas”) created by the burning of carbon-based “fossil fuels” such as coal, oil (and other petroleum products), and natural gas. 

    The global-warming thesis is a theory that, despite the propaganda of global-warming alarmists, is far from being scientifically proven.  Indeed, it is a flawed theory that fails to fit the facts; in other words, it’s a theory based on faulty “junk” science – to which radical “green” true believers adhere based mostly on faith, not reason.  To them, it’s a kind of religion that has at its core a fanatical hatred for anything that’s man-made and for the human beings who make them.  Greg Gutfeld, a commentator on Fox News (libertarian panelist on The Five and host of Red Eye), summed it up quite well in his first book, The Bible of Unspeakable Truths (Grand Central Publishing, 2010, p. 27): 

    “Sadly, this antihuman bullshit keeps flowing, unfiltered and unquestioned.  Global warming theology and the journalism that soaks it up avoid focusing on fact, and instead build a story – one that is easily understood by the arrogant, simplistic minds that prefer to think humans (or rather, capitalistic humans) are to blame for everything wrong in the world.  It’s not only dishonest, it’s murderous.  We are now scaring a populace into thinking that the biggest threat to mankind is man’s role in climate change, instead of ideologies that promote death and violence.”


                Over the past few years, it has become even more clear that the “climate change” or “global warming” theory is nothing but a scam -- indeed it’s been called “the greatest scam in history,” by John Coleman, meteorologist and the founder of The Weather Channel.  The revelations from what has been called “Climate-gate” – e-mail exchanges and other documents hacked from computers at the Hadley Climate Research Unit at the University of East Anglia in Great Britain – reveal that there has been a conspiracy among some in the science community to spread alarmist views of global warming and to intimidate, if not silence, those who disagree (the skeptics of global warming theory, called “denialists” by the true believers, whom I call “warm-mongers”).  In other words, what Michael Crichton warned about in the fictional story in his novel State of Fear – a global hoax by radical environmentalist terrorists – has practically come to pass.   

                Fortunately, however, more and more Americans are realizing that the radical environmentalists’ theory really is nothing more than a massive scam.  Public opinion polls show that concern about alleged “global warming” or “climate change” has drastically declined – particularly as a new “revolution” in carbon-based energy resources (discussed above) has the promise of re-energizing the United States and the moribund American economy.   

    As I noted in my February 7 essay, earlier this year a group of more than 20 retired NASA scientists and engineers, who call themselves “The Right Climate Stuff,” has issued a report that decisively shatters the global warming myth.  After reviewing, studying and debating the “available data and scientific reports regarding many factors that affect temperature variations of the earth’s surface and atmosphere,” they’ve found (among other things):  “the science that predicts the extent of anthropogenic [man-caused] global warming is not settled science”; “there is no convincing physical evidence of catastrophic anthropogenic global warming”: and “because there is no immediate threat of global warming requiring swift corrective action, we have time to study global climate changes and improve our prediction accuracy.”  They conclude that Washington is “over-reacting” on global warming and suggest that a “wider range of solution options should be studied . . . .” (“Facts About Climate,” I.B.D., January 28). 

                The unraveling of the “global warming” scare continues, with a number of recent important pieces of science news, as the editors of Investor’s Business Daily have noted.  Among them:  Scientific American reports that the Arctic “used to be a lot warmer,” a good 8 degrees Celsius warmer, than today (that was 3.6 million years ago – well before man-made carbon dioxide!); a BBC report that over the coming decades “global average temperatures will warm about 20% more slowly than expected”; predictions of sea-level rise by the UN’s Intergovernmental Panel on Climate Change have fallen to only 7 – 23 inches (not the 80-inch figure touted by some alarmists, including UN Secretary-General Ban Ki-moon); new evidence from the Australian Broadcasting Company confirming a 2010 report that rather than shrinking, many low-lying Pacific islands are actually growing; and a New York Times story reporting that, according to a new study, CO2 concentrations (at record high levels, last seen about 2 million years ago, according to recent media reports) would have to triple or quadruple before any of the global warming catastrophes that the alarmists have predicted would actually occur (“Global Warming Cools Off,” May 21).  (For more myth-bashing, see some of the writings of Patrick J. Michaels, director of the Center for the Study of Science at the Cato Institute,  particularly his recent essay, published in Forbes, “The Climate Change Horror Picture Show” (April 18).) 

    I’m old enough to remember the late 1970s – just a few years after the modern radical environmentalist movement began, with “Earth Day” in 1970 – when the news media was warning about global cooling.  (Remember the Newsweek cover story, “The Coming Ice Age”?)  It seems that “climate change” science hasn’t improved much over the past 35-40 years.  The I.B.D. editors suggest that “maybe the alarmists [should] apologize for the decades of public disservice they have provided.”  But I wouldn’t hold my breath.



    n The Disease Known as “ObamaCare” 

    In its shameful decision in National Association of Independent Business v. Sebelius last summer (at the end of June 2012), the U.S. Supreme Court upheld the constitutionality of the 2010 federal health-care control law, officially (but disingenuously) titled “The Patient Protection and Affordable Care Act” (PPACA) but popularly known as “ObamaCare,” because it is the signature legislative “achievement” of B.O.’s presidency.   (The pivotal opinion in the case, written by Chief Justice John Roberts, ridiculously upheld the individual mandate – the keystone of the massive bureaucratic edifice created by the law – as a “tax,” thus providing the crucial 5th vote for the Court’s overall decision upholding the law, 5–4.  As I discussed in my special blog post last summer, “Supreme Folly 2012: The Supreme Court’s `ObamaCare’ Decision” (July 5, 2012), Roberts thus earned the acronym name that my good friend Rod Evans coined for him: “Fierce Jurist Botches”!  

    The massive, 2,800-page “ObamaCare” law was passed by Congress late in 2010, at a time when both houses of Congress were controlled by the Democrats.  It passed by a strict party-line vote, without a single Republican (in either house of Congress) voting in favor of it, and also by some parliamentary chicanery (including evasion of the Constitution’s requirement that revenue bills – which the law ought to be considered, especially given the Court’s decision – must originate in the U.S. House, not the Senate, where the bill that ultimately passed Congress and was signed into law by B.O. really had originated).  As I have frequently noted here on MayerBlog, the law was truly “historical,” but not in the positive sense:  it was the most important law ever passed by Congress on such a strictly partisan basis, and it is also (arguably) the worst – the most monstrous – piece of legislation ever passed by Congress.  In Part III of my “2013: Prospects for Liberty” essay (February 7), I called it “the biggest Mongolian clusterfuck of them all,” with an explanation why that slang term is particularly apt in describing the PPACA – not just a horrid law but also, arguably, the worse disease afflicting the American health-care system (a supposed “cure” that’s far worse than the problems it was supposedly designed to solve).  

    Public opinion polls at the time the law was pushed through Congress showed that the majority of Americans were opposed to it; and over the past two and a half years, the more Americans learn about the law, the more unpopular it has become.  (According to a recent Kaiser Family Foundation poll, only 35% of Americans view “ObamaCare” favorably, while 53% support efforts to continue to block and repeal it.)           

    Nevertheless, the Court’s disappointing decision, coupled with the results of the 2012 elections – the election of B.O. to a second term as “occupier-in-chief” of the White House, and the Democrats’ continuing majority in the U.S. Senate – meant that it is highly unlikely, politically, that the law will be repealed (as most Republicans have pledged to do) before its full implementation effectively begins in January 2014.  Indeed, as long as B.O. remains in the White House and as long as he adamantly insists on vetoing any repeal legislation, it seems that the American people are stuck with his horrid “ObamaCare” law.  Those of us who are adamantly opposed to the law on constitutional, policy, and philosophical grounds – most importantly, as an abridgement of Americans’ fundamental right to health-care freedom (see my discussion in “Unhealthy, Unconstitutional `Reform’,” April 8, 2010) – must rest our hopes for getting rid of the law on such thin reeds as:  additional court cases challenging the constitutionality of the law on other grounds (which I discussed in Part III of my “2013: Prospects for Liberty” essay (February 7); efforts on the part of state governments to “nullify” the federal law (which are even more controversial than “ObamaCare” itself) or to block its implementation (particularly with regard to the proposed expansion of Medicaid); and political change, namely big GOP gains in the 2014 Congressional elections – resulting in Republican majorities in both house of Congress that, with some Democrat support, might be able to either block implementation of the law or to  repeal “ObamaCare” (and then override B.O.’s veto of the repeal legislation). 

    The last option isn’t as far-fetched as it sounds, for not only Democrats in Congress but many of their “core” constituencies (such as labor unions) are becoming increasingly worried about the law and its growing unpopularity.  That’s because, even before it’s fully implemented, “ObamaCare” is increasingly showing signs that it will collapse under its own weight.  Indeed, one prominent Democrat who was one of the law’s chief architects (called a “co-author” of the law by some commentators) – Senator Max Baucus – recently called the law “a train wreck.”     

                Every promise B.O. made about the law already has been broken, particularly the two big promises implied by the law’s disingenuous official title: it has not “protected” patients but instead has caused millions of Americans to lose their health insurance coverage; and rather than making health care more “affordable,” it has caused costs to sky-rocket.  In February, the Congressional Budget Office said that 7 million people likely will lose their employer coverage thanks to the higher costs imposed on employers by ObamaCare – nearly twice the CBO’s previous estimate.  The House Energy and Commerce Committee recently discovered internal costs estimates from 17 of the nation’s largest insurance companies indicating that “ObamaCare” will increase premiums on average by nearly 100%.  Young adults may face premium increases that will soar as high as 400%.  And despite the law being hailed by B.O. as a deficit reducer, the GAO reports that “ObamaCare” will increase federal deficits by $6.2 trillion over the next several years. 

    “ObamaCare” is also a job-killer – and one of the chief causes of the nation’s continuing high unemployment rates.  Just implementing the law’s regulations alone has killed 30,000 jobs, according to a study by the American Action Forum.  A Federal Reserve report listed “ObamaCare” as a leading deterrent for businesses hiring new workers.  (As many economists have noted, that’s the predictable result of the law’s mandates on businesses.  Because they generally apply to businesses with 50 or more full-time employees, with “full-time” defined as working 30 hours or more per week, many businesses simply aren’t expanding beyond 50 employees or are shifting more workers to part-time status.  Franchisees of Taco Bell, Dunkin’ Donuts, and Wendy’s are only some of the businesses that are in the process of reducing their employees’ hours below 30 per week to avoid the costs of covering their health care.  But the law’s negative impact on small-business hiring is even worse.  Because it gives a substantial tax credit to businesses that employ 25 or fewer people, owners of small enterprises have an incentive not to expand their workforce.  Indeed, since the full tax credit is available for those businesses with 10 or fewer workers, there is even less incentive to hire more than 10 employees.) 

                (Among the few jobs that the “ObamaCare” law arguably did help create are those of authors and publishers of books about how Americans might cope with the new law – a veritable cottage-industry that has sprung up in the past year or so, as Americans dread the coming implementation of the law.  One of the best-selling books, at least on, is the ObamaCare Survival Guide, by Nick J. Tate, which is quite readable and informative.  As for books critiquing the health care law, I recommend The Truth About ObamaCare by Sally Pipes, president of the free-market Pacific Research Institute.  And Ms. Pipes is also the author of one of the best books about what should be done to repeal ObamaCare and replace it with real free-market-oriented health-care reform, The Pipes Plan, which is available from Amazon in e-book format for the Kindle reader.) 

    One of B.O.’s main selling points for “ObamaCare” was that it would help the most vulnerable people who have been denied insurance coverage because of “pre-existing conditions.”  But the program established under the law to cover them is refusing new applicants and is already out of money, before less than 1 percent of those with pre-existing conditions received coverage.  And another part of the law – the CLASS program, which was supposed to provide long-term care through a government-run assisted-living insurance program – already has died (officially suspended by the B.O. regime).  The White House decided in October 2011 to pull the plug on the program, admitting that the plan could not financially sustain itself over the long term through enrollee contributions. 

                It’s doubtful whether “ObamaCare” will be fully implemented by January 1, 2014 – the date the individual mandate officially kicks in – because neither the expansion of Medicaid or the establishment of “health insurance exchanges,” both of which are mandates that the law imposes on the states, will be in place.   Indeed, as Michael F. Cannon observes in a white paper published by the Cato Institute, 50 Vetoes: How States Can Stop the Obama Health Care Law (Mar. 21, 2013), “By refusing to create Exchanges or expand Medicaid, states can block many of the PPACA’s worst provisions and push Congress to reconsider the entire law.”  

    In its infamous “ObamaCare” decision last summer, the Supreme Court did hold, 7–2 (with Chief Justice Roberts and two of the Court’s “liberal” justices, including B.O. appointee Elena Kagan, joining the four conservative justices), that states cannot be compelled to expand Medicaid, the chief way in which the law aims to cover millions of low-income uninsured Americans.   Realizing that Medicaid already is a failed program, a number of states (most with Republican governors) have decided not to expand their Medicaid coverage.  (Texas Governor Rick Perry – calling Medicaid “a system of inflexible mandates, one-size-fits-all requirements, and wasteful bureaucratic inefficiencies” – said that to expand this program was “not unlike adding a thousand people to the Titanic.”)  Other states with Republican governors (including, unfortunately, Ohio’s John Kasich), unable to resist the temptation of promised federal funding the first three years, have announced plans to expand Medicaid as dictated by the “ObamaCare” law; however, in some of those states (including Ohio), Republican-majority legislatures may thwart the governor’s proposal by refusing to play along with this unconstitutional power-grab by the national government.   

    As I discussed more fully in my February 7 essay, a health-insurance “exchange,” as defined under the law, is “a mechanism for organizing the health insurance marketplace to help consumers and small businesses shop for coverage.”  Thus described, it appears to be a kind of clearinghouse where consumers without employer-based coverage can obtain supposedly “affordable” health insurance (to comply with the law’s individual mandate); in practice, however, it is the chief means by which the federal government – acting through the broad discretionary power of the Secretary of Health and Human Services to issue directives (“as the Secretary shall decide” is the most repeated phrase in the “ObamaCare” law) – will control the health insurance market.  That’s because the only health insurance policies included in the exchanges are those that comply with the federal (H.H.S.) regulations.  All exchanges are supposed to be operational by January 1, 2014, but the states are not required to create state-run exchanges; they may decide to “opt out” and instead either allow the federal government to create an exchange for their residents or do a “state-federal partnership” exchange.   

    So far 33 states have refused to create exchanges – an astounding number, considering that HHS Secretary Sebelius in early 2012 predicted that only 15 to 30 states might decline to create exchanges.  Creating an exchange is not mandatory (for the Supreme Court has recognized that the Tenth Amendment bars the national government from “commandeering” states in such a manner), and most states have realized that to create their own exchange will neither save them money nor spare them from federal control.  Again Texas Governor Perry summed it up nicely with his pithy comment, “It is clear there is no such thing as a state exchange.  This is a federally mandated exchange with rules dictated by Washington.”  And Wisconsin Governor Scott Walker noted, “No matter which option is chosen, Wisconsin taxpayers will not have meaningful control over the health care policies and services sold to Wisconsin residents.”  As Sally Pipes observes, “Walker is right.”  The federal H.H.S. department dictates that all policies sold on the exchanges must meet one of four classifications – “platinum,” “gold,” “silver,” or bronze,” depending on the percentage of health costs a plan covers – but given the caps on deductibles imposed on all plans, even the least expensive “bronze” plans, the mandates prevent insurers from offering low-cost products that may best fit a family’s budget.  And because “ObamaCare doesn’t just set the rules – it also tasks states with enforcing them . . . running a [state] exchange could therefore get pricey” (Pipes, “ObamaCare Exchanges Turning into Disasters,” I.B.D., January 30).   Moreover, some states (including Ohio) – those with state constitutional provisions or statutes guaranteeing their citizens’ health-care freedom – are constitutionally or legally prohibited from implementing an “ObamaCare” exchange.   

    With even the leftist rag The New York Times reporting that creating and operating federal exchanges “will be a herculean task that federal officials never expected to perform,” it is doubtful whether the exchanges will be operational by the stipulated date, January 1, 2014.  And even if exchanges were set up by that date in all fifty states – whether they’re state-created exchanges, federal exchanges, or state-federal partnership exchanges – it’s even more doubtful that they will work as intended, to give Americans more choices as health-insurance consumers and to lower health-care insurance premiums.  That’s partly because, in many states, there are virtual monopolies, with in-state markets dominated by a single insurance company.  (That’s why a much simpler – and far more workable – solution to the problem of rising health-insurance costs would have been for Congress to pass a simple law allowing Americans to buy insurance across state lines – in other words, ending the states’ ability to monopolize their health-insurance markets.  But such an act would be a market-oriented reform anathema to the schemes of B.O. and Democrat “progressives” to have the national government capture control of the health-care industry.) 

    Three recent developments have raised even more alarm bells about “ObamaCare.”  One is the developing scandal that I call “IRS-Gate” (discussed below), the outrageous news that the IRS targeted pro-growth, conservative and Tea Party groups due to their political beliefs.  It’s especially alarming because the IRS is the key agency tasked with enforcing ObamaCare, both the individual mandate (the provision upheld by the Supreme Court) and the employer mandate.  Even more alarming, the current head of the IRS health-care office, Sarah Hill Ingram, was in charge of the tax-exempt division when agents first started improperly scrutinizing conservative groups over their applications for tax-exempt status.  (According to the IRS, Ingram was reassigned to help the agency implement the health-care law in December 2010, about six months before a Treasury inspector general’s report said her subordinate, the director of exempt organizations, learned about the targeting.  “There isn’t any evidence that Sarah Ingram had any inkling of the problems,” said Rep. Sander Leven, D.– Mich., ranking Democrat on the Ways and Means Committee, which oversees the IRS.  But Ms. Ingram was the one responsible – thus leaving the question why she was reassigned, as well as the broader, more troubling question, whether the IRS can be trusted to enforce the law impartially.)       

    A second troubling development, also involving the IRS, has been the agency’s illegal attempt to enforce penalties on employers in states that do not create health-insurance exchanges.  Under the “ObamaCare” law, certain employers are subject to penalties if they fail to provide “minimum value” health benefits (as dictated by the law) for their employees: namely, companies with over 50 employees in states with their own state-run exchanges, if their employees obtain insurance subsidies through a state-created exchange.  Under the letter of the law, employers in states that have refused to create their own exchanges – as noted above, the 33 states that have decided not to create their own state-run exchanges but instead have opted for federal exchanges or state-federal partnerships – are not subject to this penalty (probably an oversight as Congress rushed to write this hastily-drafted, poorly-crafted law, which the then-Speaker of the House, Nancy Pelosi, said Congress would first have to pass before anyone could fully know what’s in it).  But the IRS, in a blatant power-grab, has decided to enforce the provision on all employers, even those in states without state-run exchanges.  (A lawsuit filed on behalf of some such employers, in five states, is currently pending in the federal courts.  One of the plaintiffs, Dr. Charles Willey, a M.D. and CEO of Innovare Health Advocates, a medical practice group in St. Louis, recently wrote an op-ed in Investor’s Business Daily describing this outrageous development, noting that the illegal action by the IRS forces him and other employers and their families “to purchase costly insurance that they cannot afford, do not need, or do not want”  (“IRS Illegally Usurps Congress with New Law in ObamaCare,” I.B.D., May 22)). 

    Finally, in yet another scandal about the B.O. regime – one that has not received much media coverage, except on Fox News and in the pages of I.B.D. – as funding for “ObamaCare” implementation has dried up, HHS Secretary Kathleen Sebelius has been soliciting funds from private healthcare companies to implement the law.  It’s yet another politicized move (typical of the “Chicago-style” thuggery of the B.O. regime), essentially a “shake-down.”  

    Let’s hope that some way will be found within the next year or so to block or repeal this horrid law before it’s too late – before it not only deprives Americans of their freedom but also seriously endangers their health (and their lives), destroying the best health care in the world.



    n IRS-Gate: More Than Just a Taxing Situation            

    England’s King Henry II was a great reformer; his reign during the 12th century (1154–1189) marks the formative period of the English “common law” system, which continues to serve as the basis of the law in both England and America.  Yet King Henry is perhaps most famous for his confrontation with his former friend, Thomas Becket, whom he appointed Archbishop of Canterbury, head of the Church in England.  Becket championed the autonomy of the Church, opposing Henry’s (ultimately) successful attempts to put clergymen accused of crimes under the jurisdiction of the royal common-law courts instead of the Church’s own ecclesiastical courts.  As dramatized in the classic 1964 film Beckett (co-starring Richard Burton in the title role and Peter O’Toole as Henry II), at one fateful moment the King voices his exasperation with Becket – “Will no one rid me of this meddlesome priest?” – and a group of knights, hearing the King’s angry outburst and interpreting it as a command, then proceeded to Canterbury Cathedral, where they brutally murdered Becket while he was presiding over the early evening (vespers) services.  Beckett became a martyr (he was named a saint by the Church and his tomb at Canterbury became a destination for pilgrims for the next 350 years); and King Henry II did penance (submitting himself to the lash, as the Beckett filmmakers imagined it), to absolve himself of guilt and to atone for his role in the crime. 

    Why am I telling this story here?  Because B.O. is facing his own Becket moment, and it involves a rapidly-unfolding story that many political commentators believe might cause the downfall of his presidency:  the illegal targeting of, harassment of, and discrimination against conservative and libertarian groups by the Internal Revenue Service (IRS), as it reviewed the organizations’ requests for tax-exempt status – the scandal I call “IRS-Gate.”  Parallels to Richard Nixon’s presidency and the Watergate scandal that was its undoing are obvious.  And, just as with Nixon 40 years ago, the critical question seems to be:  Is the president (the current “occupier” of the White House, as I like to refer to B.O.) responsible for this blatant abuse of power by a federal agency?  (If so, then the related question will be:  Is this an impeachable offense?  I’ll discuss that question more fully in Part II of this essay.) 

    As of the time I’m writing this, it has been less than two weeks since the news story first broke (on Friday, May 10) that the IRS had been targeting certain conservative/libertarian groups (initially reported as just “Tea Party groups”), as a result of the release of an IRS inspectors-general report.  Since then, investigative reporting by some of the news media (most notably by USA Today) and the ongoing investigations by Congress have revealed more of the story.  But investigations are still being conducted, especially after the IRS official at the center of the scandal – Lois Lerner, head of the Tax Exempt Division – recently pleaded the Fifth Amendment, refusing to answer questions (after making a brief opening statement denying culpability) in hearings by the House Oversight Committee on May 20.  Here’s a summary of what we know so far.  

    By sometime in the early spring of 2010, still-unidentified IRS workers in Cincinnati, Ohio (the main processing center for tax-exempt organizations) began singling out applications for tax-exempt status from certain groups (initially reported as groups with names containing “tea party,” “patriot” or other buzzwords).  As USA Today recently summarized it, “Applications from political groups warranted extra attention, but such attention should have been scrupulously neutral and non-partisan.  It wasn’t.  After singling out conservative groups, workers sent letters with intrusive questions to some.  Meanwhile, some liberal groups sailed through the process” (“IRS inquiry rests on four key questions,” USA Today editorial, May 21).  According to the newspaper’s breaking news story about the IRS scandal (“IRS gave a pass to liberals,” May 15), a review of IRS data shows that for a 27-month period, from March 2010 through May 2012, the IRS granted no Tea Party group tax-exempt status.  During that same time, however, the IRS approved applications from similar left-liberal groups; indeed, some groups with obviously liberal names (including words like “Progress” or “Progressive”) were approved in as little as nine months.    

    Besides targeting conservative groups for extra scrutiny – refusing or delaying their application for tax-exempt status – the IRS also harassed such groups, making unusual document requests, asking for massive amounts of information the agency couldn’t possibly need to determine tax-exempt status.  For example, groups were asked to provide:  donor names, blog posts, transcripts of radio interviews, resumes of top officers, board minutes and summaries of materials passed out at meetings.  Some groups were asked about connections to other conservative groups or individuals; for example, the IRS demanded that the Center for Constitutional Law “explain in detail your organization’s involvement with the Tea Party.”  Even worse, IRS employees engaged in selective leaks, providing information gleaned from its special scrutiny of conservative groups to certain left-liberal groups.  ProPublica, a left-liberal-leaning nonprofit journalism organization, has revealed that the IRS had leaked it nearly a dozen pending applications, including one submitted by Karl Rove’s Crossroads GPS (“Did IRS Try To Swing `12 Election?” I.B.D., May 16).  And law professor John Eastman has maintained, in a recent op-ed in USA Today, that the IRS disclosed to a “gay rights group,” the Human Rights Campaign (HRC, which favors same-sex marriage), the confidential tax returns of the conservative group he’s associated with, the National Organization for Marriage (NOM, which is opposed to same-sex marriage), and that this information was immediately republished by The Huffington Post and other left-liberal news media outlets.  He adds that the HRC and NOM are “the leading national groups on opposing sides in the fight over gay marriage,” and hence that the IRS leak of confidential information – including the identities of major donors – “fed directly into an ongoing political battle” (“IRS committed political sabotage,” May 16).  

    It should be noted that most groups applying for status as tax-exempt organizations under sections 501(c)(3) or 501(c)(4) of the Internal Revenue Code rely heavily on contributions from individuals for their funding, and having status as a tax-exempt organization is critically important for them to receive contributions (because it encourages contributors, who can then claim their donations as deductible expenses from their own taxes).  That’s especially true of “grass roots” organizations, comprised of “ordinary” Americans, like most groups involved in the Tea Party movement.  The timing of the IRS “crack-down” on such groups is especially interesting, because these groups were generally credited for being successful in getting Republicans elected in the 2010 Congressional elections, allowing the GOP to regain control of the U.S. House of Representatives.  And they were expected to play an equally important role in the 2012 general elections.  (Although they might jeopardize their tax-exempt status by campaigning for particular candidates, they are entitled to educate voters and to inform them about the candidates – just as left-wing tax-exempt groups are.)  As the I.B.D. editors noted in their May 16 editorial, “Did IRS Try to Swing `12 Election?” during a hotly-contested election, the IRS “managed to put its thumb on the political scale by squelching political activity on the right.”  Some groups report that they curtailed their get-out-the-vote efforts, spending piles of money on legal fees or disbanding altogether in the face of IRS inquisitions.  Thus, the effort to delay or impede certain groups’ tax-exempt status had the effects to kill them, to suppress them, or at least to minimize their impact during the 2010 and 2012 election years. 

    As the editors of Investor’s Business Daily have noted, the IRS targeting of Tea Party groups is something that ought to concern all Americans:   

    “The formation in 2009 of hundreds of Tea Party groups, [all over the U.S.] . . . was a spontaneous reaction by ordinary American citizens to an overspending, oversized government whose overreach called into question its constitutionality.


    “Yet it was these very Tea Parties that were specially targeted by the IRS in its outrageously intrusive efforts to learn their every secret, to leak those secrets to a hostile, slanderous media, and to delay beyond justification their approvals as tax-free institutions in an effort to suppress them – while granting those permissions easily to leftist ones.”


    Noting that such groups were the “civil society” groups, the voluntary associations, that Alexis de Tocqueville lauded in his classic book Democracy in America, as groups that helped mitigate against the dangers of majoritarian tyranny, the editors conclude: “The fact that the IRS tried and may have succeeded in destroying some Tea Party and other groups is a sign that the broader agenda wasn’t just suppression of political dissent, but imposition of the kind of tyranny found in places like Cuba, where law-abiding civil society has been replaced by fanatical government-directed mobs.”  That’s what Tocqueville warned about, and that’s why “these IRS transgressions ought to be treated by as what they are – a fundamental attack on our democracy” (“An Attack on America’s Civil Society,” May 17).  And as the editors also noted in a previous editoral – referring to the IRS leaks of information from conservative groups to the left-liberal news operation ProPublica – “Democracy is threatened when a presidential campaign co-chair [referring to HRC’s president Joe Solmonese] can use IRS documents to attack a political opponent and a campaign donor gets to write the rules” [noting that heavyweight Obama donor and fundraiser Jeffrey Katzenberg, CEO of DreamWorks Animation, is a key supporter of ProPublica] (“Obama’s Internal Re-Election Service,” May 16). 

    It wasn’t just “Tea Party” groups that were targeted for special attention by the IRS, moreover.  The Washington Post (hardly a conservative paper) reports that of nearly 300 groups selected for special scrutiny, 72 had “tea party” in their title, 13 had “patriot,” and 11 had “9/12.”  According to documents obtained by the Post, IRS exempt organizations chief Lois G. Lerner – who “apologized” for the agency’s actions, before she decided to plead the Fifth before the House Oversight Committee – objected in a meeting held June 29, 2011 with IRS staffers, in which they described giving special attention to groups where “statements in the case file criticize how the country is being run.”  Yet only six months later, on January 15, 2012, the agency decided to look at “political-action type organizations involved in limiting/expanding government, educating on the Constitution and Bill of Rights, social economic reform movement,” according to an appendix in the Inspector General’s report.  In 2010, the pro-Isreal organization Z STREET filed a lawsuit against the IRS, claiming it had been told by an IRS agent that because the organization was “connected to Israel,” its application for tax-exempt status would receive additional scrutiny.  Reportedly, an IRS agent told a Z STREET representative that the applications of other Israel-related organizations have been assigned to “a special unit in the D.C. office” to determine whether the organizations’ activities “contradict the administration’s public policies” (“ObamaCare, IRS, and Tax Power,” I.B.D., May 14).  

    And here in Ohio, Maurice Thompson, the executive director of the 1851 Center for Constitutional Law – a non-profit organization that provides legal representation to Ohioans whose constitutional rights have been violated (including but certainly not limited to tea party groups or their members) – has revealed that in processing the 1851 Center’s application for tax-exempt status, which ultimately was granted, the IRS in its May 20, 2010 letter demanded that the Center must “explain in detail your organization’s involvement with the Tea Party.”  (The Center applied for status as an educational and/or civil public policy charity under section 501(c)(3); it is a public interest law firm that litigates civil rights cases without engaging in politics.) (“IRS Targeting of 1851 Center in May of 2010 Demonstrates Broader Corruption,” 1851 Center press release dated May 16). 

    Congressional committees obviously ought to continue their investigations into this especially serious matter.  Eventually (sooner rather than later, I hope), an independent special counsel with subpoena power will be appointed.  It is “the only possible solution,” argues Lawrence Kudlow in a recent op-ed (“Revenuers Saw Tea Party as Political Threat,” I.B.D., May 20).  Kudlow notes that B.O. “has charged Treasury Secretary Jack Lew with straightening this out,” but adds that Jack Lew is “an Obama political operative.”  That’s putting it mildly: as Kudlow himself observed in an op-ed earlier this year (questioning Lew’s credentials to be Treasury secretary), Lew not only was formerly B.O.’s chief of staff but also an intensely partisan operative, far to the left of his predecessor, Tim Geithner – in Kudlow’s words, “a left-liberal Obama spear-carrier, whose very appointment signals a sharp confrontation with the Republican House over key issues such as the debt ceiling, the spending sequester, next year’s budgets and taxes.”  Only an independent special counsel can investigate “any possible connections with senior Treasury officials, connections that could lead to the Oval Office,” Kudlow concludes. 

    As noted above, there are strong parallels between the B.O. regime’s “IRS-Gate” scandal and President Nixon’s Watergate scandal, not only because they both involved alleged White House participation in illegal (or unconstitutional) activities by federal agencies (and, more importantly, abuse of power in the “cover-up” of those activities) – but also because some of the allegations against Nixon in 1974 also specifically involved the IRS.  The first clause in Article II of the articles of impeachment against Nixon approved by the House Judiciary Committee at the end of July 1974 alleged: 

    “He has, acting personally and through his subordinates and agents, endeavored to obtain from the Internal Revenue Service, in violation of the constitutional rights of citizens, confidential information contained in income tax returns for purposes not authorized by law, and to cause, in violation of the constitutional rights of citizens, income tax audits or other income tax investigations to be initiated or conducted in a discriminatory manner.”


    This allegation was followed by other clauses in the second Article of impeachment, accusing Nixon – again either “acting personally” or “through his subordinates and agents” – of misusing other federal agencies, including the FBI, the CIA, and the Department of Justice, to violate individuals’ constitutional rights.  Because Nixon resigned even before the articles of impeachment were voted on by the House of Representatives, his impeachment case was never tried in the Senate.  As far as I know (and I’ve read several books about the Nixon impeachment and the Watergate affair), there has been no credible evidence proving this allegation.  The so-called “White House enemies list” maintained by the Nixon administration – allegedly the source of names of persons to be subjected to harassment by the IRS and other federal agencies – has never been proven to be anything other than a “do-not-invite” list for White House social functions. 

    Thus, in a sense, the potential allegations against B.O. arising from “IRS-Gate” could be far more serious than this impeachment article against Nixon.  From what Americans (including the members of Congress currently holding hearings investigating the IRS abuses) already know, the IRS did abuse its power – not only by collecting information against certain individuals and groups, auditing them or otherwise threatening them, in a unlawful discriminatory manner – but also presumably did so in a way that not only deprived the groups involved of their constitutional rights but also perhaps affected the results of the 2012 elections. 

    There was no “smoking gun” linking Nixon to the IRS allegations, as there was linking him personally to the allegations contained in Article I of the articles of impeachment – the main allegations, about obstruction of justice and abuse of power in covering up the break-in at the Democratic National Committee offices in the Watergate office building on June 17, 1972.  (For more on Nixon and the Watergate affair, see my essay “The Legacy of Watergate,” June 17, 2005 – posted on the 33rd anniversary of the Watergate break-in.)  But, in approving the articles of impeachment against Nixon, the House Judicial Committee adhered to a broad standard of what constitutes impeachable offenses – a standard that held the president responsible for acts done “through his subordinates and agents.”  That standard is reasonable, given the office of the presidency as created by the Constitution, what some scholars call the “unitary” Chief Executive.  It also comports with Americans’ popular understanding of the office – “the buck stops here,” as the sign on President Harry Truman’s desk in the Oval Office famously read.  (Ironically, this broad standard of impeachment was also urged by a young Democratic staffer of the House Judiciary Committee, a recent graduate of Yale Law School named Hillary Rodham Clinton!) 

    Based on the precedents set by the Nixon near-impeachment, B.O. certainly could be held accountable for the illegal acts committed by the IRS during his watch – even if there’s no evidence of his personal involvement.  Morally and politically, if not legally, he ought to be held responsible for the actions of the IRS because of the way he has conducted his administration – which I have called a “regime” here on MayerBlog because it has been so lawless.  (Indeed, I have discussed why B.O. is “the most lawless president in U.S. history,” in both my “Rating the U.S. Presidents – 2012” (Feb. 22, 2012) and my more recent essay, “The Unconstitutional Presidency” (February 21).)  Besides bringing his “Chicago style” of gangster government to Washington, D.C., and thus generally encouraging a spirit of lawlessness (of disrespect for the rule of law) throughout the federal government, B.O. himself has taken the lead in demonizing conservatives and libertarians – particularly groups involved in the Tea Party movement – and has encouraged his fellow Democrats as well as his lapdogs in the news media to demonize them, precisely because of their opposition to his policies and their concern about the direction in which the United States is headed.  

    Given the highly-charged partisan atmosphere that B.O. has cultivated in Washington, D.C., is it at all surprising that someone at the IRS – whether they were “minions” or top-level administrators – took B.O.’s angry outbursts about Tea Party groups, Republicans, conservatives, libertarians, etc., as if they were an order to target them for special treatment?  That’s what I meant above, when I referred to “IRS-Gate” as possibly B.O.’s “Becket moment.”  Just as Henry II encouraged the murder of Thomas Becket, B.O. has encouraged the violation of certain Americans’ rights.  

    But there may even be a “smoking gun” – evidence of B.O.’s personal involvement in the abuse of IRS powers.  According to a special report by Jeffrey Lord just published in the American Spectator, evidence of that smoking gun might be found in the White House visitor’s logs, which show that on March 31, 2010 someone named Colleen Kelley – who happens to be president of the National Treasury Employees Union, an anti-Tea Party group that includes the unionized employees of the IRS – had an appointment with “POTUS,” that is, with B.O., the President of the United States.  The very next day after her White House meeting with the president, according to the Treasury Department’s Inspector General’s Report, IRS employees – the very same employees who belong to the NTEU – set to work in earnest targeting the Tea Party and conservative groups around the U.S. (“Obama and the IRS: The Smoking Gun?”, May 20).



    n All the News That’s Fit To Control – Or To Quash            

    “IRS-Gate” is one of three major scandals – called a “trifecta” of scandals by conservative political columnist George Will – that currently are dogging B.O. and his regime.  The other two concern Benghazi (the militant Islamist attack on the U.S. consulate in Benghazi, Libya last September 11, and the B.O. regime’s subsequent cover-up, which I will discuss in Part II of this essay) and the seizure of Associated Press phone records by the Justice Department.  Perhaps we could call this third scandal “AP-Gate”? 

    Like the IRS scandal, “AP-Gate” involves serious allegations of abuse of power, this time by the B.O. regime’s Department of Justice (DOJ, which maybe should be called the “Injustice Department”), headed by Attorney General Eric Holder.  On Monday, May 13, just a few days after the IRS scandal broke (which, as noted above, was on Friday, May 10), the Associated Press learned that the Justice Department had conducted a sweeping seizure of its phone records, spanning a two-month period of time.  The subpoenas under which the DOJ seized the phone records were, as USA Today describes them in an editorial, “so broad they covered outgoing calls from the AP’s general numbers in New York, Washington, and Hartford, Conn., and for reporters who cover Congress.  The intrusion didn’t stop there.  Records of office, home and cellphones of some individual reporters were also seized as part of an investigation into leaks about a failed al-Qaeda plot last year” (“Quest to plug leaks yields torrent of abuse,” May 15).   

    The Justice Department insists that the subpoenas fell within lawful guidelines; and in recent testimony before a House panel, Holder defended the action as necessary in order to foil a terrorist plot (“This is among the top two or three [most] serious leaks that I’ve ever seen,” said Holder. “It put the American people at risk,” he asserted, without elaboration.)  But AP CEO Gary Pruitt said “there can be no possible justification” for the action, considering its wide-ranging scope.  And House Judiciary Committee chairman Rep. Bob Goodlatte (R. –Va.) said the Justice Department requests “appeared to be very broad and intersect important First Amendment freedoms.”  He added, “Any abridgment of the First Amendment right to the freedom of the press is very concerning.” 

    It’s ironic that it was the AP who was the target of the DOJ, considering that the AP generally has a left-liberal bias, so much so that it may be considered a lapdog for the B.O. regime.  (I jokingly say “AP” stands for “administration propaganda.”)  But this time, when its own ox is being gored – when its own news reporters are being investigated – the AP suddenly becomes aware of potential abuse of power by the B.O. regime, and the violation of its staffers’ constitutional rights.  Little wonder that, of the recent “trifecta” of major scandals, it’s this one that has received most attention from the news media.  As the editors of Investor’s Business Daily recently noted, “The AP flap has drawn a properly outraged response from the news agency, because the White House’s obsessive efforts to find leaks cast such a broad, indiscriminate net against reporters just doing their jobs.” 

    And just as the IRS scandal has been shown to involve more than just “Tea Party” groups, the Justice Department’s abuse of its investigative powers apparently involves more media persons than AP reporters.  In an even more disturbing news story, it has been revealed that the Justice Department investigated Fox News reporter James Rosen and two other newsmen, and even threatened to treat Rosen as an unindicted “co-conspirator,” in another matter involving an alleged leak of national security information.  As the I.B.D. editors note, “it was a tiny leak of a matter – North Korea’s plan to launch more nuclear tests if United Nations sanctions went through in 2009 – that would have changed nothing, whether reported or not.”  Yet it somehow merited a case against Rosen, “for the kind of reporting that Obama’s allies in the New York Times and Washington Post conduct all the time.”  “And yes,” they add, “we think the White House did it because Fox News is a dissident news organization” – to which one might add, a news organization that B.O. and the Democrats have routinely demonized.  “For a while, it looked like the White House wanted just to control `the narrative.’  But its seizure of AP phone records and surveillance of Fox employees now show its real aim: to control the news” (“The Obama Objective: To Control the News,” May 21).   

    One of the questions yet to be answered in this scandal is whether the B.O. White House and the Holder Injustice Department will stonewall Congressional efforts to investigate it, by claiming not only “national security” considerations but also “executive privilege,” as they have done with efforts to investigate the Department’s “Fast and Furious” project, involving illegal gun-running in Mexico.



    n Francis, the Socialist Pope? 

                In “Too Pooped To Pope,” the section on the retirement of Pope Benedict XVI in this year’s “Spring Briefs” (March 23), I briefly discussed the new pope, the former Cardinal Jorge Mario Bergoglio, archbishop of Buenos Aires, Argentina, who has taken the name of Francis I (after Saint Francis of Assisi).  As the first pope to come from the Americas, as well as first Jesuit pope, Francis is “historic.”  Both his Jesuit background and his experience in Argentina will shape his character as pope – for good or ill.  In my March entry, I noted some reasons to be both cautious and optimistic about Francis: 

    Wisely, the new pope has fought against the Marxist “liberation theology” movement that swept Latin America in the 1980s; he also has been an outspoken critic of Argentine President Cristina Fernandez’s leftist government, using his homilies to criticize economic and social conditions as well as political corruption.  Some commentators – such as the editors of Investor’s Business Daily – have idealistically predicted that Pope Francis might be an opponent of leftist dictatorships in South America (including Venezuela, Ecuador, and Bolivia) the same way Pope John Paul II opposed communist dictatorships in eastern Europe (“Pope and Change in the Vatican,” March 15).  Nevertheless, Francis sees himself as a champion of the poor and outcast; and he shares with leftist “liberation” theologians the same perverted notion of “social justice” that is rooted in a moral philosophy of altruism, or self-sacrifice, which along with the Church’s adamant stance against contraception, is responsible for most of the poverty in the “Third World.”  Like his predecessors, Pope Francis is unlikely to embrace the only true hope for happiness and prosperity in the world:  free-market capitalism.


                It seems that the I.B.D. editors were overly optimistic, however – and I was rather prescient.  The editors recently reported that Francis has spoken about “the tyranny of money” and called for countries to impose more control over their economies to prevent “absolute autonomy” (in other words, individual freedom and responsibility) and foster the “common good” (in other words, government tyranny).  As the editors noted, it appears that Pope Francis “has been infected by the local economic pathologies of his homeland, Argentina, and its liberation theology among the Jesuits.” 

                The editors added that the pope’s policy prescription “has already been tried in Argentina,” where it “has driven millions of Argentines into poverty.”  It’s also been tried in socialist Venezuela (under the dictator Hugo Chavez, who is still dead), “where expanded control over the economy has brought capital controls, and predictable-as-nightfall shortages, the latest of which is of toilet paper, joining corn, flour, sugar, salt and every foodstuff the government has a hand in.”  Reminding readers that the pope has professed his goal of ensuring “dignity” for the poor, the editors ask: “Is it dignified for the poor to live without toilet paper as a result of government control?”  By contrast, “everywhere free enterprise is practiced, prosperity follows, reaching even to the poorest people.”  A better model is provided by Argentina’s successful neighbor, Chile, which in 1974 adopted “the most far-reaching series of free-market reforms in history, driving a once-poor, socialist ravaged country to the ranks of the first world within 30 years”: 

    “Trade was freed, bureaucracy cut, its currency stabilized, its debts paid, and pensions were privatized, turning workers into capitalists and creating a vast pool of capital to develop the country.  As a result, Chile has Latin America’s highest educational level, its highest rate of charity-giving, its lowest infant mortality, lowest corruption, and the highest economic freedom on the continent.”


    “The hard fact here is that free markets free people to do and be their best.”  If Pope Francis really wants to help the poor – to make them less poor – “he should take a second look at what capitalist societies have achieved” (“Toilet Paper Lesson,” May 20).



    n The “Most Dangerous Man in America” 

                Cass Sunstein, former University of Chicago and Harvard law professor, was called “the most dangerous man in America” by political commentator Glenn Beck, when Sunstein served as the B.O. regime’s “regulatory czar” (administrator of the Office of Information and Regulatory Affairs).  Beck had a valid point, for Sunstein epitomizes the elitist “philosopher-king” ideology of the so-called “Progressive” movement.  Now that Sunstein has returned to academia, however, Beck has a new candidate for the title of “most dangerous man in America.”   He’s another “progressive” reformer who poses a major threat to Americans’ liberties:  Michael Bloomberg, the billionaire who is mayor of New York City. 

    I’ve previously written about “Big Brother” Bloomberg, as I call him because he can’t seem to stop imitating “Big Brother,” the paternalistic dictator of George Orwell’s dystopian novel 1984 (see “Summer 2012 in Review” (Sept. 12, 2012) and this year’s “Spring Briefs” (March 23)).  Bloomberg’s efforts to control the lives of New Yorkers have included a ban on smoking in public places, a ban on trans-fats in restaurant food, a ban on sales of soft drinks in cups or containers larger than 16 ounces, and restrictions on hospitals’ distribution of infant formula to new mothers.  Not only have these measures deprived New Yorkers of their freedom, but they have inspired other municipal governments across the United States to imitate Bloomberg’s paternalism, by enacting similar “nanny state” measures of their own.   

    More recently, Bloomberg has been one of the most vocal anti-gun fascists in the United States.  Since the Newtown, Connecticut school massacre, he has demagogued the issue of “gun violence,” and in his role as leader of the organizations “Mayors Against Illegal Guns” he has been pushing for various measures – including a new federal ban on so-called “assault weapons” and restrictions on ammunition – increasing government control over firearms.  (Thus, besides being an enemy of individual freedom and responsibility generally, Bloomberg thus is also an enemy of Americans’ Second Amendment rights.)   

    He has spent a small fortune – about $12 million, by some estimates, a small fraction of his personal fortune – to air across the USA a series of TV ads promoting gun control.  (One of these ads, ironically titled “Responsibility,” features a self-proclaimed “gun owner” – or an actor hired to portray a Midwestern gun owner, dressed up like a “bubba,” consistent with left-liberal elitists’ stereotype of rednecks who “cling to their guns” – who broke the NRA’s first two rules of gun safety: (1) ALWAYS keep the gun pointed in a safe direction, and (2) ALWAYS keep your finger off the trigger until ready to shoot.  As one blogger rhetorically asked, “Is the actor in the video simply an irresponsible gun owner or does he just play one on TV? ( “Michael Bloomberg’s Anti-Gun Ad Flops,”, March 27).)  

    In “Spring Briefs” I called Bloomberg a “megalomaniac,” but perhaps a better term is “petty tyrant” – a petty tyrant who has been abusing his power as mayor to force his own half-baked ideas about personal health on the people unfortunate enough to live in his city.  He’s abusing the legitimate “police power” that state and local governments possess (which concerns only public health – preventing epidemics of infectious disease and the like – not the private health of individuals).  Bloomberg’s disdain for individual freedom and responsibility extends even to the Constitution that is designed to protect individual rights – the Constitution that he, as a government official, took an oath to defend.  After the Boston Marathon bombing, Bloomberg pompously declared, “We live in a complex world where you’re going to have to have a level of security greater” than we’ve had.  “Our laws and our interpretation of the Constitution, I think will have to change.” 

    By himself, Bloomberg isn’t “the most dangerous man in America,” after all.  But he does epitomize a type of politician who truly has been most dangerous, since the so-called “Progressive” movement of the early 20th century:  a so-called “progressive” reformer, who is really a reactionary – harkening back to a centuries-old paternalistic tradition in government (as I discuss in my essay “Reactionary `Progressives’,” Mar. 16, 2006).  Merely corrupt politicians aim at enriching themselves at the public expense.  But “progressive” reformers are elitists who aim at controlling other people’s lives.  (Politicians hungry for power are always more dangerous than politicians merely hungry for material wealth.)   

    Thankfully, “Big Brother” Bloomberg’s tenure as NYC mayor will soon end because he is term-limited (further evidence of the value of term limits!).  Among those who have announced their candidacy to succeed him is former Congressman Anthony Weiner, who resigned from Congress in 2011 after a scandal involving his over-exposure on social media (more precisely, after tweeting “lewd” photos of his crotch, both covered and uncovered).  How appropriate, to have one Weiner succeed another!



    n A Lesson from Neal Boortz 

                I’ve been reading Neal Boortz’s autobiographical book, Maybe I Should Just Shut Up, published earlier this year in e-book format, to mark Boortz’s retirement from talk radio.  (Boortz retired on January 18, after 42 years on the radio.  Thanks to Justin Gruver for this splendid video clip marking the occasion.)  The book is chock full of entertaining and informative stories about Boortz’s long career.  Among them is the chapter on “The Public’s Airwaves,” a devastating critique of the myth that underlies government regulation of radio and TV broadcasting.  

    There’s also the following story – a kind of parable, or a thought experiment – from the chapter “Why Liberals Suck at Talk Radio.”  Boortz writes: 

    “Here’s the simple, basic, easy-to-understand scenario I love to present to liberals, just to see if they can successfully defend a basic tenet of liberalism – government-enforced charity.


    “I’ve gone through this routine countless times on the air with liberal callers, and not one of them – not one – has ever successfully withstood the challenge.


    “I’ll be the compassionate, caring, and loving liberal for this scenario, and you can be the selfish, stingy, greedy conservative.  As we walk along a downtown street, discussing the efficacy of a degree in gender studies in an increasingly technological society, we encounter a panhandler.


    “This poor guy doesn’t have any arms or legs, yet he’s playing 26 musical instruments simultaneously.  There’s a bucket in front of him with a few dollars in it.  Behind the bucket is a sign reading, `Tips please.  Thank you.  Stump the Band.


    “Being the compassionate liberal, I pull out a 5 and put it in Stump’s bucket.  Then I turn to you and tell you to put $5 in the bucket, too.  You say that you just don’t have the money to spare.  There are bills to pay and mouths to feed.


    “OK, here’s the question:  At that point, would it be OK if I were to pull a gun out of my pocket, put that gun to your head and tell you that you don’t have a choice?  Can I simply use deadly force to make you put a 5-dollar bill in that bucket?  I doubt that you’re going to say, `Sure!  Go ahead.  I don’t mind!’


    “Your likely answer here is that I have no right to force you to give.


    “As I said, I’ve never had a liberal tell me that it would be OK for me to pull a gun out of my pocket and force them to give to the panhandler.  Yet, this is what they do, using the police power of government to seize and transfer (or redistribute, to use Obama’s favorite word) wealth, day after day.


    “OK, fine.  Here’s my follow-up question:  If government derives its power from the consent of the governed, and if I, part of the governed, can’t use deadly force to make you give money to someone else, how then can I ask the government to do that for me?”


    It’s a splendid parable that gets right to the heart of the perverse morality of the “welfare state,” of paternalistic government.  Of course, when Boortz uses the term liberal, he means left-liberals – not “liberals” in the classical sense of the term, synonymous with “libertarians” (like Boortz himself – or me).   

    What distinguishes libertarians from non-libertarians (whether they’re leftist “liberals” or conservatives) is that libertarians understand that government is force – potentially lethal force – and that whenever someone says, “There ought to be a law” to do X, they’re really advocating the use of force, or threat of force, to compel someone to do something (or to prohibit them from doing something) against their will.  That’s why libertarians support minimizing governmental power:  they want to reduce to a minimum the use of force, the coercive power of the law, to control people’s lives and instead want to maximize individual freedom and responsibility, allowing individuals to voluntarily enter into contracts with others, for their mutual benefit.  In other words, to replace coercion with cooperation. 

    It’s sad that Boortz is retiring at just the time when America needs most an entertaining, informative, and principled libertarian commentator on talk radio.  Rush Limbaugh, Sean Hannity, and Glenn Beck – each of whom is more conservative than libertarian – are not the consistent advocates of individual liberty, in all its aspects, that Neal Boortz was.  (Even Boortz had a few blind spots – issues where he was more conservative than libertarian:  his advocacy of government control over immigration, for example.)  Let’s hope that someday soon the nation’s airwaves will have a worthy successor to Boortz – someone who, like him, will rise from obscurity to national prominence, if only given the opportunity.



    n The End of an Era

               “Tonight, tonight,

                Won’t be just any night,

    Tonight there will be no morning star . . . .”


            -- “Tonight,” lyrics by Stephen Sondheim, from West Side Story (1957) 


                In early April, NBC announced that by spring 2014, Jay Leno will be retiring as host of The Tonight Show, to be replaced by Jimmy Fallon, currently host of NBC’s Late Night.  It’s a “generational shift,” reported USA Today, noting that NBC’s iconic Tonight Show has had only five hosts thus far in his 59-year history:  Steve Allen, Jack Paar, Johnny Carson, Jay Leno, and (briefly, before Leno’s return) Conan O’Brien (“`Tonight Show’ revs up for a new generation,” April 4).   

                Robert Bianco, USA Today’s TV critic, praised NBC’s announcement, calling the shift from Leno to Fallon “a safe, easy, logical transition.”  He notes that NBC acted appropriately in making “public and official what it had clumsily allowed to become public and unofficial, via rumor mill, in March.”  This time, there were “no on-air battles à la Leno and Conan O’Brien” (referring to NBC’s embarrassing experiment in 2009-10, when the network had given the Tonight Show briefly to O’Brien, only to pull the plug in less than a year when Leno’s ill-fated prime-time show tanked and O’Brien’s ratings faltered).  Nor were there protracted media debates – “late-night wars” – over who should get the job “à la Leno and David Letterman” (which had prompted Letterman’s move from NBC to CBS).  Still, Bianco adds, when the transition is done in 2014, “The Tonight Show that once was – that dominant late-night program that launched stars and set the cultural conversation – will finally be gone for good” (“`Tonight’ takes the safe path,” April 4). 

                Truth is, The Tonight Show started to go downhill when Johnny Carson retired in 1992, after hosting the show for a record 30 years.  Carson truly was the “king of late night.”  He was a masterful comedian (with especially good comedic timing) and yet also could be a serious interviewer (really serious – with such guests as philosopher/novelist Ayn Rand).   

                Nevertheless, it’s not just luck that explains Jay Leno’s 22 years as Tonight host (1992 – 2009, 2010 – present).  Leno is a funny, likeable guy.  Although not as skillful as Carson, he’s a fairly good interviewer and has some regular segments on the show that are hilarious.  (I especially like his “Headlines” segments, but also enjoy his “Jaywalking” quizzes and the occasional “Battle of the Jaywalk All-Stars,” even though it can sometimes be depressing to see how dumb many “average” Americans are.)  And Leno is nonpartisan in his political humor (as Carson was), poking fun at politicians from both major political parties, and showing a great deal of common sense in his monologues, as he makes humorous observations about current events in politics as well as pop culture. 

                Leno is, above all, a good sport.  He’s been treated abominably by NBC, even though he continues to hold the lead in ratings for the 11:35 p.m. time slot (regularly beating David Letterman on CBS and even his new competition on ABC, Jimmy Kimmel).  Leno even helped smooth the way for Fallon’s succession, appearing with Fallon in a video duet parodying the song “Tonight” from West Side Story.  He even graciously bowed out in a official announcement on his show.  “Congratulations, Jimmy,” he said.  “I hope you’re as lucky as me and hold on to the job until you’re the old guy.” 

                Jimmy Fallon is not a worthy successor – certainly not to the great Tonight Show hosts from the past (Allen, Paar, and Carson), nor even to Leno.  He’s relatively young (38), with a “boyish enthusiasm and natural skills as a performer,” according to NBC publicists.  But Fallon’s sense of humor strikes me as smarmy, even smug – reminiscent of his stint as the fake newsman on Saturday Night Live.  Worse, he has an overtly leftist political bias; he has used his late-night show not only to regularly poke fun at Republicans but also as a campaign platform for B.O. during the 2012 election season.  (Remember when B.O., the “Preezy” of the United “Steezy,” slow-jammed on Fallon’s late-night show?)  When Fallon takes over in 2014, the Tonight Show will return from L.A. to New York City, in a shiny new studio in Rockefeller Center, where it will be produced by SNL’s Lorne Michaels.  It will become, essentially, a weak week-night version of SNL, followed by a Late Night hosted by Jimmy Fallon’s successor, another SNL alum, Seth Meyers.  (Seth is the less-talented older brother of Josh Meyers, one of the cast members of Mad-tv and, briefly, That 70s Show.) 

                Until Leno’s departure, I’ll still try to watch Jay Leno’s “Headlines” segments on Monday nights (usually).  On other nights when I’m still awake after 11:30, I’ll be more frequently watching the Jimmy Kimmel show on ABC – a show that’s far more entertaining than either The Tonight Show or the mean-spirited David Letterman show.  That Jimmy, like Leno, is non-partisan in his humor, poking fun at Democrats as well as Republicans.  (In other words, he’s an “equal opportunity” offender when he jokes about politics.  Not surprising, as Kimmel started his TV career co-hosting with Adam Carolla the hilariously politically-incorrect Man Show and then moved on to hosting Win Ben Stein’s Money.)  Indeed, I usually try to stay awake on Thursday nights (or occasionally on Friday nights), to watch my favorite segment on Jimmy Kimmel’s show:  “This Week in Unnecessary Censorship” – his truly hilarious “tribute” to the f*c*ing FCC!  (See my essay “Abolish the F*CCing FCC,” Feb. 8, 2006.)



    n (Recommended) Summer Reading  

                As I noted last year, the “lazy, hazy, crazy days” of summer are perfect times for reading my favorite genre of pleasure-reading books:  historical murder mysteries.  I’ve recently updated my “Guide to Historical Mysteries,” to include the latest novels by some of my favorite authors.  Among these are two new mysteries set in ancient Rome – Robert Colton’s Pompeii and Alan Scribner’s Mars the Avenger – as well as a second mystery by Bruce MacBain involving Pliny the Younger (The Bull Slayer) and another book in Ruth Downie’s “Medicus” series (Semper Fidelis), concerning a doctor with the Twentieth legion stationed in Roman-occupied Britain.  Lindsey Davis, author of the splendid Marcus Didius Falco series, has a new ancient Roman novel coming out in mid-June, The Ides of April, with a female protagonist, Flavia Alba, Falco’s adopted daughter.  (It’s the first book in a promising new series, described by one reviewer as “Falco: The Next Generation.”)  Gary Corby’s Sacred Games is the third book in his critically-acclaimed series set in classical Athens.  There are also two new mystery novels set in Victorian England:  Gyles Brandreth’s Oscar Wilde and the Murders at Reading Gaol (the newest title in Brandreth’s Oscar Wilde mystery series), and (coming out in mid-July) Peril in the Royal Train, the tenth title in Edward Marston’s “Railway Detective” series. 

                Another great series of books that are perfect for summer reading are Patrick O’Brian’s Aubrey/Maturin novels.  Set mostly aboard ships of the British Royal Navy during the era of the Napoleonic wars, the 20-volume series realistically depicts naval warfare of the era, along with other adventures in exotic locales all over the world.  But, most of all, it focuses on the compelling story of the friendship between Captain Jack Aubrey and Stephen Maturin, ship’s surgeon and also spy for the British government.  Indeed, it was the story of the Aubrey/Maturin friendship that was the centerpiece of the splendid 2004 film based on the first and tenth novels in the series, Master and Commander: The Far Side of the World, starring Russell Crowe and Paul Bettany, respectively, as Aubrey and Maturin.  The movie got me interested in the first book in the series; reading that book, along with the next several books, got me hooked on the series.     

                And with the planned Fall 2014 release of the movie Atlas Shrugged Part 3 (the third and final installment), this summer is also the perfect time for people who haven’t yet read Ayn Rand’s magnificent novel, Atlas Shrugged  (especially people who saw Parts 1 and 2 of the movie and were intrigued about the book on which they were based) to experience it for the first time – and for people who have read the novel to read it again.  Rand’s philosophic novel, although written over 50 years ago, brilliantly discusses the critical issue of our time:  freedom vs. statism, or individualism vs. collectivism.  Rand herself acknowledged the book’s relevance to then-current cultural, economic, political, and legal issues in her 1964 talk “Is Atlas Shrugging?” (subsequently printed as an essay in her book Capitalism: The Unknown Ideal), and a video produced by LibertyPen and posted on YouTube, narrating excerpts from Rand’s essay juxtaposed with stories from today’s news headlines, underscores the fact that the book is even more relevant (or prescient) today.  

                As for non-fiction books (in biography, history, politics, and public policy), there are several that I recommend.   

    Judge Andrew P. Napolitano’s recent book, Theodore and Woodrow: How Two American Presidents Destroyed Constitutional Freedom (Thomas Nelson, 2012) discusses the destructive impact, on American government and constitutional law, of the two so-called “progressive” U.S. presidents of the early 20th century, Theodore Roosevelt and Woodrow Wilson.  TR and Wilson engineered the greatest shift in power in American history, from individuals and the states to the national government.  Among their disastrous legacies are the “progressive” income tax, the Federal Reserve, presidentially-initiated wars and other foreign military interventions, a bloated federal bureaucracy including administrative agencies that dictate virtually all aspects of Americans’ lives, and a national government that constantly encroaches on business, coddling special interests and discouraging true marketplace competition.  The book also contains an interesting postscript, added by Judge Napolitano as the book was going to press, dealing with Chief Justice John Roberts and his disappointing opinion on “ObamaCare.”  (Napolitano compares Roberts with his overrated early-19th-century predecessor, John Marshall, who also was a political hack who expanded the scope of federal power.) 

    John Allison, president and CEO of the libertarian Cato Institute (and retired chairman and CEO of BB&T), draws upon his considerable experience as a banker/businessman and his knowledge of (and appreciation for) free-market capitalism in his book The Financial Crisis and the Free Market Cure (McGraw-Hill, 2013), discussed in the section on “Déjà Vu,” in Part II of this essay.  He shows how bad government regulatory policies caused America’s financial “meltdown” in 2008, the Great Recession and the subsequent weak “recovery.”  More than an astute diagnosis, however, the book also explains what is needed to “cure” these problems – not more misguided government interventions (which have only made matters worse) but, rather, a true free market in financial services based on sound philosophical principles (the principles that once made America great). 

    Two other new books on public policy expose the disastrous policies of B.O., the current “occupier” of the White House, as he begins his second term in power.  First, David Harsanyi, the libertarian-conservative columnist (and author of Nanny State), in his book Obama’s Four Horsemen (Regnery, 2013) discusses (as the subtitle of the book says) “the disasters unleashed by Obama’s reelection” – the four great catastrophes that threaten to destroy the United States of America we know and love:  debt (federal deficits and a national debt that has hemorrhaged out of control), dependency (a paternalistic federal bureaucracy that controls our lives), surrender (a foreign policy that weakens America’s ability to defend itself and its allies), and death (modern Democrats’ promotion of abortion as a positive good to be subsidized at taxpayer expense).  And second, economist John Lott (author of the best-selling books More Guns, Less Crime and Freedomnomics), in his new book At the Brink (Regnery, 2013) provides a stunning assessment of what B.O. has done thus far – and what he intends to do in his second term – as he continues his “transformation” of America.  He documents the various ways in which the United States today is “at the brink” of catastrophe, economically and socially:  among other things, how the “stimulus” was the most expensive economic failure in history, with consequences that could last for generations; how the continually-growing national debt will have a devastating impact on Americans’ economic security; how “ObamaCare” will result in ever-more soaring health care costs and will destroy the best health-care system in the world; and, of course (given Lott’s expertise on the subject of gun rights), how the B.O.’s regime’s gun-control agenda would in fact make crime worse.  I especially like Chapter Four (on B.O.’s “war on business”) and Chapter Five (on taxes, which exposes, among other things, the bullshit of so-called “fair” tax rates). 

    Amity Shlaes’ biography of the 30th president of the United States, Coolidge (HarperCollins, 2013), is another must-read book for anyone who is concerned about the current state of our national government and who wants to understand the myths on which the 20th-century regulatory “welfare state” has been built.  Coolidge was unfairly vilified by the “progressive” and New Deal historians, who have wrongfully portrayed him as a “do-nothing” Republican president during the frivolous decade of the “Roaring” 1920s, which led to the Great Depression of the 1930s, “the inevitable hangover, for which FDR administered the cure,” as columnist Michael Barone has aptly characterized it, in his positive review of Shlaes’ eloquent biography (Barone, “Calvin Coolidge Finally Getting Credit He’s Due,” I.B.D., March 4).  Thus this biography of Coolidge may be considered yet another splendid addition to the growing body of scholarship (including Shlaes’ earlier book, The Forgotten Man), that have provided a fresh interpretation of the Great Depression, showing how it was caused by wrong-headed government fiscal policies and make worse by FDR’s New Deal, which perpetuated and expanded those policies.  

    It’s no accident that Ronald Reagan admired Coolidge so much that one of his first acts as president was to direct the White House curator to hang Coolidge’s portrait in the Cabinet Room.  In my rating of the U.S. presidents, I rank Coolidge as the “greatest president of the 20th century because he did the least damage to the Constitution” (see “Rating the U.S. Presidents 2012,” Feb. 22, 2012), and Shaes’ book nicely provides the details that justify my assessment.  Coolidge is known as “Silent Cal,” a man of few words.  But it’s his policy of economy in government that most merits Americans’ admiration today.  Shlaes calls him “our great refrainer.”  

               When he assumed the office of president after Warren G. Harding suddenly died in August 1923, Coolidge made the important decision of continuing Harding’s program of cutting taxes, tariffs, and government expenditures.  “I am for economy.  And after that I am for more economy,” the 30th president said.  He met regularly with the director of the new Bureau of the Budget, paring spending any way he could; and he vetoed spending bills – including veterans’ bonuses and farm subsidies – popular with Republican members of Congress.  At the same time, he pressed Congress for tax cuts.  He followed the advice of his Treasury Secretary, Andrew Mellon, who advocated “scientific taxation,” which anticipated supply-side economics theory by prescribing lower tax rates, to stimulate economic growth and increase revenues.  After Coolidge won a full term in 1924, the top income-tax rate was reduced from the wartime 70 percent to 25 percent.  As Michael Barone succinctly summarizes the results:  “An economy that lurched from inflation to recession between 1918 and 1922 suddenly burst into robust economic growth,” allowing Coolidge to achieve budget surpluses every year – “surpluses that he used to pay down the national debt.”  In a fateful decision, in the summer of 1927 while vacationing in the Black Hills of South Dakota, Coolidge announced, simply, “I do not choose to run for president in 1928.”  Barone notes that all indicators showed that Coolidge would have won a second full term – meaning he would have been in office when the stock market crashed in October 1929.  Instead of the perverse economic policies of his successor, the “progressive” Republican Herbert Hoover (who initiated the New Deal policies made even worse by his successor, FDR), we might have had the “great refrainer,” whose policies of fiscal restraint could have avoided, or at least lessened, the Great Depression. 

                Finally, there is an important new book on a topic I’ve written about frequently here on MayerBlog: the claim that Thomas Jefferson fathered the children of his slave Sally Hemings, what I’ve called “the Sally Hemings myth.”  (See my previous essays “Jefferson’s Legacy of Liberty” (April 11, 2013) and “The Truth About Thomas Jefferson” (April 12, 2012).)  M. Andrew Holowchak’s Framing a Legend: Exposing the Distorted History of Thomas Jefferson and Sally Hemings (Prometheus Books, 2013), is a worthy addition to a growing body of scholarship that should be called “corrective” history (rather than “revisionist” history) because it corrects the erroneous interpretation made by proponents of the Jefferson–Hemings paternity thesis and therefore helps shatter the Sally Hemings myth. 

                Holowchak is a professor of philosophy, and in the first half of his book (“Three Prominent Spins”) he applies a philosopher’s training to his devastating critique of books by proponents of the paternity thesis (Fawn Brodie, Annette Gordon-Reed, and Andrew Burstein), exposing such logical defects in their arguments as selective use of evidence, ungrounded speculation, and over-psychologizing.  In the second half of the book (“Unframing a Legend”), Holowchak focuses on Jefferson’s philosophic thinking, showing how evidence of Jefferson’s character and racial attitudes would make a liaison with Sally Hemings quite unlikely.  Although Holowchak does not cite the 2001 Scholars Commission report in the main text of his book, he adds an addendum summarizing its findings – and has a foreword written by the Scholars Commission chairman, Professor Robert Turner.  And throughout the book he frequently cites my own concurring opinion, from the Scholars Commission report, particularly with regard to my views how the Hemings myth has “politicized” history, due to the influences of political correctness, radical multiculturalism, and postmodernism.    

                Anyone who wants to know the truth about Thomas Jefferson’s life ought to read Holowchak’s splendid book.



    n Summer 2013 Movies: A Preview  

                Summer movies tend to follow one or more grand themes, and this summer the grand theme seems to be big-budget “action” films, particularly in the superhero and sci-fi genres, with an emphasis on sequels and prequels.  In other words, it will be another summer in which moviegoers can expect little originality – just more of the same “tried and true” models for summer box-office success.  Here’s my summary of the movies I’ll be watching for – films I anticipate either seeing or (in some cases) avoiding. 

                Already playing in the theaters are the two films most likely to be the big blockbuster hits of the summer.   Iron Man 3 (which premiered May 3), grossed $175 million just its first weekend.  It’s a sequel, not to the forgettable Iron Man 2 but rather to last summer’s hit The Avengers, for it picks up soon after that adventure (involving the whole Marvel Comics superhero crew defending against an alien attack on New York City).  Robert Downey Jr. is back in the title role, with Gwyneth Paltrow as his assistant/love interest.  This installment in the franchise isn’t just all special effects, for it has a compelling story as well as an impressive performance by Ben Kingsley as the villain.  (Or, as I should say, the supposed villain – SPOILER ALERT! – for it is Guy Pearce as the real villain who has the most impressive performance in the movie.)  The second movie, Star Trek: Into Darkness (which premiered May 17), took in $71 million its first weekend (somewhat less than the projected $90–100 million).  The sequel to director J.J. Abrams’ “reimagined” Star Trek (2009), it brings back the younger cast (headed by Chris Pine as Kirk and Zachary Quinto as Spock), in a story that continues in many ways to challenge the original Star Trek canon.  It has mind-blowing, explosive special effects (especially mind-blowing for those viewers reckless enough to watch it in 3-D), and plenty of inside jokes for veteran Trekkers (not “Trekkies,” as non-fans erroneously call them).  At times, with its fast-paced chase sequences, the movie felt to me more like a Bruce Willis Die Hard film than a Star Trek movie.  And as in Iron Man 3, the standout performance in Star Trek: Into Darkness comes from the villain, played brilliantly by British actor Benedict Cumberbatch, known to American audiences from his performance as a modern-day Sherlock Holmes in the BBC series Sherlock (who plays – sorry, another SPOILER ALERT! – the new Khan, for the movie as a whole is a kind of homage, a pastiche or even a parody, of the original Star Trek II: The Wrath of Khan).     

                The other big sci-fi film of May is After Earth (May 31), directed by M. Night Shyamalan.  Set 1,000 years in the future, it’s about a father and son (Will and Jaden Smith) stranded after crash-landing on a savage, inhospitable planet . . . called Earth.  The premise may be just crazy enough to make it work – helped out by the real chemistry between father and son. 

                Other movies premiering in May (and therefore either already playing or soon to play in movie theaters) include several films that I plan to avoid because I have no desire to waste my money seeing them (even during a discount weekday matinee showing).  The Great Gatsby (May 10) stars Leonardo DiCaprio and Tobey Maguire in an unnecessary remake by Baz Luhrmann (flamboyant director of Moulin Rouge! and Romeo + Juliet).  Instead of authentic 1920s jazz music, the movie tries to reimagine the Jazz Age with hip-hop and pop music from the likes of Jay-Z and Beyoncé.  Even worse, it was filmed in 3-D.  It’s an insult to the classic novel by F. Scott Fitzgerald, which was faithfully adapted for the screen in 1974’s The Great Gatsby starring Robert Redford (which didn’t need all the gimmickry).  The Hangover Part III (May 24) is the final installment (we hope!) in the men-behaving-badly comedy franchise starring Bradley Cooper, Ed Helms, and Zach Galifianakis.  The original Hangover movie in 2009 was quite funny, but its sequel was much less funny; if the first sequel was uncalled for, this second sequel is really uncalled for.  (Some comedy films are so classic that one cannot imagine any sequel doing them justice.  Imagine, for example, The Graduate II, or Animal House II.  The Hangover should have been in that category.)  In Before Midnight (May 24), Ethan Hawke and Julie Delpy reprise the roles they played in the previous “Before” movies: 1995’s Before Sunrise and 2004’s Before Sunset.  Although I like both Hawke and Delpy individually, for some reason I find their pairing annoying, and so I’ve avoided these movies, which only work if one cares about the characters (and I don’t).  Fast and Furious 6 (May 24) is the latest installment of this franchise based on the street-racing life in L.A.; its fans don’t include me.  Finally, the sci-fi/horror film The Purge (May 31) stars Ethan Hawke in a film filled with gratuitous violence and based on the despicable premise that one night a year citizens can commit any crime without punishment. 

                June’s big film – and another potential blockbuster hit of the summer – is Man of Steel (June 14), the Superman remake directed by Zack Synder and starring Henry Cavill in the title role.  With the DC Comics hero turning 75 this year, one might ask:  Why another reboot of the Superman story – especially so soon after 2006’s Superman Returns?  The answer (according to Entertainment Weekly) lies in Warner Brothers’ plan to use a revamped Superman “to lay the groundwork for a planned Justice League film that would team up many DC characters and possibly launch several new franchises.”  Unlike the 2006 remake, Man of Steel promises to be a complete reboot of the classic 1978 Superman movie that starred Christopher Reeve and launched a multi-film franchise of its own.  The classic origin story is kept largely intact, with Jor-El (Russell Crowe) saving his infant son by launching him away from the imperiled planet Krypton on an interstellar journey toward Earth.  The baby’s craft lands on the Kansas farm of the Kents (Kevin Costner and Diane Lane), who adopt the boy and teach him to hide his super-powers.  The familiar story continues in Metropolis, where intrepid Daily Planet reporter Lois Lane (Amy Adams) provides not only the love interest but also the main threat to the Man of Steel’s secret identity.  The real appeal of Man of Steel is the star: hunky 29-year-old British actor Henry Cavill, who was the best thing about Showtime’s highly flawed series The Tudors.  His performance alone may make it worthwhile to sit through yet another (arguably unnecessary) remake.   

                June has another big-budget sci-fi action film – World War Z (June 21), Brad Pitt’s zombie apocalypse movie – but the other noteworthy films in the month are all comedies.  The Internship (June 7) reunites the comedic duo of Vince Vaughn and Owen Wilson (who were so good in their smash 2005 comedy, Wedding Crashers) in a story about two out-of-work 40-something salesman who try to reboot their careers by taking a brutally competitive internship at Google.  This Is the End (June 12) reunites friends (and co-stars of Judd Apatow’s TV show, Freaks and Geeks) James Franco, Jason Segel, and Seth Rogan (who co-directs), in an apocalypse comedy set in a Hollywood home ostensibly owned by Franco (and filled with outrageous art, including some paintings done by the actor himself).  Much of the comedy in the film derives from inside jokes made at the expense of the actors’ professional and private lives (whether it’s rumors of Franco’s homosexuality or Rogan’s much-panned performance in The Green Hornet).  And The Heat (June 28) pairs Sandra Bullock and Melissa McCarthy as, respectively, an FBI agent and a Boston cop, in a comedy-thriller directed by the same director who made McCarthy a hit in Bridesmaids

                July begins and ends with two more big action films.  The Lone Ranger (July 3) is the long-awaited pairing of Armie Hammer in the title role and Johnny Depp in another strikingly original performance as Tonto, in a fast-paced remake of the story of the legendary Western hero, directed by Pirates of the Caribbean veteran Gore Verbinski.  (Why do I say remake?  Most people have forgotten the 1981 film The Legend of the Lone Ranger, starring the still-obscure actor Klinton Spilsbury in the title role, with Jason Robards as President Ulysses S. Grant and Christopher Lloyd in an amazing performance as the villain – amazing because that was the time when Lloyd was most known to American audiences for his performance as Jim, the burned-out cab driver in the classic TV comedy Taxi.  How’s that for a bit of trivia?)  And near the end of the month, Hugh Jackman reprises his starring role as the steel-clawed X-Man in The Wolverine (July 26), another installment in the Marvel Comics franchise and an adaptation of the character’s most famous comic-book story, set in Japan. 

                Finally, August also has two action films (both sequels) worthy of note.  300: Rise of an Empire (August 2) is the sequel to the 2007 hit 300, based on the graphic novel (comic book) story featuring digital animation and ripped abs, in an ultraviolent retelling of the story of the doomed 300 Spartans who resisted the Persian invasion of Greece at Thermopylae.  (300 was interesting, but its comedic parody, Meet the Spartans, was hilarious.)  Rise of an Empire, based on the yet-unreleased graphic novel Xerxes, tells a parallel story focused on the naval battles between the Persians and the Athenians.  And Percy Jackson: Sea of Monsters (August 7), a sequel to 2010’s Percy Jackson & the Olympians: The Lightning Thief, has its young star Logan Lerman returning as Percy – the half-human son of sea god Poseidon, who leads his fellow myth-based pals into the Sea of Monsters in search of the Golden Fleece.  (The film is based on the second book in Rick Riordan’s young-adult series.  Another film based on a best-selling young-adult fantasy series will be released later in the month (August 23): The Mortal Instruments: City of Bones.  What’s with all these complicated titles?)   


     | Link to this Entry | Posted Thursday, May 23, 2013.  Copyright © David N. Mayer.

    Jefferson's Legacy of Liberty - April 11, 2013


    Jefferson’s Legacy of Liberty



                Twenty years ago, on April 13, 1993, I was fortunate to participate in the celebration of Thomas Jefferson’s 250th birthday at the University of Virginia in Charlottesville.  The University typically observes Jefferson’s birthday as “Founder’s Day,” for he was its founder.  (Indeed, his efforts to establish U.Va. – which included designing the buildings that constitute the original “academical village” and selecting its first faculty members – were not only the great project of the final years of his life but also one of only three major accomplishments for which he wished to be remembered in his epitaph, as I note below.)  And April 13, 1993, marking the 250th anniversary of Jefferson’s birth, was a very special Founders Day.  

                I was among several speakers (who may be fairly described as a motley crew) who spoke on the U.Va. grounds that day.  (“Grounds” is the word used in Charlottesville to identify the University’s campus; it is one of several unique terms traditionally used at U.Va., where undergraduate students are called “first-year,” “second-year,” and so on, rather than “freshmen” or “sophomores,” etc., and where faculty are addressed as “Mr.” or “Ms.” rather than as “Professor” or “Doctor,” supposedly because of “Mr. Jefferson’s” abhorrence of titles.  Charlottesville and the University are often described as places where people speak of Mr. Jefferson as if he were in the next room.  Notwithstanding these traditions, however, I discovered during my years as a graduate student there in the early 1980s – I earned my M.A. in History in 1982 and my Ph.D. in History in 1988 – that much of the reverence about Jefferson at U.Va. is mere lip service.  Surprisingly, at the time I was the only history graduate student who was writing a dissertation about Jefferson – “The Constitutional Thought of Thomas Jefferson,” which was published in book form by the University Press of Virginia (now known as the University of Virginia Press) in 1994 – and it often seemed that I was the only student in my department who genuinely respected Jefferson.  The attitude toward the “Founder” that was held generally by the whole University community – by not just students but also by faculty and administrators – seemed a bit like rebellious teenage children who do not think it’s “cool” to respect their parents.  More than a few U.Va. faculty members at that time – and no doubt many more since – have expressed open contempt of Jefferson and his legacy.)      

                Back to April 13, 1993 and the “motley crew” who spoke on the U.Va. grounds that day, most under the direct auspices of the University but some (like myself) sponsored by student organizations.  It was a group that included, besides me, some real celebrities:  William Rehnquist, former Chief Justice of the United States; George Will, conservative political columnist; Katie Couric, at that time a host of NBC’s Today show and a U.Va. alumna; and former Soviet dictator Mikhail Gorbachev!  (The wholly inappropriate choice of Gorbachev as a speaker by itself shows how little respect and understanding the University administration and faculty had for Jefferson and his legacy; they were simply enamored by Gorbachev’s celebrity and the naïve belief that his supposedly liberal “reforms” had brought about the end of the Soviet Union, when it was really the steadfastness of Ronald Reagan and the inherent weakness of the Communist regime.  Having Gorbachev as a guest speaker was not only inappropriate, but also rather sad.  Jefferson’s purpose in establishing the University of Virginia was to create a university where, as he wrote to James Madison in 1826, the “vestal flame” of republicanism – the libertarian principles of the American Revolution – would be kept alive, “to spread anew” over Virginia and her sister states in the United States of America.  Sadly, U.Va. has become just like other “elite” American universities, with an administration and a faculty who are predominantly leftist in their political orientation, ignorant of Jeffersonian republican principles,  and who are more enamored of celebrities like Gorbachev and Couric than of real American heroes.) 

                Twenty years ago on April 13 my observance of the special day began with a sunrise program at Shadwell, the site of Jefferson’s birth in 1743, where some brief remarks were made by Merrill D. Peterson, the Thomas Jefferson Memorial Foundation Professor Emeritus at U.Va., the greatest living Jefferson scholar (at that time) and my dissertation director.  (Because Merrill had agreed to direct my dissertation after he had retired and been given emeritus status, I was literally his last Ph.D. student.  I like to think that he agreed to do so because he saw real merit in my study of Jefferson’s constitutional thought, which was the culmination of many years of research, beginning when I was an undergraduate student at the University of Michigan in the early 1970s, continuing through my three years of study as a law student at Michigan in the late 1970s, my four years of study as a graduate student at U. Va. in the early 1980s, my three years practicing law in Washington, D.C. in the mid-1980s, followed by a year as a postdoctoral fellow – even though I really was still predoctoral – at the Institute for Humane Studies in 1987-88.  During those years of study – totaling about 13 years, before I completed my dissertation in August 1988 – I had read not only virtually all the scholarly books written about Jefferson, including several by Peterson, but also had read virtually all of Jefferson’s own writings, both published and unpublished, including the largest collection of Jefferson’s manuscripts, at the Library of Congress.  I earned the title of “Jefferson scholar,” as Merrill Peterson had done before me.) 

                My talk took place in the early evening of April 13, sandwiched in between the late-afternoon talk by George Will and the nighttime fireworks that marked the culmination of the University’s observance.  Two student groups sponsored my talk:  Students for Individual Liberty (S.I.L.) and the Honor Committee.  S.I.L. was the leading libertarian student group at U.Va.; it was the successor organization to the libertarian student group that I had co-founded in 1983, the Libertarian Student Association (LSA).  By inviting me to come to Charlottesville and be their guest speaker on that special day, the leaders of S.I.L. were doubly honoring me:  first, by giving me an opportunity to observe Jefferson’s 250th birthday; and second, by honoring me, as the libertarian group’s own “founder,” on the tenth anniversary (or so) of its founding.  (I was pleased to learn that LSA, which began as a little libertarian/Objectivist group that met mostly to talk about philosophy, had morphed into the more politically active libertarian group, S.I.L., and that group was one of several libertarian groups that were thriving “on grounds” in the early 1990s.)  The Honor Committee, the student group that administers the University’s Honor Code (a student-run honor system is another Jeffersonian tradition at U.Va.), had decided to co-sponsor my talk (which itself was another honor, quite literally, for me – and more than a little ironic, for it meant that my talk was sponsored by the most tradition-bound student group and by perhaps the most “radical” student group at U.Va.!  Mr. Jefferson would be proud!) 

                The text of my April 13, 1993 talk, “The Libertarian Legacy of Thomas Jefferson,” is reprinted below, with a few minor revisions.  It is important, not only as a 20-year personal reminiscence, but also as a kind of antidote against the poisons (not too harsh a term for the outrageous myths) that many people have been propagating about Jefferson during the past two decades. 

                    Two major myths about Jefferson, one about his ideas and the other about his life, have dominated in recent years.   

    Twenty years ago, the chief myth about Jefferson’s ideas was the notion first circulated by his political enemies, the Hamiltonian Federalists, in the 1790s: that Jefferson was a “democrat,” a champion of democracy, or majority rule.  Although he more fully embraced the principle of majority rule than most of his contemporaries, Jefferson shared with other Americans of the Founding generation an abhorrence of democracy, in the pure sense.  Democracy, like aristocracy and monarchy, was one of the “pure” forms of government that – according to centuries-old political thought going all the way back to ancient Greece and Rome (including Aristotle, in his Politics) – would degenerate into either tyranny (what Alexis de Tocqueville called “the tyranny of the majority,” in his Democracy in America) or anarchy (or “mob rule”).  Like the other Founders, Jefferson championed not democracy but “republicanism,” which he defined generally as a form of government representative of the will of “the people.”  The United States of America is not a democracy but a republic – a constitutional republic; that is, a republic with a constitutionally-limited government.  (For more on this, see my essay “A Republic, Not a Democracy” (June 6, 2005).) 

    Jefferson and his good friend (and political collaborator), James Madison, called the opposition political party that they co-founded in the 1790s the “Republican” party because they believed their political opponents, the Federalists (and particularly the Hamiltonian wing of the Federalist party) were trying to undermine republican government in the United States, by attempting to transform it into an aristocratic, monarchical system like Great Britain.  That is why the core principles of the Jeffersonian Republican party emphasized the constitutional limits on the powers of government, particularly the national government (a government of limited powers enumerated in the U.S. Constitution) and – on the other side of the coin – the maximization of individual freedom and responsibility.  When Jefferson and Madison spoke of “self-government,” they were not talking about democracy – the “people” governing society, which means (in practice) the majority riding roughshod over the rights of the minority – but instead were talking about self-government, quite literally: the freedom of individuals to govern themselves, to “own” their own lives, without interference from government.  Jefferson always tempered his comments in favor of majority rule with an even greater reverence for the rights of “minorities” – and, of course, the rights of the individual, the most vulnerable “minority” of them all.  For example, in his First Inaugural Address (March 4, 1801), Jefferson announced the “sacred principle,” that “although the will of the majority is in all cases to prevail, that will to be rightful must be reasonable; that the minority possess their equal rights, which equal law must protect, and to violate would be oppression.”   

    The terms democrat and democracy did not begin to have positive connotations in American political culture until the 1830s – just after Jefferson’s death – when the supporters of Andrew Jackson, who came from the radical faction of Jefferson’s Republican Party, began calling themselves “Democrats.”  (It was in this era, during Jackson’s presidency, that the French aristocrat Tocqueville visited the United States and wrote his book Democracy in America.)  Jefferson and the radical wing of his Republican Party – which became the Democratic Party in the 19th century – became associated in the “public mind” (especially in the North) with Southern “state rights” doctrines, because of Jefferson’s emphasis on federalism and his “strict construction” of federal powers.  In the decades following the Civil War, Jefferson – in part because he was a Virginian as well as a slave-owner, and in part because some of his ideas inspired Southern secession – was associated with the “lost cause” of the Confederacy.  Even though Lincoln’s Republican Party got its name from Jefferson’s Republican Party (and Republican leaders like Abraham Lincoln himself claimed to have derived their fundamental principles from Jefferson), by the early 20th century the Republican Party looked not only to its forerunners in the American Whig Party of the 1830s and 1840s (the Whigs had evolved out of the more moderate wing of Jefferson’s Republican Party) but to the Federalist Party of the 1790s – Jefferson’s political opponents – as their political ancestors.  (The Republican Party of the early 20th century had embraced nationalism rather than “state rights” or federalism as a key political principle, and Republican politicians like Henry Cabot Lodge actually regarded Hamilton rather than Jefferson as founder of their party – one of the important topics discussed by Merrill Peterson in his classic book The Jefferson Image in the American Mind (Oxford University Press, 1962).)  

    So-called “Progressives” in the early 20th century, whether Republican (like Teddy Roosevelt) or Democratic (like Woodrow Wilson), similarly denounced Jefferson and his limited-government philosophy as antiquated, associated with (as Wilson asserted in his New Freedom (1912)) “those early days of simplicity which marked the beginnings of our government,” before the Industrial Revolution had transformed the United States from an agrarian nation into an industrial “super-power.”  (For more on the “Progressives” – and why the term is a misnomer, for they actually were reactionary paternalists – see my essay ”Reactionary `Progressives’” (March 16, 2006).  “Progressives” like Wilson also believed in a broader myth, which I call the myth of complexity, which assumed that as society became more complex – as the country moved from a simpler agrarian economy to a more complicated industrial economy – the role of government, of the coercive power of the law, needed to expand.  Jefferson and America’s Founders understood, on the contrary, that as society advanced, it needed fewer laws; they understood, as modern libertarians do, that free markets can provide better order in society than can government dictates.  To Jefferson, as to modern libertarians, true “progress” in society meant more freedom and less government.) 

                Although the “Progressive” activists were adamantly opposed to Jefferson’s constitutional principles (of limited government, strict interpretation of federal powers, broad interpretation of individual rights, federalism, and so on), they did embrace “democracy” – or, more precisely, they embraced the “modern” administrative state (government by so-called “experts,” on governmental administrative agencies) who supposedly acted in the “public interest,” or for the “common good.”  Since FDR’s so-called “New Deal Revolution” of the 1930s, Democratic politicians have continued to regard Jefferson as their party’s founder (even moreso than Andrew Jackson, who truly was their founder) and thus have emphasized the “democratic” part of Jefferson’s public image.  Republicans continued to waive their equally-good claim of Jefferson as founder of their party and began using the term “Democratic-Republican” (the derogatory term first used by Federalists in the 1790s) to identify Jefferson’s Republican Party.  Social and religious conservatives within the Republican Party, alienated by Jefferson’s libertarianism (especially his views on religious freedom and “the wall of separation of church and state,” the phrase he popularized during his presidency), similarly continued to repudiate Jefferson.  Only a minority of limited-government conservatives and libertarians kept alive Jefferson’s real political philosophy, which (as I discuss below) took liberty, not democracy, as its highest political value.  To most Americans, of either major political party, Jefferson had become the caricature that is presented in the Jefferson Memorial in Washington, D.C., which was designed during FDR’s presidency as a part of his “New Deal” propaganda – with FDR symbolically wrapping himself in the mantle of Thomas Jefferson (as Peterson characterized it in The Jefferson Image in the American Mind).   

    Thus was born the myth of Thomas Jefferson, “father of democracy” – the myth that so dominated in 1993, the year celebrating Jefferson’s 250th anniversary, that it became the theme used for the celebration, at least at the University of Virginia.  This was the context of the opening lines of my talk (reproduced in the final section below), when I observed that “modern Americans seem to have done a better job preserving what Thomas Jefferson has left us in bricks and mortar than we have done in preserving his ideas.”  Jefferson’s real political principles (his advocacy of limited-government, constitutional republicanism, of federalism, separation of powers, and so on) are, frankly, embarrassing to many modern American “elites,” whether in academia, journalism, politics, law, or business, who (like the early 20th-century “Progressives”) reject individualism in favor of paternalism.  To the paternalist elites of all political stripes (not only the left-liberals or “progressives,” but also the so-called “social conservatives” or “neo-conservatives”), Jefferson’s essentially libertarian political philosophy is anathema because of its emphasis on limits on governmental power.  His political philosophy is the antithesis of the modern national regulatory/welfare state.  So, to make Jefferson “relevant” to modern American politics, modern intellectuals extol him as “father of (American) democracy.”  It would be more faithful to the true Jefferson – to the man and (especially) to his ideas – to consider him as “father” of, say, the “Tea Party” movement, or the Republican Liberty Caucus, or even the Libertarian Party.  

    Beginning in the late 1990s, a new myth about Jefferson – focused on his private life – became dominant:  the myth that he fathered one or more of the children of his slave Sally Hemings.  Like the “father of democracy” myth, this one is a new twist on an old allegation. 

    The Sally Hemings myth also had its origins among Jefferson’s political enemies in the 19th century: chiefly, a hatchet journalist and disgruntled office-seeker named James T. Callender (who invented the story in an 1802 Richmond newspaper piece) and a Republican newspaper in rural Ohio that allegedly interviewed Madison Hemings, one of Sally’s sons, in 1873 (who essentially repeated the story told by Callender).  The story was kept alive by Jefferson’s political enemies – during his lifetime by some Federalist political opponents and after his death by some abolitionist propagandists – and as an oral tradition among some (but not all) of the Hemings descendants.  In the late 20th century the myth was given new life in a bestselling biography of Jefferson published in the 1970s, Fawn M. Brodie’s Thomas Jefferson: An Intimate History.  Brodie’s book was widely criticized by reputable Jefferson scholars as merely speculative “psycho-history.”  (Brodie claimed, for example, that when traveling through France, Jefferson described the soil in certain regions as being “mulatto” in color, he was revealing his guilt over miscegenation.)  Gullible members of the reading public – including people who read celebrity gossip magazines and who delight in salacious stories about famous people, whether living or dead – took Brodie’s fictionalized biography seriously, leading to such works as the Merchant-Ivory film Thomas Jefferson in Paris (1995), in which Nick Nolte ludicrously portrayed Jefferson as the American diplomat who began his affair with Sally Hemings in Paris in the 1780s.  (The problem – as even modern believers in the Sally Hemings myth must concede – is that Sally Hemings at the time was not the sensual young woman portrayed in the film, or in the Brodie book; she was a rather immature young girl who accompanied one of Jefferson’s daughters as her personal servant.  The alleged oldest child of Hemings, a son named Thomas, whom Jefferson supposedly fathered while in Paris, never existed; he was the fictional creation of James T. Callender, kept alive in pop culture by the Brodie book and the Jefferson in Paris film.) 

    The Sally Hemings myth was given renewed life in the 1990s thanks to a group of true believers at Monticello (particularly Lucia Stanton and Dianne Swann-Wright), who were engaged in the “Getting Word” oral history project involving the Hemings descendants, and to a black law professor in New York, Annette Gordon-Reed, whose first book, Thomas Jefferson and Sally Hemings: An American Controversy, was published in 1997.  Both the Monticello staffers and Professor Gordon-Reed have used the Hemings myth to push an agenda: to validate the oral history told by the Hemings descendants (which in turn seems to have been based on the sources noted above – the original Callender newspaper piece in 1802, the Madison Hemings “interview” in 1873, and the speculations in the Fawn Brodie book); and in Gordon-Reed’s case, to push a broader agenda about the problem of racism in America.   

    After DNA testing was done on the blood of the descendants of Thomas Jefferson and one of Sally Hemings’s younger children, her son Eston, Nature magazine in the fall of 1998 published an article that falsely claimed the DNA tests “proved” that Jefferson fathered all of Sally Hemings’s children.  The tests proved only that one of Sally Hemings children, Eston Hemings, was fathered by a male Jefferson (someone whose DNA matched the male side of Jefferson’s family); the DNA testing  could just as well have “proved” that Sally Hemings’s child was fathered by Jefferson’s brother, Randolph Jefferson, or one of Randolph’s sons – for whom the circumstantial evidence of paternity is at least as strong (if not stronger) than for Thomas Jefferson.  The timing of the Nature magazine article was critical – coming in the fall of 1998, as the U.S. House of Representatives was about to impeach Bill Clinton – for it gave some scholars a reason to jump on the bandwagon, using the Hemings myth to show that all U.S. presidents – even Mr. Jefferson – have had sexual affairs.  Thus has the Hemings myth been driven by politics, and various political agendas, throughout U.S. history. 

                It was in the wake of the Nature article (which popularized the misrepresentations about the DNA testing), in the spring of 2000, that I was asked to join an interdisciplinary group of distinguished scholars charged with the task of objectively evaluating all the relevant evidence regarding the allegation that Thomas Jefferson fathered the children (either all or some of them) of Sally Hemings.  Our Scholars Commission Report was issued in April 2001, and we were nearly unanimous (12 out of 13 members concurred with the final report, with just one member writing a mild dissent) in concluding that the paternity claim was not supported by reliable evidence.  As summarized by the commission’s chairman, Professor Robert F. Turner of the University of Virginia, in an op-ed published shortly after the report’s release:  “With but a single mild dissent, the scholars’ conclusions ranged from `strong skepticism’ about the allegation to a conviction that the charge was ‘almost certainly false.’  They demonstrated that most of the `evidence’ cited to establish the relationship was either factually false . . . or was explained on other grounds” (“The Truth About Jefferson,” Wall Street Journal, July 3, 2001).   

    As a member of the Scholars Commission, I wrote my own “concurring opinion” in which I discussed the reasons for my strong skepticism about the paternity thesis, which I frankly have called a myth.  In my separate report, “The Thomas Jefferson – Sally Hemings Myth and the Politicization of American History,” I concluded that the myth has been given new life in recent years because of two unfortunate trends in academia (“political correctness” and radical multiculturalism) that have undermined objective standards of historical scholarship.  I particularly criticized the shoddy scholarship by a biased Monticello committee and by the leading proponent of the Jefferson – Hemings thesis, Professor Gordon-Reed, who (as noted above) has advanced the paternity thesis as part of her broader race-conscious political agenda.   

    I concluded that there was no credible evidence supporting the claim that Jefferson fathered any of Sally Hemings’s children; nor is there any credible evidence supporting the claim that Jefferson had any sort of special relationship – let alone a “common-law marriage,” as some paternity thesis believers have imagined it – with Sally Hemings.  The evidence strongly suggests, on the contrary, that the Hemings children were fathered by multiple male members of Jefferson’s family – probably the elder children were fathered by his nephews, Peter and Samuel Carr (who admitted their paternity), and the younger children by his brother, Randolph Jefferson, and possibly one or more of Randolph’s sons.  The evidence suggests that Jefferson himself was celibate after the death of his wife, Martha, in 1782.  (The closest Jefferson came to a sexual relationship after 1782 was with Maria Cosway, a married woman and a talented artist, with whom Jefferson had a romantic friendship in France in the 1780s.)  To Jefferson, sadly, Sally Hemings was just one of the many domestic slaves at Monticello.  (The Sally Hemings imagined by Annette Gordon-Reed did not exist; she is a fictional character created to advance Professor Gordon-Reed’s agenda.  Even the claim that Jefferson treated Sally specially because she was the half-sister of his wife, Martha – the bastard child of Martha’s father, John Wayles, and his slave Betty Hemings – is suspect, for it has an even more dubious source than the story of Sally’s children.  For more on the origins of “the Sally story,” see the excellent book by Rebecca and James McMurry, Anatomy of a Scandal: Thomas Jefferson and the Sally Story (Shippensburg, Pa.: White Mane Books, 2002), the foreword of which I was honored to write.) 

    A slightly abridged version of my report is reproduced in the blog essay I posted last spring, “The Truth About Thomas Jefferson” (April 12, 2012).  As I noted in that essay, the Scholars Commission report at long last has been published in book form:  The Thomas Jefferson – Sally Hemings Controversy: Report of the Scholars Commission, edited by Robert F. Turner (Durham, N.C.: Carolina Academic Press, 2011).  The book can be ordered directly from the publisher, Carolina Academic Press (hardcover list price, $45.00), or from (hardcover edition at a slight discount, about $35, and digital edition, for the KINDLE reader, for $25).  The book contains our original commission report from 2001, complete with all the individual reports – including my individual report and (the lengthiest portion of the book) the separate report by Robert Turner, the chairman of the Scholars Commission, which he has expanded and revised to include a “postscript” summarizing the reactions to the Scholars Commission report over the past decade.  Anyone seriously interested in the Sally Hemings issue ought to consult the Scholars Commission report and particularly the individual views of Professor Turner, who has reviewed all the relevant evidence more thoroughly than any other scholar thus far has done. 

                Another book that has shattered the Sally Hemings myth has been written by William G. Hyland, Jr. – a former prosecutor and a trial lawyer from Virginia – who applied more than 25 years of litigation experience to the question of the Jefferson–Hemings paternity claim.  His book, In Defense of Thomas Jefferson: The Sally Hemings Sex Scandal (New York: Thomas Dunne Books/St. Martin’s Press, 2009), weighs the relevant historical evidence as if it were presented in court in a paternity suit.  Hyland concludes that the paternity thesis is an unsubstantiated charge, based on the accumulation of salacious rumors and irresponsible scholarship over several generations – “much of it inspired by political grudges, academic opportunism, and the trend of historical revisionism that seeks to drag the reputation of the Founding Fathers through the mud,” as the book jacket description states.  Hyland draws upon the Scholars Commission report – including my own individual report – in supporting his analysis.  

                As I noted in my 2012 essay, “The Truth About Thomas Jefferson,” Hyland also presents a devastating critique of Annette Gordon-Reed’s second book, The Hemingses of Monticello: An American Family (2008), which he aptly characterizes as a book of “creative historical imagination” that fails as either biography or history.  As he notes, Gordon-Reed’s second book is “an acidic, eight hundred-page regurgitation of her flawed 1997 book with the Jefferson–Hemings sexual allegation at the center of her social commentary on slavery.”  Her “highly speculative tome” is even more flawed by her techniques of “psychohistory,” misinterpreting clear evidence (as she did in the earlier book) and by her speculation concerning Jefferson and Sally’s “thoughts.”  Hyland is particularly critical of Gordon-Reed’s charge that Jefferson “raped” Sally, an allegation she makes repeatedly throughout her book, yet “she fails to offer a shred of documentary evidence, eyewitness, footnote, or source to support her venal opinion.”  Gordon-Reed’s book consists largely, as she admits, of “imagination” – which is another way of saying it is mere speculation, a work of fiction, masquerading as history or biography.  (The fact that her book received many awards from historical organizations only shows how far the field of history today has declined in its standards of professionalism.)  

    In addition to the “father of American democracy” myth and the Sally Hemings myth, other important myths about Jefferson – many of them also originating among Jefferson’s political enemies and perpetuated by some modern scholars – have arisen in recent decades, including the myths that Jefferson was an “atheist,” an “anti-capitalist” (an opponent of free-market capitalism), a “hypocrite” (who in power during his presidency betrayed his limited-government principles of the 1790s), and so on.  I describe these various myths and separate them from the real Jefferson, the historic Jefferson, in my essay “Thomas Jefferson, Man versus Myth” (April 13, 2006), which has been published in pamphlet form by the Center for Ethics and Entrepreneurship at Rockford College in Illinois.  (The pamphlet can still be purchased for $1.99 from

                One of these additional myths – that Jefferson was an “atheist” – deserves some further comments, for in his zeal to counter that myth the author of yet another new Jefferson book – David Barton – has created a new myth of his own, that Jefferson was a “Christian,” in the conventional sense of the word.  Last August, publishing company Thomas Nelson took the unusual step of recalling all copies of Barton’s book The Jefferson Lies: Exposing the Myths You’ve Always Believed About Thomas Jefferson, which had been released in April 2012.  The publisher explained its extraordinary action by citing a number of factual inaccuracies and historical misinterpretations that had been brought to its attention.  In a statement, Nelson said, “[We were] contacted by a number of people expressing concerns about The Jefferson Lies.  We took all of those concerns seriously, tried to sort out matters of opinion or interpretation, and in the course of our review learned that there were some historical details included in the book that were not adequately supported.”  In the wake of these accusations, Nelson recalled copies of the book in retail stores, asked online retailers to stop selling it, and suspended printing and distribution. 

                It’s not surprising that the book contained factual inaccuracies and misinterpretations – in fact, it’s riddled with them:  the author, David Barton, is not a trained historian but a religious conservative who founded an organization called Wall-Builders, which is devoted (ironically) to tearing down the “wall of separation between church and state” and instead advancing the notion that America was founded as “a Christian nation.”  Thomas Jefferson not only popularized the “wall of separation” metaphor (which he used in a famous letter he wrote to a Baptist congregation in Danbury, Conn., early in his presidency) but also has been widely understood, by most historians and Jefferson scholars, as a champion of religious freedom and a critic of “organized” Christian religion in his own era.  (Indeed, the context for the famous quotation found inside the Jefferson Memorial in Washington, D.C. – “I have sworn on the altar of God eternal hostility against every form of tyranny over the mind of man” – was in a letter Jefferson wrote to a political supporter, making the point that his Federalist political opponents in New England “rightly” feared that he, as president, would stand in the way of their agenda to use the coercive power of government to impose their version of Christianity on the American people.)  

                It’s true that Jefferson was no “atheist,” as his political enemies (in his time and even in the modern era) have alleged; but Barton truly distorts Jefferson when he tries to show he was an orthodox Christian.  Rather, Jefferson is best described as primarily a “deist,” one who believed not in a supernatural God who actively intervenes in human affairs but rather in a Creator-God who acts through the laws of nature.  He called himself a “Christian” because he admired Jesus’ moral teachings, as he understood them, but he denied Jesus’ divinity – which surely must disqualify Jefferson from being considered a “Christian” in the theological sense.  He did believe in an afterlife – or at least in a sort of “heaven,” where individuals would be reunited with their deceased loved ones – but did not image a “hell” (except here on earth) and denied the existence of “miracles.”  To a great extent, Jefferson indeed was – as he was characterized in an op-ed by one of Barton’s critics, Boston University religion professor Stephen Prothero – “our least Christian president” (USA Today, July 3, 2012). 

                In trying to disprove what seems obvious to most scholars who have seriously studied Jefferson’s thought, Barton engages in what I call “lawyer’s history”:  picking and choosing from historical data only those facts that support his conclusions, while ignoring any data that suggest otherwise.  Because his “proof” relies on extensive quotation from historical documents – often misinterpreted by being taken out of context or quoted superficially – Barton fools a number of his readers, including libertarian/conservative political commentator, Glenn Beck, who wrote an introduction for Barton’s book and who had Barton as a guest frequently on his former TV show on Fox News.  Barton, whom Beck calls his favorite “historian” and guide to America’s Founders, has completely bamboozled Beck, who fails to understand what a charlatan Mr. Barton truly is. 

                Chief among Barton’s critics have been Warren Throckmorton and Michael Coulter, professors at conservative Grove City College and authors of Getting Jefferson Right: Fact Checking Claims About Our Third President (published by Salem Grove Press in 2012 and available from Amazon both as a paperback and as an e-book).  Throckmorton and Coulter are effective in rebutting Barton’s claims because they are religious conservatives and hence do not fit the “secularist” stereotype that Barton creates to explain away the criticisms of his book.  The two professors have written a devastating, point-by-point rebuttal of most of the outrageous claims Barton makes in his book, thus illustrating (ironically) how aptly titled The Jefferson Lies really is.  Unlike Barton, Throckmorton and Coulter are careful scholars who appreciate the importance of historical context.  Their discussion of many of the particulars about Jefferson’s own religious views and his political record regarding religion – for example, their discussion of the famous “wall of separation” letter to the Danbury Baptists – is one of the best and most thorough I’ve seen in print.  As a correction to Mr. Barton’s “lies,” I highly recommend the Throckmorton and Coulter book.  It’s among the few recently-published books about Jefferson that I would recommend.  

                (There is one chapter in Barton’s book, however, that Throckmorton and Coulter do not rebut – because it warrants no rebuttal.  Like a stopped clock that tells the correct time twice a day, Barton does get one thing “right” about Jefferson – and thus does correctly bash one of the “lies,” or myths, that have been created about Jefferson:  the story that he fathered one or more of the children of his slave Sally Hemings.  That part of Barton’s book (Chapter 1) doesn’t have any original ideas; it merely summarizes the conclusions of the Scholars Commission – and it quotes extensively from the Scholars Commission report, including my own concurring report, “The Thomas Jefferson – Sally Hemings Myth and the Politicization of American History,” all of which, as I’ve noted above, has been published in the book The Thomas Jefferson – Sally Hemings Controversy: Report of the Scholars Commission, edited by Robert F. Turner (Durham, N.C.: Carolina Academic Press, 2011).  I should add that one more piece of evidence showing how poor a historian – really merely a pseudo-historian – Barton is, is the fact that the passages he cites from my report are the least important passages in it.  Thus, he fails to cite the historical works that really do support the conclusions he desires to reach.  Even as an author of “lawyer’s history,” Barton isn’t very competent.) 

                During the past twenty years, dozens of new books about Jefferson have been published – nearly as many new books as appear about Abraham Lincoln, the most popular U.S. president – but, sadly, most of the new and recently-published Jefferson books are absolute nonsense.  They generally offer no fresh insights on Jefferson, his life or (what is most important) his ideas, but instead merely rehash what other writers previously have said about him – including the myths discussed above.  Thus, recent Jefferson “scholarship” has tended to perpetuate the “democracy” myth, the Sally Hemings myth, and all the rest – masking the real Thomas Jefferson, the historic figure, with regard to both his life and his ideas.  Ironically, with all the dozens of recently-published books about him – more Jefferson studies than ever – Americans probably understand Jefferson less today than they have in any other period in U.S. history.  The most important of America’s Founders has become the most elusive.  It’s not because Jefferson, the real Jefferson, is difficult to understand – he isn’t – but rather it’s because the recent scholarship has so distorted and misrepresented him. 



    The Art of Distortion: A Review of

    Jon Meacham’s The Art of Power 


                A case in point:  the best-selling book, Thomas Jefferson: The Art of Power (Random House, 2012), by Jon Meacham, a former Newsweek editor and current editor and executive vice president at Random House.  Although not a historian, Meacham won a Pulitzer Prize for American Lion, his 2008 biography of Andrew Jackson, which was also a best-seller.  (He is also the author of two other New York Times bestsellers, Franklin and Winston and American Gospel.)   

                The lengthy (750-page) narrative of Jefferson’s life compounds the basic mistake made by many recent authors of Jefferson biographies:  rather than viewing Jefferson honestly in the context of his own time and culture, Meacham views him through the lens of a 21st-century left-liberal American, trying to force him into the model of a modern “liberal.”  Hence he concludes, “The real Jefferson was like so many of us: a bundle of contradictions, competing passions, flaws, sins and virtues that can never be neatly smoothed out into a tidy whole.  The closest thing to a constant in his life was his need for power and control” (p. 500).  This emphasis on “power” (as Meacham understands the term) is the fatal flaw of the book, for it completely distorts Jefferson, turning on its head Jefferson’s real political philosophy.    

                Aesthetically, the book is impressive, with expensive “production value” – as one might expect from the fact that the author is also an executive at the publishing company, Random House.  (As noted above, Meacham is an editor – his title, in fact, is “executive editor” – as well as executive vice president of Random House.)  Richly illustrated, the book has beautiful drawings that grace the endpapers, the main title page, and the title pages for each of its nine parts.  It also has three sections of pictures, one in color – which includes several of the paintings from Jefferson’s collection at Monticello, as well as the portraits of the three men in English history whom Jefferson most admired (Francis Bacon, John Locke, and Isaac Newton).   

                Meacham’s book is written in a non-scholarly, populist style that underscores the superficial treatment he gives his subject.  Paragraphs are short and choppy, with many paragraphs lacking topic sentences as the narrative jumps from one point to another.  The style resembles that of a lengthy newspaper article, rather than a book.  Although there are no note numbers in the main text, the book does contain 175 pages of endnotes, citing sources for the quotations that appear in the book, many of them (to the author’s credit) from Jefferson’s writings, both published and unpublished.  The bibliography is quite comprehensive, listing as “books consulted” virtually all the major books about Jefferson and his time – including my own book, The Constitutional Thought of Thomas Jefferson.  However, a close examination of the works cited as sources in the endnotes reveals that Meacham has relied mostly on just a few of these secondary sources, and not the best ones.  (He may have “consulted” my book on Jefferson’s constitutional thought, but apparently he learned nothing from it; he fails to cite my book where he should, when he is discussing such critical topics as the Declaration of Independence, Jefferson’s 1791 opinion on the bank bill, the Kentucky Resolutions of 1798, or the Louisiana Purchase – and so, as noted below, he gets those topics wrong.) 

                It is not just in its aesthetics or its writing style that The Art of Power is superficial.  Substantively, the book is grounded in Meacham’s superficial understanding of the intellectual and political worlds in which Jefferson lived.  His narrative is broad but not deep: it covers many topics, giving just enough detail (including just enough quotation from Jefferson’s writings) to create the impression of being thoroughly researched.  Yet Meacham does not prioritize the subject matter, for he gives equal attention to the significant and the trivial.   

                For example, discussing the background to the American Revolution, Meacham mentions (and even briefly summarizes) the multivolume history of England by Paul de Rapin-Thoyras that was in Peter Jefferson’s library and which Jefferson inherited after his father’s death.  Rapin’s history was indeed an important part of Jefferson’s early education, but it was one of several important works that shaped his thought in the decades prior to the Revolution.  Quoting the 19th-century Jefferson biographer Henry Randall, Meacham calls Peter Jefferson “a staunch Whig” and suggests that the “Whig” version of English history shaped the worldview of Jefferson and his fellow Patriots in the American Revolution.  Yet Meacham does not mention any of the other important writers – a group of thinkers, not just historians but also political philosophers and authors of legal treatises – whom scholars generally call the English “radical Whigs,” or “Real Whigs,” of the 17th and 18th centuries, who truly did influence Jefferson and the other Founders, not only to support the American cause but to formulate a uniquely American constitutionalism.  Without even acknowledging the radical Whig tradition, Meacham writes that Jefferson and his fellow American Revolutionaries fought against monarchical “tyranny” and sought (as he simplistically puts it) “a government they judged fair and representative.”  When he finally gets around to discussing Jefferson’s authorship of the Declaration of Independence, Meacham writes that Jefferson’s influences were “manifold” and included “Locke, Montesquieu, and the philosophers of the Scottish Enlightenment.”  His sources, cited in the endnotes, are three flawed and/or outdated secondary sources:  Pauline Maier’s American Scripture (1997), Garry Wills’ Inventing America (1978), and Carl Becker’s Declaration of Independence (1970).  He doesn’t cite the most serious (and insightful) studies of the Declaration, including Hans Eicholz’s Harmonizing Sentiments (Peter Lang, 2001) or Chapter 2 of my Constitutional Thought of Thomas Jefferson.    

                Although Meacham is an editor and a writer, not a lawyer, his book epitomizes what historians contemptuously call “lawyer’s history,” cherry-picking the evidence and citing only those facts that support his thesis while ignoring facts that do not support his thesis.  And what exactly is Meacham’s thesis?  The title of the book suggests it, and Meacham states it rather directly in the book’s prologue: 

    “He had a defining vision, a compelling goal – the survival and success of popular government in America. . . . In pursuit of his ends, Jefferson sought, acquired, and wielded power, which is the bending of the world to one’s will, the remaking of reality in one’s own image. . . .


    “More than any of the other early presidents – more than Washington, more than Adams – Jefferson believed in the possibilities of humanity.  He dreamed big but understood that dreams become reality only when their champions are strong enough and wily enough to bend history to their own purposes.  Broadly put, philosophers think: politicians maneuver.  Jefferson’s genius was that he was both and could do both, often simultaneously.  Such is the art of power. . . .


    “A master of emotional and political manipulation, sensitive to criticism, obsessed with his reputation, and devoted to America, he was drawn to the world beyond Monticello . . . .  As a planter, lawyer, legislator, governor, diplomat, secretary of state, vice president, and president, Jefferson spent much of his life seeking control over himself and power over the lives and destinies of others.  For Jefferson, politics was a not a dispiriting distraction but an undertaking that made everything else possible.”


    Thus, as portrayed by Meacham, Jefferson was a power-hungry politician, not much different from ambitious politicians in modern America.  He was, writes Meacham, “a breathing human being who was subject to the passion and prejudice and pride and love and ambition and hope and fear that drive most other breathing human being,” particularly those who have spent much of their lives in politics.    

                Meacham rejects as mere political dissembling Jefferson’s often-expressed disdain for the world of politics, his well-documented lifelong desire to retreat to his “family, farm, and books,” at Monticello (and in his later years, when Monticello became too crowded, his retreat home at Poplar Forest).  Meacham’s Jefferson is a “patriarch,” who wields “power” to achieve “control,” not only at Monticello but in the political communities of his state and nation.  Because of his distorted view of Jefferson’s basic character, Meacham is blind to the real reason why Jefferson acted contrary to his own desires and repeatedly came out of retirement – the “holy cause” (as Jefferson described it): “the holy cause of freedom.”  Freedom, or liberty: the opposite of “power,” in the political sense. 

                Meacham’s peculiar definition of power (“the bending of the world to one’s will, the remaking of reality in one’s own image”) profoundly influences his analysis.  Like other modern left-liberals, Meacham fails to distinguish political power – control exercised by government, through the coercive force of the laws – from private order, which is achieved by wise management of one’s own affairs and voluntary interactions with other people, for mutual benefit.  Jefferson certainly was the “patriarch” of Monticello, the head of his family (and the owner of several slave families) on his “little mountain” in central Virginia, where he meticulously planned everything in his house and gardens.  But he never envisioned himself as a patriarch in a political community – whether his local community in Albemarle County, his state (or “country,” as he called it), Virginia, or in the new nation he helped create, the United States of America.  He regarded public service as a kind of duty for someone of his class – a rather old-fashioned attitude grounded in the 18th-century world of “deference” politics in which he grew up – but (as I discuss more fully in my talk on his libertarian legacy, below) Jefferson was drawn to the world of politics, despite his personal aversion to it, because he was moved not only by the sense of duty but also by devotion to a cause.  That “holy cause,” as he repeatedly identified it, was the cause of “freedom.”  Rather than trying to control other people’s lives – to be a political patriarch, a legal paternalist – Jefferson devoted his public career to the struggle to free Americans from the tyranny of patriarchy or paternalism.  He sought to limit, not to expand, the use of governmental power, allowing individuals – his fellow Virginians, his fellow Americans – to exercise the freedom to order their own lives.   

                In discussing the political conflicts of the 1790s, Meacham overlooks the reason why Jefferson and his colleague, Madison, led their opposition party – why they called it the “Republican” Party and why Jefferson, first as secretary of state under Washington’s administration and later as vice president during Adams’s administration, took the stands that he did.  For example, Meacham discusses Jefferson’s opposition to Alexander Hamilton’s plan to charter the Bank of the United States as if they disagreed only on policy grounds (“Hamilton’s vision”).  Although he acknowledges that Jefferson’s opinion against the constitutionality of the bank bill was based on “an argument for strict construction,” Meacham does not take seriously Jefferson’s constitutional argument, maintaining that he was “not doctrinaire” and indeed labeling Jefferson “an improviser and a nationalist” (p. 250).  Thus Meacham ignores one of the key differences in constitutional interpretation between the Jeffersonian Republicans and the Federalists – the Jeffersonians’ insistence on a strict interpretation of federal powers according to enumerated powers scheme of the Constitution.  (If Jefferson gave Washington an excuse to sign the bill by further advising him to defer to the Congress if he was in doubt about the bill’s constitutionality, it wasn’t because Jefferson was “pragmatic,” as Meacham claims, but because Jefferson was equally concerned about Washington abusing his veto power.)   

                Meacham similarly gives scant attention to Jefferson’s constitutional arguments in the other major disputes of the 1790s: the conflict over Washington’s Neutrality Proclamation (Meacham fails to mention how Jefferson encouraged Madison to write his “Helvidius” essays in response to Hamilton’s “Pacificus”), and the opposition to the Alien and Sedition Acts of 1798. With regard to the latter, Meacham does discuss Jefferson’s Kentucky Resolutions, one of Jefferson’s most important public papers, but in doing so he distorts Jefferson’s arguments, claiming that he took contradictory positions as both a “nationalist” and a “nullifier.”  Meacham then makes the following extraordinary claim about Jefferson “acting in character,” a passage that shows how completely wrong Meacham is about Jefferson – and how far Meacham’s thesis about “power” has led him to distort Jefferson’s political (and constitutional) philosophy: 

    “He [Jefferson] was always in favor of whatever means would improve the chances of his cause of the hour.  When he was a member of the Confederation Congress, he wanted the Confederation Congress to be respected.  When he was a governor, he wanted strong gubernatorial powers.  Now that he disagreed with the federal government (though an officer of that government), he wanted the states to have the ability to exert control and bring about the end he favored.  He was not intellectually consistent, but a consistent theme did run through his politics and statecraft.  He would do what it took, within reason, to rearrange the world as he wanted it to be.”


    (pp. 318-19).  To fully understand how completely wrong Meacham is about Jefferson in this passage, one needs to read my book The Constitutional Thought of Thomas Jefferson, which shows how “intellectually consistent” Jefferson really was.  (It’s a shame that Meacham, who cited my book in his bibliography – as noted above – apparently did not read it or, if he did, he did not understand it.  Maybe he deliberately ignored it – as he does so many historical facts and historical interpretations – because it wouldn’t fit his specious thesis.) 

    Not surprisingly, Meacham also distorts Jefferson’s record as president, failing to understand the important ways in which Jefferson departed from the precedents set by his predecessors, Washington and Adams, by restoring the constitutional limits on the presidency.  Instead, again pushing his thesis to ridiculous extremes, Meacham generally claims that the story of Jefferson’s two terms as president “is one of a lifelong student of control and power bringing all of his virtues and vices to the largest possible stage” (p. 351).  Meacham discusses the informality that Jefferson brought to the office but suggests it resulted merely from a difference in style – not (as it really was) a deliberate effort to strip from the office the monarchical trappings that he predecessors gave it.  For example, Meacham overlooks the importance of Jefferson’s change in the mode of presenting his annual message to Congress – submitting a written message instead of delivering a “state of the union” address in person – just as Meacham also misses the important subjects of some of these messages (particularly the first, in December 1801, when Jefferson recommended that Congress authorize offensive actions in the Barbary War).  Regarding the Louisiana Purchase, Meacham commits the same error made by other commentators – going all the way back to Henry Adams’s history of Jefferson’s presidency – in dismissing the seriousness of Jefferson’s constitutional scruples and instead claiming that it shows Jefferson’s hypocrisy (as Meacham presents it, a contradiction between “the philosophical Jefferson” and “the political Jefferson”).  And with regard to the disastrous embargo – the enforcement of which Jefferson found excruciatingly embarrassing because it so contradicted his libertarian impulses – Meacham finds it “not out of character for Jefferson,” for “it put him in control” (p. 432).  He keeps hammering away at his thesis, again and again. 

    Last, but not least among the glaring defects of the book, Meacham accepts unquestionably as proven historical fact – or, more precisely, as gospel truth – the Sally Hemings myth in its fullest form, the thesis that Jefferson not only fathered all the children of his slave Sally Hemings but that he and Sally had some sort of long-term relationship.  He writes, 

    “Jefferson maintained a decades-long liaison with Sally Hemings, his late wife’s enslaved half sister who tended to his personal quarters at Monticello.  They produced six children (four of whom lived) and gave rise to two centuries of speculations about the true nature of the affair.  Was it about love?  Power?  Both?  And if both, how much was affection, how much coercion?  Jefferson’s connection with Sally Hemings lasted from about 1787 to Jefferson’s death in 1826 – almost forty years.”


    For this remarkable assertion, Meacham cites primarily two sources: Annette Gordon-Reed’s two books (particularly her “monumental” second book The Hemingses of Monticello) and the biased Monticello (Thomas Jefferson Foundation) “Report of the Research Committee on Thomas Jefferson and Sally Hemings.”  In a lengthy endnote (on pp. 522-24), Meacham acknowledges that one member of the Monticello committee dissented and wrote a minority report.  (The dissenter, Dr. White McKenzie (Ken) Wallenborn, also has written a devastating critique of the committee’s bias and shoddy scholarship – “A Committee Insider’s Viewpoint,” published in The Jefferson – Hemings Myth: An American Travesty, edited by Eyler Robert Coates, Jr. (Charlottesville, Va., 2001) – but, not surprisingly, Meacham does not cite that critique nor even refer to Dr. Wallenborn by name.)  Meacham cites only two works “for a contrary view”: William Ryland’s book, In Defense of Jefferson, and (remarkably!) David Barton’s flawed book, The Jefferson Lies.  He does not cite the Scholars Commission report, or the published volume edited by Professor Robert F. Turner, The Thomas Jefferson – Sally Hemings Controversy: Report of the Scholars Commission (Carolina Academic Press, 2011), which (as I’ve discussed above) is the most thorough study of the Jefferson–Hemings paternity question.  Needless to say, Meacham does not even acknowledge that several distinguished scholars are skeptical of the paternity thesis; like so many other recent biographers of Jefferson, he considers the Sally Hemings story to be not a myth but a proven historical fact, accepted by a supposed near-unanimous consensus of scholars.     

    Why does Meacham so unquestionably accept Professor Gordon-Reed’s thesis?  He refers to her as his “friend,” acknowledges an “incalculable” scholarly debt to her, and indeed relies almost entirely on her “masterwork” whenever discussing either Sally Hemings or the Hemings children.  But, in the lengthy endnote on pp. 522-24, Meacham frankly admits the real reason why he accepts the Sally Hemings myth – because it fits so well with his own thesis:  “In my view,” he writes, “there is convincing biographical evidence that Jefferson was a man of appetite who appreciated order, and that the ability to carry on a long-term liaison with his late wife’s enslaved half sister under circumstances he could largely control would have suited him” (p. 522).   

    Indeed, in the final analysis one is left with the impression that Meacham wrote his Jefferson book not only to advance his specious thesis about “power,” but also to help make the Sally Hemings myth seem more plausible.  And, of course, to make a shitload of money for himself and for Random House.  Anyone seriously interested in the real Thomas Jefferson should ignore this deceptive book. 

                Needless to say, therefore, I do not recommend Thomas Jefferson: The Art of Power.   Nor do I recommend any of the other books about Jefferson, particularly Jefferson biographies, published in the past twenty years or more.  (One exception – not a biography but a specialized study of Jefferson’s thought – is Jean M. Yarbrough’s American Virtues: Thomas Jefferson on the Character of a Free People (University Press of Kansas, 1998).  Although I do not fully agree with Professor Yarbrough’s thesis, her discussion of Jefferson’s concern with “character” – the concept as understood in the classical sense – is both a fresh take on Jefferson’s ideas and also an interpretation fairly faithful to those ideas themselves.  Yarbrough’s approach to Jefferson – like my own in The Constitutional Thought of Thomas Jefferson – is to take his ideas seriously, and to take them on his own terms, something that virtually all the recently-published biographies have failed to do.) 

                What Jefferson book(s), then, would I recommend?  The definitive study of Jefferson’s life – still definitive and likely to remain so for the foreseeable future – is the monumental six-volume series by Dumas Malone, Jefferson and His Time, published over a  33-year period (1948 – 1981) by Little, Brown & Company:  Jefferson the Virginian (1948), Jefferson and the Rights of Man (1951), Jefferson and the Ordeal of Liberty (1962), Jefferson the President: First Term, 1801–1805 (1970), Jefferson the President: Second Term, 1805–1809 (1974), and The Sage of Monticello (1981).  Malone’s volumes are the indispensable reference books and the necessary starting points for any serious scholar of Jefferson and his life.  Reading Jefferson is his own words is also critically important; the best one-volume collection is Thomas Jefferson: Writings, published by the Library of America in 1984 (and edited by Merrill Peterson, who also edited the slightly more compact paperback compilation, The Portable Thomas Jefferson, first published by Viking Press in 1975 and reprinted by Penguin Books).  And, of course, for anyone who wants to understand Jefferson’s ideas about government, I recommend my own book, The Constitutional Thought of Thomas Jefferson (University Press of Virginia, 1994, paperback ed. 1995). 

                What about a one-volume biography of Jefferson?  (The question I’m most frequently asked about Jefferson – other than, unfortunately, the question about the paternity of Sally Hemings’ children – is whether there’s a good one-volume biography that I’d recommend.)  I recommend two.  Merrill D. Peterson’s Thomas Jefferson and the New Nation (Oxford University Press, 1970), at over 1000 pages, is a lengthy tome that fully explains Jefferson’s life in the context of his times; it is not merely a splendid biography of Jefferson but also a textbook of American history during “the age of Jefferson.”  Finally, the more compact one-volume biography that I still recommend as the best is Noble E. Cunningham, Jr., In Pursuit of Reason: The Life of Thomas Jefferson (Louisiana State University Press, 1987).  Professor Cunningham, who is also the author of one of the best books about Jefferson’s presidency (The Process of Government Under Jefferson), in his biography presents a nicely balanced account of Jefferson’s life, solidly researched and insightful, with just the right of amount of detail.


                Now, without any further ado, here is the revised text of the talk I gave at the University of Virginia on April 13, 1993, in celebration of the 250th anniversary of Jefferson’s birth.  (It has been slightly updated, and footnotes have been omitted.)



    The Libertarian Legacy of Thomas Jefferson 


                Sadly, modern Americans seem to have done a better job preserving what Thomas Jefferson has left us in bricks and mortar than we have done in preserving his ideas. Tourists visiting Charlottesville, Virginia, can witness firsthand the ongoing efforts to preserve Jefferson's home at Monticello as well as his splendid little "Academical Village," the Lawn, which is still a vital center of student life at the University of Virginia.  Further down the road, near Lynchburg, Virginia, preservationists have been restoring Poplar Forest, Jefferson's retreat home (which in many respects is more revealing than Monticello of Jefferson both as an architect and as a homeowner).   

                Notwithstanding these efforts to maintain Jefferson's architectural legacy, however, scholars have been less successful in keeping alive his philosophy, particularly his ideas about government – despite the copious record he has left in his writings, both public papers and private letters.  Perhaps the problem is that modern scholars, living in the post-New Deal era of big government, find it difficult to comprehend a different kind of society, like that of the Founders' generation, where government intervened far less extensively in the day-to-day lives of individuals.   

                Another reason why modern Americans fail to understand fully Jefferson's ideas about government is that those ideas have been so often misinterpreted.  The 1993 celebrations of the 250th anniversary of Jefferson's birth generally championed his reputation as "father of American democracy."  For example, former Chief Justice William Rehnquist, speaking at the University of Virginia, echoed the views of many Jefferson scholars that "the permanence of Jefferson resided not in his specific theories or acts of government, but in his democratic faith." 

                Although it is true that Jefferson was a leading proponent of representative democracy – Alexis de Tocqueville in his classic work, Democracy in America, called Jefferson "the most powerful advocate democracy has ever sent forth" – his devotion to democracy was not absolute or unqualified.  Indeed, Tocqueville thought it significant that Jefferson once warned James Madison that "the tyranny of the legislature" was "the danger most to be feared" in American government.  To Jefferson, democracy and its associated principles – majority rule, equal rights, direct representation of the people in government – were valuable, not as ends in themselves, but as essential means to a greater end, the maximization of individual freedom in civil society.  Liberty was Jefferson's highest value; he dedicated his life to what he once called "the holy cause of freedom."  

                What repeatedly drew him away from his tranquil domestic life at Monticello and back into the political fray was a higher value, "the holy cause of freedom," to which Jefferson felt duty-bound whenever he saw liberty threatened by a powerful central government, whether it was the British government under King George III or the United States government under Federalist administrations.  His passion for this cause was reflected in the very language that he used in his political writings.  Jefferson, the zealous defender of religious freedom, tended to use words such as holy, orthodox, or catholic when discussing political, not religious, principles; and he reserved words such as heretic or apostate to denounce politicians whom he regarded as the enemies of liberty.  He summed up his life's work in a letter he wrote relatively early in his public career, in 1790, soon after his return to the United States following his ambassadorship to France.  "[T]he ground of liberty is to be gained by inches, . . . we must be contented to secure what we can get from time to time, and eternally press forward for what is yet to get.  It takes time to persuade men to do even what is for their own good." 

                Jefferson's philosophy of government, accordingly, stressed the perpetual need to limit government's powers.  As he once wrote, "The natural progress of things is for liberty to yield and government to gain ground."  The notion that government was inevitably threatening to liberty was part of the radical Whig tradition in which Jefferson's early intellectual life was steeped.  Like John Locke, Algernon Sidney, and other lesser-known English radical Whig political philosophers, Jefferson understood, paradoxically, that it was government, which was created to "secure" individual rights, that posed the greatest danger to those rights through the abuse of its legitimate powers.  Hence Jefferson, like other good "Whigs" of his time – and like the classical liberals of the nineteenth century – was profoundly distrustful of concentrated political power and intensely devoted to the ideals of limited government and the rule of law. 

                To Jefferson, the significance of the American Revolution was the opportunity it gave the American people to create a republican form of government – that is, a government not only founded in theory upon the consent of the governed, but one that was continually responsible to the will of the people – "the only form of government which is not eternally at open or secret war with the rights of mankind," he maintained.  He understood the American constitutions, state and federal, to implement in practice the theory of government he so eloquently presented in his original draft of the Declaration of Independence, where he stated the "self-evident" truths that all men were created "equal & independent," that from that equal creation they derived "rights inherent & inalienable, among which are the preservation of life, & liberty & the pursuit of happiness," and that "to secure these ends, governments are instituted among men, deriving their just powers from the consent of the governed." 

                The creation of republican governments alone, however, was not sufficient to guard against abuses of power.  Jefferson also understood the value of such devices as written constitutions, the division and separation of powers, and the people's power to amend constitutions.  As I have shown in my book, The Constitutional Thought of Thomas Jefferson, the fundamental principle of his constitutionalism was most cogently expressed in a paragraph that appeared in his draft of the Kentucky Resolutions in 1798, where he wrote:  

    [C]onfidence is everywhere the parent of despotism--free govern­ment is founded in jealousy, and not in confidence; it is jealousy and not confidence which prescribes limited constitu­tions, to bind down those whom we are obliged to trust with power. . . . In questions of power, then, let no more be heard of confidence in man, but bind him down from mischief by the chains of the Constitution.  

    Zealously guarding liberty, Jefferson was suspicious of the use of governmental power and so cautioned "jealousy" and scrupulous adherence to "the chains of the Constitution," in order to "bind down those whom we are obliged to trust with power."  He feared that without the rule of higher law, the achievement of the American Revolution would be lost.  The governments in Europe "have divided their nations into two classes, wolves and sheep."  If the people of America once become "inattentive to the public affairs," he warned, "you and I, and Congress, and Assemblies, judges and governors shall become wolves.  It seems to be the law of our general nature, in spite of individual exceptions."  

                Like Thomas Paine, who in Common Sense had distinguished government and society, Jefferson understood that the realm of politics was quite limited; outside it, individuals should be free to fashion their lives as they saw fit, through voluntary social relationships.  The "essence of a republic," he wrote, was a system in which individuals "reserve to themselves personally the exercise of all rightful powers to which they are competent," delegating others to their "representatives, chosen immediately, and removable by themselves."  This "proximate choice and power of removal," he believed to be "the best security which experience has sanctioned for ensuring an honest conduct in the functionaries of society" – in other words, for preventing those in power from becoming "wolves." 

                What were those things to which individuals were "competent" to govern themselves?  Among them were natural rights.  The Declaration of Independence listed three such natural, or "inalienable," rights: life, liberty, and the pursuit of happiness.  Elsewhere in his writings Jefferson referred to others: expatriation, religious freedom, freedom of trade, even the right to hold property.  All these various rights might be understood as particular manifestations of one basic natural right, liberty, which Jefferson regarded as sacrosanct as life itself: as he wrote in his 1774 essay, A Summary View of the Rights of British America, "The god who gave us life, gave us liberty at the same time; the hand of force may destroy, but cannot disjoin them." 

                Jefferson regarded as a basic principle of good government the guarantee to all of the enjoyment of these rights.  In 1816, discussing the "rightful limits" of legislators' power, he maintained that "their true office is to declare and enforce only our natural rights and duties, and to take none of them from us": "No man has a natural right to commit aggression on the equal rights of another; and this is all from which the laws ought to restrain him; every man is under the natural duty of contributing to the necessities of society; and this is all the laws should enforce on him; and, no man having a natural right to be the judge between himself and another, it is his natural duty to submit to the umpirage of an impartial third."  He added, "when the laws have declared and enforced all this, they have fulfilled their functions, and the idea is quite unfounded, that on entering into society we give up any natural right."   Two years later, in a report which he prepared as chairman of the Commissioners for the University of Virginia, Jefferson included in his syllabus of the basic principles of government, "a sound spirit of legislation, which, banishing all arbitrary and unnecessary restraint on individual action, shall leave us free to do whatever does not violate the equal rights of others." 

                Fundamental to Jefferson's political philosophy, then, was the idea that no government could legitimately transgress natural rights.  In order for law to be binding, it must not only proceed from the will of properly authorized legislators, but it must also be "reasonable, that is, not violative of first principles, natural rights, and the dictates of the sense of justice."  In the final paragraph of his Virginia Statute for Reli­gious Freedom, for example, Jefferson added a declaration that the rights therein asserted were "the natural rights of mankind," and that although the legislature which enacted the Bill had no constitutional power to restrain subsequent legislatures, any future act repealing it or narrowing its operation would be "an infringement of natural right."  And undoubtedly the institution of slavery was so troubling to Jefferson, throughout his life, because he realized that it violated the natural rights of an entire race of people. 

                To Jefferson, religion was a matter of conscience, a private matter that ought not concern government.  For that reason, he joined his friend and collaborator, James Madison, in calling for both a wide latitude for the free exercise of religious beliefs and a strict avoidance of government "establishment" of religion.  "The opinions of men are not the object of civil government, nor under its jurisdic­tion," his original text declared.  As he explained the purpose of the Statute in his Notes on the State of Virginia, "Our rulers can have authority over such natural rights only as we have submitted to them," noting that "the rights of conscience we never submitted, we could not submit" because men are answerable for them to God only.  "The legi­timate powers of government extend to such acts only as are injurious to others.  But it does me no injury for my neighbour to say there are twenty gods, or no god.  It neither picks my pocket nor breaks my leg." 

                When Jefferson wrote to Madison late in 1787, expressing his great disappointment that the new federal Constitution included no explicit guarantee of rights, the first such right that he listed was freedom of religion.  He surely had in mind the kind of broad statement of "natural right" expressed in his Virginia bill.  Although the language finally adopted by Congress in proposing what would become part of the First Amendment – stating that "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof" – was far less explicit than the language of the Virginia statute, Jefferson interpreted it to be just as comprehensive a guarantee.  In other words, he understood the First Amendment freedom of religion clause, like the Virginia statute, to leave the formation of religious opinions solely to "the reason of man." 

                As president, Jefferson faithfully adhered to this principle and to his broad view of the rights guaranteed by the First Amendment.  He departed from the precedent set by his predecessors, Washington and Adams, by refusing to recommend or designate any day for national prayer, fasting, or thanksgiving.  As he explained his policy, in a letter made public early in his presidency, he noted that since Congress was prohibited by the First Amendment from acts respecting religion, and the president was authorized only to execute its acts, he had refrained from prescribing "even occasional performances of devotion."  In famous words, he declared that the First Amendment mandated a "wall of separation between Church and State."        Collaborating again with James Madison in 1798, Jefferson opposed as unconstitutional the Sedition Act, which had made it a criminal offense to criticize either President John Adams or the Federalist-controlled Congress.  If Jefferson was – as some critics have charged, both in his time and today – less than fully libertarian in his defense of freedom of the press in the years that followed his election in 1800, it was because he was deeply troubled by what he perceived as the "licentiousness" of the press of his time.  During his presidency he expressed concern that his Federalist opponents were "pushing its [the press's] licentiousness and its lying to such a degree of prostitution as to deprive it of all credit."  This was, he had noted, "a dangerous state of things" because "even the least informed of the people have learnt that nothing in a newspaper is to be believed."  To another correspondent he bemoaned the fact that "nothing can now be believed which is seen in a newspaper.  Truth itself becomes suspicious by being put into that polluted vehicle." 

                Despite his belief in the efficacy of state laws against false and defamatory publications, it is important to note that, as president, Jefferson consistently followed a "hands-off" policy, as required by the First Amendment.  In his Second Inaugural Address, he explained his administration's policy as an "experiment" that had been "fairly and fully made" to determine "whether freedom of discussion, unaided by power, is not sufficient for the propagation and protection of truth."  The press, "confined to truth, needs no other legal restraint," he maintained.  "The public judgment will correct false reasonings and opinions, on a full hearing of all parties; and no other definite line can be drawn between the inestimable liberty of the press, and its demoralizing licentiousness.  If there be still improprieties which this rule would not restrain, its supplement must be sought in the censorship of public opinion."  The Second Inaugural, then, did more than reiterate Jefferson's steadfast denial of federal authority over freedom of the press: it revealed that, when pressed to draw a line between "the inestimable liberty" and the "demoralizing licentiousness" of the press, Jefferson came down on the libertarian side.  He would leave to the marketplace of ideas, and ultimately to "the censorship of public opinion," the restraint of falsehoods. 

                Jefferson took very seriously the "chains of the Constitution."  These included not only the enumeration of powers in the main text of the Constitution and the specific limitations on powers found in the Bill of Rights, but also two other devices to keep powers restrained by dividing them: federalism, which divided powers between the states and federal government; and the separation of powers, which divided federal powers among the three branches, legislative, executive, and judicial. 

                Federalism was, to Jefferson, the "true theory of our constitution"; and in a classic statement, made shortly before he was elected president, he described it thus:  

    The true theory of our Constitution is surely the wisest and best, that the States are independent as to everything within themselves, and united as to everything respecting foreign nations.  Let the general government be reduced to foreign concerns only, and let our affairs be disen­tangled from those of all other nations, except as to commerce, which the merchants will manage the better the more they are left free to manage for themselves, and our general government may be reduced to a very simple organization and a very unexpensive one-‑a few plain duties to be performed by a few servants.

    Under Jefferson's view, the whole field of government in the United States was divided into two departments, "domestic" and "foreign," each department having "distinct directories, coordinate and equally indepen­dent and supreme, in its own sphere of action."  To the state governments were reserved "all legislation and administration, in affairs which concern their citizens only"; to the federal government was given "whatever concerns foreigns, or the citizens of the other states."  The "foreign," or federal, sphere, moreover, was strictly limited to the few functions enumerated in the Constitution.  

                Nothing better illustrates Jefferson's strict interpretation of federal powers under the Constitution than his 1791 opinion on the constitutionality of a bill to establish the Bank of the United States.  Citing the language of the Tenth Amendment, that "all powers not delegated to the U.S. by the Constitution, not prohibited by it to the states, are reserved to the states or to the people," Jefferson considered this provision to be "the foundation of the Constitution."  It reiterated the general principle of federal powers expressed by the language of Article I: that the legislative powers of the federal government, vested in the Congress of the United States, were limited to those "herein granted" in the Constitution.  "To take a single step beyond the bounda­ries thus speci­fically drawn around the powers of Congress, is to take possession of a boundless field of power, no longer susceptible of any definition." 

                The rest of Jefferson's opinion shows what he regarded those "bounda­ries drawn about the powers of Congress" to be: they were the words of Article I, the enumerations of Congressional power, construed (as Jeffer­son would later put it) "according to the plain and ordinary meaning of its language, to the common intendment of the time and those who framed it."  (Thus Jefferson was one of the first “originalists,” who interpret the provisions of the Constitution – and particularly its power-granting clauses – according to their original meaning.)  "The incorporation of a bank, and other powers assumed by this bill, have not . . . been delegated to the U.S. by the Constitution," Jefferson concluded, arguing that they were neither "among the powers specially enumerated" nor "within either of the general phrases" of Article I, the "general welfare" and "necessary and proper" clauses.  He understood the "general welfare" phrase to be a statement of the purpose for which the specific power of laying taxes was to be exercised, not a grant to Congress of "a distinct and independent power to do any act they please, which might be for the good of the Union."  To interpet it as the latter, Jefferson observed, "would render all the preceding and subsequent enumerations of power completely useless" as it would, in effect, "reduce the whole instrument to a single phrase," of empowering Congress to do whatever it pleased.  Similarly, he took quite literally the word necessary in the "necessary and proper" clause.  The Consti­tution, he argued, restrained Congress "to the necessary means, that is to say, to those means without which the grant of the power would be nuga­tory"; otherwise, the "necessary and proper" clause also "would swallow up all the delegated powers, and reduce the whole to one phrase."            

                Jefferson's opinion on the constitutionality of the bank bill thus presented a theory of strict interpretation of the Constitution.  To say that Jefferson was a literalist or a strict constructionist, however, is insufficient to describe his theory of interpretation.  First and foremost, he followed what I call a “contextual” theory of interpretation:  it was the context of a particular constitu­tional provision within the overall purpose of the federal Constitution, and not the text of the provision alone, that mattered to Jefferson.  He indeed was a "strict constructionist" with regard to most of the powers granted Congress in Article I, section 8, especially where federal powers could preempt state law.  Nevertheless, he could interpret federal powers under the Constitution quite liberally in matters involving foreign affairs, which he regarded as an exclusive responsibility of the national government since the time of the Articles of Confederation.  (Hence, in his second term as president, he enforced one of the most draconian laws ever passed by Congress – at least prior to the Civil War – the Embargo, which curtailed virtually all foreign trade in a failed attempt to keep the United States out of the war between Britain and France.)  He also could be quite liberal in interpreting power-restraining or rights-guaranteeing provisions of the Constitution, as his interpretation of the First Amendment religion clause demonstrates. 

                Upon becoming president in 1801, Jefferson reiterated his ideal of a federal government limited to its legitimate powers assigned by the Constitution: a government reduced to "a few plain duties performed by a few servants."  His Inaugural Address declared his general support for the idea of "a wise and frugal government, which shall restrain men from injuring one another, [but] which shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned."  More specifically, in his first annual message, in December 1801, he declared that it was his administra­tion's policy "to reduce expenses to what is necessary for the useful purposes of government," and he described those concerns that he consider­ed appropriate for the federal government.  "When we consider that this government is charged with the external and mutual relations only of these states; that the states themselves have principal care of our persons, our property, and our reputation, constituting the great field of human concerns, we may well doubt whether our organization is not too complicated, too expensive; whether offices and officers have not been multiplied unnecessarily, and sometimes injuriously to the service they were meant to promote." 

                Jefferson's administration pursued a policy of economy in government, drastically reducing the size of the federal payroll while simultaneously repealing all internal taxes, including Alexander Hamilton's hated excise on whiskey.  Abolition of internal taxes made possible the elimination of the internal revenue service employed to collect them; this resulted in a significant decrease in the Department of Treasury, by far the largest of the executive departments.  Jefferson also recommend­ed reductions in the army, the navy, and the diplomatic corps. 

                In addition to the repeal of internal taxes and drastic reductions in federal expenditures, Jefferson also endorsed enthusiastically the plan prepared by his treasury secretary, Albert Gallatin, to pay off the entire national debt – some $83 million – within sixteen years by annual appropriations of $7,300,000.  Believing it wrong for the present generation to saddle future generations with a huge national debt, Jefferson sought to establish the principle of "pay-as-you-go" in the federal budget.  During the eight years of Jefferson's administration the debt actually was reduced by almost a third; extraordinary expenses not fore­seen at the beginning of his presidency-‑chiefly, the increased naval costs associated with the Barbary Wars and the $15 million Louisiana Purchase--forced the modification of Gallatin's plan.  Nevertheless, the plan to extinguish the debt was largely successful because of the large increase in revenue from import duties that accom­panied the growth in American commerce during this period.  Indeed, the increased revenues actually created a surplus later in the administration, prompting Jeffer­son to recommend a constitutional amendment permitting expenditures for roads and other improvements projects, as noted below.  After his retirement from the presidency, Jefferson urged continued effort to pay off the debt by reducing federal expenditures, noting that increased public debt would bring increased taxation "and in its train wretchedness and oppression." 

                As president, Jefferson thus sought to accom­plish the objective he had stated in his First Inaugural Address and reiterated elsewhere in his writings at the start of his presidency:  to restore the constitutional equilibrium between the states and federal government, by keeping the latter "a wise and frugal govern­ment" limited to its sphere.  Although later in his presidency he recommended that Congress appropriate money for such projects as roads, canals, river and harbor improvements, and a national university, Jefferson recognized that a constitutional amendment was necessary for Congress to do so because such purposes were not among the enumerated powers of the federal government.  Indeed, Jefferson's strict interpretation of the Constitution almost jeopardized the Louisiana Purchase; he gave up his efforts on behalf of a constitutional amendment permitting the Purchase only after his closest advisers urged that it would cause the French government to reconsider the deal. 

                Critics of Jefferson, both past and present, have cited the Louisiana Purchase as an example of Jefferson's failure, as president, to consistently adhere to his doctrine of strict interpretation of federal powers.  Rather than showing his hypocrisy, however, the entire episode of the Louisiana Purchase illustrates the seriousness of Jefferson's constitutional scruples.  Jefferson understood the importance of the Purchase: it secured New Orleans and control of the Mississippi and was therefore vital to the interests of the United States.  Although Albert Gallatin presented Jefferson with arguments supporting the constitutionality of the Purchase, Jefferson remained sufficiently troubled to draft a constitutional amendment explicitly making the Louisiana territory part of the United States.  No important adviser or supporter of Jefferson apparently urged either the necessity or the practicality of such a constitutional procedure, however.  Indeed, Jefferson's close friend Senator Wilson Cary Nicholas argued strongly against it, saying that a declaration from Jefferson that the treaty exceeded constitutional authority would lead to its rejection by the Senate or at least to the charge of his willful breach of the Constitution.   

                Jefferson's reply to Nicholas's letter, stating in particularly striking terms his lingering constitutional scruples, has been one of the most often quoted of Jefferson's writings on constitutional matters: 

    When an instrument admits two constructions, the one safe, the other dangerous, the one precise, the other indefinite, I prefer that which is safe & precise.  I had rather ask an enlargement of power from the nation where it is found necessary, than to assume it by a construction which would make our powers boundless.  Our peculiar security is in possession of a written Constitution.  Let us not make it a blank paper by construction.

    Conceding the likelihood that the framers' enumeration of powers was "defective" – for "this is the ordinary case of all human works" – he urged, "Let us go on then perfecting it, by adding by way of amendment to the constitution, those powers which time & trial show are still wanting."  In the present case, he concluded, it was "important . . . to set an example against broad construction by appealing for new power to the people." 

                When Jefferson finally dropped the matter and acquiesced in the Louisiana Purchase despite the lack of a constitutional amendment, he did so not because he had given up strict construction but because he was following his advisers' recommendation not to press the constitutional problem, realizing that it could jeopardize a treaty so vital to the nation's security.  "What is practicable must often control what is pure theory; and the habits of the governed deter­mine in a great degree what is practicable," he noted.  Jefferson took solace in what he regarded as the "good sense" of the people, not to permit this one precedent to destroy the whole edifice of enumerated powers upon which constitutional limitations on the federal government rested.  Indeed, a common-sense resolution of his constitutional qualms was suggested by Thomas Paine, who reassured Jefferson that "the cession makes no alteration in the Constitution; it only extends the principles over a larger territory, and this certainly is within the morality of the Constitution, and not contrary to, nor beyond, the expression of intention of any of its articles."  If a new power had been added by construction to those powers assigned by the Constitu­tion to the federal sphere, it was only the power to add to the domain of what Jefferson aptly called the "empire for liberty." 

                The fact that, despite these assurances, Jefferson remained troubled about his constitutional scruples – for years after his presidency – only underscores the degree of his scrupulous regard for the "chains of the Constitution."  Unable to square the acquisition of Louisiana and its incorporation into the Union with his theory of federal powers, Jefferson came to regard it as an extraordinary action of executive prerogative:  he, as president, going beyond the strict limits of the law, for the good of the country.  Even then, he still hoped for an "act of indemnity" by the nation, one that "will confirm & not weaken the Constitution, by more strongly marking out its lines."  The "act of indemnity" to which he referred was, of course, an amendment to the Constitution.  

                With regard to the proper allocation of federal powers, Jefferson took equally seriously the principle of separation of powers.  It is a mistake to try to label Jefferson's presidency as either "strong" or "weak."  Where the Constitution assigned powers exclusively to the president, Jefferson vigorously exercised them; where powers were assigned to or shared with other branches, however, Jefferson both preached and exercised restraint, quite strictly.  He regarded the president, like the federal government generally and all its other parts, as "bound by the chains of the Constitution." 

                Unlike modern presidents, who assert the power as commander in chief to send U.S. armed forces anywhere in the world without the consent of Congress, Jefferson as president was respectful of Congress's war power.  When U.S. Navy ships fought against pirates in the Mediterranean, Jefferson – recognizing that the Constitution gave Congress alone the power to declare war –ordered the Navy to engage in defensive actions only until Congress authorized offensive measures.  The position that he took publicly, in his message to Congress – which modern commentators consider one of the most restrictive interpretations of executive war powers ever uttered by an American president – showed that he wished the decision committing American navy forces to hostilities in the Mediterranean to be not a unilateral one, but one in which Congress shared. 

                Jefferson also held a quite narrow view of the executive power, strictly speaking.  On one occasion he wrote, "I am but a machine erected by the constitution for the performance of certain acts according to the laws of action laid down for me."  For example, as noted above, when Jefferson as president refused to designate a day of national prayer, fasting or thanksgiving, he explained his position by noting that Congress was prohibited by the First Amendment from acts respecting religion and that the president was authorized only to execute their acts.  Thus his view of executive power saw it limited in its exercise both by constitutional restraints and by law.   

                As president he sought to keep his constitutional distance from the Congress.  He could hardly have done otherwise without opening himself to charges of hypocrisy (by his enemies) or backsliding (from his friends and followers), for the Republicans in the 1790s had been sharply critical of what they perceived as Federalist attempts to institute an English monarchical and ministerial system.  Consequently, early in his administration, Jefferson declared that he would abandon "all those public forms and ceremonies which tended to familiarize the public idea to the harbingers of another form of government."  These included the annual speech to Congress, which to Jefferson was too reminiscent of the king's opening of Parliament.  In sending a written message rather than delivering it in person, he broke with the precedent that George Washington had set and started a tradition that lasted more than a century.  Not until Woodrow Wilson did presidents deliver their state of the union addresses in person.  The modern spectacle – with both houses of Congress assembled in the House chamber in wait on the president, whose presence is loudly announced and greeted with two separate standing ovations – no doubt would have appalled Jefferson. 

                In at least one area, however, Jefferson was a "strong" president: in his assertion of his equal power – equal with the other two branches of the federal government, particularly the Supreme Court (dominated at the time by Federalists) – to interpret the Constitution.  The constitutional theory that scholars have called Jeffer­son's "tripartite" doctrine was fully developed in Jefferson's mind by the time of his presidency.  He explained his doctrine in a letter written to Abigail Adams in 1804, defending his actions in discontinuing prosecutions and pardoning offenders under the Sedition Act: 

    You seem to think it devolved on the judges to decide on the validity of the sedition law.  But nothing in the constitution has given them a right to decide for the executive, more than to the Executive to decide for them.  Both magistracies are equally independent in the sphere of action assigned to them.  The judges, believing the law constitutional, had a right to pass a sentence of fine and imprisonment, because that power was placed in their hands by the constitution.  But the Executive, believing the law to be unconstitutional, was bound to remit the execution of it; because that power has been confided to him by the constitution.­

    The Consti­tution, he concluded, "meant that its co‑ordinate branches should be checks on each other" and that, according­ly, to give the judiciary the right to decide questions of constitutional­ity "not only for themselves in their own sphere of action, but for the legislative and executive also in their spheres, would make the judiciary a despotic branch."  (Thus Jefferson was also one of the earliest opponents of the courts’ abuse of their judicial review power – what is today criticized as “judicial activism.”) 

                Jefferson had seemed not at all troubled by the fear of conflicts arising from the departments' divergent interpretations of the Constitu­tion.  In part, this may have been due to the fact that, in Jefferson's day, for all practical purposes, the legislature and the executive continued to determine for them­selves whether or not they were acting within the bounds of the Constitution.  If a truly difficult conflict arose between two or more branches, it could be resolved by the only truly ultimate arbiter of constitutional questions – the people, acting in their elective capacity.  By their periodic choosing of officers for two of the three departments of national government, the people, Jefferson believed, have an opportunity to "reinte­grate" the Constitution, by demonstrating their approval or disap­proval of those branches' interpretation of it. 

                Jeffer­son, though not an advocate of "frequent and untried changes in laws and constitutions," neverthe­less denied that he was a man who looked at constitutions with "sanctimonious reverence . . . like the ark of the covenant, too sacred to be touched."  Accordingly, he favored revisions of laws and constitutions, as the needs arose.  His view was clearly distinct from that of Chief Justice John Marshall, who in his famous opinion in McCulloch v. Maryland argued that the Constitution was "intended to endure for ages to come" as a rationalization for the expansion of federal powers by judicial interpretation.  Jeffer­son, with his Whig heritage of distrust of law and govern­ment, looked to the people rather than to the courts when he thought of adapting the Constitution, or of determining the applica­tion of its provisions, to new circumstances.  Always suspicious of men in power, Jefferson was particularly reluctant to entrust so important a role as the interpretation of the federal Constitu­tion to any one body of men – especially to a Supreme Court dominated, as it then was, by John Marshall.  Hence he preferred that constitutional difficulties remain unresolved, or that the mode of resolving them remain awkward and uncertain, rather than have mutual jealousies give way to confidence in the government at Washington. 

                In the early 1820s, during the Virginia campaign against the claim that the United States Supreme Court was the ultimate arbiter of constitutional questions, Jefferson again emphasized that the ultimate arbiter was the people themselves.  As he wrote one correspondent in 1820, "I know no safe depository of the ultimate powers of the society but the people them­selves; and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion by education.  This is the true corrective of abuses of constitutional power." 

                The notion that the control by the people over their government, according to their own "wholesome discretion," informed by education, constituted the "true corrective" of abuses of power is distinctively Jeffersonian.  Indeed, the emphasis that Jefferson placed on popular participation and control – making the people themselves a vital element in constitutionalism – was the preeminent hallmark of Jefferson's constitutional thought.  None of his contemporaries, with perhaps the exception of John Taylor of Caroline (a fellow Virginian and a radical Jeffersonian Republican), quite so emphasized this element.  It would in fact underlay many of the other aspects of his constitutional thought.  As I have shown in my study of Jefferson's constitutional thought, both the pure theory of separation of powers as well as the theory of federalism that Jefferson espoused were ultimately derived from his thoroughgoing repub­licanism:  with each branch of the federal government, and with each state in the Union, determining constitutional questions, potentially in conflict with one another, some common ground was necessary; and that common ground – in effect, the glue that held Jefferson's constitutional system in place – was in fact the active participation of the people in constitutional questions. 

                This explains Jefferson's lifelong emphasis on the importance of education as well as his support for a system of public schools.  The purpose for his "Bill for the More General Diffusion of Knowledge," as he explained it in Notes on Virginia, was that of "rendering the people the safe, as they are the ultimate guardians of their own liberty."  "Every government degenerates when trusted to the rulers of the people alone.  The people themselves therefore are its only safe depositories.  And to render even them safe their minds must be improved to a certain degree."  Jeffer­son's Bill sought to do by giving all citizens a basic schooling in reading, writing, and history.  The emphasis on historical education was quite deliberate, Jefferson explained:   

    History by apprising them of the past will enable them to judge of the future; it will avail them of the experience of other times and other nations; it will qualify them as judges of the actions and designs of men; it will enable them to know ambition under every disguise it may assume; and knowing it, to defeat its views.

    Beyond this basic schooling, the best students – the "natural aristocracy," determined by merit, or "genius" – would receive advanced training at the institution to which he devoted the final years of his life, the University of Virginia, where he hoped the "vestal flame" of republicanism would be kept alive. 

                In later years Jefferson coupled education with one other proposal, which he considered equally necessary to the preservation of republican­ism:  his proposed system of local government by "little republics," or wards.  His proposal was to divide the counties into wards of such size that every citizen can attend, when called on, and act in person.  "What has destroyed liberty and the rights of man in every government which has ever existed under the sun?  The generalizing & concentrating all cares and powers into one body."  The "secret" of maintaining freedom, he suggested, was to make the individual alone "the depository of the powers respecting himself, so far as he is com­petent to them, and delegating only what is beyond his competence by a synthetical process, to higher & higher orders of functionaries, so as to trust fewer and fewer powers, in proportion as the trustees become more and more oligarchical."  The system of republics thus described would accomplish this and thereby itself become a vital element of constitution­al­ism.  "Where every man is a sharer in the direction of his ward‑republic, or of some of the higher ones, and feels that he is a participator in the government of affairs, not merely at an election one day in the year, but every day; when there shall not be a man in the State who will not be a member of some one of its councils, great or small, he will let the heart be torn out of his body sooner than his power be wrestled from him by a Caesar or a Bonaparte," he also observed.  

                Jefferson thus envisioned, as a vital element of constitutionalism – indeed, as the most effective check on the abuse of governmental power – the active involvement of citizens in the government itself.  An educated, actively involved citizenry would be simul­taneously self‑reliant, managing directly those affairs to which indivi­duals were alone competent, and vigilant, keeping a close watch over their elected officials to whom they had entrusted all other affairs, and making certain that those officials did not turn into "wolves" (and that the people themselves not turn into “sheep”). 

                Jefferson's proposed ward system also gives added meaning to his support for the principle of "rotation in office" – what today is called “term limits.”  As is evident in the modern debate over various proposals to limit the terms of state legislators and members of Congress, one of the goals of rotation in office is to increase the level of popular participation in government by mandating turnover.  Thus, as the proponents of term limitations argue, the virtual monopoly that incumbent, professional politicians hold on some offices may be broken, and the way created for a return to the "citizen-politician" model of the 19th century.  The appeal of term limitations to modern-day Jeffersonians is exactly the same as its appeal to Jefferson himself: it enhances the possibility that each citizen may become, in his words, "a participator in the government of affairs, not merely at an election one day in the year, but every day." 

                A full understanding of Jefferson's ideas regarding constitutional change – and indeed, of his constitutional thought generally – must take into account Jefferson's dual emphasis on education and participation.  The essentially negative view of politics that Jefferson held thus ultimately influenced his constitutional thought in a profound way. 

                Jefferson regarded as truly modest the achieve­ments of his generation, believing that subsequent generations, learning from additional experience, would improve on the Founders' handiwork, with the problem of maintaining a free government becoming far simpler as subsequent generations hit upon better and better solutions.  Hence he recommended that every generation create anew their constitutions – a recommendation that reveals both his assump­tions that constitution‑making was a relatively simple matter and that the people, as a whole, were fully competent to the task.  Although a preeminent member of what Dumas Malone has called the "great generation," Jefferson disclaimed its greatness.  

                Throughout his life Jefferson deliberately downplayed his public service.  For example, in 1800 he drafted a list of his services that emphasized his role in introducing olive trees and upland rice into South Carolina, noting that "the greatest service which can be rendered any country is, to add a useful plant to its culture."      

                Perhaps Jefferson's greatest political legacy, therefore, is the extent to which he devalued politics.  During nearly half a century of public service, Jefferson held many high political offices: third President of the United States, Vice-President of the United States, Secretary of State, U. S. Ambassador to France, Member of Congress, Governor of Virginia.  Nevertheless, he asked to be remembered in his epitaph for only three accomplishments:  author of the Declaration of Independence, author of the Virginia Statute for Religious Freedom, and father of the University of Virginia.  Liberty and knowledge, not political power, were his highest values. 

                The author of the Declaration of Independence died on July 4, 1826, the fiftieth anniversary of the adoption of the Declaration, the date Americans have chosen for the celebration of the nation's birthday.  Like his fellow Patriot of `76, John Adams, who also died on that momentous day, Jefferson was fully aware of the symbolism; his final words, reportedly, were, "Is it the Fourth?"  Significantly, he wrote in his last letter of the libertarian meaning of American independence: "May it be to the world, what I believe it will be, (to some parts sooner, to others later, but finally to all,) the signal of arousing men to burst the chains under which monkish ignorance and superstition had persuaded them to bind themselves, and to assume the blessings and security of self-government." 


     | Link to this Entry | Posted Thursday, April 11, 2013.  Copyright © David N. Mayer.

    Spring Briefs 2013 - March 23, 2013



    Spring Briefs 2013


     It’s that time of year again – time for another series of “Spring Briefs” on MayerBlog.  As the spring weather heats up (throughout the USA except, apparently, here in Ohio), so too do controversies in the worlds of politics and popular culture.  Here are my comments on some current developments:



    n The Relevance of the Constitution

                 Maybe it’s just a coincidence that as B.O. begins his second term in the White House, a number of scholars and political commentators – mostly on the left side of the political spectrum – have been echoing the Occupier-in-Chief’s disdain for the U.S. Constitution.  Among them is Georgetown University law professor Louis Michael Seidman, who wrote a provocative op-ed piece in the New York Times at the end of 2012 entitled “Let’s Give Up on the Constitution.”  Professor Seidman, who (amazingly) teaches constitutional law at Georgetown, argues for the wholesale abandonment of the U.S. Constitution, decrying “all its archaic, idiosyncratic and downright evil provisions.”  In an interview on CBS TV, Professor Seidman explained: 

    “This is our country.  We live in it, and we have a right to the kind of country we want.  We would not allow the French or the United Nations to rule us, and neither should we allow people who died over two centuries ago and knew nothing of our country as it exists today.  If we are to take back our own country, we have to start making decisions for ourselves, and stop deferring to an ancient and outdated document.”


    He added that he didn’t mean we should abandon the whole document – just that we should not let it get in the way of what we want to do. 

    The question that should have been asked of Professor Seidman by that CBS interviewer is: “Who is the `we’ you’re talking out?”  “We the People of the United States,” as the Constitution’s preamble states?  Surely, though, the professor ought to understand that Americans are not all of one mind; historically, as a “free” society that regards freedom of thought and expression as one of our most cherished freedoms, we are a people who have profound disagreements about what’s best for us.  It’s to protect each individual from being harmed by other people’s violations of his rights that we form governments in the first place.  But – a fundamental fact about government that every law professor ought to understand – government is dangerous because it is the only entity in society entitled legitimately to use force, or coercion, to accomplish its purposes.  So we create written constitutions, to establish and enforce rules that limit government in the exercise of its powers.  As a law professor, Seidman also ought to understand why the text of all written instruments, including (and especially) constitutions, matter:  the text memorializes the powers that the “sovereign” people of the United States have given to their national government – including the limits imposed on those powers and (the flip side) guarantees of the rights that government cannot abridge – all in order to control its exercise of power, to prevent it from destroying the very rights it was created to protect.  Because the Constitution has a formal mechanism for amending its provisions, the text of the document today doesn’t memorialize the “dead hand of the past”; it reaffirms the relevance of those words to Americans today, who by choosing not to exercise their sovereign power to change the words, are in effect reaffirming them.  (This is the sense – the only sense – in which the theory argued by left-liberal constitutionalists, of a “living Constitution,” is true.)  

    The “we” to whom Professor Seidman referred in his interview, if he would honestly acknowledge it, are the left-liberal elites who share his view of government – people who want to transform the United States from a free nation based on individual rights into a collectivist, paternalistic “people’s state,” like the welfare states of Europe.  Thankfully (I hope) he speaks for a small minority of Americans.  Who needs the Constitution?  We (meaning all Americans) do – to protect us from the likes of Professor Seidman.


    n “Second-Raters” for a Second Term 

                Former V.P. Dick Cheney has described B.O.’s second-term Cabinet picks as, collectively, a bunch of “second-rate people.”   He’s actually giving them too much credit.  B.O.’s first term Cabinet were a bunch of mediocrities, and his picks for replacement of departing Cabinet members in his second term are even worse – not “second-raters,” but third-, or fourth-, or fifth- (and so on) raters.  They’re just of bunch of “yes”-men and –women, people who have risen far above their levels of incompetence (according to the “Peter Principle”), the kind of people our second-rate, narcissistic president surrounds himself with, just to feel good about himself. 

    The following sentence states a general truth that applies to virtually every second-term nominee: “__________, B.O.’s nominee to replace __________ as __________, is even less competent than the person he/she is replacing.”  It works for John Kerry (replacing Hillary Clinton as Secretary of State), Jack Lew (replacing Tim Geithner as Secretary of Treasury), Chuck Hagel (replacing Leon Panetta as Secretary of Defense), John Brennan (replacing David Patraeus as CIA director), and so on.   

    I’ve already written (in “2013: Prospects for Liberty, Part I” (Jan. 17)) about the top three named picks.  All three are further to the left than the person they replace, confirming that B.O. is moving even further to the left in his second term.  Hagel is nominally a Republican – but is really a RINO (Republican in Name Only), because his foreign-policy views are far to the left of most Republicans in the Senate.  Among other things, he favors massive Pentagon down-sizing, seems to be rabidly anti-Israel, and is soft on Iran.  Kerry is a left-wing Democrat, former U.S. Senator from Massachusetts, and is further to the left than his predecessor, Hillary Clinton.  As described by conservative commentator Geoffrey P. Hunt in American Thinker, Kerry is a “speciously decorated” Vietnam veteran, “disgracing his service and his uniform with dubious testimony before the Senate Foreign Relations Committee in 1971 [comparing the U.S. military with “Genghis Khan”], now the champion for the Afghan quagmire, and third in succession to the White House.”  Moreover, “Kerry’s top treaty-making priority is defeating global warming” (“Second-Rate Appointments from a Third-Rate President,” February 13).  And Lew, the former White House budget director and chief of staff, was summarized by Lawrence Kudlow as someone who “has no qualifications, standing or experience in the financial world or international sphere,” as well as someone who’s to the left of his predecessor, Tim Geithner.  (Geithner was bad enough, but at least he had some Wall Street experience; Lew has virtually none, having worked only for a three-year stint at Citigroup, from which he took home a $1 million "bonus" shortly after his bank received $45 billion in TARP funds, just before he joined the White House in 2009.)  Lew is, in Kudlow’s words, “a left-liberal Obama spear-carrier, whose very appointment signals a sharp confrontation with the Republican House over key issues such as the debt ceiling, the spending sequester, next year’s budgets and taxes.”  As treasury secretary, a leftist like Lew will push for “trillions of dollars in new tax hikes, absolutely minimal spending restraint, and no serious entitlement reform.”   

    Two other recent picks for B.O.’s second term deserve special mention:  John Brennan, to replace David Petraeus as CIA director, who was only recently confirmed (following Senator Rand Paul’s filibuster, discussed below); and Thomas Perez, to replace Hilda Solis as Secretary of Labor.  Both also fit the profile of being further to the left than their predecessors.  Brennan, who was B.O.’s chief counter-terrorism advisor, had served as deputy CIA director during the Bush administration, when he helped run the post-9/11 terrorist rendition program, which he called “an absolutely vital tool.”  Then, when it became politically unfashionable in 2005, he declared that enhanced interrogation “goes beyond the bounds of what a civilized society should employ”; yet he supports the use of unmanned drones to kill people targeted as terrorists by the government.  Thomas Perez has been described by the editors of Investor’s Business Daily as “so radical he makes the last labor secretary, Hilda Solis, look like Ayn Rand.”  As Assistant U.S. Attorney General, Perez has run the Department of Justice’s troubled civil rights division since 2009.  Among other actions, he dropped the voter intimidation case against the Black Panthers; sued Florida to try to stop the state from purging its voter rolls of 182,000 non-citizens; sued municipalities to force them to scrap written tests for police and firefighters, replacing them with “affirmative action,” race-based hiring; and brought a civil rights harassment suit against Arizona Sheriff Joe Arpaio, whose only “crime” was trying to enforce existing U.S. immigration law.  As the editors conclude, “These acts show the work not of a dedicated public servant but a rabid activist who sees advancing the leftist agenda as the goal, and the law as the obstacle” (“Thomas Perez: Unfit for Office,” March 15). 

    One possible exception to the general rule is the director of the Environmental Protection Agency (EPA).  B.O.’s appointee during his first term, Lisa Jackson, was not only a radical environmentalist but also was politically corrupt.  She resigned from office after having been investigated by the EPA’s inspector general for improperly using a bogus secondary e-mail account for official business.  B.O.’s pick for Ms. Jackson’s successor, Gina McCarthy, is just another radical environmentalist.  She headed the agency’s “clear air” efforts during B.O.’s first term, and as EPA director is expected to push the B.O.’s radical anti-carbon agenda (including efforts to reduce carbon dioxide emissions – despite the fact that CO2 isn’t really a pollutant).  She’s also a radical leftist but, as far as I know, is honest.


     n Droning On – To a Point 

                Senator Rand Paul (R.–Ky.) recently earned the praise of a broad array of Americans – not only conservatives and libertarians but also many left-liberals concerned about civil liberties – by doing something that his Senate Republican colleagues usually are roundly criticized for:  filibustering.  In Senator Paul’s case, however, the filibuster – an old-fashioned filibuster in which Paul, with “a little help from his friends” (not just Republicans but also Democrat Senator Ron Wyden), spoke continuously for nearly 13 hours on March 6-7 – was a success, in forcing the B.O. regime to make a commitment to follow the rule of law.  It was a reminder to all Americans why the Senate’s peculiar rule permitting unlimited debate – the filibuster rule, which as practiced in recent years amounts to a super-majority rule requiring 60 votes to end debate – actually does help protect Americans’ constitutional rights, by making the Senate a more “deliberative” (and deliberate) body that, in theory, ought not to pass legislation – or exercise the Senate’s other unique powers, to confirm presidential appointments or to ratify treatises – in haste.    

    Senator Paul filibustered against the nomination of John Brennan as CIA director, in order to call attention to the B.O. regime’s use of unmanned drone aircraft to do “targeted killings” of suspected terrorists.  Paul’s effort started out as a simple question he posed to Attorney General Eric Holder:  Does B.O. have “the power to authorize lethal force, such as a drone strike, against a U.S. citizen on U.S. soil, and without trial?”  The plain, simple answer ought to have been “No,” since such action clearly violates the U.S. Constitution’s Fifth Amendment due-process protection.  But instead Mr. Holder gave a rather long, convoluted answer filled with legalese giving the government “wiggle room”:  “It is possible, I suppose,” wrote Holder, “to imagine an extraordinary circumstance in which it would be necessary and appropriate under the Constitution and applicable laws of the United States for the President to authorize the military to use legal force within the territory of the United States.”  (One such “extraordinary circumstance” might be another Civil War, as Charles Krauthammer has observed in a recent column:  U.S. citizens, members of the armed forces of the seceded Confederate States of America, engaged in an actual armed rebellion against the U.S. government on American soil.  That’s the only valid example I can think of.) 

    Unsatisfied by Holder’s response, Paul – looking a lot like Jefferson Smith, the idealistic senator played by James Stewart in the classic movie Mr. Smith Goes to Washington – did his old-fashioned, 13-hour filibuster, assisted by a few Republican colleagues plus one Democrat, Senator Wyden (as noted above), which made the effort officially “bipartisan.”  (Just as he had done in voting to confirm Chuck Hagel as defense secretary – a controversial vote, for which Paul took much heat from conservatives – Paul was underscoring that his opposition wasn’t merely partisan but was principled.)  What was the point of his filibuster?  In the senator’s own words, it was to make this simple yet important point: “No American should ever be killed in their house without warrant and some kind of aggressive behavior by them. . . . I will not sit quietly and let [B.O.] shred the Constitution.  No person will be deprived of life, liberty, or property without due process.” 

    Paul was forced to stop speaking, after about 13 hours, because of a “call of nature,” but he subsequently declared the effort a success.  On March 7, Attorney General Holder wrote Paul:  “It has come to my attention that you have now asked an additional question: `Does the President have the authority to use a weaponized drone to kill an American not engaged in combat on American soil?’ The answer to that question is no.”  So Paul finally succeeded in getting a definite “no” from the B.O. regime to at least one version of the basic question he was asking.  (The Senate then confirmed Brennan, by a 63-34 vote.  Most of the 34 votes against Brennan were from Republicans, joined by two Democrats, Patrick Leahy of Vermont and Jeff Merkley of Oregon, plus socialist “independent” Bernie Saunders of Vermont.  Interestingly, two senators who helped Paul filibuster – Republican Marco Rubio and Democrat Ron Wyden – joined the 63-senator majority to confirm.  Only two of Paul’s Senate colleagues openly criticized his effort – two Republicans (or maybe RINOs would be a better term), John McCain and Lindsey Graham, who called Paul’s rhetoric “alarmist,” claiming that B.O. wouldn’t kill anyone with a drone.  Maybe McCain and Lindsey are getting soft on B.O.; maybe they were just jealous of Paul’s popularity.) 

    Declaring Senator Paul’s March 6 filibuster to be a political, as well as legal, success – saying it makes him “probably now the 2016 front-runner for president” – Jonathan Moseley, writing in American Thinker, enumerated several reasons why “Rand Paul Shifts [the] Political Orbit” (March 8).  Moseley explains how Paul’s filibuster serves as a “teachable moment,” a model for Republicans to take similar stands in favor of limited-government principles, and against the B.O. regime, and to get popular support for their doing so. 

    Remarkably, among the political commentators praising Paul’s filibuster was left-liberal Washington Post columnist Eugene Robinson, writing the “On the Left” column in Investor’s Business Daily.  Robinson generally regards the Republican senator from Kentucky as (in his words) “an archconservative kook” (which is how many leftists see libertarians like Paul), yet he began his op-ed by writing, “Rand Paul was right.”  Praising Paul for calling attention to American citizens’ guarantee of due process rights, he added, “I cannot argue with the basic point Paul was making:  There must be greater clarity about how and where our government believes it has the authority to use drones as instruments of assassination, especially when U.S. citizens are in the cross hairs.”  He rhetorically asked, 

    “Imagine that drone technology had existed at the time of the 1995 Oklahoma City bombing.  Imagine the government somehow got wind of Timothy McVeigh’s plans in advance and tracked him to the compound where he and Terry Nichols were building their bomb.  Should the president have had the power to order a drone-fired missile strike, killing McVeigh, Nichols and whoever else might be in the vicinity?”


    Robinson is satisfied that Holder’s reply to Paul “closes that door.”  (I have less confidence in the B.O. regime, particularly Holder – who, like his chief, is an unabashed liar and hypocrite.)  Yet, as he concedes, even harder questions remain about the use of drones in other countries.  “The way we use drones as killing machines has to be consistent with our freedoms and our values.  For grabbing us by the lapels, Rand Paul deserves praise,” concludes Robinson – adding “Yikes, I said it again” (“Sen. Paul Was Justified in Droning On,” I.B.D., March 8).  I feel the same way about approvingly quoting Mr. Robinson!


    n National Community Organizer-in-Chief

                   In his infamous “open mic” comments to the Russian president a year ago – promising he’d be more “flexible” on missile defense in his second term – B.O. said that 2012 was his last political campaign.  Why then is his campaign organization still raising and spending money?  Could it be because after four years in office, B.O. still hasn’t learned how to govern (to exercise the executive powers – and duties – of his office, under the Constitution) and remains stuck in permanent “campaign mode”?  Or maybe he cannot function except as the radical leftist “community organizer” he was during his Chicago years? 

                   B.O.’s reelection campaign group, euphemistically called “Organizing for America,” has been converted into a tax-exempt nonprofit group, renamed “Organizing for Action” (OFA), supposedly to push his second-term legislative agenda.  OFA’s leaders – B.O.’s former White House and campaign aides – say they plan to raise about $50 million to build what the group says will be a “grass-roots” effort to rally public and congressional support for such issues as “climate change,” gun control, and immigration reform.  Jim Messina, who ran B.O.’s reelection campaign and now chairs OFA, claims that more than one million people have undertaken at least one “volunteer action” since the group started in late January (“Leaders of Obama advocacy group defend plan,” USA Today, March 14). 

    The acronym OFA is apt, because it’s truly an “ofa” idea.  (In case you don’t get the pun, “ofa” is how one pronounces “awful” if one’s mouth is full of shit.)  “Organizing for Action” should be renamed “Organizing for Access” because that’s what it provides:  anyone who writes a check for $500,000 sits on OFA’s “national advisory board” and gets to have face-to-face meetings with B.O. four times a year.  As the Investor’s Business Daily editors observe, “That’s a lot of access from a president who refuses to answer questions from the press about his actions and rarely ever holds press conferences.”  It also shows the blatant hypocrisy in B.O.’s claim, during his 2008 campaign, that he’d never take money from lobbyists.  As the editors note, “Cash for access has pretty much been a hallmark of the Obama years and it’s just getting worse. . . . He’s made his White House a revolving door of jobs for lobbyists.  And then there are the crony capitalists,” such as Solyndra, the failed “green energy” company that got $500 million in U.S. taxpayer funds, and the Southern Company, which seems to have bought a $8 billion federal loan guarantee for a plant in exchange for a donation to the B.O. Inaugural Committee.  Even Bob Edgar, president of the leftist Common Cause, says “It just smells” (“Organizing for Access,” February 27). 

    Modern presidents of both major political parties, since the time of Teddy Roosevelt, have been abusing the power of their office by transforming it into what TR called a “bully pulpit,” as I discussed in my essay last month on “The Unconstitutional Presidency.”  But with B.O. in the White House, the bully pulpit has been further transformed – into the bullshit pulpit.  Or maybe just the shit pulpit.


    n Sequester, Shmequester

                 What’s all the fuss about “draconian” federal government spending “cuts”?  Allegedly they result from the so-called “sequester” – or sequestration, the across-the-board federal budget cuts mandated by the August 2011 debt-ceiling deal – which began to be implemented on March 1.  The automatic “cuts” in spending total $1.2 trillion over 10 years, half from domestic (discretionary) programs and half from defense.  For this year, however, the “cuts” total just $85 billion – which is truly minuscule compared to a nearly $3.6 trillion budget.  Moreover, because the $85 billion is actually budget authority, not budget outlays, the actual reduction in budget outlays for this year is only $44 billion, or one quarter of 1 percent of GDP (GDP is $15.8 trillion) or only 1.25% of the $3.6 trillion government budget.  And the “cuts” aren’t really cuts at all; rather, they’re reductions in the automatic increases in spending that are built into the federal budget thanks to a gimmick called “baseline budgeting.”  With sequestration, federal spending as a share of GDP will still be 22.2%, well above the post-World War II average of less than 20%.  As the editors of Investor’s Business Daily have noted, the reductions don’t even get spending back to pre-B.O. levels.  “In fact, under the sequester the government will spend about $60 billion more than it did in 2008 just on what are called domestic discretionary programs – like education, law enforcement, highways and the environment.  That’s an increase of more than 10%.  And that 2008 spending level itself was hugely inflated – the result of a 60% increase in the domestic discretionary budget over the previous eight years” (“Automatic Cuts Are No Tragedy,” February 12).  I.B.D cartoonist Michael Ramirez has drawn a “pie chart” which graphically depicts the sequester cuts for what they are – mere crumbs – compared to total federal spending.  

    Notwithstanding the sequester’s relatively minuscule impact on federal spending, B.O. and his regime – along with the Democrats in Congress and their lapdogs in the news media – have been engaging in ridiculous hyperbole, giving apocalyptic warnings about the impact of the so-called “cuts.”  In a series of campaign-style events, B.O. warned that the budget cuts will “gut critical investments,” “weaken America’s economic recovery,” and “weaken our military readiness.”  He said the sequester’s “meat-cleaver approach” of “severe,” “arbitrary,” and “brutal” cuts will “eviscerate” education, energy, and medical-research spending.  And he added that tens of thousands of parents will scramble to find child care, and hundreds of thousands will “lose access to primary care and preventative care.”  His education secretary, Arne Duncan, went on Sunday TV talk shows to suggest that 40,000 teachers would lose their jobs; his transportation secretary, Ray LaHood, and his homeland security secretary, Janet Incompetano, claimed air travelers would suffer huge delays.  But the prize for hyperbole must go to Rep. Maxine Waters (D. –Calif.), who on March 1 claimed that “over 170 million jobs” could be lost due to sequestration.  (As noted below, that would wipe out the entire U.S. workforce – and then some.)  Later, Rep. Waters corrected herself, saying that it was “750,000 jobs” that could be lost. 

                B.O. and his regime sound like “Chicken Little” in the famous children’s story.  Or maybe (better yet) “the boy who cried `wolf’!”  Either way, it seems that B.O.’s regime is trying to manipulate the sequester cuts for their partisan advantage – in an effort to rouse public opinion against the sequester and blame it on Republicans.  But the strategy doesn’t seem to be working.  The American people generally seem to accept sequestration – public opinion polls show a majority of Americans support reductions in government spending – and they’re not so foolish as to believe B.O.’s lie that sequestration is all the Republicans’ fault.  The sequester indeed did originate in the B.O. White House – specifically, the automatic cuts were the idea of then-Budget Director Jack Lew (now B.O.’s new Secretary of the Treasury) and White House Legislative Affairs Director Rob Nabors, as Bob Woodward has revealed.  (See the discussion of “Woodward-gate,” below.)   The White House intended to make the sequester a “poison pill” that Congressional Republicans would never accept – that’s why half the cuts were to come from national defense, which they assumed Republicans couldn’t stomach – but that scheme failed.  It seems that most Republicans in Congress care more about cutting spending and reducing the deficit than they do in preserving current levels of defense spending.  (Maybe that’s because GOP leadership has shifted from neocons like John McCain to real limited-government conservatives or libertarians like Rand Paul.)  With neither the Republicans in Congress nor the American people terribly upset about the sequester, B.O. and his minions indeed do seem like a bunch of either “Chicken Littles” or boys who cried “wolf”!   

                One telling example of how the B.O. regime has botched its attempt to politically manipulate the sequester has been the unpopular decision to end public tours at the White House.  In a recent interview, B.O. denied any responsibility for the decision – Harry Truman famously had a sign on his Oval Office desk reading “The buck stops here,” but it seems B.O.’s motto ought to be “The buck-passing starts here” – as the White House claimed the decision was made by the Secret Service, which employs over 30 agents to provide security during the tours, at the cost of $74,000 a week.  Why eliminate the White House tours, unless the decision was purposely made to maximize public outrage?  Yet as the public learns more facts about the Secret Service and White House budgets, people aren’t blaming Congressional Republicans (who shrewdly managed to ensure the sequester wouldn’t interfere with public tours of the U.S. Capitol); rather, the blame has been falling squarely on B.O. and his regime.  Instead of canceling White House tours, they could have saved much more money by curtailing B.O.’s traveling:  operating Air Force One alone costs almost $180,000 an hour; eliminating just one of B.O.’s golf trips – such as the recent round he played with Tiger Woods in Florida, at a cost to U.S. taxpayers over $1 million – would save enough money to pay for three months’ worth of White House tours (Michael Ramirez, “White House Tours Canceled, but Golf with Tiger Is a Must,” I.B.D., March 12). 

                What’s really going on here?  In a recent column, Charles Krauthammer quoted a leading Democrat lobbyist who told the Washington Post that “the worst-case scenario for us” would be “the sequester hits and nothing bad really happens.”  Is sequestration “dumb”?  Sure, across-the-board cuts are dumb because we need to discriminate, to set priorities; that’s why Congress is supposed to adopt a budget.  “Except that the Democratic Senate hasn’t passed one in four years.  And the White House, which proposed the sequester in the first place, has 18 months to set rational priorities among accounts – and did nothing.  When the GOP House passed an alternative that cut where the real money is – entitlement spending – [B.O.] threatened a veto.  Meaning, he would have insisted that the sequester go into effect – the very same sequester he now tells us will bring on Armageddon.  Good grief.”  Krauthammer then provides the most succinct explanation of sequester politics that I’ve read: 

    “The entire sequester would have reduced last year’s deficit from $1.33 trillion to $1.24 trillion.  A fraction of a fraction.  Nevertheless, insists [B.O.], such a cut is intolerable.  It has to be `balanced’ – i.e., largely replaced – by yet more taxes.  Which demonstrates that, for [B.O.], this is not about deficit reduction, which interests him not at all.  The purpose is purely political: to complete his Election Day victory by breaking the Republican opposition.  . . . In the past two years, House Republicans stopped cold [B.O.’s] left-liberal agenda.  Break them now and the road is open to resume enactment of the expansive, entitlement-state liberalism that [B.O.] proclaimed in his second inaugural address.  But he can’t win if `nothing bad really happens.’  Instead, he’d look both foolish and cynical for crying wolf.”


    (“Dems Hoping Sequestration Yields Disaster,” I.B.D., March 1). 

                Krauthammer also succinctly summarizes the aversion that B.O. and the Democrats have to genuine cuts in government spending:  

    “For reactionary [left-]liberalism, . . . whatever sum our ever-inflating government happens to spend today (now double what Bill Clinton spent in his last year) is the Platonic ideal – the reduction of which, however miniscule, is a national calamity.  Or damn well should be.  Otherwise, people might get the idea that we can shrink government, and live on.”


    Could we cut $100 billion or more annually from federal discretionary spending?  Of course, Krauthammer observes, citing a 2011 GAO report that “gave a sampling of the vastness of what could be cut, consolidated and rationalized in Washington:  44 overlapping job training programs, 18 for nutrition assistance, 82 (!) on teacher quality, 56 dealing with financial literacy, more than 20 for homelessness, etc.  Total annual cost: $100 billion to $200 billion, about two to five times the entire domestic sequester.  Are these on the chopping block?  No sir.  It’s firemen first.”  “Firemen first” was a phrase coined in 1976 by Washington Monthly editor Charlie Peters to describe how local government functionaries beat budget cuts by putting the fire department first on the chopping block.  A similar tactic is used by local school district officials who, whenever voters defeat a millage, would put popular sports programs (or other extracurricular activities) first on the chopping block, essentially blackmailing voters into supporting increased taxes.   

                The sequester is nothing more, or less, than a good start – a good start on what must be done to reduce federal spending, cut the deficit, and end the hemorrhaging of the national debt.  The controversy over it highlights the mental illness that afflicts B.O. and the Democrats – what I’ve called their “Deficit-Attention Disorder” (D.A.D.): their stubborn refusal to recognize that the basic problem in Washington, D.C. today is out-of-control government spending, which in turn causes obscenely high deficits (averaging $1 trillion each year that B.O. has been in office) and the dangerously-high national debt.  D.A.D. in turn is a disorder that results from the Democrats’ underlying disease:  paternalism (their paternalistic philosophy of government).  It’s time that all Americans (not just some Republicans in Congress and some conservative and libertarian political commentators) begin calling them out on it.


    n Shall We Call It “Woodward-Gate”?

                Bob Woodward has long been respected as one of the greatest investigative reporters in modern U.S. history – mainly because of the work he did with his Washington Post colleague Carl Bernstein in breaking the story of the Watergate scandal, and thus helping to bring down Richard Nixon’s presidency, in the early 1970s.  Now Woodward is the target of vicious personal attacks from the political left (including many of the so-called journalists who once so idolized him), because of his recent revelations about B.O.’s lies concerning the sequester.  Not only did Woodward reveal in his book, The Price of Politics, that sequestration was an idea that originated in B.O.’s White House (as noted above), but Woodward also dared to criticize B.O.’s fear-mongering about the sequester.  Appearing on MSNBC, the left’s favorite cable channel, Woodward pointed out that neither Ronald Reagan nor George W. Bush ever would have claimed, as B.O. has, that a budget sequester would prevent the commander-in-chief from protecting the country militarily.  Specifically, Woodward challenged B.O.’s claim that the mere threat of these cuts “forced” the Navy to delay the deployment of an aircraft carrier to the Persian Gulf.  He called B.O.’s assertion “a kind of madness that I haven’t seen in a long time” – a fairly unambiguous reference to Nixon’s behavior during the Watergate crisis.   

    For simply reporting the truth about the B.O. regime, Woodward has been chastised by the leftist news media.  Let’s call it “WoodwardGate.”  Former Los Angeles Times reporter Steve Weinstein called Woodward “senile.”  Salon’s Alex Pareene ludicrously accused Woodward of laziness – “kind of like accusing Hugh Hefner of chastity,” writes the editors of Investor’s Business Daily (which is definitely not part of the leftist news media).  Pareene added that he hopes WoodwardGate will mean that in the future “no one talks to Woodward because since he’s lost it, let’s all stop indulging him.”  And the Huffington Post’s Eric Boehlert “was apparently ready to issue an excommunication for Woodward’s mortal sin of appearing on Fox News” (“Cowed Media Turn on Woodward,” I.B.D., March 4). 

    In the eyes of the left-wing news media – who are still enthralled in what media critic Bernard Goldberg has called their “slobbering love affair” with B.O. (the “Emperor’s New Clothes” phenomenon I’ve so often discussed here on MayerBlog) – Woodward has committed an even greater sin, by disclosing how he has been targeted with threats from Gene Sperling, B.O.’s top economic adviser, who yelled at Woodward for a half hour and then emailed him (with regard to his reporting about the origins of the sequester), “I think you will regret staking out that claim.”’s Arnold Ahlert observed that Woodward is “getting a taste of what happens to those who challenge the Obama-Democrat-media machine.”  The B.O. White House attacks anyone who criticizes it, then B.O.’s allies in the news media “immediately joined the feeding frenzy before any objective evidence was available – a chilling warning to anyone who would dare defy the power structure in Washington,” from a press that has “abandoned truth-telling for two things they consider far more important: advocacy and access.”  AP reporter Ron Fournier in the National Journal noted that because of WoodwardGate, he “broke ties with a senior White House official” who had been a longtime source for him, adding that “my only regret is I didn’t do it sooner.”  Over a long period, Fournier reports that he had received “emails and telephone calls from this White House official filled with vulgarity, abusive language” and a threat identical to the one Woodward got from Sperling.  It all confirms what the I.B.D. editors call “an undeniable pattern of systemized Obama bullying of the press – extending even to Bill Clinton’s White House counsel Lanny Davis,” who now works as a columnist for the conservative Washington Times.  “In eagerly devouring one of their own, the media attacking Bob Woodward are choosing the power structure in Washington over future generations of journalists, whose job is supposedly to scrutinize that power.” 

    People on the political left like to say that they regard “speaking truth to power” as a virtue, even an act of heroism.  But apparently that only applies when those wielding power are Republican politicians.


    n B.O.’s Anti-Growth Agenda

                 Among the dire warnings – the parade of horribles – that B.O. has claimed will result from the sequester is an increase in the nation’s unemployment rate.   How stupid does he think the American people are?  Increased unemployment is the inevitable result of B.O.’s own anti-growth, anti-jobs agenda – policies that include the misleadingly-named “Affordable Care Act” (aka “ObamaCare”).  For a nation that used to have a real unemployment rate of 4–5 % (or less), the “new normal” under B.O.’s regime seems to be about twice that rate – in other words,  more like the nearly-bankrupt European welfare states than like the traditional rate for the USA.  B.O.’s redistributionist agenda has caused not only a rise in unemployment – with 23 million Americans still unable to find regular work – but also a decrease in wages (average wages, adjusted for inflation, have dropped for 21 of the last 23 months), higher rates of poverty, ongoing $1 trillion annual federal budget deficits, and the gross domestic product (GDP) actually falling during the last quarter (Ralph Reiland, “Obama’s Anti-Growth Agenda,” I.B.D., February 15). 

    Contrary to most media reports, the government’s February jobs numbers did not signal an economic recovery.  The White House was eager to tout the monthly jobs report as evidence that the nearly four-year-old “recovery” was finally “gaining traction,” saying that 236,000 jobs were added to the economy and the official unemployment rate dropped to 7.7%, the lowest it’s been since December 2008.  What the White House didn’t say is that the official rate dropped because even more people gave up on looking for a job.  While the country gained 236,000 jobs, the ranks of those not in the labor force – people who don’t have a job and stopped looking – swelled by 296,000.  That continues a trend throughout the “recovery” of the past four years (which I’ve called “B.O.’s recession,” not truly a recovery at all):  the non-workforce has climbed almost twice as fast as people with jobs.  Put another way, only 58.6% of Americans work today, down from 60.6% when B.O. took office.  The average over the previous two decades was 63%.  The number of long-term unemployed is still higher than it was three and a half years ago – and it jumped more than 90,000 in February.  There’s also the fact that 3.7 million workers have gone on the Social Security disability program since mid-June 2009, the fastest enrollment pace ever (“Jobs Report Not So Hot After All,” I.B.D., March 9). 

    The basic reason why the American economy isn’t growing is that B.O.’s tax and regulatory policies have had a devastating dampening effect on American businesses.  As the editors of Investor’s Business Daily observed in a recent editorial, B.O. “bad-mouths businesses and punishes entrepreneurs by taking the profits they’d otherwise plow back into their operations.  And he’s strangling [businesses] with reams of new red tape,” particularly with the uncertainties and costs created by “ObamaCare” and “Dodd-Frankenstein.”  “Companies have put off hiring and investment in new plants and equipment to deal with regulations that now dwarf the New Deal in scope and complexity.”  The editors quote Federal Reserve Gov. Richard Fisher, who in a recent speech singled out the Dodd-Frank law, maintaining that new banking rules “have exacerbated the weakness in economic growth by increasing regulatory uncertainty in key sections of the U.S. economy.”  He added, “Against this backdrop, I am not surprised by the reaction of businesses. . . . Private-sector job creators are in a defensive crouch” (“It’s the Economic Growth, Stupid!”  March 11). 

    As long as B.O. and Democrats in Congress stubbornly insist on higher taxes and more regulations – not just refusing to remove the burdens they’ve imposed on the economy, but actually “doubling down” on their misguided, disastrous anti-business policies – “growth won’t spring back to normal,” the I.B.D. editors conclude.  “But if we just doubled [the] anemic sub-2% real GDP growth [under B.O.] to President Reagan’s 4% average clip, we’d see some $5 trillion in deficit reduction over the next decade.”  In other words, if our tax and regulatory policies were pro-growth instead of anti-growth, we wouldn’t need gimmicks like sequestration to balance the federal budget.


    n Karl Rove Is Right!  . . .

                 According to many Democrat politicians and their allies in the news media, the Republican Party nationally is in disarray.  It’s become customary, in modern politics, for a political party that “loses” a major election to engage in some post-electoral “soul-searching.”  But following the 2012 elections – which the GOP only partially lost (as noted below) – it seems that Republicans are not just suffering from an identity crisis but have also become quite dispirited – even depressed, one might say – and are dividing into various camps, each with its own “solution” to the party’s difficulties.  Needless to say, Democrats and their media allies are gleeful (indeed, almost giddy) about the Republicans’ problems – which tends to depress and to divide Republicans even more.  Politics, though, is a game based largely on perceptions of reality, however false; and the perception that the GOP is in deep trouble is indeed false. 

    Many Republicans have become their own worst enemies, by buying into the false perceptions of their colleagues – and their political opponents.  Among those false perceptions is one being propagated by some “conservative” commentators, particularly talk-radio star Rush Limbaugh, who simplistically divides politically-active and aware people into two camps, “liberal” and “conservative,” and then in turn divides the Republican Party into two camps, “establishment” Republicans and true “conservatives.”  According to Limbaugh’s narrative, the GOP “lost” the 2012 election because the party’s presidential candidate, Mitt Romney, was not a true conservative but rather a “Massachusetts moderate” (echoing Newt Gingrich’s accusation during the GOP primary campaign) – which to Limbaugh is the same as a “liberal” – and the candidate picked by the hated “establishment.”  Limbaugh’s overly-simplistic (and just plain wrong) analysis overlooks several important facts, among them:  (1) the GOP may have lost the presidential race (and failed to recapture control of the U.S. Senate), but it “won” in the U.S. House (where it retains its majority) and in most statehouses around the USA; (2) Romney won the party nomination by winning most of the state primary elections, not by getting just party “establishment” support; (3) self-identified “conservatives” do outnumber self-identified “liberals” in most polls, but “independents” – not conservatives – hold the plurality; (4) “conservatives” are not a monolithic group but can be divided into several different camps (social conservatives, limited-government or libertarian conservatives, “neo-conservatives,” and so on); and (5) Mitt Romney truly was more of a limited-government conservative than his other GOP primary rivals, with the exception of Ron Paul (a point that I repeatedly made here on MayerBlog both before and after the election).  It is Limbaugh’s ignorance of the last two points that mostly explains why his analysis of the 2012 election is virtually worthless – except as a kind of self-fulfilling prophecy.  As I warned in my “Spring Briefs” blog last year (Mar. 15, 2012), in the section on “Rush Limbaugh’s Near-Fatal Mistake,” Rush’s criticisms of Romney dampened enthusiasm for the GOP nominee among many “core” GOP constituents (self-identified “conservatives”  who comprise Rush’s audience) – and thus actually helped the Democrats and particularly B.O.’s reelection campaign.  Rush’s “near-fatal” mistake – despite his positive comments about Romney after the GOP Convention had secured him the nomination – became a fatal mistake (fatal to Romney’s election chances); his lack of support for Romney – along with a similar lack of support by most Tea Party groups – was probably the chief reason why Romney lost the presidential election.  

    Given all the misinformation Rush is spreading among his regular listeners, it’s not surprising that a broad array of similarly self-identified “conservatives” are waging a veritable civil war within the GOP, one that supposedly pits “establishment” Republicans against “grass-roots activists.”  Nor is it surprising that many of these self-identified “conservatives” – Tea Party groups, talk-radio hosts, and other activists – have attacked Republican strategist Karl Rove, whom former president George W. Bush called “the architect” of his presidential campaigns.  Rove’s American Crossroads super-PAC had at best mixed success in the 2012 elections, but Rove on February 4 announced plans to create a new PAC, called Conservative Victory Project, to spend money to promote “electable” candidates in GOP congressional primaries in 2014.  Many conservatives have reacted furiously.  For example, radio talk host Mark Levin asked, “Who died and made Karl Rove queen for a day?” 

    Rove and Crossroads President Steven Law (who’s heading the new project) have made the media rounds, from Fox News Channel’s Sean Hannity to MSNBCs Chuck Todd, arguing that they’re not trying to thwart the will of conservative primary voters, or to promote moderates, or to protect incumbents.  They have pointed out that Crossroads groups spent $30 million in the past two elections to help Tea Party favorites such as Senators Marco Rubio, Rand Paul, and Pat Toomey – as well as unsuccessful candidates including Indiana’s Richard Mourdock.  But as Rove realizes – and naïve “conservatives” like Rush Limbaugh fail to realize – many of the Republican candidates lost in 2012 because they alienated voters because of their extreme social conservatism and (for want of a better word) all-around kookiness.  Two prime examples are Mourdock and Missouri’s Todd Akin – candidates for U.S. Senate seats that should have been won by the GOP but weren’t because of their extreme anti-abortion views and their outrageous comments about rape.  Social conservatives, because their positions on such issues as abortion and same-sex marriage are so out of line with evolving American values, are turning off the GOP to many younger voters (who are more libertarian).  (Romney’s efforts to court social conservative support, by taking strong stands against abortion and groups like Planned Parenthood and agreeing with the GOP platform’s opposition to same-sex marriage, were double-edged swords:  they failed to convince social conservatives that he was truly a “severe” conservative, as he once called himself with a poor choice of words, and at the same time they alienated more libertarian GOP and independent voters.) 

    Rove is not saying that the GOP should abandon its principles by becoming a “big tent” that will appeal to moderates or independents.  But he is saying that the GOP should not insist on “conservative” purity in its candidates.  The Conservative Victory Project says that its aim is to institutionalize William F. Buckley’s rule:  Support the most conservative candidate who is electable.  As David Harsanyi, columnist and senior reporter at Human Events, observed in a recent op-ed: “The most electable conservative candidate in the Northeast isn’t going to be a social conservative.  It’s that simple.”  He quotes Rove saying: “If . . . people think the best we can do is Todd Akin and Richard Mourdock, they’re wrong.  We need to do better if we hope to take over the United States Senate.  We need to get better conservative candidates and win” (“Rove has a point: focus on high-quality candidates,” Columbus Dispatch, February 8). 

    To these comments from Harsanyi and Rove, I say “Amen.”  And I’ll add that the “best” or most “electable” type of conservative generally is not going to be a social conservative or a neo-con; it will be a true “limited-government” conservative or a libertarian.  That’s the future of the GOP.


    n . . .  But Rand Paul and Charles Murray Are Even More Right!   

                Democrats and their allies in the news media criticized the recent Conservative Political Action Conference (CPAC) for failing to invite “popular” Republican governors, like New Jersey’s Chris Christie, to speak.  But CPAC isn’t a Republican conference; it’s a conservative conference, and Chris Christie isn’t a true “conservative,” no matter how one defines it.  (His speech at the 2012 Republican National Convention, billed as a “keynote” speech, failed to do what a keynote speech ought to do – to support the party’s presidential nominee – and seemed instead to support only Christie’s own future political ambitions.  And his shameless courting of federal disaster funds following tropical storm Sandy last fall seemed to play directly into B.O.’s hands, helping his reelection campaign – making Christie seem more like a RINO, if not a traitor to the GOP).  But considering how many different types of conservatism there are in America today (as noted above), it’s not surprising that many of the speakers at CPAC failed to fit the stereotype of either conservatism or the GOP that left-liberals are trying to create.  Two cases in point are Senator Rand Paul and libertarian political scientist Charles Murray.  

    Rand Paul’s splendid speech not only criticized the B.O. regime and Democrats in Congress with many hard-hitting zingers but also made a convincing case about the direction in which the GOP must change, if it expects to win future elections.  It’s the same point I’ve been making here in MayerBlog, in my postings both before and after the 2012 elections:  that the GOP must become more consistently the party of limited government and of individual freedom.  Among other things, Senator Paul observed: 

    “The Republican Party has to change – by going forward to the classical and timeless ideas enshrined in our Constitution.  When we understand that power corrupts and absolute power corrupts absolutely, then we will become the dominant national party again.


    “It is time for us to revive Reagan’s law: For liberty to expand, government must now contract.  For the economy to grow, government must get out of the way. . . . .


    “Our party is encumbered by an inconsistent approach to freedom.  The new GOP, the GOP that will win again, will need to embrace liberty in both the economic and personal sphere.


    “If we are going to have a Republican Party that can win, liberty needs to be the backbone of the GOP.


    “We must have a message that is broad.  Our vision must be broad.  And that vision must be based on freedom.” 


    Getting down to specifics, Charles Murray’s speech at CPAC took the attendees by surprise by focusing on one important “liberty” issue:  same-sex marriage.  Murray ditched his prepared remarks on “America Coming Apart” and instead delivered an impromptu admonition, on the question, “How can conservatives make their case after the election?” drawn from his experience with his own four children, who range in age (he said) from 23 to 43.  While they share many of his views on limiting the size of government, and supporting free enterprise, he said, “Not one of them thought of voting for a Republican President” in the last election.  Their disenchantment with the Republican Party was not specifically because of Mitt Romney, he added, but because “they consider the Party to be run by anti-abortion, anti-gay, religious nuts.”  With regard to same-sex marriage, which he called “gay marriage,” he went on, “I think the train has left the station.” 

    Writing about Murray’s speech in New Yorker magazine, left-liberal reporter Jane Mayer seemed gleeful herself when she observed, “Certainly the locomotive power of the issue seemed hard to miss on a day when the top political news was Ohio Republican Senator Rob Portman’s announcement that he, too, supports gay marriage.”  Like Senator Portman – whose position on the marriage issue changed after he found out his 21-year-old son is gay – Murray says his “change of heart” has come after he’s acquired “a number of gay and lesbian friends.”  He also has been influenced by the pro-same-sex marriage arguments made by Jonathan Rausch, an openly gay writer for the National Journal and the Atlantic.  As I observed in my essay “Marriage, American Style” (May 19, 2004), same-sex marriage can be properly seen as a “conservative” issue, if one sees truly principled conservatism in terms of consistently arguing for more limited government and for maximizing individual freedom.  Ms. Mayer then reports that the “disquiet” at CPAC “grew further, as Murray suggested that abortion, too, was an issue better left, for the most part, to `moral suasion’ rather than criminalization.”  Bravo, again!


    n The Occupier-Who-Must-Not-Be-Named 

                It is forbidden to refer to the current “occupier” of the White House by name on Glenn Beck’s radio show, a policy that the host – a former conservative who now calls himself “libertarian” – instituted beginning in January.  On Beck’s show, he’s sometimes referred to as “He-who-must-not-be-named” (as the evil Lord Voldemort was called in the Harry Potter novels – which in turn may have been inspired by “she-who-must-be-obeyed,” as English barrister Horace Rumpole called his wife, in John Mortimer’s Rumpole of the Bailey stories).  More often, he’s called simply “That Guy.”  Offenders (including Beck himself) must pay a fine; the money thus collected will be donated to charity.   

                I can find no fault with Beck’s policy because, as regular readers of MayerBlog know, it’s been my practice to refer to “that guy” in the White House as “B.O.” since October 2008.  Except when I’m quoting from someone else, I never use B.O.’s full name.  As I’ve previously explained, referring to him by his initials makes sense and seems especially apt because his record and his policies, in a word, stink.  (As I’ve also frequently written, “that continually rising stench emanating from Washington, D.C. is a result of America’s B.O. problem.”)  I do have one quibble with Beck, however.  He still refers to B.O. sometimes as “the President” – something I refrain from doing, for the obvious reason that B.O. (who, as I’ve often maintained here, is not only the worst but also the most lawless president in U.S. history) does not deserve the title.  I have too much respect for the office of President of the United States to taint it by associating it with B.O., whom I frequently call by other titles he more truly deserves:  Occupier-in-Chief, Bullshitter-in-Chief, Liar-in-Chief, etc.   


    n Dorner “Unchained” 

                Coverage of so-called “gun violence” (the euphemism for government control of guns that I discussed in Part III of this year’s “Prospects for Liberty” essay) by the left-wing “lamestream” news media took a strange turn last month, in the media’s coverage of the murderous rampage of Chris Dorner, the disgruntled former L.A. cop who was burned to death in a shootout with his former police colleagues, at a Big Bear, California cabin on February 12.  What was so strange about the story was the way certain politicians and media sources seemed to be sympathetic to Dorner, romanticizing him and his murderous rampage, as if he were the real-life counterpart of the “hero” in Quentin Tarantino’s violent film, Django Unchained – the black man who kills “whitey” in retribution for supposed racial crimes.  (Dorner was a supporter of B.O. whose Facebook “manifesto” argues in favor of the resumption of an assault-weapons ban.)  For example, on CNN’s Reliable Sources, host Howard Kurtz spoke with George Washington University professor Steve Roberts (husband of ABC’s Cokie Roberts and a former reporter for Newsweek).  Roberts’ thought that the manifesto Dorner wrote was “interesting” because of what he wrote about the LAPD, which has had (in Roberts’ words) “a long history of racism.”  He thought Dorner’s “historical sensibility” in the manifesto was “worth paying attention to despite the murders.”  (Dorner’s murderous rampage included the killings of the innocent child and girlfriend of one of the supposed “racist” L.A. cops that he was targeting for retribution.)  Even more favorable comments have been unearthed by conservative commentator Michelle Malkin, on her Twitchy website, where she aggregated tweets from many leftists to whom Dorner has become a kind of cult hero.  “Oh, yeah, I love this guy!  He’s the modern-day, real-life Django,” one Twitter user wrote. 

    There’s more than just the left’s “politically correct” racism at work here.  The media’s coverage (or lack of coverage) of the Dorner case reveals the fallacy at the heart of the gun-control fanatics’ fantasy world, where Dorner is one of the “Good Guys”: “the controlled, heavily trained few within federal, state, and local governments who are entitled to own and handle high-performance firearms as Americans’ protectors,” as the editors of Investor’s Business Daily observed.  “To the predictable response from the left that the government fired Dorner – in the form of the Los Angeles Police Department sacking him for issuing false statements [alleging police brutality and/or racism in the department] – it should be pointed out that someone capable of Dorner’s apparent crimes could well have done so while still wearing a law enforcement or military uniform, and for rationales other than those he chose.”  The editors conclude, “Armed police officers turning into serial killers might be rare, but the Dorner nightmare is further proof that the problem is not guns, but criminality; not the tools of crime but its perpetrators” (“Government Gun Holders Go Crazy, Too,” February 9). 

    What the Dorner case really teaches us about “gun violence” is that the right protected by the Second Amendment is indeed precious to “ordinary,” law-abiding Americans.  It’s important not only to protect them against criminals, but also against government and its agents when they act like criminals, too.  The core principle behind the Second Amendment is that government ought not to have a monopoly on firearms; Dorner’s murderous rampage reminds us why.


    n  “Dearth Hour” 

                For 60 minutes (starting at 8:30 p.m. Eastern time) on Saturday night, March 23, radical environmentalists and other gullible people will observe “Earth Hour” by killing their lights, to increase awareness of “climate change.”  Earth Hour, organized by the World Wildlife Fund, began in Sydney, Australia in 2007 and has since spread to 152 countries, including the USA – most of them prosperous countries in the “developed” industrial world.  It persists because radical environmentalists are still perpetrating the “climate change” scam – the myth that man-made “greenhouse gases” from the burning of carbon-based “fossil fuels” causes “global warming” and other types of “climate change,” endangering the earth.  Despite abundant evidence that the theory is in fact a myth, based on “junk” pseudo-science, radical environmentalists still manage to intimidate many gullible people, who feel self-righteous when they do something symbolic for the cause – such as turning off their lights for an hour on a Saturday night. 

                Besides being silly, Earth Hour is counter-productive, for it actually does harm to its stated goal, as noted by Bjorn Lomborg (author of The Skeptical Environmentalist and Cool It: The Skeptical Environmentalist’s Guide to Global Warming).  “It may inspire virtuous feelings, but its vain symbolism reveals exactly what is wrong with today’s feel-good environmentalism.”  Writing in Project Syndicate, a website devoted to “thought-provoking commentaries,” Lomberg points out that during Earth Hour any reduction in CO2 emissions, resulting from a drop in electricity demand during the hour, will be offset by the surge from firing up coal or gas stations to restore electricity supplies afterward.  Moreover, he notes, the candles that many participants will light during Earth Hour “are still fossil fuels – and almost 100 times less efficient than incandescent bulbs.  Using one candle for each switched-off bulb cancels out even the theoretical CO2 reduction; using two candles means you emit more CO2.” 

                In an aptly-titled editorial, Investor’s Business Daily reminds us: “Electricity is not a curse.  It is a blessing.  It’s given us refrigeration that keeps food from rotting.  It heats us, cools us, and pumps clean water into our homes.  It powers vital medical equipment and modern conveniences from televisions to toasters to cellphones.”  The editors add:  “Rather than turning lights off Saturday night, we suggest not only leaving them on, but turning on those that aren’t needed.  As the Competitive Enterprise Institute suggests, celebrate human achievement” (“Dearth Hour,” March 22).


    n Draft Them Gals! 

                With the U.S. military now OKing women for combat, there’s no rational basis for limiting conscription (the military draft) to men.  Current federal law compels only men between ages 18 and 25 to register for a military draft.  Never before has the country drafted women into military service, and neither the B.O. regime nor Congress seem eager to make them register.  But, constitutionally speaking, they may have no other choice – unless they repeal the draft registration law entirely.  (That’s what really ought to be done, for military conscription is not only inconsistent with America’s libertarian founding principles but also a direct violation of the Thirteenth Amendment prohibition of involuntary servitude, even though the Supreme Court failed to recognize the validity of that argument in its infamous World War I era Selective Service Cases.)    

    More than three decades ago the Supreme Court ruled that it was constitutional to register only men for a draft.  The Court’s rationale was that registration creates a pool of potential combat troops in case a national emergency – like a world war – required a rapid increase in the size of the military.  At the time women were excluded from serving in battlefield jobs, so there was no reason to register them for possible conscription into the armed forces, the Court held.  Now that front-line infantry, armor, artillery, and special-operations jobs are open to female volunteers who can meet the physical requirements, “it will be difficult for anyone to make a persuasive argument that women should continue to be exempt from registration,” said Diane Mazur, a law professor at the University of Florida and a former Air Force officer quoted in a recent AP news article (“Might draft registration expand to women?”  Columbus Dispatch, February 26). 

    So-called feminists – such as the National Organization of Women (NOW – what Rush Limbaugh has called the “National Association of Gals,” or NAG) – often complain about perceived inequalities between women and men, particularly the denial of “equal rights” for women.  Shouldn’t that include equal responsibilities as well?


    n Saint Hillary? 

                Many Democrats have been deluding themselves, thinking that the heir apparent to the White House is Hillary Rodham Clinton (HRC – otherwise known as Her Royal Clintonness).  After resigning as secretary of state, Mrs. Clinton is now in retirement; but her absence from the Washington political scene has not quelled speculation about 2016.  (After all, absence makes the heart grow fonder.)  B.O., the “Anointed One,” himself apparently anointed Hillary as his successor, in that now-infamous joint interview on CBS’s Sixty Minutes on January 27, where Steve Kroft asked such powder-puff questions that calling them “softball” would be too harsh – and an insult to the game.  (Fox News’s Greg Gutfeld, on The Five, aptly characterized it this way: “That wasn’t an interview; it was a ménage a trois!”)  Newsweek magazine (more justly called Newsweak) proclaimed Mrs. Clinton “the most powerful woman in U.S. history” on an early February cover.  That phony smile on the face of V.P. “Smiley” Joe Biden, who no doubt fancies himself as heir apparent, must be even more phony these days! 

                One fairly sure sign that HRC is still politically ambitious – that she remains (to put it bluntly) a power-hungry bitch, with her eyes again on the White House – is her recent action endorsing same-sex marriage.  “I support it personally and as a matter of policy and law, embedded in a broader effort to advance equality and opportunity for LGBT Americans and all Americans,” Mrs. Clinton said in a video released March 18 by the advocacy group Human Rights Campaign.  Why would the now-retired secretary of state make such an announcement unless she plans to run again for political office?  Note her use of the “politically-correct” (P.C.) buzzwords, “LGBT Americans” – referring, of course, to “lesbian, gay, bisexual, and transsexual” Americans – in a blatant attempt to appeal to those non-heterosexuals who identify themselves with those labels.  Never mind that, as secretary of state, Hillary did nothing to try to stop persecution of homosexual persons in radical Muslim countries or leftist dictatorships.  For her, apparently words speak louder than actions – or inaction.  In any event, it looks as though Mrs. Clinton is positioning herself for the 2016 presidential election. 

    But the pebbles strewn along Hillary Clinton’s path back to the White House are more like boulders, the inconvenient facts about her past record as U.S. secretary of state, U.S. senator, and first lady of both the U.S. and Arkansas – a past that ought to disqualify her from higher office, except in a political party that chose B.O. as its standard-bearer.  Mrs. Clinton’s tenure as secretary of state can be fairly described as disastrous: over the past four years, militant Islamic terrorists have seized more control of the Middle East, as the murder of the U.S. ambassador and three other Americans at the U.S. consulate in Benghazi, Libya, so pointedly illustrated.  Hillary was a key conspirator in the B.O. regime’s cover-up of that foreign-policy debacle, helping to perpetuate the lie that the Benghazi terrorist attack was motivated by an anti-Mohammed video broadcast on the Internet.  And Hillary’s much-touted “reset” of U.S.–Russian relations has yielded only B.O.’s capitulation to Vlad Putin on such issues as missile defense.  Even more embarrassing to Hillary (if she had any shame) is her record as Slick Willy’s cuckolded wife and criminal co-conspirator.  (Remember those infamous billing records from Hillary’s Rose law firm that mysteriously appeared in the Clinton White House living quarters – documents that were at the heart of the Whitewater land fraud?  All of Hillary’s whining about a “vast right-wing conspiracy” out to get her and Bill will never fully lay that story to rest.  Hillary’s political past may now be in the history books (most of which are written by people with a leftist bias that’s favorable to Hillary); nevertheless, that past will come back to haunt her, if she should be so foolhardy as to reenter political life. 


    n Too Pooped To Pope 

                Pope Benedict XVI shocked the world on February 11 when he announced his resignation, effective February 28, citing his age and declining health.  The former Cardinal Joseph Ratzinger, elected pope in April 2005, had served for almost eight years but decided to step down because, at age 85, he was becoming increasingly infirm, physically.  The Pope Emeritus Benedict (as he is now designated) leaves the Roman Catholic Church with a mixed legacy.  Although he helped make the Church seem relevant to the modern world by using new technologies – he tweeted from an iPad, issued benedictions from a Facebook page, and distributed Vatican news from a YouTube channel – he was a staunch conservative, theologically, reaffirming the Church’s traditional teachings on such matters as abortion, contraception, marriage, and a celibate, male-only priesthood hierarchy.  Although he denounced the scandal of child abuse by pedophile priests – calling it a “scourge” – Benedict XVI was criticized for not doing enough to expose the crimes.  And under his papacy, administrative problems within the Church festered, such as the incompetence, financial corruption, and possible organized-crime ties within the Vatican Bank.      

                Although Pope Benedict’s action was unprecedented in the modern era (the last time a pope resigned was 600 years ago, in 1415) – and therefore quite unsettling to the tradition-bound Church – it’s likely to become more common in the future, as modern medical technology allows people to survive well into their 80s, 90s, even beyond – despite physical or mental infirmity.  What will the Church do when a future pope develops Alzheimer’s disease?  It can only hope that pope emulates Benedict XVI in acting responsibly and resigning.  As Mary Johnson wrote in a recent USA Today op-ed, Benedict’s “most valuable legacy” could be his resignation.  “Is not this sort of responsible behavior, so void of heroics, so laudably sane, exactly what one might expect from a responsible steward, a good shepherd, a faithful servant?” she asks, adding – and she meant this as a compliment – that by resigning instead of staying until his death, “he treated the papacy not like a sacrament or a monarchy, but a job” (“Pope Benedict” An Unexpected Revolutionary,” February 18).   

                Meanwhile, the Church’s cardinals recently (on March 13) selected a new pope – and by their action, seemed only to “kick the can” proverbially down the road.  Any Catholics who hoped that the new pope would be a true “reformer” were no doubt disappointed by the choice of Cardinal Jorge Mario Bergoglio, archbishop of Buenos Aires, Argentina, who has taken the name of Francis I (after Saint Francis of Assisi ).  At age 76 and in questionable health (reportedly he has only one functioning lung), he is unlikely to have a long tenure.  And although Pope Francis represents many “firsts” for the Church – among them, the first pope from the Americas, the first pope from South America, the first Jesuit pope – he is known for standing firmly for core doctrine, like his predecessors John Paul II and Benedict XVI.  The main way he’s different?  By all accounts a genuinely humble man – in Buenos Aires, he took public transportation around the city and eschewed the archbishop’s palace, choosing instead to live in a modest room, where he cooked his own meals – the new pope has a personal style that is the antithesis of Vatican splendor.   

                Wisely, the new pope has fought against the Marxist “liberation theology” movement that swept Latin America in the 1980s; he also has been an outspoken critic of Argentine President Cristina Fernandez’s leftist government, using his homilies to criticize economic and social conditions as well as political corruption.  Some commentators – such as the editors of Investor’s Business Daily – have idealistically predicted that Pope Francis might be an opponent of leftist dictatorships in South America (including Venezuela, Ecuador, and Bolivia) the same way Pope John Paul II opposed communist dictatorships in eastern Europe (“Pope and Change in the Vatican,” March 15).  Nevertheless, Francis sees himself as a champion of the poor and outcast; and he shares with leftist “liberation” theologians the same perverted notion of “social justice” that is rooted in a moral philosophy of altruism, or self-sacrifice, which along with the Church’s adamant stance against contraception, is responsible for most of the poverty in the “Third World.”  Like his predecessors, Pope Francis is unlikely to embrace the only true hope for happiness and prosperity in the world:  free-market capitalism.


    n  Hugo Chavez Is Still Dead 

    “Generalissimo Francisco Franco is still dead” was a catchphrase that originated in 1975 during the first season of NBC’s Saturday Night Live – a regular feature of Chevy Chase’s “Weekend Update” news segment – and which mocked the weeks-long media reports of the Spanish military dictator’s impending death.  Today, 38 years later, after weeks of media reports of the impending death of another dictator, Venezuela’s socialist “president” Hugo Chavez, on March 5 it finally was announced that Chavez was indeed dead from cancer.   

    Don’t expect SNL or any other mock “news” show (such as Jon Stewart’s Daily Show on Comedy Central) to repeat that joke from 1975: to today’s left-wing news media, the death of a socialist dictator like Chavez isn’t as amusing as the death of a fascist dictator like Franco.  (Some leftist journalists and Hollywood celebrities like Oliver Stone and Sean Penn have even lionized Chavez as a “champion of the poor.”  Never mind that he’s a thug whose 14-year rule has made his country even poorer and more crime-ridden.)  To the individuals who lived under the oppressive rule of either of these dictators, however, there’s not a bit of difference – socialism and fascism are simply different sides of the same tyrannical collectivist coin), nor is there anything either funny or sad about the end of their horrible lives.  

    Anyone who doubts former President George W. Bush’s warning that there’s an “axis of evil” in the world – a global conspiracy of rogue states who are rabidly anti-American because the USA represents the freedom and individualism they’re plotting to destroy – need only look at the guest list for Chavez’s funeral in Caracas.  It included Cuba’s Raul Castro, Iran’s Mahmoud Ahmadinejad, and the left-wing dictators of Bolivia, Ecuador, and Nicaragua.  The U.S. delegation was led by Rep. Gregory Meeks (D.–N.Y.) and former Rep. Bill Delahunt (D.–Mass.), who were “friends” of the dictator because of the cash and influence Chavez showered on them.  (Delahunt is the “broker” of Chavez’s gambit to buy influence in the U.S. through his cheap heating oil program, doled out through Joe Kennedy’s Citizens Energy.  Meeks sought Chavez’s help to improve the court chances of a convicted Ponzi schemer.) “Taking Chavez’s penny, both remain in his debt,” even after death (“Sorry Show in Caracas,” I.B.D., March 9).


    n The Once and Future Queen 

                The big news among fans of the British royal family is that Prince William and his wife, Kate (the former Kate Middleton, now Duchess of Cambridge), are expecting a baby, the heir-apparent to the British throne.  Of course, the current queen – Elizabeth II – following her triumphant 60th-anniversary Jubilee, is as popular as ever.  But the two men who are next in line – her son, Prince Charles, and her grandson, Prince William – are barely noticed these days.  Charles never was very popular; nor is he terribly bright – with his radical environmentalist nonsense and his self-conceit that he’s an expert on architecture.  And William, once the leading royal heart-throb and hence media darling, has lost some of his allure, after marriage and his growing bald spot.  (Thankfully for him, his wife’s fashion sense – and her increasingly obvious “baby bump” – seem to get most of the attention.)

    Anticipating that the future heir-apparent may be a girl, Great Britain and the other countries that are members of the Commonwealth of Nations (who all consider the reigning British monarch to be their monarch, too) last fall began the process of changing their centuries-old rules of succession that put sons on the throne ahead of any older sisters.  (Since the medieval period, England generally has followed the rules of primogeniture, which in determining the line of succession to the hereditary throne gave priority to the oldest male in the direct line of descent.  A woman would become monarch only if she had no brothers, either younger or older than she – which is partly why both Victoria and Elizabeth II became queen.)  The new rules would apply only to future heirs and would have no impact on the current line of succession.  (Next in line to the throne after Elizabeth II is Prince Charles, who is the queen’s firstborn child.  Charles’ sister, Anne, is lower in the line of succession than her younger brothers Andrew and Edward by virtue of their male gender.  William is second in line to the throne after his father, but since Charles had only sons – William and his brother, Prince Harry – the issue was never raised until William married.)   

    The change in the rule of succession will require approval by the legislatures of all 16 Commonwealth nations that have Queen Elizabeth II as their head of state, including not only Britain itself but also such countries as Canada, Australia, and New Zealand.  But representatives of all the Commonwealth nations – meeting in Perth, Australia in late October 2012 – have agreed to the change.  (They also agreed to lift another centuries-old rule, a ban on British monarchs marrying Roman Catholics.)  Most observers believe the change is non-controversial, given the degree to which the laws in Commonwealth countries have recognized the legal and political equality of women, since the early 20th century (when women were given the right to vote).  No doubt the long and mostly successful reign of Elizabeth II also has permanently changed the attitudes of the people of Britain and the Commonwealth countries, most of whose population cannot remember having a male monarch on the throne.  (It’s been over 60 years, after all, since Elizabeth’s father, George VI, was king.)  So, once the new rules are implemented, the next in line to the throne after Charles and William will be the baby that Kate is now carrying.  If she’s a daughter, she’ll be the first British princess to beat out any younger brothers and accede to the throne – assuming that there will still be a monarch sitting on a throne at that future time.


    n Mischievous Young ‘Un 

                North Korea’s 30-year-old communist dictator, Kim Jung Un (the son and grandson of dead communist dictators Kim Jong Il and Kim Il-sung, respectively), has been up to more mischief – and not just by “wining and dining” former NBA star Dennis Rodman (who called Kim “an awesome kid,” in what has to be the weirdest bromance of the year).  Rodman’s “basketball diplomacy” apparently has failed.  Angrily reacting to the U.N. Security Council’s recent unanimous decision to tighten sanctions against his rogue country, Kim’s regime announced that it was nullifying all nonaggression agreements with South Korea, declaring invalid the armistice agreement that ended the Korean War in 1953.  And it’s not just South Koreans who are now nervous.  As Pyongyang marches relentlessly toward deliverable nuclear weapons and the long-range missiles to carry them, one top North Korean general has claimed that his country has nuclear-tipped intercontinental ballistic missiles (which could reach the United States, not only Alaska but also the Northwest coast) ready to blast off.  

    In a recent editorial, Investor’s Business Daily argues that North Korea’s recent actions – both the tearing up of the 1953 armistice and the threat of a nuclear strike on U.S. territory – may be more of a warning to China than to us: 

    “China has long been caught in a bit of a conundrum regarding North Korea.  Supporting Pyongyang has kept alive the threat of another war on the Korean peninsula, which has perpetuated a strategic problem for the U.S.  Beijing fears an Asian repeat of the reunification of Germany after the collapse of the Soviet Union.  A reunified Korean would give the U.S. a firm foothold on its border that does not sit well with China’s regional, expansionist goals.


    “Yet North Korea’s relentless pursuit of nuclear weapons and the missiles to carry them has created another big problem – renewed war on the Korean peninsula, including strikes on Japanese and U.S. territory, would not be good for China either.  Their dilemma has been how to keep North Korea as a useful nuisance but restrain it from being a threat.”


    (“Don’t Ignore Petulant Pyongyang,” I.B.D., March 12). 

    Now even B.O., who has been a critic of missile defense since long before his 2008 campaign (and who has been very cooperative with Russia in scrapping missile defense in eastern Europe), sees the merit in having some missile defense to protect the U.S. homeland from rogue regimes like Kim’s in North Korea.  “I can tell you that the United States is fully capable of defending against any North Korean ballistic missile attack,” White House spokesman Jay Carney recently said.  Let’s hope he’s right.


    n Going Postal – Going, Going, Gone 

                The U.S. Postal Service (USPS) has announced plans to stop Saturday home delivery (except for parcels), starting in August.  The USPS says the cost-cutting move is necessary if it is to remain viable.  But already, though, a chorus of howls is erupting in Congress.  Senator Susan Collins (R.–Maine) has called the plan “inconsistent with current law” mandating six-day-a-week delivery and “threatens to further jeopardize [USPS’s] customer base.”  Now the media is reporting that the plan might be scrapped, recalling what happened a few years ago when the USPS proposed shutting 3,200 money-losing post offices.  The list was whittled down to just 162 after lawmakers started complaining. 

    Something has to be done.  Last year, the USPA posted a record $15.9 billion loss, and Postmaster General Patrick Donahoe says it’s hemorrhaging about $25 million a day, thanks to massive bloat and a cratering core business (as more and more Americans turn to electronic forms of communication rather than “snail mail”).  All but 7,000 of its 32,000 post offices lost money, and a recent government audit found the USPA has almost twice as many mail facilities as it needs, 35,000 excess workers, tens of thousands unneeded machines, and hugely wasted travel costs.  The fact that 85% of its workforce is unionized doesn’t help, either.  Meanwhile, first-class mail volume is plunging.  It fell 25% over the past decade and will likely drop another 46% over the next. 

    Donahoe complains that Congress wants USPS to run like a business, while refusing to let it do so.  “He’s right,” observe the editors of Investor’s Business Daily.  USPS has been trying for years to restructure its operations in light of its crumbling core business, only to be “stymied by Congress, which routinely blocks needed reforms either to protect special interests or spare districts from losing a precious post office.”  That’s why, the editors conclude, “the only real solution isn’t to tinker around the margins with phony congressional fixes, but to privatize the USPS entirely.”  That would liberate the agency to cut costs, improve efficiencies, innovate and grow new lines of business.  “Other countries have done so, and consistently found improved quality and lower cost.  Yet this common-sense idea keeps getting lost in the mail here” (“A Private Post Office?” February 8).    


    n Minimum Wage, Maximum Folly 

                In his 2013 “State of the Union” address, among the typical laundry list of asinine ideas, B.O. proposed that Congress raise the federal minimum wage from $7.25 to $9.00 an hour.  And a  number of states have raised, or are in the process of raising, their government-mandated minimum wages – thus perpetuating one of the worst public policies of the past century. 

    I wrote about this issue in a MayerBlog essay several years ago:  “Minimum Wage, Maximum Folly” (Oct. 20, 2006).  All the arguments against minimum-wage laws remain just as true today as they did then.   Indeed, they’re even more poignantly true today, with our sky-high unemployment rates, particularly among the groups whom minimum-wage laws hurt the most:  young, unskilled, and minority workers.  For example, teen unemployment has been above 20% for four years – and in California (one of the states considering a minimum-wage increase) the teen unemployment rate is nearly 35%.  “Fewer job opportunities is the last thing these teens need,” argues Michael Saltsman (research director at the Employment Policies Institute) in a recent op-ed.  He cites a new scholarly report – coauthored by UC-Irvine minimum-wage expert David Neumark – that exposes the flaws in other academic studies claiming that increasing the minimum wage has no negative effect on entry-level job opportunities.  That claim not only defies common-sense and the realities of the business world but has been rejected by most sound (and unbiased) economists for years.  There’s now a “robust academic consensus that says the minimum wage will worsen the outlook” for young workers (“Minimum Wage Does Damage,” I.B.D., January 24).  And economist Walter Williams – whose splendid op-ed provided the name for my 2006 essay – continues to hammer home the critical point that minimum-wage laws historically have harmed the very groups they are supposed to help, particularly minority workers (“Minimum-Wage Hike Is Poison to Minorities,” I.B.D., March 6).   By the way, economics professor Frank Stephenson has posted a series of short essays on the “Division of Labour” blog, smashing to bits the illogical rationales offered by supporters of minimum-wage laws.  See, for example, his post “Some Questions for Mr. Ross” (February 12). 

    It’s not just libertarian or conservative advocates of free-market economics who have pointed out the folly of increasing the minimum wage; some left-liberal economists have jumped on the bandwagon, too.  For example, Christina Romer – the former chair of B.O.’s Council of Economic Advisors, who is now back in academia teaching economics at Berkeley – wrote an op-ed published in the New York Times earlier this month, an op-ed which (in the words of the I.B.D. editors) “delicately but systematically dismantled [B.O.’s] call for a 24% hike in the federal minimum wage over the next two years.”  First, she questioned the need for a minimum wage at all, noting the essential truth that “robust competition is a powerful force helping to ensure that workers are paid” decent wages.  Then she stripped bare the argument that a minimum wage is needed on “fairness” grounds, pointing out that many minimum-wage workers aren’t poor but are supplementing a family income or are teens just entering the job market.  Plus, to the extent that businesses pass on the cost of a higher minimum wage to consumers, that could “harm the very people whom a minimum-wage is supposed to help.”  Ms. Romer also demolishes the claim that a higher minimum wage will boost economic growth by giving workers more spending power.  At best it will translate into less than $20 billion in new spending - “not much in a $15 trillion economy,” she observed.  No wonder Ms. Romer is no longer part of the regime.  “It’s too bad Obama isn’t listening to [her] these days.  If he ever did” (“Minimum Sense,” March 5). 


    n “Outrageous” Nanny-Statism 

                The Transportation Safety Administration (TSA) – an agency that ought not exist – has issued new rules relaxing the draconian restrictions that have been in place on airline passengers since the 2001 militant Islamic terrorist attacks.  Earlier this month TSA Administrator John Istole announced that, effective April 25, travelers will be allowed to carry (or to have in their carry-on luggage) such items as small knives (knives without a molded grip and with blades less than 6 centimeters, or 2.36 inches, in length – like Swiss army knives, for example); and certain sports equipment, such as billiard cues, ski poles, hockey and lacrosse sticks, and two (but only two!) golf clubs.  Razor blades and box cutters, such as those the 9/11 hijackers used, would still be prohibited.  The move follows TSA actions that earlier eased restrictions on cigarette lighters and fingernail clippers, which were also formerly banned.  And the allowances are more closely in line with standards set by the International Civil Aviation Organization, the TSA said.  In other words, it’s a long-overdue, common-sense change in U.S. travel regulations. 

    Nevertheless, the change set off an immediate outcry, especially from flight attendants.  The Flight Attendants Union Coalition, representing nearly 90,000 flight attendants at U.S. airlines nationwide, called it “a poor and shortsighted decision by the TSA.”  “We believe that these proposed changes will further endanger the lives of all flight attendants and the passengers we work so hard to keep safe and secure,” the group said.  Stacy Martin, president of Southwest Airlines’ flight-attendants union, Transport Workers Union of America Local 556, called the decision “outrageous” (“TSA to allow pocketknives on planes,” USA Today, March 6). 

    What’s really “outrageous” here are the misguided “security” measures the federal government has taken – including the creation of the TSA itself – since 9/11.  As I maintained in my essay observing the tenth anniversary of the Sept. 11, 2001 attacks – “Remembering 9/11: The Tenth Anniversary” (Sept. 15, 2011) – “the U.S. government (both the Bush administration and the Congress) did overreact, with mostly misguided policies – policies that have been followed (despite rhetoric to the contrary) by the B.O. regime and by both the 111th and 112th Congresses – that, unfortunately, have not really made Americans safer from terrorist attacks while, at the same time, have eroded Americans’ civil liberties.”  Rather than helping Americans be better prepared to defend themselves from future terrorist attacks, these policies have transformed us into a nation of sheep, even more dependent on the government for our security.  Nothing better illustrates this than the pathetic whining of some Americans about such a minor, common-sense change in these nonsensical rules.


    n “Big Brother” Bloomberg Strikes Again 

    New York City was once the greatest city in the world – both its financial capital and a leading cultural center.  Today, sadly, it’s more like the laughing-stock of the world, thanks to Mayor Michael “Big Brother” Bloomberg and the truly “outrageous” nanny state he’s turning the city into.  (Maybe we should call it a “nanny city-state,” or a “ninny state.”)  As I noted in “Summer 2012 in Review” (Sept. 12), Bloomberg “can’t seem to stop imitating `Big Brother,’ the paternalistic dictator of George Orwell’s dystopian novel 1984.”  First Bloomberg banned smoking in public places – not only indoor workplaces but even outdoor spaces, such as all the city’s parks, beaches, swimming pools and pedestrian plazas.  (Never mind that the scientific studies purporting to show health dangers from “secondhand smoke” are not only dubious with regard to indoor smoking but totally non-existent when it comes to outdoor smoking.)  Then Bloomberg banned trans-fats from restaurant food.  (Never mind that trans-fat amounts to a tiny fraction of total fat consumption and, according to many nutritionists, actually better for our health than saturated fats, which Bloomberg’s ban doesn’t touch.)  

    Last summer Bloomberg proposed a ban on soft drinks sold in containers 16 ounces or more, supposedly to force New Yorkers to reduce their sugar consumption.   His proposed rule would prohibit restaurants, mobile food carts, delis and concessions at movie theaters, stadiums or arenas from selling sugary drinks in cups or containers larger than 16 ounces.  The ban would apply to soda pop (beverages with more than 25 calories per 8 ounces) but not to 100% juice drinks or beverages with more than 40% milk.  Contrary to popular belief, however, the ban does not include convenience stores, such as 7-Elevens, and supermarkets, both of which are regulated by state government.  (So customers can still get their “Big Gulps.”)  However, Bloomberg recently said that he’d urge the state to move forward with a ban that matches the city’s new regulations.  

    The ban on soft drinks in containers larger than 16 ounces was approved on Sept. 13 by the city Board of Health, whose members were appointed by the mayor, and was to take effect at the beginning of March.  Thankfully, however, a judge of the New York Supreme Court (the name the Empire State gives its trial courts) – Judge Milton Tingling – struck down the infamous ban, on the brink of its implementation.  He declared Bloomberg’s soda diktat to be an unconstitutional “arbitrary and capricious” action that violates the doctrine of separation of powers.  Indeed, Judge Tingling ruled that “it would eviscerate,” not just violate, that fundamental doctrine of American government, and that “such an evisceration has the potential to be more troubling than sugar sweetened beverages.”  The judge also ruled that the city health board didn’t have the authority to limit or ban a legal item (like sugary soft drinks) under the guise of “controlling a chronic disease” (presumably, obesity), because “imminent danger” to public health by such a “disease” was not proven in this case.  (Thus, the judge correctly recognized that the ban was not a valid exercise of government’s “police power” because it did not concern public health.)  The judge imposed a permanent restraining order against Bloomberg’s ban, which the mayor immediately vowed to appeal to a higher court.  Various groups that had opposed the ban – including the American Beverage Association, which was one of the plaintiffs challenging it in court, as well as the National Association of Theater Owners – cheered the court’s decision. 

    Like a fascist Energizer bunny, however, “Big Brother” Bloomberg “keeps going and going” – constantly proposing new paternalistic government measures.  Immediately following last summer’s movie-theater massacre in Colorado – even before all the victims’ bodies were removed from the theater – Bloomberg politically exploited the horrid crime by calling for more government restrictions on guns and suggesting that police go on an illegal strike in order to coerce public support.  Following the Newtown, Connecticut school massacre early this year, he similarly exploited the tragedy to call for more gun controls, including a federal ban on so-called “assault weapons” and restrictions on ammunition.  And as if his wars on cigarettes, trans-fats, soft drinks, and guns weren’t enough, Bloomberg recently has taken aim at three other targets:  baby formulas, Styrofoam containers, and tobacco products.  As part of the city’s campaign to urge mothers to breast-feed their infant babies – a program awkwardly named “Latch On NYC” – Bloomberg last summer announced new regulations on city hospitals, forcing them to keep baby formulas hidden behind locked doors, to cease giving out samples of formula to new mothers, and to document a medical reason every time the hospital gives a baby a bottle.  Then in his mid-February “State of the City” address, “Big Brother” Bloomberg called for a ban on plastic-foam (such as Stryofoam brand) food packaging, arguing that it’s filling up landfills (another myth).  Most recently, he proposed to ban all stores from publicly displaying tobacco products.  Under his proposal, tobacco products would have to be kept out of public view – under counters, in drawers, or behind curtains – supposedly to discourage people from smoking.  “Even one new smoker is one too many,” Big Brother Bloomberg pompously said. 

                Bloomberg is a megalomaniac who is depriving New Yorkers of their personal freedom – and of their responsibility for their own lives.  The important underlying issue here isn’t health or safety; rather it’s power – political power, the coercive power of government.  As Lord Acton stated in his famous maxim, “Power tends to corrupt, and absolute power corrupts absolutely.”  New Yorkers ought to examine their city charter and consider whether it vests far too much power in just one person, the mayor.


    n Do They Give a Merit Badge for Homophobia? 

                In early February the Boy Scouts of America (BSA) were “in the news again, for the only thing they ever seem to be in the news for anymore: their attitudes toward homosexuality.”  So reported Nick Gillespie, editor of and a former Boy Scout himself, in a Wall Street Journal op-ed (“A Lesson from the Scouts’ Own Book,” February 1).  The BSA had planned to hold a vote whether to end the national organization’s longstanding ban on openly “gay,” or homosexual, scouts and leaders.  Although many commentators expected the Scouts to adopt a compromise position – ending the national blanket ban but still permitting individual chartering groups (many of which are churches) to decide for themselves whether to permit open homosexuals to join and help run their troops – the BSA decided on February 6 to delay the controversial vote.  The organization declared it needed more time for “a more deliberate review of its membership policy” and would take action on the issue in May during its National Annual Meeting. 

    As a matter of constitutional law, of course the BSA, as a private membership organization, is entitled to determine its own “values” and to decide for itself whom it will accept (or reject) as members; that is a fundamental freedom – actually, a part of both the freedom of speech and freedom of “association” guaranteed by the First Amendment.  But the organization is under tremendous pressure, both from outside and from within (including present and former Scouts, both homosexual and heterosexual), to change its policy – as the attitudes of Americans generally are changing, become much more accepting of homosexuality (as the recent trend in favor of same-sex marriage indicates).  Nevertheless, perhaps because of those changing attitudes in the general population, many conservatives are adamantly opposed to a change in membership policy.  About 70% of troops are chartered to faith-based groups, including the Catholic Church, which condemns homosexuality as a sin.  Forced to welcome openly gay Scouts and leaders, some groups would end their BSA affiliation, seriously weakening an organization already declining in numbers and influence.   

    Legal and political considerations aside, the issue indeed is one of moral principles, as conservatives maintain.  But the values they claim to uphold – the traditional values on which the BSA was formed a century ago – are seriously flawed and out of touch, not only with contemporary society but also with human nature.   Science teaches us that homosexuality is as “natural” as heterosexuality; that sexual orientation is determined by a variety of factors, both genetic and environmental, too complex to fully understand.  But the fact that only a small minority of people have a homosexual orientation doesn’t mean they’re not “normal” human beings – just as the fact that only a small minority of people have blood type AB doesn’t mean they’re not “normal” or “natural” either.  Traditionalists within the Boy Scouts claim that they’re upholding the Scout Oath, which requires members to do their best to be “morally straight” at all times.  What’s so “morally straight” about bigotry directed against those who are different – bigotry rooted in the very real phenomenon known as “homophobia” (heterosexuals’ irrational fear and hatred of homosexuality)?  Or, for that matter, why is it “morally straight” to expect boys and men who have homosexual orientation but who otherwise cherish the Scouts to repress their own feelings, just to belong? 

    In contrast, Girl Scouts USA – which celebrated its centennial last spring – has adopted a policy not to discriminate on the basis of sexual orientation.  The Girl Scouts not only have great cookies but also a great deal of common sense. 


    n More Idol Speculations – From a Survivor Fan 

                I had planned to never again mention Fox’s American Idol TV show, just as I had planned to skip watching the 12th season of the singing competition.  It’s all because of the changes in the judging panel.  Not that I was a fan of last season’s new judges, Jennifer Lopez and Steven Tyler, but I dislike their replacements even more.  Australian country singer Keith Urban is a nice enough guy (for a non-American singing in a genre that ought to be exclusively American), but his fellow newcomers, Mariah Carey and Nicki Minaj – when they’re not playing “dueling divas” – have egos not matched by their talent.  (To me, they seem to scream rather than sing their songs.)  Nicki Minaj is particularly annoying in her unpredictable oddity – aptly described by Robert Bianco (TV critic for USA Today) as a blend of “Paula Abdul’s mothering, Stephen Tyler’s sometimes-inappropriate quirkiness, and, at her best, Simon Cowell’s directness.” The veteran judge, Randy Jackson, on the other hand is still annoyingly predictable, with all his “dawg” talk and his recurring complaint that the singing is “pitchy.”  

                But like a motorist who can’t help but to stop and gawk at a car wreck, I found myself watching, first the auditions, then the Las Vegas rounds (where the judges whittled the contestants down to the Top 10), and now the live shows from Hollywood.  “America” now gets to vote – that is, those Americans who are such fans of the show that they text, tweet, or phone their votes for their favorite.  (Everyone gets up to something like 50 votes, which pretty much guarantees whoever has the most fanatic fans will win, regardless whether they’re the most talented singer).  Last year, I successfully predicted the winner, Phillip Phillips – whose unique singing style impressed me the first time I saw him in the auditions.  This year, I won’t press my luck by sticking out my neck again to predict a winner:  there’s no one who seems to me to be a standout (although the show’s producers seem to have done everything they can to stack the deck so the next winner won’t be another white male).  But I do have a favorite:  Lazaro Arbos, the 21-year-old from Naples, Florida, who stutters so badly he can barely speak but who sings beautifully – another Mel Tillis!  (Google that name if you don’t know who he is.) 

                Meanwhile, I’m again enjoying another new season (amazingly, the 26th season) of CBS’s Survivor, the grand-daddy of TV “reality” competition shows.  As I noted in “Tricks and Treats 2012” (Oct. 25),  the success of the show can be attributed to several factors, among them, an effective host (Jeff Probst, who now has his own talk show on daytime TV) and an essentially sound basic format.  The producers of the show have managed to keep that format intact – having “survivors” in a harsh wilderness setting, competing with one another in various games, and in their social as well as physical survival skills, to “outwit,” “outplay,” “outlast” their competitors, not necessarily in that order) – while adding enough new “twists,” or differences, each season to keep it interesting.   

                Set in Caramoan, an island group in the Philippines, this season pits a team of “Fans” against a team of returning “Favorites.”  So far the veteran players have been winning most of the competitions – a result not only of their experience but also the foolish strategy of the Fans, who’ve created a six-person alliance targeting the four youngest (and best-looking, and strongest) players for elimination.  But the Favorites have had some rough spots too, mostly due to personality conflicts (the divisive play of ex-"federal agent" Phillip and the volatility of Brandon, whose recent meltdown resulted in the team forfeiting so they could immediately vote him off.)  Again, I won’t stick out my neck to make a prediction who will win, but I will note my favorite player, one of the Favorites:  Malcolm Freberg, a 26-year-old bartender from Hermosa Beach, California.  A veteran of last season’s “Survivor: Philippines,” Malcolm is strong (he played college football for Dartmouth), fairly smart, and has a pleasant personality.  In his previous outing in the Philippines, he nearly “outwitted/outplayed/outlasted” his competitors, being the 15th player voted off (and the 8th member of the jury).  Malcolm says he’s “slightly less cocky” than the last time he did this, so perhaps he’s learned from his mistakes and this time will make it all the way to the finale. 


    n  An Amazing Disg-race 

                Another “reality” competition show, The Amazing Race (also on CBS), is in hot water because of an outrageous recent episode.  Set in Hanoi, Vietnam, the March 17 episode had contestants trying to memorize a song performed for them by children in front of a portrait of North Vietnam communist leader Ho Chi Minh, with subtitled lyrics that included “Vietnam Communist Party is glorious. The light is guiding us to victory.”  The contestants then had to go to a B-52 Memorial, which is the wreckage of an American bomber plane shot down during the Vietnam War, to find the next clue in their televised round-the-world journey.  In the episode, the twisted metal of the downed plane is treated as any other prop, with a bright “Amazing Race” “Double-U-Turn” signed planted in front of it, signifying to contestants the next phase of their scavenger hunt. 

    The episode has aroused the ire of a broad spectrum of viewers, including (not surprisingly) Vietnam War veterans, the VFW, and the American Legion.  And it’s not just veterans’ groups or conservatives who are outraged:  Bob Beckel, the left-liberal commentator on Fox News’ The Five, was one of the first media figures to condemn the program.  So far both CBS network executives and the show’s executive producer, Jerry Bruckheimer, have refused to apologize – or even to acknowledge the controversy.   With a Marxist sympathizer in the White House, apparently folks in the media no longer consider it outrageous for American game-show contestants to participate in Communist propaganda, even when it’s done in a former enemy nation that was responsible for the deaths of tens of thousands young Americans killed in war.

                    UPDATE:  At the beginning of the March 24 episode of Amazing Race, CBS made the following statement: “Parts of last Sunday’s episode, filmed in Vietnam, were insensitive to a group that is very important to us – our nation’s veterans.  We want to apologize to veterans, particularly those who served in Vietnam, as well as to their families and any viewers who were offended by the broadcast.”  (Notice that CBS does not admit it was wrong to promote Communist propaganda – just “insensitive” to Vietnam veterans.)  Producer Jerry Bruckheimer issued a similar apology via his company’s Facebook site. 


    n Some More BFDs

                 Another (occasional) tradition on MayerBlog are entries discussing “BFDs” – issues in politics and/or pop culture that have caused much needless hand-wringing, for only in the sarcastic sense are they really (in the words of V.P. “Smiley” Joe Biden) “Big Fucking Deals.” Here are some more BFDs that have wasted a lot of time and effort this spring:

    • Sip-of-Water-gate:  Senator Marco Rubio (R.–Fla.) became the object of much derision from the leftist news media because of his awkward pause, to take a sip of water, as he delivered the official Republican Party response to B.O.’s State of the Union address on February 12.  Commentators on CNN (which might stand for “communist news network”) went so far as to declare that Rubio’s political future – including a possible run for the presidency in 2016 – was finished!  It seems that leftist criticism of Senator Rubio has intensified as he takes a leading role in negotiating immigration reform on Capitol Hill – and (ironically) as the predominantly leftist news media has begun touting him as the “savior” of the GOP (as Time magazine did on an early February cover).  But Rubio’s supposed gaffe is nothing compared to the embarrassing faux pas committed by B.O. when visiting Britain a couple years ago.  He rose to toast Queen Elizabeth II – beginning with the words, “Ladies and gentlemen, to the Queen!” – apparently unaware (didn’t any of his handlers tell him?) those words should be spoken at the end of the toast, for they would signal the band to begin playing “God Save the Queen.”  Yet B.O., being his usual narcissistic self – oblivious to everything but his own imagined greatness – insulted the Queen and all of Great Britain by continuing to deliver the rest of his toast as the band played the British national anthem, which B.O. seemed to assume was merely background music for his own voice!  (Thanks to Thomas Lifson, who reminds us of B.O.’s “cringe-inducing gaffe involving a sip” and posts the video in his entry, “A Little Perspective on Rubio’s Sip of Water,” on the American Thinker blog, February 14.)

    • Michelle, Ma Belle – NOT!  Michelle Obama is the most overrated First Lady in U.S. history.  Like her husband, Mrs. B.O. benefits from what I’ve called “The Emperor’s New Clothes” phenomenon (in her case, the “Empress’s New Clothes”): many people in the news media, not wanting to be called racist for criticizing the first black First Lady in U.S. history, give her false adulation – ooing and aahing over her supposed great fashion sense, or her latest change in hair style – while ignoring the negative side of Michelle.  In truth, she’s not very fashionable, isn’t very attractive (with those silly Moe Howard bangs and that unsightly under-bite), and by many accounts (on those few occasions when the truth seeps through the media hype) she’s a shrew!  (The only sympathy I have for B.O. is because he’s married to this woman.  Little wonder that recently, while she and the girls flew to a posh resort in Colorado for a ski weekend, B.O. instead flew to Florida to play golf with Tiger Woods.)  More importantly, she hen-pecks not only her husband but also all Americans, with her crusade against what she regards as “unhealthy” foods.  Like her husband, she’s a paternalist who wants to dictate how other people should live their lives.  And, with all the grossly expensive travel junkets she’s taken since her husband began his occupation of the White House, she’s also probably the most profligate spender of money ever to be First Lady (another way in which she and B.O. make the “perfect couple” who deserve each other).  Mrs. B.O. is truly the “Marie Antoinette” of modern American politics – only instead of saying to the peasants “Let them eat cake,” she’s saying “Let them eat tofu”!

    • Not Exactly a Carnival Triumph:  Last month more than 3,000 passengers on the Carnival cruise ship, Carnival Triumph, experienced “cruise ship hell” when the ship broke down in the middle of the Gulf of Mexico.  The 900-foot, 14-story vessel was returning to Galveston, Texas from Cozumel, Mexico, on the third day of a four-day cruise when an engine-room fire knocked out power and plumbing on most of the ship.  No one’s life was endangered, and by virtually all accounts the crew of over 1,000 acted valiantly, doing their best to help the passengers cope with the lack of electricity, running water – and working toilets.  Toilets and drainpipes overflowed, soaking many cabins and interior passages in sewage, and filling the ship with an overpowering stench.  How could this happen?  Because, by several accounts, many of the passengers behaved like animals.  Passenger Jacob Combs, 30, an Austin, Texas-based sales executive with a health-care and hospice company, told Reuters, “Just imagine the filth.  People were doing crazy things and going to the bathroom in sinks and showers.  It was inhuman.”  Combs had nothing but praise for the crew members, saying they had gone through “hell” cleaning up after the irresponsible passengers.  “They were constantly cleaning,” he said, adding that after he left the vessel, “The thing I’m looking forward to most is having a working toilet and not having to breathe in the smell of fecal matter.”  No shit!  (Meanwhile, Carnival’s multiple apologies – and offers of full reimbursement, travel expenses, future cruise credits, and a $500 additional payment to help compensate for the ordeal – have failed to satisfy many disgruntled passengers, who vow to sue the cruise line for millions of dollars in damages.  Apparently they can’t accept that, despite the cruise industry’s generally remarkable safety record, from time to time, shit does happen.) 

    • Zombies and Witches, Oh, My!:  Vampires are so passé; the hottest supernatural genre in pop culture today seems to be zombies.  They’re the subject of a recent movie, Warm Bodies, as well as a hit cable TV series, The Walking Dead, now in its third season on AMC.  I hate to make this political (yeah, sure!), but the phenomenon seems apt today.  Aren’t zombies supposed to feed on living human brains?  That may explain all the “low-information voters” who reelected B.O.  There’s another fad in pop culture (the supernatural fantasy department) – witches – as illustrated by some current films (Beautiful Creatures, Oz the Great and Powerful, and Hansel & Gretel: Witch Hunters) and TV series.  (Even the CW’s Vampire Diaries seems to have been taken over by witches, overshadowing the vampire title characters.)  Why all this interest in witches?  No doubt it has something to do with certain women being touted as female “role models,” including former First Lady H.R.C. and the current First Lady, Mrs. B.O. 

    • The Devil’s in the Details:  History, the cable TV channel, has generated a mini-controversy with its hit series The Bible.  It seems that the actor cast to portray Satan bears a striking facial resemblance to B.O., according to many viewers – including Glenn Beck, who tweeted, “Anyone else think the Devil in #TheBible Sunday on History Channel looks exactly like That Guy?” (as noted above, Beck’s euphemism for B.O.).  “This is utter nonsense,” executive producers Mark Burnett (creator of Survivor) and his wife, actress Roma Downey, said in a joint statement.  “The actor who played Satan” – Mohamen Mehdi Ouazanni – is “a highly acclaimed Moroccan actor,” who previously played parts (including Satanic characters) in several Biblical epics, Burnett noted.  Downey, a devout Roman Catholic, added, “Both Mark and I have nothing but respect and love [for] our President, who is a fellow Christian.”  (She neglected to mention that B.O. is the son of a Muslim – and a Marxist – from African and that, by his own admission, he inherited the “dreams” of his father.)  But the real scandal here is that History – which used to be known as The History Channel – has so little real history in its programming.  It seems that “History” today consists largely of myths, UFO stories, and redneck reality shows like American Pickers, Pawn Stars, and Swamp People.  

    • Wrestling with the IOC:  The International Olympic Committee (IOC) has announced that it plans to eliminate wrestling.  At its recent meeting in Switzerland, the 15-member executive board recommended that wrestling not be on the list of 25 core sports proposed for the 2020 Summer Games.  Jim Scherr, former CEO of the U.S. Olympic Committee and an Olympic wrestler in 1988, said the decision-makers behind the move do not represent countries where wrestling is popular and successful – namely, the USA (which has won 124 Olympic wrestling medals) or Russia (which along with the former Soviet Union has 167 wrestling medals).  Sherr added that he thinks the decision is “a reflection of the Eurocentric nature of the IOC board, the IOC membership as a whole.”  USA Today sports writer Christine Brennan went further, maintaining that the decision reflects the IOC’s anti-U.S. bias.  “The USA is to the IOC what Notre Dame is to college football.  People love to hate it” (“IOC’s anti-U.S. bias no surprise,” February 14).  For all those young athletes (and their families) whose dreams of an Olympic medal in wrestling may be crushed, the IOC’s decision really is a B.F.D.  And for everyone who appreciates the history of wrestling as a sport – one of the world’s oldest sports, and one of the original events at the ancient Olympic games – the IOC’s decision is a travesty.  But it illustrates a phenomenon that’s becoming all-too-predictable: an international bias against the West.  That such a bias has infected the Olympics – one of the greatest legacies left to the world by the birthplace of the West, ancient Greece – is just simply sad.


     | Link to this Entry | Posted Saturday, March 23, 2013.  Copyright © David N. Mayer.

    The Unconstitutional Presidency - February 21, 2013



    The Unconstitutional Presidency



    This essay is in part based on a two-part talk on “Restoring the Constitutional Presidency” I presented at the Atlas Society’s Summit in Washington, D.C. on June 30 & July 1, 2012.  Videos of my talk are posted on The Atlas Society’s website: Part I and Part II.  (Each part is approximately 60 minutes in length, including Q&A.)


    In the fall of 1787, when Thomas Jefferson (then serving as U.S. ambassador to France) received a copy of the new Constitution of the United States proposed by the Constitutional Convention, he had concerns about two chief “defects” he perceived in the document.  One was the absence of a bill of rights – a concern Jefferson shared with the Antifederalist opponents of the Constitution and, probably, a majority of the American people.  That defect almost derailed ratification of the Constitution, but it was corrected by 1791 with the addition of the first ten amendments – the Bill of Rights – to the Constitution.   

    It was the second “defect” that Jefferson perceived that relates to the topic of this essay: the perpetual eligibility of the president for re-election.  Recognizing that the Constitution created a strong and independent Chief Executive, Jefferson feared that without “rotation in office” – what we today call “term limits” – the president would be perpetually re-elected, making him an officer for life, essentially an elective monarch.  Besides being an evil in itself, that would be just one step away from a hereditary monarch – which would undo the American Revolution, substituting an American king for the British monarch (George III) whom the Declaration of Independence had pronounced a “tyrant.” 

    Most of the American people in 1787-88, as the Constitution was being ratified, did not share Jefferson’s concern about the lack of term limits on the presidency.  Jefferson himself acquiesced in Americans’ confidence in George Washington – whom everyone expected to be elected the first president – and Washington earned that confidence by his decision to voluntarily retire after two terms.  Jefferson applauded Washington’s decision – which created a tradition that Washington’s successors (including Jefferson) would follow, until the 20th century.  But Jefferson nevertheless predicted that it would be necessary to amend the Constitution to limit presidents to two terms – making explicit and part of the text the tradition that Washington originated – because “the natural progress of things is for liberty to yield and government to gain ground.”  But, Jefferson added, amendment of the Constitution to correct this flaw would have to wait until “inferior characters” succeeded Washington and “awakened us to the danger which his merit has led us into.” 

    The “inferior character” who finally awakened the American people to the flaw in the Constitution that Jefferson had perceived finally was elected president in 1932: Franklin D. Roosevelt (FDR), who was reelected in 1936 and then reelected not only to an unprecedented third term in 1940 but even to an unprecedented fourth term in 1944.  As the president who greatly expanded federal government powers through his “New Deal” programs as well as the president who served as Commander-in-Chief during World War II, FDR wielded far more power than any of his predecessors and, in many ways, embodied the greatest fears of Jefferson, Madison, and other Founders who imagined that the great powers of the office might indeed transform the presidency into a monarchy.  Not until after FDR, by his death in 1945, had become, effectively, a president reelected for life (precisely what Jefferson had feared), did the American people finally amend the Constitution by adding the Twenty-second Amendment in 1951, limiting the president to two terms. 

    Although the Twenty-second Amendment has corrected the “defect” in the Constitution that so concerned Jefferson in 1787-88, the presidency still remains a dangerously powerful office.  FDR’s successors from both major political parties, both Democrat and Republican, have continued to expand the powers of their office – raising the specter of “an imperial presidency,” in effect, an elective monarch, who wields far more power than did the British “tyrant,” George III. 

    As I’ll argue in this essay, the modern American presidency has departed significantly from the model created by the framers at the Constitutional Convention and implemented by the early American presidents.  Unless something is done to restore the limits that the Constitution puts on the office, the modern presidency will continue to degenerate into a tyranny, thus fulfilling the worst nightmares of Jefferson and other Founders.   

    That’s especially true today, as the current “Occupier” of the White House, B.O., begins his second term.  B.O. has been, as I described him in my essay “Rating the U.S. Presidents 2012” (Feb. 22, 2012), not only “the worst president in U.S. history” but also “the most lawless president in U.S. history.”  B.O. himself asserted (in comments he made last March to the Russian president – confidential comments that were accidentally picked up on an open microphone) that in his second term, he could be “flexible” – meaning he feels free to ignore not just the constitutional restraints on his exercise of power but also political restraints (including public opinion).  A “lame duck” president can be an irresponsibly dangerous president.  Thankfully, as discussed in the final section below, the framers of the Constitution provided a remedy – impeachment and removal from office – one that I predict will have to be used, to put an end to B.O.’s tyrannical abuse of presidential power.



    The Constitutional Presidency:

    Restrained by the “Chains of the Constitution”


    During the 1790s Thomas Jefferson (whom Washington chose as his first Secretary of State) and James Madison (a leader in the U.S. House of Representatives) became the leaders of the first true national opposition political party.  They called theirs the “Republican” Party because they feared that the establishment Federalist Party was undermining the limited, republican form of government created by the Constitution.  A major part of their concern was the broad power wielded by Washington, particularly in the realm of foreign policy.  Jefferson and Madison did not blame Washington, whom they both admired, but they were concerned that Washington was following too much the advice of his Secretary of the Treasury, Alexander Hamilton – a man who openly admired the British monarchical system of government and who wished the American presidency would be more like the British monarchy. 

    One of the major issues that divided the Republican and Federalist parties – and which pitted Jefferson and Madison against Hamilton, in the arena of public-policy debates – was Washington’s decision to issue a Neutrality Proclamation in 1793, reminding Americans that the United States was neutral with regard to the European war between Britain and France.  It wasn’t the policy of neutrality per se that concerned Jefferson and Madison – indeed, they supported the policy – but rather the president’s unilateral action in issuing the proclamation without first consulting Congress.  They argued that by acting on his own authority as president, Washington had undercut Congress’s power to declare war, one of its enumerated powers in Article I, Section 8 of the Constitution. 

    The parties’ conflict over foreign policy served as the backdrop yet again to another constitutional controversy that arose during the presidency of Washington’s successor, John Adams.  The Federalists (especially the hawkish Hamiltonian wing of the party) were agitating for the United States to enter the European war on the side of Britain – in other words, to go to war against France.  As part of their attempted build-up to war (what some historians have called the “pseudo-war”) during the summer of 1798, the Federalists who controlled both houses of Congress passed into law a series of measures called the Alien and Sedition Acts.  Of these, the Sedition Act was most ominous: it criminalized criticism of the president or the Congress, in a blatant attempt to silence the Republican party opposition.  Jefferson and Madison responded by anonymously drafting resolutions, which were passed by the legislatures of Kentucky and Virginia, respectively, protesting the unconstitutionality of the laws. 

    In his Kentucky Resolutions – in the eighth resolution that he drafted – Jefferson declared a principle that I regard (in my book The Constitutional Thought of Thomas Jefferson) to be the touchstone of his philosophy regarding the Constitution.  He wrote: 

    “Free government is founded in jealousy, not in confidence.  It is jealousy, and not confidence, which prescribes limited constitutions, to bind down those whom we are obliged to trust with power.”


    After further noting that the Constitution had “fixed the limits” of power, he concluded: 

    “In questions of power, then, let no more be heard of confidence in  man, but bind him down from mischief by the chains of the Constitution.”


    Jefferson’s metaphor is quite apt: the purpose of written constitutions is to limit governmental power, to prevent its abuse.  Hence, if we are to take seriously the Constitution, we also must take seriously its chief structural devices for limiting power, what Jefferson called “the chains of the Constitution,” including separation of powers and its corollary principle, checks and balances.  Jefferson took quite seriously these structural devices for limiting power; indeed, more than most of the Founders – and certainly more than any of the other U.S. presidents – he particularly took seriously the constitutional separation of powers, including the critically important limitations it put on the power of the President of the United States.               

    The success of Jefferson’s and Madison’s Republican Party in the election of 1800 – in which Jefferson was elected president and the Republicans won majorities in both houses of Congress – was considered by Jefferson to be a second American Revolution.  “The Revolution of 1800,” he called it, for it represented the American people’s validation of the party’s constitutional principles and a reaffirmation of the Constitution as it was understood at the time of its ratification.  The Federalists in various ways had departed from that original meaning of the Constitution – as noted above, the Hamiltonian wing of the Federalist Party had tried to transform American republicanism into a system reminiscent of the aristocratic, monarchical English constitutional system – and the Republican victory in 1800 also marked the beginning of the end of the Federalist Party.  A permanent minority party at the national level after 1800, the Federalists eventually disappeared as a party after 1815.  (Their opposition to the War of 1812, which they denounced as “Mr. Madison’s War,” essentially marked the death knell of their party.)  During the 1820s, the “Era of Good Feelings” that prevailed during James Monroe’s presidency, there was no clear partisan division, although schisms within the Jeffersonian Republicans (which first appeared during Jefferson’s second term) anticipated the next major party system, the “Second Party System” (as political historians call it) – the Jacksonian Democrats versus the Whigs – the precursor of our modern party system, of Democrats versus Republicans. 

    As president, Jefferson departed in significant ways from the practices of his predecessors, Washington and Adams – certain practices that he regarded as unconstitutional, for they stretched the powers of the office too far.  Some of these changes were more symbolic than substantive, such as Jefferson’s practice of sending his annual reports in writing to the Congress rather than delivering in person a “State of the Union” address (discussed in the next section below).  Others were both symbolic and substantive, such as Jefferson’s refusal to issue presidential proclamations for days of national prayer or thanksgiving – which he viewed as barred by the First Amendment religion clause (which he wrote, in famous words, “erected a wall of separation” between church and state).  More substantively, Jefferson sought to nullify the bad precedents that Washington and Adams set by making new precedents, more respectful of the constitutional limits on the presidency.  He refused to continue prosecutions that had begun under the Sedition Act, and he used his pardoning power to release from prison persons who had been convicted under the unconstitutional law.  When the United States became involved in its first foreign war, the “Barbary War” in the Mediterranean against North African Muslim pirates, Jefferson – mindful of his criticism of Washington’s Neutrality Proclamation – was careful to get Congress’s authorization of offensive measures (as noted in the section on Commander-in-Chief powers, below).  With regard to the most audacious act of his presidency, the Louisiana Purchase, Jefferson was so troubled by the precedent it might set that he drafted a constitutional amendment authorizing it, “to set an example against broad construction.”  

    In general, Jefferson recognized that the Constitution separated the national government’s legislative and executive powers, vesting the former in Congress and the latter in the president.  (And in addition to the general law-making power, the Constitution also vested in Congress the power over “the purse,” to collect taxes and to appropriate money, as well as important war powers, discussed in the section on the Commander-in-Chief power, below.)   In exercising his most important power as president, the “Executive Power” vested in the president by Article II of the Constitution, Jefferson understood his basic role was to enforce the laws made by Congress:  “I am but a machine, erected by the constitution for the performance of certain acts, according to the laws of action laid down for me,” he wrote in 1805.  This was not merely a presidential power, but a duty that the Constitution (by a provision in Article II, Section 3) imposed on the president: “he shall take care that the laws be faithfully executed.” 

    Jefferson transformed the presidency, restoring the original limits that the Constitution placed on the office, and his more-restrained model of the Chief Executive, “bound by the chains of the Constitution,” set the precedents that virtually all his successors followed throughout the 19th century.  Even Andrew Jackson, one of the “strongest” presidents of the 19th century, followed Jefferson in respecting separation of powers, vis a vis the Congress (although Jackson famously defied the U.S. Supreme Court); coming from the more radical wing of Jeffersonian Republicanism, Jackson also followed Jefferson in respecting the Constitution’s system of strictly enumerated federal powers.  (In one of his most famous acts, vetoing the bill that re-chartered the Second Bank of the United States, Jackson used the veto power precisely as Jefferson understood it, as a check against unconstitutional legislation – as the section on the veto power, below, more fully discusses.)  Within the limits of his office, however, Jackson did fully exercise his powers – prompting an opposition party, the Whigs (who derived their name from the English opposition party of the late 17th century), to criticize Jackson for acting more like a monarch, “King Andrew,” than like a president.  Before the Civil War, presidents of both parties, Democrat and Whig, respected the limits the Constitution placed on their office. 

    Even Abraham Lincoln, who as president during the Civil War stretched the powers of the national government to their limits (or even beyond their limits, as his critics have maintained to the present day), was mindful of the constitutional separation of powers.  He broadly used his presidential war powers (especially his power as Commander-in-Chief), ordering a blockade of Southern ports, authorizing the arrest and imprisonment of civilians without due process of law (suspending the writ of habeas corpus), and freeing slaves in “rebel” territories through his Emancipation Proclamation.  Yet Lincoln was deferential to Congress on matters of legislative policy (except for Reconstruction, where he used a “pocket veto” to nullify the Wade-Davis Bill).  The apparent paradox has been explained by historian David Donald, who calls Lincoln “a Whig in the White House,” citing his political background as a Whig (before the Republican Party was created in the mid-1850s) and the Whig criticisms of “King Andrew” Jackson. 

    Lincoln’s successor, Andrew Johnson, was the first president to be impeached and nearly removed from office.  As noted in the section on impeachment, below, Johnson was impeached in part because his was openly critical of Congress, using his office as a “bully pulpit” (as Teddy Roosevelt later called it), and thereby undermining Congress’s law- and policy-making prerogatives. 

    The fact that Johnson was impeached for using the presidential office essentially the same way his 20th-century successor, Teddy Roosevelt, has been praised for using it – as a “bully pulpit” – highlights how significantly the office of president was transformed in the early 20th century, during the so-called “Progressive Era.”  As I have frequently noted here on MayerBlog, the “Progressive” activists sought a more powerful, paternalistic government and had nothing but disdain for anything that stood in their way – not just the American tradition of individualism but also the American constitutional system, with all its structural devices (including separation of powers, checks and balances, and federalism) for limiting the power of government and safeguarding the rights of individuals.  It’s not surprising, then, that the two “Progressive” presidents of the early 20th century – the Republican, Teddy Roosevelt (TR), and the Democrat, Woodrow Wilson – departed in significant ways from the precedents set by Jefferson and their other 19th-century predecessors, as well as from the powers of the presidency as enumerated in the text of the Constitution. 

    TR famously used the presidential office as a “bully pulpit,” appealing directly to the American people, and thus by-passing Congress (and its legitimate power to not only make the laws but also to determine public policy).  More ominously, TR (in his autobiography) created a new model for the presidency – emphasizing the role of the president as “steward” of the American people: 

    “[O]ccasionally great national crises arise which call for immediate and vigorous executive action. . . . [and] in such cases, it is the duty of the President to act upon the theory that he is the steward of the people, and that the proper attitude for him to take is that he is bound to assume that he has the legal right to do whatever the needs of the people demand, unless the Constitution or the laws explicitly forbid him to do it.”


    What is so dangerous – frightening, in fact – about TR’s new model of the presidency is that it turns on its head the fundamental principle of the Constitution, the enumeration of powers.  Under the Constitution, the president has only those powers granted him, the powers enumerated in the text (most of them in Article II).  If the president exercises any powers other than those enumerated, he’s not being a “steward”; he’s acting unconstitutionally, and the word for that is “tyrant” or “dictator.” 

    Woodrow Wilson, a former university president who had written a doctoral dissertation that praised the British parliamentary system of government, had similar disdain for the limits the Constitution imposed on the presidency.  In fact, as president he fancied himself to be more like a British prime minister.  Little wonder, then, that he changed the Jeffersonian tradition of submitting written annual messages to Congress and instead started the tradition of modern presidents personally speaking before a joint session of Congress, in their annual “State of the Union” address (discussed below).  Among other things, Wilson led the United States into World War I – a European war in which the United States had no business getting involved – but he failed in his plans for U.S. participation in the League of Nations when the Senate refused to ratify the treaty.  It was a deliberate rebuff to Wilson – and a sign of the American people’s desire to return to constitutional “normalcy” after the war.    

                Unfortunately, two of Wilson’s successors – the Republican, Herbert Hoover, and the Democrat, Franklin D. Roosevelt (FDR) – exploited the economic crisis of the Great Depression to expand not only the powers of the national government, generally, but the powers of the president, vis a vis the other two branches of government, Congress and the Supreme Court.  The paternalistic national “welfare state” programs known as the “New Deal” actually began under Hoover, most under a model of government cooperating with business.  Under FDR the programs expanded greatly, following a more coercive model of government regulating, or controlling, business – as epitomized by FDR’s exercise of unprecedented regulatory power under the National Recovery Act, which the Supreme Court (correctly) held to be unconstitutional.  Then FDR turned on the Court itself, with his infamous “Court-packing” plan of 1937 – a plan he never needed to implement because he was able to transform the Court, with pro-New Deal justices, through the normal process of attrition, as older justices died or retired.  By the time the United States entered World War II, all but one of the justices had been nominated by FDR; and the “Roosevelt Court” upheld virtually all of FDR’s unconstitutional acts, including (in the Court’s infamous Korematsu decision in 1944) his despicable executive order authorizing the confinement of Japanese-Americans in internment camps on the West coast. 

                It was FDR who transformed the presidency – not only through his unprecedented election to four terms, but also through his exercise of powers not granted the president by the Constitution.  FDR’s “stewardship” of the office really was more like a tyranny or a dictatorship – and it emboldened his successors, both Democrat and Republican, to abuse power in additional ways.



    The Unconstitutional State of the Union Address


    Last week the current “occupier” of the White House delivered the annual speech to a joint session of Congress known as the “State of the Union” address.  Setting aside the substance of B.O.’s speech – the policies he’s pushing as he begins his second term, which I’ve generally discussed in my previous “2013: Prospects for Liberty” essays – I’m focusing here instead on the format of the “State of the Union” address, as it has evolved over the past century (since the presidency of Woodrow Wilson).   

    The modern “State of the Union” address, I argue, is unconstitutional – it is an abuse of the power legitimately granted the president to propose measures to Congress, and it is a perversion of the fundamental constitutional principle of separation of powers.  This particular abuse of presidential power has been committed by all presidents, of either party – Republicans and Democrats – since Wilson’s time.  (Indeed, my discussion in this section is based on one of the earliest essays I posted to my blog, “The Unconstitutional State of the Union Address” (Jan. 20, 2004), criticizing the State of the Union address as delivered by B.O.’s predecessor, President George W. Bush.)   

    Following the practice of recent years, the address is a grand spectacle of modern American political theater.  Announced loudly by the Sergeant-at-Arms (“Mr. Speaker, the President of the United States!”), the current occupant of the White House strides through the aisle of the House chamber – slowly (as he shakes the hands of politicians from both parties) – and then, following two standing ovations, begins delivering his speech to an audience consisting of members of both Houses of Congress, the Cabinet, the justices of the Supreme Court, and (in the balcony) notables, including the First Lady and some “ordinary citizens” whose stories the President will cite in his speech, to give it the personal touch (a practice artfully employed by that master of political theater, Ronald Reagan, and copied by his successors).  The President’s speech consists largely of a laundry list of proposed new programs (the President’s “_____-initiative,” as each is usually described), peppered with political one-liners, greeted by raucous applause and additional standing ovations, at least by those members of Congress in the President’s political party.   

    Thomas Jefferson would be appalled.   As president, he broke the precedents set by his predecessors, Washington and Adams, and instead of personally addressing Congress, he sent his annual message in writing, usually in December of each year of his presidency, when it was read (no doubt in a dry monotone voice) by the clerk of each House.  Although some cynical Jefferson biographers claim that Jefferson did this because he was not comfortable speaking publicly, his deliberate decision not to address Congress personally was one of several ways by which he sought to limit the presidency (in its symbolic trappings as well as in the actual exercises of power) to the duties enumerated in the Constitution.

                To Jefferson, the spectacle of the President personally delivering an annual address to Congress resembled too closely the British monarch’s opening of a new session of Parliament.  He strongly believed that such ceremony had no place in a republican form of government.  Indeed, as noted above, Jefferson always feared that the American presidency might become a sort of elective kingship and thus sought to dispense with all ceremonies or practices reminiscent of  European-style monarchies.  For the same reason, he stopped the practice of holding “levees,” or formal receptions, in the executive mansion and instead hosted dinner parties where the guests were seated “pell-mell” rather than according to protocol.  The British ambassador to the U.S. was offended at Jefferson’s informality – which probably was a sure sign that Jefferson had successfully “republicanized” the forms of American government. 

                With regard to the substance of government and the actual exercise of powers, Jefferson also took quite seriously the constitutional doctrine of separation of powers, which is a key structural feature of the Constitution.   He recognized that Article I vests the legislative power in Congress and that Article II vests the executive power in the President; Congress makes policy, and the President implements it.  The Constitution does authorize the President to “from time to time give to the Congress information of the state of the Union, and recommend to their consideration such measures as he shall judge necessary and expedient” – the constitutional basis for a State of the Union address.  But, as he typically did with regard to the power-granting clauses of the Constitution, Jefferson interpreted this provision quite strictly.  Hence, he sent written messages to the Congress, and his annual messages consisted of recommendations couched in deferential language, respectful of Congress’s role as the law-making branch of the national government.  As Jefferson described the role of the President as chief executive, “I am but a machine erected by the Constitution for the performance of certain acts according to the laws of action laid down for me.” 

                Jefferson’s successors as President in the 19th century also would be appalled at the spectacle of the modern State of the Union address.  Regardless of party (whether Jeffersonian Republican, Whig, Democrat, or Republican), they all followed Jefferson’s practice of sending a written annual message to Congress.  Even Abraham Lincoln, at the height of the Civil War, sent written messages to Congress, including his special July 4, 1861 message to a special session of Congress called by Lincoln to deal with the crisis of Southern states’ secession from the Union.  To Lincoln, as to his fellow 19th-century presidents, it would have been unthinkable for him to personally appear in Congress to give a speech.  Typically, the only public speeches an American president would make in the 19th century would be the address he gave at his inaugural.  (Lincoln’s Gettysburg Address – his brief remarks at the dedication of the military cemetery at Gettysburg, Pa. – was noteworthy not only as a great speech but as an exception to this rule.)  Indeed (as the final section below will more fully discuss), part of the reason President Andrew Johnson was impeached and almost removed from office was that he breached presidential etiquette by giving public speeches in which he criticized Congress (“in a loud voice” and with “intemperate, inflammatory, and scandalous harangues,” as the impeachment articles put it). 

                The Jeffersonian practice ended in the early 20th century with Woodrow Wilson, who began the modern practice of presidents delivering their annual message in person to a joint session of Congress.  It’s not surprising that Wilson broke with the Jeffersonian practice, for in various ways Wilson rejected the American constitutional doctrine of separation of powers and instead favored the British parliamentary system, fancying his own role as President to be akin to that of the British prime minister.  As noted above, it was also consistent with the political atmosphere of the early 20th century – the so-called “Progressive” era in American politics – for policymakers to forget the importance of separation of powers, preferring “efficiency” to limits on power.  In this era, Congress began creating the so-called “fourth branch” of government, the independent regulatory agencies, which combined legislative, executive, and judicial powers in the same bureaucrats’ hands, thereby throwing separation-of-powers doctrine to the wind.  Not surprisingly, both the power of the President and the power of the federal government generally have grown over the past century, to extremes that America’s Founders could not have imagined in their wildest nightmares. 

                Modern presidents do not merely recommend measures to Congress.  Following the Budget and Accounting Act of 1921, the President has been given the initiative in submitting budgets and, effectively, in legislating, notwithstanding the constitutional separation of powers.   The only part of the lawmaking function assigned by the Constitution to the President – the veto power – was meant by the Framers as a shield against unconstitutional legislation, but modern presidents have turned it into a political tool to force their policy preferences on Congress (as the next section more fully discusses).  

                In modern State of the Union addresses there’s also something especially unseemly about the presence of Supreme Court justices, who are supposed to be independent of the other two “political” branches of the government (and also, at least in theory, insulated from shifting public opinions).  After all, many of the “initiatives” the president proposes might very well – if the system works as it should – end up being challenged in a case or controversy brought before the Court.  Justices have no business attending this political event.  

                B.O. has politicized the State of the Union address even more than his recent predecessors.  Two years ago, in his 2010 address, he abused the occasion, by openly criticizing the Supreme Court’s decision in the Citizens United campaign financing case – and in so doing, slandered the justices who were in attendance.  As I noted in my “Spring Briefs 2010” (Mar. 18, 2010), some two months after the event, Chief Justice John Roberts finally broke his silence about B.O.’s  tactless remarks unfairly criticizing the Supreme Court – most of whose justices were sitting in the front row of the U.S. House chamber, as the Congress’s distinguished guests.  Referring to the Court’s recent decision in Citizens United v. Federal Election Commission, B.O. claimed that the Court “reversed a century of law to open the floodgates for special interests – including foreign corporations – to spend without limit in our elections.”  Justice Samuel Alito spontaneously responded to B.O.’s comments by mouthing the words, “That’s not true,” and as many knowledgeable commentators – including my friend and colleague, Capital University law professor Brad Smith (former member and chairman of the FEC) – have noted, Alito was right.  B.O. had blatantly lied about the Court’s decision (certainly not the first time the Bullshitter-in-Chief, a Harvard Law graduate and former law professor, was wrong about the law).       Chief Justice Roberts, speaking at the University of Alabama the first week of March 2010, answered a law student’s question about the president’s criticism of the Court as follows: 

    “The image of having the members of one branch of government standing up – literally surrounding the Supreme Court [justices], cheering and hollering – while the Court, according to the requirements of protocol, has to sit there expressionless, I think, is very troubling.”


    He then questioned why the justices attend the annual event.  “To the extent it has degenerated into a political pep rally, I’m not sure why we’re there.”  Yet the justices (including Chief Justice Roberts) continue to attend – some, but not all, of them (to his credit, Justice Clarence Thomas – my favorite justice on the Court today – has steadfastly refused to attend).  Just as he did in his decision in the “ObamaCare” case last summer, Roberts wimped out.  To quote the anagram aptly devised by my good friend Rod Evans (author of the word-play book, Tyrannosaurus Lex), “Fierce Justice Botches” – yet again. 



    Abuse of the Veto Power


                The power to veto legislation is generally regarded as a presidential power, but it is not even mentioned in Article II among the powers vested in the president by the Constitution.  That is because the veto is a prime example of a constitutional “check and balance,” properly understood – that is, an exception to the general constitutional principle of separation of powers.  Thus, because it concerns legislative power, the veto is found in Article I, specifically in the so-called Presentment Clause (Article I, Section 7), which outlines the procedures by which bills become laws.  This provision specifies that if the president approves of a bill passed by both houses of Congress, "he shall sign it"; but "if not, he shall return it with his objections" to Congress, where the bill may still become law if approved by a two-thirds vote in each house.  The veto is a classic example of one of the "checks and balances" of the Constitution, for it gives the Chief Executive a share in the legislative power, which under the Constitution principally rests with Congress.  Indeed, because Article I, Section 1 provides that legislative powers are “vested” in the Congress, the power given the president in Section 7 is an exception to this general rule.  Accordingly, it ought to be construed strictly – not to give the president a broad, discretionary “share” in the legislative power but rather to give the president an opportunity to check Congress’s abuse of the legislative power.    

                Why did the framers of the Constitution give this power to the president?  Remarkably, both Alexander Hamilton and Thomas Jefferson – leaders of the two major political parties of the 1790s, who fundamentally disagreed about the Constitution and particularly about presidential power – both agreed that the purpose of the veto was to guard against unconstitutional legislation.  As explained by Hamilton in the Federalist Papers, the veto was intended to give the president a "shield" against legislation that unconstitutionally interfered with his powers.  Without the veto, the president "would be absolutely unable to defend himself against the depredations" of Congress, Hamilton argued.  Jefferson similarly described the veto as “the shield provided by the constitution” to protect not just the president but also the other branch of the national government, the judiciary, as well as the states, against the “invasions” by Congress of their “rights” under the Constitution.  Jefferson felt so strongly that the proper use of the veto was limited to constitutional grounds that he actually undercut his argument against Hamilton’s bank bill by advising President Washington to defer to Congress unless he was certain of the bill’s unconstitutionality.     

                As the framers intended, the veto gave a preliminary line of defense against unconstitutional laws before they were put into effect; it made it unnecessary to wait for an actual case or controversy to bring such legislation before the Supreme Court for that body to exercise its power of judicial review to nullify unconstitutional laws.  The veto power explains in part why presidents take an oath "to preserve, protect, and defend" the Constitution. 

                The early presidents understood that the limited purpose of the veto was to guard against unconstitutional legislation and, accordingly, they used the veto power very sparingly.  Our first  president, George Washington, vetoed only two bills.  Washington followed the advice of his secretary of state, Thomas Jefferson, who advised him to veto a bill passed by Congress only if his mind were "tolerably clear that it is unauthorized by the Constitution."  Otherwise, Jefferson believed, the president should defer to Congress.  "If the pro and con hang so even as to balance" the president's judgment, "a just respect for the wisdom of the legislature would naturally decide the balance in favor of their opinion" (as Jefferson wrote at the end of his opinion on the constitutionality of the bank bill).  In his two terms as president, Jefferson cast no vetoes at all; his predecessor, John Adams, and Adams’s son, John Quincy Adams, also cast no vetoes; James Monroe cast only one veto.  The only early president to exercise the veto power with any frequency was James Madison, who vetoed seven bills, all on constitutional grounds.  Thus in the forty year period (1789‑1829) covered by the first six presidential administrations there were a total of only ten vetoes. 

                During the rest of the 19th century and the early decades of the 20th century, presidents generally adhered to the tradition of limited use of the veto.  During Andrew Jackson’s eight years in office, he vetoed twelve bills – more than all his predecessors combined – thereby earning the nickname, "King Andrew," given him by his political opponents, the Whigs (as noted above).  Although Jackson’s famous veto of the bill rechartering the Second Bank of the United States was the first veto to cite policy considerations, his principal objection was that the Bank was unconstitutional.  Even Abraham Lincoln, a strong president who exercised extraordinary powers during the Civil War, cast only two regular vetoes.  "As a rule I think the Congress should originate as well as perfect its measures without external bias," Lincoln wrote, showing that he, like the early presidents, was reluctant to substi­tute his judgment on policy matters for that of Congress.  After the Civil War, the number of presidential vetoes increased dramatically, but the majority of vetoes were cast against private bills, granting pensions to Civil War veterans who claimed service-related disabilities but whose claims had been rejected by the Pension Bureau.  Ulysses S. Grant, believing many of these claims to be fraudulent, vetoed forty private bills; Grover Cleveland vetoed 482.  In his book The American Presidency, historian Forrest McDonald notes that Cleveland's "veritable orgy" of vetoes was striking – Cleveland used the veto power more than twice as much as his twenty-one predecessors combined – but concludes that almost two-thirds of the bills vetoed during the Constitution's first two centuries were private bills, not public.  These vetoes were based premised on constitutional grounds, for private bills generally violated the long-standing but unwritten constitutional principle that it was wrong for government to tax some citizens for the benefit of others. 

                In the twentieth century use of the veto power fell again to a lower level until the presidency of Franklin Roosevelt, when the power again underwent a radical change.  William McKinley vetoed only two public bills; Theodore Roosevelt, fifteen; William Howard Taft and Woodrow Wilson, each more than twenty public bills.  Franklin Roosevelt, in his twelve years in office, vetoed over 600 bills – despite the fact that he had huge Democratic majorities in Congress who usually gave him what he wanted.  As McDonald observes, Roosevelt used the veto "as a method of cracking the whip, to keep Congress subordinate if not subservient."  Thus Roosevelt initiated the modern presidents' practice of using the veto as a political tool.  This practice was followed by FDR's successor, Harry Truman, who when faced with a Republican Congress from 1947 to 1949, disapproved over 175 bills, using his veto messages as a means of calling attention to his policies while simultaneously attacking what he called "the do-nothing Eightieth Congress."   

                When modern presidents use the veto in so blatantly a political manner, they do more than simply engage in partisanship: they also seriously undermine the Constitution itself.  A president who vetoes every piece of legislation that he dislikes on policy grounds forces Congress to accede to his wishes, unless legislators are able to muster the two-thirds majority in both houses they need to override the vetoes.  Such use of the veto indeed constitutes blackmail.  It thwarts not only the will of the people, as manifested in the most recent congressional elections, but also the design of the framers of the Constitution, who intended that the legislative power be vested primarily in the people's representatives in the Congress.  Profligate use of the veto, in effect, transforms the simple majority vote required by Article I to a two-thirds majority requirement – in effect, working a change in the constitutional procedures for enacting legislation.   

                B.O. not only has followed in the footsteps of  post-FDR presidents who abuse the veto power, but he has carried that abuse – using the veto as a political tool to force his policy choices on Congress – to dangerous new heights, resulting in the further concentration of power in the White House.  Consider, for example, how B.O. has used the veto, or the threat of a veto, to force his tax-and-spend policies on Congress and to attempt to negate the role of the House of Representatives (the house that directly represents “the people” and which has the exclusive constitutional power of initiating tax legislation), simply because the House has a Republican majority who have rejected B.O.’s “soak-the-rich” tax policy.  As I discussed in the section on the so-called “fiscal cliff” in Part II of my “2013: Prospects for Liberty” essay (January 31), it was B.O. himself, and not Congress, who was responsible for creating such a fiscal “crisis,” through his abuse of the veto power.  



    Abuse of the “War Power” (Commander-in-Chief)


    One of the most important ways in which the framers of the Constitution implemented the principle of separation of powers was to deliberately divide what some commentators have called “the war power” – but which is more properly understood as a bundle of discrete powers, “the war powers” – between the President and Congress.  The framers sought to chain “the Dog of war” (as Jefferson put it) by granting Congress the power to declare war, which is properly viewed as the power to determine whether or not the military forces of the United States should be used.  Other important powers concerning military policy also are vested in the Congress, including the powers to “grant letters of marque and reprisal” (authorizing private vessels to engage in war); to “make rules concerning captures on land and water”; to “raise and support armies”; to “provide and maintain a navy”; to “provide for organizing, arming, and disciplining the militia”; and even to “make rules for the government and regulation of the land and naval forces” (all powers of Congress enumerated in Article I, Section 8). 

    Yet, since the end of World War II, one of the chief ways in which modern presidents have abused the powers of their office – a practice exercised , unfortunately, by presidents of both major parties, Democrat and Republican – has been abuse of the president’s power to be “Commander in Chief of the Army and Navy of the United States, and of the Militia of the several States, when called into the actual Service of the United States” (as specified in Article II, Section 2).  Modern presidents, claiming to act under authorization of this clause, have usurped Congress’s power to declare war by deciding, on their own authority (without consulting Congress), to commit U.S. military forces to various “conflicts” or “police actions” (euphemisms for “wars”), around the globe. 

    Harry Truman began this unconstitutional practice by sending troops to the Korean peninsula, to fight in the Korean War – claiming that his unilateral action (without any explicit Congressional authorization) was pursuant to U.S. treaty obligations, because the “police action” was authorized by the United Nations, after communist North Korea had attacked the free South Korea.  

    Defending the actions of Truman (and similar actions taken by his successors), some foreign-policy scholars have argued that U.S. ratification of the United Nations treaty has created a “new world order” in which the 18th-century rules about war and peace have become obsolete, thereby making the Article I, Section 8 clause granting Congress the sole power to declare war a kind of constitutional dead-letter.  That argument fails, for at least two important reasons.  First, the argument mischaracterizes the effect of U.S. ratification of the UN treaty:  no treaty can effectively amend the Constitution – which can be amended legitimately only in one way, pursuant to the procedures provided in Article V.  Second, the argument ignores the other provisions in Article I, Section 8 (noted above) that give Congress most of the other “war powers.”   

    Considered in the context of these provisions, as well as Congress’s power to declare war, it is clear that, in so dividing the war powers, the framers of the Constitution meant the president’s power as Commander-in-Chief to be quite limited:  it is the power to wage war, to actually command – to direct how the military forces of the United States are to be used – but not the power to determine whether U.S. military forces are to be used.  That important policy determination – the question whether to initiate war, to determine when and where U.S. military forces should be used – was left to the determination of Congress.  (Why?  Because Congress represents the people, and it is the people – through their tax dollars and their blood – who must pay the full costs of a war.)  Only after Congress has authorized war, by a formal declaration (the purpose of which is to clearly demonstrate that war has begun), may the president, as Commander-in-Chief, determine how the war is to be waged.  (The exception is where the United States are attacked by a foreign power – where a state of war has been created by another nation’s declaration of war against the U.S. – in which case some of the Founders (particularly Alexander Hamilton and his fellow Federalist war-hawks) believed that a Congressional declaration of war was unnecessary.  Other Founders, however (namely, Jefferson, Madison, and their Republican party), took the view that Congress must authorize all offensive uses of the U.S. military, even when a foreign power has declared war against or has attacked the United States.  In such a situation, they believed, the president could respond with defensive military force only, until Congress authorizes the use of offensive force.  (That was the position Jefferson took – for which he was roundly criticized by the Hamiltonian Federalists – during the first term of his presidency, with regard to the “Barbary War” in the Mediterranean.))    

    The Supreme Court has steered clear of the “war powers” controversy, regarding it as a nonjusticiable “political question” between the two “political” branches, the president and Congress – which is why the constitutionality of U.S. involvement in particular wars or conflicts has not been resolved by the courts.  From time to time, however, cases arise that indirectly concern the war powers, particularly when presidents’ abuse of their Commander-in-Chief power adversely affects the rights of private persons.  For example, during the Korean War, President Truman attempted to seize control of the nation’s steel mills, to avert a possible strike, arguing that it fell within his powers of Commander-in-Chief because it concerned production of necessary war materials.  (He issued an executive order, which also raised the issue whether the president was abusing his executive authority by usurping the law-making powers of Congress.)  The Supreme Court rejected Truman’s argument, finding that his actions were unconstitutional – specifically, a violation of the steel companies’ property rights as well as an abuse of presidential powers.  Writing the opinion of the Court, Justice Black held that the president’s executive order “cannot properly be sustained as an exercise of the President’s military power as Commander in Chief of the Armed Forces” – because that power cannot constitutionally give the president the power “to take possession of private property in order to keep labor disputes from  stopping production.”  If a labor strike at steel mills would jeopardize national security and the “war effort,” that’s “a job for the Nation’s lawmakers, not for its military authorities.”  Justice Black added that Congress had given the president a lawful way to avert a strike in the steel mills – by enforcing a provision of the Taft-Hartley Act authorizing 60-day injunctions of strikes in emergencies – but that Truman failed to follow the law.  Instead of doing his duty to see that the laws passed by Congress are faithfully executed, Truman unconstitutionally attempted to be a “lawmaker” himself (Youngstown Sheet & Tube Co. v. Sawyer (1952)). 

    Presidents John F. Kennedy (JFK), Lyndon Baines Johnson (LBJ), and Richard Nixon, again abusing their powers as Commander-in-Chief, committed U.S. military forces in Vietnam, when communist North Vietnam invaded the free South Vietnam.  Again the presidents justified their actions by claiming to act under treaty authority (not only the UN but also SEATO, the Southeast Asian Treaty Organization, in which the U.S. had promised to help defend South Vietnam from attack).  During LBJ’s presidency, Congress did pass the Gulf of Tonkin Resolution, authorizing the president to use military force to protect troops he’s already committed to the Vietnam War – a resolution that some scholars interpret as a de facto declaration of war, giving the president essentially a blank check to continue U.S. involvement in the conflict.  But as the war continued, and even escalated, during Nixon’s presidency – after nearly a decade in which thousands of young men were drafted, compelled to fight in Vietnam, and either died or were wounded – both Congress and the American public tired of the conflict.  

    In this war-weary atmosphere of the early 1970s, Congress passed, over Nixon’s veto, the joint resolution known as the War Powers Act (1973), which required the president to consult Congress and obtain formal congressional approval (or some similar formal action by Congress) after U.S. military forces are committed in a foreign theater of war, where hostilities exist or were imminent.  

    Starting with Nixon, however, no president – whether Republican or Democrat – has followed the letter of the law (the War Powers Act), because it has been the constant position of the White House (regardless which party is in control) to regard the Act as an unconstitutional intrusion on presidential powers, including the Commander-in-Chief power.  In fact, however, the War Powers Act represents a minimal effort on the part of Congress to reassert its war power – its Article I, Section 8 authority to declare war – which Congress had, effectively, abdicated to U.S. presidents after 1945.  It’s not just the U.S. presidents who should be blamed for usurping Congress’s war power; it’s also Congress itself, which has for most of the past 65 or so years acquiesced in this unconstitutional power grab by the presidents. 

    For example, President George H.W. Bush (“Bush the elder,” as I call him) committed U.S. troops to the defense of the Kuwaiti monarchy, from attack by Saddam Hussein’s regime in Iraq, in the first Gulf War.  Bush the elder sent the troops to the Gulf starting late in the summer of 1990; by February 1991, when Congress eventually passed resolutions authorizing him to offensively use military force – literally on the eve of the attack on Baghdad – U.S. military forces had been so committed (by the president’s own actions as Commander-in-Chief) that Congress had little choice, lest it be accused of not “supporting our troops.”  (Thus, Bush the elder essentially blackmailed Congress into authorizing the Gulf War.)  Bill Clinton, acting on his presidential authority alone and without prior authorization by Congress, committed U.S. troops to various conflicts around the globe, including the war in Bosnia (in the former Yugoslavia).  George W. Bush (“Bush the younger,” I call him) committed U.S. military forces to two wars, in both Iraq and Afghanistan, but at least he had Congressional authorization – a shameful resolution passed by Congress, in the wake of the 9/11/2001 Islamic terrorist attacks on the United States, giving Bush the younger essentially a blank check to fight wars against whomever he regarded as responsible for the terrorist attacks. 

    Following in the unfortunate (and unconstitutional) tradition of his predecessors, B.O. similarly has committed U.S. military forces to wars – particularly, the NATO operation in Libya – without authorization of Congress, as the next section will discuss.  But it’s actually among the least dangerous ways in which B.O. has abused the powers of his office.



    Abuse of Other Presidential Powers:

    Why B.O. Is the Most Lawless President in U.S. History


                 As regular readers of this blog know, since the 2008 elections it has been my policy to refer to the 44th president of the United States, Barack Obama, by his initials, “B.O.,” for the obvious reason.  As I’ve explained, “other modern presidents have been known by their initials – TR, FDR, JFK, LBJ – and using this president’s initials in lieu of his name seems appropriate because B.O. as president, in a word, stinks.”  (For a full explanation, see the first entry in my “Fall-deral 2008” essay (Nov. 6, 2008)).  

    B.O. came into office on a wave of popularity – intense popularity, verging upon adulation for many of his supporters – that perhaps was unprecedented in modern American history.  Much of that popularity came from his status as “the first African-American” president, as is popularly believed (although, technically, Obama is biracial – his mother was white, his father was black, a black African Muslim, in fact – he identifies himself as black).  Much of the popularity also came from the image created by his campaign managers, claiming he was bringing not only “Change” and “Hope” to Washington but also transforming American politics and government.  B.O. was, to many people, not only a “post-modern” president but a “post-racial,” or “trans-racial” leader, who would bridge not only racial divisions in America but also partisan divisions, between Democrats and Republicans or left-liberals and conservatives, a “uniter,” not a “divider,” even a kind of “new (political) Messiah,” as some supporters claimed. 

                Behind this image of the 44th president is a man who in reality, frankly, doesn’t come close to living up to the hype.  B.O. personifies the naked “Emperor” in the famous children’s story by Hans Christian Andersen, “The Emperor’s New Clothes” – as I also have written here, since well before the 2008 elections.  (See my blog essay “The Emperor Is Naked!” Oct. 16, 2008.)   In that essay I maintained that B.O. is a master purveyor of bullshit – a true bullshit artist.  He got his party’s nomination and was elected president because of a campaign based almost entirely on bullshit.  And he was reelected to a second term because he has continued to be a master purveyor of bullshit – “the Bullshitter-in-Chief,” as I’ve called him – keeping up the “Emperor’s New Clothes” pretense, as I have discussed in Part I of this year’s “Prospects for Liberty” essay (January 17).  

                B.O.’s image portrayed him as a “uniter,” but in fact he has employed the standard Democrat demagoguery based on race and class, dividing Americans even more sharply along racial, class, and partisan lines.  (As Pat Cadell and Doug Schoen, moderate Democratic political consultants, have observed, B.O. is the most divisive president in modern history; he’s “tearing the country apart.”)  B.O.’s image portrayed him as a “centrist,” but in fact he has proved himself to be one of the most left-wing politicians ever to hold the presidency.  It’s not just simply a case of substance failing to match style, for even in his much-vaunted style, B.O. has failed to live up to his hype.  The supposedly eloquent speaker has been exposed as an inarticulate moron overly-dependent on a teleprompter.       

                The biggest load of bullshit associated with B.O. and his presidency, however, is the notion that he’d bring “change” to Washington.  All that B.O.’s regime has brought to Washington has been more of the same – more of the same old, tired, semi-socialist, paternalist policies that the federal government has been tinkering with, under both Democrat and Republican administrations, for the past century or so, since the beginning of the 20th-century regulatory/welfare state.  All B.O. really has attempted to do is to expand the welfare state – to increase government and its controls over Americans’ lives, making the U.S. even more of a socialist (or, more properly speaking, a fascist) country – and that’s not any kind of real “change” in public policy, at all.  No wonder the economy is such a “mess,” to use the term B.O. likes to use, as he still tries to blame on his predecessor, George W. Bush, the recession that B.O.’s polices have prolonged and worsened. 

                Thus, B.O. is, as I described him in my essay “Rating the U.S. Presidents 2012” (Feb. 22, 2012), “the worst president in U.S. history.”  In that essay, I summarized B.O.’s record as president – and explained why I rank him as a “failure,” at the bottom of the list:   

    “The least qualified person ever to hold the office of Chief Executive, he is unquestionably the worst president in American history.  B.O. was elected president by promising to bring `change’ to Washington, D.C.; instead, he brought more of the same old semi-socialist, welfare/regulatory state, paternalistic policies that the worst of his predecessors, whether Democrat or Republican, brought to Washington – except that B.O. did it at unprecedented levels, resulting in spiraling federal budget deficits and . . . adding more to the national debt than all previous presidents – from George Washington to George W. Bush – combined.  . . .  In comparison, B.O. makes Jimmy Carter seem competent, Bill Clinton seem moral, and FDR seem faithful to the Constitution.”


    I also gave my own “Top Ten” list, of the ten reasons why I regard B.O. as the worst president.  The most important – the No. 1 reason – was his “Contempt for the Constitution.”  I noted that, sadly, what I predicted in my 2009 “Prospects for Liberty” essay (Jan. 15, 2009) about B.O.’s presidency has come true: 

    “Given his politics, he lacks the requisite understanding of, and respect for, constitutional limits on the powers of government – which means that on January 20 when he takes the oath of office to `preserve, protect, and defend the Constitution of the United States,’ he will be lying.  Virtually everything he’ll do as president will undermine the Constitution, the limits it places on the powers of the executive branch and the federal government generally as well as its protections for the rights of individuals.”


    During the fours years thus far he has been in office, B.O. has clearly shown nothing but contempt for the rule of law generally and particularly for the higher-law limitations that the Constitution imposes on the powers of the national government and the president.  

    To those reasons why I regard B.O. as the worst president in U.S. history, I added another – really an elaboration on the topic discussed above, B.O.’s contempt for the Constitution.  That additional reason is that B.O. is also “the most lawless president in U.S. history.”  He blithely violates not only the law of the Constitution – the higher law that limits the government – but also the broader limitations on the exercise of governmental power known as “the rule of law.”  By calling B.O. “lawless,” I mean that he not only acts in unlawful ways but also acts as though he’s not subject to, or controlled by the law.   

    To show that B.O. is the most lawless president in U.S. history, I created another “Top Ten” list, of ten ways (not necessarily in order of importance) in which B.O. – in his first term in office – has demonstrated his contempt not only for the Constitution but also for the rule of law, generally.  What follows is a summary of that “Top Ten” list, slightly revised, to take into account recent political developments.  To paraphrase Thomas Jefferson’s introduction of the charges against King George III in the Declaration of Independence, “let these facts be submitted to a candid world”: 


    10.  Not the Rule of Law, But the Rule of Alinsky:  B.O. not only holds far-left political views (as shown by his constant attempts to use federal taxes to redistribute wealth), but also may be considered a disciple of the 1960s neo-Marxist anarchist/nihilist Saul Alinsky (1909–1972), author of the leftist strategy book, Rules for Radicals.  B.O. and his minions frequently practice the methods of Alinsky’s Rules, particularly rule #13 (which arguably is B.O.’s favorite): “Pick the target, freeze it, personalize it, and polarize it.”  One can see that in his attacks on “fat cat bankers,” “greedy health insurers,” and “millionaires and billionaires.”  That also explains why B.O.’s claim to be “a unifier, not a divider” has spectacularly back-fired.


     9.  Orwellian Language:  B.O.’s lawlessness is reflected even in the way he speaks.  His major public speeches frequently use words in a manner that is best described as “Orwellian,” from George Orwell’s famous distopian novel 1984, where a totalitarian regime controlled people through the perversion of words.  Engraved on the wall of the “Ministry of Truth” in Orwell’s fictional Oceania were the slogans “War is Peace,” “Freedom is Slavery,” and “Ignorance is Strength.”  To those may be added a new slogan, suggested by B.O.’s speeches – “Spending is Investment” – as B.O. continually refers to increased federal government spending for such things as education, energy, and infrastructure as “investments.”  Related to B.O.’s Orwellian use of language is his equally dangerous practice of telling lies.   B.O. is a shameless and compulsive liar.  His regime’s explanation of the Islamic terrorist attacks in Benghazi, Libya was based on a big lie (as discussed below).  His recent State of the Union address contained several major lies (as the editors of Investor’s Business Daily recently observed).  And commentator Steve McCann, in a provocative piece posted on American Thinker, called B.O. “the most dishonest, deceitful, and mendacious person in a position of power” he’d ever witnessed (“The Mendacity of Barack Obama,” April 15, 2011). 


    8.  Initiating War:  As noted in the section above, like many of his predecessors, both Democrat and Republican, in the post-World War II era, B.O. has violated the exclusive power of Congress to declare war, by initiating war through unilateral presidential action, by abusing his “commander-in-chief” power.  In B.O.’s case, this unconstitutional exercise of the “war power” has been particularly egregious because he not only has continued the two wars begun during George W. Bush’s presidency in Iraq and Afghanistan (both of which were at least authorized by Congressional resolutions) but also has committed U.S. military force to Libya, in the war that resulted in the downfall of Libyan dictator Muammar Gaddafi’s regime.  (In response to B.O.’s apologists, who deny that U.S. military intervention in Libya was “war” because U.S. air forces alone provided support for NATO operations there, the obvious answer is that war does not require ground troops:  whenever the U.S. uses its military force offensively against another nation, it is at war – and the constitutional requirement of a Congressional declaration of war cannot be ignored.)  B.O.’s Libyan venture is doubling egregious because the vital interests of the United States were not at all involved; rather, it primarily concerned the economic interests of Europe, as western European countries are the major buyers of Libyan oil.  So, B.O. abused U.S. military power to benefit our commercial rivals in Europe – yet another instance of B.O.’s occupation of the White House being more beneficial to other countries than to the United States.


    7.  Assassin-in-Chief:  Among the unconstitutional or extra-constitutional powers that B.O. has exercised, there is one in particular that ought to concern genuine civil libertarians.  As Jack Curtis observed in a provocative piece posted last year on American Thinker, B.O. has “order[ed] the killing of selected individuals (and any others nearby) located in foreign countries at will, using drone aircraft.”  Citing a lawsuit filed by the ACLU and the Center for Constitutional Government that challenged this practice – and the Center’s press release saying that the B.O. regime has claimed it has the power to kill, without review, any American it decides is a threat – Curtis writes that this claim is “absolutely scary,” and explains: “Killing citizens at presidential will without review seems pretty far from constitutionally limited government. . . . The fact that it originated with killings mostly located in the backwoods [or mountains] of Pakistan offers no assurance of future locations” (“The Powers of This President,” March 7, 2012).   

    In a recent piece also posted on American Thinker, Herb Titus and William J. Olson continue writing about the killing of civilians – including three U.S. citizens – on B.O.’s order in drone strikes in Yemen in 2011.  “As the worldwide drone program ramps up, there have been increasing calls for the president to reveal the basis for his claimed authority.”  Congress only recently has been investigating the matter, as the Senate begins gathering information for hearings on John Brennan’s confirmation as CIA director.  NBC News on February 4 released a leaked U.S. Justice Department “white paper” that explains the regime’s criteria for ordering the “targeted killing” of American citizens off the battlefield on foreign soil.  This is a power that “no prior president ever thought he possessed – a power that no prior resident is known to have exercised”; in effect, it is “the power to kill citizens without judicial process – a power that has been unknown in the English-speaking world for at least 370 years” (“Assassin in Chief?” February 7).


    6.  The B.O. White House “Enemies List”:  One of the charges against former President Richard M. Nixon, in the articles of impeachment drawn up by the House Judiciary Committee in 1974 (discussed in the next section below), was that the Nixon White House had maintained an “enemies list” and was using the powers of government to punish individuals on that list.  It turned out that the allegation was false – the Nixon “enemies list” amounted to nothing more than a list of persons who were not to be invited to White House social events.  Neverthless, the specter of an administration compiling an enemies list and using the awesome powers of government to punish persons on that list has, sadly, come true in the B.O. White House.   

    There’s the infamous example of Gerald Walpin, former inspector general for AmeriCorps, who blew the whistle on political corruption in that government program – and who thus became the target of a campaign of character assassination orchestrated by the B.O. regime.  There’s also the case involving Charles and David Koch, the businessmen brothers who help bankroll various nonprofit organizations that promote personal liberty and free enterprise – and who were selected as a “political punching bag” by the president’s re-election team.  In an op-ed last year in The Wall Street Journal, the Koch brothers’ attorney, Ted Olson (former solicitor general of the United States) describes how the B.O. regime has demonized his clients – among other ways, by having the White House engaged in derogatory speculative innuendo about the integrity of the [Kochs’] tax returns, by having a leading Democratic member of Congress demand their appearance before a congressional committee to be interrogated about the Keystone XL oil pipeline project (in which the Kochs have no involvement, other than publicly criticizing B.O. for killing it), and more generally, by having the president’s surrogates and allies in the media regularly attacking them, sullying their reputation, and questioning their integrity. 

    As Olson notes, “when Joseph McCarthy engaged in comparable bullying, oppression, and slander from his powerful position in the Senate, he was censured by his colleagues and died in disgrace. . . . In this country, we regard the use of official power to oppress or intimidate private citizens as a despicable abuse of authority and entirely alien to our system of a government of laws. . . . That is why it is exceedingly important for all Americans to respond with outrage to what the president and his allies are doing to demonize and stigmatize David and Charles Koch,” who have been the targets of a “multiyear, carefully orchestrated campaign of vituperation and assault,” one that has been “choreographed from the very top” (“Obama’s Enemies List,” Feb. 1, 2012). 

    The B.O. regime’s attacks on Mr. Walpin and the Koch brothers are not isolated instances; they are routine practices of a regime that implements Alinksy’s Rules for Radicals and particularly B.O.’s favorite rule, “Pick the target, freeze it, personalize it, and polarize it.”  Other Americans who have been targeted as “enemies” of the regime are documented in Chapter 7 (“We’re Gonna Punish Our Enemies”) of David Freddoso’s book Gangster Government.  They include Rush Limbaugh, the U.S. Chamber of Commerce, FOX News, and even the justices on the Supreme Court, whom B.O. directly attacked (by mischaracterizing the Court’s Citizen United decision, as noted above) during his 2010 State of the Union address, delivered with six justices seated immediately in front of him!  


    5.  Taking Over the “Fourth Branch” of Government: Administrative agencies – agencies such as the Environmental Protection Agency (EPA), the Federal Communications Commission (FCC), the National Labor Relations Board (NLRB), and so on – are regarded as “independent” executive agencies because they were created by acts of Congress and are staffed by commissioners with staggered terms, so no one president can control them through his appointments.  They also have been called the “fourth branch” of government because these agencies combine legislative (rule-making), executive (enforcement), and judicial (adjudicating disputes in administrative courts) functions, a dangerous blending of the three essential functions of government that the framers of the Constitution meant to keep separate.   

    Not only are these agencies dangerous in themselves to Americans’ liberties, but they have become particularly dangerous during B.O.’s regime because of his efforts to control them (through his appointees) and to use them illegitimately to make laws – abusing the agencies’ regulatory powers to by-pass Congress, by enacting “rules” that actually amount to new legislation.  When Congress failed to pass the “cap-and-trade” bill that B.O. was pushing, supposedly to control so-called “greenhouse gas” emissions (essentially carbon dioxide, from the burning of “fossil fuels” – coal, oil, and natural gas), B.O. had his handpicked EPA administrator, Lisa P. Jackson, assume regulatory authority over greenhouse gases, even though Congress had not empowered the agency to do so.  Similarly, when Congress refused to enact “card check” legislation doing away with secret ballots in union elections, B.O.’s appointees to the NLRB imposed the change by administrative decree.  With their threatened action against Boeing, to block the company’s plan to build a facility in South Carolina (a right-to-work state), the pro-union NLRB even attempted to dictate where a business may locate.  And B.O.’s appointees to the FCC have assumed, or attempted to assume, regulatory authority over the internet, even though Congress has not given the agency such power.   


    4.  King of the “Czars” and Abuser of the Recess Appointment Power:  Modern presidents’ practice of appointing so-called “czars” – presidential appointees not subject to Senate confirmation – is an abuse of the appointment power, as these “czars” function not merely as presidential advisors but as actual bureaucrats, exercising executive powers.  B.O. has outdone all this predecessors in this unconstitutional practice, appointing – in just his first few months in office – “more czars than the Romanovs.” 

                Related to his appointment of unconfirmed “czars” is another abuse of the presidential appointment power that B.O. has employed.  The Constitution provides (in Article II, Section 2, clause 3) that the president may fill “vacancies that may happen during the recess of the Senate.”  Previous presidents have exercised this power only when the Senate was legitimately in recess (which, according to the Constitution, requires House approval if the Senate recesses for more than three days); the shortest prior period for a recess appointment was a break of ten days.  B.O. has abused this recess appointment power by making appointments while the Senate was not officially in recess – when in fact the Senate had taken pro forma steps to stay in business – as he did in January 2012, with the appointment of former Ohio attorney general Richard Cordray as director of the new Consumer Financial Protection Bureau (CFPB), a self-funded independent agency created by the Dodd-Frank Act in 2010, as well as the appointment of three new members (all of them pro-union) to the National Labor Relations Board (NLRB).   

                In both cases, B.O. made the illegitimate appointments to avoid contentious Senate confirmation battles; and in Cordray’s case, the appointment was not only unconstitutional but also illegal, as the law creating the CFPB specifies that the director has authority to act only after being confirmed by the Senate.  These appointments, made in a deliberate attempt to circumvent the Senate’s constitutional exercise of its “advice and consent” power, thus put under a cloud of illegality each and every decision made by the CFPB and the NLRB.  As I noted in Part III of my “2013: Prospects for Liberty” essay (February 7), a recent decision by the U.S. Court of Appeals for the District of Columbia held that B.O.’s so-called “interim” appointments to the NLRB were unconstitutional (Canning v. NLRB, Jan. 25).  The ruling apparently invalidates all decisions made by the NLRB in which these illegitimate appointees took part, and it also calls into question not only Cordray’s appointment as director of the CFPB but also all the actions he has taken (and rules he has announced) as director.  It remains to be seen whether the Supreme Court will affirm the Court of Appeals decision – and thus uphold the Constitution against B.O.’s unlawful use of the appointment power. 


    3.  King of the Waivers:  The rule of law requires that government act – exercising its power to use force – through general, objective laws that apply equally to all individuals.  (That’s essentially what America’s founders meant by “a government of laws,” not “the rule of men” – the sine qua non of republican government.)  This aspect of the rule of law is also reflected in the constitutional guarantee to individuals of “the equal protection of the laws.”  When a president does not uniformly enforce federal laws, applying them equally to all Americans, but instead selectively enforces the laws – exempting certain favored individuals or groups from the burdens that the laws impose on others – he thus violates both the constitutional guarantee of equal protection and the general principle of the rule of law.  

    Rather than faithfully execute the laws, the duty imposed on him by the Constitution and the presidential oath, B.O. has instead selectively enforced federal laws, assuming a discretionary power to “waive” the law for certain favored individuals or groups.  Because the 2010 federal health-insurance law (so-called “ObamaCare”) imposes heavy costs on virtually all employers, the B.O. regime has sought to soften its impact, making it more politically palatable to the American people, as well as to reward its friends, by granting thousands of waivers from the law’s onerous mandates.  Thus, as of last year, thousands of businesses, state and local governments, labor unions, and insurers, covering over three million individuals or families, have been granted a waiver from ObamaCare by Secretary of Health and Human Services Kathleen Sebelius.  Michael Barone has noted that more than half of those three million persons participate in plans run by labor unions (even though union members are only 12% of all U.S. employees, they have received 50.3% of ObamaCare waivers).  He also noted that in 2011 Sebelius granted 38 waivers to restaurants, nightclubs, spas and hotels in former Speaker Nancy Pelosi’s San Francisco congressional district (“Waiver Grants Latest Example of Chicago Way,” Investor’s Business Daily, May 26, 2011).  

    The practice of selective issuance of waivers has continued with the White House’s announcement that ten states would be given waivers from the Bush-era federal education law known as the “No Child Left Behind” law.  (Like ObamaCare, that law unconstitutionally injects the federal government into matters reserved to the states under the Constitution.  Consistent with his oath of office, the president may “waive” enforcement of such unconstitutional laws in all fifty states.  But to waive it in some states while enforcing it in others is exactly the kind of favoritism or cronyism that the principles of equal protection and the rule of law forbid.) 


    2.  Crony Fascism:  As I’ve previously noted here on MayerBlog, B.O.’s agenda to expand the federal regulatory/welfare state may be more properly called “fascist” rather than “socialist” because, rather than having outright government ownership of major industries, his policy leaves businesses in private hands but subjects them to pervasive government regulatory control – much as Hitler did in Germany and Mussolini did in Italy in the 1930s and 1940s.  Indeed, Sheldon Richman in the Concise Encyclopedia of Economics defines fascism as “socialism with a capitalist veneer.”  Citing this definition in an American Thinker essay, Steve McCann has identified the B.O. regime’s use of cronyism – what some call “crony capitalism” but which is actually crony fascism – as a key element of what he calls B.O.’s “fascist economy” (“Obama’s Fascist Economy,” Sept. 21, 2011).)  The Orwellian-named Patient Protection and Affordable Care Act of 2010 (popularly known as “ObamaCare”) is not only fascist but, with the thousands of waivers now being selectively granted by B.O.’s secretary of health and human services (discussed above), is also a perfect example of how the B.O. regime’s fascist policies also lead to political corruption in the form of cronyism. 

                Two other key examples of B.O.’s cronyism aptly illustrate how his policies have not only undermined the American economy but also violated the rule of law.  The infamous bailouts of two Detroit auto companies, Chrysler and GM – for which B.O. during his 2012 reelection campaign claimed credit for “saving” the U.S. auto industry – were in essence government takeovers of the two companies in order to pay off a major campaign donor to B.O. and other Democrat politicians (the United Auto Workers union), at the expense of bondholders of the two companies and in blatant disregard of long-established rules of bankruptcy law.  As noted above, when President Truman tried to nationalize the steel industry, in the face of a looming national strike that he claimed might endanger the Korean war effort, the Supreme Court found his actions to be unconstitutional, in the landmark Youngstown Sheet & Tube Company v. Sawyer decision in 1952.  And as law professor Todd Zywicki has pointed out, “by restructuring through a politicized bailout process,” GM and Chrysler “were left in a weaker competitive position than they would have been had they simply gone through a traditional Chapter 11 [bankruptcy] process” (“Romney’s Big Opening on Bailout,” Investor’s Business Daily, Feb. 22, 2012). 

                Similarly, B.O.’s “green” energy program is a textbook example of corrupt cronyism, involving a series of “green,” or alternate energy, companies with ties to the White House, which the B.O. regime promoted and heavily subsidized with taxpayer funds, which now seem to be going down the drain as the companies fail.  The most visible part of the scandal is Solyndra, the failed solar-panel maker that squandered $535 million of “stimulus” money.  But many more companies are involved in the scandal, as an investigation by reporters for the usually leftist Washington Post has revealed.  Post reporters “found that $3.9 billion in federal grants and financing flowed to 21 companies backed by firms with connections to five Obama administration staffers and advisers” (“Influence for Sale,” Investor’s Business Daily, Feb. 17, 2012).    


    1.  Abuse of Executive Orders:  B.O. has followed the unfortunate precedents set by many of his predecessors in abusing his power to issue “executive orders,” not to clarify how the executive branch will enforce the laws passed by Congress but instead to make new laws, thereby usurping the legislative power that Article I of the Constitution vests in the Congress.  He has done so more audaciously than any of his predecessors, actually bragging about how he has by-passed Congress.   

                For example, on a three-day Western trip in late October 2011, B.O. announced that he would use executive orders to implement three initiatives: programs that he claimed would help 1.6 million college students repay their federal loans, 1 million homeowners meet their mortgage payments, and 8,000 veterans find jobs – all without any legislation being passed by Congress.  In a speech to students at the University of Colorado – Denver, he infamously declared, “We can’t wait for Congress to do its job.  So where they won’t act, I will. . . . We’re going to look every single day to figure out what we can do without Congress.”  (It was the opening act of B.O.’s reelection bid, employing a strategy of trying to imitate Harry S. Truman’s campaign against a supposed “do-nothing” Republican Congress, but ignoring the fact that B.O.’s own Democratic Party still controls the Senate.)   

                B.O. began aggressively using his executive-order power almost from the moment he took office.  The day after he was inaugurated, he revoked one of George W. Bush’s executive orders limiting access to presidential records.  The very next day, he signed an executive order calling for the U.S. military detention facility in Guantanamo Bay, Cuba to be closed within a year – an order that has not been implemented, as “Gitmo” remains open today.  He continues to abuse the power, with his “can’t wait for Congress” initiatives, including his implementation last summer of parts of the so-called DREAM Act (providing government benefits for the children of illegal aliens), when Congress refused to pass the legislation.  (Commenting on B.O.’s order, issued through a Homeland Security Department memo which effectively means that immigration laws no longer apply to some 800,000 people, columnist Charles Krauthammer has called it “a fundamental rewriting of the law,” by presidential fiat.  “The Immigration Bombshell: Obama’s Naked Lawlessness,” Investor’s Business Daily, June 22, 2012.)  Most recently, B.O. has said he’d implement over 20 different gun-control measures by executive order if Congress fails to act as he demands it does, to “reduce gun violence.”  Never mind that the president lacks such an authority under federal law – or that, even if he had the authority, it would violate Americans’ Second Amendment rights, as I discuss in Part III of my “2013: Prospects for Liberty” essay (February 7). 

                Bill Clinton’s abuse of the executive-order power (ably chronicled in chapter 3 of the book The Rule of Law in the Wake of Clinton (2000), edited by the Cato Institute’s Roger Pilon) led former White House aide Paul Begala to quip in The New York Times:  “Stroke of the pen, law of the land.  Kind of cool.”  That nicely sums up the cavalier attitude that B.O. and his regime have for the rule of law.  It’s not “cool” to abuse the power of the presidency; it’s unconstitutional. 


                To this rather lengthy list may be added two more recent instances of abuse of power:  the cover-up of the Justice Department’s “Fast and Furious” Mexican gun-running scheme, and the cover-up of the real story behind the Sept. 11 Islamist attack on the U.S. consulate in Benghazi, Libya. 

    “Operation Fast and Furious” was the illegal Mexican gunrunning operation in which agents of the Bureau of Alcohol, Tobacco, Firearms, and Explosives pressured U.S. gun dealers to sell weapons to Mexican cartels, all in an apparent effort to raise public support for gun control in the U.S.  Despite efforts by Congressional investigators to find out how the ill-fated operation began – especially to see if a “smoking gun” can be found, tracing the origin of the illegal operation to the office of Attorney General or even to the White House – Attorney General Eric Holder apparently has successfully evaded responsibility for the operation, targeting instead some low-level officials in the Justice Department and its subsidiary Bureau.  (See “Holder’s Cover-Up Gets Criminal,” Investor’s Business Daily, Jan. 31, 2012.)   The White House assisted Holder in the cover-up by asserting the doctrine of “executive privilege” – the same doctrine that former President Nixon asserted in an attempt to block congressional investigation of the Watergate scandal (noted below). 

    It is interesting that notwithstanding the wholesale turnover of members of B.O.’s cabinet – like rats leaving a sinking ship – one prominent member of B.O.’s first-term cabinet is apparently staying on for the second term:  Attorney General Eric Holder.  That’s ironic, because Holder appears to be either one of the most corrupt or one of the most incompetent attorney generals in modern U.S. history, as revealed by Congressional investigations into “Operation Fast and Furious.”  In previous presidencies (the Nixon administration, noted below, and most famously, that of Harry S Truman – who even had a sign on his Oval Office desk, “The buck stops here”), the occupant of the White House has been held responsible for the illegal actions of his subordinates, even if he had no actual prior knowledge of their activities.  Apparently that’s no longer the case, with the current “occupier” of the White House.  

    Another troubling cover-up – so troubling because the B.O. regime apparently again has succeeded, first, by deflecting Congressional Republican attempts to get to the truth and, second, by distracting the “lamestream” news media from reporting the story – is the regime’s cover-up of the September 11, 2012 violent attack on the U.S. consulate in Benghazi, Libya.  

    The September 11 attack resulted in, among other things, the brutal murder of four Americans (including U.S. Ambassador Christopher Stevens) in Libya.  As I discussed in the first section (“The `Arab Sping’ Turns into an Islamo-Fascist Autumn”) of my “Tricks and Treats 2012” blog entry (October 25), the B.O. regime did more than merely cover up the real nature of the attack – and its causes – for they concocted a phony “cover story,” which claimed that the attack on the U.S. consulate in Benghazi was motivated by “spontaneous” Muslim outrage at an anti-Mohammed video broadcast on the Internet.  That phony story was a deliberate lie, fed to the news media by the highest-ranking foreign-policy officials in the B.O. regime:  U.N. Ambassador Susan Rice, former Secretary of State Hillary Clinton, and B.O. himself.  

    What were they trying to cover up?  Principally, the fact that the attack was a deliberate act, planned in advance for the 11th anniversary of the 9/11/2001 Islamist terrorist attacks on the U.S., by a militant Islamist group associated with al-Qaeda; and secondarily, the fact that the U.S. government (the State Department and the military) failed to protect Ambassador Stevens and his staff, either by negligence or by deliberate malfeasance.  As the editors of Wall Street Journal succinctly put it, “Four Americans lost their lives in Benghazi in a terrorist attack that evidence suggests should have been anticipated and might have been stopped.  Rather than accept responsibility, the Administration has tried to stonewall and blame others” (“The Libya Debacle,” Sept. 27, 2012).        

    Why were they trying to cover it up?  Not only because the killing of one of our ambassadors is embarrassing in itself, but also because the incident exposes the fecklessness of B.O. in dealing with the threat of militant Islam. It also exposes, as I wrote on October 25, “the overall failure of B.O.’s foreign policy, which can be described only as disastrous”: 

    “These events, and their aftermath, demonstrate B.O.’s incompetence to be president – just as the parallel event 32 years ago, the violent seizure of the U.S. embassy in Iran by Islamist revolutionaries who held Americans hostage for over a year – demonstrated the incompetence of B.O.’s predecessor, Jimmy Carter.  It’s much more than merely B.O.’s `Jimmy Carter moment,’ however.  The difference between B.O. and Carter is that B.O. is not merely incompetent or feckless: he’s also dangerous, following policies that border on being treasonous.  (He’s weakened the ability of the United States to defend its own interests, undermining our credibility abroad, and giving aid and encouragement to our enemies in the Islamic world.)” 


    I added:  

    “Ironically, perhaps the main (if not the sole) reason why B.O. has been rated high on foreign policy in many American polls has been the killing of former al-Qaeda head Osama bin Laden – an act for which B.O. has unjustly claimed credit, showing his shameless narcissism as well as hypocrisy.  (See my discussion of “Laden with Hypocrisy,” in my “Thoughts for Summer 2011” (May 11, 2011).)  By claiming credit for the killing of bin Laden so shamelessly – by in effect `spiking the ball’ – at the Democratic National Convention, which concluded in Charlotte, N.C. on September 6, barely a week before the 9/11 anniversary, it appears that it was B.O. himself and his fanatic supporters who motivated these recent attacks in the Arab world.  Demonstrators in Tunisia, for example, chanted, `Obama, Obama, we are all Osamas!’” 


                B.O.’s foreign-policy failures are even more ominous if one considers the theory suggested by Dinesh D’Souza’s books and film 2016: Obama’s America – that, because of the “anti-colonialist” ideology that he inherited from his father, B.O. is deliberately pursuing an agenda designed to weaken the United States economically and militarily.  His covert support for militant Islamists and his attempts to appease Russian dictator Vladimir Putin – going so far as to promise Putin’s representative more “flexibility” in meeting Russia’s demands for the downsizing of the U.S. nuclear missile arsenal – may in fact amount to treason.  (As I discuss in the final section below, B.O. may be another first in the history of U.S. presidents – the first president to commit acts that would justify his impeachment on the grounds of treason.) 

    It is not just conservatives or libertarians who are seriously concerned about B.O.’s abuse of the powers of his office.  Jonathan Turley, a constitutional law professor at George Washington University Law School (and a contributor to USA Today) has said that B.O. “is using executive power to do things Congress has refused to do, and that does fit a disturbing pattern of expansion of executive power.”  Professor Turley adds: “In many ways, [B.O.] has fulfilled the dream of an imperial presidency that Richard Nixon strived for.  On everything from [DOMA] to the gaming laws, this is a president who is now functioning as a super-legislator.  He is effectively negating parts of the criminal code because he disagrees with them.  That does go beyond the pale.”  Quoting Professor Turley – who in many respects is a typical left-liberal law professor – Steve Friess, writing in Politico, comments that the Nixon analogy “may be apt.”  Friess’s article also cites John Eastman, a conservative constitutional law professor at Chapman University School of Law, who also draws parallels between B.O.’s version of the “imperial presidency” and Nixon’s action of not spending, or impounding, funds appropriated by Congress.  (Nixon’s impeachable conduct and the supposed “imperial presidency” model are discussed in the next section, below.) 

    In light of the many ways B.O. has abused the powers of his office, violating both the Constitution and the general rule of law, he most certainly did not deserve to be elected to a second term.  Indeed, he deserves to be impeached and removed from office – before he does any further damage to the Constitution and to the United States of America.



    Impeachment: Enema of the State


                   What is there to do when the American people foolishly re-elect to a second term such a lawless “occupier” of the White House?   Anticipating a future time when the U.S. president might act as a scoundrel and even a would-be dictator, the framers of the Constitution wisely provided a solution – a means by which a corrupt president could be, in effect, “flushed away, “ removed, put out of office.  We might consider it “the enema of the state.”  It’s called impeachment. 

                   The House of Representatives, representing “the people,” is granted (in the final clause of Article I, Section 2) “the sole Power of Impeachment.”  Impeachment – the direct accusation and arraignment of an individual for misconduct – was a power that had been won by the English House of Commons by the late medieval period.  It had proved to be a useful check against the king and his ministers, a device by which a minister of the Crown could be held directly responsible to Parliament for his official acts.  As granted to the House by the Constitution, the power of impeachment similarly provides a check against the abuse of power by officers in the other two branches of government, including the president and federal judges.   

                   The Senate is granted (by the sixth clause in Article I, Section 3) “the sole Power to try all Impeachments,” sitting essentially as a court with each member “on Oath or Affirmation” to render a true verdict.  That role for the Senate parallels that of the English House of Lords, which since early medieval times had retained the judicial functions of the king’s great council and remained the highest court in the land.  As one English constitutional historian has summed it up, “The House of Commons, by acting as a grand or accusing jury, could present ministers or other servants of the king before the House of Lords for trials for serious offenses, such as treason or felony.  If the upper house found the accused guilty of the charges against him the penalty might be death.”  The Constitution (in the final clause of Article I, Section 3) prescribes no such serious penalty; rather, it limits the consequences of conviction by the Senate (which requires a two-thirds vote) to removal from office and disqualification to hold any federal office.   

                   The grounds for impeachment are specified in Article II, Section 4, which provides: “The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.”  Treason and bribery have fairly straightforward meanings.  (Indeed, the crime of treason against the United States is specifically defined in Article III, Section 3.)  But “high Crimes and Misdemeanors” has no precise legal meaning; it was meant to embrace, as Alexander Hamilton maintained in Federalist Papers No. 65, “those offenses which proceed from the misconduct of public men, or, in other words, from the abuse or violation of some public trust.”  

    The history of impeachments provides further evidence of the meaning of “high Crimes and Misdemeanors” – those offenses proceeding from presidents’ “misconduct” or their “abuse or violation” of their “public trust.”  During the past 225 years, since ratification of the Constitution, serious efforts have been made to impeach a U.S. president only three times – during the presidencies of Andrew Johnson, Richard Nixon, and Bill Clinton – and only two of these presidents (Johnson and Clinton) actually have been impeached by the House and tried by the Senate.  (Nixon was nearly impeached by the House, but he resigned before articles of impeachment came up for a full House vote.) 

    Andrew Johnson’s impeachment and trial in 1868 has generally been criticized as a “political” act – the basic interpretation given by historian Michael Les Benedict, in his book The Impeachment and Trial of Andrew Johnson (1973), and by former Supreme Court Chief Justice William H. Rehnquist, in his book Grand Inquests (1992).  (Rehnquist was only the second Chief Justice to preside over a Senate trial of a president, of Bill Clinton in 1999; and arguably, he was the best-informed Chief Justice to do so, because of his authorship of this book which compared the historic impeachments of Justice Samuel Chase and President Johnson, both of which Rehnquist regarded as politically-motivated.)   

    The articles of impeachment against Johnson charged that he – “unmindful of the high duties of his office, of his oath of office, and of the requirement of the Constitution that he should take care that the laws be faithfully executed” – had violated the law by removing Edwin Stanton as Secretary of War and replacing Stanton with an interim appointee, Lorenzo Thomas.  The articles identified the law Johnson allegedly violated – the 1867 Tenure of Office Act, which required Senate consent for the removal or replacement of a confirmed Cabinet member.  That law itself was of dubious constitutionality: the Constitution requires Senate confirmation (its “advise and consent” power) only for the appointment of Cabinet members and other U.S. officials; and it had become accepted practice since Washington’s time for the president to fire Cabinet members without Senate approval (as well as to make valid interim appointments, when the Senate was not in session).  This charge against Johnson looked like a kind of political “entrapment” – Congress passed (over Johnson’s veto) an unconstitutional law, dared Johnson to violate it, and then charged him with “a high misdemeanor” for violating that law – and hence confirms the traditional view that Johnson’s impeachment was politically motivated. 

    Another article of impeachment against Johnson – the tenth – charged that he also had committed “a high misdemeanor” in office by publicly criticizing Congress.  The article alleged that Johnson:  

    “did attempt to bring into disgrace, ridicule, hatred, contempt, and reproach the Congress of the United States . . ., to impair and destroy the regard and respect of all the good people of the United States for the Congress and legislative power thereof . . ., and to excite against the odium and resentment of all the good people of the United States against Congress and the laws by it duly and constitutionally enacted.”


    How did he do this?  By “openly and publicly,” in various public speeches, delivering “in a loud voice certain intemperate, inflammatory, and scandalous harangues,” and that he did so “amid the cries, jeers, and laughter of the multitudes then assembled and within hearing.”  In other words, he not only criticized Congress publicly but, apparently, did so with great political effect! 

    The traditional view by most scholars is that this article of impeachment underscored the “political” motives behind Johnson’s impeachment, but one scholar (Walter Berns, writing a 1994 op-ed in the Wall Street Journal) has suggested that this might have been a serious charge that Johnson misused the power of his office.  Berns has pointed out that in early American history, following the text of the Constitution, the president’s legislative role was limited to providing “information on the State of the Union,” by a written annual message to Congress – the tradition begun by Thomas Jefferson (which I discuss above, in the section on “The Unconstitutional State of the Union”).  Before Teddy Roosevelt changed the presidential office into a “bully pulpit” – a change institutionalized by Woodrow Wilson, who sought to refashion American government along the lines of the British parliamentary model (as also noted above) – U.S. presidents rarely spoke in public.  “Aside from issuing proclamations, or responding to addresses or `serenades,’ presidents spoke directly to the people mostly on special occasions: upon assuming office [or upon leaving it]; or when dedicating, say, . . . the cemetery at Gettysburg.”  And “[b]efore the coming of radio and TV, even presidential candidates kept aloof from the public, issuing no statements on their own behalf and making no speeches”: they “stood for office,” but did not campaign (“The Prattling Presidency,” Wall Street Journal, Oct. 31, 1994).    

    Thus, going beyond his veto of what he regarded as unconstitutional legislation, Johnson actively sought to undermine Congress’s Reconstruction policy, by taking his case directly to the people.  What today is considered a normal practice of presidents was, in the 19th century, justly regarded as an abuse of presidential power.  Johnson’s impeachment was no more “political” than was his acquittal by the Senate.  The Constitution requires a two-thirds “guilty” vote of the Senate, for a president to be removed from office.  In their balloting at the end of Johnson’s Senate trial in 1868, 35 senators found him guilty but 19 senators (including 7 Republicans) found him not guilty – just one vote shy of the two-thirds needed to convict and remove him.  Although constitutional scholars and popular authors (most famously, JFK in his book Profiles in Courage) have applauded for their supposed courage (or integrity) the 7 Republican senators who voted to acquit Johnson, the reality is that they too were influenced by political motives:  their abhorrence of their colleague, Benjamin Wade, who as president pro tempore of the Senate would have become president if Johnson were convicted.  Apparently as politically difficult as Johnson had been in attempting to thwart Congressional Reconstruction, the Radical Republicans who controlled Congress regarded Johnson as less of a threat than the bellicose Senator Wade, especially when the Republicans enjoyed a two-thirds “veto-proof” majority in both houses of Congress, thereby effectively nullifying the rest of Johnson’s presidency. 

                Long before the Watergate scandal began – with an innocuous story in the Washington Post reporting a mysterious break-in at the Democratic National Committee’s offices in the Watergate office complex on June 17, 1972 – Richard Nixon’s political enemies (and his critics in the academic world) had been calling for his impeachment.  In fact, the New York Times printed a series of articles in the early spring of 1973 (just before the famous Senate Watergate committee had begun its public hearings), arguing that Nixon was creating a dangerous “imperial presidency.”  (I read the articles as they were reprinted in The Detroit Free Press, March 11-15, 1973.)  In one of these articles, under the headline “Nixon Is Fighting for the Strongest Presidency since FDR,” distinguished constitutional historian Henry Steele Commager was quoted saying, “In so many ways, I think Mr. Nixon has gone far beyond any previous president in history.”  Another presidential scholar, Thomas E. Cronin, was quoted saying that Nixon “has systematically gone about trying to strengthen the presidency in a number of ways, frequently by circumventing the Constitution or expanding on past practices that were ambiguous or questionable.”  These practices included Nixon’s “impounding” of funds appropriated by Congress for domestic programs and his use of the war power in Vietnam.  One of the articles accused Nixon of “stealing power from his own Cabinet”; another maintained that he was “dismantling 40 years of Democratic policy” through his “New Federalism,” which turned back authority – and money – for various federal programs to the states and local governments.  In another article, under a headline that asked the provocative question, “Is Congress Helpless Against Nixon Power Grab,” Commager is again quoted, saying:  “One answer would be impeachment if the Congress had any guts, but it doesn’t.” 

    Only after the media began obsessing over the Watergate break-in – and the American public’s attention was stirred by the Senate Watergate committee hearings in the spring of 1973 – did Congress begin to seriously consider impeachment.  The Watergate affair is simply too complex to discuss here, but I did discuss it previously – in a MayerBlog essay posted on the 33rd anniversary of the break-in, “The Legacy of Watergate” (June 17, 2005).  I’ll reiterate just a few salient points, focusing on the articles of impeachment drawn up against Nixon in 1974.  

    Nixon’s impeachable actions concerned not the break-in itself – that was merely a “third-rate burglary,” as some commentators have called it, part of the “dirty tricks” of the 1972 campaign – but his abuse of the powers of his office in an attempt to “cover up” the break-in.  (To this day, there has been no evidence showing that Nixon in advance approved of, or even knew about, the break-in, which was committed by a secret unit called “the plumbers,” acting on behalf of Nixon’s reelection campaign, the Committee to Re-elect the President, with the unfortunate acronym name, CREEP.  But the “smoking gun” sought by investigators and members of Congress – which linked Nixon directly to the cover-up – was his statement, recorded on one of the infamous “Watergate tapes,” secret recordings of Oval Office conversations, in which Nixon discussed paying “hush money” to a particular person.)  

    The articles of impeachment drawn up against Nixon by the House Judiciary Committee in the summer of 1974 echoed the language of the articles drawn up against Johnson in 1868.  They alleged that Nixon violated his office and his oath to follow the Constitution, and his duty to see that the laws are faithfully executed, by (1) “using the powers of his office, engaged personally and through his subordinates and agents,” in a course of conduct or plan “designed to delay, impede, and obstruct” investigation into the Watergate break-in, and to “cover up, conceal, and protect those responsible” and “conceal the existence and scope of other unlawful covert activities”; (2) “violating the constitutional rights of citizens” by misusing executive agencies including the IRS and the FBI; and (3) “refusing to produce [those] papers and things” subpoenaed by the House Judiciary Committee.  The third article accused Nixon of disobeying congressional subpoenas – something that many other presidents have done throughout U.S. history, particularly when they assert “executive privilege."  The second article, as noted above, consisted of bare allegations of misuse of various agencies, without any evidence supporting the allegations (at least as they pertained to Nixon himself).  But it was the first article, alleging obstruction of justice through the attempted cover-up, that was supported by evidence – including the above-mentioned “smoking gun” tying Nixon personally to the cover-up.   

    With only the Johnson impeachment as a precedent – and given the traditional criticism of that impeachment as politically-motivated – even the most partisan Democrats on the House Judiciary Committee in 1974 were reluctant to impeach Nixon unless he could be personally tied to some criminal activity.  Some on the committee (including a young staffer, a recent graduate of Yale Law School, Hillary Rodham Clinton) maintained that any sort of malfeasance in office might constitute “high crimes and misdemeanors” (an opinion she’d later regret when her husband faced impeachment 24 years later); but most members of the House committee followed a narrower standard requiring actual criminal conduct.  Note, however, that both the first and second articles of impeachment held Nixon responsible not just for the actions he personally undertook, but also the actions undertaken “through his subordinates and agents.”  In other words, the House Judiciary Committee, while following a fairly narrow standard of what constitutes impeachable offenses, regarded the “buck” as stopping at the president’s desk. 

    Nixon, showing both a patriotism and a sense of shame which were conspicuously absent in Bill Clinton, resigned the presidency in August 1974, rather than put the United States through the agony of another presidential impeachment trial.  Indeed, he decided to resign even before the House voted formal charges of impeachment, after a bipartisan delegation of members of Congress visited him at the White House and told him, frankly, that he’d be impeached by the House and most likely convicted by the Senate.  The fact that the effort to impeach Nixon was truly bipartisan – that Republican leaders joined Democrats in urging Nixon to resign – made the formalities of impeachment and trial unnecessary (along with, as I’ve noted, Nixon’s patriotism – putting the good of the country, and of the presidency, ahead of his own ego, as well as his own sense of shame). 

                Over sixteen years ago, in October 1996 just before the elections that year, I wrote an op-ed that was published in my local newspaper, The Columbus Dispatch, arguing that rather than being elected to a second term, Bill Clinton ought to have been impeached and removed from office.  (That was long before anyone had heard of Monica Lewinsky, the White House intern with whom Clinton was having a sexual relationship – an affair he tried to cover up by committing perjury before a grand jury and by obstructing justice by suborning others to commit perjury – the crimes for which he was impeached in 1998, although he ultimately was found not guilty by the Senate in a politicized trial in 1999.)  In 1996 the “high crimes and misdemeanors” Clinton appeared to have committed involved the scandals known as Whitewater, “Travelgate,” and “Filegate” – matters involving allegations of abuse of power far more serious than the Lewinsky cover-up (and also far more serious than the Watergate affair that ended Richard Nixon’s presidency), but for which special prosecutor Kenneth Starr did not recommend impeachment because he could not discover evidence personally linking those crimes to Clinton.   

                   As I noted in my “Legacy of Watergate” essay and again in my essay “The Worst U.S. President” (Feb. 15, 2008) – needless to say, written at a time when I couldn’t imagine an even worse (or more lawless) president than Clinton – Ken Starr did Clinton a huge favor by deciding to report to Congress only those actual crimes for which he found evidence personally linking Clinton – a narrower standard of impeachable offenses than the House Judiciary Committee had adopted with regard to Nixon in 1974.  That is why two articles of impeachment approved by the House late in 1998 charged Clinton only with (1) obstructing justice by providing “perjurious, false, and misleading testimony” to a federal grand jury and (2) obstructing justice by, among other things, concealing evidence and suborning perjury by others.  Notwithstanding clear evidence of his guilt for both these crimes, Clinton was acquitted by the Senate – voting 45 guilty, 55 not guilty on the first (perjury) count and voting 50–50 on the second (obstruction of justice) count.  As the House’s chief investigative counsel, David P. Schippers (a Democrat and former prosecutor from Chicago) explained in his excellent book, Sell Out: The Inside Story of Clinton’s Impeachment (2000), the senators who voted to acquit Clinton did not follow the evidence; instead, they followed “the path of political expediency,” in which high public officials would be answerable only to "politics, polls, and propaganda."   Clinton remained in office because he was popular and he had the support of his party, who dismissed his unlawful actions as “merely” about “sex.” 

                   Republicans, by focusing on the sexual nature of the Lewinsky affair rather than the crimes of perjury committed in furtherance of the cover-up, botched their prosecution of Clinton, allowing the Democrats to dismiss it as merely about “sex.”  Judging from the role he played in last year’s presidential election, at the national convention and on the campaign trail, Clinton is regarded by Democrats as a kind of senior statesman in the party, rather than as a serial abuser of women and an unconvicted felon who was stripped of his license to practice law.  How short political memories are! 

                   Virtually any of the lawless actions or abuses of power that B.O. has committed thus far during his occupation of the White House would constitute impeachable offenses.   Indeed, in my “Election 2012 Postmortem” essay (Nov. 10, 2012), I predicted that B.O. would be impeached and removed from office before he completes his second term:

                “The abuses of power committed by B.O. over the past four years – abuses that justify my calling him `the most lawless president in U.S. history’ . . .  ought to have disqualified him from being elected to a second term.  Now that he has been reelected, however, I seriously doubt whether he will complete a second term.  I am confident that his disregard for the Constitution and the rule of law will prompt him to continue to commit offenses – not merely `high crimes and misdemeanors,’ but also possibly treason (making him the first traitor to hold the office of president) – that will justify his impeachment and removal from office before his second term ends in 2017, in other words, sometime during the next four years.”


    I added that he might be the first president to be impeached for treason, given the ominous incident that some commentators have called “Mic-Gate”: an incident that shows how far B.O. is willing to go to appease Russia.  In a private conversation captured accidentally on a “hot mic,” B.O. told Russian President Dmitri Medvedev on March 26, 2012:  “On all these issues, but particularly missile defense, this can be solved, but it’s important for him [Putin] to give me space. . . . This is my last election.  After my election, I have more flexibility.”  So B.O. wants the Kremlin to know that more “flexibility” is on the way, but since American public opinion will oppose it, he needs to wait until after his presumed reelection.  That’s practically an admission that B.O. intends to act against the interests of the United States, to give comfort and aid to our enemy, Russia (which remains the enemy of the U.S. even with the “Cold War” over) – in other words, to commit treason. 

                   But as I noted in my “Legacy of Watergate” essay, the lessons of both Nixon’s and Clinton’s presidency – of the successful attempt to impeach Nixon and of the failed effort to convict Clinton – a successful impeachment effort must be bipartisan; it cannot be begun by the Republicans alone, unless they hold commanding majorities in both the House and the Senate.  (Perhaps some particular abuse of power – such as the drone strikes discussed in the “Assassin-in-Chief” section above – will receive attention from the news media and arouse concern among left-liberals as well as conservatives and libertarians.)  Hence, as I discussed in the concluding section of my “Election 2012 Postmortem” essay, the Republicans face a major challenge – both to educate the American people and to make them care, to make them care whether the man in the Oval Office is abusing the awesome powers of his office.  As I wrote: 

    “More generally, Republicans must educate the American people about the basic laws of economics, the fundamentals of our free-market capitalist system, and our federal, republican system of limited government powers – and the vital role played by the U.S. Constitution, if scrupulously adhered to, in both limiting government power and safeguarding individual rights (including such precious rights as economic freedom).  They cannot allow the Democrats and their allies in the left-wing news media to misrepresent Republicans and their policies, to tell lies and to demagogue with the American people.


    “Indeed, Republicans today also should take the lead in educating the American people about the importance of the rule of law – and the vital importance of all public officials adhering to the law (particularly the president, who at his inaugural takes an oath to `support and defend’ the Constitution, which also imposes on him the duty `to see that the laws are faithfully executed’).  Republicans should emulate one of the founders of their party in the 19th century, Abraham Lincoln, who in his famous Address before the Young Men’s Lyceum of Springfield, Illinois, declared, in part, `Let every American, every lover of liberty, every well wisher to his prosperity, swear by the blood of the Revolution, never to violate in the least particular, the laws of the country; and never to tolerate their violation by others. . . . In short, let it [respect for the rule of law] become the political religion of the nation . . . .’


    “And, [of course], Republicans must take the lead in exposing the various ways B.O. is abusing the powers of the presidency, violating both the Constitution and the rule of law.  The House Republicans should have the guts to be prepared to initiate impeachment proceedings against B.O. – but only after they have first paved the way by getting popular support (primarily by getting the American people to care about abuses of governmental power by seeing how it threatens their individual rights).  They should learn a lesson from the unsuccessful impeachment of Bill Clinton – when the Republicans failed to make the case why he had violated the law and ought to be removed from office, and so they failed to acquire bipartisan support from the Senate Democrats they needed to convict Clinton and remove him.  Republicans instead should learn a lesson from the near-impeachment of Richard Nixon in 1974: how Nixon’s political enemies (within the Democratic Party, academia, and journalism) started building the case for his impeachment well before the Watergate scandal caught the public’s attention; so, by the time the House was ready to vote for articles of impeachment, a bipartisan group of members of Congress advised Nixon he must go – and thus he resigned, for the good of the country.”   


    I added, “I don’t expect anyone as arrogant and narcissistic as B.O. to do a similar thing [as Nixon did], with faced with the real possibility of impeachment by the House and a trial in the Senate – which is why it is so vitally important for Republicans to educate the public and, frankly, to not only court but also shape public opinion.”  The Democrats’ success in doing that helps explain why they retained both the White House and control of the Senate in the 2012 elections.  The Republicans will need to be equally successful, to retain control of the House and perhaps regain control of the Senate in the 2014 midterm elections – if we are to hope for a restoration of a constitutional presidency, before B.O.’s lawlessness destroys both the office and the Constitution.


     | Link to this Entry | Posted Thursday, February 21, 2013.  Copyright © David N. Mayer.

    2013: Prospects for Liberty (Part III) - February 7, 2013


    2013: Prospects for Liberty


    Part III



    Continuing my sometimes-annual January blog essay on “The Prospects for Liberty” in the coming year, I’m emphasizing again this year what I have called “the tyranny of bullshit.”  Part I discussed the single greatest threat to individual freedom today – government paternalism, through the expanding national regulatory/welfare state – and the policies being pushed by B.O. and his party, the Democrats, the party that advocates further expansion of the welfare state.  Their policies are based on nothing more than bullshit:  bullshit rationalizations offered by paternalistic politicians, bullshit believed by a majority of the gullible public, the fools who voted for the politicians “whose sole qualification to rule [us] was their capacity to spout the fraudulent generalizations that got them elected to the privilege of enforcing their wishes at the point of a gun," to quote Ayn Rand’s apt definition of politicians.  Unlike other forms of tyranny in the past, the tyranny of bullshit is not based on the use of force against an unwilling people:  it’s based instead on fraud, fraud committed by the politicians who spout the bullshit theories that persuade a gullible majority of the people to put the collars around their own necks.   

    Part I focused on the reelection of B.O., the current “Occupier” of the White House and the Bullshitter-in-Chief, whose policies pose the most tangible threat to Americans’ freedoms in the near future (and the most tangible manifestation of “the tyranny of bullshit”).  I discussed the reasons why B.O. was reelected to a second term and, in general, the challenges posed by those of us who oppose his policies and the hope we have to defeat, or at least to thwart, his efforts to implement them.   

                In Part II, I began discussing particular aspects of B.O.’s “tyranny of bullshit,” focusing on two of the most important: “`Fairness’ and `Social Justice’ Bullshit” (threats to liberty from redistributionist taxation) and “Keynesian `Stimulus’ Bullshit” (threats to liberty from profligate government spending).  The section on taxation included a discussion of the new tax law, passed at the very end of the 112th Congress, intended to resolve the tax part of the “fiscal cliff” crisis.  The second section included discussion of the unresolved parts of the “fiscal cliff,” the critical problems of out-of-control federal deficit spending and the burgeoning national debt. 

                In this third and final part of the 2013 “Prospects for Liberty” essay, I’ll discuss some additional types of bullshit propagated by B.O. and the Democrats that threaten Americans’ freedom – and the nation’s well-being.  Within each category, I’ll also discuss the challenges that those of us who oppose B.O.’s policies face – and also the hope we have in successfully meeting those challenges and thus defending Americans’ liberty.




    “Health Care Reform” Bullshit:

    The Threat to One of the Most Precious Aspects of Liberty

    (Our Freedom to Own Our Own Lives)


                Last summer, on June 28, the U.S. Supreme Court surprised most legal commentators by its decision in National Association of Independent Business v. Sebelius, the case challenging the constitutionality of the 2010 federal health-care regulatory law, which is officially titled “The Patient Protection and Affordable Care Act” but is popularly known as “ObamaCare,” because it is the signature legislative “achievement” of B.O.’s presidency.  (The official title of the law is quite disingenuous, for it provides genuine “protection” for no one and it makes health care more expensive, not more “affordable.”  Interestingly, advocates of the law, including B.O. himself, lately have been calling it simply “the Affordable Care Act” – dropping the first lie but perpetuating the second.) 

                Whatever one calls it – whether the PPACA, the ACA, “ObamaCare,” etc. – the 2010 federal law was a monstrosity, arguably the worst piece of legislation ever to pass Congress.  To put it another way, one might call it the biggest Mongolian clusterfuck of them all.  (The Urban Dictionary defines Mongolian clusterfuck as “a generally futile attempt to solve a problem by throwing more people at it rather than more expertise,” or more generally as something “that is spinning or has already spun out of control with disastrous results.”  The term, which originated in military slang, seems an apt description of the 2010 legislation.)  The massive, 2,801-page law – with its 159 new bureaucracies, $2.6 trillion in new spending, 1,968 new federal powers, and 13,000 pages of regulations (so far) – was rammed through the Democrat-controlled 111th Congress during the final days of its “lame duck” session, despite strong opposition by the Republican minorities in both houses as well as strong opposition by the American people, according to most opinion polls.  Even supporters of the law acknowledged that it was too complex for them to either understand or explain exactly what it would do.  (Remember the infamous comment by former House Speaker Nancy Pelosi that Congress would first have to pass the bill before we know exactly what it provided?) 

                Since its enactment, the law has continued to remain unpopular.  In fact, the more Americans find out about the law and its actual effects on the nation’s health-insurance market, the less they like it.  They have discovered, among other things, that the law imposes 18 new taxes on Americans, including levies on medical device makers and small businesses that decide not to cover their employees – taxes totaling at least $800 billion, “a massive tax hike that hurts businesses and hampers economic growth,” as the editors of Investor’s Business Daily have concluded.  “ObamaCare” also increases the Medicare tax by nearly 1% - then counts the revenues from that tax hike twice, pretending the whole amount goes to fund Medicare and increase general budget revenues at the same time.  Worse, the law carves more than $570 billion out of Medicare – meaning that financially untenable program will go bankrupt much sooner.  Perhaps worst of all, the new Independent Payment Advisory Board – an unelected body, over which Congress has no control – will make sweeping decisions about what is covered.  “In short, the government will replace patients and doctors in deciding who gets care – and what care they get” (“Five Challenges in 2013 that Dwarf `Fiscal Cliff’,” January 2).  

    While state governments decide whether or not to create the insurance exchanges mandated by the law (discussed below), it has become more and more apparent that “ObamaCare” will fail to achieve the two principal goals stated in its official title:  it will not “protect” patients by guaranteeing coverage to Americans who are currently uninsured, nor will it make health care more “affordable.”  In fact, most Americans are finding that health-care costs – particularly the premiums that they or their employers pay for health insurance – are beginning to skyrocket, as the law starts to be implemented.  And many Americans are realizing that they will lose their insurance coverage, as employers change their business practices to adjust to the law’s “employer mandate.”  That provision of the federal law requires companies with over 50 employees to provide insurance for anyone working 30 or more hours a week, or face fines.  As Sally Pipes (CEO of the Pacific Research Institute) explains, the law creates a strong incentive for companies to reduce their workers’ hours to part-time (for example, Wal-Mart recently announced it will not offer health insurance to new employees who work less than 30 hours a week) – or for small businesses not to grow beyond 50 full-time workers.  Either way, the law will result in “job cuts [that] will hurt the working poor most” (Pipes, “Under ObamaCare, Many Will Lose Their Coverage,” I.B.D., December 27). 

                “ObamaCare” may not be fully “socialized medicine” – just a step toward socialized medicine – but, in a very real sense, it may be called “fascist medicine.”  Whole Foods CEO John Mackey caused a stir in 2009 when, in a Wall Street Journal op-ed, he compared the law to socialism.  In a January 16 interview on NPR, Mackey caused even more controversy when he was asked a follow-up question about the law.  After all, the interviewer noted, the so-called “public option” was never adopted; the law is a mishmash of mandates, regulations and price controls, but falls short of an outright nationalization of the insurance industry, so how could Mackey compare it to socialism?  Mackey set off another firestorm with his response – saying, “Technically speaking, it’s more like fascism,” explaining: “Socialism is where the government owns the means of production.  In fascism, the government doesn’t own the means of production, but they do control it, and that’s what’s happening with our health care programs and these reforms.”  Although Mackey since has walked back his statement – apologizing for his “bad choice of language” – but he was quite right:  “ObamaCare” is indeed fascist.  More precisely, it may be labeled “corporatism,” the economic theory that undergirded the fascist ideology in Benito Mussolini’s fascist regime in Italy in the 1920s and 1930s.   

                The brainchild of Alfred Rocco, a key figure in Mussolini’s regime, corporatism “called for the organization of the economy into corporate sectors that would cooperate with the government in implementing state policy.”  Rocco aimed at “uniting workers, entrepreneurs, and government officials in economic activities, carried out in the public interest or the interests of the state.”  As he put it, “For Fascism, society is the end, individuals the means, and its whole life consists in using individuals as instruments for its social ends.”  As Robert Romano, senior editor of Americans for Limited Government, observes: “Certainly, that’s what ObamaCare – with its individual and employer mandates to respectively purchase and provide health insurance – seeks to accomplish.  It guarantees customers to large companies, in this case insurance providers that supported passage of the legislation, and in the process cartelizes the system” (Romano, “ObamaCare as `Fascism’? If Ideological Shoe Fits . . .,” I.B.D., January 23). 

    In truth, the law creates a corporatist or fascist system that (as its chief supporters wish, although they generally aren’t honest enough to say so) will eventually lead to fully socialized medicine, under a government monopoly.  Under the guise of extending health-care coverage to millions of uninsured Americans, the law really aims at nationalizing the health-care industry – imposing on private health insurance companies mandates that are fiscally unsound, which will eventually drive them out of business, destroying the market for private health insurance – and thus ultimately creating a “single-payer,” “universal,” national system of socialized medicine, jeopardizing not just the freedom but the health and even the lives of all Americans.  (B.O. himself, in a speech to supporters prior to its enactment, called the law “a step” toward universal coverage, or fully socialized medicine.) 

                Having the government “guarantee” so-called universal health care – which really is a euphemism for having the government totally control the providing of health-care services through a government monopoly, giving the government literally life-or-death control over the lives of its individual citizens – has been a dream of so-called “progressive” political activists since the early 20th century.  It was a key component of the national “welfare state” advocated by political activists in the so-called “Progressive” movement of the early 1900s – yet it was the one component omitted from FDR’s “New Deal” programs (which did include other key components, such as old-age pensions and disability payments through “Social Security” as well as unemployment compensation) and only partially enacted in Lyndon B. Johnson’s so-called “Great Society” programs, as expanded by Richard M. Nixon (namely, Medicare and Medicaid).   

                To call such programs – or the political activists who promote them – “progressive” really is a misnomer, a perversion of terms, for they are anything but progressive, in the true sense of the word.  As I discussed in my essay “Reactionary `Progressives’” (Mar. 16, 2006), there’s nothing truly “progressive” about the 20th-century “welfare state” model:  it’s actually reactionary, based on a centuries-old paternalistic model of government that had been revived by German Chancellor Otto von Bismarck in the late 19th century, in part to provide an “opiate for the masses” that might discourage Communist revolution.  This paternalistic model shows contempt not only for the free market but also (and most fundamentally) for the ability of ordinary citizens to be in control of their lives, by being free to make their own choices and to act on those choices; instead, “progressive” reformers ironically have unlimited confidence in the ability of government “experts” – the legislators who pass the laws, and the bureaucrats who make regulations to implement them – to make the choices and to impose their choices on individual citizens.  

    The so-called “right to health care” that left-liberal “progressive” activists have been pushing for the past century or so is really a perversion of the concept of individual rights, properly considered.  What it really means is not a “right,” properly speaking, at all, but rather the loss of the genuine right that all Americans have – an aspect of their fundamental natural right to liberty, which includes liberty of contract – which is the right best described as “health care freedom,” the right of individuals to be in control of their own health and lives, including their freedom to enter into contracts for health care products and services, such as health insurance.  Yet it’s that genuine, legitimate right to health-care freedom that the 2010 law eventually will destroy, as I wrote in my essay “Health Care `Rights’ and Wrongs” (Oct. 16, 2009). 

                And although it has been touted by its proponents as major “health care reform,” the 2010 law is not really a “reform,” in the true sense of the word.  It moves the U.S. health-care system in entirely the wrong direction, away from the kind of common-sense true reform that’s really needed.  As I discuss in my 2009 “Health Care `Rights’ and Wrongs” essay, the real problem with the high cost of the U.S. health-care system is its misuse of insurance to cover routine health-care costs rather than extraordinary costs like hospitalization (which is what insurance ought to be for).  The private-insurance part of the U.S. health-care system is its strength – giving Americans access to the best-quality health care in the world – even despite its problems (not only cost but the problem of “portability” because it’s employer-based), while the part that has been socialized by government (Medicare and Medicaid, primarily) has inflated overall costs while reducing quality of care to those people (generally the elderly and the poor) who are limited in their choices to these government-monopolized programs.  Real reform would focus on contracting the socialized part of the nation’s health-care system (gradually privatizing both Medicare and Medicaid) while strengthening the market for private insurance, making it more competitive, and giving individuals greater freedom of choice in shopping around for the insurance plans best suited to their individual circumstances.  Real reform would focus on the basic problem of high costs rather than the chimera of expanding coverage to people who lack insurance (for whatever reasons). 

    The legal challenge to the 2010 law, in the case of National Association of Independent Business v. Sebelius, focused on the most controversial – and the key – provision of the law, the so-called “individual mandate” (the requirement that Americans purchase health insurance as mandated by the federal government).  The pivotal opinion was written by Chief Justice John Roberts, who sided with the Court’s four more conservative justices in holding that was not a constitutional exercise of the powers of Congress under either the Commerce Clause or the Necessary and Proper Clause.  Roberts also joined the four conservatives and, remarkably, two of the more “liberal” justices on the Court, in holding unconstitutional another controversial part of the law, mandating that the states expand Medicaid to provide coverage for uninsured Americans.  Yet in the crucial part of Roberts’ opinion, the Chief Justice joined the four left-liberal justices in upholding the individual mandate, calling it a tax – in other words, a constitutional exercise of Congress’s power to levy taxes.  That’s the part of the decision that made all the headlines, as the news media touted it as a “victory” for B.O. and his regime.  (Interestingly, the media virtually ignored the fact that two “liberal” justices – including Justice Kagan, B.O.’s own former Solicitor General, had joined Roberts and the four conservatives in holding the Medicaid mandate on the states to be unconstitutional.  In that respect, at least, the decision was a huge political defeat for B.O.) 

    As I wrote in my special post discussing the decision (“Supreme Folly 2012: The Supreme Court’s `ObamaCare’ Decision,” July 5), Roberts’ peculiar (and ridiculous) decision is best explained by regarding the Chief Justice as a “wimp,” who was intimidated by threats of criticism by left-liberal commentators that a decision overturning the law would have been “activist.”  In two especially telling passages in his opinion, Roberts declared a need to “adopt any reasonable construction in order to preserve the statute from being ruled unconstitutional” and then stated:  “It is not our job to protect people from the consequences of their political choices.”  So, Roberts used the spurious rationale that the individual mandate was not a penalty but rather a “tax” – a rationale that not even the B.O. regime took seriously, when it tried to defend the constitutionality of the law during oral arguments before the Court.  Ironically, in order to avoid the false impression that the Court was being “activist,” deciding the case on political grounds, Roberts really did engage in illegitimate judicial activism, in fashioning the “tax” rationale – and thus in effect rewriting the law – to avoid a confrontation with the B.O. regime and leftist political commentators.  In other words, Chief Justice Roberts became “Fierce Jurist Botches,” the apt anagram created by my good friend Rod Evans, philosophy professor and author of the word-play book, Tyrannosaurus Lex

                Libertarian law professor Randy Barnett, the intellectual leader of the constitutional challenge to the federal law, made a valiant effort to put a positive “spin” on the Court’s decision, in a Washington Post op-ed (which I discussed in “Supreme Folly”) and in a subsequent interview published in Reason magazine (“`We Won in Our Effort To Preserve the Constitution’,” October 2012).  Barnett wrote that although challengers to the law lost the case, they saved the Constitution as a limitation on the powers of the national government by prevailing on all their most significant arguments of constitutional law.  He also found solace in the hope that the Court’s decision would make the 2010 health-care law a central issue in the 2012 elections.  As Barnett concluded in his op-ed, “Those [of us] who value our republican system of limited federal powers should . . . get to work to achieve politically the complete victory that the chief justice denied us.”  

                Unfortunately, Professor Barnett was proved wrong in his hope that “ObamaCare” would be a central issue in the 2012 elections.  It was barely mentioned: B.O. and the Democrats did not defend the unpopular law; and although Mitt Romney and the Republicans promised to repeal it, they did not highlight the issue.  And some pundits believed that Romney’s record as governor of Massachusetts (so-called “RomneyCare,” the state law he supported that also mandated purchase of health insurance) blurred the clear lines dividing the two parties on this issue.  (That argument is unconvincing, for several reasons; among them, it ignores federalism – how a law that’s unconstitutional and unworkable at the national level might be otherwise at the state level – as well as Romney’s clear promise to void the federal law.)  A better explanation is that, notwithstanding the unpopularity of “ObamaCare,” the majority of voters put a priority on other issues.  As I noted in Part I of this essay, B.O.’s reelection can be attributed to a variety of factors – but it is clear that support for “ObamaCare” wasn’t one of them.  The Supreme Court’s decision, allowing the keystone of the law to stand as a “tax,” thus ironically took it off the public’s radar.  

    This does not mean, however, that “ObamaCare” is a “done deal” and that the American people should consider their right to health-care freedom yet another fundamental right that they’ve lost to a fascist “Nanny State.”  Because key provisions of the law do not begin to go into effect until next year, 2014 – and because the federal law requires the cooperation of the states to fully implement it – it is still possible to block it and thus to help preserve Americans’ health-care freedom.   

    The most important way that state governments can block implementation of “ObamaCare” is to refuse to set up an insurance exchange.  An “exchange,” as defined under the law, is “a mechanism for organizing the health insurance marketplace to help consumers and small businesses shop for coverage.”  Thus described, it appears to be a kind of clearinghouse where consumers without employer-based coverage can obtain supposedly “affordable” health insurance (to comply with the law’s individual mandate); in practice, however, it is the chief means by which the federal government – acting through the broad discretionary power of the Secretary of Health and Human Services to issue directives (“as the Secretary shall decide” is the most repeated phrase in the “ObamaCare” law) – will control the health insurance market.  That’s because the only health insurance policies included in the exchanges are those that comply with the federal (H.H.S.) regulations.  All exchanges are supposed to be operational by January 1, 2014, but the states are not required to create state-run exchanges; they may decide to “opt out” and instead either allow the federal government to create an exchange for their residents or do a “state-federal partnership” exchange.  (Under the last type of exchange, a state-federal partnership, states can oversee insurance plans and assist consumers, but the federal government will handle duties such as enrollment and determining eligibility.  Under a purely federal exchange, the federal government will call all the shots.  It also will – as the H.H.S. Secretary announced in early December – impose a 3.5% fee on insurance plans sold via a federally-run exchange, presumably to help offset the exchange’s costs.  These costs will be passed on to consumers, further eroding the law’s promise to make health care more “affordable.”)  

                So far, only 19 states, most of them (all but five) with Democrat governors, have decided to set up state-run exchanges.  That means that the majority of states – 25 so far, most of them with Republican governors – have decided to opt out, either completely or by doing the state-federal partnership.  States are opting out because of the high costs and uncertainty about creating state-run exchanges, which are estimated to cost taxpayers in each state, at a minimum, between $10 million and $100 million a year – but that’s just a rough guess, because the full contours of the exchange have yet to be specifically delineated by the H.H.S. Secretary.  (The state of Nebraska, for example, did a thorough analysis, estimating that running an exchange would cost the state about $646 million over eight years, fiscal 2013–20.  “It is simply too expensive to do a state insurance exchange,” Nebraska Gov. Dave Heineman said.)   

                States are also opting out because they have justifiable concerns about the loss of state autonomy – in other words, they fear becoming merely functionaries of the federal government.  In a letter to H.H.S. Secretary Kathleen Sibelius, Texas Governor Rick Perry explained why he wasn’t playing along with the pretense of so-called state-run exchanges.  Governor Perry bluntly wrote: “It is clear there is no such thing as a state exchange.  This is a federally mandated exchange with rules dictated by Washington.”  And Wisconsin Governor Scott Walker noted, “No matter which option is chosen, Wisconsin taxpayers will not have meaningful control over the health care policies and services sold to Wisconsin residents.”  As Sally Pipes observes, “Walker is right.”  The federal H.H.S. department dictates that all policies sold on the exchanges must meet one of four classifications – “platinum,” “gold,” “silver,” or bronze,” depending on the percentage of health costs a plan covers – but given the caps on deductibles imposed on all plans, even the least expensive “bronze” plans, the mandates prevent insurers from offering low-cost products that may best fit a family's budget.  And because “ObamaCare doesn’t just set the rules – it also tasks states with enforcing them . . . running a [state] exchange could therefore get pricey” (Pipes, “ObamaCare Exchanges Turning into Disasters,” I.B.D., January 30).     

                Some states – those with state constitutional provisions or statutes guaranteeing their citizens’ health-care freedom – are constitutionally or legally prohibited from implementing an “ObamaCare” exchange.  Here in Ohio, for example, an amendment to the Ohio Constitution – the “Health Care Freedom Amendment” approved overwhelming by voters in November 2011 – prohibits state and local governments from compelling “either directly or indirectly, any person, employer, or health-care provider to participate in a health care system.”  Under the constitutional provision, “compel” is defined to include “the levying of penalties or fines.” Maurice Thompson, director of the 1851 Center for Constitutional Law (and original draftsman of the Ohio amendment), has authored a paper explaining why Ohio constitutionally cannot implement an exchange.  Cato Institute analyst Michael Cannon has noted that 13 states “have passed statutes or constitutional amendments . . . that bar state employees from carrying out [the] essential functions of an ObamaCare exchange.”     

                Another way states may block full implementation of “ObamaCare” is to refuse to expand Medicaid coverage.  A provision in the federal law requires the states – as a condition for receiving federal funding – to expand Medicaid by covering everyone earning up to 138% of the federal poverty threshold ($32,000 for a family of four).  (This is the chief way in which the federal law aims to cover millions of low-income uninsured Americans.)  As briefly noted above, in an often-overlooked (yet critical) part of last summer’s decision, the Supreme Court held that the states could not be forced to expand Medicaid as the federal government dictates.  Some states (most with Republican governors) have decided not to expand their Medicaid coverage, but other states (including, disappointingly, some Republican governors, such as John Kasich of Ohio and Jan Brewer of Arizona) have decided to embrace Medicaid expansion because they cannot resist the lure of federal funding.  (The federal government has promised to provide a 100% spending match on the newly eligible for the next three years, phasing that down to 90% in 2020.)  Governor Kasich, in his recent budget proposal, said it was “the right decision for Ohio” to expand Medicaid eligibility – adding some 275,000 Ohioans to the welfare rolls within the first year, while prompting another 250,000 who are currently eligible to sign up, he estimates – just to receive an estimated $13 billion in federal funds over the next seven years.  (In other words, an offer he just couldn’t refuse.  Judas had his 30 pieces of silver; Kasich, $13 billion in federal money.)  

                Constitutional challenges to the law are still being considered by the courts.  One case, begun by a suit filed by Liberty University in Lynchburg, Va. on the day the bill was signed into law in 2010, challenges the constitutionality of the employer mandate and specifically whether the federal law’s forced funding of abortion and contraception is unconstitutional under the First Amendment religion clause (protecting the free exercise of religion) and the federal Religious Freedom Restoration Act.  The Liberty University lawsuit had been dismissed by a panel of the U.S. Court of Appeals for the Fourth Circuit, in Richmond, Va., but the Supreme Court ordered the court to consider it.  A similar lawsuit, filed by North Carolina’s Belmont Abbey College (a Roman Catholic school), is also being considered by the federal appellate court; it is among the 40 or so lawsuits (mostly from church-affiliated schools and hospitals opposed to abortion) that are challenging the contraception mandate on religious grounds.  Nevertheless, these cases might be rendered moot by the recently-announced changes in the employer mandate that will exempt some religious institutions from covering certain medical services (like abortion or contraception) against which they have conscientious religious scruples. 

                Another case – considered by some commentators to be the best vehicle for tearing apart the “Affordable Care Act” – is a suit filed by the Oklahoma Attorney General, Scott Pruit, challenging the constitutionality of the employer mandate generally.  Pruitt argues the IRS oversteps its legal authority if it taxes Oklahoma businesses to subsidize the federal health insurance exchanges that the law creates. Taxing Oklahoma businesses to pay for a federal exchange established by Washington in his state – which will happen as Oklahoma is declining to set up its own exchange – is not authorized under the law, Pruitt contends.  

                If the federal courts (and ultimately the U.S. Supreme Court) would side with the Oklahoma Attorney General, the very existence of the federal law becomes vulnerable.  Even though most of the law would survive if the employer mandate were struck down, the law lacks so-called severability language – meaning that if any part of it is found unconstitutional, the entire law should be thrown out.  (That didn’t happen last summer, however, when the Court invalidated the Medicaid expansion provisions.)  John Goodman, president of the National Center for Policy Analysis, has said that invalidation of the employer mandate could be a “fatal blow” for the law.  Nevertheless, given the Supreme Court’s record – not only John Roberts’ horrible decision last summer but also the Court’s record in enforcing federal constitutional limits, in its entirety at best a mixed record – sadly, Americans cannot count on the federal courts to protect their rights. 

                Ironically, the Supreme Court’s decision upholding the individual mandate as a “tax” (and allowing states to opt out of the expanded Medicaid program) opens up the law to another constitutional challenge.  “If the mandate is an indirect tax, as the Supreme Court held, then the Constitution’s `Uniformity Clause’ (Article I, Section 8, Clause 1) requires the tax to `be uniform throughout the United States,’” write Washington D.C. attorneys David B. Rivkin, Jr. and Lee A. Casey in a provocative op-ed in Wall Street Journal.  The “ObamaCare” tax fails to meet this standard because low-income taxpayers who can discharge their mandate-tax obligation by enrolling in the new, expanded Medicaid program (the functional equivalent of a tax credit) can do so only in those states that have not opted out of the Medicaid expansion.  The Court would face a “tough choice” if this challenge reaches it, Rivkin and Casey conclude:  

    “Having earlier reinterpreted the mandate as a tax, [the justices] would be hard-pressed to approve the geographic disparity created when states opt out of the Medicaid expansion.  But that possibility is inherent in a scheme that imposes a nominally uniform tax liability accompanied by the practical equivalent of a fully off-setting tax credit available only to those living in certain states.  To uphold such a taxing scheme would eliminate any meaningful uniformity requirement – a result that the Constitution does not permit.”


    (“The Opening for a Fresh ObamaCare Challenge,” December 5). 

                Two other cases also offer fairly plausible grounds for the federal courts to strike down “ObamaCare” in whole or in part.  Attorneys for the Goldwater Institute, a libertarian research and legal advocacy organization in Arizona, argue that the law’s Independent Payment Advisory Board (the appointed board which is supposed to hold down Medicare costs) violates separation of powers, because it is not subject to congressional oversight or even meaningful agency review.  The institute also contends that, by compelling individuals to disclose medical information to insurers and the federal government, the law places an undue burden on individuals’ right to privacy.  And a lawsuit filed by the Pacific Legal Foundation, another libertarian advocacy group, also turns on the Supreme Court’s decision upholding the individual mandate as a “tax,” arguing that if that is so, then the bill was unconstitutionally passed by Congress.  The Constitution requires that all revenue bills originate in the House of Representatives, but the bill that eventually became the “ObamaCare” law was a substitute bill that initially passed the Senate, not the House. 

                Ultimately, there is only one sure way to save Americans’ real “right” to health care – their right to health-care freedom: the freedom to decide for themselves what kind of health insurance, if any, they need, and the freedom to choose that insurance in a nationwide free market, which is what real health care “reform” ought to be aimed at.  That is to repeal the noxious 2010 law, a law that never should have been passed.  As noted above, the more Americans find out about “ObamaCare,” the less they like it – and the more they want to see it repealed.  (A recent Rasmussen poll showed 56% of Americans favor repealing it.)  Unfortunately, to do so would require a political “revolution” in the United States that cannot occur until after B.O. leaves office:  Republican victories, assuring the GOP control of both houses of Congress and the White House, in the 2016 general elections.  Until then, as the law becomes implemented, Americans should continue opposing it and resisting it, as much as they can – and hoping that over the next four years it doesn’t irreparably destroy the system that has provided the best health care in the world.



    “Green” Bullshit:

    Threats to Liberty from “Climate Change” Paranoia

    and the B.O. Regime’s War on Carbon-Based Fuels


    As I wrote in the second part of my 2009 “Prospects for Liberty” essay,  

    Perhaps the greatest threat to the freedom and prosperity of the industrialized Western world today is the threat posed by radical environmental activists and the politicians who follow their bullshit, pseudo-scientific theories.  And no theory better epitomizes this “green bullshit,” as I call it, than the theory of “global warming” – or “climate change,” as it’s now euphemistically called – in other words, the theory that the average global temperatures are increasing, and that the Earth’s warming will lead to cataclysmic disasters such as massive flooding of coastal areas as the polar ice cap melts and the world’s oceans rise to dangerously high levels, etc., etc., and that this dangerous global warming is caused by human activity, namely, by man-made carbon dioxide (a so-called “greenhouse gas”) created by the burning of carbon-based “fossil fuels” such as coal, oil (and other petroleum products), and natural gas.


    I added that “the global-warming thesis is a theory that, despite the propaganda of global-warming alarmists, is far from being scientifically proven.  Indeed, it is a flawed theory that fails to fit the facts”; in other words, it’s a theory based on faulty “junk” science.   

                Over the past few years, it has become even more clear that the “climate change” or “global warming” theory is nothing but a scam -- indeed it’s been called “the greatest scam in history,” by John Coleman, meteorologist and the founder of The Weather Channel.  The revelations from what has been called “Climate-gate” – e-mail exchanges and other documents hacked from computers at the Hadley Climate Research Unit at the University of East Anglia in Great Britain – reveal that there has been a conspiracy among some in the science community to spread alarmist views of global warming and to intimidate, if not silence, those who disagree (the skeptics of global warming theory, called “denialists” by the true believers, whom I call “warm-mongers”).  In other words, what Michael Crichton warned about in the fictional story in his novel State of Fear – a global hoax by radical environmentalist terrorists – has practically come to pass.   

                Fortunately, however, more and more Americans are realizing that the radical environmentalists’ theory really is nothing more than a massive scam.  Public opinion polls show that concern about alleged “global warming” or “climate change” has drastically declined – particularly as a new “revolution” in carbon-based energy resources (discussed below) has the promise of re-energizing the United States and the moribund American economy.  Notwithstanding media reports about 2012 being the “hottest year on record” for the continental United States – and the global warm-mongers’ attempt to exploit public misinformation about this story to continue to promote their “climate change” scam, most people now understanding that climate is distinct from weather.  Whatever weather phenomena might explain the summer heat (and drought) in the continental United States (such as an unusually northward jet stream) last year, really provide no evidence for global warming.  Indeed, 2012 was a colder than normal year in other parts of the world, including Alaska and most of Europe.    

                Most recently, a group of more than 20 retired NASA scientists and engineers, who call themselves “The Right Climate Stuff,” has issued a report that decisively shatters the global warming myth.  After reviewing, studying and debating the “available data and scientific reports regarding many factors that affect temperature variations of the earth’s surface and atmosphere,” they’ve found (among other things):  “the science that predicts the extent of anthropogenic [man-caused] global warming is not settled science”; “there is no convincing physical evidence of catastrophic anthropogenic global warming”: and “because there is no immediate threat of global warming requiring swift corrective action, we have time to study global climate changes and improve our prediction accuracy.”  They conclude that Washington is “over-reacting” on global warming and suggest that a “wider range of solution options should be studied . . . .” (“Facts About Climate,” I.B.D., January 28). 

                The recently-announced news about the hypocrisy of one of the leading propagandists for the “climate change” scam, former V.P. Al Gore, only underscores how phony the whole campaign against so-called “fossil fuels” has been.  (In case you haven’t heard, Al Gore recently made an estimated $100 million, or one-fifth of the proceeds, from the $500 million sale of his failed cable TV news channel, Current TV, to the radical Muslim, anti-Semitic broadcaster Al-Jazeera.  Al-Jazeera “is literally bought and paid for by the monarchy of the oil-and-gas-rich Mideast nation of Qatar,” notes the editors of Investor’s Business Daily.  Hence, it’s “a three-fer in greedy hypocrisy for Gore”: “He gives anti-Semitic Islamists more influence in America, takes $100 million in cash from Big Oil (after making a lucrative living by claiming that it's destroying the planet), and gives those same greenhouse gasbags a foothold in the U.S. media” (“Oil Gore Gets His,” January 4).)  

    B.O.’s explicit reference to “climate change” in his second inaugural address shows his determination to continue pushing his regime’s anti-carbon agenda.  But it’s not just the bullshit radical environmentalist theory of “global warming” or “climate change” that’s behind B.O.’s war on coal, oil, and natural gas.  B.O. deliberately aims to weaken the United States economically, to transfer America’s wealth to countries in the so-called third world, to promote what he regards as global economic “justice.”  That’s the goal of the radical leftist anti-colonialist ideology he inherited from his father, Barack Hussein Obama, Sr., as documented by Dinesh D’Souza, in his two books, The Roots of Obama’s Rage (2010) and Obama’s America: Unmaking the American Dream (2012), and his documentary film, 2016:Obama’s America.  As I observed in my “Tricks and Treats” essay (Oct. 25, 2012), D’Souza’s analysis provides perhaps the best explanation for B.O.’s double standard on energy:  he has waged war on carbon-based energy – using his Interior Department and EPA to impede development of America’s vast natural reserves of oil, natural gas, and coal (just when technological advances such as “fracking” have made exploitation of these reserves now economical) – but instead has helped bankroll deep oil drilling in Brazil, Columbia, and Mexico.  As D’Souza explains, the anti-colonial theory “predicts that Obama should want to enrich the previously colonized countries at the expense of previous and current colonizers.  Anti-colonialists insist that since the West grew rich by looting the resources and raw materials of the colonies, it is time for global payback.  The West must have less and the rest must have more.” 

                Whether it’s “green” bullshit or anti-colonialist ideology (another kind of bullshit) that’s behind B.O.’s agenda, it’s clear that during the past four years the B.O. regime has been pushing an anti-carbon agenda (opposed to America’s exploitation of its abundant resources in fossil fuels – coal, oil, and natural gas) and that it’s likely to continue pushing this agenda in the coming years.

    In January 2009, the day before B.O. was first inaugurated, the average price of a gallon of gasoline was a mere $1.83.  Today, gas prices average at least $3.50 nationally and, according to some analysts, will slowly rise to a “new normal” of $4 or even $5 a gallon.  Thus, gas prices have more than doubled during B.O.’s presidency – one of the few promises that he actually has kept.  As I wrote in my entry “Spring Briefs 2011” (March 18, 2011): 

                “High gas prices are the direct result of the policies of B.O.’s regime in Washington – indeed, they’re the deliberate policy of the regime, whose energy policy is best described as an anti-energy policy, designed to create an artificial shortage in carbon-based fuels, that is, oil, natural gas, and coal – a veritable war on America’s domestic oil,  gas and coal industry.   It’s a deliberate program of restricting domestic energy to make so-called `green energy’ (which cannot pay for itself without massive government subsidies) more attractive and necessary, all in order to fulfill B.O.’s campaign promise that energy prices would “necessarily skyrocket” on his energy agenda.  . . . Energy Secretary Steven Chu (a physicist with no experience in the energy industry, who’s also a true believer in radical environmentalists’ global warming, or `climate change,’ theories), before he was appointed energy secretary, had expressed a fondness of high European gas prices as a means of reducing consumption of fossil fuels.  In September 2008, Chu told the Wall Street Journal, `Somehow we have to figure out how to boost the price of gasoline to the levels in Europe’—which at that time averaged about $8 a gallon.”


    And as I added, “Virtually all the decisions made by the B.O. regime not only document its hostility to fossil fuels but have had the effect of raising energy prices.”   

                Higher gasoline prices have had a percolating effect on the economy, causing the prices of other commodities – especially food – also to rise.  (Federal mandates requiring ethanol and subsidizing its production – which divert the nation’s corn production from food to energy – are also an important factor.  Unfortunately, a bipartisan coalition of members of Congress from “Corn Belt” states share responsibility for this foolish – or should I say “fuelish” – policy.) 

                The only solution B.O. has offered for higher gas prices is to impose even more stringent fuel-economy (CAFÉ) mandates on auto and truck manufacturers – which may help Americans use less gas but which will force them to drive in smaller, lighter, less safe and less convenient vehicles.  That’s also part of B.O.’s ultimate goal: to make America more like Europe, with both its sky-high gasoline prices and its tiny cars.  And it’s yet another important example of how the B.O. regime has deprived American consumers of their freedom of choice. 

                The B.O. regime’s anti-carbon (and hence anti-energy) policies also include the moratorium on oil drilling in the Gulf of Mexico that departing Interior Secretary Ken Salazar imposed in the wake of BP’s Deepwater Horizon disaster – which, as I’ve previously noted, was far more destructive to the Gulf economy than the spill itself.  Energy economist Nick Loris of the Heritage Foundation estimated Salazar’s unnecessary slowdown in permits and leasing after the Gulf spill led companies to slash investments by $18.3 billion and cost the Gulf region 162,000 jobs.  During his tenure at the Department of Interior, Salazar removed major swaths of energy-rich U.S. land from production and exploration, costing the U.S. hundreds of billions of dollars.  Through last year, the number of yearly energy leases granted under B.O. was 51% below what it was under Clinton – and 36% below George W. Bush’s level.  As a federal report last year showed, oil output on federal lands fell 11% from 2010 to 2011, while natural gas dropped 6%.  (During the same time, thanks to the energy “revolution” discussed below, oil and natural gas output on private land surged, respectively, 14% and 12%.)  Today, the U.S. government leases less than 2.2% of federal offshore areas for oil and gas development, and just 6% of its lands.  All told, B.O./Salazar energy policies have removed an estimated 1.2 trillion barrels of oil and 21 trillion cubic feet of natural gas off the nation’s table – enough to last the U.S. hundreds of years (“Master of Disaster,” I.B.D., January 18). 

    B.O.’s decision to block the Keystone XL pipeline from Canada is another significant example of his regime’s anti-carbon, anti-energy policies.  The pipeline would have carried oil 1,700 miles from Alberta, Canada to Texas refineries, creating thousands of American jobs while it also would have increased our energy independence, by pumping 700,000 barrels of oil per day into the United States.  With B.O. having effectively killed the trans-USA pipeline, Canada probably will build it to British Columbia – and then sell the oil to China instead.  Noting that B.O.’s decision to kill the pipeline was meant to appease radical environmentalists – a critical voting bloc that B.O. counted on for his reelection – columnist Charles Krauthammer has concluded, “It’s hard to think of a more clear-cut case of putting politics over nation” (“Pipeline Sellout Placed Politics Above Country,” Investor’s Business Daily, Nov. 18, 2011).  Although Nebraska’s Department of Environmental Quality recently gave the revised Keystone XL pipeline route through that state a thumbs up, the U.S. State Department still hasn’t given its approval (which is necessary because the pipeline would cross an international border).  Notwithstanding a preliminary review showing the pipeline would have minimal environmental risks and would meet pipeline safety standards, the editors of Investor’s Business Daily predict that the State Department’s new review “will be a thumbs down since, in the minds of environmentalists, approval would contradict [B.O.’s] commitment to fight the climate change bogeyman in his second term,” as he pledged in his second inaugural address (“Now It’s Keystone vs. `Gang Green’,” January 24).      

    B.O.’s Environmental Protection Agency has announced new carbon-dioxide emission rules that force new power plants to put expensive new equipment – equipment that doesn’t even yet exist – to capture and bury emissions underground.  Because the new EPA rules are so draconian, they effectively ban new coal-fired power plants by making them uneconomical to build.  The actions show the B.O. regime following through on an earlier promise to crack down on the coal industry via regulation after the “cap and trade” carbon bill stalled in Congress in 2010.  The result will be higher electricity prices for all Americans, as the nation’s vast coal reserves remain untapped (“EPA’s War on Energy Continues,” “EPA Effectively Bans New Coal Plants,” I.B.D., March 28, 2012).   

                Ironically, just as the B.O. regime continues to persist in its anti-carbon agenda, the world is about to experience a true revolution in energy, centered on the abundant natural resources (in so-called “fossil fuels” – coal, natural gas, and oil) here in the United States.  Technological innovation (particularly hydraulic fracturing, or “fracking”) have now made it economically feasible to extract carbon fuels from deposits embedded deep in shale rock.  This revolution in carbon energy production at last is making realistic the dream that the United States would be fully self-reliant, no longer dependent on Middle East countries – a dream of U.S. policymakers since the 1970s. 

    Victor Davis Hanson has written a column noting how “the world was reinvented in the 1970s” by soaring oil prices and massive transfers of national wealth to the oil-rich Middle East; in a similar way, he predicts, a new revolution in world energy supplies is now taking place, centered on North America.  “The Canadian tar sands, deepwater exploration in the Gulf of Mexico, horizontal drilling off the eastern and western American coastlines, fracking in once-untapped sites in North Dakota and new pipelines from Alaska and Canada could within a decade double North American gas and oil production.  Given that North America in general and the United States in particular might soon be completely autonomous in natural gas production and within a decade without much need of imported oil, life as we have known it for nearly the last half-century would change radically.”  For one thing, the prospect of war in the Middle East no longer would disrupt U.S. energy supplies and hence no longer adversely affect our economy or national security (“World-Changing Oil Reserves,” I.B.D., March 30). 

    With regard to oil: “All told, the U.S. has access to 400 billion barrels of crude that could be recovered using existing drilling technologies, according to a 2006 Energy Department report.  When you include oil shale, the U.S. has 1.4 trillion barrels of technically recoverable oil, according to the Institute for Energy Research, enough to meet all U.S. oil needs for about the next 200 years, without any imports.”  And if we add to those numbers estimates based on new discoveries or new technologies that make oil more economically recoverable, the true total may be many times this amount (“U.S. Crude Supply 60 Times Greater than Obama Claims,” I.B.D., March 15).  Indeed, according to a recent report by the International Energy Agency, the shale-oil boom will help the U.S. overtake Saudi Arabia as the world’s largest oil producer by 2020 (“U.S. Redraws World Oil Map,” Wall Street Journal, Nov. 13, 2012). 

    Shale rock in the U.S. also contains abundant deposits of natural gas, now economically feasible to extract thanks to advances in “fracking” and horizontal drilling technologies – and thus creating the shale-gas boom, which liberal economist Robert J. Samuelson has called “the most important energy event in decades.”  From 2000 to 2012, the U.S. share of global gas production zoomed from less than 2% to 34%.  By 2040 the Energy Information Administration expects overall gas production to increase by nearly 40% (with shale gas’s share rising to about half).  By one study, the gas boom – which has occurred mostly on private lands, outside the reach of the U.S. Interior Department to stop (as noted above) – has created nearly 500,000 jobs for producers and their suppliers; and in 2012 residential gas bills (which also cover transportation and distribution costs) are down 21% from 2008 (Samuelson, “Don’t Kill Most Important Energy Event in Decades,” I.B.D., December 25). 

    Besides natural gas, there’s another “fossil fuel” in which the United States has rich reserves, but which the B.O. regime has targeted in its war on carbon-based energy: coal.  With more reserves than any other nation on earth, the United States has been described as the Saudi Arabia of coal.  The International Energy Agency (IEA) says there are 1.7 trillion tons of identified coal resources and 3.1 trillion tons of total coal resources.  “Based on U.S. coal production for 2010,” says the EIA, “the U.S. recoverable coal reserves represent enough coal to last 239 years” (“Fossil Fuels: Energy of the Future,” I.B.D., April 17, 2012).   

    B.O. likes to talk about oil, gas, and coal as “fuels of the past” and to portray his “green” energy policies as “forward-looking,” but really everything he’s proposing is “just a retread of Jimmy Carter’s failed energy policies from the 1970s” (“That 70s Energy Show,” I.B.D., March 27).  And so-called “green” energy sources aren’t really new technologies at all:  the first electric cars in the early 20th century actually predated those powered by gasoline; the first wind generator was built in the U.S. in 1888, and by the 1930s were common on farms; and the first solar collector was invented in 1908, with Bell Labs scientists building the first solar cell able to power everyday electrical equipment in 1954.  Yet each of these technologies – electricity, wind, and solar – have proved to be far less efficient and economical than the burning of fossil fuels.  That’s why the government’s own Energy Information Agency estimates that solar and wind, combined, will account for less than 3% of U.S. energy production as far forward as 2035 (“Back to the Future on Energy,” I.B.D., April 5, 2012). 

                The B.O. regime’s “green” policies, rather than producing alternative energy, instead have produced more political scandals – consider, for example, the case of Solyndra, the bankrupt company that has become the poster child for this regime’s crony capitalism (or, more properly, crony socialism or fascism).  And rather than creating any real American jobs, they have resulted in more “outsourcing” of energy-industry jobs overseas – for example, by the use of stimulus dollars to build electric cars in Finland, or the support of the U.S. Export-Import Bank’s plan to loan Brazil’s state-run oil company, Petrobras, $2 billion to do deep-water oil drilling in the Atlantic.  As the editors of Investor’s Business Daily recently noted, “Obama’s forced dependence on foreign oil – while we leave vast reserves in the ground – has resulted in the acceleration of the greatest transfer of job-creating wealth overseas in history to the tune of hundreds of billions annually.  Why we can’t be our own best energy company, keeping jobs and cash here, is the fault of a president beholden to groups opposed to any American oil drilling” (“The Great Job Outsourcer,” May 3). 

    The editors of Investor’s Business Daily, concluding a several-weeks series of editorials about energy economics, observed last spring: “The administration has tried for three years to kick-start a `green economy.’  But fossil fuels still dominate for lots of good reasons.  It’s time for a return to reality. . . . The Obama administration has consistently refused to learn the lessons of the market.  Nowhere is this clearer than in its quixotic, expensive effort to create a new economy that no longer depends on oil.  That quest is a failure, though we doubt if the president will admit as much” (“Why Obama Lost His War on Oil,” April 25, 2012). 

                The good news is that B.O.’s anti-energy policies may be the one part of his second-term agenda that’s likely to meet determined congressional opposition, not only from Republicans but also from moderate Democrats from states rich in carbon-fuel resources (such as Senator Joe Manchin of West Virginia or the new senator from North Dakota, Heidi Heitkamp).  Thus, although radical environmentalists are encouraged by the lip-service B.O. paid to “climate change” in his second inaugural (“What we expect is the president to deliver on climate, roll up his sleeves and build on the modest success of what he’s done so far,” said Michael Brune, Sierra Club executive director), energy industry leaders are also optimistic.  Jack Gerard, president of the American Petroleum Institute and a sharp critic of B.O., predicts that politics may force B.O. to articulate more an “all of the above” energy policy (“Fight over greenhouse gases may be slowed,” Washington Post article reprinted in The Columbus Dispatch, Nov. 8, 2012).  



    “Gun Violence” Bullshit:

    The Threat to Second Amendment Rights


                 During the weeks following the 2012 elections, there were significant increases in the number of Americans purchasing firearms or applying for permits to carry concealed weapons.  Why?  Because many people feared that their Second Amendment rights would be jeopardized by B.O.’s regime, particularly in his second term when he’ll have the “flexibility” to be his true leftist/collectivist self.  Those fears have been realized by the reaction to the December 14 massacre at an elementary school in Newtown, Connecticut – a shameless political exploitation of the tragedy in order to push the leftists’ anti-gun agenda, a blatant example of the B.O. regime’s principal modus operandi to “never let a crisis go to waste” (in the infamous words of former White House chief of staff – and now Chicago mayor – Rahm Emanuel).  

                The mass-murder of twenty-six persons – twenty children in first grade, all six- or seven-year  olds, plus six adults – at Sandy Hook Elementary School in Newtown, Connecticut on December 14, 2012 was a horrific crime.  But the tragedy was compounded by its shameless exploitation by gun-control fascists. 

                Advocates of gun control – whether politicians or their lapdogs in the news media – lately have been using the euphemism of “gun violence” (they’re concerned about the problem of  “gun violence,” they want government to “do something,” to “take action” to solve the “gun violence” crisis, etc.), instead of honestly acknowledging that they’re pushing for more gun control.  In other words, they’re advocating more laws and government action (that is, using the coercive power of government, which is also backed up by firearms), to “control” guns, by limiting law-abiding Americans’ freedoms to own and possess firearms.  “Gun control” really means government control over the ownership, possession, and use of firearms by individuals – in other words, government control over firearms owners – and the most extreme advocates of gun control (the most extreme advocates of “solutions” to the “gun violence” problem) really want (as their ultimate objective) a government monopoly over firearms.  A government monopoly means the prohibition of private gun ownership, including the confiscation of all firearms possessed by individuals not authorized to do so by the government. 

                Many (if not most) gun-control activists identify with left-liberal “progressivism,” the leftist or “progressive” wing of the Democratic Party, because their ideology – like that of the early 20th-century “Progressive” movement activists – is essentially paternalistic.  They have little if any faith in the ability of ordinary persons – “common” citizens – to be self-sufficient in any way, including defending themselves against criminals.  They do not see America as a society composed of competent, largely self-reliant individuals, with the majority of adult citizens being law-abiding and responsible owners of firearms, which they use not only for recreation but also for self-defense, against the minority of persons who are criminals (and therefore by definition are not law-abiding).  Rather, they have complete trust and confidence in government “experts,” including the legislators who make the laws and the “professional” police who enforce the laws, to keep people safe from the criminals.   

                As with other issues being pushed by the “progressive” activists (who are really reactionary, as I’ve discussed above, because they hark back to a centuries-old, paternalistic theory of government), the gun-control agenda is really an agenda to make individual citizens helpless, to paternalistically treat adults as if they were children, incapable of effectively protecting themselves, and thus completely dependent on the government (on the police and the military) for their protection against criminals – and utterly helpless to defend themselves against the government when its agents (the police or the military) become criminal by abusing their powers.  (To anyone who’ll say this wouldn’t happen in the United States, I can cite a number of examples throughout U.S. history, including Ku Klux Klan lynch mobs in the South after the Civil War.  For more recent examples, I’ll cite “Waco” and “Ruby Ridge” – two well-publicized shoot-outs resulting in the murder of innocent Americans by armed federal law-enforcement officials, from the FBI, DEA, and BATF, in the early 1990s.) 

                I call gun-control activists “fascists” because they are fascists, or statists:  they extol the state (the term fascism derives from the ancient Roman fasces, the bundle of rods and an ax that represented the power of the ancient Roman state); they subordinate the rights and liberties of the individual to the supposed “common good,” the interest of the state (or society, the collective).  Like the “Progressive” movement activists of the early 20th century, today’s gun-control fascists are profoundly anti-individualist.  Like B.O., they may pay lip service to Second Amendment rights – particularly after the Supreme Court, by its decision in Heller (2008) affirmed that the Second Amendment protects an individual’s right to keep and bear arms – but they don’t really like it.  If they could have their way, they’d have the Supreme Court reinterpret the Second Amendment, making it effectively useless, as the Court has so many other provisions of the Constitution.  Or, quite frankly, they’d have the Second Amendment repealed, expunged from the text of the Constitution (even though, as noted below, its core value is a natural right that ought not to depend on the text of a written constitution for its guarantee).  

                Gun-control activists also generally share what I call “gun-phobia”: an irrational fear of, and hatred for, guns, which seems to be rooted in their near-complete ignorance about firearms.  Consider, for example, the ban on so-called “assault weapons” enacted by Congress during the Clinton administration and in effect for ten years, from 1994 until 2004.  The term assault weapons  has no objective meaning and is, in fact, practically meaningless; the law, which was based on emotion rather than reason, prohibited a series of specific semi-automatic firearms that politicians (such as Sen. Diane Feinstein, the principal author of the law) considered “military” weapons because of their cosmetic resemblance to firearms used by the military, such as the M-16, which are fully automatic machine guns.  But none of the weapons banned under the 1994 law were true “military” weapons (they were semi-automatic, not automatic).  Most importantly, two studies by criminology professors Chris Koper and Jeff Roth for the National Institute of Justice (one in 1997 and the other in 2004) found that the so-called “assault weapons” ban had no meaningful effect on the reduction of crime.  In their follow-up study, Koper and Roth concluded, “we cannot clearly credit the ban with any of the nation’s recent drop in gun violence.  And, indeed, there has been no discernible reduction in the lethality and injuriousness of gun violence.” 

    The firearm used by the killer in Newtown was a Bushmaster .223, which is a semi-automatic firearm frequently used by hunters:  it uses essentially the same sorts of bullets as small-game hunting rifles, fires at the same rapidity (one bullet per pull of the trigger), and does the same damage.  Gun-control activists frequently ask the rhetorical question, “Why do people need a semiautomatic Bushmaster to go out and kill deer?”  The simple answer, John Lott has noted, is that it is a hunting rifle – just one that has been made to look like a military weapon.  But, as he adds, “the point isn’t to help hunters.  Semiautomatic weapons also protect people and save lives.  Single-shot rifles that require you to physically reload the gun may not do people a lot of good when they are faced with multiple criminals or when their first shot misses or fails to stop an attacker” (“The Facts About Assault Weapons and Crime,” Wall Street Journal, January 18). 

    Facts are indeed “stubborn things,” as the saying goes – and the gun-control activists base their rhetoric on emotions, on feelings (mostly fear), rather than facts.  For example, USA Today had a December 19 page-1A story – a so-called “data analysis” – about the “mass killing crisis,” with a headline reading, in big letters, “Virginia Tech, Fort Hood, Aurora, Sandy Hook . . . .”  The article reported that, using news accounts and FBI records from 2006 through 2010, the paper identified 156 murders (resulting in the deaths of 774 people) that met the FBI definition of mass killings, in which four or more people are killed by the attacker.  The story bragged that the paper’s review “offers perhaps the most current and complete picture yet of a crime that is frighteningly common and not widely understood.”  But only those readers who read all the way to the bottom of the page discovered the remarkable concluding paragraph: “For all the attention they receive, mass killings still accounted for only about 1% of all murders over those five years.  More died from migraines and falls from chairs than mass murders, according to death records kept by the Centers for Disease Control and Prevention.” 

                As the USA Today story illustrates, the leftist-dominated news media also suffers from, and perpetuates, gun-phobia.  Notice generally, for example, how news stories about crimes committed with firearms do not call the perpetrators “criminals,” “killers,” or “murderers”; rather, they call them “shooters” or “gunmen” – as if it were the firearm, and not the criminal who uses it, that’s responsible.  Like the gun-control activists whom they parrot, the news media contemptuously refers to responsible gun owners as the “gun culture” and to civil rights organizations that defend Second Amendment rights (such as the National Rifle Association (NRA) or Gun Owners of America (GOA)) as the “gun lobby.”  Interestingly, they never refer to the special-interest groups on the other side, groups that advocate gun control such as the Brady Center, as the “anti-gun lobby,” although that’s a fair description of them. 

                Quite predictably, even before the identities of the Sandy Hook School massacre victims were revealed, anti-gun activists (the anti-gun lobby, the politicians, and their allies in the news media) began exploiting the killing to push their agenda, calling for more government control over guns.  Instantly after the attack, Congressman Jerry Nadler (D.–N.Y.) declared war on the NRA, saying “We have to go to war against the people who enable the gun violence, the people who stop us from keeping guns out of the hands of mentally unstable people, of felons, and that means the NRA leadership.”  New York City Mayor Michael “Big Brother” Bloomberg appeared on NBC’s Meet the Press to propose a massive package of gun-control measures, declaring “We don’t need people carrying guns in public places.”  Senator Dianne Feinstein (D.– Calif.), principal author of the Clinton-era ban on “assault weapons,” announced she’d introduce a new assault-weapons ban in Congress; and Senate Judiciary Committee chair Charles “Chuck You” Schumer (D.–N.Y.) mulled limiting clips to 10 rounds and acting to keep guns from the mentally unstable.  And New York Governor Andrew Cuomo yelled and hyperventilated that “we must stop the madness,” as he proposed a series of gun control measures including “the toughest assault-weapon ban in the nation, period”; background checks even when one private party sells a gun to another; an Instant Criminal Background Check System for all ammunition purposes; a ban on Internet sales of ammunition; and a ban on high-capacity magazines (actually any clip with more than 10 rounds, which may be barely enough to save one’s family from a home invasion by multiple criminals). 

    A suburban New York newspaper, The Journal News (of Westchester County), posted an online data base, drawn from public records (which the paper accessed through a Freedom of Information request), that identified all handgun permit holders in three New York counties.  (The newspaper’s stated rationale for posting the information again illustrates the news media’s “gun phobia”: the editors claimed they were performing a “public service” by warning residents who in their neighborhoods had a gun and therefore, presumably, was a threat to public safety.)  By doing so, the paper provoked a firestorm of controversy – and it wasn’t just gun rights advocates who expressed their ire at the paper’s outrageous action.  Not surprisingly, many of the gun owners who were thus “outed” were justifiably concerned about their or their families’ safety (among the accessible names and addresses were those of prison guards, retired cops, and other individuals for whom the possession of a firearm is not only a constititutional right but a necessity, for they’re the people on whom criminals would like to get revenge for locking them up).  Or maybe the homes of those who were “outed” as gun owners – or the homes of their non-gun-owning neighbors – would become targets for thieves.  Walter T. Shaw, a former burglar and jewel thief and author of License To Steal (a book about his criminal career) called the newspaper’s piece “the most asinine article [he’d] ever seen.”  “Having a list of who has a gun is like gold,” he told Fox News.  “Why rob that house when you can hit the one next door, where there are no guns?  Or it could tell criminals in need of a gun to use or sell where one might be found when the owner is away” (“Gun Owner Lists Recklessly Endanger,” I.B.D., January 10). 

    Besides highlighting an apparent conflict between First and Second Amendment rights (discussed more fully below), the controversy also revealed the real hazards – to individuals’ right to privacy as well as their own personal safety – of such supposedly “reasonable” government regulation of firearms as compulsory registration laws.  When a gun owners are forced to register their firearms with the government, they’re not only forced to disclose personal information to the government (which is dangerous enough in itself) but also, because of freedom-of-information laws, they’re also forced to disclose that private information to the whole world. 

                The editors of USA Today joined the anti-gun bandwagon with a December 17 editorial calling the Newtown massacre “an imperative for action.”  In response (in the newspaper’s “opposing view” column), Larry Pratt, executive director of Gun Owners of America (the group that makes the NRA seem like moderate wimps in comparison), wrote a courageous op-ed that focused on the principal indirect cause of massacres like that at the Newtown, Connecticut school.  (I say “indirect” because, of course, the direct cause was the criminal act of the murderer – an evil and/or mentally ill young man.  And I say “courageous” because at the time Mr. Pratt and the GOA spoke out, the NRA was being silent, still trying to decide what public position it should take.)  Mr. Pratt argued that “blood is on the hands of members of Congress and Connecticut legislators who voted to ban guns from all schools in Connecticut (and most other states).  They are the ones who made it illegal to defend oneself with a gun in a school when that is the only effective way of resisting a gunman.”  He concluded, “What a lethal, false security are the `gun-free zones,” calling them “a lethal insanity.” 

                Mr. Pratt is quite right, and his proposed solution – a repeal of legislation mandating “gun-free” zones – is the only truly rational response to the Newtown massacre.  The problem isn’t the availability of firearms but rather existing gun prohibitions, particularly the existence of “gun-free” zones in schools, college campuses, movie theaters (like the Aurora, Colorado theater where last summer’s mass killing occurred), and other public places.  Virtually all the mass killings with guns that have made newspaper headlines in recent years occurred in places where either the government or the property owners had banned firearms.  In fact, as John Lott (author of the important book More Guns, Less Crime (2d ed. 2000)) has pointed out, with just a single exception (the attack in Tucson last year in which former Congresswoman Gabrielle Giffords was seriously wounded), “every public gun shooting in the U.S. in which more than three people have been killed since at least 1950 has occurred in a place where citizens are not allowed to carry their own firearms.”  

                As John Lott has shown in his book, firearms are used to prevent crimes (by armed, law-abiding citizens who refuse to be made victims) far more often than they are used by criminals. And as the editors of Investor’s Business Daily observed (in one of the best editorials published in the wake of the Newtown tragedy), paraphrasing the thesis of Lott’s book, “more guns in the hands of potential victims means less crime.”  The editors cite another shooting at a public place in Oregon that took place prior to the Newtown massacre but which had a far less tragic outcome because it did not take place in a “gun-free zone”: 

    “Before the tragedy in Connecticut, a shooter at an Oregon shopping mall was stopped by an armed citizen with a concealed carry permit who refused to be a victim, preventing another mass tragedy.  In the target-rich environment of the Clackamas Town Center two weeks before Christmas, the shooter managed to kill only two people before killing himself.  A far worse tragedy was prevented when he was confronted by a hero named Nick Meli.  As the shooter was having difficulty with his weapon, Meli pulled his and took aim, reluctant to fire lest an innocent bystander be hit.  But he didn’t have to pull the trigger.  The shooter fled when confronted, ending his own life before it could be done for him.  We will never know how many lives were saved by an armed citizen that day.”


    Other mass shootings – from a 1997 shooting at a high school in Pearl, Miss., and a 2002 shooting at Appalachian School of Law in Grundy, Va., to the New Life Church shooting in Colorado Springs which took place shortly before the massacre in Aurora – have been cut short when someone retrieved a gun from a car or elsewhere and confronted the shooter.  In contrast, like the Newtown school shooting, the 2007 massacre on the Virginia Tech campus (which resulted in the deaths of 32 people) and the shooting last summer at the Aurora, Colo. movie theater both occurred in a “gun-free” zone.  “John Fund, writing in National Review, notes that the Aurora shooter had a choice of seven movie theaters within a 20-mile drive of his home that were showing the Batman movie he was obsessed with.  The Cinemark Theater he chose wasn’t the closest, but it was the only one that banned customers from carrying their guns inside, otherwise allowed under Colorado law.”  And, of course, some of the most restrictive prohibitions of firearms in the world, in “gun-free” Norway in 2011, did not prevent a lone gunman from shooting 68 unarmed people after setting off a car bomb in the heart of Oslo that killed nine others (“A Lesson of Sandy Hook: Give Up Gun-Free Zones,” I.B.D., December 18). 

                The B.O. regime, of course, did not hesitate to exploit the Newtown tragedy to build political momentum for its long-desired gun-control agenda.  (Remember the regime’s motto: “Never let a crisis go to waste.”)  With much fanfare, V.P. “Smiley” Joe Biden headed up a task force on “gun violence” which held phony hearings and then sent their report to B.O., who then announced the measures the regime probably had predetermined to push as its gun-control agenda.  These included the expected call for Congress to reenact a ban on “assault weapons” plus other measures such as “universal” background checks, limiting gun magazines to 10 rounds, spending $4 billion to pay for training more local police officers, a resumption of research (and more funding in an unspecified amount) for federal research on gun violence (by the Centers for Disease Control), and spending $150 million so local schools could hire school resource officers and counselors to bolster school safety.  Many of these actions require legislation by Congress, but B.O. said he’d implement some actions by executive order, especially if Congress failed to act – proposing a total of 23 different executive orders, to be precise.  (One wonders how the White House came up with the magic number of 23 – why not 19? Or 26?  No doubt some public relations “experts” recommended that precise number.)  B.O. announced his proposals at a White House news conference, where he appeared behind the presidential podium, backed up by children who had supposedly written him about their concern about “gun violence” in the wake of the Newtown massacre. 

                It’s not surprising that B.O. would use children as props – as tools for his regime’s propaganda – for such abuse of children (and it really is a form of child abuse) has been used by other tyrants throughout history (including Stalin, Hitler, and Mao), as a series of pictures posted on the website – “Other Tyrants Who Have Used Children as Props” (January 16) – so effectively shows. 

                Gun-control fascists – B.O., “Smiley” Joe, other Democrat politicians, and their allies in the news media – also clearly are engaged in a massive propaganda campaign to demonize the NRA, the most visible organization within the gun-rights movement (and the nation’s largest civil-rights organization, in the true sense of the term civil rights).  When anti-gun politicians and the media contemptuously refer to groups like the NRA as the “gun lobby” (as I noted above), they’re really paying tribute to their effectiveness as a lobbying group – which is to say, an association of Americans who share common values and who join such associations in order to exercise their First Amendment right to petition the government.  The NRA reports that its membership has increased by some 250,000 – a quarter of a million Americans – in the month after the Newtown, Connecticut school massacre.  

                Already there is significant “pushback” – political opposition – to the B.O. regime’s agenda, from politicians (mostly but not exclusively Republicans) both in Congress and at the state and local level.  Senator Rand Paul (R.– Ky.) introduced a bill, called “The Separation of Powers Restoration and Second Amendment Protection Act of 2013,” to condemn the use of executive orders which undermine the powers constitutionally reserved for Congress and/or the right protected by the Second Amendment.  A number of sheriffs across the nation (most from rural areas in the South or West – including 28 of Utah’s 29 sheriffs) have announced that they will not enforce any new unconstitutional federal restrictions on firearms.  (After all, as officers of government, they all take an oath to support the U.S. Constitution and thus, when asked to enforce unconstitutional federal laws or executive orders, they have a duty not to do so – in effect, to “nullify” unconstitutional federal actions.)  And eight states – Alaska, Arizona, Idaho, Montana, South Dakota, Tennessee, Utah, and Wyoming – have adopted laws in recent years that would exempt guns made in the state from federal regulation as long as they remain in the state.  (That’s because federal regulation of firearms is premised on abuse of Congress’s power to “regulate commerce among the states”; firearms not in interstate commerce presumably are beyond the reach of federal authority.)  Twenty-one other states are considering similar legislation, according to a recent news article (“More states try to bypass federal gun regulations,” USA Today, January 11). 

                The Second Amendment, in its straightforward language, is an absolute prohibition on government action: “. . . the right of the people to keep and bear Arms shall not be infringed.”  When James Madison drafted the words that eventually became the Second Amendment, he placed the provision near the top of the list of proposed Bill of Rights amendments, second only to the First Amendment.  Like the Second Amendment, the First Amendment is also an absolute prohibition on government action:  “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press, or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”  The two amendments top the list of the rights enumerated in the Bill of Rights because they both protect natural rights (rights that exist prior to the formation of a political society): the First Amendment protects the natural rights of religious freedom (freedom of conscience) and freedom of expression, while the Second Amendment protects the natural right of self-defense. 

                The natural right of self-defense protected by the Second Amendment – and even more explicitly by similar provisions in the bills of rights of the constitutions of many states (including Ohio) – is a right that exists independently of government (and of the guarantees of a written constitution); that’s what the term natural right means – that the right derives not from government or society but is inherent in human nature.  (That is why believers in God call it a “god-given” right.)  Although in theory government is created to protect, or more effectively “secure” individuals’ natural rights (deriving its just powers from the consent of the governed, as the Declaration of Independence proclaims and as English philosopher John Locke explained in his Second Treatise on Government), individuals do not surrender their right of self-defense to the government.  They retain the right, just as they retain other fundamental natural rights, such as those affirmed by the First Amendment.  

                There are several reasons why self-defense is a right retained by individuals and therefore a fundamental right that cannot be legitimately denied or abridged by the government.  One reason is that, as a practical matter, government cannot be relied upon to protect individuals from harm.  Even with today’s professional police forces and all the advantages of modern technology, government is not perfect, especially in a free society.  (Maybe it would be possible to make a “Big Brother”-type police state that so effectively controls the actions of all its citizens that it could, in theory, prevent crimes from ever taking place.  But anyone who values individual freedom – anyone who values human beings' right to live as beings with free will, not as sheep to be herded by a government shepherd or as automatons programmed by a government planner – would not want to live in such an omnipotent police state.)  Should a citizen be powerless to defend herself, her family, her property, from armed criminals who invade her home, as she waits minute-by-precious-minute, for some government police force to respond to her 9-1-1 call? 

                More important, government cannot be relied upon to protect individuals from harm because often it is the government itself (or its armed agents) who are doing the harm.  Arguably, the most important aspect of the natural right of an individual to defend himself against criminals is the right to defend himself against government when it acts criminally, or tyrannically – when instead of following its legitimate function of protecting individual rights, government itself invades those rights, abusing its powers.  It happens, inevitably, because government (by definition) has a monopoly on the legitimate use of force (with the notable exception of the right of individuals to use force in their own self-defense); that monopoly power can easily be abused, because (as Lord Acton’s maxim famously declared), "Power tends to corrupt, and absolute power corrupts absolutely." 

                Indeed, one might say that the right to keep and bear arms – the right guaranteed by the Second Amendment – is not primarily about self-defense in the ordinary sense.  Its real importance is self-defense in this broader sense (including defense against the government when the government acts criminally or tyrannically) – in other words, as a final deterrent against government violation of rights.  Historically, tyrannical governments always have tried to disarm the people.  It happened in England during the reign of James II, a Roman Catholic who aimed at restoring Catholicism in England by force.  His efforts to disarm Protestants led to his own downfall, in the “Glorious Revolution” – and ultimately, to the provision in the English Bill of Rights of 1689, guaranteeing the right to keep arms which is the direct antecedent of our federal Bill of Rights’ Second Amendment (except ours protects the individual right to be armed of all the people, not just Protestants).  More recently, it happened in the former Soviet Union, whose Communist government in the late 1980s sought to disarm people in the Baltic states (Estonia, Latvia, and Lithuania) to prevent them from declaring their independence. 

                What “arms” are protected by the Second Amendment?  (Or by the Fourteenth Amendment, which was meant to apply against the states and local governments all the individual rights guaranteed against the federal government by the Bill of Rights, including the right to keep and bear arms?)  Given the amendment’s absolute language, the obvious answer is ALL arms – including those types of firearms typically used by the military.  Indeed, given the core principle that lies at the heart of the Second Amendment – that government must not have a monopoly on arms, for a “free society” requires that arms be possessed by “the people” – one may argue that so-called “assault weapons” (a term that is misleading, as noted above) should never be banned by government, for they are the most important category of arms that citizens ought to have, to make the right to keep and bear arms truly meaningful.  A ban on semi-automatic “assault” weapons, therefore, is clearly unconstitutional – a clear violation of the Second Amendment.  For the same reason, a ban on automatic firearms – which has existed in federal law for over 70 years – is also blatantly unconstitutional.  (As I’ve discussed above, such bans not only contradict the plain language of the Second Amendment but also fall outside the legitimate powers of the federal government.  “Reasonable” regulations of firearms – that is, those that can pass constitutional muster – may be legislated, if at all, only by the states, pursuant to their so-called “police power” (to protect public safety), which the Tenth Amendment reserves to the states. 

                Does this mean that government is absolutely prohibited from banning any type of weapon?  Of course not.  The “right to keep and bear arms,” by its own terms, is limited to arms – weapons, including all types of firearms, that may be used for self-defense against armed criminals (or tyrannical government).  It does not include what are today called “weapons of mass destruction” – biological, chemical, and nuclear weapons, which are inherently dangerous and destructive of human life – and which accordingly may be absolutely prohibited by the states, under legitimate exercise of their police powers, or may be banned from entry into the United States, under the legitimate power of Congress to regulate foreign commerce.   

                In a provocative recent op-ed, Victor Davis Hanson notes how the debate over gun control today seems to be pitting not only liberals against conservatives but also two provisions of the Bill of Rights – the Second Amendment and the First Amendment freedom of speech clause – against each other.  In response to left-liberals’ call for stricter gun-control laws, some conservatives are calling for equally unconstitutional restrictions on First Amendment rights, in order to “crack down” on the violent motion pictures or video games that may have motivated young killers like the Newton, Connecticut murderer, Adam Lanza.  Hanson rhetorically asks,  

    “Just as semi-automatic weapons mark a technological sea change from the flintlock muskets of the Founders’ era, computer-simulated video dismemberment is a world away from the spirited political pamphleteering of the 18th century.  If we talk of restricting the Second Amendment to protect us against modern technological breakthroughs, why not curtail the First Amendment as well?  How about an executive order to Hollywood to stop its graphic depictions of mass killings, perhaps limiting the nature and rationing the number of shootings than can appear in any one film?  Can’t we ban violent video games altogether in the same way we forbid child pornography?


    “Isn’t it past time for an executive order to curtail some of the rights of the mentally unstable – given that the gunmen in mass killings usually have a history of psychic disorders and often use mood-altering drugs?”


    To ACLU liberals on the left who would react to such proposals with abhorrence, arguing they would unconstitutionally abridge First Amendment rights or other rights (such as due process, the general right to liberty, or the right to privacy), Hanson replies that Second Amendment rights are just as fundamental (or perhaps even more fundamental, for they enable citizens to protect themselves against encroachments on these other rights):  

    “To the Founders, the notion that individual citizens had recourse to weapons comparable to those of federal authorities was a strong deterrent to government infringing upon constitutionally protected freedoms – rights that cannot simply be hacked away by presidential executive orders.”


    (“A Constitutional Split over Controlling Guns,” I.B.D., January 18). 

                Rather than having Congress enact new, unconstitutional gun-control laws – or allowing B.O. to by-pass Congress by abusing his executive powers to impose his gun-control agenda – we ought to repeal the unconstitutional laws that are already on the books and to hold B.O. accountable for his abuse of power.



    “Consumer Protection” Bullshit:

    The Threats to Liberty from the Administrative State


    As I noted above, in the section on “health care reform” bullshit, so-called “progressive” reformers are really reactionaries; their idea of “reform” is to hark back to the centuries-old policy of paternalistic government – a model of government that sees its essential function not as protecting their rights (especially their right to be free to live as they choose, to “own” themselves) but rather as “protecting” them, in many cases from themselves, from the consequences of their own actions, their own choices.  Thus does the “progressive” model of government undermine both individual freedom and individual responsibility.  (Freedom and responsibility go hand in hand.  An individual who is truly free also owns his own life: he is free to act according to his own choices, and he is responsible for the consequences of his choices and his actions – he reaps the benefits of his good choices/actions but pays the costs of his bad choices/actions.  When government uses its coercive power to limit persons’ freedom to act, it also allows them to evade responsibility for their own lives.  By doing so, it undermines their humanity, for the essence of being human is to be able to make choices and to be free to act according to one’s choices – and to take responsibility for them.)  

    The “welfare state” that began to be created in the United States by the “Progressive” activists of the early 20th century has been called “the Nanny State” by some libertarian critics because it treats adult citizens as if they were children, incapable of deciding for themselves how they should live their lives.  That’s because, as noted above, “Progressives” had a low opinion of ordinary persons and their ability to govern themselves, but they had a high opinion – an idealism out-of-touch with the reality of human nature – of government “experts,” who would be supposedly nonpartisan and concerned only with “the public interest.”  One may ask:  Exactly what is “the public interest”?  Does it even exist?  Or are there in reality only many, many different private interests – and what the “progressives” call “the public interest” is really only the interest of certain favored individuals or groups?  (For more on the dangers to individual freedom from the “Nanny State,” see my essay “`A Nation of Sheep’ Under the `Nanny State’” (April 30, 2008).) 

    In addition to their notion of government by “experts,” early-20th-century Progressives (like their counterparts today) were contemptuous of American constitutionalism and key constitutional principles like federalism and the separation of powers.  (Their dislike of the Constitution was mainly rooted in the fact that it stood in the way of their basic goal of increasing the power of government – essentially the same reason why many modern leftists also hold the Constitution in contempt; following the Constitution and the limits it imposes on governmental power thwarts many of the leftists’ pet schemes.)  The “Progressive” reformers created the 20th-century “administrative state”: at both the state and national level, the rise of independent regulatory agencies, staffed by “expert” commissioners who had the powers to make law (rules or regulations), to enforce the law, and to adjudicate in administrative tribunals.  Starting with the Interstate Commerce Commission (ICC), created in 1887 to regulate the railroad industry, a series of “alphabet agencies” (so-called because of Washington’s propensity to use acronym names) were created by Congress, with several important ones during the “Progressive Era” of the early 1900s – including the Food and Drug Administration (FDA) and the Federal Trade Commission (FTC) – and even more, as part of FDR’s “New Deal” program of the 1930s: the Federal Power Commission (FPC, later renamed the Federal Energy Regulatory Commission, or FERC), the Federal Communications Commission (FCC), the Securities and Exchange Commission (SEC), the National Labor Relations Board (NLRB), the Occupational Safety and Health Administration (OSHA), and so on.  Still more administrative agencies have been created by Congress in recent decades, including the Environmental Protection Agency (EPA) and the Federal Election Commission (FEC). 

    The so-called “fourth branch of government” dangerously blends legislative, executive, and judicial powers – and thus violates the fundamental constitutional principle of separation of powers.  More ominously, the unchecked power exercised by these supposedly “expert” bureaucrats, threatens Americans’ freedom in countless ways.  (The only real check on the power of administrative agencies is provided by the courts, which are supposed to oversee the agencies’ exercise of discretion and strike down rules or regulations that exceed the powers given them by law – an oversight function in which the courts often fail because of their deference to the agencies’ supposed expertise and, more fundamentally, because the courts for several decades now have permitted the unconstitutional delegation of legislative powers.)  Unlike traditional executive agencies – part of the executive branch of government, within the bureaucracy of the Cabinet departments and ultimately accountable to the president – administrative agencies are called “independent” because they are not directly accountable to the president, for their members are appointed (nominated by the president and confirmed by the Senate) for long terms. 

    At the federal level, when one adds to the regulations promulgated by traditional executive agencies (enforcing the laws passed by Congress), the myriad of new rules or regulations promulgated by dozens of administrative agencies (supposedly carrying out their “mission” as defined in the laws that created them) – one finds that the freedom of all Americans, in their daily lives, has been significantly curtailed, in virtually countless ways.  In the name of “protecting” Americans from themselves (or in the name of some supposed “public interest,” such as to “save energy,” “preserve water” and other natural resources, “reduce greenhouse gases” (carbon emissions), etc.), government bureaucrats have denied individuals their freedom of choice.  To pick just a few examples, the government has mandated what kinds of light bulbs Americans may use in their homes (effectively banning Edison’s incandescent bulb and instead requiring “energy-efficient” bulbs, such as compact fluorescents), what kinds of toilets they may use (“water-saving” toilets that do not effectively flush), and what kinds of washing machines they may use (front-loading machines that do not effectively clean clothes, again supposedly to conserve water).  And, as noted in the section on “green” bullshit, above, fuel-economy (CAFÉ) standards – especially as they’ve increased to absurd levels under B.O. – have forced auto manufacturers to make smaller, lighter, more fuel-efficient cars which are less comfortable and far less safe in crashes, thus not only denying Americans who prefer larger, more comfortable “gas-guzzling” vehicles but also perhaps jeopardizing their (and their family’s) lives.  

    In many cases, by limiting Americans’ choices – in the name of protecting them or advancing some fictitious “public interest” – these agencies not only limit their freedom but also cause them harm, sometimes even killing them, all in the name of “protecting” consumers.  Critics of federal auto “safety” standards have argued that airbags and other devices mandated by government, supposedly to protect lives, actually have cost more lives than they’ve saved.  Similarly, critics of FDA regulation of foods, drugs, and medical devices have argued that the agency not only has failed to effectively protect Americans from unsafe foods and drugs but also that the complicated, expensive, and time-consuming FDA approval process has denied many Americans life-saving drugs or medical devicesGovernment regulation thus not only may fail to “protect” consumers but may actually be responsible for killing them. 

    Effectively functioning free, competitive markets provide the best protection for consumers, especially in this Internet age, when the average consumer has access (literally through his or her fingertips) to a broad array of information (about all sorts of topics, including information for consumers about defective products or unscrupulous businesses).  Private testing agencies (like Underwriters’ Laboratories or Consumer Reports) – as well as businesses’ regard for their own reputations – do a far more effective job “protecting” consumers than do some government bureaucrats whose “watchdog” activities are enforced at gunpoint. 

    Under B.O.’s regime, the increased number of government regulations has skyrocketed, prompting some commentators to refer to America during the past four years as a “regulation nation” – a true “Obama-nation.”  A report last year from the conservative Heritage Foundation found that during the first three years of B.O.’s presidency, federal agencies imposed 106 new major rules costing more than $46 billion a year, with nearly $11 billion in one-time implementation costs – five times the cost of regulations during the Bush White House’s first three years.  There were, among other things, six major rules from the Securities and Exchange Commission, five from the Commodity Futures Trading Commission, one from the Federal Reserve, and hundreds of new rules authorized under the Dodd-Frank financial regulatory law; plus five new major regulations from the Environmental Protection Agency, which will cost more than $4 billion a year (“Regulation Nation,” Investor’s Business Daily, Mar. 14, 2012).  As the editors of I.B.D. also have noted, “the total regulatory toll for small businesses today is $11,000 per employee.  Since small businesses account for 80% to 85% of all new hiring and with the regulatory burden growing fast, it’s not hard to see why job growth has slowed to a crawl” (“Five Challenges in 2013 that Dwarf `Fiscal Cliff’,” January 2). 

    “The annual average number of regulations is higher during Obama’s first three years in office than during either George W. Bush’s or Bill Clinton’s administration,” reported Grover Norquist and John Lott in chapter 5 (“Regulatory Thuggery”) of their book, Debacle: Obama’s War on Jobs and Growth, and What We Can Do Now To Regain Our Future (2012).  Most importantly, there has been a “dramatic increase” – a 40% increase compared to the last term of the Bush administration, a 55% increase compared to the last Clinton term – in so-called “economically significant” regulations (those which the government thinks will cost businesses $100 million or more each year).  Just 75 of those new regulations under the B.O. regime will cost companies an average of $38 billion a year over the coming decade (pp. 126-27).  And as the authors report, “it will be years before we see the full regulatory impact of all the laws that were passed during Obama’s first two years in office” (p. 129).  The 2010 federal health-insurance regulatory law (so-called “ObamaCare”) created 159 new agencies, commissions, panels, and other governmental bodies; one six-page-long section of the law has turned into 429 pages of regulations issued in 2011 by the Department of Health and Human Services.  The Dodd-Frank financial regulatory law (what I’ve called the “Frank-n-Dodd monster”) has 447 new rules that are either required or suggested, including the infamous “Buffett rule” (which has yet to be written).  Because the Dodd-Frank law imposes its costs on banks and other financial institutions – whose loan activities are a chief source of capital for small businesses – the costly toll of the law and its regulatory rules percolate throughout the economy.  

    Consider one particularly egregious example:  The Consumer Financial Protection Bureau (CFPB, or simply “the Bureau”), a creature of the monstrous Dodd-Frank legislation (the “Frank-n-Dodd monster”), which is arguably the most dangerous administrative agency ever created.  As columnist George Will has observed, in several significant ways the CFPB violates the Constitution and the rule of law.  The Dodd-Frank law delegates effectively unbounded power to the Bureau and couples that power with provisions insulating it against meaningful checks by other branches of government, violating the principles of separation of powers and its corollary, checks and balances.  Unique among administrative agencies, the Bureau is not subject to congressional financial oversight – in other words, it is exempt from Congress’s “power of the purse” – because its funding comes not from congressional appropriations but is “determined by the director” and comes from the Federal Reserve.  (Will notes not one Bureau request for $94 million in Federal Reserve funds was made on a single sheet of paper and that its 2012 budget estimated $130 million for expenses identified simply as “other services.”  Little wonder the Bureau has been “hiring promiscuously and paying its hires lavishly”: as of a few months ago, about 60 percent of its then 958 employees were making more than $100,000 a year, with 5 percent making $200,000 or more.  In comparison, a cabinet secretary makes $199,700.)   

    The Bureau’s mission is to prevent practices that are “unfair, deceptive, or abusive,” and it is given the broad discretionary power to “declare” what those practices are.  “Unfair” and “deceptive” practices already are regulated by federal law – chiefly by the Federal Trade Commission, created during Woodrow Wilson’s presidency – and have proven to be sometimes-vague concepts, subject to the administrative agency’s discretion (and thus often unpredictable).  “Abusive” practices are not well-defined – the term abusive is a novel legal term – and so the Bureau’s director essentially is empowered to make new law, as he determines whether a practice may be “abusive” and therefore illegal, because it “interferes with” a consumer’s ability to “understand” a financial product, or takes “unreasonable” advantage of a consumer’s lack of understanding, or exploits “the inability of the consumer to protect” his or her interests regarding a financial product.  “This fog of indeterminate liabilities is causing some banks to exit the consumer mortgage business,” Will observes (“Consumer bureau should be dismantled,” Columbus Dispatch, Nov. 18, 2012).   

    Unlike other federal administrative agencies, the CFPB is headed, not by a board or a commission consisting of several members with staggered terms, but by a single director, appointed by the president for a five-year term.  Richard Cordray, the former attorney general of Ohio, is currently the Bureau’s director.  I should note that I know Rich (he’s the husband of one of my faculty colleagues here at Capital University Law School): he’s a nice guy, a fairly bright guy (he was a champion on the TV quiz show Jeopardy), someone who (I believe) sincerely wants to help people.  But he’s also politically ambitious; and as a Democrat, he shares with other members of his party a strong bias against business people and a tendency to demagogue (as his record as attorney general demonstrates).  I suspect that Cordray identifies intellectually with the early 20th-century “Progressive” movement; thus, it seems to me, that he also shares with other modern “progressives” an elitist, arrogant certainty that business people are the “bad guys” and that government bureaucrats are the “good guys,” because they’re looking out for the “little guy,” the average consumer, who is considered incompetent to protect himself.  In short, I suspect that he’s a paternalist, one who sees himself as a kind of “philosopher-king.”  I’m not criticizing Cordray personally – he’ll do as well in this position as anyone – but I am criticizing the office he holds, the office of Bureau director itself.  No one person should have the broad discretionary power the director has – the power, among other things, to dictate the standards for all mortgage loans in the United States.  No one should have that power because no one is truly a “philosopher-king.”  That’s the conceit of the “progressives,” what libertarian economist and philosopher F.A. Hayek called “the fatal conceit,” that a government bureaucrat can do a better job in planning an economy than can a free market, with the spontaneous order that a market provides (which is both far more efficient and far fairer than government planning). 

    There’s an additional problem with Cordray’s appointment as CFPB director.  He was one of the infamous “recess” appointments that B.O. made on January 4, 2012 – on a day on which the U.S. Senate was in pro forma session – in order to avoid a Senate confirmation fight (and possible filibuster).  The appointment therefore was an illegitimate, unconstitutional appointment.  I’ve previously argued that B.O. violated the Constitution in making these illegitimate “recess” appointments; it was one of the reasons I called B.O. “the most lawless president in U.S. history,” in my most recent “Ratings of the U.S. Presidents” essay (Feb. 22, 2012).  I’m pleased to report that a federal court, the U.S. Court of Appeals for the District of Columbia Circuit, has agreed with me, in finding B.O.’s supposed “recess” appointments to be unconstitutional.  In its decision in Canning v. National Labor Relations Board (decided January 25), the court held that the three pro-union members of the NLRB whom B.O. also appointed on January 4, 2012 were not legitimately appointed (they were appointed on a day on which the Senate was not in recess and therefore were not properly confirmed by the Senate, as the Constitution requires).  Without those new appointees, the Board lacked a quorum to make any decisions; hence, the court’s decision vacates the Board’s order challenged in the Canning case and effectively invalidates all Board orders or rulings made since December 2011. 

    Although the Canning decision directly applies only to the NLRB, its reasoning also would invalidate Cordray’s appointment – and all actions he has taken as CFPB director, including the controversial new mortgage “rule” (an 800-page set of rules regulating home loan lending) made by Cordray and announced earlier this month.  (The rule was meant to “protect” consumers from lending practices that preceded the housing bust, when many borrowers took on risky loans they didn’t understand and couldn’t afford.  Cordray’s rule has been criticized from both sides in the debate over mortgage policy:  some consumer groups say it “invites abusive lending” by giving lenders too much protection and low-income borrowers not enough; while business analysts say it will have the effect of forcing lenders to approve prime loans to borrowers who normally would qualify only for subprime loans, thereby perpetuating the loose credit rules that precipitated the housing bust.  Compare “Mortgage Rule to Protect Buyers,” USA Today, Jan. 10, with Paul Sperry, “Will New Loan Rules Seed the Next Crisis?” I.B.D., Jan. 15.) 

    The Supreme Court eventually may rule on this “recess” appointment question, and as I’ve noted above, both in this section and in the section on “ObamaCare,” we cannot rely on the Court to consistently or fully uphold the Constitution, by curbing abuse of power by either the president or administration agencies.  But in the D.C. Circuit’s decision in Canning, we can be encouraged by at least one federal court upholding the Constitution – and rebuking B.O. for his unconstitutional over-reaching, in at least this one instance.



    America’s “Last, Best Hope”:

     Federalism and the States as Bastions for Liberty


    As I observed in Part I of this year’s “Prospects for Liberty” essay, the United States of America is a unique kind of constitutional republic – a union of states (each with a republican government limited in its powers by each state’s constitution as well as the U.S. Constitution), united for certain limited purposes in a national government, whose powers are limited by the U.S. Constitution.  Among the most important ways in which the Constitution limits the powers of the national government, in all its “branches” (including the executive branch, or the presidency), is through the principle of separation of powers and its corollary, checks and balances.  These “chains of the Constitution” (as Thomas Jefferson called them) limit the powers of the president – in terms of both the scope of his powers and the means by which he can exercise them – and thus pose the greatest threat to the tyrannical schemes of B.O. and the Democrats (or, to put it another way, the greatest protection Americans have for their individual liberties).   

    Just as separation of powers limits the powers of each of the three branches of the national government, so does federalism – the other fundamental principle of the U.S. Constitution – limit the powers of the national government generally.  Federalism is the principle that divides powers between the national government and the states:  the national government, which under the Constitution is a government of limited, enumerated powers, has only those powers explicitly granted it by the people of the United States, as memorialized in the written text of the Constitution; all other powers are reserved to the states or retained by the people, as the Tenth Amendment affirms.   James Madison nicely explained the federal division of powers this way, in Federalist Papers No. 45: 

    “The powers delegated by the proposed Constitution to the Federal Government, are few and defined.  Those which are to remain in the State Governments are numerous and indefinite.  The former will be exercised principally on external objects, as war, peace, negotiation, and foreign commerce; with which last the power of taxation will for the most part be connected.  The powers reserved to the several States will extend to all the objects, which, in the ordinary course of affairs, concern the lives, liberties, and properties of the people; and the internal order, improvement and prosperity of the State.”


    Madison also understood that federalism is a key structural check on the powers of the national government, keeping it confined to its legitimate, constitutional limits.  In advocating amendments that became the federal Bill of Rights, Madison maintained that these rights provisions not only allowed the courts to be “the guardians of these rights” (through exercise of their power of judicial review) but also authorized the states to act as a check against unconstitutional exercise of federal power: 

    “[T]he State Legislatures will jealously and closely watch the operations of this Government, and be able to resist with more effect every assumption of power, than any other power on earth can do; and the greatest opponents to a Federal Government admit the State Legislatures to be sure guardians of the people’s liberty.”


    (Madison, Speech in the U.S. House of Representatives, June 8, 1789).  In their famous Virginia and Kentucky Resolutions of 1798, Madison and his good friend and political collaborator, Thomas Jefferson, utilized the Republican state legislatures of Virginia and Kentucky to protest against the unconstitutional Alien and Sedition Acts, passed by a Federalist-dominated Congress (and enforced by the Federalist administration of John Adams and by a federal-court system also dominated by Federalists).  In his Kentucky Resolutions, Jefferson went a step further and declared that the states could “nullify” federal laws they regarded as unconstitutional. 

                    Today, the states still serve as a vital check on the powers of the national government – and hence as bastions in defense of individuals’ liberty, against the abuse of federal powers.  Given the current political situation, the states (where the Republican Party still remains effective, either as a majority party or in opposition) may be the most important checks on abuse of power by the B.O. regime and the Democrats in Congress. 

                    With all the media attention devoted to the 2012 elections focused on B.O.’s reelection and the continuation of divided government at the national level (Democrats retaining control of the U.S. Senate and Republicans retaining control of the U.S. House), Republican success at the state level was perhaps the most under-reported political story of 2012.  As I discussed in my essay  Election 2012 Postmortem” (Nov. 10), Republicans retained control of the governorships of most states as well as most state legislatures.  30 states now have reform-minded, limited-government Republican governors, the most since 2000.  (Just a few years ago Republicans held only 22 governorships.)  In a majority of states, Republicans control one or both houses of the state legislatures.  In addition, the South is now nearly purely Republican, completing the historic switch of parties, as ten Deep South states have changed from total Democratic control (not many years ago) to total Republican control. 

    One-party dominance has grown at the state level, with 46 states having one-party control of their legislatures, the highest number since World War II, USA Today reported in a mid-December news article.  In 38 of those states, the ruling party also controls the governor’s mansion, as the article reports (and – a point not made explicitly in the article – a majority of those states are controlled by Republicans).  Moreover, in half of the states (25 of them), one-party control is strong enough to override their governor’s vetoes: in other words, 25 state legislatures have veto-proof majorities.  The article does report that a majority of those 25 states (16) are controlled by Republicans (“One-party dominance grows in states,” December 14).  

    As I also noted in my November 10 “Election 2012 Postmortem” essay (quoting Mike Schrimpf, a spokesman for the Republican Governors Association), Republican governors can act as “a firewall” against B.O.’s policies – an important function as states make major decisions on the implementation of the federal health-insurance control law (as noted in the section above).  Republican governors and other Republican government officials at the state and local level also can serve as check against attempts by the B.O. regime to impose unconstitutional gun-control measures, in violation of citizens’ Second Amendment rights (as also noted above). 

    Many important, free-market-oriented reforms that are not yet politically feasible at the national level (because of Democrat opposition) have begun to be implemented at the state level.  Chief among these are labor-law reforms, particularly reform of laws limiting the power of “public sector” (government worker) unions, which are bankrupting the states because of the rising costs of health care and pension programs.  Ironically, it was in Wisconsin – the birthplace of “progressivism,” the first state to allow government workers to unionize – that the most important recent reform occurred:  in 2011, the Republican-majority state legislature passed, and Republican Governor Scott Walker signed into law, a much-needed bill to reform government-worker unions which, among other things, raises state employees’ share of pension and health costs, limits union negotiations of wages, and prohibits unions from automatically deducting union dues from workers’ paychecks – all common-sense (and actually rather modest) reforms that are sorely needed in order to help reduce the skyrocketing costs of government workers’ pensions and health-care plans.  (For more on the Wisconsin reforms, see my 2011 “Spring Briefs” essay (Mar. 18, 2011) – and scroll down to the entry titled “Real Cheeseheads: Wisconsin Union Thugs and Their Democrat Allies”).  Efforts by Big Labor and the Democratic Party to overturn the law in the courts failed – as did their effort to recall Governor Walker.   

    The success in Wisconsin has inspired other states – especially in the Great Lakes region, part of the nation’s “Rust Belt” where manufacturing industries have been in steep decline due to rising labor costs – to follow with their own reforms.  Indiana, for example, became a “right to work” state – one that allows workers the freedom to decide for themselves whether or not to join a labor union.  (Indiana was the 23rd state to pass a “right to work” law.  Most states with such laws are in the South, and they have been experiencing economic growth despite the national recession.  As reported by economist Richard Vedder in a 2010 Cato Journal article, data show a 23% higher rate of per capita growth in right-to-work states.  And according to a recent report by the U.S. Census Bureau, between April 1, 2010 and July 1, 2012 a net total of nearly 809,000 Americans moved into one of the 22 right-to-work states from elsewhere in the U.S. – a continuation of the massive exodus of employees and their families from forced-unionism states that the Census Bureau has documented ever since it began tracking state-to-state domestic migration during the 1990s.)  

    Perhaps the most striking development in labor law at the state level has occurred in my original home state, Michigan.  As I reported in my “Election 2012 Postmortem,” Michigan voters on Election Day 2012 rejected two proposed constitutional amendments that were pushed by Big Labor: Proposal 2 would have guaranteed collective bargaining, prohibiting the legislature from passing any new laws that would have restricted unionization, such as a “right-to-work” law; and Proposal 4 would have mandated minimum wages and unionization of home health-care workers.  In the wake of these major defeats by Big Labor, Republican Governor Rick Snyder (a moderate Republican businessman) and the Republican-controlled legislature passed a law in mid-December making Michigan the 24th “right to work” state.  The time was right to break a long-standing tacit truce in Michigan politics on union rules: Michigan had the nation’s sixth-highest state jobless rate at 9.1% and it had one of the lowest rates of personal income growth between 1977 and 2011; meanwhile, Indiana now has a record number of businesses choosing to expand or set up in the state (220 companies, including Amazon and Toyota, which will invest $3.6 billon and create some 21,000 new jobs, according to the Indiana Economic Development Corporation).  As the editors of  Wall Street Journal put it, “it could be the best thing to happen to Michigan’s economy since the internal combustion engine” (“Worker Liberation in Michigan,” December 11).  (As a Michigan resident, a “Michigander,” the first 25 years or so of my life, I am impressed by how radical a change this development is.  Labor unions, particularly the autoworkers’ union (the UAW) have controlled Michigan politics for the past fifty years or more – in other words, for most of my lifetime – until now!) 

    Republican governors and state legislators are also behind other important reforms at the state level – among them, taxation reform. Proposals to abolish income taxes, business taxes and other unpopular levies (which also impede economic growth) – often offset by boosts in sales taxes – are under consideration in several states.  For example, Louisiana Gov. Bobby Jindal has proposed to eliminate the state income tax (which has a top rate of 6% for incomes over $100,000), offset by an increase (by 3 percentage points) in the state sales tax, which would give Louisiana the nation’s highest sales tax rate (about 12% when state and local taxes are combined) (“GOP Govs push to reshape taxes,” USA Today, January 14).  Such a shift, from income taxation to a consumption tax, has been inspired by the experience of several states (mostly in the West or South, including Texas and Florida) that have no income taxation and have had far more economic growth than states with income taxes.  In addition to promoting growth, such a change in tax policy also promotes fairness, by distributing the tax burden more evenly among all citizens. 

    Other important pro-liberty reforms that are occurring at the state level – because they are not possible at the federal level, either constitutionally or politically – are the gradual liberalization of drug prohibition laws (particularly the easing of marijuana prohibition) and the growing recognition of same-sex marriage.  Among the referenda voters approved on Election Day 2012 were proposals to legalize the recreational use of marijuana in two states (Colorado and Washington), as well as a proposal to legalize the medical use of marijuana in one more state (Massachusetts) – a growing trend (no pun intended) among the states in recent years.  Voters in 2012 also approved proposals to legalize same-sex marriage in three states (Maine, Maryland, and Washington), and voters in another state (Minnesota) defeated a proposal to bar same-sex marriage (by defining marriage as between a man and a woman).  Although a majority of states still bar same-sex marriage, the trend in recent years is for states to expand the definition of marriage to include same-sex couples.  Most of this reform has occurred by court order (state courts exercising their power of judicial review and enforcing state constitutional mandates for the equal protection of law) or by legislative action; the developing trend of having same-sex marriage approved by referenda signals an important sea-change in public opinion on this issue.  (For more on same-sex marriage – and why it’s an issue that not only libertarians but conservatives ought to embrace – see my essay “Marriage, American Style” (May 19, 2004).) 

    In fairness, I should note that these are reforms not being pushed by Republicans; indeed, Republicans generally are opposed to both developments.  But it should be noted that most Democrat politicians also are opposed to drug decriminalization and that many Democrat politicians are just as opposed (or as reluctant to embrace) the legalization of same-sex marriage as are Republicans.  Both issues involve expanding individual liberty: the freedom to ingest in one’s own body whatever substances one chooses to ingest, and the freedom to marry the partner of one’s choice (whether of the opposite sex or same sex).  As I argued in the final section of my “Election 2012 Postmortem” essay, the Republican Party ought to more fully embrace both same-sex marriage and drug law reform – to be fully consistent with its historical commitment to individual freedom and equality under the law for all persons. 

    In general, I believe Republicans should be more consistent, not less, in their principles of limited government and in their defense of individual freedom and responsibility, across the broad range of issues (“social” as well as “economic,” which is really a false dichotomy, as freedom has an unlimited number of aspects).  To distinguish themselves as the party of freedom and responsibility, the party of truly limited government, Republicans ought to become more libertarian (and less socially conservative).  Republicans should unabashedly champion the freedom and responsibility of individuals – across the board, on all major issues.  Consider, for example, efforts by the Nanny State-ists (a subset of whom I call the “health fascists”), to ban smoking and the consumption of “unhealthy” foods that allegedly promote obesity and ailments such as heart disease.  (One striking example are the policies being pushed by New York City Mayor Michael “Big Brother” Bloomberg, the most recent of which is a directive to prohibit stores, restaurants, and other business establishments in the city from selling sugary soft drinks in cup sizes larger than 16 ounces.)  Republicans should take the lead in opposing such proposals; let the Democrats be the party of paternalistic government.       

                At the beginning of this section I mentioned Madison’s and Jefferson’s Virginia and Kentucky Resolutions of 1798.  The historical context of those documents reveals close parallels between the predicament in which the United States found themselves in 1798 and the current political situation.  A political party (the Federalists), with views out-of-step with the libertarian principles of the American Revolution (the Federalists’ “Old World” aristocratic paternalism) as well as the libertarian sentiments of the majority of the American people, was in control of all three branches of the national government (the presidency, both houses of Congress, and virtually the entire federal judiciary).  Under the pretext of a national “crisis” (the pseudo-war with France, in the summer of 1798), the controlling political party pushed through Congress blatantly unconstitutional legislation (the Alien and Sedition Acts), designed to silence the opposition political party (the Jeffersonian Republicans).  While some within the opposition party considered open rebellion or even secession, the leader of the party (Thomas Jefferson) cautioned them to be patient.  “A little patience, and we shall see the reign of witches pass over, their spells dissolved, and the people recovering their true sight, restoring their government to its true principles,” he wrote.  (Letter from Jefferson to John Taylor of Caroline, June 1, 1798).  And in the 1800 elections, Jefferson’s Republican Party won control of both houses of Congress as well as the presidency, inaugurating what Jefferson called the “Republican revolution” of 1800. 

                Today, the “reign of witches” – left-liberal Democrat “progressives” – also will pass, as it must, if the United States is to continue to be the beacon for individual freedom in the world.  Abraham Lincoln, whose Republican Party in the 1850s derived its name from Jefferson’s Republican Party of the 1790s (and who once said that all his political principles derived from those of the author of the Declaration of Independence), observed in a similar time of crisis:  “We shall nobly save, or meanly lose, the last, best hope of earth” (Lincoln’s message to Congress, December 1, 1862).  That’s the fundamental issue the American people face in 2013 and beyond.


     | Link to this Entry | Posted Thursday, February 7, 2013.  Copyright © David N. Mayer.

    2013: Prospects for Liberty (Part II) - January 31, 2013


    2013: Prospects for Liberty


    Part II


    Continuing my sometimes-annual January blog essay on “The Prospects for Liberty” in the coming year, I’m emphasizing again this year what I have called “the tyranny of bullshit.”  As discussed in Part I, the single greatest threat to individual freedom today is paternalism – the expanding 20th-century regulatory/ welfare state, which deprives individuals of the freedom (and the responsibility) to govern their own lives.  Here in the United States, given the results of the 2012 elections (with the party advocating expansion of the welfare state, the Democrats, continuing their control over the presidency and the U.S. Senate), further expansion of the welfare state – and with it, the loss of more and more individual freedom – will be a real threat for the next several years.  Americans’ best hope lies with the Republican opposition, centered not only in the U.S. House (where, as a result of the 2012 elections the GOP maintained the majority it had won in 2010) but also in state governments (where the GOP actually gained control of more governorships and more state legislatures) – and in the ability of Republicans to adhere to their limited-government principles and to unite in opposition to the schemes of B.O. and the Democrats. 

    The policies that will result in the expansion of the “Nanny State” are based on nothing more than bullshit:  bullshit rationalizations offered by paternalistic politicians, bullshit believed by a majority of the gullible public, the fools who voted for the politicians “whose sole qualification to rule [us] was their capacity to spout the fraudulent generalizations that got them elected to the privilege of enforcing their wishes at the point of a gun," to quote Ayn Rand’s apt definition of politicians.  Unlike other forms of tyranny in the past, the tyranny of bullshit is not based on the use of force against an unwilling people:  it’s based instead on fraud, fraud committed by the politicians who spout the bullshit theories that persuade a gullible majority of the people to put the collars around their own necks.  

    Part I focused on the reelection of B.O., the current “Occupier” of the White House and the Bullshitter-in-Chief, whose policies pose the most tangible threat to Americans’ freedoms in the near future (and the most tangible manifestation of “the tyranny of bullshit”).  I discussed the reasons why B.O. was reelected to a second term and, in general, the challenges posed by those of us who oppose his policies and the hope we have to defeat, or at least to thwart, his efforts to implement them.   

    Since I posted Part I, Americans have seen the spectacle of B.O.’s second inaugural – which in its style or symbolism illustrates the monarchy into which B.O. and his supporters are trying to “transform” the American presidency, and which also in its substance shows B.O.’s truly radical leftist (socialist/fascist collectivist) agenda, couched in traditionalist rhetoric.  (Let’s put it this way: Beyoncé’s lip-synched rendition of the national anthem was just the tip of the iceberg with regard to Inauguration Day phoniness.)  B.O.’s speech plainly showed his Orwellian determination to be “flexible” in his second term – meaning NOT that he’ll be willing to compromise with his political opponents but rather that he’ll feel free to push his agenda unrestrained by the “chains of the Constitution,” by political opposition (even within his own party), and even by public opinion.  That’s why it is critically important that the Republicans in Congress (and in statehouses around the U.S.) not only oppose him but also effectively and successfully oppose him and his policies, getting public opinion on their side.  (It’s also important that Republicans be more consistent limited-government conservatives, or libertarians, especially at the state and local level, as I will discuss in the final section of Part III.) 

                In this second part of the 2013 New Year’s essay (and in the third and final part, to be posted next week), I’ll discuss some of the various types of bullshit propagated by B.O. and the Democrats that threaten Americans’ freedom – and the nation’s well-being.  Within each category, I’ll also discuss the challenges that those of us who oppose B.O.’s policies face – and also the hope we have in successfully meeting those challenges and thus defending Americans’ liberty.




    “Fairness” and “Social Justice” Bullshit:

    Threats to Liberty from Redistributionist Taxation


    One of the worst – and, unfortunately, one of the easiest – ways in which government may deprive persons of their liberty is to levy taxes.  Historically, abuse of the taxation power has prefaced tyrannical government.  King Charles I’s so-called “Eleven Years’ Tyranny,” from 1629 to 1640, when he ruled without Parliament, imposing taxes on the English people without their consent, lead to the English Civil War of the 1640s and Revolution in 1649.  Similarly, the complaint about “no taxation without representation” – the levying of taxes on colonists in America by the British Parliament, where Americans were not represented and thus could not give their consent – was among the leading issues that led to the American Revolution. 

                “Taxation is theft!” is a popular mantra among some libertarian activists, and it rings true mostly when applied to the federal income tax.  As I observed in my essay “Those Damn Taxes” (April 15, 2004), 

    In a real sense . . . the federal income tax IS a form of theft:  it forcibly takes money from those who earn it and then spends that money, not on the legitimate functions of the national government as enumerated in the U.S. Constitution, but on a variety of programs most of which are unconstitutional and illegitimate.  Indeed, given that the majority of federal government spending today is on so-called “entitlement programs,” it is fair to say that most of the money Uncle Sam takes out of Americans’ paychecks is theft:  rather than being spent to protect their rights, it is being redistributed to others – to finance programs that benefit others, or as direct payments to others.  Taking wealth by force from those who have earned it and transferring it to others who have not earned it is precisely the definition of theft or robbery.  Seen in this light, Uncle Sam is no better than an ordinary highway robber, except he has the aura of legitimacy about him because his taxing authority ultimately derives from “the consent of the governed,” those whose hard-earned wealth he takes, who have consented to the robbery through the tax laws passed by their representatives in Congress.


    If the Republicans in Congress fail to thwart B.O.’s schemes, the abuse of the federal taxing power – theft by taxation and redistribution of wealth – will not only further undermine Americans’ freedom but also seriously jeopardize the nation’s prosperity.  

                Since well before his election in 2008, B.O. has made no secret of his desire to raise taxes on those Americans he calls “the richest” or “the wealthiest” (which he has defined variously, sometimes simply as “millionaires” or “billionaires,” sometimes as any individual who annually earns more than $200,000, or couples who earn more than $250,000).  And he also has made no secret of the reason why he wants to tax upper-income Americans more: to produce what he considers “fairness,” “economic justice,” or “social justice,” by forcibly confiscating the wealth produced by the more successful individuals and redistributing it to the less successful, or unsuccessful – in other words, to redistribute wealth from the producers to the looters, from the “makers” to the “takers” – and, in the process, to make everyone more dependent on the government.   During the Democrats’ 2008 primary contest, in a debate with Hillary Clinton moderated by ABC’s Charlie Gibson, B.O. was asked why he supported an increase in the capital-gains tax even though it would decrease government revenues.  (As history has shown, cuts in capital-gains rates always have raised revenues, while increases in rates have caused revenues to decline.)  B.O.’s remarkable answer (documented in this YouTube video clip) was that he’d raise the tax rate anyway, for the sake of “fairness,” because hedge fund managers (in his opinion) make too much money.  And on the 2008 campaign trail, in a famous exchange he had with an Ohio small businessman, Joe Wurzelbacher (“Joe the Plumber”) (also documented on a YouTube video), B.O. admitted that the goal of his economic policy was to “spread the wealth around.”  To B.O., the federal government’s power to levy taxes should not be confined to its legitimate, constitutional authority to raise the revenues necessary to fund the (legitimate) functions of the U.S. government; rather, that awesome power should be used to control individuals’ behavior and ultimately to transform society itself. 

                Although B.O.’s apologists deny that he’s a “Marxist” or “socialist,” it’s undeniable that he uses Marxist or socialist rhetoric and that his political views have been shaped throughout his life by true Marxists – starting with his father, Barack Hussein Obama, Sr. (as documented in B.O.’s own autobiography, Dreams  From My Father), continuing through his formal education (in high school, college, and law school) by a number of Marxist teachers and professors, as well as his training by radical leftist activists in Chicago.  Peter Farrara, budget policy director for the free-market Heartland Institute, says only one thing explains B.O.’s obsession with soaking the rich.  “He’s a Marxist – that is the only logical explanation of [B.O.’s] rhetoric.  And it is 100% consistent with his own published background” – including, one might add, the statements B.O. has made on the campaign trail.  Indeed, as I’ve previously observed, the very slogan the B.O.-Biden reelection campaign chose, “Forward!,” is a term with strong Marxist (Soviet Communist, to be precise) connotations. 

                The insightful columnist Michael Barone – who coined the term “Gangster Government” to describe the Chicago-style “thugocracy” that has come to Washington with B.O.’s regime – has contended that in pursuing his goal of wealth redistribution, B.O. really doesn’t care if raising taxes and imposing more regulations on business tips the economy back into recession.  “Economic growth produces things Obama doesn’t like,” Barone maintains.  “Some people get rich.  Obama prefers a more equal income redistribution.”  Through a series of incremental steps and complicated, painstaking rules – “think ObamaCare” – B.O. is injecting into the American economy what Barone calls “soft despotism,” borrowing the term from Alexis de Tocqueville, the young French aristocrat who visited the United States in the early 1830s and published his profound observations about American society in his classic book, Democracy in America.  Quoting Tocqueville, Barone notes that under soft despotism, government increasingly takes control over all aspects of individuals’ lives: 

    “. . . an immense tutelary power is elevated, which alone takes charge of assuring their enjoyments and watching over their fate. It is absolute, detailed, rigid, far-seeing and mild. It would resemble paternal power if, like that, it had for its object to prepare men for manhood; but on the contrary, it seeks only to keep them fixed irrevocably in childhood; it likes citizens to enjoy themselves. It willingly works for their happiness; but it wants to be the unique agent and sole arbiter of that.”


    Ultimately such soft despotism reduces the nation to “nothing more than a herd of timid and industrious animals of which government is the shepherd” (“Reversing Obama’s Soft Despotism,” National Review Online, Feb. 20, 2012). 

                As I have frequently told my students, the term social justice is a leftist code word for what, in fact, amounts to socialist injustice:  penalizing the successful simply because they’re successful, taking from them the wealth they’ve earned and using the coercive power of government to redistribute it to those who haven’t earned it.  To call such socialist injustice “justice” – whether “social justice” or “economic justice” – is to turn the concept of justice on its head.  It’s another example of the use of “Orwellian” language by B.O. and his regime. 

                Similarly, to call for additional taxes on “the rich” in the name of promoting “fairness” is also Orwellian:  it truly distorts the concept of fairness for, in fact, it is quite unfair.  As I discussed in my 2004 essay on “Those Damn Taxes,” the current federal income tax system is “fundamentally unfair to high-income Americans, who are paying far more than their ‘fair’ share, under any measure of fairness”: 

    Under the tax code’s so-called “progressive” rate structure, a minority of taxpayers in the upper income brackets are forced to pay the lion's share of federal income taxes.  The wealthiest 1% of taxpayers constitute 21% of the total income earned and pay 37% of federal income taxes; the top 5% constitute 35% of total income and pay a majority of federal income taxes, 56.5%.  In contrast, those in the bottom 50% pay a mere 4% of all federal income taxes (IRS, Statistics of Income Division, September 2002). 


    The famous (or infamous, depending on one’s perspective) “Bush tax cuts” – that is, the reductions in individual federal income tax rates legislated by Congress in 2001 and 2003, during George W. Bush’s presidency – actually exacerbated the disparity.  Taking into account the changes in exemptions, deductions, and credits which disproportionately benefited lower-income taxpayers, wealthier Americans are now paying an even higher percentage of total federal income taxes.  According to IRS data for calendar year 2009, the top 1% earned 17% of the nation’s income but paid nearly 37% of federal income taxes; the top 5%, 32% of income and nearly 59% of taxes.  In contrast, the bottom 50% earned 13.5% of the nation’s income but paid only a bit over 2% of federal income taxes; and nearly half of all Americans do not pay any federal income taxes at all.  What’s so “fair” about that? 

                As the Wall Street Journal noted in an editorial on November 20, 2002, “The Non-Taxpaying Class,” the progressive income tax has created “a nation of two different tax-paying classes: those who pay a lot and those who pay very little.  And as fewer and fewer persons are responsible for paying more and more of all taxes, the constituency for tax cutting, much less for tax reform, is eroding.  Workers who pay little or no taxes can hardly be expected to care about tax relief for everybody else.”  To that statement ought to be added the additional fact, as noted above, that the income tax also has created a nation of two different classes of Americans:  those who pay NO taxes at all (which is fast approaching half of the population), and then, among the taxpayers, the two classes identified in the Journal editorial.  Again, what’s “fair” about that? 

                The unfairness of the “progressive” income tax, in its rate structure, is further compounded by the ways in which federal revenues are spent.  The national “welfare state” that has grown steadily since FDR’s “New Deal” of the late 1930s – and which has grown by leaps and bounds, to an unprecedented (and unsustainable) dangerously high level under B.O. (as the next section will more fully discuss) – transfers wealth earned by productive Americans to the “beneficiaries” of government welfare programs (not only “poor” individuals but also, increasingly, middle-class individuals and even business entities, the “beneficiaries” of “corporate welfare,” the cronyism that also has grown under B.O.’s regime).   

                The massive redistribution of wealth that results from federal tax-and-spend policies is not only unfair or unjust but also unconstitutional: it violates the basic principle of equal protection of the laws, which was enforced in early American constitutional law (from the 18th century until the first third of the 20th century) by the unwritten constitutional prohibition against “class legislation” – laws that confer special benefits or impose special burdens on one class of citizens as opposed to others.  Since the growth of “Big Government” at the federal level in the last two-thirds of the 20th century – since the income tax came to be the chief source of federal government revenues – the courts have failed to enforce this prohibition of class legislation, or constitutional guarantee of equal protection (as it was originally understood).  “A law that transfers property from A and redistributes it to B” – in other words, from one class of citizens to another – which, historically, epitomized the worst kind of unconstitutional law, contrary to the fundamental principles of a republican form of government, has now become the most common type of federal law.  The Sixteenth Amendment (authorizing Congress to tax incomes), coupled with federal redistributionist “welfare state” spending, made it all possible. 

                Rather than supporting true tax reform which would simplify the bloated federal income tax code, by moving toward a “flat” (or at least a flatter) rate system (which even Russia has implemented, to promote both fairness and efficacy), B.O. would increase the complexity of the tax code by using it to more fully “spread the wealth” – that is, to further compound its existing unfairness or injustice, penalizing those who are successful simply because they’re successful.  And – it cannot be emphasized enough – he’s willing to do so, even though it means a decrease in federal revenues and a further dampening of the economy, because of his perverted understanding of the concept of “fairness.”   

                Perhaps nothing better illustrates B.O.’s perversion of the concept of fairness (or justice) – as well as his disregard for the Constitution and the limits it imposes on federal powers, including his own powers as president – than does the role he played late in the 2012 campaign season and during the month of December, with regard to the so-called “fiscal cliff.” 

                The term fiscal cliff was coined by someone – whether a political pundit, a politician, or a journalist, probably a Democrat – to describe the supposed “crisis” the United States would face on December 31, 2012 when, by law, the “Bush tax cuts” (that is, the lower rates for all income brackets that were enacted during George W. Bush’s presidency) would expire.  I say it was a “supposed crisis” because the “crisis,” such as it was, was not an economic or fiscal crisis but was entirely a political crisis, caused by Washington politicians – originally caused, specifically, by Democrat members of Congress who in 2003 refused to make the Bush tax cuts permanent.  Ever since, Democrats “have used the threat of tax hikes on a broad swath of Americans as a political bludgeon against the Republicans,” as the editors of Investor’s Business Daily noted in their NewYear’s editorial (“Five Challenges in 2013 that Dwarf `Fiscal Cliff’,” January 2). 

                B.O. played a critical role in creating the phony “fiscal cliff” crisis by his abuse of the veto power:  his threat to veto any bill that would continue the “Bush tax cuts” (that is, the lower Bush-era tax rates) on “wealthy” Americans, which he defined as individuals earning $200,000 or more annually (or couples earning $250,000 or more) – in keeping with his dual campaign promises, not to impose additional taxes on the “middle class” (a promise that B.O. had broken, long before the 2012 election, as noted below) and to raise taxes on the “wealthy,” to make them pay their “fair share” (which is to say, to make them pay a share that’s even more unfair, as noted above).  This threat was an abuse of the veto power because, as originally meant by the framers and ratifiers of the Constitution, the veto was to be used by presidents only to prevent the enactment of unconstitutional legislation.  That limited purpose of the veto was the way it was used for the first 150 years or so of the presidency.  Starting with FDR, however, modern presidents of both major political parties have used (or threatened to use) the veto for political purposes, to force their policy choices on Congress (because both houses of Congress would have to muster two-thirds votes to override a presidential veto).  Hence, B.O.’s threatened use of the veto power was the means by which he attempted to dictate tax policy to Congress – and particularly to the Republican-controlled House of Representatives which, as the Constitution specifies, is given the exclusive power to initiate tax legislation (because it’s the branch of government that directly represents “the people,” including the taxpayers). 

                The so-called “fiscal cliff” over which the United States would fall, if Congress were to fail to reenact the Bush-era tax rates, would have been the single largest tax increase in U.S. history.  It would hit all income brackets, including the “wealthy,” who as owners of small businesses, are the most important job creators in the national economy.  An end to the so-called “recovery,” meaning that the United States would again plunge deep into an economic recession, with massive hikes in unemployment, would surely result.  As all intelligent economists agreed, to impose such a massive tax hike, at a time when the economy was still sluggish, would be sheer folly – tantamount to a deliberate act to ruin the American economy. 

                And yet – it cannot be emphasized enough – that’s what B.O. was willing to hazard, through abuse (or threatened abuse) of the veto power, in his obstinate insistence on the expiration of Bush-era rates (“tax cuts”) for taxpayers above the $200,000/$250,000 level.  B.O. was adamant, totally obstinate or inflexible, about this – as were the Democrats in Congress (the minority in the House, led by former Speaker Nancy Pelosi, and the majority in the Senate, led by Harry Reid).  The Republican majority in the House, earlier in 2012, had passed a bill continuing all the Bush “tax cuts” – that is, continuing and making permanent the lower Bush-era tax rates for all income brackets – but that bill was declared dead on arrival in the Senate (in other words, it wouldn’t ever come up for a floor vote) by Harry Reid.  And even if the Senate were somehow (say, through a coalition of the Republican minority and Democrat “moderates”) to consider and pass the House bill, that’s the measure that B.O. adamantly insisted he’d veto.  B.O. and the Democrats in Congress were engaging in economic extortion: quite literally, they held hostage the “tax cuts” (continuation of the Bush-era reductions in tax rates) for all “middle-class” Americans (taxpayers earning below the $200,000/$250,000 level), for the price of tax increases on the “wealthy” (the upper 2 percent of taxpayers).  In other words, the Democrats were willing to hold hostage the tax liability of 98 percent of Americans, in order to impose higher taxes on the “wealthy,” the most productive, the 2 percent, all in the name of supposed “fairness.” 

                The true costs of B.O.’s obstinacy over tax rates was not just the higher taxes that all Americans would face but also the economic uncertainty that businesses have faced for the past year or more.  Not knowing at what rate their incomes would be taxed after December 31, 2012, small business owners understandably have not risked adding new employees – and this dampening of business growth has been a major reason why unemployment rates have remained high.  (As noted in the section on health care, below, the added costs imposed on small businesses by “ObamaCare” was another important factor.)  As Treasury Department data show, about 4.8 million American small businesses employ workers, but only 1.2 million of them earn the vast majority (91%) of all the small business income.  These are B.O.’s “rich”: they make up just 3% of all small businesses, but they employ a stunning 54% of the total private U.S. workforce.  They are the nation’s job creators – and the victims targeted by B.O.’s tax policy (“Avoid Tax Hikes on Job Creators,” I.B.D., Nov. 11, 2012). 

                When B.O. said, “We are just asking the rich to pay a little more” – one of the catchwords or phrases he used repeatedly on the campaign trail in 2012 – he insulted the intelligence of the American people, as economist Thomas Sowell noted in an op-ed exposing many of the “lies” atop the “fiscal cliff.”  “The government doesn’t `ask’ anybody to pay anything; it orders you to pay the taxes it imposes and you can go to prison if you don’t.”  Another huge lie is that tax increases on “the rich” would help solve the federal deficit problem.  Raising the tax rates on the top 2% – forcing America’s top earners to pay more taxes – “will not get enough additional revenue to run the government for 10 days,” Sowell notes.  “Taxing `the rich’ will produce a drop in the bucket when compared with the staggering and unprecedented deficits of the Obama administration.  No previous administration in the entire history of the nation ever finished the year with a trillion-dollar deficit.  The Obama administration has done so every single year.  Yet political and media discussions of the financial crisis have been focused overwhelmingly on how to get more tax revenue to pay for past and future spending.”  And perhaps B.O.’s biggest lie, “repeated over and over, largely unanswered,” is that President Bush’s “tax cuts for the rich” cost the government so much lost tax revenue that this added to the budget deficit – “so that the government cannot afford to allow the cost of letting the Bush tax rates continue for `the rich’.”  That argument is completely false.  The truth is that tax revenues went up – not down – after tax rates were cut during the Bush administration, and the budget deficit declined, year after year, after the Bush tax cuts.  What caused the decreasing budget deficits after the Bush tax cuts to suddenly reverse and start increasing was the mortgage crisis” of 2008.  “So it is sheer hogwash that `tax cuts for the rich’ caused the government to lose tax revenues” (“Sifting Through Lies Atop the Fiscal Cliff,” I.B.D., December 4).   

                Nevertheless, the Democrats’ extortion – use of “the threat of tax hikes on a broad swath of Americans as a political bludgeon against the Republicans,” as the I.B.D. editors aptly put it – unfortunately, “worked,” thanks in part to the role played by the leftist mainstream (or “lamestream”) news media, in their role as cheerleaders for B.O.’s reelection.  “Despite being an entirely Democratic contrivance, the fiscal cliff is blamed by a majority of Americans on the GOP – kind of like blaming the firemen who try to put out a fire for starting the blaze in the first place” (“Five Challenges in 2013 that Dwarf `Fiscal Cliff’,” January 2).   

                Democrats and their media lapdogs similarly demonized Grover Norquist, president of Americans for Tax Reform, who has encouraged Republican congressmen and senators – more than 90% of Republican members of the House and Senate – to stick to the Taxpayer Protection Pledge they made to their constituents: a pledge to (1) “oppose any an all efforts to increase the marginal income tax rates for individuals and/or businesses” and (2) “oppose any net reduction or elimination of deductions and credits, unless matched dollar for dollar by further reducing tax rates.”  Created as part of the effort to pass Ronald Reagan’s Tax Reform Act of 1986 (which reduced marginal tax rates and pared back deductions and credits so that tax reform was pro-growth and revenue neutral), the pledge “protected tax reform from becoming a Trojan Horse for tax increases.”  George H.W. Bush signed the pledge and won the 1988 election with his promise, “Read my lips, no new taxes.”  Bush broke his word and made a 1990 deal to raise taxes – a deal on which Democrats reneged by breaking their promise to cut spending $2 for every $1 of tax hikes – and Bush was defeated for reelection, for breaking his pledge to the American people.  Since 1990 no Republican in Congress has voted for an income-tax increase.  Between the Clinton tax hike of 1993, which passed with all Democrat votes, and the “ObamaCare” tax hikes of 2010, which also passed with only Democrat votes, there was a 17-year period with no income-tax increases, the longest such period in American history.  As Mr. Norquist observed, B.O. and the Democrats in Congress have been so critical of the pledge because it frustrates their scheme to “get Republican fingerprints on the tax hikes they need to transform America into a European welfare state with federal spending closer to 30% or 40% of GDP than the 21% average of the past 30 years” (“Kept Promises Keep Away Welfare State,” I.B.D., November 30).   

                As anyone who followed the political drama in December knows, B.O.’s meetings with House Speaker John Boehner, to “negotiate” a “compromise” to avert falling over the “fiscal cliff,” failed.  According to the Democrats’ talking points, as reiterated in the news media, those negotiations failed because of Boehner’s intransigence, which was blamed on those radical “extremist” “Tea Party” Republican members of the GOP caucus.  In fact, Boehner was willing to compromise – at one point, offering a $4.6 trillion deficit-reduction proposal that included $800 billion in new revenue to be achieved through closing “loopholes” and capping deductions for upper-income taxpayers, an offer that was quickly rejected by the White House, whose spokesman (communications director Dan Pfeiffer) complained (incredibly) that the GOP proposal “does not meet the test of balance” (meaning that it didn’t sufficiently soak “the rich”).  Boehner also considered bringing to a House vote his so-called “Plan B,” which would have increased tax rates only on those earning $1 million or more (essentially, calling B.O.’s bluff, because on the campaign trail in 2012 B.O. often described the “wealthy” not as those earning over $200,000 but rather as “millionaires or billionaires”) – an alternative that failed when Boehner realized he didn’t have sufficient votes among GOP members.  

                In his willingness to compromise with B.O. and the Democrats, Boehner was jeopardizing Republican unity in the House – which many conservative commentators believe to be B.O.’s ultimate political goal.  In reality, it was B.O.’s intransigence – his obstinate insistence on the $200,000/$250,000 cut-off point for reduced tax rates – that doomed his talks with Boehner.  Compromise with B.O. proved to be impossible – not only because the so-called “negotiations” were not conducted in good faith by B.O. but also because, as many conservative commentators justifiably believe, B.O. is incapable of truly making a compromise.  He “doesn’t know how to do a political compromise,” noted Daniel Henninger in Wall Street Journal.  “Where in his career did [B.O.] ever learn the art of the political deal?  Nowhere” (“Obama’s Ruinous Course,” December 6).  

                Only when B.O. and Boehner broke off their talks, and the negotiations were resumed by Senate Minority Leader Mitch McConnell (R.–Ky.) and his former Senate colleague, V.P. “Smiley” Joe Biden, was the impasse broken and a political deal – a real compromise – struck, to end the tax part of the “fiscal cliff” crisis.  As conservative commentator Jonah Goldberg has observed, “Perhaps the surest indication that Washington operates on the kooky side of the looking glass now is the sudden, even disturbing revelation that the sanest, most grown-up and responsible person in the White House” is Biden, who “has proved more useful, realistic, and constructive in a budget crisis than his boss” (“So Begins the Biden Presidency,” USA Today, January 3.  Incidentally, I used to refer to Biden by the nickname “Loose Lips” because of his propensity to commit verbal gaffes.  But as I explained in “2012 Tricks and Treats” (Oct. 25), I’ve changed his nickname to “Smiley” because of his boorish behavior in the V.P. candidates’ debate against Paul Ryan.)   

                As Jonah Goldberg added, it seemed that B.O. even tried to sabotage the Biden-McConnell negotiations with the speech he gave at the White House, “something like a campaign event in front of a cheering audience of supporters,” in which he mocked Congress during the most delicate moments of his own administration’s negotiations.  B.O. even gloated that if Republicans accepted the deal, it would amount to a surrender on taxes, making it politically harder for Republicans to accept the deal.  Fortunately, “Republicans didn’t take the bait, but it was clear [B.O.] was baiting them” – making it seem “as if the stunt was designed to sabotage a deal he allegedly wanted.”  Goldberg is right – especially in using the word allegedly.  (B.O. attempted to pull the same trick this week:  just after a bipartisan group of U.S. senators announced a tentative plan for immigration reform, B.O. jetted off to Las Vegas, where he delivered a speech allegedly trying to seize the initiative but really intended to sabotage any bipartisan deal.)     

                Notwithstanding the conventional wisdom, from left-liberals and conservatives alike, that B.O. “won” the fiscal cliff tax battle, his supposed victory was not complete.  B.O. failed to achieve his stated goal of raising income tax rates on “the richest” Americans, defined as those earning $200,000 as individuals or $250,000 as couples: the choice of $400,000/$450,000 as the cutoff was a true compromise (one that B.O., for the first time in his life, was forced to accept, because of political circumstances).  More importantly, B.O. failed to achieve his real goal – his secret goal – of having all the Bush-era tax rates expire, so tax rates would increase on all taxpayers (including the middle class), which he’d blame on the Republicans.  By accepting the compromise negotiated by Biden and McConnell, Congressional Republicans successfully thwarted B.O.’s secret scheme – which is why he didn’t stay in Washington to sign the bill into law but instead resumed his family vacation in Hawaii, signing the bill remotely by directing staff members to use the White House auto pen. 

                Nevertheless, the “fiscal cliff” tax deal was really a win for Big Government – and hence a loss for all Americans.  The ironically-titled “American Taxpayer Relief Act” hikes taxes on the most productive Americans, exacerbating the injustice of the federal income tax and practically guaranteeing continued economic stagnation and higher unemployment.  Economist Ralph Reiland cogently summed up the Act in a recent op-ed in Investor’s Business Daily.  First, as noted above, the top personal income tax rate increases from 35% to 39.6% for taxpayers with more than $400,000 (or couples filing jointly with more than $450.000) in taxable yearly income.  The federal estate tax jumps from 35% to 40% on estates over $5 million (another minor disappointment for B.O., who wanted a 10-point hike in the estate tax, from 35% to 45%).  “At a time when business investment is weak and trillions are sitting on the sidelines,” the tax on capital gains increases from 15% to 23.8% for individuals making more than $200,000 a year and couples earning more than $250,000, counting the new 3.8% ObamaCare surtax on investment income – making it the highest tax rate on capital gains in 17 years.  “Adding further disincentives to new investment at a time when unemployment is high, the rate of poverty is up and the economy is sluggish,” the tax on dividends increases from 15% to 23.8% for high earners (again including the new 3.8% ObamaCare surtax on investment income). 

                Because part of the deal also included expiration of the temporary two percent cut in the Social Security payroll tax, income earners who aren’t among “the rich” also get their taxes raised.  Households with earnings between $40,000 and $50,000 will see an average tax increase of $579, according to the nonpartisan Tax Policy Center in Washington.  Households with earnings between $50,000 and $75,000 will face an average yearly tax hike of $822.  As Professor Reiland observes, “calling all this the American Tax Payer Relief Act is Orwellian.”  Citing George Orwell’s novel 1984 (a dystopia where the citizens enslaved by the state were indoctrinated to believe the manta “War Is Peace, Slavery Is Freedom, Ignorance Is Strength”), he explains: “In our time, it’s `Tax Hikes Are Tax Relief.’”  

                B.O. asserted that the deal “protects 98% of Americans and 97% of small business owners,” but that assertion is doubly misleading.  First, the “98%” overlooks the hikes in the payroll tax and fails to take into account the job losses likely to result from an estimated $620 billion reallocation of wealth from the income-producing private sector to the government over the next decade.  Second, B.O.’s oft-repeated line of protecting “97% of small business owners” fails to mention that the other 3% of small business owners who are targeted for higher taxes employ 54% of the total private workforce in the U.S. (Ralph R. Reiland, “Fiscal Cliff Deal Was a Win for Big Government,” I.B.D., January 9).  

                If B.O. and the Democrats in Washington get their way, the tax hikes won’t stop here. As the “fiscal cliff” tax deal was being finalized, B.O. told a White House rally that “revenues have to be part of the equation” in ongoing budget talks, bluntly noting: “If Republicans think that I will finish the job of deficit reduction through spending cuts alone, then they’ve got another thing coming.”  Then, after the deal was made, he talked about how “cutting spending has to go hand-in-hand with further reforms to our tax code” that take more money from “the wealthiest corporations and individuals.”  So, as the editors of Investor’s Business Daily note, “after getting a deal that includes only tax hikes and no spending cuts, [B.O.] will demand that any future spending cuts come with still more new taxes.”   Some Democrats in Congress are talking not only about still higher tax rates on “the rich” (notwithstanding the fiscal cliff deal), but also about eliminating certain credits and deductions, taxing employer-provided health coverage, and such new federal taxes as a “carbon tax” and a value-added tax (VAT), the favorite ploy to raise revenues in European welfare states.  

                “On policy grounds, it makes no sense,” the editors observe.  “The long-term deficit problem is entirely caused by out-of-control spending, not inadequate taxes” – as the next section in this essay will show.  “And, as the fiscal cliff deal shows, there aren’t enough rich people to finance [B.O.’s] expensive spending ambitions” (another matter more fully discussed in the next section).  Politically, he’ll find it harder to get any new tax hikes through Congress “and he’s lost his best political leverage – the threat that taxes on the middle class would go up if he didn’t get his way – as soon as he agreed to make [most of] the Bush tax cuts permanent.”  But none of that seems to matter to B.O., “whose unquenchable thirst for higher taxes has finally been exposed” (“A Tax Appetite Barely Whetted,” January 3). 

                Still, B.O.’s fiscal cliff “victory” has the hallmarks of a classic Pyrrhic victory.  Just what did he achieve with his $620 billion tax hike on the wealthy and the boost in the payroll tax rate?  First, the higher taxes will hurt the economy, further slowing the already sluggish growth.  Moody’s Analytics chief economist Mark Zandi says that the higher taxes from the “fiscal cliff” deal will cut close to 1 point off GDP growth this year and result in 600,000 fewer jobs.  Pantheon Macroeconomic Advisors chief economist Ian Shepherdson figures the deal will cut GDP by 1.5 points.  Second, (as the next section more fully discusses), the tax hikes will do nothing to fix the debt crisis.  Like other tax hikes in recent U.S. history, they won’t produce as much revenues as expected; and worse, by slowing economic growth, the tax increases make tackling the nation’s debt crisis all the harder.  And finally, as noted above, B.O.’s “victory” may have come at a steep political price: 

    “By making the Bush tax cuts [or most of them] permanent, [B.O.] lost the best leverage he had in forcing the Republicans to accede to tax hikes.  That hasn’t stopped him from saying he wants to raise taxes on the rich even more.  But what makes him think he’ll have a better chance of getting tax hikes through the GOP-controlled House now?  After all, Republicans can correctly claim they’ve already given the president what he wanted on taxes, and that it’s now [his] turn to cough up all those `balanced plan’ spending cuts he’s been promising.”


    (“Pyrrhic Victory on Fiscal Cliff,” I.B.D., January 4).



    Keynesian “Stimulus” Bullshit:

    Threats to Liberty from Profligate Government Spending


                Anyone who followed the political drama in Washington, D.C. at the end of the 112th Congress, as 2012 ended and 2013 began, knows that the U.S. financial crisis was not resolved by the “fiscal cliff” deal on federal tax rates.  Republicans, thinking wishfully, may believe that new taxes are no longer on the bargaining table, but as I’ve noted in the previous section, B.O. and the Democrats will continue pushing for more taxes, and not just on “the rich.” 

                Expiration of the Bush-era tax cuts was only part of the “fiscal cliff” that the U.S. government faced on December 31, 2012.  The other unresolved (and arguably far more troublesome) parts of the crisis are, first, the $109 billion in automatic across-the-board spending cuts – so-called “sequestration,” with half the cuts hitting the Defense Department – that were part of the August 2011 deal ending the deadlock over the last raise in the U.S. national debt ceiling; and second, the fast-approaching deadline for the U.S. once again to hit that debt ceiling.  Part of the January 2 “fiscal cliff” deal “kicked the can” of spending cuts further down the road, delaying for two months (until March) resolution of the “sequestration” issue.  And although the U.S. national debt has now officially passed the $16.4 trillion debt ceiling, Treasury Secretary Timothy Geithner has bought more time for the politicians by shifting around U.S. assets so that the debt-ceiling deadline similarly may be postponed by at least a couple of months.   

                Given the way the B.O. regime has been pushing its gun-control agenda (see the section in Part III), one would think that the nation’s fiscal crisis isn’t urgent.  That’s what B.O. and his acolytes in the news media would like Americans to believe, to prevent Congressional Republicans from seizing the initiative in what is really the greatest crisis the United States will face in B.O.’s second term: the combined crisis of the federal budget deficit and the national debt – a crisis that B.O. cannot blame on his predecessor, for it has been caused by the profligate spending policies and irresponsible recklessness of the B.O. regime and Democrats in Congress.  (Many Republicans in Congress aren’t much better than the Democrats when it comes to spending, but at least a fair number of Republicans in both houses of Congress understand the importance of the spending problem and have voiced support for spending curbs.)     

    During the first half of B.O.’s first term, in the 111th Congress (2009-10) when both houses of Congress were controlled by Democrats, Congress enacted three major pieces of legislation that were literally monstrous (both in their size and complexity and in their disastrous effects on the U.S. economy).  One was the 2010 federal health-insurance regulatory law (so-called “ObamaCare”), which is discussed in the next section of this essay.  (For purposes of this section, it’s sufficient to note the devastating economic impact of “ObamaCare”: in addition to raising the costs of health insurance for individuals, it has added significantly to employers’ business costs, scaring them from adding new jobs out of fear of getting hit with these exorbitant new costs.)  The second was the nearly $800 billion spending bill – the so-called “economic stimulus” legislation – that passed both houses of the 111th Congress (by a partisan, almost entirely Democrat vote) and signed into law by B.O. in spring 2009.  The third was the “Dodd-Frank Wall Street Reform and Consumer Protection Act” (named after its two chief Democrat sponsors, former Senator Chris Dodd and former Congressman Barney Frank, which is commonly called the “Dodd-Frank” law but which I’ve nicknamed “Frank-n-Dodd,” because it is indeed monstrous), signed into law by B.O. in July 2010. 

    These laws have proved to be massively expensive failures.  Rather than “stimulating” the economy, the distortions introduced in the nation’s finances by the laws have impeded economic recovery – causing what I have called “B.O.’s recession.”  More importantly, continued profligate spending – with annual deficits over $1 trillion each year of B.O.’s presidency – over the past four years have pushed the U.S. national debt to truly dangerous levels, making the fiscal soundness of the United States untenable.  No wonder the economy is in such a “mess” (to use the word frequently evoked by B.O., over which he has yet to acknowledge any responsibility). 

    Unemployment during B.O.’s first term rose to 10.2%, the highest jobless rate since early 1983 and only the second time since World War II that unemployment was in the double digits.  Over the past year – during the supposed “recovery” – the official jobless rate has declined somewhat, but it still remains at nearly 8% (7.9%, higher than when B.O. took office).  That official rate, as calculated by the Bureau of Labor Statistics (BLS), obscures the fact that millions have given up looking for jobs and so aren’t being counted as unemployed.  If you account for the unprecedented drop in labor participation under B.O., the real unemployment rate today is 10.7%.  (Moreover, some economists argue that if “under-employed” persons and persons who have stopped looking for jobs would be included in the unemployment numbers, the figure would rise to over 16%.)  Meanwhile, the pool of long-term unemployed was a staggering 4.8 million in December, which is 2 million more than when B.O. took office.  The average length of unemployment was 38 months – almost 20 months longer than four years ago and 15 months longer than when the recession supposedly “ended” in June 2009.   

    At the same time, real average weekly earnings have dropped about 1% over the past two years, according to the BLS.  Worse, low-paying jobs have been replacing the higher-paying ones lost in the recession, leaving median household income 7% below where it was when B.O. took office. 

    Most ominously, new data from the Bureau of Economic Analysis show that the U.S. economy – which had experienced meager growth for most of the past four years (for example, only 1.8% growth in 2011 and 2.2% in 2012) – actually has begun to shrink, experiencing negative economic growth (a decline of about 0.1 percent) in the fourth quarter of 2012. 

    The B.O. administration’s absurd claim that the “stimulus” law “saved or created” over 600,000 to 1 million jobs has also been proven false – another example of outrageous B.O. bullshit.  (Not only was the administration’s theory bogus, but it turned out that many of the jobs claimed by the administration came from phony congressional districts.  Such fraudulent claims, perhaps, could be expected, given that the person who runs B.O.’s economic recovery program is actually named “G. Edward DeSeve” – an apt name for the “stimulus czar,” given that his job seems to be to try to de-ceive the American people.) 

    Why has government “stimulus” spending been such a massive failure?  As I wrote four years ago (in my January essay, “2009: Prospects for Liberty”, and in my previous essay, “The Myth of Market Failure,” Oct. 2, 2008), both Democratic and Republican politicians in Washington – not only the members of Congress and both the Bush and B.O. administrations, but also the regulators at the Federal Reserve and other agencies – reacted to the economic crisis of 2008–09 with misguided legislation and regulations that only made the economy worse.  That’s because public policy has been based on another bullshit theory – the myth that some sort of “market failure” (that is, a failure in the capitalist, free-market system) requires government intervention to “save” the economy.  The bullshit behind both the Bush “bailout” (the TARP program) and the B.O. “stimulus” is the notion – originally propounded by socialist British economist John Maynard Keynes – that governments can, by spending vast amounts of their taxpayers’ money, somehow legislate prosperity.  That was the fallacious notion behind FDR’s “New Deal” programs of the 1930s – which scholars now know did not help the economy at all but rather simply prolonged and worsened the Great Depression.  (See the books I cited in my “Market Failure” essay:  Gene Smiley’s Rethinking the Great Depression (2002), Jim Powell’s FDR’s Folly: How Roosevelt and His New Deal Prolonged the Great Depression (2003), and Amity Shlaes’ The Forgotten Man: A New History of the Great Depression (2007).)   

    The economic history of both the United States and the world in the decades since the end of World War II have proven that Keynesian economics is untenable – nothing more than bullshit, to put it bluntly – and yet the policy-makers in Washington seem determined to repeat these failed public policies of the past.  It proves the truth behind the saying that those who are ignorant of history are doomed to repeat it.  Similarly, those who are ignorant of the real history of the New Deal (not the propaganda of “progressive” activists and scholars, but the story of what FDR’s programs really did to the nation’s legal and economic system) seem doomed to repeat it – and to create a second Great Depression, as a sort of self-fulfilling prophecy. 

    As I noted in “The Myth of Market Failure,” it was not capitalism, or the free-market system, that caused the so-called Great Recession of 2008-09:  rather, it was misguided public policy, laws passed by the Congress and enforced by government regulators, that so distorted the private credit market that it first created a “bubble” in the housing market, which (when the bubble burst) has negatively affected credit markets generally.  (On the causes of the 2008 financial crisis, see the excellent new books The Financial Crisis and the Free Market Cure by John Allison  (McGraw Hill, 2013) and After the Fall: Saving Capitalism from Wall Street – and Washington by Nicole Gelinas (Encounter Books, 2011).  Other good books explaining the origins of the crisis have been published in recent years, including: Johan Norberg’s Financial Fiasco (Cato Institute, rev. ed. 2012); Thomas E. Woods, Jr.’s Meltdown (Regnery Publishing, 2009); and John B. Taylor’s Getting Off Track (Hoover Institution Press, 2009).)  As John Allison explains it,  

    “The financial crisis and the ensuing Great Recession were primarily caused by the Federal Reserve and by Freddie Mac and Fannie Mae.  The Federal Reserve created a massive misinvestment (bubble) in our economy by over-expanding the money supply to try to keep us from experiencing the normal short-term downside corrections that occur in a free market.


    “The bubble (misinvestment) primarily affected the housing market because of Freddie Mac and Fannie Mae’s affordable-housing lending, which was driven by a political goal to raise homeownership above the natural market rate.”


    (The Financial Crisis and the Free Market Cure, p. 65).   

    Neither Congress nor the B.O. regime have done anything to address the root causes of the housing bubble – particularly Fannie Mae and Freddie Mac, those Frankenstein monsters created by Congress, that politicians have allowed to grow so big that they now own or guarantee about half of the United States’ $12 trillion mortgage market.  Rather than reforming Freddie and Fannie, Congress has not only allowed them to continue making bad mortgage loans but has even used part of the Wall Street bailout program to, in effect, further subsidize risky loans by providing more guarantees against mortgage foreclosures.  Indeed, the B.O. regime has “doubled down” on the misguided housing policies that helped trigger the subprime mortgage crisis:  principally through enforcement of the Carter-era Community Reinvestment Act, as expanded under the Clinton administration (Paul Sperry, “Obama Doubles Down on Clinton’s Mistake,” I.B.D., January 25).  

    Moreover, among other misguided policies, the Dodd-Frank law has enshrined into U.S. financial regulation the concept of “too big to fail” – that is, “crony capitalism,” or socialist/fascist government bailouts, an egregious form of “corporate welfare.”  Simultaneously, by imposing new steep regulatory costs on banks and other financial institutions, the legislation (and the rules still being written by regulators to implement it) will tend to drive smaller banks out of the marketplace, in effect cartelizing the banking/financial industry, concentrating it in an oligopoly of a few huge firms that are deemed “too big to fail.”  The result?  As John Allison has cogently summarized it, Dodd-Frank, “[when] fully implemented, will make U.S. financial firms less competitive, drive resources out of the United States, reduce job creation, and lower our standard of living.  Ironically, it will also increase risk, as unregulated firms headquartered in foreign locations will have a larger market share” (The Financial Crisis and the Free Market Cure, p. 130).  

    Just as the “New Dealers” did in the 1930s, today’s policymakers have failed to address the root causes of economic downturn – the misguided governmental policies that distorted  financial markets – and instead have continued, and even expanded, those misguided policies.  What Congress has done in perpetuating and even worsening the problems it caused, in the housing market, it’s done to an even more dangerous degree in its efforts to “bail out” particular industries such as banking, insurance giant AIG, and two of the “Big 3” Detroit auto companies, Chrysler and GM.  (By in effect nationalizing General Motors, the U.S. government has turned GM into, quite literally, “Government Motors,” as I observed in my “Summer 2009 in Review” essay.  Part of the explanation of B.O.’s reelection – a key reason why he closely won states like Ohio with large numbers of unionized auto workers – was his claim that he “saved” GM.  By eschewing the legal bankruptcy process, however, B.O.’s politicized bailout of GM actually saved only some union jobs, in the short run.  GM’s continuing structural problems will mean that it will again face bankruptcy in the not-too-distant future.  Already, GM has against lost the position of the world’s biggest automaker to Toyota; and the U.S. government’s announcement in mid-December that it would sell off its stake in GM means that, at current prices, taxpayers will lose about 61%, or nearly $15 billion, on their “investment.”  “GM Bailout Will Cost U.S. Taxpayers Billions,” I.B.D., December 20). 

    What’s most disturbing about all these “bailouts” are the precedents they set – precedents that will result not only in demands by other industries and institutions for more U.S. government “relief,” but also in the further erosion of constitutional limited-government and even of morality in the United States, as I discussed in my essay “2009: Prospects for Liberty, Part II” (Jan. 26, 2009).   Like pigs at the trough (or, perhaps a more apt metaphor) piglets suckling at a sow’s teats, all the looters are competing with one another for their “share” of the loot, U.S. taxpayers’ money – the wealth produced by others.  Such unsightly and, frankly, sickening competition over whose “needs” are greatest is, sadly, the inevitable consequence of public policies based on the Marxist principle “From each according to his ability, to each according to his needs.”  Thanks to the “bailout” philosophy, American society as a whole is being destroyed by the same vicious principle that destroyed the Twentieth Century Motor Company, as dramatized in Ayn Rand’s amazingly prescient novel, Atlas Shrugged, over fifty years ago.  And in the policies being pursued by the B.O. regime – with special favors being dispensed to certain politically-connected industries (like the so-called “green energy” boondoggle) – we also see an illustration of the phenomenon described in Rand’s novel as “the aristocracy of pull.” 

    As I also observed four years ago, “[t]he fallacy underlying massive government spending programs – the Keynesian myth that governments can legislate prosperity by spending money – was exposed long ago by the great early-19th-century French philosopher Frederick Bastiat, who in his famous paper on the “broken-window fallacy” wrote about “what is not seen” whenever government spends money”:   

                “By exposing the fallacy of the broken window, Bastiat called attention to a fundamental fact about government – a fact that frequently ignored by modern policy-makers: that government cannot legislate prosperity.  Government itself creates no wealth; it can only forcibly seize (through its taxation powers) wealth that individuals have created or forcibly compel (through its other powers) individuals to act differently from the ways they freely would have chosen to act.  Every time and every way government acts, through the coercive power of law, it interferes with the spontaneous order that would have resulted from the free market; that is, from the free choices of individuals in society.  . . .


    “Every dollar spent by the government is one less dollar that could have been used for truly productive purposes – by private businesses, to invest more capital in their enterprises.  Every make-work “job” supposedly created by government takes the place of the only source of real, productive jobs in our economy – those created by private enterprise.  Governments do not “make jobs”; they only destroy them.  Governments do nothing truly productive; they only forcibly take the wealth created by productive persons in society and reallocate it to other, non-productive persons, or looters.”


    To that, I’d add that it’s not surprising that the current regime has followed a policy of looting:  after all (as I noted in the previous section),  it was B.O. on the campaign trail in 2008, in his infamous reply to “Joe the Plumber,” who said he favored a redistributionist “share-the-wealth” policy.  Those who advocate “sharing” wealth really mean using the coercive power of government to forcibly take away the wealth earned by the productive work of some individuals and to redistribute it to other individuals, the non-productive looters who are the politicians’ constituents.   

    As I concluded in 2009,  

    “The one and only way that government truly can stimulate the economy is to get out of the way – to repeal the taxes, spending programs, and regulations that distort the market – and thus to set the market system free to function as it should.  Economists who understand and appreciate the free-market system – those of the so-called “Austrian” and “Chicago” school of economics – also understand that it’s natural for there to be a business cycle, with periods of “boom” and “bust,” and that recessions – even deep recessions, as the present situation seems to be – do not constitute a “crisis” requiring government intervention to “solve.”  Indeed, they also understand that government efforts to “solve” the so-called “crisis” are apt to cause a real crisis – just like that which the politicians in Washington are about to create, if and when they enact more misguided federal government “stimulus” programs.


    The economic policies pursued by the B.O. regime, the Democrats who controlled both houses of the 111th Congress and have controlled the U.S. Senate in the 112th and the new (113th) Congress, and the regulators at the Federal Reserve Bank, taken together, have done great damage to the economy.  A prime reason is the tremendous amount of uncertainty – and hence the poor business climate – these policies have created. 

    In addition to dampening the businesses that are the engines of economic growth, massive federal spending under the current regime has raised federal deficits and the national debt to crisis levels.  There is a real crisis, and it has two dimensions – one economic, and the other political.  Economically speaking, the fiscal crisis can be summed up quite simply:  current levels of spending, which are driving both the continuing federal deficits and the skyrocketing national debt, are unsustainable.  Politically speaking, the crisis also can be summed up simply: B.O. and most of the Democrats in Congress are addicted to increased spending and are unwilling even to acknowledge that there’s a problem.  And, sadly, many Republicans seem to be just as addicted to spending – or just as unwilling to seriously attempt to bring it under control – as the Democrats.   

    Federal spending has been at unprecedented high levels for the past four years.  Congress had its real spending binge in 2009, when Democrats controlled both houses and B.O. asked for – and got – the massive “stimulus” law.  Total outlays soared from $2.98 trillion in 2008 to $3.53 trillion in 2009 and $3.46 trillion in 2010 (these are fiscal years ending Sept. 30).  Spending in 2011 peaked at $3.60 trillion, but declined slightly to an estimated $3.54 trillion in 2012.  This means the federal government is now spending about $1 trillion more each year than it did four years ago, at the start of B.O.’s regime (“Obama Pretends Debts Aren’t His,” I.B.D., Jan. 15).   

    Because federal revenues have fallen far short of this unprecedented level of spending, federal deficits similarly have been at unprecedented high levels, averaging well over $1 trillion each year during B.O.’s first four years in the White House.  The budget deficit for B.O.’s first year, 2009, alone – $1.4 trillion – was more than three times the most red ink ever amassed in a single year; it’s more than the total national debt for the first 200 years of the Republic and, as a percentage of U.S. economic output, it’s the biggest since World War II.  From 2009 to 2012, federal deficit spending totaled $5.095 trillion.     

    In addition to ruining the nation’s credit, the massive increase in federal deficit spending also threatens to hit Americans with the hidden tax of inflation.  That’s because the Federal Reserve has been paying for these deficits with more borrowing (keeping interest rates close to zero percent) and by what is euphemistically called “quantitative easing” (which essentially means creating more money, injecting additional dollars into the money supply).  There are already signs that the “stagflation” of the 1970s – high inflation coupled with high unemployment – is on the verge of returning.  As many people realize, the spending power of U.S. households has been squeezed, with median household income falling 7% since B.O. took office four years ago. 

    The unprecedented high deficits of the past four years have created a national debt in the staggering amount of over $16 trillion (the debt recently passed the official debt ceiling of $16.4 trillion).  That’s scary because it’s approaching the debt levels (national debt compared with a nation’s GDP) of Greece and other European nations that are experiencing economic and social turmoil because of their unsustainable indebtedness. 

    And the $16+ trillion figure is just the official national public debt – which is really only the tip of the iceberg as far as the true, full indebtedness of the U.S. government.  The U.S. Treasury’s “balance sheet” lists liabilities such as public debt but does not include the massive unfunded liabilities of Social Security, Medicare, and other federal future obligations.  A conservative estimate of Washington’s unfunded liabilities for the year that ended in 2011 is $87 trillion – more than 500% of our 2011 GDP of $15 trillion.  Former Congressmen Chris Cox and Bill Archer, in their article “Why $16 trillion Only Hints at the True U.S. Debt” (Wall Street Journal, Nov. 26, 2012), point out the true extent of our dire economic straits:  “When the accrued expenses of the government’s entitlement programs are counted, it becomes clear that to collect enough tax revenue just to avoid going deeper into debt would require over $8 trillion in tax collections annually.  That is the total of the average annual accrued liabilities of just the two largest entitlement programs, plus the annual cash deficit.” 

    In other words, “Washington would have to collect $8 trillion in tax revenue, not to pay off our national debt and have reserves against unfunded liabilities, but just to avoid accumulating more debt,” as economist Walter Williams has observed.  We can’t solve the problem by raising taxes on “the rich” because “there are not enough rich people to satisfy Congress’s appetite.”  In 2011 Congress spent $3.7 trillion, or about $10 billion per day.  As noted above (in the section on taxation), if B.O. had gotten his tax increase on households earning more than $250,000 (the top 2% of U.S. taxpayers), that would have brought in additional revenue to run the government for only 10 days.  As Professor Williams notes, if Congress would impose a 100% tax on all earnings above $250,000 per year, it would bring in about $1.9 trillion – which would keep running for only 190 days.  Even if Congress would confiscate the entire incomes of taxpayers earning more than $66,000 a year (a total adjusted gross income of $5.1 trillion) and all corporate profits (in 2011, about $1.6 trillion), that would still not be enough to cover the $8 trillion per year growth of U.S. liabilities (“Massive Deficits Can’t Be Shrunk By Taxing the Rich,” I.B.D., December 5). 

    The fundamental problem is easily explained, and obviously is a problem of too much spending rather than not enough taxation.  Federal tax revenue has remained relatively constant for the past 50 years, despite many changes in the tax rates, averaging about 18% of gross domestic product.  (That suggests that we’ve reached the upper level of potential taxation; raising taxes even higher would yield diminishing returns, as it would further dampen economic growth.)  During that same 50-year interval, federal spending has risen from less than 20% (its post-World War II high) to more than 25% of GDP.  To put it another way, Congress takes in about $200 billion in revenue each month yet spends $360 billion per month – meaning than roughly 40 cents of every federal dollar spent has to be borrowed.  

    Yet, in spite of this abundant evidence that the nation is in fiscal crisis, B.O. and the Democrats continue to be addicted to increased federal government spending and are in complete denial that there’s even a problem.  Every year he’s been in office, B.O. has submitted to Congress a budget with over $1 trillion in deficit spending; and the Democrat-controlled Senate, which by law is obligated to pass a budget bill every year, has failed to do so for the past several years.  At the same time, Senate Majority Leader Harry Reid has refused to bring to a Senate floor vote any of the budget bills passed by the Republican-controlled House (including the moderate budget plan drafted by Congressman Paul Ryan’s House Budget Committee last year).  B.O. and the members of his party in Congress seem to suffer from the ailment I’ve called the Democrats’ “deficit-attention disorder” (see that section in my “Thoughts for Summer 2012” essay (May 25, 2012)).  

    During the December “fiscal cliff” negotiations, B.O. told House Speaker John Boehner, “We don’t have a spending problem.”  That’s news to B.O.’s own debt commission, the Simpson–Bowles Commission, which in its final report made clear that spending is the driving force behind the nation’s debt crisis.  “Even after the economy recovers,” the report said, “federal spending is projected to increase faster than revenues, so the government will have to continue borrowing money to spend. . . . Over the long run, as baby boomers retire and health costs continue to grow, the situation will become far worse.”  It recommended, “We should cut all excess spending – including defense, domestic programs, entitlement spending, and spending in the tax code.”  The commission was “hardly breaking new ground here,” as the editors of Investor’s Business Daily have noted.  “Anyone who has looked at the federal budget can quickly see that out-of-control spending, not insufficient revenues, is the problem.”  Indeed, a GAO report concluded that spending “is on an unsustainable long-term fiscal path” and blamed entitlements.  And countless Congressional Budget Office reports have documented how, left unchecked, federal entitlement programs will soon swamp the entire budget.  “Apparently [B.O.] didn’t read any of those, either” (“No Spending Problem?” January 8).  

    Besides being irresponsible, B.O.’s also a shameless hypocrite.  During the 2008 campaign, he criticized President George W. Bush for adding $4 trillion to the national debt (raising it from $5 trillion to $9 trillion) during Bush’s eight years as president, calling Bush’s addition to the national debt not only “irresponsible” but also “unpatriotic,” as this classic YouTube video shows.  Moreover, as a U.S. senator during Bush’s presidency in 2006, B.O. was one of several Democrats who voted against a debt-ceiling increase.  In comparison, B.O. as president has received six debt-ceiling increases, has added about 50% more ($6 trillion) to the national debt in only four years in office (so far) than Bush did in eight years, and in fact has added more debt than the first 42 presidents (from Washington to Bush the elder) combined. 

    B.O. already is maneuvering to use the next debt-ceiling deadline, in the third stage of the “fiscal cliff” crisis (which will hit sometime in the spring), to hurt House Republicans, politically – apparently the chief (and perhaps the sole) goal of his fiscal policy.  He has warned that Social Security checks will be delayed if Congress fails to increase the government’s borrowing authority by raising the debt ceiling – which is an outright lie.  

    According to the 2012 Social Security trustees report, assets in Social Security’s trust funds totaled $2.7 trillion, and Social Security expenditures totaled $773 billion.  Although it’s a myth that those S.S. funds are held “in trust” for recipients – because Social Security is a pay-as-you-go system (meaning that taxes paid by today’s workers are immediately sent out as payment to today’s retirees), the trust fund contains nothing more than IOUs, worthless bookkeeping entries – the fact that the system is now taking in more revenue than it is spending means that it need not default on payments.  As economist Walter Williams argues, House Republicans ought simply to call B.O.’s bluff, passing a bill prioritizing how federal tax revenues will be spent – mandating that payments to Social Security recipients (along with interest on the national debt) be the first priorities – and then send the measure to the Senate and to B.O. for concurrence, daring him to veto it (Williams, “Lies Obscure Real Cause of Our Fiscal Mess,” I.B.D., January 30).     

    Following B.O.’s lead, other Democrats in Congress also have started the political maneuvering designed to pressure House Republicans to surrender in the debt-ceiling battle.  They warn that unless the debt ceiling is raised, the consequences of hitting the ceiling would be catastrophic:  the U.S. government will have to shut down; not only Social Security recipients, but also active-duty military personnel and veterans will not get their monthly checks; and worse, the U.S. will default on its debt, no doubt triggering another lowering of the nation’s credit rating.   

    This demagogic ploy, raising the specter of “default,” is also based on a myth: even if we don’t raise the debt ceiling, enough revenue will still come in to the U.S. Treasury to pay interest on the national debt, bonds, and essential services.  The federal government takes in about $200 billion in revenues each month.  In comparison, interest on the national debt is about $30 billion each month; Social Security costs roughly $50 billion; Medicare and Medicaid cost about $50 billion; active-duty military personnel pay costs about $2.9 billion, as do Veterans affairs programs.  Thus, the U.S. could still meet core obligations, without defaulting on its debt or leaving granny without her monthly Social Security check.  Congress could simply pass legislation prioritizing federal spending, as Professor Williams suggested in the op-ed cited above (“The Mythical Threat of Default,” I.B.D., January 22). 

    Other Democrats and leftist political commentators have been even more absurd, in proposing other proposed “solutions” to the debt crisis.  The radical left’s favorite economist, Paul Krugman, has proposed (a tongue-in-cheek proposal that even he calls “silly”) that the U.S. Treasury mint a $1 trillion platinum coin in order to circumvent the debt ceiling.  And some Democrats have argued that a provision in the Fourteenth Amendment (a provision meant to secure U.S. debt following the Civil War) gives the president the authority to raise the $16.4 trillion debt ceiling unilaterally.  Besides being “silly,” too – it interprets the Fourteenth Amendment provision completely out of context and contrary to its original meaning – that proposal reveals many Democrats’ contempt for the Constitution.  Besides being a mechanism to hold U.S. government borrowing within some bounds, the debt ceiling is a necessary means to protect the exclusive power of Congress to borrow money.  (By “exclusive,” I mean that Article I, Section 8 of the Constitution vests the power to borrow money in Congress – not the president or the secretary of the treasury.  It’s a part of Congress’s “power of the purse,” one aspect of which – the power to introduce revenue bills – Article I vests particularly in the House of Representatives, as the house representing “the people,” the taxpayers.  For that reason, if we’re truly faithful to the Constitution, the House ought to take the lead not only with regard to tax legislation but also all budgetary, or spending, matters.) 

                The spending problem must be seriously dealt with.  All forms of federal government spending – both “discretionary” and “non-discretionary” – are out of control, and these unprecedented high levels of spending are at the root of the fiscal problem, driving the deficits and hence the debt.  Moreover, real cuts in federal government spending must be accompanied by real reforms in the so-called “entitlement programs,” Medicare and Social Security, whose unfunded liabilities will really hemorrhage out of control during the next decade or so, as the huge “baby boomer” generation reaches retirement age.  (Left-liberals like to blame national defense for the growth in federal spending, but in 1962, national defense expenditures were 50% of the federal budget; today they are only 19%.  What accounts for most federal spending today are the so-called “entitlement” programs.  In 1962, entitlement spending was 31% of the federal budget; today it is 62%.  Medicare, Medicaid, and Social Security alone take up 44% of the federal budget, and that amount will skyrocket in coming years, crowding out all the so-called “discretionary” spending.) 

    Senator Rand Paul (R.–Ky.) – the person whom I hope will become the intellectual leader of congressional Republicans – in an op-ed written during the “fiscal cliff” talks in December, put his finger on the essential problem when he noted that the federal government is now spending $1 trillion more each year than it did when B.O. took office four years ago.  He then drew some common-sense conclusions about the problem:  

    “Those who argue that we can’t cut spending are basically saying that our federal government was far too small when [B.O.] entered the White House and that now we can survive only if government continues to spend at its current level.  I know few if any Americans who honestly believe this, Republican or Democrat.  It’s also hard to image reasonable people actually believing that our government spending this obscene amount of money is somehow what makes our economy tick.” 


    (“The Real Fiscal Cliff: Spending, Not Taxes,” I.B.D., December 19). 

    Senator Paul has also identified what Congress needs to do to help rein in federal spending, the deficit, and ultimately the national debt.  He suggests, among other things, that we cut spending to 2008 levels. That, combined with the revenues of 2012, would give us next year a deficit as small as $89 billion – less than 1% of GDP.  Besides reforming entitlements, we need to “examine any and every way to significantly cut spending,” Sen. Paul concludes.   

    Senator Paul is also critical of the House GOP leadership, who he says “seems to want Republicans to be the party that raises taxes just a little less than Democrats.”  “This will not do,” he adds:  

    “Republicans are supposed to be the party of limited government and low taxes.  These are our most core and basic principles.  I don’t think it’s time to change who we are or what we stand for.  It will not help our economy.  It will also defeat the purpose of even having a Republican Party.”


    (“The Real Fiscal Cliff,” December 19). 

    Rather than following Senator Paul’s bold suggestions, however, House Republicans so far have decided to further “kick the can” down the road.  They recently passed, 285–144, a bill to suspend the “debt ceiling” – the $16.4 trillion limit on the U.S. government’s borrowing authority – for four months, through May 18.  The bill aims to both buy time for both parties to engage in broader budget negotiations and assuage market fears of a potential U.S. default.  Senate Majority Leader Harry Reid said the Senate would pass the bill, and the White House said that B.O. would sign it.  Although the legislation temporarily takes the debt-ceiling issue off the table, Congress faces two upcoming budget deadlines:  automatic spending cuts (the so-called “sequestration” mandated by the August 2011 debt-ceiling agreement) kick in on March 1, and current government funding runs out March 27. 

    House Budget Chairman Paul Ryan will release his committee’s budget after B.O.’s Feb. 12 State of the Union speech (after which B.O. is also expected to release his proposed budget).  Senate Democrats, who have not passed a budget since 2009, intend to pass one this year, according to Sen. Charles “Chuck You” Schumer (D.–N.Y.).  In an effort to pressure the Senate, House Republicans included in their legislation a provision to suspend lawmakers’ salaries if their respective chamber does not pass a budget by April 15.  Their salaries would be held in escrow until a budget is adopted, or until the 113th Congress ends in two years.  The “no budget, no pay” idea is popular among outside reform advocates, such as the non-partisan “moderate” group No Labels. 

    In adopting the stopgap legislation, House Republicans might have been following the advice of syndicated columnist Charles Krauthammer, who in a recent column urged the House GOP to “go small and simple.”  He proposed that the House GOP “offer to extend the debt ceiling through, say, May 1, in exchange for the Senate delivering a budget by that date” – virtually the same process the House adopted.  Republicans control only one chamber, the House; with B.O. in the White House and Democrats currently in control of the Senate (a situation that might change after the 2014 midterm elections), there’s little the GOP can do, Krauthammer warns.  “Forget about forcing tax reform or entitlement cuts or anything major.  If Obama wants to recklessly expand government, well, as he says, he won the election.”  What does Krauthammer mean by “go small and simple”?  He explains:  “Republicans should simply block what they can, further tax hikes for example.  The general rule is: From a single house of Congress you can resist but you cannot impose.”  If Republicans want “to save the Republic” from B.O. and the Democrats, they must “win the next election.”  “If your conservative philosophy is indeed right, winning will come.”  He concludes by quoting Margaret Thatcher, who said serenely of the Labor Party socialists she later overthrew: “They always run out of other people’s money” (“Split GOP Can’t Govern from Single Chamber,” I.B.D., January 18). 

    Although Krauthammer’s “go small and simple” tactics may sound attractive to the wimpish leadership of the House GOP, I think Senator Paul – and many of my libertarian and libertarian Republican friends – would ask, with some justification, If that’s all the Republican House is good for, why should American voters select them over Democrats in the 2014 Congressional elections?   To paraphrase Senator Paul:  Republicans should stand for something, to truly distinguish themselves from the Democrats:  just as they shouldn’t be merely “the party that raises taxes just a little less than the Democrats,” neither should they be the party that just spends a little less than the Democrats.


    To Be Continued . . . .

     | Link to this Entry | Posted Thursday, January 31, 2013.  Copyright © David N. Mayer.

    2013: Prospects for Liberty (Part I) - January 17, 2013


    2013: Prospects for Liberty


    Part I


    Once again, I begin a new year with an essay on “The Prospects for Liberty” in the coming year, 2013.  This year, again, the prospects can be summed up in three words – not very good.  The reason, quite simply, is the re-election to a second term of B.O., the Occupier-in-Chief, and all that his reelection portends.  Because of the 2012 elections, Americans face the greatest threat to their liberties under the Constitution of the United States – in terms of both the limitations the document puts on the powers of the national government and the rights that it guarantees to individuals, both of which are threatened by B.O. and his political party, the Democrats – the greatest threat to Americans in the 225 years since the ratification of the Constitution.    

    The political situation today is even more perilous than it was three years ago, in 2010, when I had also concluded that the prospects for liberty were “not very good.”  That was just a year after B.O. had begun his occupation of the White House.  At that time I had written: 

    “[T]wo of the three branches of the United States government are controlled by a political party that has as its agenda the destruction of constitutional limits on federal government powers; and the national collectivist (fascistic/ socialistic/ paternalistic) state the Democrats are trying to create poses unprecedented threats to individual freedom and responsibility in the USA, the country that until recently . . . has been the bastion of individualism in the world.  All is not yet lost, however, for 2010 is an election year – and the Congressional elections later this year, coupled with a momentous opportunity for the Republican Party to rediscover the principles of limited government on which it was founded, provide some grounds for hope in a revitalization of liberty in America.”


    The 2010 elections indeed did give some grounds for hope, for Republicans regained control of the U.S. House of Representatives as well as many governorships and state legislatures, thus providing a check against the Democrats’ disastrous agenda.  The major legislative “achievements” of the first two years of B.O.’s regime – nearly $1 trillion in federal “stimulus” spending, the “ObamaCare” health-insurance control law, and the Dodd-Frank financial regulatory law, all passed by Democrat majorities in the 111th Congress – provided the high-water mark of B.O.’s scheme to more fully “transform” the United States into a European-style socialist welfare state, by legislation.  Thanks to Republican control of the House, B.O. could not push his agenda through the 112th Congress; he had to resort to abuse of executive powers (through unconstitutional issuances of executive orders and through misuse of federal regulatory agencies like the EPA), to continue his schemes.  His radicalism was tempered, however, by his quest to get reelected – and by the disastrous economy, “B.O.’s recession” (as I’ve called it), which resulted from his wrong-headed economic policies and which he disingenuously tried to blame on his predecessor, President George W. Bush.    

                The divided federal government set the stage for the 2012 elections.  As I wrote in my New Year’s essay last year (“2012: A Time For Choosing,” Jan. 19), this year’s general elections were the most important of my lifetime – the most important of the past fifty-plus years – for they offered the American electorate the clearest choice between the two major political parties, two starkly different visions of the future of America.  And as I continued to discuss in my “Election 2012 Postmortem” (Nov. 10), the themes chosen by the competing presidential tickets of the two major American political parties epitomized the choice.  The B.O.–Biden ticket and the Democrats promised to move “Forward” with their policies, designed to destroy the constitutional foundations of the United States, while the Romney–Ryan ticket and the Republicans appealed to voters to “Believe in America,” offering to restore those institutions that have made the nation great.   

                Thus the 2012 election involved primarily a moral choice – a choice between two conflicting visions for America, and the values on which those visions are based.  Unfortunately, both sides in the campaign appealed more to Americans’ emotions than to reason – sadly, perhaps, the inevitable result of how populist our political system has become.  The Democrats’ campaign was based on negative emotions – stirring up Americans’ fears (their fear of being in control of their own lives) and their feelings of envy (envy of others’ success) and resentment (resentment against achievement).  Their misleading appeals to “fairness” masked their real message, which offered Americans the proverbial (and hypothetical) “free lunch,” and was aptly summed up by the infamous comment B.O. made during the campaign, “You didn’t build that.”  (The context of B.O.’s statement made clear his contempt for business – and for the capitalist system of free markets and limited government under which capitalism must thrive.)  B.O. and the Democrats favor statism, using force – the coercive power of government – to control the production of wealth and to redistribute it from those who’ve earned it (the producers, or the makers) to those who haven’t, the takers or the “looters” (as Ayn Rand called them in her magnificently prescient novel, Atlas Shrugged). 

                The Republicans’ campaign was based on more positive emotions, chiefly confidence (confidence in one’s self and one’s abilities) and hope (hope in the future – hope for a better life, for one’s self and one’s children and grandchildren).  They appealed to those Americans who still value freedom and individual responsibility – people who want to hold their destiny in their own hands, rather than having it determined by some bureaucrat in Washington.  With their presidential candidate, Mitt Romney, a successful businessman who understands (better than any other candidate in modern American history) how wealth really is created – that government itself cannot “create jobs” but can only help ensure the legal conditions under which free markets create jobs – the Republicans supported the producers – the entrepreneurs who create wealth in society, by freely trading their achievements with others, for their mutual benefit.  They defended the makers of wealth – and their right to keep and use the fruits of their creative activities – as epitomized by their response to B.O. at the Republican National Convention, honoring successful businessmen and businesswomen by proudly affirming, “You did build that!”     

                In short, the 2012 election offered Americans a clear choice between these two opposing values, these two opposing sides:  statism versus freedom, collectivism versus individualism, the looters versus the producers, the “takers” versus the “makers,” Democrats versus Republicans.  It was not just a referendum on B.O.’s presidency, as I’ve frequently noted here; rather, it was a referendum on the American people themselves.   

    Sadly, as I noted in my November 10 “postmortem,” the American electorate largely flunked the test.  The results were mixed, however.  Although (most ominously) B.O. was elected to a second term and the Democrats retained control of the U.S. Senate, the Republicans retained control of the U.S. House – thus essentially preserving the existing political alignment (the status quo) in Washington, D.C.  And at the state level, Republicans continue to hold most governorships and were largely successful in retaining control of a majority of the state legislatures.  Hence, American government, at both the national and state levels, continues to be divided between the two major political parties – either of which may plausibly claim to represent a majority of the American electorate. 

    B.O. may claim to have a “mandate” to pursue his anti-capitalist, anti-individualist (and unconstitutional) Marxist/fascist redistributionist agenda, but the 2012 election really gave him no such mandate at all.  The victories of Republican candidates for a majority of the U.S. House seats – like the victories of Republican candidates at the state level – show that the American people prefer to have Republicans making the laws and policies that B.O. (as Chief Executive of the U.S. government) has a constitutional duty to “faithfully execute” – not to have B.O. assume extraordinary (and unconstitutional) law-making powers as, in effect, a dictator would have.   

    The United States of America remains a constitutional republic – a union of states (each with a republican government limited in its powers by each state’s constitution as well as the U.S. Constitution), united for certain limited purposes in a national government, whose powers are limited by the U.S. Constitution.  Among the most important ways in which the Constitution limits the powers of the national government, in all its “branches” (including the executive branch, or the presidency), is through the principle of separation of powers and its corollary, checks and balances.  These “chains of the Constitution” (as Thomas Jefferson called them) limit the powers of the president – in terms of both the scope of his powers and the means by which he can exercise them – and thus pose the greatest threat to the tyrannical schemes of B.O. and the Democrats (or, to put it another way, the greatest protection Americans have for their individual liberties).  (Incidentally, that’s why so many left-liberals, so-called “progressives,” hold in distain the U.S. Constitution, claiming it’s no longer “relevant” to modern society – because faithful adherence to the Constitution would thwart their scheme to transform the United States into a paternalistic national regulatory/welfare state, with the president as a virtual dictator.) 

    Why was B.O. reelected, despite having one of the worst records (if not the worst record) of any president in U.S. history?  (See my discussion of B.O.’s failed presidency – my discussion of why I have ranked him as the greatest failure in U.S. history – in my “Rating the U.S. Presidents 2012,” Feb. 22, 2012.)  The reasons for B.O.’s reelection in 2012 are closely related to the reasons for his election four years ago, in 2008. 

    Political pundits have described B.O.’s election in 2008 as “historic” because he was the first African-American to be elected president.  As I’ve discussed in many previous essays here on MayerBlog, B.O. was elected for “racist” reasons: because he’s black – or, more precisely, because he identified himself racially as black (when he’s really of mixed racial heritage, with a white mother and a black African father) and politically exploited that racial identity – he appealed not only to black Americans but also to many white Americans who suffer from white “liberal” guilt.  (Black political commentator Shelby Steele maintained in an insightful op-ed in 2008 that B.O.’s presidential candidacy was “based more on the manipulation of white guilt than on substance.”  Steele called B.O. a “bargainer,” someone who “flatters whites, grants them racial innocence, and hopes to ascend on the back of their gratitude” (“The Obama Bargain,” Wall Street Journal, March 18, 2008.)  Interviewed by Dinesh D’Souza for the documentary film 2016: Obama’s America, Steele elaborated on these comments, calling B.O. a “redeemer” of white guilt, and concluding that “the reason he’s in the White House is because of his race, his blackness.”) 

    B.O.’s race – and his shameless political exploitation of racism (which has made America even more polarized by race now than it was four years ago, before his election) – is only just a part of the reason why he was first elected president in 2008, however.  The fundamental reason, as I have repeatedly maintained here on MayerBlog, is that B.O. (with the assistance of his political handlers, such as his chief political advisor, David Axelrod) is a master purveyor of bullshit (which I discuss more fully in the next section, below).  It is bullshit that has “clothed” B.O. with the image that he and his political handlers have so artfully devised – in what I’ve described as the “Emperor’s New Clothes” phenomenon.   

                As I noted in my November 10 essay, B.O. has personified the naked “Emperor” in the famous children’s story by Hans Christian Andersen, “The Emperor’s New Clothes” – as I have written here, since well before the 2008 elections.  (See my blog essay “The Emperor Is Naked!” Oct. 16, 2008.)  Like the “emperor” in that story, B.O. has been clothed in a supposedly magnificent set of clothes, which happen to be invisible, because they don’t really exist:  the tailors in the story are scam artists, much like Axelrod and the other political “handlers” of B.O., who have managed to deceive and/or intimidate people into saying that the “emperor,” B.O., who really is naked, is instead wearing those remarkable clothes.  In the 2008 election, Axelrod and the other handlers clothed B.O. in the image of a “centrist” (when in fact he was one of the most left-wing members of the Senate and, before that, the Illinois legislature), a decisive “leader” (when he totally lacked any executive experience in government and was instead a mere “community organizer,” or political agitator), a “pragmatist” (when in fact he’s an extreme left-wing ideologue, essentially a Marxist communist), and a “uniter, not a divider” (when in fact he’s bitterly partisan, employs divisive racist and class-warfare rhetoric, and advocates policies that have tended to exacerbate partisan, class, race, and sex divisions in American society).  He was also portrayed as intellectually bright, glib and articulate – when in fact he’s a moron who, when he speaks publicly, is overly dependent on a teleprompter.   

                In my essay “Tricks and Treats 2012” (Oct. 25) (among the “tricks,” in a section on “The B.O.-Maniacs’ Cult – and the Naked Emperor”), I expressed the hope that, by Election Day, a majority of American voters would recognize that the “emperor,” B.O., truly is naked.  Although there’s abundant evidence that millions of Americans at last do recognize that fact – including many people who voted for B.O. in 2008 and who supported him until, for whatever reason, they became disillusioned with him – B.O.’s reelection shows that many more millions of voters are still being deceived (or are, consciously or unconsciously, deceiving themselves) – despite the clear record of B.O.’s failed presidency.  (As I noted on November 10, B.O.’s reelection, with the nation in such wretched economic shape, has defied history and reason.  No previous president, since FDR, had been reelected with such an abysmal economic record.)  

    And in my November 10 essay, after briefly summarizing the record of B.O.’s presidency – not only the abysmal economy that has resulted from his disastrous policies, but also his failure to adhere to constitutional limits and to the rule of law (which has prompted me to rank him “the most lawless president in U.S. history) – I asked the rhetorical questions, “Why was B.O. reelected when instead, in a just nation that scrupulously adhered to the rule of law, he really should have been impeached and removed from office?   And why was B.O. reelected by millions of Americans who are suffering from the economic hardships created by his disastrous policies?”  My response was as follows:   

    The simple, basic answer is that a majority of the American electorate continue to fail to see the real B.O. – the real Barack Hussein Obama, Jr. (“mmh, mmh, mmh”) – but instead see a phony image of B.O. created by his own bullshit:  a new kind of politician, a sort of political “messiah,” a “Santa Claus” who dispenses “free stuff.”  That’s the image of B.O. that he and his political handlers have so carefully crafted since he first began campaigning for the presidency (in 2007, if not earlier) – not the man who has sat in the White House (when he wasn’t out on the golf links or campaigning for his reelection), amassing a dismal record and failing to follow the law.  The sad truth is that B.O. was reelected by Americans who don’t care who the real B.O. is – who don’t care about the Constitution and the rule of law – and who are in denial of reality – because they’re too ignorant to know better, or because they voted for him just to feel good about themselves (for a variety of irrational considerations).  In other words, his reelection defied reason because a majority of voters, in casting their votes for B.O., acted irrationally.


    To put the question another way:  Why has B.O. been so successful in perpetuating his “Emperor’s New Clothes” myth – in effect, in covering up his own nakedness?  Again, why was he re-elected, despite all the objective evidence that, historically and rationally, had predicted that he would be doomed to be a one-term president?  The simple answer, as suggested above, that his reelection simply defied history and reason isn’t really an answer at all; it’s just circular reasoning.  Similarly, to say that a majority of Americans (or more precisely, of the American electorate – of those Americans who chose to vote) are irrational, is just too simplistic.  In what ways, precisely, are they irrational?  If it wasn’t reason, what was it that motivated them – what irrational beliefs or opinions, or raw emotions, do they hold or feel?   

    In my November 10 essay, I discussed many possible explanations for B.O.’s reelection – among them, B.O.’s appeal to the “looters” (Mitt Romney’s infamous “47 percent”), those who believe in the proverbial “free lunch” and who are willing to sacrifice individual freedom and responsibility in exchange for collective “security” (that is, dependence on government); the negative campaign ads run by B.O.’s reelection organization and Democrat super-PACs, which demonized Mitt Romney (and did it so early and so well that the Romney campaign and the Republican Party could not effectively counter it); the relative success of Democrats, at the state and local level, in “getting out the vote,” among their core constituents (and the relative failure of the Republican Party and the Tea Party movement to mobilize their voters); and the critical role played by the “mainstream” (or, as Bernie Goldberg calls them, the “lamestream”) media, who were totally in the tank for B.O.’s reelection – acting virtually as the P.R. arm of the Democratic Party – even to the extent of using their supposed “fact checks” of campaign ads and political speeches to perpetuate B.O.’s lies and his bullshit.  This last factor was particularly important.  The media helped propagate B.O.’s lies – both about his own policies (his denial of responsibility for worsening the economic crisis and his pathetic ploy of blaming his predecessor, George W. Bush), as well as his opponent, Romney (especially all the lies told about Romney’s record as CEO of Bain Capital, to demonize Romney and portray him as a rich white guy out of touch with the problems of “average” Americans).  The media also bought into other aspects of B.O.’s bullshit – for example, propagating the bullshit cover-up story for the Benghazi, Libya consulate attack (blaming it on an obscure anti-Islamic video rather than on the failures of the B.O. regime’s foreign policy), or the “green” bullshit that B.O. used to rationalize his anti-carbon energy policy and his “green” corporate cronyism.   

    To these factors may be added some other plausible explanations offered by some conservative pundits and political commentators since the November elections.  Rush Limbaugh, on his radio show the past several weeks, has attributed B.O.’s win to the ballots cast by the Americans he calls “low-information voters” (LIVs):  voters who are basically ignorant (not only about politics but also about the basics of American government as well as basic principles of economics) – in other words, morons!  The LIVs were easily deceived by the bullshit propagated by B.O. and his political handlers – by the false image in which he was “clothed.”  To them, he seemed like a nice guy – not the arrogant, elitist and narcissistic prick that he really is.  All those appearances B.O. made on late-night entertainment shows (Leno, Letterman, Jimmy Fallon) or on daytime talk shows (like The View) – which were dismissed by many conservatives as demeaning to the office of the president – actually may have given him a decided edge over Romney (who generally declined to make such appearances), with the LIVs.  As some pundits have suggested, even B.O.’s lackluster performance in the first presidential debate – so lackluster that it shocked and profoundly disappointed many Democrats while it made many Republicans over-confident – may have helped cement B.O.’s appeal to the LIVs (who saw him as an “ordinary” guy, not the slick politician he really is). 

    The usually-astute political commentator Michael Barone, in a recent op-ed, maintains that the 2012 election most resembled the election of 2004, which resulted in George W. Bush’s reelection to a second term: 

    “Both of the elections involved incumbent presidents with approval ratings hovering around 50% facing challengers who were rich men from Massachusetts (though one made his money and the other married it).  In both cases, the challenger and his campaign seemed confident he was going to win – and had reasonable grounds to believe so.  In both elections, the incumbent started running a barrage of negative ads defining the challenger in the spring.  And in both elections, the incumbent had at least one spotty debate performance.  In both elections, each candidate concentrated on a more or less fixed list of target states, and in both elections the challenger depended heavily on outside groups’ spending that failed to achieve optimal results.  The popular vote margins were similar – 51% to 48% for George W. Bush in 2004, 51% to 47% for Barack Obama in 2012.”


    The “one enormous difference,” Barone adds, was turnout.  Turnout is a measure both of party organization and of voter enthusiasm.  In 2004, John Kerry got 16% more popular votes than Al Gore had four years before – but he lost because George W. Bush got 23% more popular votes than he had four years before.  “Turnout between the 2000 and 2004 elections rose from 105 million to 122 million – plus 16%.”  Kerry voters “were motivated more by negative feelings for Bush than by positive feelings for their candidate”; however, “Bush voters were more positively motivated” and thus gave Bush a big win.  

                In contrast, turnout between the 2008 and 2012 elections “fell from 131 million to 128 million – minus 2%.”  Both campaigns fell short of their turnout goals:  “Obama got 6% fewer popular votes than he had gotten in 2008,” and “Romney got only 1% more popular votes than John McCain had four years before.”  But, as both the election returns and exit polls indicate, the B.O. reelection campaign “turned out voters where it really needed them” – enabling him to carry Florida by 1 percentage point, Ohio by 3, Virginia by 4, and Colorado and Pennsylvania by 5.  “Without those states, he would have gotten only 243 electoral votes and would now be planning his presidential library.”  (Barone adds that “if Romney had gotten 16% more popular votes than his predecessor, as Kerry did, he would have led Obama by 4 million votes and won the popular vote 51% to 48%.”) 

    “Romney, like Kerry, depended on voters’ distaste for the incumbent,” maintains Barone.  The critical difference between 2004 and 2012, however, was that Romney “could not hope to inspire the devotion Bush enjoyed in 2004 and that Obama had from a diminished number in 2008.”  Although he gives no further explanation for Romney’s failure to motivate voters to vote for him (rather than against B.O.), Barone suggests this was the critical factor (“Romney Failed, as Kerry Did, To Move Voters,” Investor’s Business Daily, December 27). 

    Unlike some libertarians and limited-government conservatives (including some leaders in the Tea Party movement), however, I do not blame Mitt Romney for the Republicans’ loss in the presidential election.  Romney would have been a superb president.  With his business background (and the knowledge that gave him about capitalism and free-market economics), he would have been better qualified than any other president in U.S. history to deal with the economic crisis the nation now faces.  His chosen running-mate, Paul Ryan, similarly would have been one of the best V.P.s in U.S. history – combining the right philosophy of government with an impressive knowledge (as House Budget Committee chairman) of the federal budget.  As I have repeatedly discussed here on MayerBlog, I consider Romney the best candidate the Republicans had in 2012 – a true limited-government conservative with an appeal to independent voters (not a mushy “Massachusetts moderate,” as he was unfairly caricatured by rivals like Newt Gingrich or Rick Santorum and by “social conservatives”).  If Romney can be justly faulted for anything, it’s for being too nice a guy – for failing to attack B.O. and the Dems as forcefully as they had attacked Romney, and thus allowing them to get away with all the grossly unfair attacks, distortions, and outright lies that B.O.’s campaign (and its super-PAC surrogates and allies in the news media) committed during the campaign.  

    Although no single factor fully explains why B.O. was reelected, I’ve become more and more convinced that the critical factor – the single most important reason explaining the result of the 2012 election, if one were forced to identify just one factor – might very well be the overall low voter turnout.  In other words, the best explanation for the results of the 2012 presidential election that I’ve heard is that offered by Michael Barone (above), although not precisely for the same reasons suggested by Barone (the edge the Democrats had over Republicans in their “get-out-the-vote” effort in critical battleground states). 

    B.O.’s reelection in 2012, like his election in 2008, also has been described by political pundits as “historic.”  Why?  There are as many possible explanations are there are pundits, but the real reason why his reelection in 2012 was historic – the reason why it was truly historic – is that no president in U.S. history has ever won a second term with fewer votes than he got the first time around.  As John Merline has reported in Investor’s Business Daily, B.O. won reelection “despite losing millions of supporters.”  Turnout among eligible voters in 2012 was just 57.5%, according to a report released soon after Election Day by the Center for the Study of the American Electorate.  That was well below the 62.3% turnout in 2008 and the 60.4% turnout in 2004.  “That translates into about 5 million lost votes, some estimate.”  B.O. won with some 8 million fewer votes than he got in 2008.  By comparison, George W. Bush won reelection in 2004 with 11.6 million more votes than he got in 2000; Ronald Reagan won reelection in 1984 with 10.6 million more votes than he got in 1980; Richard Nixon in 1972, with 15.4 million more than in 1968.  Bill Clinton added 681,000 in his reelection bid in 1996.  “Even some one-term presidents boosted their vote counts the second time around, despite losing their reelection bids” (“Did 5 Million Non-Voters Win It for President?” November 11). 

    (Who were these non-voters?  According to the I.B.D. article, based on an IBD analysis of exit polls and a projected turnout number provided by Real Clear Politics, most of them were so-called moderates, or independents: “it appears that while Obama turned out his base, many moderates decided to sit this one out.”  The article also cites a report by Real Clear Politics senior elections analyst Sean Trende, which found that B.O.’s win “had less to do with demographic shifts and more to do with the fact that white voters stayed home in droves.”  Trende suggests it was B.O.’s relentlessly negative ad campaign against Romney that, while it failed to win back many of B.O.’s supporters from 2008, “may have turned them off to the Republican nominee as well.”) 

    Because so many Americans, for whatever reason, chose not to vote in the 2012 election, on both sides, it is impossible to say with any real confidence why B.O. won reelection, or why (on the other hand) Republican candidates were successful in maintaining their majority in the U.S. House and in maintaining (and in many cases even expanding) their control of so many statehouses.  The salient facts are that those millions of Americans who did vote are, overall, about equally divided in their allegiance to the two major parties that represent both sides in the modern American political divide – and that the views of many millions more, who (for whatever reason) did not vote, have yet to be determined.  We simply do not know what (if anything) is going on inside the heads of a majority of Americans – let alone know what values they may hold inside their “hearts.”  

    It’s understandable, given B.O.’s disastrous record as president, that so many libertarians and conservatives are depressed about the results of the 2012 election.  It’s tempting to believe that “we” are outnumbered (“we,” in this case, being Americans who still “believe in America” – in limited, constitutional government, in capitalism and the free-market economy, and in individual freedom and responsibility) – that the “makers,” or producers, have lost because the “takers,” or looters, have seized control – that Romney’s “47 percent” are now in the majority (the “tyranny of the majority” that James Madison and Alexis de Tocqueville so feared).  But with the election as close as it was, resulting in a continuation of “divided government,” we simply do not know. 

    Thus, there’s still room for hope – in the 2014 Congressional elections and in the 2016 general elections – for the Republican Party, if it can continue to rediscover and maintain its fundamental limited-government principles and if Republicans can remain united in opposition to the dangerous policies being pushed by B.O. and the Democrats, for the next four years.  Those are two big ifs on which the prospects for liberty in 2013 – and in the years beyond – depend.



    This Year’s Theme, Once Again: “The Tyranny of Bullshit”


    In a previous New Year’s essay (“2009: Prospects for Liberty”), I discussed what I called “the tyranny of bullshit.”  My dictionary defines bullshit as “nonsense, lies, or exaggeration.”  And it’s precisely that – a mass of nonsense, lies, and exaggerations, which scholars might call either myths or legends, but which ordinary people call bullshit – on which the 20th-century regulatory/ welfare state has been founded.  Moreover, it’s also bullshit on which the arguments for expanding the welfare state (as spouted mostly by Democrat politicians today) rest.  I call it “the tyranny of bullshit” because it’s a peculiar form of tyranny, one based on the gullibility of people who vote for the politicians who are taking away their liberties because they’re misled by the bullshit rationalizations the politicians espouse.  (Recall again Ayn Rand’s definition of politicians, which is especially apt in this context: “men whose sole qualification to rule me was their capacity to spout the fraudulent generalizations that got them elected to the privilege of enforcing their wishes at the point of a gun.")  

    In his four years as president, B.O. has proven himself to be the leading purveyor of bullshit in the United States, the bullshit-artist-in-chief, the chief executive of a regime based on bullshit.  As I observed in 2009, 

                “B.O. got his party’s nomination and was elected president because of his campaign – which I have likened to the Hans Christian Andersen fairy tale of “The Emperor’s New Clothes” (see my Oct. 6 [2008] essay “The Emperor Is Naked!”).  It was a campaign based almost entirely on bullshit:  the image portrayed of B.O. as a `uniter’ (when in fact he employs the standard Democrat demagoguery based on race and class divisiveness), his image as a `centrist’ (when in fact, he’s one of the most left-wing politicians ever elected president) – all this is bullshit.


                “The biggest load of bullshit associated with B.O. and his campaign for the presidency, however, is the notion that he’ll bring `change’ to Washington.  The constant references to `Change,’ or `Change You Can Trust,’ etcetera, are nothing but pure bullshit because all B.O.’s administration will bring to Washington is more of the same – more of the same old, tired semi-socialist, paternalist policies that the federal government has been tinkering with, under both Democrat and Republican administrations, for the past century or so, since the beginning of the 20th-century regulatory/welfare state.  All B.O. really promises