MayerBlog: The Web Log of
David N. Mayer

 

2006: Prospects for Liberty - January 4, 2006

 

2006: Prospects for Liberty

 

  

            I begin the New Year with my traditional annual essay on “The prospects for liberty.”  Like the past two years, 2006 offers mixed prospects for those of us who are radical individualists.   

            (“Radical individualist” is how I prefer to identify myself, and all other persons who are like-minded, politically speaking.  The term embraces libertarians, both “small-l” adherents to the modern libertarian political movement and “capital-L” members of the Libertarian Party; Objectivists, Randians, and neo- or post-Randians; classical liberals – that is, liberals in the true, classical sense of the term; limited-government conservatives; Goldwater Republicans; independents who hold liberty – the freedom of the individual, in thought and action – as their highest political value; and all others who supported the principle described by English philosopher John Stuart Mill in his 1850 essay On Liberty as “the sovereignty of the individual”:  

“The sole end for which mankind are warranted, individually or collectively, in interfering with the liberty of action of any of their number, is . . . to prevent harm to others.  His own good, either physical or moral, is not a sufficient reason. . . . The only part of the conduct of any one, for which he is amenable to society, is that which concerns others.  In the part which merely concerns himself, his independence is, of right, absolute.  Over himself, over his own body and mind, the individual is sovereign.” 

 

Anyone who supports this principle, fully and consistently, with respect to all aspects of liberty, is a radical individualist; anyone who does not, to the extent of the difference, is a collectivist.) 

            Although I remain optimistic in believing in the eventual triumph of liberty – that is, in the eventual full flourishing of human nature – my optimism is tempered by the realization that there’s a long, hard battle to fight.  The coming year brings great challenges, on virtually all fronts.    

 

 

n    The Welfare-State Mindset 

            The single greatest threat to liberty in the United States and in the Western world continues to be, as it has been for the past century or more, what I call the “welfare-state mindset”:  the set of assumptions (all based on premises that are demonstrably false) that helps sustain political support for the “welfare state” and regulatory state that arose in the early 20th century.    

            (Among the false premises:  what I call the “myth of complexity,” the belief that as society becomes more complex, more laws are needed to protect individuals from harm; and the belief that supposed “market failure” justifies government intervention in order to give individuals “security” and to foster the “common good.”  As classical liberal and libertarian scholars have persuasively shown, the myth of complexity turns truth on its head – for, as society becomes more complex, the free market expands opportunities for individuals to exercise their freedom and therefore the need to use the coercive power of law to protect their rights lessens.  The more advanced society becomes, it needs fewer – not more – laws, exactly the opposite of what has happened in the past century.  Moreover, as these scholars also have persuasively shown, the notion of “market failure” also is mythical, for it is always bad government policies (which distort market processes) and not any “failure” in the free market system which accounts for problems like the Great Depression, the “energy crisis” of the 1970s, or the “health care crisis” of the 1990s.  Subordination of individual rights to some supposed “common good” – in other words, rejection of Mill’s principle of “the sovereignty of the individual” – has been the philosophical root of the 20th-century regulatory/welfare state.  This basic assumption, too, is derived from false premises:  there is no such thing as the “common good,” or social welfare, apart from the sum total of the various interests of the individuals who compose society.  And, in a free society, some “insecurity” is inevitable, as the price we pay for our liberty; those who believe otherwise are ignorant of the truth stated in Benjamin Franklin’s famous aphorism, “They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety."  The truth is that capitalism – the free market system – has yet to be tried, fully, in the United States or anywhere else in the world.  On the other hand, socialism has proven to be a failure, in all respects, everywhere it has been tried – including the semi-socialism of “mixed economies” like those of the USA.) 

            The growth of government, especially at the federal level, has been truly ominous:  before 1930, government spending at all levels (federal, state, and local combined) was about 10% of gross national product, with two-thirds of it at the state and local levels; by the 1990s, combined government spending was about 43% of GNP, with two-thirds of it at the federal level.  And an increasingly large share of money spent at the federal level is consumed by so-called “entitlement” programs – direct spending to individuals – that is, money earned by some Americans, forcibly taken from them according to the tax laws passed by Congress and then redistributed to other Americans, the “beneficiaries” of other laws passed by Congress.  (I refer to them as “so-called `entitlement’ programs” because recipients are not truly “entitled” to their benefits, in either the moral or legal sense:  they did not earn them but rather receive them entirely at the whim of Congress, which is determined solely by political considerations.)  The three largest of these “entitlement” programs – Social Security, Medicare, and Medicaid – cost $1 trillion, nearly one-half of the $2.2 trillion that was the total amount of federal government spending in the 2004 budget year.  And everyone predicts the costs of these programs to rise astronomically in future years. 

            The costs of the federal welfare state go far beyond these dollar figures, however.  As horrible as these economic costs are – the drain that it puts on the nation’s capital – the real price we pay for the welfare state are its legal, political, and moral costs.  What are those costs?  The welfare state has all but destroyed the effectiveness of the U.S. Constitution in limiting the powers of government, for it has increased the powers of the federal government far beyond those legitimate powers enumerated in the text of the Constitution.  This disregard for the fundamental purpose of the Constitution has been sanctioned by the courts – meaning that judges have joined the politicians in the other two branches of government in violating their oaths to support and defend the Constitution.  Politically, it has meant that the American people have ceased being citizen-sovereigns and instead become clients of the welfare state; we are a nation of special-interest groups, each (like a pig at the feeding trough) competing with others for an increased share of the money forcefully taken from U.S. taxpayers and doled out to the interest groups by the Congress.  An especially telling sign of how power has become concentrated in Washington, D.C. as a result of the federal welfare state is that even state and local governments have become just another one of the lobbying groups in the nation’s capital. 

            Most ominously, the welfare state has undermined morality, eroding the fundamental moral values on which the well-being, prosperity, and happiness of Americans depend.  What are those fundamental values?  Individual freedom and responsibility:  the right and duty of each individual to be in charge of his or her own life, reaping the rewards of one’s wise choices and paying the price for one’s bad choices – in a word, self-responsibility.  And the virtues on which self-responsibility depends:  among them, ambition, confidence, discipline, frugality, hard work, patience, pride, productivity, prudence, and rationality.  The creation of the “cradle to grave,” or “womb to tomb,” welfare state – a government that promises to provide “security” for its citizens, “protecting” them from all the vicissitudes of life, even those resulting from their own folly or irresponsibility – has meant that rather than acting like self-responsible adults, Americans act more like children, helplessly dependent on the “nanny state.”  And no wonder:  if the policies of the government are based on the assumption that all persons are children, they will behave like children.  And if they are punished for their virtues – being made to feel guilty for their “selfishness” – and if they are rewarded for their vices – given more attention the more they cry for help – like children, they will respond accordingly, becoming profligate, lazy, needy, etc.  (For more on the moral costs of the welfare state, see David Kelley’s excellent book A Life of One’s Own: Individual Rights and the Welfare State, published in 1998 by the Cato Institute.) 

            For one particularly striking example of how the welfare state has de-humanized Americans – turning us from a nation of self-reliant individuals into a nation of sheep needing to be herded – consider what happened on September 11, 2001.  The United States was vulnerable to the 9/11 attacks because the policies of our government – meaning the policies enacted into law by our elected representatives, in response to their constituents’ demands – have made Americans utterly dependent on government for their own safety and therefore unable to defend themselves, particularly against such unexpected threats as suicide bombers using hijacked airplanes as their bombs.  It was the “nanny state” that made American airline passengers – on three of the four planes hijacked by Islamist terrorists on 9/11 – helpless against thugs armed only with box cutters.  It’s also a telling illustration of how the “nanny state” mindset has distorted our understanding of reality that most policymakers consider the cause of our vulnerability to the 9/11 attacks to have been the “security breach” that allowed the terrorists to board the planes carrying box cutters – and not the inability of the other airline passengers (passengers who had been disarmed by government “security” regulations) to defend themselves against those thugs.   And that our policies in reaction to 9/11 generally have sought, not to make Americans more capable of defending themselves against terrorists, but rather to make us even more dependent on government for our security. 

            Welfare-state programs are but one part of the modern “nanny state”; no less egregious – and just as costly in terms of economics, law, politics, and morality – is the other part, which I’ll call the “regulatory state”:  the vast network of laws and regulations, often administered by the so-called “fourth branch of government,” regulatory or administrative agencies, at both the federal and state levels, which affect virtually every aspect of the average American’s life.  Government regulators have dictated, among other things:  the wages and hours at which individuals are able to work (minimum-wage and maximum-hours laws); those employees whom employers can and cannot hire or fire (anti-discrimination and other labor laws); the environment of business establishments, including the personal behavior of employees and/or customers (OSHA regulations, anti-sexual harassment laws, and smoking bans); how much money persons can contribute to political campaigns (campaign-finance laws); whether or not property owners can remodel their homes or businesses (zoning and other land-use regulations); the design of persons’ automobiles (CAFE, or fuel-economy, standards and other mandates) and many other consumer products (including toilets that do not flush properly, washing machines that load from the front rather than the top, and door handles instead of door knobs).  Persons do not even own their own bodies:  it is illegal for them to ingest various substances, ranging from alcohol (for adults under 21), marijuana (even for medicinal use), and narcotic drugs, to all sorts of medications not approved by government regulators; it’s also illegal, except in a few counties of Nevada, for persons to engage in prostitution.  In most states, notwithstanding their constitutionally-guaranteed rights “to keep and bear arms,” individuals are not free to possess or carry guns for their self-defense unless they comply with various government regulations (which range from registration to licensing to outright bans, in many places).  Persons who are not heterosexual are denied the right to marry a partner of their choice.  And in most states, government itself monopolizes gambling (except for government-sanctioned casinos, only state lotteries can legally profit from gaming operations); and, through licensing laws and regulations, government restricts entry into many professions, shielding established firms from competition by depriving newcomers of their freedom to compete.   

            Writing about one aspect of this phenomenon – the taxicab monopoly that exists in most major American cities – George F. Will wrote in a 1993 op-ed, still relevant today, “The world is divided into two kinds of people:  those who want to prosper by competing, and those who want to prosper by getting government to cripple their competitors” – in other words, “genuine entrepreneurs” and “those whose entrepreneurship consists of turning government into a dispenser of privilege and injustice.”  Will’s comment underscores how government regulations, as much as welfare-state “entitlements,” have undermined morality.  Just as welfare-state programs penalize the virtuous for their virtues and reward the profligate for their vices (as discussed above), government regulations – including, ironically, the antitrust laws and other laws supposedly designed to foster competition – also turn ethical principles on their heads.  As Will concluded in that op-ed, “activist, regulating, subsidizing government is generally a servant of the strong and entrenched against the weak and aspiring.”   

            By interfering with the freedom of the marketplace – and thus distorting the market and undermining, if not destroying, its ability to adjust to new conditions – the 20th-century regulatory regime has severely undermined American capitalism, the source of our prosperity and freedom.   The real value of a free market is that it coordinates the choices of millions and millions of individuals, each pursuing his or her own interests by voluntarily entering into agreements with others for their mutual benefit – and thus does a free market help maximize individual freedom of choice.  A free society is one in which its individual members truly have freedom to choose, in all aspects of their lives.  Government – because of its inherent nature, its power legitimately to use force or compulsion to achieve its ends – always distorts the market by limiting freedom of choice.  The most dangerous phrase in the English language today is “There ought to be a law . . .” – for that phrase always prefaces rationalizations of attempts by some people to interfere with the free marketplace and instead accomplish their goals by using the coercive power of law.  In other words, attempts by some people to use governmental power to force their view of what’s “good” for society on the rest of us. 

            Abortion-rights activists like to talk about “a woman’s right to choose” – meaning only a pregnant woman’s right to terminate her pregnancy by destroying the life of an unborn child – which, of course, is a peculiarly limited notion of “choice.”  Most of the people who talk so passionately about “freedom to choose” in this narrow context are, apparently, oblivious to the almost-infinite number of ways in which the modern regulatory/welfare state deprives all individuals, male and female, of their freedom of choice:  for example, the freedom of any individual to work for less than the government-established minimum wage, to donate money to a political campaign above the government-established maximum, to smoke a marijuana cigarette anywhere, to smoke tobacco in places prohibited by government edict, to hire a prostitute (or to sell oneself as a prostitute), to utter on the airwaves a word that FCC regulators deem “indecent,” to disconnect the air bag on one’s car, to drive a car without buckling the seatbelt or to ride a motorcycle without wearing a helmet, to carry a concealed gun for self-defense (unless one has a permit from the government to do so), to open a beauty shop and braid customers’ hair (unless one has a full cosmetology license, in many states), to use a life-saving drug or medical device that hasn’t been approved by the FDA, to divert the 12% of one’s salary consumed by Social Security taxes and to invest it as one chooses, or to marry a partner of one’s choice if that person happens to be of the same sex.  (It’s truly shameful the United States, the nation founded on premises of individual rights, is beginning to lag behind the rest of the world on this issue.  We deny this basic right to our homosexual citizens while many European countries have begun to recognize it – Belgium, Canada, Holland, and Spain, which have expanded the definition of marriage to include same-sex couples; and Britain, France, Germany, and Switzerland, which have legalized same-sex civil unions.  The stubbornness with which the majority of the American people cling to the traditional definition of marriage, as the union of a man and a woman – a definition that limits it, of course, to monogamous heterosexual couples – can be explained only partly by homophobia; it really exemplifies the unfortunate predilection of Americans, whose minds have been warped by the welfare-state mindset, to “legislate morality,” whether it’s the moral code of the “religious right” or the socialist left.)   

            Sadly, the prospects for reversing the trend of the past century – the trend towards an ever-bigger, more expensive and pervasive regulatory/welfare state – are not good for the coming year, or even for the coming several years.  For radical individualists, the greatest challenge will be to fight against this trend, on all fronts:  to oppose all attempts to erode even further an individual’s freedom of choice, in all ways.  To do so successfully, we need to fight against the “nanny state” mentality that treats all persons like children; we need to insist that persons be treated instead like the free, self-responsible adults that they are.  (And we should leave the care of those who are truly children to the only individuals who are truly responsible for them, their parents.)  Anyone who argues “there ought to be a law” to do X – unless it’s the repeal of some existing program or regulation – ought to be challenged to explain why it’s absolutely necessary to use force to accomplish whatever they claim the law will accomplish.  Our goal should be to make the free operation of the marketplace what it used to be in America:  not the exception, but the rule.   

  

 

n    The Disappointing Bush Presidency 

            A year ago there was reason to be optimistic about the reelection of President Bush – reason to think that a second Bush term would be more than simply the lesser of two evils, compared to the alternative (a Kerry-Edwards administration [ugh!]), and would in fact advance liberty on many fronts.  Yet the presidency of George W. Bush, about to enter its sixth year, has proven to be a major disappointment.  The Bush administration has only one positive achievement:  the federal tax cuts, which have yet to be made permanent and to be deepened, as they need to be, to help counter the massive injustice of federal taxation.  (That injustice comes from reliance on income taxes that put the majority of the burden of funding the federal government on a small minority of Americans, the most productive citizens, in effect penalizing them for their ability and rewarding non-productive Americans who freeload off the system.  For more on this, see my 2001 essay, “Wealthy Americans Deserve Real Tax Relief.”)  

            Mr. Bush has failed to accomplish his greatest promise:  reform of the horrible Social Security program, which he proposed quite modestly to begin privatizing – the one truly good idea Mr. Bush had, which justified his election in 2000 and his reelection in 2004.  (On the need for real Social Security reform, see my February 15, 2005 essay on “Socialist Insecurity.”)  Lacking support from his own political party, because weak-kneed Republicans thinking too much like Democrats are afraid of privatization (see “Demopublican/Replicrat Monopoly,” below), Mr. Bush stopped providing leadership on this issue and apparently has abandoned any plans to meaningfully reform Social Security – and thus has destroyed a golden opportunity to leave the presidency with a positive legacy. 

            Rather, Mr. Bush apparently has risked all on a huge gamble – U.S. military intervention in Iraq – that’s not likely to pay off (although for reasons different from those imagined by his Democratic critics and many of my libertarian friends, as discussed in the next section, below).   A handful of other positive legacies of his presidency thus far (including a Justice Department that at least gives lip service to the importance of Second Amendment rights and several wise judicial appointments, including that of John Roberts as chief justice) pale in comparison with the negative record of the Bush administration.  That negative record includes:  the “No Child Left Behind” law (which exacerbates the federal government’s unconstitutional control over education in the United States); the McCain-Feingold campaign finance regulatory law (which further unconstitutionally abridges freedom of political speech); the Medicare prescription-drug program (the greatest expansion of the welfare state since Nixon’s presidency); the USA PATRIOT Act (which abridges constitutional liberties needlessly in the name of fighting terrorism); the expansion of federal bureaucracy through creation of the Department of Homeland Security and the TSA (which has all but ruined air travel in the U.S.); the mushrooming of federal spending, including such outrageous pork-laden legislation as the “farm bill” of Mr. Bush’s first term, the “energy bill” of the past year, and the planned federal bailout of the costs of rebuilding New Orleans; the continuation of many bad policies of the Clinton administration, such as those of the Clinton “Injustice Department” (particularly the Department’s antitrust division) and domestic surveillance by the NSA (the Clinton administration’s “Echelon” project, which served as the foundation for the current administration’s recently-exposed eavesdropping program); the unconstitutional Congressional attempt to intervene in the Terri Schiavo case in Florida courts; and Mr. Bush’s disastrous choice of Harriet Miers, an incompetent crony, as his first nominee to succeed Sandra Day O’Connor as Supreme Court justice. 

            It’s a truly disappointing record.  Thankfully, Mr. Bush is not eligible for reelection.  Unfortunately, the prospects for 2006 (and the following two years) are not good:  Bush will be viewed, more and more, as a “lame duck,” and the partisan wrangling over the policies of his administration will shape the federal elections of 2006 and 2008, eventually ushering in a new “Demopublican/Replicrat” administration which (as discussed below) is likely to be even worse than Bush’s. 

 

 

n    The Islamic Threat 

Perhaps the most visible or obvious threat to liberty on a global scale is Islamist terrorism, and behind that threat – however un-PC it may be to say so – is the Islamic religion, a religion that is the enemy of individualism and which preaches violence among its essential tenets.   Author Sam Harris convincingly makes the case for this proposition in his insightful and provocative book, The End of Faith: Religion, Terror, and the Future of Reason (2004).  As Harris shows in Chapter 4, “The Problem with Islam,” what makes Islam irreconcilable with Western liberalism is the “irrescindable militancy” of Islam, which is “undeniably a religion of conquest.”  Islam today is like Christianity was in the 14th century; it has yet to experience anything like the Protestant Reformation – yet alone the Enlightenment.  Thus, notes Harris, most Muslims are “`fundamentalist’ in the Western sense of the word – in that even `moderate’ approaches to Islam generally consider the Koran to be the literal and inerrant word of the one true God. . . . The only future devout Muslims can envisage – as Muslims – is one in which all infidels have been converted to Islam, subjugated, or killed.  The tenets of Islam simply do not admit of anything but a temporary sharing of power with the `enemies of God.’”  Quoting extensively from the Koran, Harris persuasively makes the case that “[o]n almost every page,” the book “instructs observant Muslims to despise non-believers.”  He maintains that “Islam, more than any other religion human beings have devised, has all the makings of a thoroughgoing cult of death.” 

Now, certainly not all Muslims are “Islamists,” those extremists “who believe that Islam must inform every dimension of human existence, including politics and law”; and like most other religions, Islam has suffered a variety of schisms, chiefly the split between the Sunni and the Shia, as well as divisions within each of these sects and even within the ranks of the Islamists.  As Harris notes, this mitigates the threat that Islam poses to the West, but the threat remains given the nature of Islam itself:  “Moderate Islam – really moderate, really critical of Muslim irrationality – scarcely seems to exist.  If it does, it is doing as good a job at hiding as moderate Christianity did in the fourteenth century (and for similar reasons).”  Harris concludes, ominously, that “we should not ignore the fact that we must now confront whole societies whose moral and political development – in their treatment of women and children, in their prosecution of war, in their approach to criminal justice, and in their very intuitions about what constitutes cruelty – lags behind our own.”   

            Notwithstanding these observations, however, it’s also true that, ultimately, Muslims themselves must determine the content of their religion; in other words, that they can make choices as to how to interpret their sacred text and how to apply its teachings to their lives.  As David Kelley observes in the Fall 2005 issue of The New Individualist (journal of The Objectivist Center),  

“As a religion, Islam rests on faith in the existence of a supernatural being and in a story of how his commands were revealed to mankind; its whole superstructure of systemization rests on a foundation of arbitrary belief.  It therefore has no rational way to exclude Islamist fanaticism from the spectrum of belief by proving it false, any more than mainstream Christians can prove that fundamentalists misrepresent Christianity.  By the same token, nothing prevents Muslims from moving in the direction of a more liberal doctrine that is more open to reason, science, progress, individualism, and a secular society.”

 

Kelly also notes that the Islamic principle of jihad (literally, “struggle,” but usually translated into English as “holy war”) need not be understood by Muslims generally the way the Islamist fanatics interpret it – citing not only the different interpretations given jihad by most scholars but also evidence suggesting that leftist ideas imported from the West also have influenced Islamists.  (20th-century totalitarian movements, especially Marxism, have influenced Islamists’ hostility to capitalism, their belief in an Islamic revolution led by a “vanguard,” and their portrayal of poverty and stagnation in Muslim countries as the product of victimization, Kelly notes.  Thus, he characterizes the Islamist ideology as “actually a modern one,” which has “more in common with fundamentalist movements in other religions, and with secular totalitarian ideologies like Marxism, than with any historic school of Islamic thought.”)  

            Practically speaking (and on principle), we cannot wage war with all the Islamic world.  The challenges that Islam poses to individualism and peace must be fought, ultimately, with the only ammunition appropriate to a war of ideas:  reason.  It is only those fanatic Muslims – the Islamist terrorists – who support the use of force or violence to advance their religion who are our enemies.  And of those, the ones against whom we wage war must be those who pose imminent threats to the United States; under our Constitution and in keeping with our nation’s founding principles, we have no legitimate role in functioning as the world’s policeman.  Use of our military resources must be limited to our own defense. 

            Whether Iraq will prove to be “the wrong war” has yet to be determined by history; it’s still too soon to characterize U.S. military intervention in that country, to depose the dictatorship of Saddam Hussein, as a “mistake.”  (And it’s certainly hypocritical of Democrats in Congress to try to blame the war on President Bush alone, when he acted with the support of Congress, which passed resolutions abdicating their responsibility to declare war and instead giving the president essentially a blank check to use U.S. military force wherever he determined it expedient.  And at the time of the U.S. invasion of Iraq nearly two years ago, most of the evidence showed that Hussein’s regime, because of its sheer lawlessness, posed at least a strong potential of an imminent threat to the U.S.)  But there is mounting evidence that, even under the most favorable of likely scenarios, U.S. military intervention in Iraq will be no panacea to the problems posed by Islamist terrorism.  That’s largely because U.S. policymakers – not only in the Bush administration, but also in the leadership of Congress – mistakenly regard “democracy” as a solution.  As Iraq’s “democratic” neighbor, Iran, so vividly demonstrates, democracy – which means nothing more than rule by the majority and which therefore has the dangerous tendency of leading to majority tyranny – can be just as conducive to Islamic terrorism as dictatorships can.  What’s really needed in Iraq, to prevent the situation there from degenerating into full-blown civil war between the various religious and ethnic sects of that country, is not “democracy,” but rather a representative system of government with real constitutional limits on the powers of the national government and real checks to prevent its powers from being abused – in other words, a constitutional republic.  Unfortunately, it’s still an open question whether the constitution ratified by a popular vote in Iraq last year will guarantee such a system of government, or whether the government recently elected (in elections whose validity is still questioned by many) will be viable, providing the country the stability it so desperately needs.  (And the stability that’s necessary as a prerequisite for the withdrawal of U.S. troops.) 

            In an insightful essay, “Understanding the Global Crisis,” published in the May-June 2003 issue of The Free Radical, Chris Matthew Sciabarra has written persuasively about the reasons to be wary of any long-term U.S. expansion in the region.   As he has noted, “The lunacy of nation-building and of imposed political settlements – which have been tried over and over again in the Middle East with no long-term success – does not mean that there is no hope for the Arab world.”  Citing evidence suggesting a rising revolt against theocracy, especially among a younger generation of Iranians who “eat American foods, wear American jeans, and watch American TV shows” and thus are fed up with oppressive government, he adds, “I don’t see how a U.S. occupation in any part of the region will nourish this kind of revolt.  If anything, the United States may be perceived as a new colonial administrator.  Such a perception may only give impetus to the theocrats who may seek to preserve their rule by deflecting the dissatisfaction in their midst toward the `infidel occupiers.’  I can think of no better ad campaign for the recruitment of future Islamic terrorists.”  Sadly, the story of the U.S. occupation of Iraq seems to have proved Sciabarra’s prediction to be right. 

            The United States and the rest of the Western world must use military force, as appropriate, to defend themselves against the threat posed by fanatical Islamists.  Our past policies of appeasement toward Islamic terrorism have proven to be failures, but we should not adopt policies of overreaction that will be failure in the opposite direction.  Of course, we are right to strike back against those who initiate force and even to strike preemptively or unilaterally against imminent threats to American security, as Chris Sciabarra notes.  Nevertheless, I also find persuasive his argument that “America’s only practical long-term course of action is strategic disengagement from the region,” meaning the entire Middle East.  Like Sciabarra, “in the long term, I stand with those American Founding Fathers who advocated free trade with all, entangling political alliances with none.  If that advice was good for a simpler world, it is even more appropriate for a world of immense complexity, in which no one power can control for all the myriad unintended consequences of human action.  The central planners of socialism learned this lesson some time ago; the central planners of a projected U.S. colonialism have yet to learn it.”

  

 

n    The Christian Threat 

Lest readers think that he’s critical of irrationality only in Islamic faith in his book, The End of Faith, Sam Harris also takes aim at the dominant religion of the Western world, Christianity.  Of course, compared with the theocracies of the Islamic world, the influence of religion in the West “seems rather benign,” he concedes.  Yet in Chapter 5, “West of Eden,” Harris also argues that “the degree to which religious ideas [and he’s talking chiefly about fundamentalist Christianity] still determine government policies – especially those of the United States – presents a grave danger to everyone.”  In this chapter, Harris reveals much about his own left-liberal political biases, for he confines his criticisms largely to Republican-sponsored policies (such as the ban on federal funding for embryonic stem-cell research) and what other commentators demonize as the “Religious Right,” ignoring the equally pernicious influence of the religious left on American and Western politics.  The most persuasive section of this chapter is his discussion of “The War on Sin,” the criminalization in the U.S. and in much of the rest of the world of such “victimless crimes” as drug use, prostitution, sodomy, and the viewing of “obscene” materials.  Making such behaviors criminal “is nothing more than a judicial reprise of the Christian notion of sin,” he persuasively argues.   

To forbid something simply because it’s immoral – and to equate immorality with selfishly pleasurable activities – is to let religious faith, and not reason, determine public policy.  For example, consider our drug laws and particularly the criminalization of marijuana.  Harris writes persuasively about the irrationality of our drug laws: 

“The fact that people are being prosecuted and imprisoned for using marijuana, while alcohol remains a staple commodity, is surely the reduction ad absurdum of any notion that our drug laws are designed to keep people from harming themselves or others.  Alcohol is by any measure the more dangerous substance.  It has no approved medical use, and its lethal dose is rather easily achieved.  Its role in causing automobile accidents is beyond dispute.  The manner in which alcohol relieves people of their inhibitions contributes to human violence, personal injury, unplanned pregnancy, and the spread of sexual disease.  Alcohol is well known to be addictive.  When consumed in large quantities over many years, it can lead to devastating neurological impairments, to cirrhosis of the liver, and to death.  . . . None of these charges can be leveled at marijuana.  As a drug, marijuana is nearly unique in having several medical applications and no known lethal dosage. . . . Marijuana kills no one.  Its role as a `gateway drug’ now seems less plausible than ever (and it was never plausible).  In fact, nearly everything human beings do – driving cars, flying planes, hitting golf balls – is more dangerous than smoking marijuana in the privacy of one’s own home. . . . And yet, we are so far from the shady groves of reason now that people are still receiving life sentences without the possibility of parole for growing, selling, possessing, or buying what is, in fact, a naturally occurring plant.  Cancer patients and paraplegics have been sentenced to decades in prison for marijuana possession.  Owners of garden-supply stores have received similar sentences because some of their customers were caught growing marijuana.  What explains this astonishing wastage of human life and material resources?  The only explanation is that our discourse on this subject has never been obliged to function within the bounds of rationality.”

 

He concludes that “because we are a people of faith, taught to concern ourselves with the sinfulness of our neighbors, we have grown tolerant of irrational uses of state power.” 

            Harris is essentially right; if anything, he has minimized the pernicious influence that Christian faith – particularly the blind faith of Christian fundamentalism – has had over American public policy.  Thankfully, we have a constitutional separation of church and state in this country.  But that principle of separation has been understood by the courts and by policy makers to preclude government “establishment” of religion in a rather formal sense; it does not preclude religious faith from influencing public policy, particularly with regard to the use of government “police powers” to enforce religiously-based codes of morality.   

Many of the prohibitive laws enforced by the modern regulatory/welfare state, discussed in the section above, are based on this rationale for “legislating morality.”  Why do we criminalize gambling or prostitution?  Why do we forbid same-sex couples from marrying?  Only because some people (perhaps a majority in our society, but certainly not everyone) believe those activities to be “immoral,” and their moral belief is in turn rooted in some version of the Christian faith (in many cases, heavily influenced by the anti-sexuality biases of early Christian theologians like the apostle Paul).  Even if those beliefs were right – and I’m one who’s persuaded that they are not, that Christian morality as well as theology is out of touch with the reality of human nature – it does not mean that “sin” ought to be criminalized.  In a truly free society, a society that truly values and under law truly protects the full range of individual rights, it is not sufficient to criminalize something because it is considered “harmful” to oneself or to that amorphous entity people call “society.”  As John Stuart Mill stated in his explication of the self-sovereignty principle, the only legitimate use of political power is “to prevent harm to others”; a person’s “own good, either physical or moral, is not a sufficient reason.” 

Fortunately, Christianity has among its teachings Jesus’ famous maxim, “Render unto Casear the things that are Caesar’s and unto God the things that are God’s.”  Sincere Christians who follow this principle ought to agree with radical individualists that it’s wrong to use political power to try to force people to be moral.  It’s doubly wrong, for it not only violates libertarian principles about the legitimate use of government’s power but also because – as many conservative commentators have realized – it also undermines religion and morality, which ought to rely not on political power (i.e., force) but rather on persuasion, on the voluntary actions of individuals, to make them “moral.” 

 

 

n    The Radical Environmentalist Threat 

Yet another “religion,” of sorts – because it’s based on irrational faith and propagates itself by playing on persons’ negative emotions, especially fear and guilt – is the radical environmentalist movement.  Activists in this movement, who include animal rights fanatics (like the PETA nuts) and so-called “eco-terrorists” (thugs who use force or violence to accomplish their aims), hold premises that are essentially anti-human:  they worship an imaginary “natural” world that excludes human beings from nature (thereby ignoring the facts that humans are a part of nature and that it’s natural for humans to manipulate their environment); they put the supposed interests of this imaginary world above the real interests of human beings. 

The influence of this “religion” on public policy can be found in, for example, legislation “protecting” endangered species and wetlands – legislation which violates the rights of property owners and which, by distorting market forces, actually is counterproductive.  And the perniciousness of its influence on American policy makers can be illustrated, for example, by the continuing failure of Congress to pass legislation allowing oil drilling in Alaska’s Arctic National Wildlife Refuge. Sadly, this shows that the radical environmentalist lobby continues to be so effective in thwarting common-sense policies – and liberty. 

 

 

n    The Demopublican/Replicrat Monopoly 

As I wrote in my November 17 entry, “Demopublicans and Replicrats,” there’s no essential difference between Democrats and Republicans, between left-liberals and conservatives, in the United States today.  Both major American political parties, which together hold a monopoly on public offices in the USA, adhere to the policies of the regulatory/welfare state and differ only in the form of collectivism they embrace; both major parties are anathema to radical individualists, those of us who truly value individual rights. 

The prospects for breaking the Demopublican/Replicrat monopoly are not very good, in the short run.  As I discussed in my Nov. 17 entry, neither of the major political parties in America today are welcome places for those of us who believe in freedom and in limited government.   Absent a major realignment of the parties, that’s a sad reality that’s unlikely to change. 

For us, the 2006 midterm Congressional elections will be, essentially, irrelevant:  whether or not Republicans retain their nominal control over both houses of Congress or Democrats gain more seats (or perhaps even regain control) over one or both houses of Congress, the legislation likely to emerge from the next Congress will be more of the same old, to put it bluntly, statist shit.  Consider one of the members of the U.S. Senate who’s up for reelection this year, Mike DeWine of Ohio.  DeWine is one of the RINOs (the “Republicans in Name Only”) who, unfortunately, comprise the ideological center of the Demopublican/Replicrat monopoly.  Although nominally a Republican, he has voted, for example, against opening ANWR to oil drilling and against cutting Medicaid and federal student-loan programs.  If DeWine is replaced by a Democrat, it will mean little change in how the person occupying that Senate seat votes. 

Looking ahead to the 2008 presidential election, if the pundits are right in their prognostications, the Demopublican/Replicrat race will be between Hillary Clinton and John McCain – in other words, a power-hungry bitch versus a megalomaniacal bastard, each of them equally an enemy of liberty.  Such a contest, too, will be irrelevant to individualists, one that will arouse little emotion other than nausea. 

Unless and until the Libertarian Party stops acting like a philosophical club and instead begins acting like a real political party – opening up its membership to individualists of all types, including limited-government conservatives – and thus helps pave the way for the much-needed realignment of American political parties, partisan politics is not worth our time, resources, or energy.

 

 

n    The Collectivist Bias of Intellectual “Elites” 

American “intellectuals” – those who have the conceit of “expertise” on public-policy questions and who, for better or worse, actually do help shape the dominant opinions in society – unfortunately, continue to be, as they’ve been for the past 70 or so years, largely collectivist.  Overwhelmingly, intellectuals in the most influential professions (in law, journalism, higher education, science, and the arts) have a left-wing collectivist political orientation.  In recent years – thanks largely to the challenges that the “new media” (the Internet, talk radio, cable and satellite channels) have posed to the “old media” of TV news and newspapers – the left-wing monopoly over media has been challenged; but the most effective challengers (for example, Rush Limbaugh on radio and various commentators on the Fox News Channel) are also collectivists, albeit on the “right,” or conservative, side of collectivism.  Rather than really challenging the anti-individualist premises behind left-liberal politics, they simply offer a competing version of collectivism that is just as anti-individualistic.   

            As I noted last year, however, this is the one area where I foresee real progress in the coming year – and ultimately, hope for progress in the other areas.  That’s because the monopoly of “old media” is being broken, and that the “new media” is beginning to offer some real diversity of views, including those of radical individualists.  The Internet, and particularly the “blogosphere,” has been particularly effective not only as a watchdog over Big Media (as, for example, the bloggers who exposed the forged documents on which CBS News relied in 2004, in its 60 Minutes story on President Bush’s service in the Texas National Guard), but even more importantly, in giving voice to viewpoints outside the collectivism “mainstream.”  From such genuine diversity, the “marketplace of ideas” can do wonders in advancing freedom. 

  

 | Link to this Entry | Posted Wednesday, January 4, 2006 | Copyright © David N. Mayer