The Kennedy-Reagan Truth vs. the Obama Delusion

In his book The New Reagan Revolution, Michael Reagan examined six great economic crossroads of the 20th and 21st centuries. These six critical junctures in the history of the United States serve as economic laboratories to test two contrasting economic theories. One theory consistently produced economic expansion and sustained growth. The other theory invariably produced failure and misery. Here are Michael Reagan findings:

1. The “Forgotten Depression” of January 1920. During the last year of Woodrow Wilson’s presidency, the economy nosedived. GNP fell 17 percent; unemployment soared from 4 to almost 12 percent. This was the “Forgotten Depression” of 1920. Wilson’s successor, Warren G. Harding, came into office and immediately cut tax rates for all income brackets, slashed federal spending, and balanced the budget. Long before the world ever heard of Ronald Reagan, Harding practiced “Reaganomics.”

“President Harding applied the principles of Reaganomics,” Michael Reagan observed, “even though Ronald Reagan was at that time a nine-year-old boy living in Dixon, Illinois. Harding was not following an economic theory. He was following common sense. He treated the federal budget as you would treat the family budget: When times are tough, cut spending and stay out of debt. Harding also treated his fellow citizens with commonsense compassion: If folks are going through tough times, government should ease their burden and cut their taxes.”

The Harding recovery was astonishingly rapid, beginning just half a year into his presidency. Unemployment fell to 6.7 percent by 1922, and to 2.4 percent by 1923. Harding’s successor, Calvin Coolidge, maintained Harding’s program of low tax rates, balanced budgets, and limited government. The Harding-Coolidge era of prosperity became known as “the Roaring Twenties”—a time of soaring prosperity, stable prices, and boundless optimism.

Obvious conclusion based on the evidence: Reaganomics works.

2. The Great Depression. Coolidge was succeeded by Herbert Hoover. In the eighth month of Hoover’s presidency, the stock market crashed—the infamous Crash of 1929. Many factors led to the Great Depression, but the Crash was the precipitating event. Hoover had failed to learn the lessons of the Harding-Coolidge years, so he responded by raising taxes (hiking the top marginal rate from 25 to 63 percent), imposing protectionism (the Smoot-Hawley Tariff Act), and boosting government spending by 47 percent, driving America deep into debt. Hoover’s actions worsened the Depression. A defeated Herbert Hoover bequeathed a ruined economy to Franklin Delano Roosevelt

FDR took office at a time when 25 percent of the nation’s workforce was unemployed. He, too, ignored the lessons of the “Forgotten Depression,” and doubled down on Hoover’s failed tax-and-spend policies, applying the economic theory known as Keynesianism (after British economist John Maynard Keynes). The Keynes-FDR approach involved deficit spending, soak-the-rich tax policies, and big-government make-work programs (the New Deal). FDR and a compliant Congress hiked personal and corporate income tax rates, estate taxes, and excise taxes.

Michael Reagan wrote, “From 1937 to 1939, the stock market lost almost half its value, car sales fell by one-third, and business failures increased by one-half. From 1932 to 1939, the U.S. racked up more debt than in all the preceding 150 years of America’s existence. By early 1939, as the Great Depression was in its tenth year, unemployment again climbed past the 20 percent mark.”

Many Americans credit FDR with “getting America through the Depression.” In reality, FDR’s policies prolonged the Depression. In a time of catastrophic unemployment, Roosevelt made it prohibitively expensive to hire people, making a terrible human tragedy even worse. While thousands of U.S banks failed under FDR’s policies, across the border in Canada, not one bank failed—because Canadian banks were not hamstrung by FDR’s foolish over-regulation. In FDR’s Folly, historian Jim Powell questions the disturbing FDR legacy:

Why did New Dealers make it more expensive for employers to hire people? Why did FDR’s Justice Department file some 150 lawsuits threatening big employers? Why did New Deal policies discourage private investment without which private employment was unlikely to revive? Why so many policies to push up the cost of living? Why did New Dealers destroy food while people went hungry? To what extent did New Deal labor laws penalize blacks? Why did New Dealers break up the strongest banks? . . . Why didn’t New Deal public works projects bring about a recovery? Why was so much New Deal relief spending channeled away from the poorest people?

In May 1939, a demoralized and defeated Henry Morgenthau, FDR’s treasury secretary, told the House Ways and Means Committee, “We are spending more than we have ever spent before and it does not work. . . . I want to see people get a job. I want to see people get enough to eat. We have never made good on our promises. . . . After eight years of this administration we have just as much unemployment as when we started. . . . And an enormous debt to boot!”

Many people mistakenly believe that World War II lifted America out of the Great Depression. Not true. What WWII did was take 12 million men out of the workforce and send them into war, which ended unemployment. But all the other signs of a damaged economy remained during the war: low stock prices, depressed private investment, and depressed consumer demand.

Roosevelt and his successor, Harry Truman, had a post-war plan to impose an even bigger Second New Deal after the war. Fortunately, Congress refused, and chose instead to cut taxes and cut spending—the same commonsense “Reaganomics” approach that had produced prosperity during the 1920s. The result: a post-war economic boom from the late 1940s through the 1950s. Had FDR and Truman gotten their way, the country would have slipped right back into recession if not a second Great Depression.

Obvious conclusion based on the evidence: Keynesianomics fails, prolonging economic hardship and misery, while Reaganomics works again.

3. The Recession of 1960 and 1961. When John F. Kennedy came into office, he faced a jobless figure of 7.1 percent. Wanting the economy to keep up with the growing workforce, JFK addressed the Economic Club of New York in December 1962 and proposed a bold notion: “It is a paradoxical truth that tax rates are too high today and tax revenues are too low and the soundest way to raise the revenues in the long run is to cut the rates now. . . . The purpose of cutting taxes now is not to incur a budget deficit, but to achieve the more prosperous, expanding economy which can bring a budget surplus.”

Those are the words of John F. Kennedy—and he was preaching Reaganomics. Kennedy was assassinated less than a year later, but his successor, Lyndon Johnson, lobbied hard for the JFK tax cuts, and he signed them into law in 1964. As a result of JFK’s Reaganesque economic plan, the economy experienced a dramatic 5 percent expansion and personal income increased by 7 percent. Gross national product grew from $628 billion to $672 billion, corporate profits by an explosive 21 percent, auto production rose by 22 percent, steel production grew by 6 percent, and unemployment plummeted to 4.2 percent—an eight-year low. The Kennedy-Johnson tax rate cuts produced a sustained economic expansion for nearly a decade.

Obvious conclusion based on the evidence: Reaganomics works again.

4. The Recession of the 1970s. This recession began in November 1973 under Nixon and ended (technically) in March 1975 under Gerald Ford—a 16-month recession. According to the graphs and charts of the economists, real GDP was on the rise by the spring of 1975, yet unemployment and inflation remained painfully high throughout rest of the 1970s. Americans continue to suffer joblessness amid spiraling prices after the recession officially ended.

In 1976, Ronald Reagan narrowly lost the primary race against Gerald Ford. Reagan was convinced that he knew how to solve the long and painful recession of the 1970s, but he was forced to watch from the sidelines as Gerald Ford and Jimmy Carter—two befuddled, clueless Keynesians!—battled each other for the White House. On October 8, 1976, at the height of the presidential race between Carter and Ford, Reagan outlined the principles of Reaganomics in a syndicated newspaper column entitled “Tax Cuts and Increased Revenue.” He wrote:

Warren Harding did it. John Kennedy did it. But Jimmy Carter and President Ford aren’t talking about it. The ‘it’ that Harding and Kennedy had in common was to cut the income tax. In both cases, federal revenues went up instead of down. . . . Since the idea worked under both Democratic and Republican administrations before, who’s to say it couldn’t work again?”

Reagan had majored in economics at Eureka College and had spent years studying the great free market economists such as Adam Smith (The Wealth of Nations), Friedrich Hayek (The Road to Serfdom), and Milton Friedman (Capitalism and Freedom). While Reagan’s opponents ignorantly wrote him off as an “amiable dunce,” it is clear that Reagan correctly and insightfully diagnosed the ailing economy of the 1970s. Unfortunately, Reagan would have to wait more than four years for the opportunity to put his prescription into practice.

Obvious conclusion based on the evidence: Keynesianism fails again.

5. The Jimmy Carter Stagflation Recession of 1980. After Jimmy Carter was inaugurated in January 1977, he inflicted the failed FDR-style Keynesian approach on the country—an approach which says the federal government can spend its way to prosperity. The result of Carter’s policies was an economic disaster called “stagflation”—slow economic growth coupled with the misery of rampant inflation and high unemployment.

By the 1980 election, America under Carter was in a full-blown recession. The American people had suffered years of double-digit interest rates, double-digit inflation, and double-digit unemployment, plus blocks-long lines at the gas station. Ronald Reagan defeated Carter in a landslide. Newsweek observed: “When Ronald Reagan steps into the White House . . . he will inherit the most dangerous economic crisis since Franklin Roosevelt took office 48 years ago.”

Reagan moved confidently and quickly to slash tax rates and domestic spending. Under his leadership, the top marginal tax rate dropped from 70 percent to 28 percent. Michael Reagan described the results:

Tax cuts generated 4 million jobs in 1983 alone and 16 million jobs over the course of Ronald Reagan’s presidency. Unemployment among African-Americans dropped dramatically, from 19.5 percent in 1983 to 11.4 percent in 1989. . . .

The inflation rate fell from 13.5 percent in 1980 . . . to 3.2 percent in 1983. . . .

The Reagan tax cuts nearly doubled federal revenue. After his 25 percent across-the-board tax rate cuts went into effect, receipts from both individual and corporate income taxes rose dramatically. According to the White House Office of Management and Budget, revenue from individual income taxes went from $244.1 billion in 1980 to $445.7 billion in 1989, an increase of over 82 percent. Revenue from corporate income taxes went from $64.6 billion to $103.3 billion, a 60 percent jump.

This was the fulfillment of the “paradoxical truth” that John F. Kennedy spoke of in his 1962 speech: “Cutting taxes now . . . can bring a budget surplus.” Both JFK and Ronald Reagan predicted that lower tax rates would generate more revenue. This “paradoxical truth” worked exactly as predicted.

At a White House press conference in 1981, President Reagan took reporters to school, explaining that the principles of Reaganomics have been known for centuries. Lower tax rates invariably bring more money into the treasury, he explained, “because of the almost instant stimulant to the economy.” This principle, Reagan added, “goes back at least, I know, as far as the fourteenth century, when a Moslem philosopher named Ibn Khaldun said, ‘In the beginning of the dynasty, great tax revenues were gained from small assessments. At the end of the dynasty, small tax revenues were gained from large assessments.'”

The principles of Reaganomics have been proved true—and Keynesian theory has been exposed as a fraud once more.

6. The Obama Recession. To be fair, what I call “The Obama Recession” actually began under George W. Bush, triggered by the collapse of the housing bubble. I think it’s fair to call it The Obama Recession because, when Barack Obama took office, he threw $814 billion of stimulus money at the recession (plus billions more in corporate bailouts, “Cash for Clunkers,” Solyndra-style green energy boondoggles, and other prime-the-pump schemes). He promised to jump-start the economy and hold unemployment below 8 percent. This was weapons-grade Keynesianism, practiced on a scale never before witnessed in human history. After spending so much money on the “cure,” Obama now owns that recession.

If Keynesian theory works at all, the Obama stimulus plan should have completely turned the economy around. But the stimulus plan—officially known as the American Recovery and Reinvestment Act of 2009—not only failed to make a splash, it didn’t make a ripple. Even after the government pumped nearly a trillion dollars of borrowed money into the economy, unemployment nudged up toward the 10 percent mark. Today, unemployment is officially below 9 percent—but the actual jobless rate is much higher.

In 2010, the Population Reference Bureau calculated the workforce to be at just over 157 million people. The Bureau of Labor Statistics reports that there are 131 million jobs in America. That would leave 26 million people jobless—or about 16 percent of the total workforce. But it gets worse: Many of those jobs are just part-time jobs, and many people hold two or more of those jobs, so the actual jobless number is certainly far higher than 16 percent—maybe 20 percent or higher.

Obvious conclusion based on the evidence: Keynesianomics fails catastrophically.

Unfortunately, the high priests of the Keynesian religion refuse to see the light. President Obama clings to his delusional Keynesian faith, insisting that all we have to do is throw more money at the economy with another stimulus bill! That is economic insanity. Former Reagan aide Peter Ferrara wrote in the Wall Street Journal:

The fallacies of Keynesian economics were exposed decades ago by Friedrich Hayek and Milton Friedman. Keynesian thinking was then discredited in practice in the 1970s, when the Keynesians could neither explain nor cure the double-digit inflation, interest rates, and unemployment that resulted from their policies. Ronald Reagan’s decision to dump Keynesianism in favor of supply-side policies—which emphasize incentives for investment — produced a 25-year economic boom. That boom ended as the Bush administration abandoned every component of Reaganomics one by one, culminating in Treasury Secretary Henry Paulson’s throwback Keynesian stimulus in early 2008.

Mr. Obama showed up in early 2009 with the dismissive certitude that none of this history ever happened, and suddenly national economic policy was back in the 1930s. Instead of the change voters thought they were getting, Mr. Obama quintupled down on Mr. Bush’s 2008 Keynesianism.

Keynesian theory is every bit as superstitious as believing in astrology or a flat Earth or the good-luck powers of a rabbit’s foot. The facts of history are beyond dispute. The old Keynesian superstition has failed every time it was tried. But Keynesian fundamentalists like Barack Obama continue to live in a state of denial.

We know what works. Nearly a century of economic history proves it. Now we need a president and a Congress with the common sense to apply the lessons of history to the economic crisis of today.

“Nothing New Here”

After posting my previous entry, “Who Made God?,” I went to Twitter and tweeted about the blog (I’m @AnswersAuthor, and there’s a “follow” button at the bottom of this page). Here’s a typical message I tweeted: “#Atheists like #ChristopherHitchens ask, ‘If God made the universe, who made God?’ Find the answer to that question at https://thetruthwillmakeyoumad.wordpress.com.”

I got a wide range of responses, both complimentary and otherwise. The uncomplimentary tweets included: “Claptrap. Self-devolving prose.” “What a pathetic specimen you are, clinging to your superstition for dear life.” “I feel ever so slightly dumber after reading some of that.”

To the twitterer who felt “ever so slightly dumber,” I replied, “Sorry my blog made you feel dumb. That was not my intent. Reread two more times—I’m sure you’ll feel smarter.” He tweeted back, “I’m afraid if I read more the result will irreversible.” To which I replied, “Then, by all means, avoid exposure to new ideas and information. I wish you well.” Ah, but we weren’t quite done. He tweeted back: “Nothing in your writing was new.”

At that point, I knew exactly how this thing would play out. I’ve spent the past 25 years studying the evidence and assembling my own case for God. I know for an absolute fact that I’ve put together a case (especially the “Who Made God?” argument) that is not in print anywhere else. I know how groundbreaking these ideas are. So for this twitterer to say there’s nothing new here is so obviously false that I knew he was bluffing. He either hadn’t read the blog, or he didn’t understand the blog, or he was pretending to have knowledge he just didn’t have.

Well, it was time for him to put up or shut up, so I tweeted back: “Excellent. You can cite for me which ideas in the article you’ve seen before and where you read them?” And, as I knew he would, he tweeted back: “Or I could waste no more of my time on you.” To which I replied, “That’s fine. As I said a few tweets ago, I wish you well.”

And I meant it. I do wish him well. I wish nothing but the best for all of my critics on Twitter and elsewhere. I hope they find the truth they are so strenuously, belligerently trying to avoid and suppress.

For some reason, my atheist critics on Twitter are usually angry and hostile, and their attacks are disproportionately personal and vindictive. I don’t know why that is. Is it the atheist mindset itself that makes people so hostile? Or is it something about Twitter, and its 140-character limitations, that makes people behave badly? I really don’t know.

One twitterer attacked my Twitter profile bio, saying, “Even his bio is a self-aggrandizing word salad.” My bio reads: “Skeptical believer, Christian anthropicist, Hayek-Friedman-Reagan small-gummint classical liberal, post-partisan author.” A word salad is defined as a string of incomprehensible words having no apparent connection to one another. But my Twitter bio is a highly succinct and accurate summation of who I am. It describes me.

So I replied (in a series of tweets), “You are kidding me! Attacking my bio, dude? Really? A rational response would be: Examine my sources, confront any faulty logic, and show me the error of my ways. I don’t know why my humble little blog is so threatening to you, but feel free to simply avoid new ideas and reject new information. Ad hominem attack is so weak and anti-rational.”

The twitterer replied, “But so apropos in this case and so enjoyable, Skippy!”

Now, here’s a weird thing I’ve noticed: For some reason, atheists on Twitter like to call their opponents “Skippy.” I’ve encountered that multiple times. I replied (over several tweets): “Atheists’ Handbook, p. 37: ‘When out of intellectual ammo, call the other guy Skippy.’ You’re the third atheist to call me that. Weak, irrational ad hominem attack is never logically apropos, but when that’s all you’ve got . . .”

I didn’t hear back.

Another atheist looked at my blog and tweeted, “An ignorant response which fails horribly. The atheist Hitchens’ question still stands, even though you word-play. Pathetic.”

So I responded, “Know what’s really pathetic? Asserting that something ‘fails horribly’ or is ignorant wordplay without backing up the assertion. Christopher Hitchens said, ‘What can be asserted without evidence can be dismissed without evidence.’ Where’s your evidence? #Weak”

The atheist replied, “What do #atheists need evidence for? When Hitchens said that, he was speaking of theists and their assertions. Pay attention.”

Well, of course, Hitchens was speaking of theists and their assertions. But the Hitchens principle cuts both ways. If a theist makes an assertion without evidence, it can be dismissed without evidence. And if an atheist or anti-theist makes an assertion, it too can be dismissed on the same basis.

My atheist friend on Twitter asserted that my blog was failed, ignorant wordplay. Okay, that’s an assertion. Now, back up your assertion with facts. What did I write that demonstrates ignorance? Where does my logic fail? Where does my evidence fail? If you just flatly assert that I’m wrong, yet you can’t tell me why I’m wrong and where I went wrong (especially when everything I’ve written is sourced and footnoted), then frankly, you’re the one who looks pathetic.

So I replied: “Hitchens was stating a broad principle: If you make a claim, back it up with fact. And yes, atheism makes assertions.”

The atheist tweeted back, “#Atheism doesn’t make assertions. You seem confused.”

I replied, “Atheism is your dogma. It blinds you to new information and new ideas.”

The atheist replied: “Why are you confused over the definition of #atheism? It’s very clear. There is no mistake. I can help you if you want. #Atheism is the position where one lacks belief in a god. Therefore, it’s not dogma. To say it’s dogma makes you look ignorant.”

Rather than reply within the 140-character restraints of Twitter, I decided to write this blog entry. I understand why my atheist friend thinks only theists need to provide evidence. I understand why he thinks that atheism makes no assertions. I understand why he denies that atheism is dogma. And I can explain why he’s wrong.

Atheist philosopher Antony Flew (who, late in life, converted to theism) divided the atheist community into two camps, “strong atheism” and “weak atheism.” Strong atheism asserts that no deities exist. Weak atheism is lack of belief in a deity without an explicit assertion that no deities exist. So my atheist friend on Twitter claims to be (by Flew’s definition) a “weak atheist.”

An assertion that is common to both strong and weak atheism is the assertion of materialism. This assertion states that the entire universe consists of nothing but matter and energy, and all phenomena in the universe, including human consciousness, result from material interactions. Science fiction writer Isaac Asimov typified the materialist view when he wrote:

The molecules of my body, after my conception, added other molecules and arranged the whole into more and more complex form. . . . In the process, I developed, little by little, into a conscious something I call “I” that exists only as the arrangement. When the arrangement is lost forever, as it will be when I die, the “I” will be lost forever, too.

And that suits me fine. No concept I have ever heard, of either a Hell or of a Heaven, has seemed to me to be suitable for a civilized rational mind to inhabit, and I would rather have the nothingness.

In my blog entry, “Who Made God?,” I present what I consider to be a compelling case that this atheist assertion is FALSE. The evidence shows that there is more to the universe than materialism, and that Mind is the ground of all reality. Any fair-minded, objective reader would have to agree that I have presented ideas and evidence that are AT LEAST worthy of consideration.

If, however, you are blinded by your dogma, if you are closed to new ideas and new information and your mind is set in stone, you will not give my ideas fair consideration. You’ll dismiss those ideas in knee-jerk fashion as “claptrap” and “ignorant wordplay.” You’ll mock the author of those ideas as “a pathetic specimen clinging to superstition.” You’ll claim that reading it actually makes you dumber. You’ll say it’s nothing new.

The one thing you will not do is actually examine those ideas and consider the evidence. You won’t even try to challenge the author’s reasoning, because to actually think about these ideas would threaten your dogma. It would mean honestly and objectively asking yourself, “What if the author is right?”

Many people assume the word dogma applies only to religious belief and doctrine. Not true. A dogma is a set of opinions or beliefs that are held with such tenacity that one becomes closed to new ideas and new information. If you find yourself feeling angry or annoyed by the ideas I presented in “Who Made God?,” there’s a good chance you are blinded by your dogma. A non-dogmatic person might disagree and calmly challenge those ideas. Or a non-dogmatic person might simply shrug and walk away. But only a dogmatist becomes hostile and insulting in response to a reasonably expressed viewpoint.

And these comments aren’t directed only at atheists. I have found that there are two groups of people who are hostile to the scientific evidence for God. One group, of course, is dogmatic atheists. The other group is dogmatic Christians. For some reason, extremely dogmatic Christians tend to hate the idea that the existence of God might be provable. They seem to think there is something noble about “blind faith,” belief without evidence.

But without evidence, how can you know what to believe?

Elton Trueblood said, “Faith is not belief without proof, but trust without reservation.” I agree. And once you’ve seen the evidence, once you’ve experienced the proof, then you can trust unreservedly. Whether believer or atheist, we must have the courage to follow the evidence. Bart D. Ehrman put it this way: “The search for truth takes you where the evidence leads you, even if, at first, you don’t want to go there.”

Dogmatic people invariably get mad when the truth pokes holes in their dogma. That’s why this blog is called, “The Truth Will Make You Mad.” Instead of getting mad, set yourself free. If you really want to know the truth, you owe it to yourself to open your mind and examine the evidence.

Who knows? If you actually THINK about my ideas and evidence, you just might find a way to prove me wrong.

______________________________________________

Postscript, September 3, 2012:

The atheist twitterer responded to my blog entry about as I expected. I’ll take the liberty of translating Twitterspeak to English—for example, changing “u” to “you,” “ur” to “your,” and so forth—for the sake of clarity. He tweeted:

“Your blog fails because you continue to be confused over what atheism means. Strong/weak are not real subcategories either.”

“An atheist is one without belief in a god. Strong/weak merely define what view atheists have in addition to atheism.”

“I refer you to my blog in response to your ignorance about atheism.”

His blog delves into the origin of the word atheism to explain the difference between “without belief in a god” versus “a belief that there is no god.” Yeah, I get that. And I explicitly acknowledged that distinction above.

As to whether strong/weak atheism (also called positive/negative atheism) are real subcategories, his argument is not with me but with atheist scholars like Antony Flew and Michael Martin. In the glossary to The Cambridge Companion to Atheism (New York: Cambridge University Press, 2007, pages xvii and xviii), Martin writes:

Negative atheism: absence of belief in any god or gods. More narrowly conceived, it is the absence of belief in the theistic God. Cf. positive atheism. . . .

Positive atheism: disbelief in any god or gods. More narrowly conceived, it is disbelief in the theistic God. Cf. negative atheism.

Okay, enough hair-splitting. My atheist friend’s next tweet:

“Until you can come up with actual evidence for a god, you will continue to have the burden of proof, and we will sit, point and laugh at you.”

That burden began to shift as far back as September 1973 when physicist Brandon Carter presented a paper (“Large Number Coincidences and the Anthropic Principle in Cosmology”) at the Copernicus symposium in Kraków, Poland. Carter described some of the odd coincidences in the universe—a multitude of seemingly unrelated laws of physics that appear to be coordinated and fine-tuned to produce life. Carter called this concept “the anthropic principle,” also known as the “fine-tuned universe” concept. I address it in greater detail in “Is Our Universe ‘the Ultimate Artifact’?”

In the years since Brandon Carter delivered that paper at the Kraków symposium, the evidence has been steadily growing that the universe seems to have been deliberately fine-tuned to produce life, and that Mind is essential to the existence of the universe. That is the foundation of the case I have assembled in my blog entries, “Is Our Universe ‘the Ultimate Artifact’?” and “Who Made God?” 

Is the fine-tuned universe proof of the existence of God? Some scientists find it convincing. Others do not. Those who are convinced include theoretical physicist Freeman Dyson, physicist Frank Tipler, astronomer Alan Sandage, and Francis Collins, former head of the Human Genome Project and President Obama’s head of the National Institutes of Health.

Even scientists who are unconvinced recognize that the anthropic evidence is powerful and at least gives the unmistakable appearance of pointing to God. Atheist physicist George Greenstein wrote:

As we survey all the evidence, the thought insistently arises that some supernatural agency—or, rather, Agency—must be involved. Is it possible that suddenly, without intending to, we have stumbled upon scientific proof of the existence of a Supreme Being? Was it God who stepped in and so providentially crafted the cosmos for our benefit? …

It is a matter of taste how one deals with that notion. Those who wish are free to accept it, and I have no way to prove them wrong. But I know where I stand. . . . I reject it utterly.

[George Greenstein, The Symbiotic Universe (New York: William Morrow, 1988), pp. 27 and 87.]

So Greenstein clearly states that the anthropic evidence appears to point to God, though he himself rejects that notion. The evidence Greenstein refers to is essentially the evidence I present in “Is Our Universe ‘the Ultimate Artifact’?” I take those ideas even further in “Who Made God?”

Those two blog entries contain about 4800 words of rational scientific evidence, yet they form just a brief introduction to the mountain of evidence that exists. Even so, they dismantle the ignorant atheist canard that there’s “no evidence” for God.

If my atheist friend is correct and the burden of evidence is on me, then hey, no problem, I have delivered the goods. It’s there in those blogs. He and his fellow atheist twitterers are either unwilling or unable to deal with that evidence, because over the past few days, not one of them has challenged or refuted a single word in those blogs.

My atheist friend can continue splitting hairs about the definition of atheism if he likes, and he can “sit, point and laugh” at the evidence and the truth. But the burden is now on my atheist friend to put up or shut up—and to come up with some facts and intelligent reasoning to counter what I have presented.

The atheist twitterer concludes:

“There is no ‘scientific evidence’ for your god. Atheists appear hostile to your irrational beliefs, not your invisible evidence.”

You, the reader, can judge for yourself if these blogs begin to build a case for a Cosmic Designer, as I claim—or if they are nothing but “irrational beliefs” and “invisible evidence,” as my atheist friend claims.

Oh, and one more thing: Christopher Hitchens, author of God is Not Great, has acknowledged that the fine-to universe evidence is “intriguing” and “not trivial.” You can hear it from Hitchens’ own lips at “Christopher Hitchens Makes a Startling Admission.”  Here’s the essential part of Hitchens’ statement [note: when Hitchens says “we,” he means leading atheists such as Richard Dawkins, Sam Harris, and himself]:

At some point, certainly, we are all asked which is the best argument you come up against from the other side. I think every one of us picks the fine-tuning one as the most intriguing. . . . Even though it doesn’t prove design, doesn’t prove a Designer . . . you have to spend time thinking about it, working on it. It’s not a trivial [argument]. We all say that.

If Christopher Hitchens, the atheists’ atheist, acknowledged that the fine-tuning evidence is “not trivial,” that it is “most intriguing,” that “you have to spend time thinking about it, working on it,” then anyone who says there is “no scientific evidence” for God is either intellectually dishonest or ignorant.

______________________________________________

Post-postscript:

The atheist twitterer in question has asked that I give out his Twitter username (@TedTheAtheist) and the link to his blog reply. Done.

A person with a fixed idea will always find some way
of convincing himself in the end that he is right.”

Mathematician Atle Selberg

Who Made God?

Here’s an excerpt from my book God and Soul: The Truth and the Proof, which presents the rational, scientific case for the existence of God and the human soul. This section addresses a question that is invariably posed by the New Atheists (Christopher Hitchens, Richard Dawkins, Sam Harris, Daniel Dennett, and Michael Shermer): “If God made the universe, who made God?” I think you’ll find that this is an answer you’ve never encountered before. The following excerpt from God and Soul is copyright 2012 by Jim Denney, and may not be reproduced without permission.

Excerpt:

There is a question that all of the New Atheists ask in their books, their speeches, and their public debates. It’s a question intended to stump the believers, end the debate, and expose the theistic fallacy once and for all. It’s the simple question, “If God made the universe, who made God?”

Michael Shermer, in his book The Believing Brain, frames the question this way: “Who created God? God is he who needs not be created. Why can’t the universe be ‘that which needs not be created’?”32 Daniel Dennett puts it this way in Darwin’s Dangerous Idea: “If God created all these wonderful things, who created God? Supergod? And who created Supergod? Superdupergod? Or did God create himself?”33 Christopher Hitchens, in God is Not Great, wrote, “The postulate of a designer or creator only raises the unanswerable question of who designed the designer or created the creator.”34 Likewise Sam Harris in Letter to a Christian Nation: “The notion of a creator poses an immediate problem of an infinite regress. If God created the universe, what created God?”35 Finally, in The God Delusion, Richard Dawkins makes it unanimous:

The whole argument turns on the familiar question, “Who made God?”, which most thinking people discover for themselves. A designer God cannot be used to explain organized complexity because any God capable of designing anything would have to be complex enough to demand the same kind of explanation in his own right. God presents an infinite regress from which he cannot help us escape.36

The question “Who made God?” is actually a question many children ask. Because it’s a childlike question, we should first make sure the question does not contain an underlying fallacy, such as a category mistake. A category mistake is a semantic or logical error in which objects of one kind or category are mistakenly presented as if they belong to another kind or category. For example, the question “What does red taste like?” is a category mistake because “red” belongs to the category of colors, not tastes. Something that is red may taste like raspberries or like blood, because “red” is not a taste.

The question “Who made God?” may be a similar category mistake because God may not belong to the category of created things, but to a separate category, such as “ground of reality” or “ground of being.” The anthropic principle [or “fine-tuned universe”] strongly suggests that the Cosmic Designer, being the Architect and Originator of the Big Bang, may not belong to the category of created things. If that is true, if God is the ground of reality, then Dawkins is mistaken and God does not present us with “an infinite regress from which he cannot help us escape.”

The Abrahamic religions assert that God does not belong to the category of created things, and that is why most theistic writers answer the “Who made God?” question in a dogmatic way. Here’s a typical theistic answer to that question:

Who made God? No one did. He was not made. He has always existed. Only things that had a beginning — like the world — need a maker. God had no beginning, so God did not need to be made.37

Of course, this “answer” doesn’t answer anything. It’s simply a dogmatic statement that erects a mental firewall against further inquiry. If the question “Who made God?” makes our brains hurt, then let’s just say, “God had no beginning,” and stop thinking about such questions.

I prefer to keep thinking.

The question “Who made God?” is a useful and interesting way to prod further thought and discussion. Unfortunately, the New Atheists try to use this question to end the discussion.

In order to honestly grapple with the question “Who made God?,” we need to have our consciousness raised — twice. Richard Dawkins has called Darwin’s theory of evolution by natural selection “the ultimate scientific consciousness-raiser.” And it’s true — we do need to have our consciousness raised by the principle of natural selection. But we mustn’t stop there. We must also have our consciousness raised by the anthropic principle.

The problem with Dawkins and his fellow New Atheists is that they have only had their consciousness raised once. If they would raise their consciousness a second time by opening their minds to the anthropic principle, they might discover where the “Who made God?” question actually leads us.

If there is a Cosmic Designer who created a universe with the purpose of bringing forth intelligent life (as the anthropic evidence clearly, overwhelmingly suggests), then the Cosmic Designer would certainly welcome our intelligent inquiry. After all, we human beings are the “children” of the Cosmic Designer, and the raison d’être of the anthropic principle. The universe was called into being for the express purpose of bringing thinking beings into existence — so it seems to me that the Cosmic Designer would be pleased to know that the conscious, reasoning creatures of the universe have begun to look back and think deeply about such questions.

One place to begin thinking about the question “Who made God?” is to remember that time began at the moment of the Big Bang. I know this is an impossible concept to fully grasp, but it’s true: There was no such thing as time “prior to” the Big Bang. In fact, the phrase “before the Big Bang” is about as meaningless an expression as can be ever be put into words. Time did not exist until the instant of the Big Bang, which physicists express as “t = 0.” The first moment of time, the first micro-tick of the cosmic clock, occurred approximately 13.7 billion years ago. Adolf Grünbaum (b. 1923), the founding Director of the University of Pittsburgh’s Center for Philosophy of Science, explained it this way (all emphasis is in the original):

[The Big Bang instant t = 0] … had no temporal predecessor. In this case, t = 0 was a singular, temporally first event of the physical space-time to which all of the world-lines of the universe converge. This means that there simply did not exist any instants of time before t = 0! But it would be (potentially) misleading to describe this state of affairs by saying that “time began” at t = 0. This description makes it sound as if time began in the same sense in which, say, a musical concert began. And that is misleading precisely because the concert was actually preceded by actual instants of time, when it had not yet begun. But, in the Big Bang model … there were no such earlier instants before t = 0 and hence no instants when the Big Bang had not yet occurred. [Astronomer Sir Alfred Charles Bernard Lovell] … is quite unaware of these facts when he speaks mistakenly of a “metaphysical scheme before the beginning of time and space.” Similarly, there is no basis for [cosmologist Jayant Vishnu Narlikar’s] … lament that “scientists are not in the habit of discussing … the situation prior to [the Big Bang].”38

There was nothing before the Big Bang. There was no space, no time, no matter, no energy, no gravity, no “before.” At t = 0, all of the life-giving, fine-tuned laws, constants, and forces of the universe were “baked in.” If there was no space and time “before” t = 0, then what “caused” the “effect” we know as the Big Bang? Who or what designed this amazing, delicately calibrated universe that gives us life?

Answer: A Mind — a conscious, purposeful, willful Designer.

Because we live within a reality that consists of three dimensions of space and one dimension of time, we assume that the ultimate ground of reality is space-time. But space-time can’t be the ultimate ground of reality because space-time is a mere 13.7 billion years old. Space-time did not exist until the Big Bang happened.

The universe is trying to tell us something: The universe is not primarily about space, time, matter, energy, and gravity. Those things are real, but they are not the most basic feature of the universe. At its most fundamental level, the universe is all about Mind.

(When I capitalize the word “Mind,” I’m not suggesting that “Mind” means “Supernatural Deity.” I’m trying to convey the fact that Mind is an entity distinct from the space-time universe of matter. The mind of God would be Mind, but the minds of human beings and other conscious observers also partake in this collective property I call “Mind” with a capital M.)

Before you dismiss these ideas as a lot of New Age tripe, like auras and spiritual vibrations, I want to state clearly that I don’t deal in mysticism. The integral role of the conscious mind in quantum physics has been an accepted scientific concept as far back as the 1920s, when Niels Bohr and Werner Heisenberg were noodling around with wavefunction mathematics.

Great scientists have considered the role of Mind in the structure of the universe at least since the day of astronomer Johannes Kepler (1571-1630). When he began to understand the laws of planetary motion that bear his name, Kepler exclaimed, “O God! I think thy thoughts after Thee!”39 The universe, Kepler realized, was designed by conscious, rational, purposeful thought.

Three centuries later, Stephen Hawking made a similar statement at the end of his book A Brief History of Time. Hawking concluded that if we could discover a complete “theory of everything” and find the answer to why we and the universe exist, “it would be the ultimate triumph of human reason — for then we should know the mind of God.” Hawking, an agnostic, used the term “mind of God” in a metaphoric sense — but his statement may be more literally true than even he intended.

Countless physics experiments clearly show that the workings of the universe are entangled with the workings of Mind — the minds of conscious human observers at least, and perhaps the mind of God. One of the fathers of quantum theory, Austrian physicist Erwin Schrödinger (1887-1961), expressed this view when he wrote, “The overall number of minds is just one. I venture to call [mind] indestructible since it has a peculiar timetable, namely mind is always now.”40 In other words, Mind is an indivisible unity, it cannot be destroyed, and it is timeless. Only a mind of the kind Schrödinger describes would be capable of formulating, coordinating, and fine-tuning all of the life-giving laws, constants, and forces of the universe at the moment of t = 0.

Schrödinger goes on to speak of the conscious mind that each of us thinks of as “I” or “myself.” He writes: “We do not belong to this material world that science constructs for us. We are not in it; we are outside. We are only spectators. The reason why we believe we are in it, that we belong to the picture, is that our bodies are in the picture.”41

Here, Schrödinger describes a picture of reality that is almost religious in nature — yet this picture of reality is derived from quantum physics, not some religious text or tradition. In Schrödinger’s description, Mind interacts with the material world but is not part of the material world. Mind is outside of the material world — a “spectator.” A mind housed in a human body tends to mistake the material body for the “I” or the “self” that is the mind. But while the body belongs to the world of matter, in Schrödinger’s view, the mind is separate from the material world.

This view parallels that of Australian neurophysiologist Sir John Carew Eccles (1903-1997), who won the 1963 Nobel Prize in Medicine for his pioneering work on brain synapses and neurotransmitters. Eccles came to the conclusion that consciousness and thought occur when the non-material mind acts upon the quantum “microsites” within the synapses of the cerebral cortex of the brain. He suggested that the non-material mind interacts with the material brain by means of quantum mental units called “psychons.” These psychons control the quantum jumps within synapses, causing them to emit neurotransmitters which account for such brain activity as thought, decision-making, and body movement. In Eccles’ view, the brain doesn’t give rise to the mind; rather, the mind is separate from the brain, and it activates the brain in order to control the body.

Eccles authored or co-authored several books with the intent to “challenge and negate materialism and to reinstate the spiritual self as the controller of the brain.”42 In How the Self Controls Its Brain, Eccles even went so far as to say, “In some mysterious way, God is the Creator of all the living forms in the evolutionary process, and particularly in hominid evolution of human persons, each with the conscious selfhood of an immortal soul. … Biological evolution transcends itself in providing the material basis, the human brain, for self-conscious beings whose very nature is to seek for hope and to enquire for meaning in the quest for love, truth, and beauty.”43

American physicist Nick Herbert, the author of Quantum Reality, has worked as a senior physicist in industry (Memorex, Smith-Corona Marchant) and in pure research (Lawrence Berkeley National Laboratory, Xerox PARC). Herbert is a strong proponent of the view that Mind is a more pervasive aspect of reality than matter and energy. While the standard view of reality is that the universe evolved consciousness (in the form of conscious beings like us), Herbert says that consciousness comes first, and that consciousness creates reality. He writes:

The first person to suggest that quantum theory implies that reality is created by human consciousness was not some crank on the fringes of physics but the eminent mathematician John von Neumann. In his quantum bible [Mathematische Grundlagen der Quantenmechanik or The Mathematical Foundations of Quantum Mechanics] …, the most influential book on quantum theory ever written, von Neumann concludes that, from a strictly logical point of view, only the presence of consciousness can solve the measurement problem. As a professional mathematician, von Neumann was accustomed to boldly following a logical argument wherever it might lead. … His logic leads to a particularly unpalatable conclusion: that the world is not objectively real but depends on the mind of the observer.44

(Personal note: I lean toward a view which holds that the world is objectively real, but that Mind interacts with and shapes objective reality in more powerful ways than we normally suppose.)

Nick Herbert goes on to compare von Neumann’s view, rooted in mathematics and experimental physics, to the intuitive insights of George Berkeley (1685-1753), Bishop of Cloyne, Ireland. Describing Berkeley’s views, Herbert wrote:

Berkeley argued that mind is not a form of matter but quite the opposite: matter does not even exist except as the perception of some mind. Absolute existence belongs to minds alone — the mind of God, the minds of humans and other spiritual beings. All other forms of being, including matter, light, the Earth, and stars, exist only by virtue of some mind’s being aware of them. … Esse est percipi (To be is to be perceived) was the Irish bishop’s motto concerning matter: “All those bodies which compose the mighty frame of the world have no subsistence without a mind.”45

So let’s bring this discussion back to the original question: “Who made God?” At this point, you may see where I’m heading. Nick Herbert’s suggestion (derived from von Neumann) that “reality is created by human consciousness” is a step in the right direction, but it doesn’t account for all the facts. The universe has existed for 13.7 billion years. Conscious human beings (in the form of genus Homo) have existed for the tiniest fraction of that span of time, roughly 2.4 million years. Our own species, Homo sapiens, has existed for less than 200,000 years — a mere twinkle in the eye of the cosmos.

For the better part of 13.7 billion years, there were no conscious human minds in existence to observe reality and make reality real — but does that mean there was no conscious Mind at all in the universe? No. Mind was immanent throughout the universe from the instant of t = 0. As physicist Freeman J. Dyson has said, “God is what Mind becomes when it has passed beyond the scale of our comprehension.”46

So what sort of conscious Mind existed during all those billions of years before human beings evolved? What sort of Mind directed the life-giving purpose of the universe at the moment the Big Bang? What sort of Mind selected, balanced, and fine-tuned the laws, constants, and forces of the universe at the instant of t = 0?

Everything that exists within the space-time universe is subject to the principle of causality. A cause always precedes its effect, and causes and effects always take place within the framework of space and time. But if Mind exists outside of the space-time universe, Mind is not subject to the principle of causality. If Mind is not an effect produced by some other cause, then Mind itself is the cause — and the universe is the effect.

If Mind is the ground of existence, and therefore not subject to the law of cause and effect, then the question “Who created God?” (in effect, “Who created Mind?”) can be seen as a nonsense question. It’s like asking “How big is blue?” or “What does seven taste like?”

To say that Mind is the ground of reality is not to say that space and time, matter and energy, are not real. They are definitely real. But it is Mind — the mind of the Cosmic Designer, the mind of conscious beings like ourselves — that makes reality real. To quote Freeman Dyson once more, “I do not claim that the architecture of the universe proves the existence of God. I claim only that the architecture of the universe is consistent with the hypothesis that mind plays an essential role in its functioning.”47

As the English mathematician-astronomer Sir James Jeans (1877-1946) concluded, “The universe appears less and less like a great machine and more and more like a great thought.”

End of excerpt.

For more information on the anthropic (fine-tuned universe) evidence for God (the Cosmic Designer), see my previous blog post, “Is Our Universe ‘the Ultimate Artifact’?”

Is Our Universe “the Ultimate Artifact”?

April 1987 ANALOG

I first encountered the scientific case for the existence of God in the April 1987 issue of Analog Science Fiction / Science Fact. Sandwiched among the science fiction stories was a fact article by Richard D. Meisner with the intriguing title “Universe—the Ultimate Artifact?” I began reading—and what I read was startling. Meisner gave a guided tour of a number of startling cosmic coincidences.

Meisner’s conclusion: The universe appears to be an artifact—an object designed by an intelligent entity for a specific purpose. Meisner went on to quote cosmologist Paul Davies: “It is hard to resist the impression that the present structure of the universe, apparently so sensitive to minor alterations in the numbers, has been rather carefully thought out.” Then Meisner offered his own impression:

One may feel inclined to apply the word “God” in this context. This is justifiable, although I tend to avoid the word simply because I’ve found almost without exception that it triggers an immediate positive or negative emotional response in the listener—most inconducive to good scientific thinking. Naturally, the artifact hypothesis is most attractive when stripped of its unfortunate historical trappings of superstition and dogma. . . . Personally, if the artifact inference proved true, I would be most interested not in how the universe was fabricated, but why.

A year after I encountered Meisner’s article in Analog, I discovered a book by Dr. George Greenstein with the intriguing title The Symbiotic Universe. It’s a book-length treatment of the cosmological case for God. It explores the body of evidence Meisner wrote about, but in much greater depth and detail.

Dr. Greenstein is a Yale-educated astrophysicist who currently teaches at Amherst College in Massachusetts. In the early 1980s, Greenstein became fascinated by the scientific case for God, and he began examining the list of “cosmic coincidences” purely as a matter of personal amusement. As the list of “coincidences” kept growing, Greenstein found the results disturbing.

“The more I read,” Greenstein wrote, “the more I became convinced that such ‘coincidences’ could hardly have happened by chance.” Why did he find the “cosmic coincidences” disturbing? Because they appeared to be evidence for a Cosmic Designer—that is, evidence for God—and Greenstein was a confirmed atheist.

The possibility that God or a Godlike super-intelligence might have actually designed the universe made Greenstein almost physically sick. He recalls experiencing “an intense revulsion, and at times it was almost physical in nature. I would positively squirm with discomfort. … I found it difficult to entertain the notion without grimacing in disgust, and well-nigh impossible to mention it to friends without apology.”

What is the scientific evidence that caused Dr. Greenstein to “squirm with discomfort”? It is often referred to as the evidence for a “fine-tuned universe.” The universe, we now know, is incredibly precision-balanced (or “fine-tuned”) to produce life. Take, for example, the Big Bang.

At the moment the Big Bang began, everything that exists—matter, energy, the three dimensions of space, and the fourth dimension of time—emerged from a single geometric point, expanding at the speed of light. The Big Bang actually created space and time.

Scientists are amazed that the explosive violence of the creation event was as delicately balanced as it was. Cosmologist Paul Davies observes:

Had the Big Bang been weaker, the cosmos would have soon fallen back on itself in a big crunch. On the other hand, had it been stronger, the cosmic material would have dispersed so rapidly that galaxies would not have formed. … Had the explosion differed in strength at the outset by only one part in 1060, the universe we now perceive would not exist. To give some meaning to these numbers, suppose you wanted to fire a bullet at a one-inch target on the other side of the observable universe, twenty billion light-years away. Your aim would have to be accurate to that same part in 1060…. Channeling the explosive violence into such a regular and organized pattern of motion seems like a miracle.

If the explosive force of the Big Bang not been perfectly balanced and incredibly fine-tuned, life would be impossible and you and I could not exist.

At first, the laws and constants of the universe were simply accepted as a matter of fact—no one wondered why this or that force or constant of physics was not slightly stronger or weaker than it is. Eventually, physicists began to realize (as George Greenstein observes in The Symbiotic Universe) that the “laws of nature could have been laid down only in the very instant of the creation of the universe, if not before.”

Paul Davies recalls that when he was a student, the question of where the laws of physics come from was off-limits. A scientist was supposed to simply apply those laws, not inquire into their origin. They would say, “There’s no reason the laws of physics are what they are—they just are.” Davies concluded, “The idea that the laws exist reasonlessly is deeply anti-rational. … It makes a mockery of science.”

As it became clear that the laws of nature might have been different than they are—that they appeared to have been deliberately selected to produce life—scientists began to look at these forces, laws, and constants with new sense of awe. The entire universe seemed to be constructed out of an incredibly unlikely series of cosmic coincidences. Some examples:

There are four forces governing the structure and behavior of subatomic particles—the electromagnetic force, the gravitational force, the strong nuclear force and the weak nuclear force. These forces determine everything from how an electron orbits the nucleus of an atom to how stars and galaxies are formed. Each force has a specific mathematical value called a constant (because its value never varies).

The gravitational force constant is finely tuned to permit life. Slightly greater, and stars would burn too hot, too quickly, and too unevenly to produce life-giving elements. Slightly smaller, and stars would be too cool, so that nuclear fusion could not take place and there would be no life-giving heavier elements.

The electromagnetic force is also fine-tuned. If its constant were slightly larger or smaller, the chemical bonding required for making living things could not take place.

There is a fine-tuned balance between the gravitational and electromagnetic forces. If the constant of the ratio between these two forces were larger, there would be no stars smaller than 1.4 solar masses, and the lifetime of stars would be too short to generate life-giving elements. If the constant were smaller, there would be no stars larger than 0.8 solar masses—and again, no production of life-giving heavier elements.

If the strong nuclear force constant were slightly larger, there would be no hydrogen in the universe and no stars. If this constant were smaller, the universe would consist of nothing but hydrogen.

If the weak force constant were larger, most of the hydrogen in the universe would have converted to helium during the Big Bang. If it were smaller, there’d be too little hydrogen converted to helium—a roadblock to the production of life-giving heavier elements such as carbon and oxygen.

The proton-to-electron mass ratio: A proton is 1,836 times more massive than an electron; if this ratio varied slightly in either direction, molecules could not form and life could not exist. The ratio of the number of protons to the number of electrons is also finely balanced to permit the electromagnetic force to dominate the gravitational force, allowing the formation of galaxies, stars, and planets.

The unusual properties of water are also a fine-tuned condition for life. Water plays an essential role in almost every biological function. It is necessary to photosynthesis, the foundation of the food chain. In photosynthesis, plants use sunlight to convert carbon dioxide and water into sugar, giving off oxygen as a “waste product.”

Water is one of the few liquids that expands when it freezes. Most substances contract and become more dense when they freeze, but frozen water is actually 9 percent less dense than liquid water. This is because, at freezing temperatures, the hydrogen bonds that connect water molecules make an adjustment to keep negatively charged oxygen atoms apart. This adjustment creates the crystal lattice that enables ice to float in liquid water.

If water didn’t have this extraordinary property, ice would sink, which would cause lakes and rivers to freeze solid. If ice did not float, observes George Greenstein, life on Earth “would be confined to a narrow strip lying close to the equator.”

And the list goes on: the proton decay rate, the neutron-proton mass difference, the matter-antimatter ratio, and on and on—it’s as if dozens of completely unrelated laws of nature plotted together in a vast cosmic conspiracy to produce life. As Paul Davies observes:

It is tempting to believe, therefore, that a complex universe will emerge only if the laws of physics are very close to what they are. … The laws, which enable the universe to come into being spontaneously, seem themselves to be the product of exceedingly ingenious design. If physics is the product of design, the universe must have a purpose, and the evidence of modern physics suggests strongly to me that the purpose includes us.

And physicist Fred Hoyle adds, “I do not believe that any scientist who examines the evidence would fail to draw the inference that the laws of nuclear physics have been deliberately designed.”

Is our life-giving universe the result of an inconceivably improbable series of cosmic accidents? Or is it the product of calculated, deliberate design?

Is the universe evidence—even proof—of the existence of God? Is our universe “the Ultimate Artifact” of the mind and hand of an intelligent Creator?

An Unmitigated Disaster for America

The SCOTUS decision on Obamacare is an unmitigated disaster for America. Here’s why:

1. IT INVENTS A NEW, EXPANDED DEFINITION OF THE POWER TO TAX. Roberts firewalled the expansion of the Commerce Clause, but redefined a mandate as a “tax.” Now the Left no longer needs the Commerce Clause to do anything it wants. Leftist social engineers can run (and ruin) our lives via the Tax Clause.

The constitutional power to tax has never before been used to control private behavior, only to fund functions of government. Roberts INVENTED a huge new cudgel that the government can use to oppress and bully the people. Government WILL use it against us in ways we do not now imagine.

Libertarian attorney Jacob Hornberger rightly a called the Constitution “a barbed-wire entanglement designed to interfere with, restrict, and impede government officials in the exercise of political power.” That is the Founding Fathers’ view. By contrast, Roberts took it upon himself to EXPAND federal power in a previously unheard-of direction.

John Yoo of the U.C. Berkeley School of Law has a great Wall Street Journal piece called “Chief Justice Roberts and His Apologists.” Here’s an excerpt:

Justice Roberts’s opinion provides a constitutional road map for architects of the next great expansion of the welfare state. Congress may not be able to directly force us to buy electric cars, eat organic kale, or replace oil heaters with solar panels. But if it enforces the mandates with a financial penalty then suddenly, thanks to Justice Roberts’s tortured reasoning . . . the mandate is transformed into a constitutional exercise of Congress’s power to tax. . . . Justice Roberts may have sacrificed the Constitution’s last remaining limits on federal power for . . . a little peace and quiet from attacks during a presidential election year.

2. REPEAL AND REPLACE IS A LONG SHOT. A friend of mine confidently told me, “No worries. We’ll win the election, and Obamacare will be repealed and replaced by the next administration.”

First, I’m not confident Mitt Romney will win. He has a limitless capacity for unforced campaign errors.

Second, even if he wins, it’s unlikely Obamacare will be dismantled. No government program, once established, has ever been dismantled in the history of the republic. Ronald Reagan couldn’t fulfill his promise to dismantle the Dept. of Education, even though it had been established just a year earlier by Jimmy Carter. The forces against repeal will be brutal. I don’t think Romney and Boehner really believe they will “repeal and replace” Obamacare, but it does make great election-year rhetoric.

Our best chance of dismantling this unconstitutional, oppressive socialist scheme was in the Supreme Court. Now that chance is gone.

3. ROBERTS’ RATIONALIZATION TWISTS THE CONSTITUTION. Some apologists for Chief Justice Roberts suggest that he crafted this tortured decision in order to safeguard the reputation and stature of SCOTUS. If so, then he protected SCOTUS at the expense of the nation and the Constitution. The best way to safeguard SCOTUS is to safeguard the Constitution. By concocting a transparently phony rationale that a mandate is a tax, Roberts got friendly media coverage, but did violence to the Constitution. If Roberts crafted this rationale in order to improve the reputation of SCOTUS, he’s lost his perspective on why SCOTUS exists.

4. WE CAN’T RELY ON ANTI-TAX SENTIMENT. Some have suggested that if people don’t like being taxed to pay for Obamacare, they can simply vote to change Congress.

Problem: We’ve reached the tipping point where anti-tax sentiment in America is a minority position. Most Americans pay no income taxes, and have every reason and incentive to vote increased taxes on those who do pay. Voters won’t vote to change Congress if they like getting freebies from the government at the expense of fellow taxpayers.

What about people who can’t pay the Obamacare mandate “tax”? The rest of us will pay it for them. The middle class will get soaked, as usual.

5. THE FOUR DISSENTERS ON THE COURT MAKE A COMPELLING CASE THAT THIS IS A DISASTROUS DECISION. Justices Scalia, Kennedy, Thomas, and Alito wrote:

To say that the Individual Mandate merely imposes a tax is not to interpret the statute but to rewrite it. Judicial tax-writing is particularly troubling. . . . The Constitution requires tax increases to originate in the House of Representatives . . . the legislative body most accountable to the people. . . . We have no doubt that Congress knew precisely what it was doing when it rejected an earlier version of this legislation that imposed a tax instead of a requirement-with-penalty. . . . Imposing a tax through judicial legislation inverts the constitutional scheme, and places the power to tax in the branch of government least accountable to the citizenry.

The devastating logic of the dissenters trumps the slippery reasoning of Chief Justice Roberts. This decision is a complete disaster, and I have very little confidence it can ever be undone.

Darwin’s Holocaust? (Part 3 of 3)

Go to Part 1.

Continued from Part 2.

Christopher Hitchens wrote a book called God Is Not Great: How Religion Poisons Everything. As the title suggests, Hitchens blamed much of the evil in the world on religion. (For more insight into Hitchens’ views and where his thinking went wrong, see Lament for an Atheist—Part I and Lament for an Atheist—Part II. See also the video at Christopher Hitchens Makes a Startling Admission.)

Evolutionary biologist Richard Dawkins makes a similar case in The God Delusion. And it’s true that many atrocities, savageries, and cruelties have been committed in the name of religion: The Crusades, the Spanish Inquisition, the persecution of Galileo, the execution of Giordano Bruno, the Albigensian Crusade, Martin Luther’s rabidly anti-Semitic treatise On the Jews and Their Lies, the Salem Witch Trials, the 1066 Granada Massacre and other pogroms, the “Troubles” in Northern Ireland, the Lebanese Civil War, the Israel-Palestinian problem, Jonestown, India versus Pakistan, ethnic cleansing in Bosnia-Herzegovina, Jihad, 9/11, and on and on.

But does religion really poison everything? Well, it depends on how you define “religion.”

If, by “religions,” we mean the tribalistic societies that organize themselves around certain beliefs, rules, rituals, and traditions, and that often defend their beliefs through figurative or literal “holy wars,” then yes, I agree, that sort of religion has a distinctly poisonous history. (And by tribalism, I mean any social structure — including a religion or denomination — that prizes cultural conformity within the group and practices hostility toward those outside the group.)

But if, by “religion,” we mean a commitment to live according to the teachings of, say, the Sermon on the Mount — teachings that cut across the grain of our tribal instincts by commanding us to love our enemies, forgive those who sin against us, and pray for those who persecute us — then Christopher Hitchens was simply wrong. That kind of rational, selfless religion has never poisoned anything. In fact, in The God Delusion, Richard Dawkins writes: “Jesus, if he existed … was surely one of the great ethical innovators of history. The Sermon on the Mount is way ahead of its time. His ‘turn the other cheek’ anticipated Gandhi and Martin Luther King by two thousand years.”24

The Sermon on the Mount is the sort of religion that even Richard Dawkins can endorse. The compassionate, forgiving, anti-tribalist Christianity of the Sermon on the Mount really does exist, and is often found right alongside the corrupt, institutional religiosity that Jesus of Nazareth confronted and condemned throughout the gospel accounts.

Jesus seemed to know in advance that some of his so-called “followers” would corrupt and betray his message. He predicted that a day would come when many supposed “Christians” would say to him, “Lord, Lord, did we not prophesy in your name, and in your name drive out demons and perform many miracles?” And he said his reply to them would be blunt: “I never knew you. Away from me, you evildoers!”25

Hypatia, a woman  of Alexandria in Roman Egypt, was one of the leading scholars of the classical age. She was famed as a mathematician, astronomer, and public speaker, and she taught at the Great Library of Alexandria. Unfortunately for Hypatia, she also threatened the political power of Cyril, the corrupt Christian archbishop of Alexandria. In AD 415, Cyril sent his aide, known as Peter the Reader, to recruit a mob of monks to assassinate Hypatia. The monks ambushed her in her chariot, stripped her naked, dragged her through the streets to the Caesareum church, where they killed her, defiling their own house of worship with her murder. They tore her to pieces, then burned the body parts outside of the city. The stated rationale for Hypatia’s grisly murder was that “she beguiled many people through Satanic wiles” — but the true motive was Cyril’s lust for power.26 No doubt, Jesus would say to Cyril, Peter the Reader, and the murderous monks, “I never knew you. Away from me, you evildoers!”

The Crusaders swept through Europe and into the Holy Land, slaughtered Jews and Muslims, pillaged and burned entire villages, raped women and put infants to the sword. They exhibited the heads of their slain enemies on stakes. Wild tales of miracles circulated among the Crusaders, bolstering their morale as they committed horrific atrocities under the banner of the cross. It’s no wonder that radical Muslims to this day identify all Christians as “Crusaders.”

Tomás de Torquemada was the first Grand Inquisitor of the Spanish Inquisition in the fifteenth century. Known as “the hammer of heretics,” Torquemada “enthusiastically supported the use of torture during interrogations,”27 and reportedly sent at least 2,000 supposed “heretics” to be burned at the stake. Yet if you compare Torquemada’s “enthusiastic” actions with the teachings of Jesus, you have to wonder: Who is the true heretic — Torquemada’s victims, or Torquemada himself?

Corrupt, tribalist, institutional religion is rife with human evil. To all those who torture, kill, rape, molest, seduce, and steal under the cloak of religion, the one who preached the Sermon on the Mount undoubtedly says, “I never knew you. Away from me, you evildoers!”

But the religion of the Sermon on the Mount, the religion endorsed by Richard Dawkins, is another thing altogether. That religion has produced some of the finest achievements of our civilization.

Take, for example, our healthcare system. Compassionate religion has blessed the world with the creation of hospitals. In the Middle Ages, religious orders of monks and nuns ran the first hospitals in Europe. In medieval France, a hospital was called a hôtel-Dieu, a hotel of God.

Or, consider how religion has promoted education. Priests and monks preserved civilization and learning through the Dark Ages. The church also invented institutions of higher learning. In medieval times, writes historian Lowrie J. Daly, “there were no great state-supported educational systems, nor even solitary schools. Practically speaking, the Church was the only institution in Europe that showed consistent interest in the preservation and cultivation of knowledge.”28

The first university in the world was at Bologna, Italy, founded around AD 1088 or earlier; next was Oxford, founded around 1096; the University of Salamanca, Spain, founded circa 1130; the University of Paris, circa 1150; and Cambridge, circa 1209. We don’t know the exact date any of these great universities were founded because they all had modest, unheralded beginnings as cathedral schools, taught by clerics.

Religion practically invented science as we know it. Medieval church clerics studied empirical phenomena and catalogued their findings. The study of science came naturally to the religious mind, because the early clerics believed that a rational God had created a orderly world that could be comprehended by human reason. Here are a few of those early clergy-scientists:

• Thierry of Chartres (died c. 1150) wrote and taught at a cathedral school at Chartes, France. In his Hexaemeron, he proposed a cosmology with similarities to the Big Bang and features of cosmic evolution.

• Robert Grosseteste (c. 1175-1253) was the Bishop of Lincoln and an Oxford scholar credited as the first mathematician and physicist of the Medieval era. Science historian Alistair Crombie called him “the real founder of the tradition of scientific thought in medieval Oxford.”29

• Albertus Magnus (c. 1200-1280) was a German Dominican friar who advocated the teaching of reason and science in the church. He catalogued thousands of insights and observations in logic, medicine, chemistry, and astronomy.

• Roger Bacon (c. 1214-1294) was a Franciscan friar known as Doctor Mirabilis (“wonderful teacher”) because he advocated the study of nature through the empirical method.

• Thomas Aquinas (1225-1274) advocated “natural theology” and “natural law,” rooted in reason as well as biblical revelation.

• French priest Jean Buridan (c. 1300-c. 1358) was one of the world’s earliest true physicists, recording observations that led to a modern understanding of inertia and momentum. In De Caelo et Mundo, he proposed an early version of the Copernican model of the cosmos — 200 years before Copernicus.

As the Middle Ages gave way to the Renaissance, great scientific minds continued to seek out the laws by which a rational God had designed an orderly universe. Johannes Kepler envisioned God as the Great Mathematician, and he went on to systematize the laws of planetary motion that bear his name. Michael Faraday saw God as the Great Physicist, who laid down laws for Faraday to discover in the fields of electricity and electromagnetism. Isaac Newton saw God as the Cosmic Engineer, and his faith in a rational God drove him to discover the laws of gravitation, motion, and mechanics.

As Paul Davies observes, “The very notion of physical law is a theological one in the first place, a fact that makes many scientists squirm. Isaac Newton first got the idea of absolute, universal, perfect, immutable laws from the Christian doctrine that God created the world and ordered it in a rational way.”30 Faith in a rational God and a well-ordered creation brought modern science into existence.

And then there’s the field of social justice — and particularly the abolition of slavery. While it’s true that many slaveholders rationalized their cruel trade from the Bible, it’s also true that religion founded on the Sermon on the Mount helped bring slavery to an end. The slave trade in England was abolished largely due to the efforts of a prominent evangelical, William Wilberforce. The American abolition movement was led by the Quakers and such evangelicals as Charles Finney.

So Hitchens’ blanket statement that “religion poisons everything” couldn’t be more wrong. The Crusades and the Inquisition and the pogroms weren’t caused by the Sermon on the Mount or anything else said by Jesus of Nazareth — just as Charles Darwin was not the instigator of the Holocaust or the Holodomor.

Theoretical physicist Freeman Dyson explains the seeming paradox that, down through the centuries, religion has inspired human beings to commit acts of both incredible evil and amazing good. He writes:

We have seen terrible wars and terrible persecutions conducted in the name of religion. We have also seen large numbers of people inspired by religion to lives of heroic virtue, bringing education and medical care to the poor, helping to abolish slavery and spread peace among nations. Religion amplifies the good and evil tendencies of individual souls.31

When evil people want to do evil things — when they want to commit acts of murder, genocide, sadism, oppression, theft, or terror — they will grab any rationale to make their evil seem “good.” If it weren’t some twisted pretense of religion or a pseudo-scientific rationale, it would have been some other excuse. But the evil would have happened in any case.

There is no evil in the words of Jesus. There is no evil in the theory of evolution. The evil is in people — in human nature itself.

That’s what poisons everything.

Notes:

This is an excerpt from God and Soul: The Truth and the Proof by Jim Denney, copyright 2012, available as an ebook at Amazon.com. For permission to quote from this excerpt, contact the author in care of this blogsite.

24. Richard Dawkins, The God Delusion (Boston: Houghton Mifflin, 2006), 283.

25. Matthew 7:22-23, Holy Bible, New International Version®. Copyright © 1973, 1978, 1984 Biblica. Used by permission of Zondervan. All rights reserved.

26. Sandy Donovan, Hypatia: Mathematician, Inventor, and Philosopher (Minneapolis: Compass Point, 2008), 75.

27. Michael C. Thomsett, The Inquisition: A History (Jefferson, NC: McFarland, 2010), 158.

28. Lowrie John Daly, The Medieval University, 1200-1400 (New York: Sheed and Ward, 1961), 4.

29. Alistair Cameron Crombie, The History of Science from Augustine to Galileo (Mineola, NY: Dover, 1995), 27.

30. Paul Davies, “Taking Science on Faith,” New York Times, November 24, 2007, http://www.nytimes.com/2007/11/24/opinion/24davies.html?_r=3.

31. Frankenberry, 379.

Darwin’s Holocaust? (Part 2 of 3)

Continued from Part 1.

As early as 1939, philosopher Judah Rumney wrote about Darwin’s influence on both Hitler and Mussolini. In an article called “Biology and War,” which Rumney wrote shortly before Germany’s September 1939 invasion of Poland (and the start of World War II), he noted that the German and Italian dictators were both influenced by the philosophy of social Darwinism:

Both Mussolini and Hitler avow their adherence to this philosophy of war. Hitler in Mein Kampf argues that the world must be ruled according to the natural law of the survival of the fittest: “In constant war mankind has become great — in eternal peace it must perish.”12

Rumney added that Hitler saw war, first, as a “biological necessity,” part of a Darwinian “struggle for existence,” and second, as a means of natural selection, in which the weak and inferior would perish and the strong and superior would be selected for survival. Rumney went on to say that Darwin’s biological theories “are mistakenly applied to social phenomena [by Hitler and other social Darwinists], and animal evolution is equated with social evolution. This dubious procedure is sustained furthermore … by false assumptions and misrepresentations of Darwin’s ideas.”13

One such misrepresentation of Darwin’s ideas is the Hitlerian interpretation of natural selection as a struggle for existence by eliminating all neighbors, competitors, and “inferiors” — the essence of the Nazi “final solution.” Even before the Holocaust began, Rumney saw where Hitler’s misapplication of Darwinian evolution was headed. He wrote:

In biology [natural selection] refers to a struggle for life between organisms consequent on a change in the environment, or a too rapid increase in their numbers which impels each organism to strive forward at the expense of its neighbors. To Darwin this struggle was primarily a process of adaptation which may or may not involve elimination. The term struggle he used in a metaphorical sense, but to the biologists of war [i.e., Hitler and other militant social Darwinists], the struggle for life is a struggle against life; it means elimination, fighting, bloodshed. They ignore the fact that animals do not generally eat or attack those of their own species.14

One evolutionary scientist, Sir Arthur Keith (1866-1955), was horrified to see Hitler pervert Darwin’s theory into a weapon of mass destruction. Shortly after the end of World War II, Keith wrote, “The German Führer, as I have consistently maintained, is an evolutionist; he has consciously sought to make the practice of Germany conform to the theory of evolution.” Hitler failed, Keith concluded, not because the theory of evolution is false, but because Hitler misunderstood evolutionary theory and misapplied it in the realm of power and politics.15

Hannah Arendt agreed that a perversion of Darwinism was at the heart of Hitler’s crimes against humanity. She wrote, “Underlying the Nazis’ belief in race laws as the expression of the law of nature in man is Darwin’s idea of man as the product of a natural development which does not necessarily stop with the present species of human beings.”16

The most convincing evidence of the influence of social Darwinism in Nazi Germany comes from the Wannsee Conference, a meeting of senior Nazi officials in the Berlin suburb of Wannsee in January 1942. The meeting was called to inform top Nazi officials of how the “final solution to the Jewish question” would be carried out. Minutes of the meeting were taken by Adolf Eichmann, one of the architects of the Holocaust. That document, which became known as “Eichmann’s Protocol,” includes this statement (note the phrase I’ve italicized):

In pursuance of the final solution, special administrative and executive measures will apply to the conscription of Jews for labor in the eastern territories. Large labor gangs of those fit to work will be formed, with the sexes separated, which will be directed to those areas for road construction and undoubtedly a large part of them will fall out through natural elimination. Those who remain alive — and they will certainly be those with the greatest powers of endurance — will be treated accordingly. If released they would, being a natural selection of the fittest, form a new cell from which the Jewish race could again develop.17

Evolutionary biologist Stephen Jay Gould (1941-2002) recalled his dismay when he read that statement and discovered that the essential mechanism of Darwinian evolution had been twisted into a rationale for Nazi genocide. Gould wrote:

I can rattle off lists of such misuses [of evolutionary theory], collectively called “social Darwinism.” … But until the fiftieth anniversary of the Wannsee Conference piqued my curiosity and led me to read Eichmann’s Protocol for the first time, I had not known about the absolute ultimate in all conceivable misappropriation — and the discovery hit me as a sudden, visceral haymaker, especially since I had steeled myself to supposed unshockability before reading the document. Natürliche Auslese is the standard German translation of Darwin’s “natural selection.” To think that the key phrase of my professional world lies so perversely violated in the very heart of the chief operative paragraph of the most evil document ever written!18

It’s clear that Hitler and the Nazis viewed war, conquest, and genocide as a biological necessity, as a means of evolutionary struggle and natural selection. Hitler absorbed biological Darwinism at the very least from his early education, and also through such secondary sources as German Darwinian biologist Ernst Haeckel (1834-1919). As Stephen Jay Gould wrote:

Haeckel’s greatest influence was, ultimately, in another tragic direction — National Socialism. His evolutionary racism; his call to the German people for racial purity and unflinching devotion to a “just” state; his belief that harsh, inexorable laws of evolution ruled human civilization and nature alike, conferring upon favored races the right to dominate others; the irrational mysticism that had always stood in strange communion with his grave words about objective science — all contributed to the rise of Nazism. The Monist League that [Haeckel] founded and led … made a comfortable transition to active support for Hitler.19

Hitler also absorbed the social Darwinist militarism of Prussian General Friedrich von Bernhardi (1849-1930). In Germany and the Next War (1911), Bernhardi advocates ruthless German aggression and expansionism, while rationalizing slaughter and conquest in the name of “natural law” and “the law of struggle.” As anthropologist Ashley Montagu notes, Bernhardi invokes “such Darwinian notions as ‘the struggle for existence,’ ‘natural selection,’ and ‘survival of the fittest.'”20 Bernhardi adapted Darwinian natural selection to the realm of conflict between nations, claiming that “struggle is a creator” (that is, a creative force) because “it eliminates” nations and cultures that are weak and inferior. Bernhardi wrote:

Struggle is, therefore, a universal law of Nature, and the instinct of self-preservation which leads to struggle is acknowledged to be a natural condition of existence.

Strong, healthy, and flourishing nations increase in numbers. … They require a continual expansion of their frontiers, they require new territory for the accommodation of their surplus population. … The right of conquest is universally acknowledged. … The instinct of self-preservation leads inevitably to war, and the conquest of foreign soil. It is not the possessor, but the victor, who then has the right.21

These words, drenched in social Darwinism, helped to propel the German Wehrmacht into Poland, France, Belgium, and the Netherlands.

I can understand Professor Richards’ eagerness to delink Darwin from Hitler and the Holocaust, but the good professor has arrived at the wrong answer. We have to follow the evidence where it leads. Hitler was without question a social and biological Darwinian who rationalized Naziism on grounds of natural selection and biological necessity.

We also have to acknowledge that Darwin himself was a racist. The full title of his 1859 book was On the Origin of Species by Means of Natural Selection, or, the Preservation of Favoured Races in the Struggle for Life. In his 1882 follow-up, The Descent of Man, Darwin predicted, “At some future period, not very distant as measured by centuries, the civilised races of man will almost certainly exterminate, and replace, the savage races throughout the world.”22 Candidly, I consider some of Darwin’s opinions of certain races to be unprintable.

At the same time, I want to make clear that while Darwin predicted the extermination of “the savage races,” he didn’t advocate their extermination. Darwin was a racist who divided humanity into “higher” and “lower” races, and he believed that natural selection would eliminate the “lower” races.

Darwin was not himself a social Darwinist and didn’t advocate applying his biological theories to social, political, and economic settings. He opposed slavery, and was appalled that some people misapplied his theories, using them as a rationale for social injustice. In The Descent of Man, he wrote that it is our “instinct of sympathy” that truly elevates us as human beings, and if we lose our ability to sympathize with the weak, the helpless, and the suffering, the result will be a “deterioration in the noblest part of our nature.”23

But Darwin’s views were used by others as a rationale for war and mass murder. Does this mean that Darwin bears moral responsibility for the crimes of Hitler? Does Hitler’s genocidal misapplication of a biological theory undermine the validity of that theory? Absolutely not.

As a scientific theory, evolution has been highly successful and well-verified — just as the reality of the Cosmic Designer is well-verified by the anthropic principle. Darwin’s theory of evolution by natural selection rises or falls on its own scientific merits, regardless of how it was later twisted and misused by the social Darwinists, by Hitler and the Nazis, by Karl Marx and the Communists, or by the Columbine killers.

And the same principle applies to religion.

To be concluded in Part 3. 

Notes:

This is an excerpt from God and Soul: The Truth and the Proof by Jim Denney, copyright 2012, available as an ebook at Amazon.com. For permission to quote from this excerpt, contact the author in care of this blogsite.

12. Judah Rumney, “Biology and War,” Journal of Social Philosophy, Volume 4, Number 4, 1939, 329.

13. Ibid.

14. Ibid., emphasis added.

15. Arthur Keith, Evolution and Ethics (New York: G.. P. Putnam’s Sons, 1947), 230.

16. Arendt, 161.

17. Helmut Krausnick and Martin Broszat, Anatomy of the SS State (London: Paladin, 1970), 101; emphasis added.

18. Stephen Jay Gould, Dinosaur in a Haystack: Reflections in Natural History (New York: Harmony, 1996), 315.

19. Stephen Jay Gould, Ontogeny and Phylogeny (Cambridge, MA: Harvard University Press, 1977), 78.

20. Ashley Montagu, Man’s Most Dangerous Myth: The Fallacy of Race (New York: Columbia University Press, 1945), 157.

21. Friedrich Von Bernhardi, translated by Allen H. Powles, Germany And The Next War (Deutschland und der Nächste Krieg, Berlin: J. G. Cotta, 1912), http://www.gutenberg.org/cache/epub/11352/pg11352.html.

22. Charles Darwin, The Descent of Man, and Selection in Relation to Sex, 2nd edition (London: John Murray, 1882), 156.

23. Darwin, 134.

Darwin’s Holocaust? (Part 1 of 3)

On Tuesday, April 20, 1999, two students went on a killing rampage at Columbine High School in Colorado. They killed twelve fellow students and one teacher, and injured more than twenty others. Then they turned their guns on themselves.

Why did they do it?

In part, the two teenagers killed because they saw themselves as agents of a Darwinian ethos. One of the killers fantasized in his journal about crashing a plane into a New York City building (almost three years before 9/11), and described how he and his co-conspirator planned to “kick natural selection up a few notches” at Columbine. On the day of the mass killings, this young killer wore a black T-shirt with the words Natural Selection lettered in red across the chest.4

If students see themselves as evolved animals in a concrete jungle, they will behave like predators. This is fact, not conjecture. It happened at Columbine. And it has happened on a vastly larger scale. It’s an indisputable fact that Darwin’s theory of evolution by natural selection has been used as a rationale for slaughtering tens of millions of people.

(Please note: I didn’t say Darwinism caused mass slaughter. I said it was used as a rationale for mass slaughter. Big difference.)

Darwin’s On the Origin of Species was published in 1859. By the 1870s, Darwin’s concept of evolution by natural selection had become so widely popularized that it had spawned a notion called “social Darwinism.” Social Darwinism is the belief that natural selection entitles the strongest in our society to exploit (and even exterminate) the weak. Darwinian capitalists justified the exploitation of workers as “survival of the fittest.” Darwinian progressives and “scientific racists” such as Charles Davenport, Havelock Ellis, Margaret Sanger, and George Bernard Shaw used the same doctrine to justify eugenics programs.

And then there were the totalitarian Darwinists — the Marxists and Nazis.

Among Darwin’s most prominent admirers were Karl Marx and Friedrich Engels, the co-fathers of communist economic theory. In fact, Karl Marx wanted to dedicate Das Kapital to Darwin, but Darwin declined the honor. German-American political theorist Hannah Arendt (1906-1975) observed that people often forget “the great and positive interest Marx took in Darwin’s theories; Engels could not think of a greater compliment to Marx’s scholarly achievements than to call him the ‘Darwin of history.'” Arendt added that Marx and Engels saw the survival of the fittest as an analogy to “Marx’s law of the survival of the most progressive class.” And Engels, in his funeral speech after the death of Marx, said, “Just as Darwin discovered the law of development of organic life, so Marx discovered the law of development of human history.”5

Marxism is a profoundly Darwinian political system, and the Marxist government of the old Soviet Union is neck-and-neck with Maoist China  for the title of most murderous regime in human history. Lenin’s forced collectivism and political purges killed more than 4 million people by 1922. The Holodomor, Stalin’s deliberate “plague of hunger” in the Ukraine in the early 1930s killed from 8 to 12 million Ukrainians. By the time of Stalin’s death in 1953, the USSR had deliberately murdered at least 40 million of its own people.

And Marxist-Maoist China killed an estimated 20 to 40 million Chinese during the “Great Leap Forward” under Mao Zedong. Now add to this the 2 million or more Cambodians who died under the Marxist warlord Pol Pot and the million-plus Ethiopians who were deliberately starved to death under Ethiopia’s Marxist regime. Why are Marxist regimes so bent on killing their own people? Answer: Marxism is pure distilled social Darwinism. When the weak are exterminated and the strong survive, that’s Darwinian natural selection at work in human society.

And what about the Nazis? It’s no accident that the title of Adolf Hitler’s Mein Kampf (“My Struggle”), echoes the Darwinian struggle for survival. By the time Mein Kampf was published in 1925, the broad doctrines of social Darwinism — struggle, the dominance of the will, total amorality, and survival of the fittest — had largely shaped the Zeitgeist of Europe and America.

The social Darwinist phrase “survival of the fittest” does not appear in any of Darwin’s own writings; it was coined by a sociologist, Herbert Spencer. But the idea that only the strong should survive became a maxim of the decades from the 1870s through the end of World War II. Social Darwinism formed the basis of Hitler’s thinking about struggle, power, the use of force, and racial purity. In Mein Kampf, Hitler wrote about Entwicklung (meaning “evolution” or “development”), and referred to racially impure human beings as “monsters that are a mixture of man and ape.”6

Robert J. Richards, professor of science and medical history at the University of Chicago, has written a scholarly paper that rejects the claim that Hitler was influenced by Darwin. In the paper, titled “Was Hitler a Darwinian?,” Professor Richards expresses concern that certain “scholars and many religiously conservative thinkers” want to charge Darwin with “moral responsibility for the crimes of Hitler” in order to undermine the theory of evolution.7

As Professor Richards rightly points out, we can trace Hitler’s racist and genocidal views to many influences, including the anti-Semitic views of composer Richard Wagner, and the racial theories of Joseph Arthur Comte de Gobineau (1816-1882) and Houston Stewart Chamberlain (1855-1927). Yet the evidence also shows that Hitler was well acquainted with Darwinism, both as a biological theory and in its corrupted and popularized form, social Darwinism. In Mein Kampf in 1925, Hitler wrote:

If Nature does not wish that weaker individuals should mate with the stronger, she wishes even less that a superior race should intermingle with an inferior one; because in such a case all her efforts, throughout hundreds of thousands of years, to establish an evolutionary higher stage of being, may thus be rendered futile.8

The Darwinian influence on that statement is irrefutable. And in his Nuremberg speech in September 1933, Hitler said:

The differences between the individual races … can be quite enormous and in fact are so. The gulf between the lowest creature which can still be styled man and our highest races is greater than that between the lowest type of man and the highest ape.9

Professor Richards writes that, in the 1940s, Hitler made statements rejecting “the origin of human beings from ape-like ancestors.”10 Maybe Hitler did change his views on Darwinian biology from the 1920s to the 1940s. But we do know that Hitler spoke of being influenced by Darwinian biology during his early education — and the Darwinian influence, he said, alienated him from the Christian religion. In a private conversation on October 24, 1941, taken down verbatim by a stenographer, Hitler said:

The present system of teaching in schools permits the following absurdity: at 10 a.m. the pupils attend a lesson in the catechism, at which the creation of the world is presented to them in accordance with the teachings of the Bible; and at 11 a.m. they attend a lesson in natural science, at which they are taught the theory of evolution. Yet the two doctrines are in complete contradiction. As a child, I suffered from this contradiction, and ran my head against a wall. Often I complained to one or another of my teachers against what I had been taught an hour before — and I remember I drove them to despair.

The Christian religion tries to get out of it by explaining that one must attach a symbolic value to the images of Holy Writ. Any man who made the same claim four hundred years ago would have ended his career at the stake, with an accompaniment of Hosannas.11

I’m not accusing Professor Richards of deliberate revisionism, but it’s clear that his claim that Hitler rejected Darwinism is simply untrue.

Continued in Part 2

Notes:

This is an excerpt from God and Soul: The Truth and the Proof by Jim Denney, copyright 2012, available as an ebook at Amazon.com. For permission to quote from this excerpt, contact the author in care of this blogsite.

4. CNN, ” Columbine Killer Envisioned Crashing Plane in NYC,” CNN.com, December 6, 2001, http://archives.cnn.com/2001/US/12/05/columbine.diary/; Peter Langman, Ph.D., “Columbine, Bullying, and the Mind of Eric Harris,” PsychologyToday.com, May 20, 2009, http://www.psychologytoday.com/blog/keeping-kids-safe/200905/columbine-bullying-and-the-mind-eric-harris.

5. Hannah Arendt, Totalitarianism: Part Three of The Origins of Totalitarianism (New York: Harcourt, 1976), 161.

6. Adolf Hitler, Mein Kampf, translated by James Murphy, http://gutenberg.net.au/ebooks02/0200601.txt.

7. Robert J. Richards, “Was Hitler a Darwinian?,” undated document, http://home.uchicago.edu/~rjr6/articles/Was%20Hitler%20a%20Darwinian.pdf.

8. Hitler, ibid.

9. Alvin Z. Rubinstein and Garold Wesley Thumm, The Challenge of Politics: Ideas and Issues (Englewood Cliffs, NJ: Prentice-Hall, 1970), 57.

10. Richards, ibid.

11. Adolf Hitler, Hitler’s Secret Conversations, 1941-1944 (New York: Octagon Books, 1972), 69.

Christopher Hitchens Makes a Startling Admission

Here is an incredible two-minute video clip from the end of the documentary Collision, featuring Christopher Hitchens (author of God is Not Great) and Reformed pastor Douglas James Wilson (Christ Church, Moscow, Idaho). The video was recorded during their promotional tour for the book Is Christianity Good for the World?, based on their series of debates.

In my previous posts about Christopher Hitchens (Lament for an Atheist Part I” and Part II”), I made note of the strange fact that Hitchens, in God is Not Great, devotes an entire chapter to “Arguments from Design,” yet he doesn’t make even the slightest reference to the “fine-tuning” or “anthropic” evidence.

(For a thorough presentation of that evidence, see my book God and Soul: The Truthand the Proof; for a brief introduction, see my blog piece “Is Our Universe ‘the Ultimate Artifact’?”)

Ever since reading God is Not Great, I’ve wondered if Hitchens was completely unaware of the fine-tuning evidence or if he simply avoided the subject because it posed an insoluble problem for him. Here’s what I wrote:

Though Chapter 6 of God is Not Great is entitled “Arguments from Design,” he doesn’t devote even one word to the cosmological case for God. The evidence is hardly new or difficult to research. This concept has been around since 1973, when physicist Brandon Carter introduced an idea he called “the anthropic principle.” It has been explored extensively by such writers as Paul Davies, John Barrow, Frank Tipler, John Gribbin, Martin Rees, and others. I devoted an extensive section of my 2001 book Answers to Satisfy the Soul to the subject.

Why, then, does Hitchens completely ignore the subject in God is Not Great? As I read Hitchens and his fellow “New Atheists,” I’m struck by the fact that they don’t seem merely unpersuaded by the evidence. They seem to either misunderstand the evidence—or worse, they seem altogether ignorant of it. Writing a chapter called “Arguments from Design” without even one mention of the cosmological evidence is like writing a book on the history of Apple Computers without any mention of Steve Jobs. It’s downright bizarre.

Well, now we know that Hitchens did know about the fine-tuning argument—and what he says about fine-tuning in this video stunned me. It will shock anyone who truly groks the implications of Hitchens’ statement. Click “play” and hear it for yourself:

Here’s a transcript of the first part of the conversation between Hitchens and Wilson:

Hitchens: At some point, certainly, we are all asked which is the best argument you come up against from the other side. I think every one of us picks the fine-tuning one as the most intriguing.

Wilson: The Goldilocks idea. Yeah, okay.

Hitchens: Yeah. The fine-tuning, that one degree, well, one degree, one hair different of nothing—that even though it doesn’t prove design, doesn’t prove a Designer, [the fine-tuning] could have all happened without [God]— You have to spend time thinking about it, working on it. It’s not a trivial [argument]. We all say that.

(By the way, when Hitchens says, “We all say that,” he refers to himself, to Richard Dawkins, and to the rest of the New Atheists. And Wilson’s reference to “the Goldilocks idea” refers to the fact that our fine-tuned universe is “just right” for life.)

In this brief clip, Christopher Hitchens has given us all—theists, skeptics, agnostics, atheists, and anti-theists—a lot to think about. And the biggest question on my mind is this: If Hitchens and the other New Atheists know that fine-tuning is not a trivial argument, that you have to spend time thinking about it, why do they omit it or misrepresent it in their books? What are they afraid of?

____________________________

Addendum — Sunday, October 14, 2012 — “NO PROOF!”

Yesterday on Twitter, I sent out some tweets regarding the anthropic (fine-tuned universe) case for God. An atheist tweeted back two words in all caps: “NO PROOF!” I looked up the atheist tweeter’s profile and found that his profile consisted of a single quotation by Christopher Hitchens: “What can be asserted without evidence can also be dismissed without evidence” (from page 150 of God is Not Great).

Perfect! I love that quote, because (a) it cuts both ways, and applies with equal force to atheist assertions, and (b) because the anthropic case for the theistic worldview consists of a mountain of irrefutable evidence. I also love that quote because (c) Hitchens HID that mountain of evidence from his readers when he wrote God is Not Great.

So I replied to my atheist friend (in a multi-part tweet):

Hi. Your profile quotes Hitchens, “What can be asserted without evidence can also be dismissed without evidence.” But Hitchens acknowledged that there IS evidence for the existence of God, that the evidence is “not trivial” and cannot be dismissed. See the Hitchens video at: [LINK].

This morning, I checked Twitter to see if my atheist friend had replied. In a way, he had. He had BLOCKED me.

Clearly, some atheists can’t handle the truth.

Cosmic Fine-Tuning in Science Fiction

In his science fiction short story “What Continues, What Fails…,” space scientist and Hugo/Nebula-winning author David Brin delves into the deep questions surrounding the mystery of cosmic fine-tuning (the anthropic principle):

The universal rules of Isola’s home cosmos were rife with such fine-tuning. Numbers which, had they been different by even one part in a trillion, would not have allowed subtleties like planets or seas, sunsets and trees.

Some called this evidence of design. Master craftsmanship. Creativity. Creator.

Others handled the coincidence facilely. “If things were different,” they claimed, “there would be no observers to note the difference. So it’s no surprise that we, who exist, observe around us the precise conditions needed for existence!

“Besides, countless other natural constants seem to have nothing special about their values. Perhaps it’s just a matter of who is doing the calculating!”

Hand-waving, all hand-waving. Neither answer satisfied Isola when she delved into true origins. Creationists, Anthropicists, they all missed the point.

Everything has to come from somewhere. Even a creator. Even coincidence.