An Atheist’s Admiration for Jesus

I’m convinced that if everyone in the world practiced the Sermon on the Mount (found in Matthew chapters 5 through 7), 95 percent of the world’s problems would be solved. It might surprise you to know that even atheist extraordinaire Richard Dawkins shares my admiration for the Sermon on the Mount. In The God Delusion, Dawkins writes:

“Jesus, if he existed . . . was surely one of the great ethical innovators of history. The Sermon on the Mount is way ahead of its time. His ‘turn the other cheek’ anticipated Gandhi and Martin Luther King by two thousand years.”

Richard Dawkins, The God Delusion (New York: Houghton Mifflin, 2008), 283.


Addendum, September 30, 2012:

I recently discovered a blogpage written by Richard Dawkins entitled “Atheists for Jesus” (April 10, 2006). At the top of the page is a photo of Dawkins wearing a T-shirt that reads “Atheists for Jesus.” In the article, Dawkins explains in greater depth his admiration for Jesus as an ethical teacher, while dismissing the theistic worldview of Jesus. Here’s an excerpt:

Of course Jesus was a theist, but that is the least interesting thing about him. He was a theist because, in his time, everybody was. Atheism was not an option, even for so radical a thinker as Jesus. What was interesting and remarkable about Jesus was not the obvious fact that he believed in the God of his Jewish religion, but that he rebelled against many aspects of Yahweh’s vengeful nastiness. At least in the teachings that are attributed to him, he publicly advocated niceness and was one of the first to do so. To those steeped in the Sharia-like cruelties of Leviticus and Deuteronomy; to those brought up to fear the vindictive, Ayatollah-like God of Abraham and Isaac, a charismatic young preacher who advocated generous forgiveness must have seemed radical to the point of subversion. No wonder they nailed him.

“Ye have heard that it hath been said, An eye for an eye, and a tooth for a tooth: But I say unto you, That ye resist not evil: but whosoever shall smite thee on thy right cheek, turn to him the other also. And if any man will sue thee at the law, and take away thy coat, let him have thy cloke also. And whosoever shall compel thee to go a mile, go with him twain. Give to him that asketh thee, and from him that would borrow of thee turn not thou away. Ye have heard that it hath been said, Thou shalt love thy neighbour, and hate thine enemy. But I say unto you, Love your enemies, bless them that curse you, do good to them that hate you, and pray for them which despitefully use you, and persecute you.” . . .

I am no memetic engineer, and I have very little idea how to increase the numbers of the super nice and spread their memes through the meme pool. The best I can offer is what I hope may be a catchy slogan. “Atheists for Jesus” would grace a T-shirt. There is no strong reason to choose Jesus as icon, rather than some other role model from the ranks of the super nice such as Mahatma Gandhi (not the odiously self-righteous Mother Teresa, heavens no). I think we owe Jesus the honour of separating his genuinely original and radical ethics from the supernatural nonsense which he inevitably espoused as a man of his time. And perhaps the oxymoronic impact of “Atheists for Jesus” might be just what is needed to kick-start the meme of super niceness in a post-Christian society. If we play our cards right—could we lead society away from the nether regions of its Darwinian origins into kinder and more compassionate uplands of post-singularity enlightenment?

I think a reborn Jesus would wear the T-shirt. It has become a commonplace that, were he to return today, he would be appalled at what is being done in his name, by Christians ranging from the Catholic Church to the fundamentalist Religious Right. Less obviously but still plausibly, in the light of modern scientific knowledge I think he would see through supernaturalist obscurantism. But of course, modesty would compel him to turn his T-shirt around: Jesus for Atheists.

Dawkins is wrong, of course, when he claims that a “reborn Jesus” would not be a theist. Jesus would know all about the anthropic, fine-tuned universe—a body of evidence that Dawkins actively misleads his readers about in The God Delusion. In fact, I think it is likely that Jesus, being the absolute exemplar of intellectual honesty, would connect his ethical teachings to the evidence for a Cosmic Designer that permeates our growing understanding of cosmology and quantum mechanics.

But I do agree with Dawkins on this: Jesus might well wear a “Jesus for Atheists” T-shirt, because Jesus is for all people, weak and strong, young and old, male and female, believer and nonbeliever. The one who said “Love your enemies,” the one who forgave those who crucified him, would certainly be for atheists. He would not be for atheism, of course, because atheism doesn’t square with reality. He would want everyone to know the truth.

But Jesus welcomed the Samaritan woman at the well, the Roman centurion, the woman caught in adultery, the tax collector, the rich and the poor, the drunks and prostitutes. So why wouldn’t he welcome an atheist as well?

Read Richard Dawkins’ “Atheists for Jesus” in its entirety at

Fifty Years of Wonder from ‘A Wrinkle in Time’

A Wrinkle in TimeIn September 1962—fifty years ago this month—my third-grade class filed into the school library in search of adventure. I found mine almost immediately—a book called A Wrinkle in Time by Madeleine L’Engle. In my new opinion piece at, I recall the profound impact this one book had on my life and career. I hope you’ll read it and let me know what you think. —Jim Denney

The Kennedy-Reagan Truth vs. the Obama Delusion

In his book The New Reagan Revolution, Michael Reagan examined six great economic crossroads of the 20th and 21st centuries. These six critical junctures in the history of the United States serve as economic laboratories to test two contrasting economic theories. One theory consistently produced economic expansion and sustained growth. The other theory invariably produced failure and misery. Here are Michael Reagan findings:

1. The “Forgotten Depression” of January 1920. During the last year of Woodrow Wilson’s presidency, the economy nosedived. GNP fell 17 percent; unemployment soared from 4 to almost 12 percent. This was the “Forgotten Depression” of 1920. Wilson’s successor, Warren G. Harding, came into office and immediately cut tax rates for all income brackets, slashed federal spending, and balanced the budget. Long before the world ever heard of Ronald Reagan, Harding practiced “Reaganomics.”

“President Harding applied the principles of Reaganomics,” Michael Reagan observed, “even though Ronald Reagan was at that time a nine-year-old boy living in Dixon, Illinois. Harding was not following an economic theory. He was following common sense. He treated the federal budget as you would treat the family budget: When times are tough, cut spending and stay out of debt. Harding also treated his fellow citizens with commonsense compassion: If folks are going through tough times, government should ease their burden and cut their taxes.”

The Harding recovery was astonishingly rapid, beginning just half a year into his presidency. Unemployment fell to 6.7 percent by 1922, and to 2.4 percent by 1923. Harding’s successor, Calvin Coolidge, maintained Harding’s program of low tax rates, balanced budgets, and limited government. The Harding-Coolidge era of prosperity became known as “the Roaring Twenties”—a time of soaring prosperity, stable prices, and boundless optimism.

Obvious conclusion based on the evidence: Reaganomics works.

2. The Great Depression. Coolidge was succeeded by Herbert Hoover. In the eighth month of Hoover’s presidency, the stock market crashed—the infamous Crash of 1929. Many factors led to the Great Depression, but the Crash was the precipitating event. Hoover had failed to learn the lessons of the Harding-Coolidge years, so he responded by raising taxes (hiking the top marginal rate from 25 to 63 percent), imposing protectionism (the Smoot-Hawley Tariff Act), and boosting government spending by 47 percent, driving America deep into debt. Hoover’s actions worsened the Depression. A defeated Herbert Hoover bequeathed a ruined economy to Franklin Delano Roosevelt

FDR took office at a time when 25 percent of the nation’s workforce was unemployed. He, too, ignored the lessons of the “Forgotten Depression,” and doubled down on Hoover’s failed tax-and-spend policies, applying the economic theory known as Keynesianism (after British economist John Maynard Keynes). The Keynes-FDR approach involved deficit spending, soak-the-rich tax policies, and big-government make-work programs (the New Deal). FDR and a compliant Congress hiked personal and corporate income tax rates, estate taxes, and excise taxes.

Michael Reagan wrote, “From 1937 to 1939, the stock market lost almost half its value, car sales fell by one-third, and business failures increased by one-half. From 1932 to 1939, the U.S. racked up more debt than in all the preceding 150 years of America’s existence. By early 1939, as the Great Depression was in its tenth year, unemployment again climbed past the 20 percent mark.”

Many Americans credit FDR with “getting America through the Depression.” In reality, FDR’s policies prolonged the Depression. In a time of catastrophic unemployment, Roosevelt made it prohibitively expensive to hire people, making a terrible human tragedy even worse. While thousands of U.S banks failed under FDR’s policies, across the border in Canada, not one bank failed—because Canadian banks were not hamstrung by FDR’s foolish over-regulation. In FDR’s Folly, historian Jim Powell questions the disturbing FDR legacy:

Why did New Dealers make it more expensive for employers to hire people? Why did FDR’s Justice Department file some 150 lawsuits threatening big employers? Why did New Deal policies discourage private investment without which private employment was unlikely to revive? Why so many policies to push up the cost of living? Why did New Dealers destroy food while people went hungry? To what extent did New Deal labor laws penalize blacks? Why did New Dealers break up the strongest banks? . . . Why didn’t New Deal public works projects bring about a recovery? Why was so much New Deal relief spending channeled away from the poorest people?

In May 1939, a demoralized and defeated Henry Morgenthau, FDR’s treasury secretary, told the House Ways and Means Committee, “We are spending more than we have ever spent before and it does not work. . . . I want to see people get a job. I want to see people get enough to eat. We have never made good on our promises. . . . After eight years of this administration we have just as much unemployment as when we started. . . . And an enormous debt to boot!”

Many people mistakenly believe that World War II lifted America out of the Great Depression. Not true. What WWII did was take 12 million men out of the workforce and send them into war, which ended unemployment. But all the other signs of a damaged economy remained during the war: low stock prices, depressed private investment, and depressed consumer demand.

Roosevelt and his successor, Harry Truman, had a post-war plan to impose an even bigger Second New Deal after the war. Fortunately, Congress refused, and chose instead to cut taxes and cut spending—the same commonsense “Reaganomics” approach that had produced prosperity during the 1920s. The result: a post-war economic boom from the late 1940s through the 1950s. Had FDR and Truman gotten their way, the country would have slipped right back into recession if not a second Great Depression.

Obvious conclusion based on the evidence: Keynesianomics fails, prolonging economic hardship and misery, while Reaganomics works again.

3. The Recession of 1960 and 1961. When John F. Kennedy came into office, he faced a jobless figure of 7.1 percent. Wanting the economy to keep up with the growing workforce, JFK addressed the Economic Club of New York in December 1962 and proposed a bold notion: “It is a paradoxical truth that tax rates are too high today and tax revenues are too low and the soundest way to raise the revenues in the long run is to cut the rates now. . . . The purpose of cutting taxes now is not to incur a budget deficit, but to achieve the more prosperous, expanding economy which can bring a budget surplus.”

Those are the words of John F. Kennedy—and he was preaching Reaganomics. Kennedy was assassinated less than a year later, but his successor, Lyndon Johnson, lobbied hard for the JFK tax cuts, and he signed them into law in 1964. As a result of JFK’s Reaganesque economic plan, the economy experienced a dramatic 5 percent expansion and personal income increased by 7 percent. Gross national product grew from $628 billion to $672 billion, corporate profits by an explosive 21 percent, auto production rose by 22 percent, steel production grew by 6 percent, and unemployment plummeted to 4.2 percent—an eight-year low. The Kennedy-Johnson tax rate cuts produced a sustained economic expansion for nearly a decade.

Obvious conclusion based on the evidence: Reaganomics works again.

4. The Recession of the 1970s. This recession began in November 1973 under Nixon and ended (technically) in March 1975 under Gerald Ford—a 16-month recession. According to the graphs and charts of the economists, real GDP was on the rise by the spring of 1975, yet unemployment and inflation remained painfully high throughout rest of the 1970s. Americans continue to suffer joblessness amid spiraling prices after the recession officially ended.

In 1976, Ronald Reagan narrowly lost the primary race against Gerald Ford. Reagan was convinced that he knew how to solve the long and painful recession of the 1970s, but he was forced to watch from the sidelines as Gerald Ford and Jimmy Carter—two befuddled, clueless Keynesians!—battled each other for the White House. On October 8, 1976, at the height of the presidential race between Carter and Ford, Reagan outlined the principles of Reaganomics in a syndicated newspaper column entitled “Tax Cuts and Increased Revenue.” He wrote:

Warren Harding did it. John Kennedy did it. But Jimmy Carter and President Ford aren’t talking about it. The ‘it’ that Harding and Kennedy had in common was to cut the income tax. In both cases, federal revenues went up instead of down. . . . Since the idea worked under both Democratic and Republican administrations before, who’s to say it couldn’t work again?”

Reagan had majored in economics at Eureka College and had spent years studying the great free market economists such as Adam Smith (The Wealth of Nations), Friedrich Hayek (The Road to Serfdom), and Milton Friedman (Capitalism and Freedom). While Reagan’s opponents ignorantly wrote him off as an “amiable dunce,” it is clear that Reagan correctly and insightfully diagnosed the ailing economy of the 1970s. Unfortunately, Reagan would have to wait more than four years for the opportunity to put his prescription into practice.

Obvious conclusion based on the evidence: Keynesianism fails again.

5. The Jimmy Carter Stagflation Recession of 1980. After Jimmy Carter was inaugurated in January 1977, he inflicted the failed FDR-style Keynesian approach on the country—an approach which says the federal government can spend its way to prosperity. The result of Carter’s policies was an economic disaster called “stagflation”—slow economic growth coupled with the misery of rampant inflation and high unemployment.

By the 1980 election, America under Carter was in a full-blown recession. The American people had suffered years of double-digit interest rates, double-digit inflation, and double-digit unemployment, plus blocks-long lines at the gas station. Ronald Reagan defeated Carter in a landslide. Newsweek observed: “When Ronald Reagan steps into the White House . . . he will inherit the most dangerous economic crisis since Franklin Roosevelt took office 48 years ago.”

Reagan moved confidently and quickly to slash tax rates and domestic spending. Under his leadership, the top marginal tax rate dropped from 70 percent to 28 percent. Michael Reagan described the results:

Tax cuts generated 4 million jobs in 1983 alone and 16 million jobs over the course of Ronald Reagan’s presidency. Unemployment among African-Americans dropped dramatically, from 19.5 percent in 1983 to 11.4 percent in 1989. . . .

The inflation rate fell from 13.5 percent in 1980 . . . to 3.2 percent in 1983. . . .

The Reagan tax cuts nearly doubled federal revenue. After his 25 percent across-the-board tax rate cuts went into effect, receipts from both individual and corporate income taxes rose dramatically. According to the White House Office of Management and Budget, revenue from individual income taxes went from $244.1 billion in 1980 to $445.7 billion in 1989, an increase of over 82 percent. Revenue from corporate income taxes went from $64.6 billion to $103.3 billion, a 60 percent jump.

This was the fulfillment of the “paradoxical truth” that John F. Kennedy spoke of in his 1962 speech: “Cutting taxes now . . . can bring a budget surplus.” Both JFK and Ronald Reagan predicted that lower tax rates would generate more revenue. This “paradoxical truth” worked exactly as predicted.

At a White House press conference in 1981, President Reagan took reporters to school, explaining that the principles of Reaganomics have been known for centuries. Lower tax rates invariably bring more money into the treasury, he explained, “because of the almost instant stimulant to the economy.” This principle, Reagan added, “goes back at least, I know, as far as the fourteenth century, when a Moslem philosopher named Ibn Khaldun said, ‘In the beginning of the dynasty, great tax revenues were gained from small assessments. At the end of the dynasty, small tax revenues were gained from large assessments.'”

The principles of Reaganomics have been proved true—and Keynesian theory has been exposed as a fraud once more.

6. The Obama Recession. To be fair, what I call “The Obama Recession” actually began under George W. Bush, triggered by the collapse of the housing bubble. I think it’s fair to call it The Obama Recession because, when Barack Obama took office, he threw $814 billion of stimulus money at the recession (plus billions more in corporate bailouts, “Cash for Clunkers,” Solyndra-style green energy boondoggles, and other prime-the-pump schemes). He promised to jump-start the economy and hold unemployment below 8 percent. This was weapons-grade Keynesianism, practiced on a scale never before witnessed in human history. After spending so much money on the “cure,” Obama now owns that recession.

If Keynesian theory works at all, the Obama stimulus plan should have completely turned the economy around. But the stimulus plan—officially known as the American Recovery and Reinvestment Act of 2009—not only failed to make a splash, it didn’t make a ripple. Even after the government pumped nearly a trillion dollars of borrowed money into the economy, unemployment nudged up toward the 10 percent mark. Today, unemployment is officially below 9 percent—but the actual jobless rate is much higher.

In 2010, the Population Reference Bureau calculated the workforce to be at just over 157 million people. The Bureau of Labor Statistics reports that there are 131 million jobs in America. That would leave 26 million people jobless—or about 16 percent of the total workforce. But it gets worse: Many of those jobs are just part-time jobs, and many people hold two or more of those jobs, so the actual jobless number is certainly far higher than 16 percent—maybe 20 percent or higher.

Obvious conclusion based on the evidence: Keynesianomics fails catastrophically.

Unfortunately, the high priests of the Keynesian religion refuse to see the light. President Obama clings to his delusional Keynesian faith, insisting that all we have to do is throw more money at the economy with another stimulus bill! That is economic insanity. Former Reagan aide Peter Ferrara wrote in the Wall Street Journal:

The fallacies of Keynesian economics were exposed decades ago by Friedrich Hayek and Milton Friedman. Keynesian thinking was then discredited in practice in the 1970s, when the Keynesians could neither explain nor cure the double-digit inflation, interest rates, and unemployment that resulted from their policies. Ronald Reagan’s decision to dump Keynesianism in favor of supply-side policies—which emphasize incentives for investment — produced a 25-year economic boom. That boom ended as the Bush administration abandoned every component of Reaganomics one by one, culminating in Treasury Secretary Henry Paulson’s throwback Keynesian stimulus in early 2008.

Mr. Obama showed up in early 2009 with the dismissive certitude that none of this history ever happened, and suddenly national economic policy was back in the 1930s. Instead of the change voters thought they were getting, Mr. Obama quintupled down on Mr. Bush’s 2008 Keynesianism.

Keynesian theory is every bit as superstitious as believing in astrology or a flat Earth or the good-luck powers of a rabbit’s foot. The facts of history are beyond dispute. The old Keynesian superstition has failed every time it was tried. But Keynesian fundamentalists like Barack Obama continue to live in a state of denial.

We know what works. Nearly a century of economic history proves it. Now we need a president and a Congress with the common sense to apply the lessons of history to the economic crisis of today.

“Nothing New Here”

After posting my previous entry, “Who Made God?,” I went to Twitter and tweeted about the blog (I’m @AnswersAuthor, and there’s a “follow” button at the bottom of this page). Here’s a typical message I tweeted: “#Atheists like #ChristopherHitchens ask, ‘If God made the universe, who made God?’ Find the answer to that question at”

I got a wide range of responses, both complimentary and otherwise. The uncomplimentary tweets included: “Claptrap. Self-devolving prose.” “What a pathetic specimen you are, clinging to your superstition for dear life.” “I feel ever so slightly dumber after reading some of that.”

To the twitterer who felt “ever so slightly dumber,” I replied, “Sorry my blog made you feel dumb. That was not my intent. Reread two more times—I’m sure you’ll feel smarter.” He tweeted back, “I’m afraid if I read more the result will irreversible.” To which I replied, “Then, by all means, avoid exposure to new ideas and information. I wish you well.” Ah, but we weren’t quite done. He tweeted back: “Nothing in your writing was new.”

At that point, I knew exactly how this thing would play out. I’ve spent the past 25 years studying the evidence and assembling my own case for God. I know for an absolute fact that I’ve put together a case (especially the “Who Made God?” argument) that is not in print anywhere else. I know how groundbreaking these ideas are. So for this twitterer to say there’s nothing new here is so obviously false that I knew he was bluffing. He either hadn’t read the blog, or he didn’t understand the blog, or he was pretending to have knowledge he just didn’t have.

Well, it was time for him to put up or shut up, so I tweeted back: “Excellent. You can cite for me which ideas in the article you’ve seen before and where you read them?” And, as I knew he would, he tweeted back: “Or I could waste no more of my time on you.” To which I replied, “That’s fine. As I said a few tweets ago, I wish you well.”

And I meant it. I do wish him well. I wish nothing but the best for all of my critics on Twitter and elsewhere. I hope they find the truth they are so strenuously, belligerently trying to avoid and suppress.

For some reason, my atheist critics on Twitter are usually angry and hostile, and their attacks are disproportionately personal and vindictive. I don’t know why that is. Is it the atheist mindset itself that makes people so hostile? Or is it something about Twitter, and its 140-character limitations, that makes people behave badly? I really don’t know.

One twitterer attacked my Twitter profile bio, saying, “Even his bio is a self-aggrandizing word salad.” My bio reads: “Skeptical believer, Christian anthropicist, Hayek-Friedman-Reagan small-gummint classical liberal, post-partisan author.” A word salad is defined as a string of incomprehensible words having no apparent connection to one another. But my Twitter bio is a highly succinct and accurate summation of who I am. It describes me.

So I replied (in a series of tweets), “You are kidding me! Attacking my bio, dude? Really? A rational response would be: Examine my sources, confront any faulty logic, and show me the error of my ways. I don’t know why my humble little blog is so threatening to you, but feel free to simply avoid new ideas and reject new information. Ad hominem attack is so weak and anti-rational.”

The twitterer replied, “But so apropos in this case and so enjoyable, Skippy!”

Now, here’s a weird thing I’ve noticed: For some reason, atheists on Twitter like to call their opponents “Skippy.” I’ve encountered that multiple times. I replied (over several tweets): “Atheists’ Handbook, p. 37: ‘When out of intellectual ammo, call the other guy Skippy.’ You’re the third atheist to call me that. Weak, irrational ad hominem attack is never logically apropos, but when that’s all you’ve got . . .”

I didn’t hear back.

Another atheist looked at my blog and tweeted, “An ignorant response which fails horribly. The atheist Hitchens’ question still stands, even though you word-play. Pathetic.”

So I responded, “Know what’s really pathetic? Asserting that something ‘fails horribly’ or is ignorant wordplay without backing up the assertion. Christopher Hitchens said, ‘What can be asserted without evidence can be dismissed without evidence.’ Where’s your evidence? #Weak”

The atheist replied, “What do #atheists need evidence for? When Hitchens said that, he was speaking of theists and their assertions. Pay attention.”

Well, of course, Hitchens was speaking of theists and their assertions. But the Hitchens principle cuts both ways. If a theist makes an assertion without evidence, it can be dismissed without evidence. And if an atheist or anti-theist makes an assertion, it too can be dismissed on the same basis.

My atheist friend on Twitter asserted that my blog was failed, ignorant wordplay. Okay, that’s an assertion. Now, back up your assertion with facts. What did I write that demonstrates ignorance? Where does my logic fail? Where does my evidence fail? If you just flatly assert that I’m wrong, yet you can’t tell me why I’m wrong and where I went wrong (especially when everything I’ve written is sourced and footnoted), then frankly, you’re the one who looks pathetic.

So I replied: “Hitchens was stating a broad principle: If you make a claim, back it up with fact. And yes, atheism makes assertions.”

The atheist tweeted back, “#Atheism doesn’t make assertions. You seem confused.”

I replied, “Atheism is your dogma. It blinds you to new information and new ideas.”

The atheist replied: “Why are you confused over the definition of #atheism? It’s very clear. There is no mistake. I can help you if you want. #Atheism is the position where one lacks belief in a god. Therefore, it’s not dogma. To say it’s dogma makes you look ignorant.”

Rather than reply within the 140-character restraints of Twitter, I decided to write this blog entry. I understand why my atheist friend thinks only theists need to provide evidence. I understand why he thinks that atheism makes no assertions. I understand why he denies that atheism is dogma. And I can explain why he’s wrong.

Atheist philosopher Antony Flew (who, late in life, converted to theism) divided the atheist community into two camps, “strong atheism” and “weak atheism.” Strong atheism asserts that no deities exist. Weak atheism is lack of belief in a deity without an explicit assertion that no deities exist. So my atheist friend on Twitter claims to be (by Flew’s definition) a “weak atheist.”

An assertion that is common to both strong and weak atheism is the assertion of materialism. This assertion states that the entire universe consists of nothing but matter and energy, and all phenomena in the universe, including human consciousness, result from material interactions. Science fiction writer Isaac Asimov typified the materialist view when he wrote:

The molecules of my body, after my conception, added other molecules and arranged the whole into more and more complex form. . . . In the process, I developed, little by little, into a conscious something I call “I” that exists only as the arrangement. When the arrangement is lost forever, as it will be when I die, the “I” will be lost forever, too.

And that suits me fine. No concept I have ever heard, of either a Hell or of a Heaven, has seemed to me to be suitable for a civilized rational mind to inhabit, and I would rather have the nothingness.

In my blog entry, “Who Made God?,” I present what I consider to be a compelling case that this atheist assertion is FALSE. The evidence shows that there is more to the universe than materialism, and that Mind is the ground of all reality. Any fair-minded, objective reader would have to agree that I have presented ideas and evidence that are AT LEAST worthy of consideration.

If, however, you are blinded by your dogma, if you are closed to new ideas and new information and your mind is set in stone, you will not give my ideas fair consideration. You’ll dismiss those ideas in knee-jerk fashion as “claptrap” and “ignorant wordplay.” You’ll mock the author of those ideas as “a pathetic specimen clinging to superstition.” You’ll claim that reading it actually makes you dumber. You’ll say it’s nothing new.

The one thing you will not do is actually examine those ideas and consider the evidence. You won’t even try to challenge the author’s reasoning, because to actually think about these ideas would threaten your dogma. It would mean honestly and objectively asking yourself, “What if the author is right?”

Many people assume the word dogma applies only to religious belief and doctrine. Not true. A dogma is a set of opinions or beliefs that are held with such tenacity that one becomes closed to new ideas and new information. If you find yourself feeling angry or annoyed by the ideas I presented in “Who Made God?,” there’s a good chance you are blinded by your dogma. A non-dogmatic person might disagree and calmly challenge those ideas. Or a non-dogmatic person might simply shrug and walk away. But only a dogmatist becomes hostile and insulting in response to a reasonably expressed viewpoint.

And these comments aren’t directed only at atheists. I have found that there are two groups of people who are hostile to the scientific evidence for God. One group, of course, is dogmatic atheists. The other group is dogmatic Christians. For some reason, extremely dogmatic Christians tend to hate the idea that the existence of God might be provable. They seem to think there is something noble about “blind faith,” belief without evidence.

But without evidence, how can you know what to believe?

Elton Trueblood said, “Faith is not belief without proof, but trust without reservation.” I agree. And once you’ve seen the evidence, once you’ve experienced the proof, then you can trust unreservedly. Whether believer or atheist, we must have the courage to follow the evidence. Bart D. Ehrman put it this way: “The search for truth takes you where the evidence leads you, even if, at first, you don’t want to go there.”

Dogmatic people invariably get mad when the truth pokes holes in their dogma. That’s why this blog is called, “The Truth Will Make You Mad.” Instead of getting mad, set yourself free. If you really want to know the truth, you owe it to yourself to open your mind and examine the evidence.

Who knows? If you actually THINK about my ideas and evidence, you just might find a way to prove me wrong.


Postscript, September 3, 2012:

The atheist twitterer responded to my blog entry about as I expected. I’ll take the liberty of translating Twitterspeak to English—for example, changing “u” to “you,” “ur” to “your,” and so forth—for the sake of clarity. He tweeted:

“Your blog fails because you continue to be confused over what atheism means. Strong/weak are not real subcategories either.”

“An atheist is one without belief in a god. Strong/weak merely define what view atheists have in addition to atheism.”

“I refer you to my blog in response to your ignorance about atheism.”

His blog delves into the origin of the word atheism to explain the difference between “without belief in a god” versus “a belief that there is no god.” Yeah, I get that. And I explicitly acknowledged that distinction above.

As to whether strong/weak atheism (also called positive/negative atheism) are real subcategories, his argument is not with me but with atheist scholars like Antony Flew and Michael Martin. In the glossary to The Cambridge Companion to Atheism (New York: Cambridge University Press, 2007, pages xvii and xviii), Martin writes:

Negative atheism: absence of belief in any god or gods. More narrowly conceived, it is the absence of belief in the theistic God. Cf. positive atheism. . . .

Positive atheism: disbelief in any god or gods. More narrowly conceived, it is disbelief in the theistic God. Cf. negative atheism.

Okay, enough hair-splitting. My atheist friend’s next tweet:

“Until you can come up with actual evidence for a god, you will continue to have the burden of proof, and we will sit, point and laugh at you.”

That burden began to shift as far back as September 1973 when physicist Brandon Carter presented a paper (“Large Number Coincidences and the Anthropic Principle in Cosmology”) at the Copernicus symposium in Kraków, Poland. Carter described some of the odd coincidences in the universe—a multitude of seemingly unrelated laws of physics that appear to be coordinated and fine-tuned to produce life. Carter called this concept “the anthropic principle,” also known as the “fine-tuned universe” concept. I address it in greater detail in “Is Our Universe ‘the Ultimate Artifact’?”

In the years since Brandon Carter delivered that paper at the Kraków symposium, the evidence has been steadily growing that the universe seems to have been deliberately fine-tuned to produce life, and that Mind is essential to the existence of the universe. That is the foundation of the case I have assembled in my blog entries, “Is Our Universe ‘the Ultimate Artifact’?” and “Who Made God?” 

Is the fine-tuned universe proof of the existence of God? Some scientists find it convincing. Others do not. Those who are convinced include theoretical physicist Freeman Dyson, physicist Frank Tipler, astronomer Alan Sandage, and Francis Collins, former head of the Human Genome Project and President Obama’s head of the National Institutes of Health.

Even scientists who are unconvinced recognize that the anthropic evidence is powerful and at least gives the unmistakable appearance of pointing to God. Atheist physicist George Greenstein wrote:

As we survey all the evidence, the thought insistently arises that some supernatural agency—or, rather, Agency—must be involved. Is it possible that suddenly, without intending to, we have stumbled upon scientific proof of the existence of a Supreme Being? Was it God who stepped in and so providentially crafted the cosmos for our benefit? …

It is a matter of taste how one deals with that notion. Those who wish are free to accept it, and I have no way to prove them wrong. But I know where I stand. . . . I reject it utterly.

[George Greenstein, The Symbiotic Universe (New York: William Morrow, 1988), pp. 27 and 87.]

So Greenstein clearly states that the anthropic evidence appears to point to God, though he himself rejects that notion. The evidence Greenstein refers to is essentially the evidence I present in “Is Our Universe ‘the Ultimate Artifact’?” I take those ideas even further in “Who Made God?”

Those two blog entries contain about 4800 words of rational scientific evidence, yet they form just a brief introduction to the mountain of evidence that exists. Even so, they dismantle the ignorant atheist canard that there’s “no evidence” for God.

If my atheist friend is correct and the burden of evidence is on me, then hey, no problem, I have delivered the goods. It’s there in those blogs. He and his fellow atheist twitterers are either unwilling or unable to deal with that evidence, because over the past few days, not one of them has challenged or refuted a single word in those blogs.

My atheist friend can continue splitting hairs about the definition of atheism if he likes, and he can “sit, point and laugh” at the evidence and the truth. But the burden is now on my atheist friend to put up or shut up—and to come up with some facts and intelligent reasoning to counter what I have presented.

The atheist twitterer concludes:

“There is no ‘scientific evidence’ for your god. Atheists appear hostile to your irrational beliefs, not your invisible evidence.”

You, the reader, can judge for yourself if these blogs begin to build a case for a Cosmic Designer, as I claim—or if they are nothing but “irrational beliefs” and “invisible evidence,” as my atheist friend claims.

Oh, and one more thing: Christopher Hitchens, author of God is Not Great, has acknowledged that the fine-to universe evidence is “intriguing” and “not trivial.” You can hear it from Hitchens’ own lips at “Christopher Hitchens Makes a Startling Admission.”  Here’s the essential part of Hitchens’ statement [note: when Hitchens says “we,” he means leading atheists such as Richard Dawkins, Sam Harris, and himself]:

At some point, certainly, we are all asked which is the best argument you come up against from the other side. I think every one of us picks the fine-tuning one as the most intriguing. . . . Even though it doesn’t prove design, doesn’t prove a Designer . . . you have to spend time thinking about it, working on it. It’s not a trivial [argument]. We all say that.

If Christopher Hitchens, the atheists’ atheist, acknowledged that the fine-tuning evidence is “not trivial,” that it is “most intriguing,” that “you have to spend time thinking about it, working on it,” then anyone who says there is “no scientific evidence” for God is either intellectually dishonest or ignorant.



The atheist twitterer in question has asked that I give out his Twitter username (@TedTheAtheist) and the link to his blog reply. Done.

A person with a fixed idea will always find some way
of convincing himself in the end that he is right.”

Mathematician Atle Selberg