Doubting Christian Statistics
You’ve probably heard various statistics about Christians, such as Christians having slightly higher divorce rates than non-Christians. Such statistics should be treated with a certain level of skepticism. If the data were based on a random sample of regenerate Christians and accurately reflected the reality of those polled, then it would be reasonable to accept the results (in accordance with the margin for error), but reality is not like that. There are at least 3 reasons why we should be careful about our conclusions based on such data:
First, in polls (as in real life) it is simply not possible to determine who is a regenerate Christian. For instance according to a page on the Barna website summarising their statistics on born again Chrstians,
In Barna Research Group studies, born again Christians are not defined on the basis of characterizing themselves as “born again” but based upon their answers to two questions. The first is “have you ever made a personal commitment to Jesus Christ that is still important in your life today?” If the respondent says “yes,” then they are asked a follow-up question about life after death. One of the seven perspectives a respondent may choose is “when I die, I will go to Heaven because I have confessed my sins and have accepted Jesus Christ as my savior.” Individuals who answer “yes” to the first question and select this statement as their belief about their own salvation are then categorized as “born again.”
Yet even with this careful methodology, 40% of American adults are born again Chrstians, which seems too high to be the number of regenerate Christians. Other results from the webpage cast further doubt. For instance, “About one-third of born agains (33%) believe that if a person is good enough they can earn a place in Heaven.” Admittedly some who respond yes to this could believe that nobody is actually good enough, but also 28% believe Jesus sinned, and only 32% believe in moral absolutes. These inconsistent beliefs ought to cause us to be suspicious of other stastistics about this group.
If an individual tells us that they are a Christian, but their beliefs and/or actions are inconsistent with that claim then we may rightly wonder whether they are actually regenerate. We should apply such skepticism to polls too.
An article on the difference between actual church attendance and church attendance reported in polls describes how researchers have found that when asked if they attended church in the previous week, twice as many people claimed they had compared to the actual rate of attendance.
As the article says:
Researchers who study how people answer survey questions have long known that responses to behavioral questions represent more (or less) than “just the facts.” When asked how many times they ate out last week, how frequently they have sex, and whether or not they voted in the last election, most people report what they usually do, what they would like to do or what they think someone like them ought to do. The question that Gallup asks, “Did you, yourself, happen to attend church or synagogue in the last seven days?” provokes similar, often less than factual responses.
Christians may be on average more inclined to believe that they ought to do something they are polled about, but they also ought to tell the truth if they didn’t actually do it, so it is unclear how misreporting affects statistics for Christians, especially in comparison to the statistics for non-Christians.
Another problem is simple misinterpretation. Does a statistic have the implication that you draw from it? For example, in her book, Getting Serious About Getting Married, Debbie Maken asks, “Why are [Christian’s] lives as marred as non-Christians? Why have two-thirds of Christian singles thrown away their virginity?” The reference she gives is to an article by Julia Duin called No One Wants to Talk About It, subtitled, Why are evangelical singles sleeping around?, which says:
My research turned up a few rough figures. In their 1991 book, Single Adult Passages: Uncharted Territories, Carolyn Koons and Michael Anthony had surveyed 1,500 single Christians. They found significant levels of sexual activity. Of the women surveyed, 39 percent were virgins. I also got hold of two similar surveys, one a singles survey from Peachtree Presbyterian Church in Atlanta and the other a survey of single Southern Baptists. Both revealed only a third of the respondents had abstained from sex.
I do not intend to attempt to explain away the apparent problem of pre-marital sex amongst Christians, but I do wish to illustrate how statistics can easily be misinterpreted. The proportion of Chrstian singles who have abstained from sex since their conversion will be lower than those who have abstained from sex completely, and if this distinction is overlooked the problem of Christians engaging in pre-marital sex will be overblown. Also do the surveys include those who had only engaged in sex during marriage, but had since become single? Probably not, but if so then the statistics make things look worse than they are.
There is great potential for misinterpretation of statistics, for instance by wrongly assuming that correlation implies causality. Both sides of an argument may use statistics to come to contradictory conclusions, and we should be particularly careful when conclusions align with our preferences. I hope to be able to expand on these ideas later on.
I am not saying that we cannot learn from statistics about Christians, but we should treat them with care, they are not as straightforward as they appear. How much does the statistic that a third of the Peachtree Presbyterian Church singles had abstained from sex tell us about chastity, or about regenerate church attendance there, or about the ages and backgrounds of their people at conversion? Can the result be assumed to be similar for another Presbyterian church or Christians more generally, or might differences in teaching, support, culture etc. make the equation unrealistic?
The Holy Spirit has a powerful sanctifying effect on the behaviour of regenerate Christians, yet polls tend to show little difference between the behaviour of Christians and non-Christians, sometimes Christians come out worse. The reality must surely not be that bad, though still far worse than it should be. In the end it does not matter for us what the polls say, we as Christians are each called to live righteous lives, to be salt and light. Though in the polls Christians may look the same as the world, we ought to show those who see us, that the same is not true of us. And if we are acting like the world, then we should ask ourselves why.