As we all know atheism is on the rise in American and the cultural influence
of the Church is clearly waning.
I thought I would put together a quick list of why I believe this is so.
1) The Sexual Revolution. Once sexual mores went out the window
so did religion. Religion has regulations about sex and if you want live
in a society that has few limits on sex you must live in a religion-free
society. Ross Douthat writes about this in his book Bad Religion.
2) Family Breakdown. Mary Eberstadt argues in How the West Really
Lost God that "In a way that hasn’t been well understood so far, it appears
that the great jigsaw puzzle of secularization has been missing a critical piece.
Religious vibrancy and family vibrancy go hand in hand."
Funny how this is related to number 1. The sexual revolution said you don't
have to be marry to have sex and you can sleep with any consenting adult.
Well the problem is that you might end up having a child without that stable
3) Feminization of American culture. If you step into church today
most likely the pews are filled with women. I've noticed that many
churches are geared to women. They have sentimental and
emotional sermons. Preachers often cry. I can see why churches
would gear their churches to women, since more and more women
are the head of their households. (See number 2).
The problem is that it's isolating 50 percent of the population! You are
going to lose influence when 50 percent of the population can't
relate to you.
4) Materialism. Americans like their new iPhones, cars, big screen
televisions. Well that takes money. So that means you might need
to take a job that has long or odd hours.....like working Sunday morning.
I don't know how often I've heard...."Yeah, I can't go to church
or bible study because of work." Work is noble but not when it is
getting in a way of your spiritual life.
These are a few things that are causing the eroding influence of
Christianity in America. Tell me what you think. Do you have
any other theories?