I've heard more times than I care to remember folks saying things like, "Jesus? Yes! But church? No. Not interested."
I understand why folks might feel that way.
Popular media will often cast the church in a negative light. The church is often presented as opposing progress, trying to enforce archaic moral values, and teaching quaint but outdated fairy tales. Sometimes we are our own worst enemies. Church leaders do things that bring disgrace to the church; the news media is not hesitant to report cases of clergy sexual abuse, financial mismanagement, or any of a myriad of other indiscretions. Add to that the plethora of other religious options -- e.g., eastern mysticism, moderate and radical Islam, Buddhism, Hinduism -- which are the consequence of immigration trends. What right do Christians have to tell others what they believe is wrong (and what we believe is right), that they should cease being Hindus or Jews and become Christians if we can't get our own act together?
Newsweek Magazine (April, 2009) went even further when it announced the end of Christian America. Al Mohler, a seminary president and defender of the Christian faith, was quoted in the article saying, "The most basic contours of American culture have been radically altered. The so-called Judeo-Christian consensus of the last millennium has given way to a post-modern, post-Christian, post-Western cultural crisis which threatens the very heart of our culture."
Is it true? Is America turning away from Christianity? Is the church outdated, irrelevant … a social appendix, a vestigial organ that now serves no useful purpose?
I wondered. Being a nerd (who spends most of his time trying to not let folks know that's who I really am), I began to wonder what the world would be like if Jesus hadn't come, if the church didn't exist. Would the church be missed? Has Christianity really made a difference in our world?
This is an especially relevant question for me since UBA exists to help churches spread the good news of Christ, to encourage people to become Christ followers, to help churches make a difference in their communities.
So I began to do one of the things I do do best -- read, study, research.
And the conclusion I came to is this: the Christian faith and the church has been foundational and fundamental to our Western way of life -- to many areas of related to human rights -- the treatment of women, care for children, abolition of slavery, care for the sick--to capitalism, to our scientific world view, even our current educational system.
Over the next few months I'll unpack some of what I've learned in my research. I'll show you what a radical difference Christianity has made. And I hope you'll end with the sense that I have now -- that we can be proud of who we are and what we believe, that the world would be a much different place without Christians and that sharing our faith is one of the best things we can do for both the eternal and temporal destiny of folks.
That doesn't mean I'm blind to the shortcomings of the church. The church isn't perfect. We have our critics and we need to listen to and learn from them. Church members and church leaders are not always well-behaved. There are times we we should be ashamed an apologetic. But we need not be ashamed of the gospel nor the difference it has made in our world!
Next time I'll start showing you just how.