Bethersda Temple Apostolic Church, Dayton, Ohi...I see a striking parallel to the decline of America as a nation and American Christianity.  America was founded on a document called the Constitution which recognized God grants all rights.  But over the past 50 years the constitution is being discounted and rewritten by progressives who find it outdated.

Christians have the Bible as our anchor for truth.  But over the same period, many Christian leaders have attempted to discount or re-interpret it away from the absolute truth of God to more of a “guiding document” that we bend and shape to fit our desires and culture.

Sadly these days an increasing number of Americans believe the government owes them prosperity.  We now believe it is our ‘right” to a steady job, a home and retirement income.  More and more Christians feel God owes them a life of prosperity just because we believe in Him.  In reality the only things Jesus promised us is that the world would hate us and persecute us because we follow Him.  But that’s just not good enough for a growing number of Christians these days.  We demand health, wealth and prosperity—the easy life.  Just who do we think we are that we can demand anything from God?  Instead we should just be thankful that he has rescued us from what we truly deserve—eternal damnation.

But instead American Christianity is mirroring the decaying culture of America.  We want everything now, and don’t feel we have any responsibilities because of the gift of salvation and eternal life.

The church either affects the culture, or the culture infects the church.  Which do we see happening today?  Is the church being a beacon of truth shining light in to the darkness of secular culture?  Or is the culture infecting American Christianity with its selfish attitude of entitlement?

If we find our attitudes and beliefs mirroring those of the secular culture around us, we’d better wake up and ask God to closely examine and expose our hearts.  The truth is the church is looking more and more like secular culture every day.  We are becoming more politically correct, refusing to call out sin.  We are adopting the ways of the world to spur church growth, identifying larger numbers with success.  As a whole, we are no better than the secular culture we are called to affect—we feel entitled to the desires of our heart, without any feeling of responsibility to God for the gift He has given us.   We view our salvation as the end, when we should be viewing it as the means to pursue holiness and righteousness.

The church either affects the culture—or the culture infects the church.  Is there any doubt which is happening in American Christianity these days?