Let’s face it… The culture in America has dramatically changed.  At one time, we considered ourselves one nation “under God.”  Today, our nation barely acknowledges God, and the church has lost most or all of it’s influence in our society.  We are increasingly marginalized.  What is our next step?  How should we move forward in 21st century America?