Is America a Christian nation? There are lots of arguments, blah blah blah... No it is not. I claim this out of the most common usage of the English language. America is not a country composed entirely of Christians, not to mention the "separation of church and state" that currently exists whether you like it or not. Whatever. I don't care to argue or address that particular issue. America isn't perfect... so what.
Well, what if there was such a thing as a "Christian Nation"? And to be admitted you had to be a believing, professing Christ follower (assuming that we could even really tell). This is one of the worst/scariest ideas I have ever come up with. First of all, how would the justice system work? Because, after all, we are still depraved beings who will sin constantly. Should we just extend grace to everyone, every time, because, "mercy triumphs over judgement." And who would want to go to this place? You can't affect anyone! How can God use you as an instrument to bring about His kingdom on this earth when you are isolated from every non-believer?
I know this post is a bit scrambled and poorly written, but for comments, post what you think would be the scariest/worst thing about living in this "Christian Nation."