All of my Christian friends say that the world is getting much worse than it used to be. They say that morality is decreasing and evil is increasing and that we are in the last days before the end. Why is this?
I used to share this pessimistic outlook on life too when i was a Christian, but now i completely disagree and think the world is becoming a much better place to live in than it used to be.
What do you think? Is the world becoming a better place or worse, and how do you justify your opinion?