Of course, I am using the word "truth" in the lousest possible sense here. But which of the various Christian denomination holds the "truth" about the Christian belief?
I see Christians on this forum bashing each other in the face with verses from the bible and demanding interpretations for the opponents, only to be bashed back again and again with the same or similar stuff.
In my experience, the evangelicals do not consider other Christians as "saved" and in fact despise other Christians who are not "born again" or evangelical. So who really hold the Christian truth?