Is Christianity an evil in and of itself?
It seems that a lot of these days, people talk about how Christianity is and was evil in and of itself, and even if that is indeed true, then why did Christians themselves form a kind of ideology that in past historic times, was used an excuse for evil deeds in the name of power and whatnot?
And did they really ever think of themselves as evil or?