Christians are so proud of themselves for having "faith" in their god. Atheists tend to attack faith and reject it completely. Maybe we should fight to acknowledge our faith... our faith in PEOPLE.
Christians claim that all people are sinners and as a result they always assume the worst in people. They need to "save" and "protect" us from ourselves.
I usually give people the benefit of the doubt unless given a reason not to.
What do you guys think about the faith?