Are Americans becoming a nation of ass holes ?
I can not accurately answer my own OP as I have lived most of my life in my home state of Texas, therefore my perspective is limited. My general sense is that regardless of geographical location, racial background, gender, socio-economic status or any other variables most of my fellow citizens have become selfish ass holes. I just don't trust them anymore.
Our American business ethics are exemplified by the ruthlessness of the Enron scandal. The outsourcing of jobs to Third World countries while qualified Americans are completely bypassed and are left to ponder the real world value of their hard earned education, .....and other examples abound.
Social graces ( like saying "thank you" or showing respect for the elderly ) that I once took for granted in my youth are now so rare as to be non existent. I see daily example of ruthlessness that even extend to animal cruelty. In my mind America has become a nation of back stabbing, all-about-me, make a profit at any cost, sociopaths.
"Humans are a disease, a cancer of this planet." Agent Smith