Kevin,It has been noted that: A) Americans believe that they are both entitled and, perhaps even obliged to provide moral instruction to the rest of the world-- B) Americans tend to view themselves as the underdog in every situation C)Many Americans view America as a gift of God to the rest of the World--
Having been educated in America during the 50's and Sixties, I can assure you that these ideas were not only encouraged, people who expressed any doubt were often severely ostracized--often by people who talked a lot like "Real American"---