Seems that there is a fair amount of criticism about the good 'ol U.S.A. from those abroad. Just out of curiosity, what changes would you like to see in our country to make all you fine folks less critical of America? banning firearms here? getting more involved or less involved in the affairs of other nations ? have us all turn into warm fuzzy bunnies????