Hey,
I've never been to USA (probably it will change soon) but I've heard many opinions about americans attitude to foreigners.
How americans perceive foreigners?
Do they treat them as "someone worse"?
Do they tolerate foreigners' culture?
I've never been to USA (probably it will change soon) but I've heard many opinions about americans attitude to foreigners.
How americans perceive foreigners?
Do they treat them as "someone worse"?
Do they tolerate foreigners' culture?