Like I said, it just feels arrogant. It can tick people off, America is a continent, not a country. It feels arrogant because it's a way to somehow dismiss the rest of the continent, and say that America is only that piece of land in the north.
Not that such a thing is what would get you hate like the ones you are talking about. Since I don't hate you guys, I don't really know much of it.
I also remembered one thing that tends to annoy a lot of people. The U.S. acting as the "world police" sometimes, getting their noses in other's business. Truth or not, I've heard that a lot. If you ask me of situations when that happened, I wouldn't know one to point out, but I've heard it plenty of times.
Doctor Whoof wrote:I don't know where you went. But I love our food.
Don't know man. I've been in the U.S. countless times, and I've just liked like 2 or 3 restaurants. You do have some awesome ones like I said, I remember one with very nice beef, potatoes, etc. One of the best I've been at.
But most of the time it's just... not to my liking. I remember when I asked for a lemonade, and they got me a soda. I was like, "do you have real lemonade?" "What do you mean real?" "... you know, with REAL lemons?" "Ohhhh!!! Hahaha, I wish!". Fun stuff.