Why travel to the US if you hate it?
I’m not talking about immigrants, I’m talking about rich snobby Europeans who will take every chance they get to shit on the US, but then book their yearly trip to Miami or LA once July rolls around
No other country gets this treatment. When people visit other countries, they (usually) either want to learn more about the countries culture and sights or want to relax
But for whatever reason, other westerners will travel to the US just to report on how awful the people, the social security system, the grocery stores (?), the schools, the infrastructure and so on is.
And the worst part is when they deliberately disrespect American culture and cultural norms. Not tipping because it’s not common in your country and being rude and condescending towards everyone. Like I said, nobody treats other countries’ culture is inferior.. (mostly Australians who do this, anecdotally 😂)
I don’t get it man. If you hate America as much as you claim and everything there sucks, stop visiting it. Nobody is forcing you.. or maybe you don’t actually hate America as much as you say 🤷♂️
The biggest culprit of this are the exchange year students. Some of them will literally spend a year in the United States trying to find flaws that they can feel superior about. Genuinely wtf man 😂