What do Japanese really think of the USA/Americans?

I'm sorry if this is a common question here (though I haven't seen it asked much tbh), but what do Japanese people in general ACTUALLY think of the USA and its people? I've been to 9 countries so far, and Japan is by far my favorite. The people in my experience are very nice, it's safe, the food is delicious and healthy, etc. I'd say most Americans like and respect Japan based on polling and my personal experience. But as you probably know if you pay attention to international news, the US isn't exactly in very good shape at the moment. We're having a lot of problems with crime and other things that need to be fixed. Does the news and things people see in the media affect how Japanese think of us? Just curious.