How does Japan currently view the West after the Epstein list ands all jewish thing came out?

Historically, especially after the events of World War II, Japan adopted a posture closer to the U.S. With the new prime minister, this alignment has grown even stronger.

We’re now starting to see what might be a global elite with bad intentions all over the world, seemingly linked to the U.S. How do Japanese people view this? Do they still want such a close relationship with America?

by Dangerous-Ant-9019