Censorship, taking away guns, forced vaccination, lockdowns who do more damage then good, brainwashing children with gender nonsense, removing historic statues.
Above doesn't belong in the West. So I think they def got a point when they think the American way of living is under attack.
Why are you so mad at white men? You think other races are perfect?