I truly cannot believe how normalized and socially accepted ant white racism has become in this country
It's an idea that has had no push back. Anyone can say they do not like white people out loud and proudly in public, on tv, on podcasts, or in interviews and no one blinks a fucking eye. No reaction whatsoever. Just nodding and agreeing. I fully support freedom of speech but let us please not pretend that if black people were being talked down to this way that it would not make international news and the person doing it would have their life ruined forever. It has almost become a majority objective opinion to think that all white people are special, privileged, and have nothing to complain about. We are seen as the top, more well off race that reins over and shits on other races. We take advantage of and reap benefits of everyone else according to much of the mainstream flow of information. "Your home got robbed by a group of black gang members? Oh you'll be fine. You're white. Who cares." "Your wife cheated on you and took most of your money? You're white. You're fine."
This is the new social norm and no one seems to give two flying shits. Of course there are a few influencers out there speaking out against this, but I really thought more people would be speaking up. People are so scared to do what's right. They would rather fit in socially than speak out when they see something blatantly wrong happening. It's beyond sad and pathetic.