Hollywood has a cancer. That cancer is WOKE.
I’m starting to see posts from wokists bitching about people who call out woke, or people who quite reasonably ask how woke a film is.
Let’s get something clear. Wokeness is a sick, racist, sexist ideology that has infected numerous institutions in the West and is destroying Western Civilisation, replacing actual justice with ‘social justice’.
It has its creepy tentacles all over and inside Hollywood, and has effectively killed the industry, with precious few films actually able to escape the obnoxious stench of woke.
I suggest the woke bitches festering around here and calling for censorship fuck off to Reddit, where they’ll find plenty of their own kind, and all the fun-policing and censorship they could ever hope for.