Do Americans still believe their government and corporations?
Serious Question
Do they still believe they have their best interests at heart?
I gotta say living in Southern Africa is not all that bad...I can be openly unpatriotic without being called a traitor or a terrorist, so we are actually allowed to expose things without fearing for our lives. You should have the same freedom.
You also have to ask yourself wtf is going on in the USA? PTSD, CTE, DID etc the USA seems to be a breeding ground for mental illness.
It is a great country but greed is destroying it.
Anyway this movie speaks volumes on how people are being lied to daily, I can't believe it even made it to the public.