I HATE Hollywood portrayals of the South
Its been downhill ever since The Green Mile, another serious stinker. Its so cheesy, so obvious, so f--king manipulative. Whenever Hollywood portrays the South, I just skip the film. It used to be mildly amusing, not it just irritates me.
share