I'd like to see a film about slavery that tells the truth
Because the truth has almost nothing to do with the story Hollywood keeps trying to sell, and not just Hollywood.
People, the African slave market is a market that has been functioning for centuries now, and is still selling people TODAY. Right now, I mean, as you are reading this. They mainly sell and have mainly sold to the Arab world, and people in this form of business have, apparently, no qualms about selling people into slavery.
Now the 'white man', who is so often portrayed as the most evil force on earth, is seen as the big instigator of black slavery. This is wholly, completely and absolutely untrue. We tapped into a market that had been going for ages and is still going today. As participants of this market we were the briefest players, we left the whole sordid business after a few centuries because we realised the degradation and inhumanity of the whole thing.
Black people were and are being sold by other black people, to Arabs and Africans and for a bit there to white people. Now I am not saying that we were angels and treated our slaves like princes, but could we please try and see things in perspective, with a bit of regard for historical fact? Virtually all the peoples of this earth and its history have engaged in slavery at some point, and I am not trying to excuse that. But the simple fact remains that it is, in particular, the white christian society that participated in this awful business the shortest of all players that made money out of it. Like I said: black people are still selling black people today.