so... basically the US were worse than the Third Reich?
Considering how slavery was a rather casual matter in the US, with everybody knowing about and most people positively advocating/endorsing it, while Nazis did their best to keep their crimes hidden, it seems to be a valid thesis after all.
Like even today, modern "Nazis" are denying the Holocaust, even though one might assume that it's something they should supposedly be proud of...