Books about US history and the horrible things it's done.
Looking for books that cover US history, specifically the horrible things the US has done and don't get covered in schools. Things like sun down towns and things like that. I want to get a hold of physical copies of books before theyvmight not be able to be found. Thanks in advance!