A Book: Hitler barred Hollywood from depicting Nazi Germany in negative shades on August 02, 2013 Adolf Hitler and Hollywood Book on Hollywood Nazi Germany The Collaboration: Hollywood's Pact with Hitler +