- Industry
Black Cinema in the USA: A Brief History
When anti-racism demonstrations erupted last summer in the US and beyond, following the killing of George Floyd by the police, Hollywood was quick to offer support, but soon faced a moral dilemma when the demonstrators demanded wiping out symbols and traces of racism from America’s history and culture, including the films that featured racial stereotypes. Warner Bros.