More posts about this topic
Only six Black women have received Oscars for acting. They have been pigeonholed into hyper-sexualized, "Mammy," "maid," servile, and urban stereotypes since their first appearance on film. When will media featuring independent, strong, and intelligent racial minorities become commonplace? At the very least, when will movies accurately portraying America's racist past become required? Most movies depicting slavery or racial injustices are inaccurate or depict White main characters as the saviors of their Black supporting actors. Example: "The Help"
Contributed by Empriś Durden