How Women Directors Are Changing Hollywood and Independent Cinema
Women directors are transforming Hollywood and independent cinema by telling authentic, emotionally rich stories that resonate globally. Their rise is changing who gets to make films-and how they’re made.