LOS ANGELES — In Hollywood, director jobs are no longer automatically filled by white men. Television writers’ rooms have made diversity and inclusion top priorities. Human resources departments at major media corporations are more responsive when complaints are filed. Intimacy coordinators, who introduce physical consent considerations into the artistic process, are now normal on productions featuring sexual content.

It has been nearly two and a half years since the sexual misconduct allegations against Harvey Weinstein burst into public view, and much is different in Hollywood.

But the entertainment industry has been doing things a certain way for decades, and not every aspect of it has been quick to change. Even as Mr. Weinstein was found guilty on Monday of two felony sex crimes, Hollywood largely remains a man’s world.

Take the Oscars, moviedom’s ultimate show of power and prestige. For the ninth time in 10 years, the Academy of Motion Picture Arts & Sciences did not nominate a woman for best director in 2020. Only one of the 20 acting nominations went to a person of color. And with the exception of “Parasite” and “Little Women,” the majority of the films honored by the Academy — “The Irishman,” “Ford v Ferrari,” “Once Upon a Time … in Hollywood” and “Joker” — were portraits of white men directed by prominent white auteurs.