While most forms of storytelling have remained unchanged over the years—painters still use oil, writers still use words—motion pictures have undergone frequent and radical transformations. Bit by bit, we went from static camera shots of canoodling couples to Avatar, a film that likely would’ve driven 1920s audiences into a mental institution.

Milestones like the introduction of sound, color, and CGI characters have been well-documented; here are eight lesser-known but equally significant evolutions in the art and business of movies.

1. Close Encounters—the Director’s Cut

Though not explicitly labeled “his” version, Steven Spielberg agreed to shoot new footage and restore excised content for a “Special Edition” re-release of Close Encounters of the Third Kind in 1980. In on-the-nose ad campaigns typical of the era, posters promised “there is more” and that Spielberg “has filmed additional scenes,” marking the first time a film was promoted based on the fact that it was a revised work.

Columbia Pictures squeezed another $15 million out of Close Encounters with minimal effort: By the 1990s, home video consumers were flooded with “Director’s Cut” versions of popular films (Blade Runner, Aliens) that allowed filmmakers more leeway—and studios the opportunity to charge cinephiles for the same movie twice.

2. Marathon Man Runs—and the Camera Follows

Prior to Garrett Brown’s invention of the Steadicam—a handheld camera that could move as quickly and fluidly as the person operating it—roaming shots were accomplished using clunky dolly tracks or cranes. Brown wanted more mobility. One of the first films to use his device, Marathon Man, allowed Dustin Hoffman to sprint through city streets while the camera followed, adding a new sense of intimacy and realism to scenes. By the time the Steadicam chased a jogging Sylvester Stallone through Philadelphia in Rocky that same year, filmmakers knew they could take their audiences anywhere.

3. Billy Jack Goes National

Movies today open on thousands of screens simultaneously, but film distribution used to be markedly different: Even large “event” offerings would debut in major cities before slowly rolling out to other parts of the country. It might be months before a family in San Francisco saw what New Yorkers had already experienced.

That strategy annoyed Tom Laughlin, director and star of a series of independent features about a pacifist named Billy Jack who occasionally finds it necessary to kick people in the sternum. For 1974’s Trial of Billy Jack, Laughlin leveraged the popularity of the earlier installments and insisted the film open in 1500 theaters in a single day.

Despite poor reviews, the film raked in millions and major studios took notice. Jaws, opening the following year, rolled out wide and ushered in the concept of the summer blockbuster. Though it’s often credited with instituting the practice, it’s Laughlin who should get the credit—or blame—for vacuuming up revenue on opening weekend.

4. The Addams Family Merges the Small and Big Screens

The 1954 film Dragnet, a big-screen adaptation of the television series starring Jack Webb as stoic Sergeant Joe Friday, was a curiosity: Why should audiences pay for a premise they could see for free on television?

Star Trek had a profitable time in the ‘80s due to pent-up fan demand, but television adaptations were still few and far between until 1991’s The Addams Family, a kitschy homage to the 1960s series about a brood of eccentrics, made a tidy $113 million. In short order, movies based on The Fugitive, Twin Peaks, The Brady Bunch, and dozens of others hit multiplexes, ready to cash in on nostalgia and brand recognition. (Trek, however, remains the king: the 2009 reboot is the highest-grossing TV-to-movie adaptation to date.)

5. Taking Independent Film Out of the Shadows

Forget Kickstarter: Television actor John Cassavetes had to round up improvisational actors and use checks culled from series guest spots to mount Shadows, an independent feature released in 1959 that explored taboo topics like race and sexuality. What he lacked in polish he gained in complete autonomy from the studio system. That do-it-yourself mentality later fed the 1990s emergence of filmmakers like Richard Linklater, Kevin Smith, and Spike Lee—knowingly or not, all of them informed by Cassavetes and his urge to express himself without a filter.

6. Snow White’s Creative License

The 1937 movie Snow White and the Seven Dwarfs is mostly remembered for being both the first animated feature and the birth of Walt Disney as an entertainment powerhouse. Lost in the shuffle was the film’s foreshadowing of Hollywood’s savvy approach to merchandising tie-ins. A coordinated push was timed to the release for a series of Snow-centric goods like hats (modeled by a young Lucille Ball), bath powders, and even a soundtrack. If you get a kick out of your Darth Vader coin bank, you have Sneezy, Grumpy and the rest to thank for it.

7. Seeing More in The Robe

Movie theater attendance declined sharply beginning in the 1950s. Taking a bite out of the box office was the advent of television, which had grown from being virtually non-existent during the late 1940s to being in 33 percent of homes by 1952.

In order to maintain their business, Hollywood decided to expand horizontally. Though widescreen—a filmed image appearing approximately twice as wide as it is tall in various aspect ratios—had been invented decades prior, it wasn’t until 1953’s biblical epic The Robe was released that filmgoers noticed how a screen that filled their peripheral vision could be more immersive.

Fox marketed it as CinemaScope; unlike earlier, more expensive attempts, only an anamorphic lens was needed to film the effect. Viewers reacted positively and widescreen is now the industry standard. (Fox wasn’t so confident, though: They also filmed the movie in a standard ratio, just in case.)

8. Indiana Jones and the Target Demographic

When Jack Valenti took over the Motion Picture Association of America in 1968, he recognized an emerging maturity in film, with sex, language, and violence no longer prohibited by the puritanical Hays Code created in the 1930s. By 1984, the MPAA’s system had morphed to include G, PG, R, and X—a spectrum that deemed movies suitable for children, general audiences or people in trenchcoats.

There was a considerable gulf, however, between the innocuous PG (Parental Guidance) label and the violence and sexual content of an R film. That middle ground was on gory display in Indiana Jones and the Temple of Doom, Steven Spielberg’s sequel to his blockbuster Raiders of the Lost Ark. In Temple, Harrison Ford contends with monkey brain appetizers, whipped children, and a voodoo priest ripping the still-beating heart from a hapless human sacrifice. It received a PG rating. So did that year’s Footloose. Something was very wrong.

Spielberg suggested to Valenti that a new advisory be created to bridge the gap between family fare and ultraviolence. The result was PG-13, which gave parents a clue to reconsider how appropriate a movie may be for their teenagers. It was too late for Jones, though: 1984’s Russian invasion flick Red Dawn became the first movie to sport the rating.