The starting pitcher has done his job. He has given his team six quality innings with just a few runs allowed, and he leaves with his team in the lead. The manager now turns to the bullpen. A supporting cast of relievers filling defined and specialized roles will, he hopes, take the team to the ninth inning when the closer will be called upon to complete the victory.

That supporting cast may include set-up men in the seventh and eighth innings, a left-handed specialist to face the opposing team’s dangerous left-handed hitter in one of those innings, and still other mid-inning changes to seek successful matchups as circumstances require. Incidentally, the game slows to a crawl as this unfolds. But the belief–now more an unchallenged conviction–is this is the best and only way to protect a lead from the point the starter departs to when the closer gets the final out.

This strategy is what we may call the LaRussa Orthodoxy, named after Tony La Russa, the Hall of Fame manager. La Russa was one of the winningest managers in baseball history, guiding the Chicago White Sox, Oakland Athletics, and St. Louis Cardinals to numerous first-place finishes, playoff wins, and World Series titles. As his profile on Baseball-Reference.com explains, “LaRussa was a pioneer in the modern usage of relief pitchers, moving from the old ‘fireman’ model where one top relief ace would be used in all tight situations, sometimes for multiple innings” to the system described above.

His profile also notes that La Russa first put his system in place with the great Oakland teams of the late 1980s that featured, among others, Dennis Eckersley as closer. Nothing succeeds like success in baseball, and by the end of the 1990s, this system had become “the standard approach” and was being used by “almost every manager.”

As a fan of the Washington Nationals whose cardiovascular system has suffered through many late-inning meltdowns, especially this year, I have wondered if there isn’t a better way than the La Russa Orthodoxy. La Russa’s system works well when a team has consistent, quality relief pitchers to call upon. A cohort of fresh, power-armed pitchers is plainly better than asking an increasingly tired starter to go through an opposing team’s lineup a third or fourth time. The relievers’ advantage can be magnified with well-chosen matchups.

But what if the team’s closer isn’t a future Hall of Famer like Eckersley, and the relievers being asked to get from starter to closer are a changing cast of middling talents? Is there, in fact, a lurking flaw in the La Russa model? By using three to four relief pitchers almost every game, haven’t you actually increased the likelihood at least one of them–even if they are good–will have an off night or outright fail, putting his team behind as innings and outs remaining dwindle?

This series was prompted by those questions and doubts. In Part 1 today, we’ll look at the evolution of relief pitching and see what might be learned from other historical approaches. An article like this would not be complete without recurring to the godfather of baseball analysis, Bill James. Over 20 years ago, as the La Russa Orthodoxy had taken hold, he wrote a typically trenchant piece on the modern bullpen in his guide to baseball managers. There, he noted “the standard use of relief pitchers has never arrived at a static equilibrium.”

There have been several periods when relief strategy was locked into an approach governed by conventional assumptions. Then a successful innovation would expose the limitations of those assumptions, and baseball’s resistance to change would be overcome. A new approach would take hold with its own unchallengeable assumptions.

The La Russa Orthodoxy, I submit, falls into this pattern. It is just another and fairly recent phase that itself had to overcome resistance to settled practices. There is no reason to suppose it cannot be supplanted just as other notions of relief strategy have. As James said, “[t]he plates will move again.”

In Part 2 next week, we will dig deeper into performance under the La Russa Orthodoxy and ask whether it really represents the last and best approach to relief strategy. We will take a stab at trying to answer that question by examining some data and trends both before and during the La Russa Orthodoxy. And we’ll look at the current state of relief pitching and find severe cracks in it. We will close with some ideas about what the next potential evolutionary step in relief strategy might be.

The Evolution of Relief Pitching–Origins to World War II

The first orthodoxy in the use of relief pitchers was that it was wrong to use them at all. Indeed, the early rules of baseball allowed pitchers and other players to be replaced only when they were injured or the other team consented. When the rules were changed, the first great manager, Harry Wright, began to use relief pitchers with some frequency when he saw his starters faltering. But when Wright did so, it was exceptional; the standard view was that starting pitchers should finish the game whenever possible. If the starter did not, he had failed to do his job. It was almost a question of moral fortitude.

This view continued to have strength into the 20th century. But it steadily became ever weaker, and so the next phase of relief pitcher usage was to use the team’s best pitchers–assumed to be starters–in relief of starters who faltered. In fact, there was no sharp division between starters and relievers. The best pitchers on the team were both. They started and were expected to come in as relievers when necessary.

The other pitchers on the team tended to be young pitchers trying to establish themselves as starters. When they had some success in occasional appearances as relievers, they could become starters, the opposite of the current practice of making failed starters into relievers. Connie Mack, one of the great managers in the generation after Wright, fit the paradigm–stay with the starter as long as possible and use your top starting pitchers to finish out wins of the other starters.

Some early experimentation occasionally occurred. Another of the great early 20th century managers, John McGraw, had the first true relief specialist in Otis “Doc” Crandall. How Crandall’s role as a reliever came about is instructive. McGraw had great starting pitching in the immortal Christy Mathewson and Joe “Iron Man” McGinnity and followed the pattern on their usage. The Giants, however, famously lost the pennant in 1908 to the Cubs. While baseball history has focused on Merkle’s Boner, McGraw’s starting pitchers, especially Mathewson, were overworked and faltered when the season was on the line. Learning from this experience, McGraw turned to Crandall in future seasons.

A Hardball Times Update by Rachael McDaniel Goodbye for now.

Crandall’s role, however, was generally not to come in when the game was close or still less to record “saves.” Rather, it was to absorb some of the pitching load during “garbage time” to keep Mathewson and the other starters fresh for crucial games later in the year. In subsequent years, after failing again to win the pennant, McGraw turned to relievers to pitch with and hold leads. His relative flexibility, for his era, may have given him an edge and contributed to his teams’ great success.

Another stratagem during the 1920s that would appear more familiar to the current understanding of relief pitching was Bucky Harris’ use of Fred “Firpo” Marberry in the years when the Washington Senators won two pennants and their only World Series. Harris increasingly turned to Marberry in relief to protect leads and nail down victories during this time.

In 1924, he was a hard-throwing young pitcher, appearing in 35 games in relief, winning 11, and saving 15. In 1925, now exclusively used as a relief pitcher, Marberry made 55 appearances and finished 39 games. He also projected the image of the intimidating, confident, shut-down reliever.

Marberry’s performance and contribution to Washington’s success should have been a revelation and led to a shift in relief strategy. Yet, it did not–demonstrating how resistant baseball can be to the established conventional wisdom. Even Harris balked. He started and kept Walter Johnson in as the Pirates rallied to victory late in the seventh game of the 1925 Series. He did not call on Marberry. And he reverted to the starter/starter-as-reliever convention for the next 20 years of his career as a manager–and had a subpar record in close games.

The other harbinger of the future before World War II was Joe McCarthy, manager of the Yankees in the 1930s when the team became a true dynasty. He became the first manager explicitly to divide his pitching staff between starters and relievers. Johnny Murphy became baseball’s first career relief ace, i.e., a pitcher who made relieving the focus of his career. Red Ruffing, one of the Yankees’ best starting pitchers, never was used in relief. The timing of McCarthy’s decision, 1936, is also suggestive. He had been the Yankees’ manager at that point for five years and won only one pennant. Like John McGraw, McCarthy was willing to try something different–specialization–in the face of disappointment.

The Evolution of Relief Pitching–New Models

Crandall, Marberry, Murphy, and their managers–McGraw, Harris, and McCarthy–were seeds that would bear fruit. World War II proved to be when they would. During–and especially after–the war, using pitchers exclusively and often in relief became prevalent. Among others, Leo Durocher began to use several pitchers per game routinely and to seek righty-lefty matchups with the Dodgers, and Casey Stengel became known as the quickest hook in baseball with the Yankees. Both used pitchers like Hugh Casey and Joe Page as full-time bullpen aces. Complete games as a percentage of total games fell below 50 percent and steadily declined.

The exemplars of the relief ace in the 1950s and 1960s were Elroy Face and Hoyt Wilhelm. They and others like them were used in late innings of any close game, regardless of whether their team was ahead or behind, and would pitch multiple innings as necessary. In the 1960s, relief aces, working around 75 games and 120 innings, became an important part of almost every team. The responsibilities of the latter would grow in the 1970s and 1980s. Thus, a new orthodoxy took hold: Starters and relievers were distinct components of pitching strategy, and strong relief pitching, led by a bullpen ace, who would pitch multiple innings in close games, was vital.

But once again seeds were being planted that would shift the paradigm. One of them came not from a forward-thinking manager or a standout pitcher, but from a journalist. Jerome Holtzman, Chicago newspaperman, created the concept of the save in the 1960s. It was meant to be a descriptive statistic, to capture something important a relief pitcher is called upon to do in a way pitching wins and losses did not. To some limited extent, as modified, it accounts for relief pitching performance in higher leverage situations. It is doubtful it was meant to guide relief pitching strategy. But now there was a measure of relievers’ success, and the stage was set, as Bill James put it, for language to drive thought, or put another way, for bullpens to be molded around the stat.

The other seed that was planted in the late 1970s and 1980s was that relief aces were being overused. The complete game percentage fell below 30% in the 1960s and had reached 20% by 1980. The number of games and innings pitched by top relievers climbed up to 100 games and 150 innings per season. Another indication was that managers who during this period had initially been known for their bullpen reliance–for example, Walter Alston and Sparky Anderson–came to be viewed as more reliant on starting pitching as other managers made increasing use of relievers and Alston and Anderson remained strategically consistent. As the relief aces’ workload increased, their effectiveness fell. It’s Hegelian in a way; every successful strategy has within it the seeds of its own destruction.

And so the stage was set for the La Russa Orthodoxy. Naming it as such seems a bit unfair, since it did not begin as an orthodoxy. Like other relief strategy changes, it can be seen as driven by a need and an insight. First, the need: One way to avoid overusing the relief ace is just to use him in the most important situations. And how to identify those situations?

The save statistic pointed to the answer–the ninth inning and your team with a fairly narrow lead–The Closer. With starting pitchers’ complete games falling below 20%, that meant more innings for other relievers to cover. And second, the insight: Assemble a set of power-armed pitchers, probably failed starters, give them defined roles, watch for good matchups, and mix and match your way to the Closer.

When La Russa instituted it with the A’s in the late 1980s, it was a great success. But over time, the mindless copying of an isolated success begets dogma. This, I contend, is where we now are–a dogmatic belief that the La Russa Orthodoxy is the best and only way to utilize relief pitching. But is it? We’ll examine this question and provide answers as well as ideas on the next evolutionary leap in Part 2 next Wednesday.

References & Resources