The Atlanta Braves were pretty good last year! They finished in the top 10 in wRC+, team defense (both UZR and DRS), and just barely outside of the top 10 (12th overall) in terms of rotation production. If there was one aspect of the team that wasn’t “pretty good,” it was the bullpen. By fWAR, the Brave’ relief corps finished 17th. By RA9-WAR, the placement was still 17th. By WPA, it was 19th. Broadly speaking, if there was one place for the Braves to target obvious improvements between this year and last year, the bullpen should be it.

It’s not always easy to pinpoint exactly why certain swaths of a sports fandom seize on certain narratives, but the above probably serves as good as anything else as the underlying rationale for a refrain that’s been heard more than once this season: “the Braves need to spend some money on the bullpen.” The syllogism makes sense: bullpen bad, improving bullpen necessary, team with better bullpen good. But, unfortunately, it’s not that simple. Not even close. The reality is that “spending money on the bullpen” has very little bearing on, or relationship to, bullpen quality. If you’re already aware of this, you don’t need to read on. This post is just going to be charts and graphs to that end, all more or less hammering on the same point. Even if you’re not aware of it, but don’t feel like encountering the same conclusion over and over, you could just take a look at this Craig Edwards analysis on Fangraphs from this past June, which tells you everything you need to know via its title: “Offseason Spending on Relievers Isn’t Working Out.”

In order to determine the deal with dealing with relievers, I pulled together a variety of data. Most of this post is going to be visual with just a bit commentary, because I think the visuals are illustrative enough on their own. If you’re convinced, right now, that spending payroll on relievers makes sense, maybe the visuals will change your mind. Maybe they won’t. Either way, they’re here for you.

Analysis 1: Scatterplots, salary versus production

One really basic way to examine the effectiveness, or lack thereof, of spending on relievers is just to plot the salary a reliever receives in free agency against his production. For all relievers signed before the 2014 season through the previous offseason (i.e., the one before 2018), here are those scatterplots for fWAR, RA9-WAR, and WPA. Relievers who signed for closer to league minimum, a minor league deal, or a split contract aren’t included here; these aren’t really free agents on who money is spent, after all.

There’s not really much of a relationship. There’s still a bit of an upward-sloping trend (at least it’s not downward-sloping...), and unsurprisingly, it’s stronger with fWAR than with RA9-WAR and WPA. (This makes sense because relievers in this day and age are presumably paid for their peripherals moreso than stuff that happens while they’re on the mound, after all.) But the returns are more than meager, and the situation looks even worse when you consider the actual outcomes being presented. Sticking with the most generous of the three relationships, fWAR, here’s what we see:

The vast majority of relievers are in this giant blob that runs from a performance level of physically painful to pretty good, with salaries between around $0.9 million to $11 million. There’s no huge pattern here; for essentially annual salary figure in this range, there are below-replacement level pitchers and above-replacement level pitchers. The blue circle reflects a tiny subset of relievers who are much better than generic, but there’s also no real relationship there: sure, some took in around $9 million annually, but a couple took in $2 to $4 million and did just as well. Even among the ones closer to $9 million, there’s a pretty big gap in performance not differentiated by salary, and of course, let’s not forget that for every dot in the blue circle, there’s at least one worse-performing dot in the red circle with a similar or higher salary. The purple circle is your expensive relievers, making double-digit millions. These guys do appear to have a higher baseline than your not-expensive relievers, but even so, around a third of them were closer to replacement level. At least no one got stung paying eight figures to a sub-replacement level reliever, though.

The green circle highlights one lonesome data point, that of Kenley Jansen’s 2017. Jansen earned a salary (AAV basis) of $16 million, and threw down a fiery 3.6 fWAR. Of course, the not-so-hidden subtext is that Jansen followed up that campaign by getting paid $16 million (still AAV) to put up 0.4 fWAR the following year. Oh, and that orange dot? I just wanted to give a shoutout to Jesse Chavez, the only reliever paid less than $2 million over this sample that managed to put more than 1.0 fWAR. Way to go, Jesse Chavez. You did better that year than 193 of the 218 free agent-reliever seasons in the sample. (No, I’m not kidding, Jesse Chavez’ 1.2 fWAR season was in the top fifth of all free agent reliever seasons since 2014.)

Analysis 2: Tabular summary of outcomes by reliever salary outlay

Rather than attempting to draw a line through a messy dataset, we can also think about things probabilistically. I’ve drawn some very arbitrary (and partially overlapping) distinctions in the tables below, but they’re just another look at the same issue.

I’ve defined three categories of reliever performance. The trashcan outcomes are defined as the ones that failed to record a positive value in a statistic. The awesomesauce outcomes are ones where the metric cleared a value of 1.7. Meanwhile, I think of a generic relief human as someone able to put up around 0.4 fWAR or more, so the final table shows the proportion of free agent relief seasons that are either worthy of a generic relief human label or better. Note that I did not feel like drawing even more arbitrary distinctions about whether a reliever making $2 million should be in the first row of each table or the second row, so they’re just in both. The results don’t really change either way. Note that WPA isn’t actually on the same scale as fWAR or RA9-WAR (WPA is relative to average; WAR is relative to replacement level), but the patterns end up being the same so I’m comfortable presenting them side-by-side.

When considering the bust rate, it’s not that the trashcan frequency doesn’t change as you pay more (we saw this in the scatterplot). It’s just that it doesn’t change much. There’s functionally no difference between paying a reliever between $2 million and $5 million as there is paying him between $5 million and $10 million. Sure, once you get into eight-figure salaries, the bust rate drops a lot. But based just on historical experience, your free agent reliever making $10 million or more still has a greater-than-one-third chance of ending up with a WPA that’s zero or negative. Can you imagine giving a position player or starting pitcher $10 million while being cognizant of the fact that you have nearly a two-in-five chance of him not helping you win any game at all ever? Probably not. Welcome to relievers. Of course, no one advocates spending big money on relievers just to preclude trashcan performances. They want something better.

That something better, at the high end, is awesomesuace. We see a broad trend in awesomesauce rates, too. If you’re not paying at least $5 million, your chance of snagging an awesome free agent relief season is basically negligible. But if you ratchet that up to eight figures, and it’s not that much higher. Still, playing at the top of the reliever market only gets you a one-in-four chance of getting whatever it is that you think you’re receiving from paying top dollar.

The most telling table is the third one. The percentages here are kind of the rates of “getting a minimally acceptable reliever guy for a contender” for your payroll dollars. You can see that the relationship between these probabilities and spending is complicated. First, there’s no appreciable difference between spending $1 million or double that. Then, spending between $2 million and $10 million tends to double your success rate in this regard relative to spending $2 million or below, but how much you actually spend in that range doesn’t seem to matter. If you could have one $8 million reliever or four $2 million relievers... hmm. Once again, the priciest relievers are less of a risk, but they’re still a risk. Can you imagine paying $10 million to a player who only has a 70 percent chance of being average at his role? What about the other 30 percent?

Here are a few other “fun,” non-visual facts from these data, about multiyear reliever deals. Buyer beware. Be very ware.

Of the 20 relievers that signed three-year deals or longer in the period of analysis, seven (35 percent) failed to put up positive fWAR in the first year of their deal.

Of the 17 relievers that signed three-year deals that have played two years on their deals to date, four (24 percent) failed to put up positive fWAR in Year 2. Cumulatively, across Years 1 and 2, seven of 20 (35 percent) had put up aggregate negative fWAR in those two years. Yes, that’s a greater than one-third chance of double-trouble-awfulness, for the relievers that somehow finagled three-year deals out of the market.

Of the 12 relievers that played through three years of their three-year-or-longer deals, four (33 percent) failed to put up any positive fWAR in Year 3. Cumulatively, across Years 1 through 3, five of 20 (25 percent) had put up aggregate negative fWAR across three years.

Of the 50 relievers that signed a multiyear deal, 15 (30 percent) failed to put up any positive fWAR in Year 1, and 18 (36 percent) failed to put up a combined positive fWAR in Years 1 and 2 combined. I’m not sure I can even describe just how bad this is, so I’ll leave it at that.

Analysis 3: Reliever anatomy

The analyses above focused specifically on what free agent relievers do (and don’t do) relative to their salaries. But we can also back into it another way: what’s the composition of actual good relievers in baseball, in terms of how they were acquired? This is actually more important than it might initially seem. For example, we know that pitching prospects are also very risky, but the reality is that most good starting pitchers were highly-touted pitching prospects anyway. So, even if free agent reliever investments are overly risky, are they the only avenue for getting actual passable or good relievers?

To answer this question, I collected data on every reliever with 10 or more innings, as well as their salary and method of acquisition. Note that the latter is a little wonky, because it doesn’t separately identify extensions, salary dump trades, and the like. It’s basically just, “How did the team come by this player?” This is helpful for identifying and separating free agent acquisitions, but a little weirder for thinking purely about the question of “throwing money at the bullpen: trashy or trendy?”

The actual stacked bars aren’t that interesting, it’s really just about the percentages in the total bar. You can see that the total landscape of relievers is basically split into thirds between free agency, trades, and internal development and promotion, with a little but taken up by waiver claims. If you look only at trashcan seasons, the ratio of those seasons comprised of free agent signings actually doesn’t change. In other words, free agency in and of itself isn’t a great avenue towards avoiding a terrible relief outcome. Nor is it a silver bullet for getting a good relief outcome. In the set of awesomesauce reliever seasons, free agent acquisitions are actually underrepresented, while internally-developed relievers are overrepresented. No real surprise there. For just “okay” to “good” seasons, there’s still no real preference for free agency.

I won’t post the charts, but the findings for WPA (in case you care about that more) aren’t any different. The split of total | trashcan | awesomesauce | generic composition by free agents is 30% | 30% | 25% | 31%. Same story, with the same takeaway that free agency is actually slightly less likely to result in a great relief season.

None of this controls for spending. So, let’s talk about spending, setting free agency aside for a minute. Using the same dollar figure breakouts from Analysis 2, but this time preventing double-counting just for ease of calculation, we get the following:

All reliever seasons are split 67% | 10% | 13% | 8% | 2% across the five annual salary (AAV basis) categories (less than $1M, $1M to $2M, $2M to $5M, $5M to $10M, greater than $10M). Basically, most relievers are really cheap.

For trashcan seasons, the breakdown is 73% | 10% | 11% | 6% | 1%. It’s kind of skewed towards cheaper relievers being worse overall, but it’s not that dramatic.

For awesomesauce seasons, the breakdown is 39% | 8% | 24% | 21% | 8%. You’ve gotta pay to play, but you don’t have to pay that much: the most common salary for a fantastic reliever season is still below $1 million, and most of the other seasons are taken up by players making between $2 million and $10 million, rather than really expensive relief arms.

For your generic-reliever-but-not-awesomesauce seasons, the breakdown is 58% | 12% | 18% | 11% | 2%.

Or, going at it from the other direction, in terms of what rate of each type of salary produces what type of season:

There’s basically no marked difference between getting an average-or-better reliever between $2 million and over $10 million. Fun. Except not fun.

(Skip this next part if you don’t want way too much detail.)

We can also combine salary and acquisition mode. The set of tables below captures two things: the left table shows the overall breakdown of relief fWAR accrued by salary and acquisition pathway; the right table shows the same breakdown but by relief seasons . Basically, if, for a given row, the number in the “Grand Total” column in the left table is higher than the corresponding number in the same column in the right table, that group is accruing a disproportionate share of WAR relative to its population. If the opposite is happening, the group is underperforming its expected WAR basis.

The takeaway is actually really boring but predictable: within every salary bucket, the group that consistently underperforms its WAR basis is free agents. For example, for the $1M to $2M group, free agent relievers made up about 34 percent of the reliever seasons, but accrued just 18 percent of the actual fWAR. There are similar gaps indicating similar underperformance through the table, with the only exception being the $10 million and above tranche, where free agent relievers made up 49 percent of the seasons and got 50 percent of the fWAR.

(Okay, you can come back again if you skipped over the large table.)

Analysis 4: Team spending results

This is actually a really brief detour, but I think it’s illustrative in a way that other things are not. Talking about deals on a player-by-player level is one thing, but what about talking about deals as a lever of team strategy? Could it be that even if spending on relievers isn’t a great idea, it’s still a way for teams to actually improve their bullpens at high cost? Now, this isn’t a rigorous analysis, but don’t be shocked by my answer: no.

The table below is very simple — it simply shows two teams for each year, 2014 through 2018. The “big spending team” is always the team that spent the most on relievers the prior offseason; the “little spending team” is not the lowest spender that offseason, but a team that spent barely anything. (The actual selection criteria are the team that spent the least but wasn’t tied with any other team. I suppose it could be interesting to actually do year-over-year changes as their own full analysis in this vein, so go ahead and do that if it seems like something you want to take on!)

Let’s start with 2014. Or, rather, 2013. In 2013, the Dodgers had an average-y bullpen. The Mets had a pretty bad bullpen. The Dodgers then went and committed over $19 million in AAV to relievers; the Mets committed $1.2 million. The Dodgers were rewarded for their efforts by sliding from average to below average; the Mets were rewarded for not doing much of anything by slightly improving relative to other teams, and staying below average.

This isn’t the pattern every year, but it’s not far off. Pre-2015, the White Sox committed $16.5 million AAV and went from a terrible bullpen to where the Dodgers were in 2013; the Orioles were already good, spent nothing, and either stayed the same or got better. Pre-2016, it was the Athletics’ turn to dump money to try and transform their horrid bullpen and they kinda succeeded (except by WPA); the Angels, meanwhile, fell from average to really bad. Pre-2017, it was the Dodgers once again, trying to improve on an already-great relief corps via the Kenley Jansen commitment. It didn’t really move the needle; meanwhile, the Rays leapfrogged a lot like the 2016 Athletics did, without really spending anything. And that brings us to the Rockies, whose absolute rollicking disaster with high-dollar relief contracts basically saw them end up the same as the Blue Jays, another good relieving team in 2017 who similarly suffered a bullpen backslide... but without any spending.

In short, for the “big spending teams,” the average team fWAR rank change was a loss of 3 spots; the average team WPA rank change was zero. For the “little spending teams,” the average team fWAR rank change was zero, and the average WPA rank change was plus-3 spots. So, why should teams spend on relievers? I have no idea.

These two charts have a lot of issues (namely that they can’t account for a team that had awful relief and then committed a bunch of money), but they’re also just basically oxymoronic sparse clumps. (Also, that sad dot towards the bottom right is the sad, sad, sad Rockies.) Basically, as a team, spending money on relievers hasn’t actually helped the relief corps perform better. You probably already figured that given all of the above... but just in case.

Analysis 5: The IIS Principle — “It’s Inefficient, Stupid”

No need for a chart here, just some basic arithmetic. In the pre-2018 offseason, teams committed $170 million in AAV to relievers. For their efforts, they got all of 11.8 fWAR, an effective price of $14.4 million per win. Meanwhile, in this same offseason, teams committed $410 million in AAV to non-relievers. For their efforts, they got 46.1 fWAR, an effective price of $8.9 million per win. Last offseason was a depressed market, yet relievers still fetched a markup of over 60 percent, after the fact, in terms of the price of their production. Note that this isn’t some kind of “pitcher markup,” either: starting pitchers by this same calculus only cost $9.7 million per win. It was the relievers that were inflated.

But, okay, 2018 had the Rockies going Hungry Hungry Hippos on relievers for some reason, and prices for many major league regulars were unexpectedly low. What about 2017, a more normal market? That year, relievers commanded a similar sum total of $174 million in AAV; they produced, as a reward, 12.3 fWAR, or a similar $14.1 million per win figure. Meanwhile, non-relievers secured $467 million in commitments that offseason, and produced 49.4 fWAR immediately thereafter, for a $/win mark of $9.5 million. (Starting pitchers had a mark of $7.2 million per win that year.)

The Space Pope is known for advising his followers not to date robots. I think he should add another tenet to his dogma: don’t sign relievers to big money deals. It’s not quite as catchy, but the logic is sound.

Mitigating Factors

The conclusions you draw from the above are your business. Still want to spend big bucks on relievers? Well, that’s on you now. For me, the real takeaway is that reliever spending should be smarter, rather than bigger. That’s not a very interesting or groundbreaking conclusion, but when you pay a high price for a 3 WAR position player, you’re getting a 3 WAR position player (with some variation around that). When you pay for a reliever, that variation pretty much defines whatever you’re getting, no matter how much you pay.

With that said, though, I can see two mitigating factors that add some nuance to this topic. The first is that sometimes you just don’t have stuff to spend money on. I know that sounds like kind of a wacky concept, given that Front Offices are in the habit of trying to do more with less (as opposed to more with more), but if you have an operational budget of $X, and you’ve essentially filled your entire roster with quality pieces but your bullpen is a bunch of small question marks arrayed in the shape of a bigger question mark, you may as well spend the cash to try to up your odds of avoiding bad outcomes and forcing good ones a bit. But, this is all pretty much in the context of a last resort. If there’s competing avenues for you to spend your cash, don’t prioritize the bullpen as one of those avenues. Only when there’s nothing else you could buy remaining can you sink the added cash for marginal upgrades in the relief corps.

The other factor is a good point brought up by loganutk in discussions around this topic. For much of the above, the unit of measure was fWAR. One of the reasons why fWAR is great is because it correlates particularly well to team wins (and if it didn’t, the focus would probably shift to some other metric that did). But fWAR for relievers is tricky; it relies on a leverage adjustment and is generally not quite as straightforward in terms of being about context-neutral run prevention as fWAR for other players. So, it’s possible that it’s not that “reliever spending” is bad, but rather that the current way in which we evaluate players is at odds with capturing whatever value relievers provide to their teams. However, I’m skeptical that some alternative paradigm would reveal that reliever spending was actually some kind of covert panacea. For one, much of the analysis in this post included WPA, and it wasn’t clear that there was any relevant WPA effect. Now, WPA is imperfect as well (and pitchers have to absorb everything their defenses are responsible for in the measure), but it’s not clear that adjusting for context makes reliever spending look any better. Further, looking across the array of teams that have had success over this same period, it’s not clear that there’s some kind of silver bullet about reliever spending that is critical. The 10 best teams in baseball (outcome-based, i.e., looking at wins) in the 2014-2018 period have been, in order, the Dodgers, Cubs, Indians, Nationals, Cardinals, Yankees, Astros, Red Sox, Pirates, and Angels. Of these, some have definitely spent a fair bit on their bullpens in one way or another (Dodgers, Yankees, Astros, Indians, Cubs), but some haven’t: the Nationals have won lots of games while constantly having bullpen issues, I honestly can’t remember a big Angels bullpen acquisition in recent history other than the generically-named and generically-pitching Joe Smith; the Pirates have generally had good relievers depart their organization due to baseball’s annoying economics, and the Red Sox are kind of interesting in that they’ve gone very stars-and-scrubs with their relief corps but haven’t really fared much better or worse than other teams that create deeper bullpens. I think this is still an open question, but I’m not convinced that the lesson here is “no, spending on relievers is awesome, we just don’t know how to measure reliever contributions” to the exclusion of “nah, spending on relievers is generally a bad idea.” Definitely an area for further investigation, though.

Anyway, there you have it. This holiday season, make your wish that the Braves use their brains to augment their bullpen, rather than their dollar bills.