Counting Sand

Conflated Magnitudes, the Fiction of Risk, and the Myth of Restraint’s Disincentive

Let’s say I hired you to do a job for me eight hours a day for five days. Nothing unethical or shady, just some organizational work and decision-making of moderate complexity. The work would be somewhat stressful with meaningful stakes involved—if you made a bad choice, it could have lasting repercussions for dozens of people.

Here’s the catch: as payment, you would receive between $100 and $200, depending on the results of a random number generator.

Obviously you wouldn’t do it — at best that gives you $5 an hour, less than minimum wage.

[I]f you effectively factor out only the top 1%, our average yearly income drops by an entire 20%…

How about this: between $500 and $1,000, again depending on a random variable.

This shows greater promise, but still—with bad luck you’re stuck doing a potentially stressful job for only around $12.50/hour, and at best it’s $25/hour.

Let’s make the range between $5,000 and $10,000. This is far more appealing—even if you don’t get the full $400/hour, getting $125/hour (equivalent to a $260,000/year salary) is certainly nothing to sneeze at.

If that’s not compelling enough, let’s try a few more scenarios:

Payment between $10,000 and $1,000,000 — $400/hour to $25,000/hour . Would you turn down the job simply because at the end you might make “only” $10,000 for one week’s work? Is the chance of receiving $1,000,000 more important than the actual sum you receive?

and — to . Would you turn down the job simply because at the end you might make “only” for one week’s work? Is the chance of receiving more important than the actual sum you receive? Payment between $1,000,000 and $2,000,000 — $25,000/hour to $50,000/hour . Would you take the job, or would you turn down one million dollars just because you’re not guaranteed to make two million?

and — to . Would you take the job, or would you turn down one million dollars just because you’re not guaranteed to make two million? Payment between $1,000,000 and $10,000,000 — $25,000/hour to $250,000/hour. Does increasing the potential ceiling make you any less likely to do a week’s work for a million dollars?

I have a broader point with all this that I’ll get to in a moment, but first I want you to think about your current yearly income and imagine what you could do, how your life would change, if it was increased to $300,000. (I’m assuming most of you make less than that per year, since only 1% of the entire population of the United States makes that much or more.)

The average household income in the United States is $72,641, though that’s misleading because a wealthy few skew that number significantly higher: each year, the top 1% (around only 3,230,000 people) takes over 20% of our collective yearly income. The other 320,000,000 of us share the remaining 80%. A more accurate measure is to examine the median income of $59,000—a difference of almost $13,000. In other words, if you effectively factor out only the top 1%, our average yearly income drops by an entire 20%. (And this is just income. The top 1% owns over 50% of the wealth, which means that if you factor out just that 1%, the average American net worth drops by half.)

Wealth Inequality in America: Remove the top 1% and we don’t have a whole lot left among us

If we factor in taxes, the take-home pay on $300,000 is about $186,000 and the take-home pay on $59,000 is about $46,000, for a difference of roughly $140,000.

The average house in the United States costs $189,000. If you took home an additional $140,000 a year on top of what you already do (assuming you make the average), you could buy a home and pay it off within 2 years and still have around $90,000 extra left over at the end (a little less, accounting for interest on the mortgage, though you could always just stash away the required amount over the course of the next 1.5 years and buy the house in cash to avoid having to pay any interest at all).

Imagine everything you could do making $300,000 a year. You wouldn’t have to worry about most expenses. Forget fretting over things like unexpected car repairs—you could buy a brand new car every year if you wanted to. Your life would be absolutely transformed. But $300,000 is just the beginning—that’s the very lowest end of the top 1%.

Let’s multiply it by, say, ten—$3,000,000 (which is still trivial compared to the highest end of the 1%; I’ll show you why shortly). This would likely equate to around $1,500,000 after taxes, but that’s still a difference of $1,462,000 over the average American’s net income. It’s so much money that differences like that barely even register—our brains treat $1,500,000 and $1,462,000 as roughly equal even though they’re missing your entire yearly take-home pay. And that’s how they get you.

Think of how long it takes just to count to a million, and now imagine getting paid $98,000 for each number you name — that’s how much money Jeff Bezos has.

Underlying all this is the human brain’s difficulty understanding and organizing numbers, harder and harder the bigger they get. There’s a certain range we’re able to firmly grasp — I’d posit the ceiling is somewhere just over a million dollars for most of us — but numbers beyond that are so outside our realm of familiarity that we just don’t have the ability to process them in the same way. They become nebulous abstract concepts like the numbers of cells in our bodies, or the difference in size between our own sun and VY Canis Majoris, the largest known star.

You can see it in the way we group millionaires and multimillionaires and billionaires into the same category—a millionaire seems pretty rich, but a billionaire like Jeff Bezos owns 98,000 times as much as a millionaire. Imagine having $98,000—now multiply that by an entire million. Think of how long it takes just to count to a million, and now imagine getting paid $98,000 for each number you name—that’s how much money Jeff Bezos has. And yet we tend to describe all these people in the same way, using the same terms, as the same group. “Millionaires and billionaires.”

Even resistance movements to the concentration of wealth (e.g. Occupy Wall Street) tend to frame things in ways that reveal an underlying difficulty grasping scale. “The top 1%” means those making around $300,000 or more, but the top 0.00000125% (the wealthiest 400 Americans, 2015 statistics) makes an average of $335,700,000/year — over 1,100 times more than the lowest earner among the top 1%. It seems silly to conflate $300,000 with something 1,100 times bigger, even though it makes a kind of sense — the problem isn’t the people making $300,000, it’s the people making $3,000,000 or $30,000,000, or $300,000,000.

We’re frequently told in the U.S. that if the wealthy have any limitations at all on their income or have to pay taxes that reduce their net income, they will have no incentive to work. Does this really make sense when you break down the numbers?

Similar to our difficulty processing magnitudes, we have problems discerning thresholds as well, perhaps most succinctly illustrated by the Sorites Paradox: say you have a heap of sand, and you keep removing grains from the heap until only a single grain remains. At which point did it cease being a heap? We can translate this to thresholds in incentive-based motivation: say you started out with $100,000,000/year and removed $1,000,000 from it at a time — at which point would a job no longer be worth it to you?

How many seconds in eternity?

Let’s return to our thought experiment. If I told you I’d pay you a salary that would net you, after taxes, anywhere between $2,000,000/year and $20,000,000/year, would you turn down the job just because you made only $2,000,000? You would have to be out of your mind — two million dollars per year could provide you just about anything you wanted, give you the life of your dreams. If that seems too meager, there are plenty of hardworking and capable Americans who would gladly do the job for that exorbitant salary.

Clearly there can’t be too much difference in generated incentive between $2,000,000 and $20,000,000, despite the latter being ten times the former. And yet we’re told to believe that given the choice between $2,000,000/year net income out of $20,000,000 gross income and not working at all, top earners would choose the latter. Does that make sense to you? Is that really a valid threat?

At the other end of the income spectrum, paradoxically espoused by many of the same people making the previous argument, we’re told no one should think themselves above working minimum-wage jobs — and that we don’t need to raise the (deeply sub-poverty) minimum wage to increase incentive. The scales involved make this a complete inverse of reason: we’re expected to believe that someone making $2,000,000 would miss $1,000,000 so much that they’d be discouraged from even working at all, but that even, say, $5,000 more wouldn’t make a difference in the life of someone making $15,000. (Advocates for Flat Tax hold a similar (willful?) blind spot for magnitude in their belief that losing $10,000/$30,000 would have the same impact on someone’s life that losing $10,000,000/$30,000,000 would.)

Let’s actually consider the scale, though: a minimum-wage full-time job results in $15,080 gross yearly income — around $13,600 net, after taxes. (Someone netting $2,000,000/year would take that home in fewer than 2 days.) Multiplying this income by 10 (one order of magnitude) brings it to only $136,000. Keep this number in mind for a moment.

Taking it a different direction, if we asked the person taking home $13,600 to take only 1/10th of their income, they’d be left with a mere $1,360, whereas if we asked the same of the person taking home $2,000,000 they would have $200,000—far more than the aforementioned $136,000.

So to recap, a person making minimum wage could multiply their take-home income by 10 times and still not have as much as if we asked a person taking home two million dollars to keep only 1/10th of their income. Magnitudes matter.

The laziest investor will see profits from a business long before — and often instead of — the hardest worker…

Wealth apologism runs rampant in American society, with exploited employees eagerly working amateur public relations campaigns to defend the people exploiting them. The laziest investor will see profits from a business long before — and often instead of — the hardest worker, but ask many conservative workers if this seems right and they’ll tell you all about how the investor’s risk somehow warrants their indefinite and unmitigated siphoning of profits. Somehow an investor who had $500 to contribute in 1988 deserves to continue to yield returns on that investment in 2018, before any worker gets their share of the profits their ongoing efforts generate.

“Risk” is the go-to justification for siphoning wealth. Executives shoulder the burden of risk that somehow warrants their making hundreds or even thousands of times as much as the average employee. Investors take a risk by lending their money to a business in the hope of seeing a return on that investment equal to or greater than what they put in. The executive could make a bad decision that costs them their job. The investor could put their money into a business that fails. Risk.

But let’s examine that premise of risk for a moment. Executives at large companies make bad decisions all the time and never seem to face any negative repercussions for it. If they get fired, they’re guaranteed millions and millions of dollars in severance. The bankers responsible for the housing crisis that nearly destroyed our economy all still received bonuses — some of them with our taxpayer-funded bailout money — despite their terrible and devastating decisions. Lower-level employees will lose their jobs as the result of bad decisions long before the executives who made those bad calls. So where’s the risk? (Not to mention that the Peter Principle dictates that people often climb the ladder to a position just above their highest capability, leaving many at the highest levels in roles in which they’re largely incompetent.)

The Walton Family […] could give away an entire billion dollars a year and never actually lose any money…

As for investors, the wealthy who wisely diversify their holdings will likely only ever make money even if some of their investments completely tank. Let’s imagine an investor who has $2,000,000,000, who invests $1,000,000,000 and puts the remaining $1,000,000,000 into a savings account yielding 1.5% interest. Over the course of a year, their savings account will earn them $15,000,000, meaning they could invest fifteen million dollars and lose it all and still be no worse off than when they started. Imagine being able to lose even a single million dollars without feeling it.

Beyond that, firms analyze investments and sort them by risk, so it should be easy for someone with enough money to put a sufficient amount in safer investments and hedge funds to provide stable — albeit slow — growth to offset any less confident ventures. So again, where’s the risk?

At a certain magnitude of wealth, even losses aren’t real losses. Jeff Bezos is worth $98,000,000,000. He could give away an entire billion dollars a year for the rest of his life and still not even come close to running out of money. The Walton Family is worth $148,000,000,000. Theoretically, if they were to simply put all their money into a savings account with 1.5% interest, they too could give away an entire billion dollars a year and never actually lose any money — in fact, be left with even more than when they started. Imagine being able to give away a billion dollars and not even lose it. (And yet, they don’t even contribute much to their own foundation and are barely charitable at all.)

The cost is higher than you realize.

These are the people we’re told time and again need more money, need to be allowed to take home more of their yearly income. The idea of taxing them or taking away some of that wealth is decried as theft, as unfair, as cruelty. Instead of guaranteeing free healthcare to every American, our priority is to ensure that those who could give away 98% of their net worth and still be the wealthiest people in our country can keep every penny they have.

If you’ve had trouble keeping track of all the numbers in this piece, don’t feel bad. There were times while editing when I had to go back and re-read a paragraph more carefully because I mistook a million for a billion—it’s easy to lose track of how many zeroes there are at a glance. But that’s ultimately my point: everyone struggles with this. It’s easy for us to distinguish differences of magnitudes like 1 vs 10 or 100 or 1,000 or 10,000, but when we get into the millions, billions, and trillions, it’s easy to lose track.

They’ve bewildered you with scale, banking — literally — on the surety that you won’t look closely at things too large to easily fathom. They’ve gotten even their opponents to frame the discussion in ways that conflate conceivable income — $300,000 — with the inconceivable, magnitudes-larger incomes that they themselves have, in an effort to get you to identify with them and defend them as they continue to siphon all the prosperity out of our economy.

It doesn’t have to be this way — the only reason it does is that we continue to let it happen. The growth we create belongs to all of us. It’s up to you — to us — to demand it back.