Reduce is the Swiss-army knife of array iterators. It’s really powerful. So powerful, you can build most of the other array iterator methods with it, like .map() , .filter() and .flatMap() . And in this article we’ll look at some more amazing things you can do with it. But, if you’re new to array iterator methods, .reduce() can be confusing at first.

Reduce is one of the most versatile functions that was ever discovered —Eric Elliott

People often run into trouble as soon as they step beyond the basic examples. Simple things like addition and multiplication are fine. But as soon as you try it with something more complicated, it breaks. Using it with anything other than numbers starts to get really confusing.

Why does reduce() cause people so much trouble?

I have a theory about this. I think there’s two main reasons. The first is that we tend to teach people .map() and .filter() before we teach .reduce() . But the signature for .reduce() is different. Getting used to the idea of an initial value is a non-trivial step. And then the reducer function also has a different signature. It takes an accumulator value as well as the current array element. So learning .reduce() can be tricky because it’s so different from .map() and .filter() . And there’s no avoiding this. But I think there’s another factor at work.

The second reason relates to how we teach people about .reduce() . It’s not uncommon to see tutorials that give examples like this:

function add(a, b) { return a + b; } function multiply(a, b) { return a * b; } const sampleArray = [1, 2, 3, 4]; const sum = sampleArray.reduce(add, 0); console.log(‘The sum total is:’, sum); // ⦘ The sum total is: 10 const product = sampleArray.reduce(multiply, 1); console.log(‘The product total is:’, product); // ⦘ The product total is: 24

Now, I’m not saying this to shame anyone. The MDN docs use this kind of example. And heck, I’ve even done it myself. There’s a good reason why we do this. Functions like add() and multiply() are nice and simple to understand. But unfortunately they’re a little too simple. With add() , it doesn’t matter whether you add b + a or a + b . And the same goes for multiply. Multiplying a * b is the same as b * a . And this is all as you would expect. But the trouble is, this makes it more difficult to see what’s going on in the reducer function.

The reducer function is the first parameter we pass to .reduce() . It has a signature that looks something like this:

function myReducer(accumulator, arrayElement) { // Code to do something goes here }

The accumulator represents a ‘carry’ value. It contains whatever was returned last time the reducer function was called. If the reducer function hasn’t been called yet, then it contains the initial value. So, when we pass add() in as the reducer the accumulator maps to the a part of a + b . And a just so happens to contain the running total of all the previous items. And the same goes for multiply() . The a parameter in a * b contains the running multiplication total. And there’s nothing wrong with showing people this. But, it masks one of the most interesting features of .reduce() .

The great power of .reduce() comes from the fact that accumulator and arrayElement don’t have to be the same type. For add and multiply , both a and b are numbers. They’re the same type. But we don’t have to make our reducers like that. The accumulator can be something completely different from the array elements.

For example, our accumulator might be a string, while our array contains numbers:

function fizzBuzzReducer(acc, element) { if (element % 15 === 0) return `${acc}Fizz Buzz

`; if (element % 5 === 0) return `${acc}Fizz

`; if (element % 3 === 0) return `${acc}Buzz

`; return `${acc}${element}

`; } const nums = [ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 ]; console.log(nums.reduce(fizzBuzzReducer, ''));

Now, this is just an example to make the point. If we’re working with strings, we could achieve the same thing with a .map() and .join() combo. But .reduce() is useful for more than just strings. The accumulator value doesn’t have to be a simple type (like numbers or strings). It can be a structured type like an array or a plain ol' JavaScript object (POJO). This lets us do some really interesting things, as we’ll see in a moment.

Some interesting things we can do with reduce

So, what interesting things can we do then? I’ve listed five here that don’t involve adding numbers together:

Convert an array to an object; Unfold to a larger array; Make two calculations in one traversal; Combine mapping and filtering into one pass; and Run asynchronous functions in sequence

Convert an array to an object

We can use .reduce() to convert an array to a POJO. This can be handy if you need to do lookups of some sort. For example, imagine if we had a list of people:

const peopleArr = [ { username: 'glestrade', displayname: 'Inspector Lestrade', email: 'glestrade@met.police.uk', authHash: 'bdbf9920f42242defd9a7f76451f4f1d', lastSeen: '2019-05-13T11:07:22+00:00', }, { username: 'mholmes', displayname: 'Mycroft Holmes', email: 'mholmes@gov.uk', authHash: 'b4d04ad5c4c6483cfea030ff4e7c70bc', lastSeen: '2019-05-10T11:21:36+00:00', }, { username: 'iadler', displayname: 'Irene Adler', email: null, authHash: '319d55944f13760af0a07bf24bd1de28', lastSeen: '2019-05-17T11:12:12+00:00', }, ];

In some circumstances, it might be convenient to look up user details by their username. To make that easier, we can convert our array to an object. It might look something like this:

function keyByUsernameReducer(acc, person) { return {...acc, [person.username]: person}; } const peopleObj = peopleArr.reduce(keyByUsernameReducer, {}); console.log(peopleObj); // ⦘ { // "glestrade": { // "username": "glestrade", // "displayname": "Inspector Lestrade", // "email": "glestrade@met.police.uk", // "authHash": "bdbf9920f42242defd9a7f76451f4f1d", // "lastSeen": "2019-05-13T11:07:22+00:00" // }, // "mholmes": { // "username": "mholmes", // "displayname": "Mycroft Holmes", // "email": "mholmes@gov.uk", // "authHash": "b4d04ad5c4c6483cfea030ff4e7c70bc", // "lastSeen": "2019-05-10T11:21:36+00:00" // }, // "iadler":{ // "username": "iadler", // "displayname": "Irene Adler", // "email": null, // "authHash": "319d55944f13760af0a07bf24bd1de28", // "lastSeen": "2019-05-17T11:12:12+00:00" // } // }

In this version, I’ve left the username as part of the object. But with a small tweak you can remove it (if you need to).

Unfold a small array to a larger array

Normally, we think about .reduce() as taking a list of many things and reducing it down to a single value. But there’s no reason that single value can’t be an array. And there’s also no rule saying the array has to be shorter than the original. So, we can use .reduce() to transform short arrays into longer ones.

This can be handy if you’re reading data from a text file. Here’s an example. Imagine we’ve read a bunch of plain text lines into an array. We’d like to split each line by commas, and have one big list of names.

const fileLines = [ 'Inspector Algar,Inspector Bardle,Mr. Barker,Inspector Barton', 'Inspector Baynes,Inspector Bradstreet,Inspector Sam Brown', 'Monsieur Dubugue,Birdy Edwards,Inspector Forbes,Inspector Forrester', 'Inspector Gregory,Inspector Tobias Gregson,Inspector Hill', 'Inspector Stanley Hopkins,Inspector Athelney Jones' ]; function splitLineReducer(acc, line) { return acc.concat(line.split(/,/g)); } const investigators = fileLines.reduce(splitLineReducer, []); console.log(investigators); // ⦘ [ // "Inspector Algar", // "Inspector Bardle", // "Mr. Barker", // "Inspector Barton", // "Inspector Baynes", // "Inspector Bradstreet", // "Inspector Sam Brown", // "Monsieur Dubugue", // "Birdy Edwards", // "Inspector Forbes", // "Inspector Forrester", // "Inspector Gregory", // "Inspector Tobias Gregson", // "Inspector Hill", // "Inspector Stanley Hopkins", // "Inspector Athelney Jones" // ]

We start with an array of length five, and then end up with an array of length sixteen.

Now, you may have come across my Civilised Guide to JavaScript Array Methods. And if you’re paying attention, you may have noticed that I recommend .flatMap() for this kind of scenario. So, perhaps this one doesn’t really count. But, you may also have noticed that .flatMap() isn’t available in Internet Explorer or Edge. So, we can use .reduce() to create our own flatMap() function.

function flatMap(f, arr) { const reducer = (acc, item) => acc.concat(f(item)); return arr.reduce(reducer, []); } const investigators = flatMap(x => x.split(','), fileLines); console.log(investigators);

So, .reduce() can help us make longer arrays out of short ones. But it can also cover for missing array methods that aren’t available.

Make two calculations in one traversal

Sometimes we need to make two calculations based on a single array. For example, we might want to calculate the maximum and the minimum for a list of numbers. We could do this with two passes like so:

const readings = [0.3, 1.2, 3.4, 0.2, 3.2, 5.5, 0.4]; const maxReading = readings.reduce((x, y) => Math.max(x, y), Number.MIN_VALUE); const minReading = readings.reduce((x, y) => Math.min(x, y), Number.MAX_VALUE); console.log({minReading, maxReading}); // ⦘ {minReading: 0.2, maxReading: 5.5}

This requires traversing our array twice. But, there may be times when we don’t want to do that. Since .reduce() lets us return any type we want, we don’t have to return a number. We can encode two values into an object. Then we can do two calculations on each iteration and only traverse the array once:

const readings = [0.3, 1.2, 3.4, 0.2, 3.2, 5.5, 0.4]; function minMaxReducer(acc, reading) { return { minReading: Math.min(acc.minReading, reading), maxReading: Math.max(acc.maxReading, reading), }; } const initMinMax = { minReading: Number.MAX_VALUE, maxReading: Number.MIN_VALUE, }; const minMax = readings.reduce(minMaxReducer, initMinMax); console.log(minMax); // ⦘ {minReading: 0.2, maxReading: 5.5}

The trouble with this particular example is that we don’t really get a performance boost here. We still end up performing the same number of calculations. But, there are cases where it might make a genuine difference. For exmaple, if we’re combining .map() and .filter() operations…

Combine mapping and filtering into one pass

Imagine we have the same peopleArr from before. We’d like to find the most recent login, excluding people without an email address. One way to do this would be with three separate operations:

Filter out entries without an email; then Extract the lastSeen property; and finally Find the maximum value.

Putting that all together might look something like so:

function notEmptyEmail(x) { return (x.email !== null) && (x.email !== undefined); } function getLastSeen(x) { return x.lastSeen; } function greater(a, b) { return (a > b) ? a : b; } const peopleWithEmail = peopleArr.filter(notEmptyEmail); const lastSeenDates = peopleWithEmail.map(getLastSeen); const mostRecent = lastSeenDates.reduce(greater, ''); console.log(mostRecent); // ⦘ 2019-05-13T11:07:22+00:00

Now, this code is perfectly readable and it works. For the sample data, it’s just fine. But if we had an enormous array, then there’s a chance we might start running into memory issues. This is because we use a variable to store each intermediate array. If we modify our reducer callback, then we can do everything in one pass:

function notEmptyEmail(x) { return (x.email !== null) && (x.email !== undefined); } function greater(a, b) { return (a > b) ? a : b; } function notEmptyMostRecent(currentRecent, person) { return (notEmptyEmail(person)) ? greater(currentRecent, person.lastSeen) : currentRecent; } const mostRecent = peopleArr.reduce(notEmptyMostRecent, ''); console.log(mostRecent); // ⦘ 2019-05-13T11:07:22+00:00

In this version we traverse the array just once. But it may not be an improvement if the list of people is always small. My recommendation would be to stick with .filter() and .map() by default. If you identify memory-usage or performance issues, then look at alternatives like this.

Run asynchronous functions in sequence

Another thing we can do with .reduce() is to run promises in sequence (as opposed to parallel). This can be handy if you have a rate limit on API requests or if you need to pass the result of each promise to the next one. To give an example, imagine we wanted to fetch messages for each person in our peopleArr array.

function fetchMessages(username) { return fetch(`https://example.com/api/messages/${username}`) .then(response => response.json()); } function getUsername(person) { return person.username; } async function chainedFetchMessages(p, username) { // In this function, p is a promise. We wait for it to finish, // then run fetchMessages(). const obj = await p; const data = await fetchMessages(username); return { ...obj, [username]: data}; } const msgObj = peopleArr .map(getUsername) .reduce(chainedFetchMessages, Promise.resolve({})) .then(console.log); // ⦘ {glestrade: [ … ], mholmes: [ … ], iadler: [ … ]}

Notice that for this to work, we have to pass in a Promise as the initial value using Promise.resolve() . It will resolve immediately (that’s what Promise.resolve() does). Then our first API call will run straight away.

Why don’t we see reduce more often then?

So, we’ve seen a bunch of interesting things you can do with .reduce() . Hopefully they will spark some ideas on how you can use it for your own projects. But, if .reduce() is so powerful and flexible, then why don’t we see it more often? Ironically, its flexibility and power sometimes work against it. The thing is, you can do so many different things with reduce that it gives you less information. Methods like map , .filter() and .flatMap() are more specific and less flexible. But they tell us more about the author’s intent. We say that this makes them more expressive. So It’s usually better to use a more expressive method, rather than use reduce for everything.

Over to you, my friend