Categories
arrays javascript split

Split array into chunks

853

Let’s say that I have an Javascript array looking as following:

["Element 1","Element 2","Element 3",...]; // with close to a hundred elements.

What approach would be appropriate to chunk (split) the array into many smaller arrays with, lets say, 10 elements at its most?

5

1034

The array.slice() method can extract a slice from the beginning, middle, or end of an array for whatever purposes you require, without changing the original array.

const chunkSize = 10;
for (let i = 0; i < array.length; i += chunkSize) {
    const chunk = array.slice(i, i + chunkSize);
    // do whatever
}

The last chunk may be smaller than chunkSize. For example when given an array of 12 elements the first chunk will have 10 elements, the second chunk only has 2.

Note that a chunkSize of 0 will cause an infinite loop.

17

  • 30

    Remember if this is a util function to assert against chunk being 0. (infinite loop)

    – Steven Lu

    Jan 25, 2014 at 0:51


  • 28

    Nope, the last chunk should just be smaller than the others.

    Jul 22, 2014 at 23:27

  • 10

    @Blazemonger, indeed! Next time I will actually try it myself before jumping to conclusions. I assumed (incorrectly) that passing an input into array.slice that exceeded the bounds of the array would be a problem, but it works just perfect!

    – rysqui

    Jul 23, 2014 at 18:48


  • 118

    For one-liners (chain-lovers): const array_chunks = (array, chunk_size) => Array(Math.ceil(array.length / chunk_size)).fill().map((_, index) => index * chunk_size).map(begin => array.slice(begin, begin + chunk_size));.

    Feb 4, 2018 at 12:39


  • 17

    Why do you need j? First I thought it is an optimisation, but it is actually slower than for(i=0;i<array.length;i++){}

    – Alex

    Mar 16, 2018 at 12:22

251

Here’s a ES6 version using reduce

const perChunk = 2 // items per chunk    

const inputArray = ['a','b','c','d','e']

const result = inputArray.reduce((resultArray, item, index) => { 
  const chunkIndex = Math.floor(index/perChunk)

  if(!resultArray[chunkIndex]) {
    resultArray[chunkIndex] = [] // start a new chunk
  }

  resultArray[chunkIndex].push(item)

  return resultArray
}, [])

console.log(result); // result: [['a','b'], ['c','d'], ['e']]

And you’re ready to chain further map/reduce transformations.
Your input array is left intact


If you prefer a shorter but less readable version, you can sprinkle some concat into the mix for the same end result:

inputArray.reduce((all,one,i) => {
   const ch = Math.floor(i/perChunk); 
   all[ch] = [].concat((all[ch]||[]),one); 
   return all
}, [])

You can use remainder operator to put consecutive items into different chunks:

const ch = (i % perChunk); 

10

  • 1

    This seems like the most condensed solution. What is chunkIndex = Math.floor(index/perChunk) getting ? Is it the average ?

    – me-me

    Jun 5, 2018 at 3:32

  • 1

    5/2 = 2.5 and Math.floor(2.5) = 2 so item with index 5 will placed in bucket 2

    – Andrei R

    Jun 7, 2018 at 3:34


  • 10

    I like your use of all and one here – makes reduce easier to read to my brain than other examples I’ve seen & used.

    Sep 30, 2020 at 19:24

  • 2

    Hot take from someone who loves functional programming, a for loop is more readable than reducing into a new array

    Sep 27, 2021 at 16:19

  • 4

    Reading solutions like this I really wonder if people ever consider the space/time complexity of their algorithms anymore. concat() clones arrays, which means that not only does this algorithm iterate every element as @JPdelaTorre notices but it does so per every other element. With one million items (which is really not that weird for any real use-case) this algorithm takes nearly 22 seconds to run on my PC, while the accepted answer takes 8 milliseconds. Go team FP!

    – SFG

    Feb 8 at 13:52

169

Modified from an answer by dbaseman: https://stackoverflow.com/a/10456344/711085

Object.defineProperty(Array.prototype, 'chunk_inefficient', {
  value: function(chunkSize) {
    var array = this;
    return [].concat.apply([],
      array.map(function(elem, i) {
        return i % chunkSize ? [] : [array.slice(i, i + chunkSize)];
      })
    );
  }
});

console.log(
  [1, 2, 3, 4, 5, 6, 7].chunk_inefficient(3)
)
// [[1, 2, 3], [4, 5, 6], [7]]

minor addendum:

I should point out that the above is a not-that-elegant (in my mind) workaround to use Array.map. It basically does the following, where ~ is concatenation:

[[1,2,3]]~[]~[]~[] ~ [[4,5,6]]~[]~[]~[] ~ [[7]]

It has the same asymptotic running time as the method below, but perhaps a worse constant factor due to building empty lists. One could rewrite this as follows (mostly the same as Blazemonger’s method, which is why I did not originally submit this answer):

More efficient method:

// refresh page if experimenting and you already defined Array.prototype.chunk

Object.defineProperty(Array.prototype, 'chunk', {
  value: function(chunkSize) {
    var R = [];
    for (var i = 0; i < this.length; i += chunkSize)
      R.push(this.slice(i, i + chunkSize));
    return R;
  }
});

console.log(
  [1, 2, 3, 4, 5, 6, 7].chunk(3)
)

My preferred way nowadays is the above, or one of the following:

Array.range = function(n) {
  // Array.range(5) --> [0,1,2,3,4]
  return Array.apply(null,Array(n)).map((x,i) => i)
};

Object.defineProperty(Array.prototype, 'chunk', {
  value: function(n) {

    // ACTUAL CODE FOR CHUNKING ARRAY:
    return Array.range(Math.ceil(this.length/n)).map((x,i) => this.slice(i*n,i*n+n));

  }
});

Demo:

> JSON.stringify( Array.range(10).chunk(3) );
[[1,2,3],[4,5,6],[7,8,9],[10]]

Or if you don’t want an Array.range function, it’s actually just a one-liner (excluding the fluff):

var ceil = Math.ceil;

Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
    return Array(ceil(this.length/n)).fill().map((_,i) => this.slice(i*n,i*n+n));
}});

or

Object.defineProperty(Array.prototype, 'chunk', {value: function(n) {
    return Array.from(Array(ceil(this.length/n)), (_,i)=>this.slice(i*n,i*n+n));
}});

9

  • 64

    Eh, I’d avoid messing with the prototype as the feeling of coolness you get from calling the chunk function on the array doesn’t really outweigh the extra complexity you’re adding and the subtle bugs that messing with built-in prototypes can cause.

    Jul 4, 2012 at 1:19

  • 16

    He’s not messing with them he’s extending them for Arrays. I understand never touching Object.prototype because that would bubble to all objects (everything) but for this Array specific function I don’t see any issues.

    – rlemon

    Jul 24, 2012 at 19:45

  • 5

    Based on the compatibility chart on the mozilla dev site, Array.map for for IE9+. Be careful.

    – Maikel D

    Jun 4, 2013 at 6:13

  • 11

    @rlemon Here you go, here’s the issues this causes. Please NEVER modify native prototypes, especially without vendor prefix: developers.google.com/web/updates/2018/03/smooshgate It’s fine if you add array.myCompanyFlatten, but please don’t add array.flatten and pray that it’ll never cause issues. As you can see, mootools’ decision years ago now influences TC39 standards.

    Apr 23, 2020 at 15:39


  • 1

    @rlemon It’s still dangerous because it may clash with what someone else has done, or with a future method. To be safe, just create a method that takes the array and avoid problems altogether. humanwhocodes.com/blog/2010/03/02/…

    May 19 at 15:28