Categories
async-await ecmascript-2017 javascript node.js promise

Using async/await with a forEach loop

2541

Are there any issues with using async/await in a forEach loop? I’m trying to loop through an array of files and await on the contents of each file.

import fs from 'fs-promise'

async function printFiles () {
  const files = await getFilePaths() // Assume this works fine

  files.forEach(async (file) => {
    const contents = await fs.readFile(file, 'utf8')
    console.log(contents)
  })
}

printFiles()

This code does work, but could something go wrong with this? I had someone tell me that you’re not supposed to use async/await in a higher-order function like this, so I just wanted to ask if there was any issue with this.

1

  • 22

    @KernelMode The forEach method is the higher-order function here

    – Bergi

    Aug 9, 2020 at 18:58

4610

Sure the code does work, but I’m pretty sure it doesn’t do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.

Reading in sequence

If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:

async function printFiles () {
  const files = await getFilePaths();

  for (const file of files) {
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  }
}

Reading in parallel

If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you’re throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you’ll get with Promise.all:

async function printFiles () {
  const files = await getFilePaths();

  await Promise.all(files.map(async (file) => {
    const contents = await fs.readFile(file, 'utf8')
    console.log(contents)
  }));
}

38

  • 78

    Could you please explain why does for ... of ... work?

    – Demonbane

    Aug 15, 2016 at 18:04

  • 196

    ok i know why… Using Babel will transform async/await to generator function and using forEach means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context of next() with others. Actually, a simple for() loop also works because the iterations are also in one single generator function.

    – Demonbane

    Aug 15, 2016 at 19:21


  • 42

    @Demonbane: In short, because it was designed to work 🙂 await suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).

    – Bergi

    Aug 15, 2016 at 23:28

  • 5

    @arve0 Not really, an async function is quite different from a Promise executor callback, but yes the map callback returns a promise in both cases.

    – Bergi

    Mar 29, 2017 at 16:25

  • 5

    @Taurus If you don’t intend to await them, then for…of would work equally to forEach. No, I really mean that paragraph to emphasise that there is no place for .forEach in modern JS code.

    – Bergi

    Mar 20, 2018 at 13:24

471

With ES2018, you are able to greatly simplify all of the above answers to:

async function printFiles () {
  const files = await getFilePaths()

  for await (const contents of files.map(file => fs.readFile(file, 'utf8'))) {
    console.log(contents)
  }
}

See spec: proposal-async-iteration

Simplified:

  for await (const results of array) {
    await longRunningTask()
  }
  console.log('I will wait')

2018-09-10: This answer has been getting a lot of attention recently, please see Axel Rauschmayer’s blog post for further information about asynchronous iteration.

7

  • 9

    I don’t think this answer address the initial question. for-await-of with a synchronous iterable (an array in our case) doesn’t cover the case of iterating concurrently an array using asynchronous operations in each iteration. If I’m not mistaken, using for-await-of with a synchronous iterable over non-promise values is the same as using a plain for-of.

    Jan 9, 2019 at 10:30

  • 2

    How we delegates files array to the fs.readFile here? It tooks from iterable?

    Jan 17, 2019 at 13:34


  • 1

    Using this solution each iteration would await for the previous, and in case of operation is making some long calculations or reading a long file it would block the executions of the next, as opposed to mapping all the functions to promises and waiting for them to complete.

    Sep 11, 2019 at 1:07

  • 3

    This answer has the same issue as the OP: It accesses all files in parallel. The serialized printing of results merely hides it.

    – jib

    Feb 18, 2021 at 13:52

  • 5

    This answer is wrong. files.map() returns an array of promises, not an asynchronous iterator, for which for await was made! It will cause unhandled-rejection crashes!

    – Bergi

    Dec 13, 2021 at 15:57

165

Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:

async function printFiles () {
  const files = await getFilePaths();

  await files.reduce(async (promise, file) => {
    // This line will wait for the last async function to finish.
    // The first iteration uses an already resolved Promise
    // so, it will immediately continue.
    await promise;
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  }, Promise.resolve());
}

11

  • 1

    This works perfectly, thank you so much. Could you explain what is happening here with Promise.resolve() and await promise;?

    – parrker9

    Mar 28, 2018 at 20:48

  • 2

    This is pretty cool. Am I right in thinking the files will be read in order and not all at once?

    – GollyJer

    Jun 9, 2018 at 0:24

  • 5

    @parrker9 Promise.resolve() returns an already resolved Promise object, so that reduce has a Promise to start with. await promise; will wait for the last Promise in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.

    Jun 17, 2018 at 15:00


  • 2

    @Shay, You mean sequential, not synchronous. This is still asynchronous – if other things are scheduled, they will run in between the iterations here.

    May 30, 2019 at 16:51


  • 4

    If you need the async processes to finish as quickly as possible and you don’t care about them being completed sequentially, try one of the provided solutions with a good amount of upvotes which uses Promise.all. Example: Promise.all(files.map(async (file) => { /* code */ }));

    Jan 31, 2020 at 16:03