Beware of Promise.all
Jeremy Dorn
Posted on October 28, 2021
In Javascript, Promise.all
lets you execute a bunch of Promises in parallel and get an array of results back.
const responses = await Promise.all([
fetch("/api/1"),
fetch("/api/2")
])
Pretty straight forward. However, if you were to do the above with 100 fetch calls instead, you might accidentally take down your server in a self-inflicted Denial of Service attack. Even if you protect against this in the API with rate-limiting, you're still going to see a lot of errors for failed requests as you scale up.
APIs are the exception. Most types of external calls have no concept of rate-limiting at all - filesystem operations, system calls, etc.
For example, in NodeJS you can spawn new shells to call out to other programs on the computer. I use this in my open source A/B testing platform GrowthBook to call a Python script. Something like this:
const results = await Promise.all(
metrics.map(m => callPython(m))
)
The above will happily spawn hundreds of Python shells if given a large array and start executing them all in parallel. My dev machine is pretty powerful, so I didn't notice during testing that all 8 CPU cores would go to 100% for a couple seconds. When I deployed the code to a Docker container on AWS though, I definitely noticed when it started crashing and restarting all the time.
The solution is to add rate-limiting or concurrency limits to your Promise.all
calls. There are a few ways to do this.
For API calls where you want to limit the number of calls per second, you can use the simple p-throttle library:
import pThrottle from 'p-throttle'
// Limit to 2 calls per second
const throttle = pThrottle({
limit: 2,
interval: 1000
})
const responses = await Promise.all([
throttle(() => fetch("/api/1")),
throttle(() => fetch("/api/2")),
...
])
For system calls where you want to limit the number of parallel executions, no matter how long they take, there is the simple p-limit library:
import pLimit from 'p-limit'
// Only 5 promises will run at a time
const limit = pLimit(5)
const results = await Promise.all(
metrics.map(
m => limit(() => callPython(m))
)
)
For more advanced use cases, you might want to look into using a full-featured job queue instead like bree, bull, or agenda.
As developers we spend a lot of time worrying about external attacks and not enough time on protecting our apps from naive internal code. I hope this helps others avoid the same CPU crashing bugs in production that I had to work through. Good luck out there!
Posted on October 28, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.