Low-effort code implementations to improve your API latency that are often overlooked
1. Unrelated async operations can be batched into a single Promise.all
Before:
// The requests are run one after another const fruits = await getFruits(); // 0.3s const drinks = await getDrinks(); // 0.3s const bill = await getBill(); // 0.4s return { fruits, drinks, bill, };

In the example above, getDrinks have to wait until getFruits fisnishes loading, and the getBill would have to do the same for getDrinks. The total blocked time is approximately (0.3 + 0.3 + 0.4) = 1s

After:
// The 3 requests are run concurrently const [fruits, drinks, bill] = await Promise.all([ getFruits(), getDrinks(), getBill(), ]); return { fruits, drinks, bill, };

With this new change, we'd just have to wait for the longest running task, being getBill, which is about 0.4s. A whopping 60% reduction in response time!

2. Requested data should be returned ASAP

Let's look at this hypothetical POST endpoint which lets user create an item in the database, and responses with the newly created contentId:

const { content } = request; const contentId = await saveToDatabase(content); await log("content added: ", contentId); await saveContentIdToAnotherTable(contentId); return response({ success: true, contentId, });

Anything wrong with the code above? Well, contentId should be return immediately to the user without being blocked by log() and saveContentIdToAnotherTable - which can be run later. So the code above can be changed to this:

const { content } = request; const contentId = await saveToDatabase(content); response({ success: true, contentId, }); await Promise.all([ log("content added: ", contentId), saveContentIdToAnotherTable(contentId), ]);
2.1 Same idea but for Vercel Edge Function

If you are using Vercel edge function, waitUntil is exactly the thing you need to achieve the same effect described above.

See: waitUntil

3. Enable HTTP/2 for your server

HTTP/2 has been out for years and upgrading from HTTP/1.1 to 2 is relatively safe and non-breaking. (From personal experience, upgrading to http/2 for an nginx server literally took seconds).

For "the why", the following links can be of help: