Next.js Discord

Discord Forum

Rate limiting static site generation

Unanswered
Stony gall posted this in #help-forum
Open in Discord
Stony gallOP
I have an API from a CMS that has rate limiting on. How can I slow down static site generation on building the nextjs app?

17 Replies

@Stony gall I have an API from a CMS that has rate limiting on. How can I slow down static site generation on building the nextjs app?
How do you currently fetch details? And how do you currently generating your static site?
Stony gallOP
@B33fb0n3 yarn build fetching data using standard fetch call to my CMS API
It has never been a problem in other Nextjs projects, it must have to do with this specific Shared Hosting provider having some sort of rate limiting active
@Stony gall <@301376057326567425> `yarn build` fetching data using standard `fetch` call to my CMS API
so you create your page once on build time and then it's cached until your build again?
Stony gallOP
Can't hit it with too many requests per second
@B33fb0n3 it's a standard nextjs app router hosted on vercel
yea, tell me how you cache your page right now
or havent you done anything in that direction?
Stony gallOP
@B33fb0n3 haven't done anything about cac he
American Crow
Don't know of any built in way of doing. Guess you'll have to implement some rate limit avoiding function e.g. and call it in your generateStaticParams

let lastFetch = new Date()

export async function avoidRateLimit() {
    if (process.env.NEXT_PHASE === 'phase-production-build') {
        let sinceLastFetch = new Date().getTime() - lastFetch.getTime()
        if (sinceLastFetch < 5000) {
            await sleepMs(5000)
        }
        lastFetch = new Date()
    }
}

With the NEXT_PHASES being
https://github.com/vercel/next.js/blob/5e6b008b561caf2710ab7be63320a3d549474a5b/packages/next/shared/lib/constants.ts#L19-L23

There is also an option to hook into it in the next.config.js
// @ts-check
 
const { PHASE_DEVELOPMENT_SERVER } = require('next/constants')
 
module.exports = (phase, { defaultConfig }) => {
  if (phase === PHASE_DEVELOPMENT_SERVER) {
    return {
      /* development only config options here */
    }
  }
 
  return {
    /* config options for all phases except development here */
  }
}


And finally here is a thread about it, though its from 2022.
https://github.com/vercel/next.js/discussions/18550
@Stony gall <@301376057326567425> haven't done anything about cac he
there are multiple options, how you can cache stuff (avoid rate limit):
1. During build: you build your page once and than it have the same data until the next build (or until next revalidation)
2. After time: after some time the page will fetch new data. For example 5 seconds. If you request your page and then refresh the browser there would be normally 2 calls, but with time based revalidation you will get the same page in the 5 seconds. So you avoided hitting the rate limit
Stony gallOP
@B33fb0n3 thanks but I think there's a misunderstanding. I'm talking about build-time only. The nextjs build script seems to call my CMS API too quickly (probably in parallel) and that triggers some rate limiting on the CMS server. I am asking whether it's possible to set some sort of throttle value (in ms or similar) or any other way to slow down these fetches
I only found this but it is old and seemingly only for pages router: https://github.com/vercel/next.js/discussions/18550
@Stony gall I have an API from a CMS that has rate limiting on. How can I slow down static site generation on building the nextjs app?
I would use this logic if I was you:

In the page, fetch data. If we get the data: great, we are done. If not, wait x milliseconds (the amount depends on the rate limiter, you should already have this information), then retry. Continue looping the retry until we get the data, or get a non-429 response.
Is it the best/fastest? Probably not. But it is definitely one of the simplest possible ways.
Stony gallOP
I would really just like being able to sequentially build each page / call the CMS API instead of Next.js making hundreds of parallel requests at the same time