Next.js Discord

Discord Forum

Conflicting and unclear information about the unstable_cache

Answered
American Shorthair posted this in #help-forum
Open in Discord
American ShorthairOP
So, I have two problems right now with the unstable_cache. Obviously, the cache is very useful when dealing with computationally heavy server-side tasks.

Problem 1
Say cache is in need of revalidation, if a ton of people query the cache at the same time, and the cache is stale, it runs the function FOR ALL OF THEM. So if it is highly computational heavy it will then be running the function for each of them. Is there any way around this?

Problem 2
According to what I have been told in here, what the documentation would lead me to believe, and what the #gpt-help has told me, if the cache is stale, then the stale data will be provided and the function will run in the background. This is not what I am observing at all. When the cache is stale, it will go and run the entire thing again and the client will have to wait until it is finished to get the latest data, which is not the behavior I want. Is there any way to get it to provide the stale cache data and then run the function for displaying next time?

Thanks!
Answered by American Shorthair
Just manually implement cache honestly. For both issues. It's not a Next.js tooling solution but, at least you can get the cache to do what you want.

https://github.com/vercel/next.js/discussions/75462#discussioncomment-12003240
View full answer

9 Replies

American ShorthairOP
if need be I can post these as separate, but just being able to solve problem 2 might alleviate the need for problem 1 to be solved.
American ShorthairOP
This is just yikes.. so basically, if you are using caching to prevent overuse of an API, you better hope that a ton of people don't all load near the same time because it will absolutely not batch them, its going to revalidate individually for each one. Just seems like such a glaring oversight. I can throttle the hell out of my own server's 7950X3D by refreshing the page a ton consecutively on what should be cached.

This is just plain awful. The documentation itself recognizes you will be using this for computationally expensive tasks. How did no one think of this being an issue?
American ShorthairOP
Just manually implement cache honestly. For both issues. It's not a Next.js tooling solution but, at least you can get the cache to do what you want.

https://github.com/vercel/next.js/discussions/75462#discussioncomment-12003240
Answer
Paper wasp
@American Shorthair I know it is late to respond to this. If my understanding is correct, Next.js on vercel solves this problem partly with their CDN usage. This is also something that OpenNext does.

Basically when a background generation is in progress, the stale page is returned by the server to the CDN with a s-maxage=2 (seconds) and the CDN keeps serving it until the generation is done.

Sadly it is not possible to do this on your own on a middleware or something, since you do not have access to the cache HIT or MISS information 😦
@Paper wasp <@981087954074222613> I know it is late to respond to this. If my understanding is correct, Next.js on vercel solves this problem partly with their CDN usage. This is also something that OpenNext does. Basically when a background generation is in progress, the stale page is returned by the server to the CDN with a s-maxage=2 (seconds) and the CDN keeps serving it until the generation is done. Sadly it is not possible to do this on your own on a middleware or something, since you do not have access to the cache HIT or MISS information 😦
American ShorthairOP
Hey thanks for the response, I actually am in progress of solving this but not through the built in unstable_cache

I ended up just recreating a really simple cache, but the difference is it that it will always reValidate in the background and serve stale data. It also will only do one request at a time and if multiple requests are made at once, it returns the same promise for all of them.

I plan on releasing it on npm when I am done.
It's for something that is cached server side and is very computationally heavy, so unstable_cache just wasn't going to work for what I was doing.
Paper wasp
If you want to work on this, dont make the same mistake as next did with unstable_cache, make it so that you can use the request result to tag the cache. They solved it with use cache but for now we cannot use it
Paper wasp
When you cache content, you want to be able to tag the params but also the result. Typical uses for exemple would be, you fetch a CMS and you receive your page data which is composed by several entries. You want to add the entries ids in your cache tag, so that when entry X is invalidated, your page is invalidated as well.

But you didn't know entry X was part of the page before you got the response