Next.js Discord

Discord Forum

Function wrapped in cache(myfunc) is called multiple times on the same page load.

Answered
Transvaal lion posted this in #help-forum
Open in Discord
Transvaal lionOP
Is this intentional?

It's not the wrapped function that is deduplicated? Only the http request itself, that the function makes internally with fetch? In that case, how to you typically monitor requests while developing?

Also am I correct in my understanding that the unstable_cache does in fact cache the entire wrapped function?
Answered by Jboncz
Yes, I follow his pattern mostly.... I wasnt really worrying about caching too much early on because im working on an internal network self hosted so caching isnt really THAT important to me, butttt I started hitting costly fetches and deduced that I should cache it regardless of the cost associated.
View full answer

72 Replies

@iyxan23 heya you might want to take a look at this long thread: https://discord.com/channels/752553802359505017/1277820644297408568/1277820644297408568
Transvaal lionOP
Thanks for the link to the thread. Great read. I kinda know the fundamental differences, but now that we're going to tweak cachine I need to know how to monitor if it works correctly. I know react cache is just deduping requests within the same page load. But what I need to verify is that the function wrapped in cache is gonna console.log multiple times (1 for each time its called) and only the outgoing request is deduped?
@Transvaal lion Thanks for the link to the thread. Great read. I kinda know the fundamental differences, but now that we're going to tweak cachine I need to know how to monitor if it works correctly. I know react cache is just deduping requests within the same page load. But what I need to verify is that the function wrapped in cache is gonna console.log multiple times (1 for each time its called) and only the outgoing request is deduped?
I don't really use React.cache and unstable_cache that much, but from what I know about caching in general is that the very point of doing it is to prevent doing the same data fetch multiple times; so it should be expected for the function wrapped in cache to only execute once (in the span of either a page load or the whole app itself), and return the cached value when called multiple times.
Transvaal lionOP
it seems like it's not deduping the function calls, only the outgoing requests they make
Transvaal lionOP
seems like using cache around the function that is wrapped in unstable cache is the best of both worlds
@Transvaal lion seems like using cache around the function that is wrapped in unstable cache is the best of both worlds
We have a winnerrrrr. Yes that is the case. the react cache specifically dedupes the fetch requests if im not mistaken, which would allow you to mutate the data in different ways still. While unstable cache caches the actual entirety of the data returned by the function.
They both have their use cases.... you COULD make a wrapper for caching and unstable_caching functions if you wanted to if it was a pattern you would use often.
Transvaal lionOP
So I guess deduplication with cache is always gonna be my case. But why not always prefer unstable_cache over the fetch/request cache?
because the react 'cache' would still allow you to mutate the data returned from the fetch in different ways if you want.
I get thats not a very common thing to do.... but I think thats the intent reading between the lines.
Transvaal lionOP
I noticed from the loggin that wrapping something in unstable_cache makes the fetch cache disabled. So it's really important to use unstable_cache together with cache.
without wrapping it in the react cache, it wouldnt ever cache the fetch though right?
You gave me a very good idea by saying what I said previously, Im going to make a wrapper for react cache and unstable cache, seems like a good thing to have handy
Transvaal lionOP
why every use the request cache (directly on the fetch request) when you can use unstable_cache instead?
and save some computation along with it. Is it because unstable_cache is costly, memory wise? Or why would I ever not want to use it for cached data? Why would I ever prefer simply caching the fetch request?
because you could still hit the react cache without hitting the unstable cache technically.... thats where the 'tags' come in.
I wouldnt call them costly though for the most part, unless your pulling back MASSIVE datasets
I dont know all of the details, I wont claim to be able to give you a 100% answer... but if @riský is up to it when hes around, he may be able to provide a little more information... he uses the unstable_cache -> cache pattern alot in emailthing
Transvaal lionOP
So this pattern should be a good default right?

export const myTempFunction = unstable_cache(
    cache(async () => {
        console.log('myTempFunction');
        const response = await fetch('https://jsonplaceholder.typicode.com/todos/1');
        const json = await response.json();
        console.log(`${JSON.stringify(json, null, 2)}`);
        const res2 = await mySecondTempFunction();
        return json;
    }),
    ['todos2'],
    {
        revalidate: 60,
        tags: ['temp2'],
    },
);
absolutely. Thats what I do.
Transvaal lionOP
and then we can use "no-store" on the fetch itself
Transvaal lionOP
or do you use the same revalidation strategy?
Heres practical examples.
Transvaal lionOP
thanks!
ohh he dedupes it first, whereas I did it second
makes sense I guess 😄
Yes, I follow his pattern mostly.... I wasnt really worrying about caching too much early on because im working on an internal network self hosted so caching isnt really THAT important to me, butttt I started hitting costly fetches and deduced that I should cache it regardless of the cost associated.
Answer
Transvaal lionOP
Thanks so much. I now feel I know what to do!!
No problem man! Happy to help!
I know there probably isnt a single 'good' answer to mark here but if you feel like the issue is resolved make sure you mark an answer so the post closes properly 🙂
as itll give the best context if someone finds this while searching the forum
Transvaal lionOP
Only thing I'm now thinking is whether to use riskys pattern of deduping first with cache, and then unstable cache or the other way around like I did 🙂 Seems like he has to make another function call inside, (like 34 in your link where he does )())
But perhaps I'm overthinking it now
@Transvaal lion But perhaps I'm overthinking it now
Yeah you are 😉 pull into your vscode and use the sytax highlighting to see where each function call ends and starts
Transvaal lionOP
I am, however it actually seems clever to use unstable cache AFTER you've first wrapped your function in cache
so you can check stuff like if (!userId) return false;
Exactly!
The extra () at the end of the unstable_cache statement is actually intentional and serves an important purpose. Here's why it's there.... The unstable_cache function doesn't directly return the cached value. Instead, it returns a new function that, when called, will either return the cached value or compute and cache a new value if needed.
If you look at the example on nextjs docs, your setting up the cache then calling it, this way he is doing it all in-line and calling it immediately.
Thats why the extra ()
Transvaal lionOP
got it
😄
My implementation looked really messy and required like 3 seperate declarations till I saw his pattern, it makes it much nicer
Transvaal lionOP
yeah... weird they don't just dedupe requests by default. Seems like a harmless thing to do
or just write it in the doc, that you usually want to always wrap the cache() function around all your functions
but yes, great pattern to always follow
Yeah, its one of those things that people wouldnt like... lol if they didnt follow the tags and stuff properly things could get weird with the caching, better to allow the user to purposely do it imo.
Id rather have to opt in the have to opt out.
Transvaal lionOP
could a reason for the fetch cache be that sometimes when you merge or transform or make other computation heavy processing, be that if an error occurs and you've cached the entire function you're fucked until revalidation occurs? Or does it automatically try to revalidate on error? do you know?
It would not try to revalidate on error.
Transvaal lionOP
So right there's the big "con"
and why you sometimes don't want unstable_cache, but cache on the fetch instead
Yeah, but you could do fancy things with the tags to make errors invalidate the cache.
really getting into the gritty details though hahah.
the tags are additional caching 'identifiers'
ahh shit here comes risky.
😂
@riský yeah he did
Nice, it looks good from my basic reading of things here then :)
Transvaal lionOP
@risky I adopted your pattern thanks. Only think I'm left with now is in what situations would you opt out of unstable_cache to instead rely on fetch cache?
I just blindly wrap it in both if im caching
:cool_doge:
Transvaal lionOP
noticed if you wrap in unstable_cache the request / fetch cache is disabled
and with the added complexity of unstable_cache to handle error scenarios, I guess there's something to really consider there as well, or you could potentially have data that is not revalidated as long as an error keeps occuring 😄
@Transvaal lion noticed if you wrap in unstable_cache the request / fetch cache is disabled
1. You should be able to still manually opt in
2. In next release fetch cache isn't auto opted in
@riský 1. You should be able to still manually opt in 2. In next release fetch cache isn't auto opted in
2.... thank god.... @Transvaal lion like I said earlier id rather opt in than opt out
Ikr
Transvaal lionOP
I've tried with force-cache as wlel. If the fetch is made within the unstable_cache I can't cache the fetch request. Tried everything, so feeling quite confident about that one 😄
@Transvaal lion noticed if you wrap in unstable_cache the request / fetch cache is disabled
I use just react cache when deduping from generate metadata and page, but don't need to do actual cache