Ways to minimize ISR cost from bots?
Answered
German Shorthaired Pointer posted this in #help-forum
German Shorthaired PointerOP
I have a site with lots of pages (400+) but low traffic. Bots like Google and Bing hit pages a lot, which quickly increases ISR reads/writes costs. Any ideas on how to minimize this without blocking bots and hurting SEO?
Answered by B33fb0n3
you can make your page SSR rendered. Google and Bing and others can still crawl through your content and as you have low traffic, speed is not that much important (and SSR is still fast)
4 Replies
you can make your page SSR rendered. Google and Bing and others can still crawl through your content and as you have low traffic, speed is not that much important (and SSR is still fast)
Answer
@B33fb0n3 you can make your page SSR rendered. Google and Bing and others can still crawl through your content and as you have low traffic, speed is not that much important (and SSR is still fast)
German Shorthaired PointerOP
Wouldn’t this just add to edge request/serverless function cost instead?
@German Shorthaired Pointer Wouldn’t this just add to edge request/serverless function cost instead?
yea, some counter will eventually count up. I guess for a "low traffic" website, 1 Million requests are fine. IMO when you hit 1M invocations, you have enough customers that can pay your bill. If not: optimize your website
@German Shorthaired Pointer solved?