Next.js Discord

Discord Forum

upload files with maximum size limit

Answered
Double-striped Thick-knee posted this in #help-forum
Open in Discord
Double-striped Thick-kneeOP
I have created a server action to upload images from client. so far it works but I'm worried that what will happen if someone tries to upload a big file. So I want to reject the request if that happens to prevent the server from lagging. but I can't figure out how can I set a maximum size for uploads. (note: I have tried uploading a 5gb file and my whole pc lags so i think It's necessary to put a limit to it)

this is how my current code looks like
Answered by B33fb0n3
@Double-striped Thick-knee another solution would be, that the client upload it directly to your file storage server. So instead of:
Client (with file) -> Nextjs Server -> File Storage Server

it would be:
Client (with file) -> File Storage Server

Like that, you won't hit any limit from nextjs itself
View full answer

81 Replies

Velvet ant
I guess you can use the size property from the File and return on error if its bigger than your limit https://developer.mozilla.org/en-US/docs/Web/API/Blob/size

size is in bytes
@Velvet ant I guess you can use the `size` property from the `File` and return on error if its bigger than your limit https://developer.mozilla.org/en-US/docs/Web/API/Blob/size size is in bytes
Double-striped Thick-kneeOP
yeah I could do that but then the whole file will already be loaded in memory.

this is what I used do in tradition express app, look at line 13, I'm pausing the request to prevent server from loading the file any further if payload size exceeds.
@Velvet ant maybe instead of using a server action, you could use a route handler and use the request.body that is a `ReadableStream`
Double-striped Thick-kneeOP
I have tried that, client keeps on uploading the file and creates a relative delay, I can't pause the upload even if the size exceeds, is there any way to pause it?
Velvet ant
it seems that you can use the const reader = request.body.getReader() and you can cancel the reader reader.cancel() @Double-striped Thick-knee
@Velvet ant it seems that you can use the `const reader = request.body.getReader()` and you can cancel the reader `reader.cancel()` <@893972714547724368>
Double-striped Thick-kneeOP
look at the cpu usage, it will increase as the file size increases, @Velvet ant
Velvet ant
you need to iterate through the reader
for await (const chunk of reader) {
//...     
}
Why not just check client side?
Velvet ant
yes also but still need a server check
Well, thats a silly question, it can be bypassed. but yeah
I was writing that 😄
@Jboncz Well, thats a silly question, it can be bypassed. but yeah
Double-striped Thick-kneeOP
yeah that's the issue
@Velvet ant ts for await (const chunk of reader) { //... }
Double-striped Thick-kneeOP
ts error, reader is not an array
readStream.on('data', (chunk) => {
      totalBytes += chunk.length;
      if (totalBytes > MAX_SIZE) {
        readStream.destroy();
        res.status(413).json({ error: 'File too large' });
      }
    });


I do something similar to this in express..... but not sure how it translates.
Not in front of a computer so I cant test anything.
Little different though, I dont care about bandwidth, so im just checking once I start reading the file.
That doesnt help
Hrmmm.
Yeah no I get it.
Thats why I said that doesnt help 🙂
Im kind of thinking through this as im typing.
@Double-striped Thick-knee without size limit, someone can overwhelm the server with huge files
Double-striped Thick-kneeOP
I saw a lot of tutorials but looks like they aren't caring about it
Yeah, often overlooked. I know how to do what your asking in express, but nextjs is a different animal. Im looking through some documentation for an answer
@Jboncz Yeah, often overlooked. I know how to do what your asking in express, but nextjs is a different animal. Im looking through some documentation for an answer
Double-striped Thick-kneeOP
true, only if they provided some functions to control the request flow
Its a hard thing to expose with a serverless environment I would imagine.
Velvet ant
@Jboncz Its a hard thing to expose with a serverless environment I would imagine.
Double-striped Thick-kneeOP
they did expose alot tho, but not the request object
I ran into this same kinds of issues when I tried to implement streaming from the server side, its not super straight forward, had to disect the nextjs ai package and figure out how its doing it.
https://github.com/vercel/next.js/discussions/57973

This is somewhat relevant, but unanswered
Double-striped Thick-kneeOP
I tried bodySizeLimit, but didn't work
I wonder if you could inspect in middleware?
Double-striped Thick-kneeOP
the default is 2mb
@Jboncz I wonder if you could inspect in middleware?
Double-striped Thick-kneeOP
tried that 2h ago, lemme try again
Are you hosting through vercel?
Double-striped Thick-kneeOP
normally I do
Just so you know before going down this path vercel has a max server action request size built in.
Double-striped Thick-kneeOP
what about api routes
With vercel there is a hard limit let me find.
Double-striped Thick-kneeOP
how are they limiting the size, can we do it ourself?
If your self hosting, it depends on what your using. I use nginx, there I can specify limits for any endpoint, because your handling it at the network level.
I would open a discussion topic on github, ive had some success with that in the past, there might still be a way, might look at it later when I have some free time.
@Jboncz I would open a discussion topic on github, ive had some success with that in the past, there might still be a way, might look at it later when I have some free time.
Double-striped Thick-kneeOP
lemme know if you find anything, until then I will proceed with anti ddos protection
@Double-striped Thick-knee does it perform well on self hosting
Yeah I use it at the enterprise level. 100+ active users at a time (m-f) and probably 2000-3000 unique visits a day, all routing through nginx as as reverse proxy
http {
    client_max_body_size 10M;  # Default limit for all requests

    server {
        location / {
            # ... other configurations
        }

        location /specific_url {
            client_max_body_size 5M;  # Limit for /specific_url
            # ... other configurations for /specific_url
        }
    }
}


Its pretty easy to do with nginx.
@Double-striped Thick-knee another solution would be, that the client upload it directly to your file storage server. So instead of:
Client (with file) -> Nextjs Server -> File Storage Server

it would be:
Client (with file) -> File Storage Server

Like that, you won't hit any limit from nextjs itself
Answer
@B33fb0n3 <@893972714547724368> another solution would be, that the client upload it directly to your file storage server. So instead of: Client (with file) -> Nextjs Server -> File Storage Server it would be: Client (with file) -> File Storage Server Like that, you won't hit any limit from nextjs itself
Double-striped Thick-kneeOP
my formData looks like this,

title: string
content: string
file: File

so with the first approach I had to upload all data to the nextjs server, and the server would handle them itself, such as uploading the file and then store the rest of the data in database
but what about the second approach, do i need to upload the File from client and send rest of the data to nextjs server?
@Double-striped Thick-knee but what about the second approach, do i need to upload the File from client and send rest of the data to nextjs server?
yes, you would upload the file via POST (or PUT) to your file storage server. You already have the file: File in your data, so that's easy for you. After you got the result, that the file was uploaded successfully (check your POST (or PUT) request), you can insert it inside your database.

As I am using s3, I only know a way for s3 [(see here)](https://aws.amazon.com/de/blogs/compute/uploading-to-amazon-s3-directly-from-a-web-or-mobile-application/)
For it, you get an upload url (if needed with policy) and then upload thought you body from the request directly to your file storage server
I'm curious about something, what if the second request gets blocked intentionally or unintentionally
only the image will be uploaded but records won't be stored on database
@Double-striped Thick-knee I'm curious about something, what if the second request gets blocked intentionally or unintentionally
It's the sime like if you would do it in one function (on nextjs server). Something is finished first. So there is not much difference
@B33fb0n3 yes and you want to check if the file upload was successful before inserting a new record in your database
Double-striped Thick-kneeOP
since file is getting uploaded from client, how would i know?
sry if my question sounds dumb
@Double-striped Thick-knee since file is getting uploaded from client, how would i know?
oh, when you start uploading by creating your request:
form.append("file", file);
const res = await fetch(uploadUrl, {
  method: "POST",
  body: form,
});

You can then check if the request was successful or not:
if (res.status === 201) {
  // todo create db record
  // ...
}
@Double-striped Thick-knee but what about the second approach, do i need to upload the File from client and send rest of the data to nextjs server?
Double-striped Thick-kneeOP
and then i will post the title and content in nextjs server that will upload it in the database, does that sound right
@B33fb0n3
@Double-striped Thick-knee I'm curious about something, what if the second request gets blocked intentionally or unintentionally
Double-striped Thick-kneeOP
but i'm concerned about it, what if someone intentionally blocks the second request from client @B33fb0n3
@B33fb0n3 then there won't be any entry
Double-striped Thick-kneeOP
is there any way to prevent it
@Double-striped Thick-knee is there any way to prevent it
The upload request is created by client and like that the client checks if it was successful (we extracted the nextjs server because of the limit). So, I don't know a way
@B33fb0n3 The upload request is created by client and like that the client checks if it was successful (we extracted the nextjs server because of the limit). So, I don't know a way
Double-striped Thick-kneeOP
alright thanks, maybe i have to use cron jobs to check if there are any images without entries
@Double-striped Thick-knee alright thanks, maybe i have to use cron jobs to check if there are any images without entries
that's a possible way of doing a cleanup. If you also want to find these lonely files, you can check the creation date of your image and check if there is any database entry created at this point as well (maybe give it a 5 sec timespan). Like that you can find these files, that don't have any database record with it (but it will be very rare)
@Double-striped Thick-knee i was thinking of using redis queue and check every hour for files with no entries and delete them
Every hour?? Shouldn't it be normal, that every fetch request will go though? And only for the rare client, that does not work, you have this queue?
@Double-striped Thick-knee well, i'm worried about that one malicious client
of course you should use a policy for your upload url. If you using s3 upload urls, yo can add a upload limit. Normally only logged in users can upload stuff (in most of my cases). So you have a user and normally this user is somehow limited. Maybe there are account limits, maybe there are file managers with limits, maybe ... So when creating the upload url (with the policy), you add the line on how much the person can upload. The creation of thie upload url also contains the substraction of available limits. The file is directly uploaded after the creation of the upload url is done (without further action from the client) so it will be uploaded perfectly.

If there is a malicious client, that just want to upload stuff to your storage, then he will eventually hit that limit and fail to create more upload urls.
@Double-striped Thick-knee owh, it looks more promising, thanks, this is definitely what i need
That sounds great. Can I ask you one last question? What file storage server are you using?
@B33fb0n3 That sounds great. Can I ask you one last question? What file storage server are you using?
Double-striped Thick-kneeOP
i haven't choosen one yet, i was using cloudinary for testing