Next.js Discord

Discord Forum

vercel and s3 bucket not working?

Answered
Waterman posted this in #help-forum
Open in Discord
Hello, I have an admin panel where I upload images to my s3 bucket and I am 99% sure that I have everything correct in my s3 bucket policy configuration, everything is allowed and it works perfectly in dev(localhost) but once I head to my deployed version at my vercel on my own domain, it just doesn't upload the images, I don't know what goes wrong because I can't see any logs in production right? So does anyone know about this issue?
Answered by linesofcode
😅
View full answer

69 Replies

You can see production logs in vercel
They remain for 1 hour
Just go your deployment page in the vercel dashboard and click on the logs tab
Are you sure you added the aws env keys for s3?
And that they aren’t using a reserved env variable name
That vercel uses internally
the only logs I get is the next/image error because the image doesn't exist cause the image didn't get uploaded.
because I don't get any errors that has to do with aws s3 or the actual upload
it would make sence though if it was the env but I dont' really understand what you mean with reserved env variable name
here is a screenshot of the env, the names are correct and the env is correct?
Yep that’s good
I really don't know what to do because it always works in dev and I have a cors configuration that allows everything
This is what I was referring to
so have I done something wrong?
Not from what I can tell
But it really depends on your code
you want me to share it?
The more info you can provide the better yeah
import { NextRequest, NextResponse } from "next/server";
import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3";
import { getServerSession } from "next-auth";
import { authOptions } from "../auth/[...nextauth]/route";

export const POST = async (req: NextRequest) => {
  const session = await getServerSession(authOptions);

  if (!session) {
    throw "Not Admin";
  }

  const formData = await req.formData();

  const client = new S3Client({
    region: "eu-north-1",
    credentials: {
      accessKeyId: process.env.S3_ACCESS_KEY || "",
      secretAccessKey: process.env.S3_SECRET_ACCESS_KEY || "",
    },
  });

  const links = [];

  for (const file of formData.getAll("file")) {
    const blob = file as Blob; // Assert that file is a Blob.
    const ext = blob.type?.split("/").pop();
    const newFileName = `${Date.now()}-${Math.random()
      .toString(36)
      .substring(2, 15)}.${ext}`;

    const fileBuffer = Buffer.from(await blob.arrayBuffer()); // Use Buffer.from to convert ArrayBuffer to Buffer.
    const fileUint8Array = new Uint8Array(fileBuffer);

    client.send(
      new PutObjectCommand({
        Bucket: "nextjsleflyxbuckle",
        Key: newFileName,
        Body: fileUint8Array,
        ACL: "public-read",
        ContentType: blob.type,
      })
    );
    const link = `https://nextjsleflyxbuckle.s3.eu-north-1.amazonaws.com/${newFileName}`;
    links.push(link);
  }

  return NextResponse.json({ links });
};
just weird because it is working fine in dev and all the other things works correct but it is just the upload part to s3 that I have issues with
Wrap all of that in a try/catch
And log the error
the for (const... part?
The contents of the POST function
Then deploy and open up the logs in vercel
And see what happens when you trigger it
is this correct?
import { NextRequest, NextResponse } from "next/server";
import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3";
import { getServerSession } from "next-auth";
import { authOptions } from "../auth/[...nextauth]/route";

export const POST = async (req: NextRequest) => {
  try {
    const session = await getServerSession(authOptions);

    if (!session) {
      throw "Not Admin";
    }

    const formData = await req.formData();

    const client = new S3Client({
      region: "eu-north-1",
      credentials: {
        accessKeyId: process.env.S3_ACCESS_KEY || "",
        secretAccessKey: process.env.S3_SECRET_ACCESS_KEY || "",
      },
    });

    const links = [];

    for (const file of formData.getAll("file")) {
      const blob = file as Blob; // Assert that file is a Blob.
      const ext = blob.type?.split("/").pop();
      const newFileName = `${Date.now()}-${Math.random()
        .toString(36)
        .substring(2, 15)}.${ext}`;

      const fileBuffer = Buffer.from(await blob.arrayBuffer()); // Use Buffer.from to convert ArrayBuffer to Buffer.
      const fileUint8Array = new Uint8Array(fileBuffer);

      client.send(
        new PutObjectCommand({
          Bucket: "nextjsleflyxbuckle",
          Key: newFileName,
          Body: fileUint8Array,
          ACL: "public-read",
          ContentType: blob.type,
        })
      );
      const link = `https://nextjsleflyxbuckle.s3.eu-north-1.amazonaws.com/${newFileName}`;
      links.push(link);
    }
  } catch (er) {
    console.log(er);
  }

  return NextResponse.json({ links });
};
(It may take a few seconds for logs to show up in vercel btw)
thank you alot for your time, I will try that and write what happend
Also in case this helps you
Either as a drop in package or use the codebase to see examples of how to interact with s3
@Waterman I think I see the issue!!
client.send returns a promise
But you’re not awaiting for it
In a serverless environment like vercel
oh but does that explain why it doesn't work in prod but in dev
Any promises which have not completed by the time your function returns an http response will be flushed
This means that when deploying to vercel, you need to await everything
oh right, I do not understand fully but I am very happy that you seemed to be understading it
Otherwise you’re creating a race condition
could you maybe try giving an example or modify the code a little?
In development this doesn’t happen because it’s not serverless
oh right, kinda makes sence
but is it an easy fix in the code?
One sec..
yes thank you
await       client.send(
        new PutObjectCommand({
          Bucket: "nextjsleflyxbuckle",
          Key: newFileName,
          Body: fileUint8Array,
          ACL: "public-read",
          ContentType: blob.type,
        })
      );
just add an await to wait for the promise to resolve
so all I need is an await??
oh really
I will try that thank you so much, one sec
Yep
just be sure to remember this for the future
it's really nuanced and a lot of devs get tricked with this
when deploying to a serverless enviornment
so await all promises and you'll be good
I really didn't even know it was serverless haha
yeah becauase await have saved me a couple of times but when I wrote this s3 code I didn't understand what I did becauase I followed an old tutorial so thats prob the reason
ah okay
it worked 😄
awesome
Thank you so much, I have been struggling with this for 3 weeks
haha
so everytime when I made a new product for ecommerce website I had to run in dev etc
😅
Answer
thank youy
no problem, good luck!