question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Signed urls for upload

See original GitHub issue

Feature request

Is your feature request related to a problem? Please describe.

At Labelflow, we developed a tool to upload images on our Supabase storage, based on one nextJs API route. The goal is for us to abstract the storage method from the client-side by querying a generic upload route to upload any file and to ease the permission management. Indeed, in the server-side function, one service role Supabase client is manipulated to actually make the upload. We use next-auth to secure the route (and to manage authentication in the app in general).

Client-side upload looks like that:

await fetch("https://labelflow.ai/api/upload/[key-in-supabase]", {
                  method: "PUT",
                  body: file,
                });

Server-side API route looks more or less like that (I don’t show the permission management part):

import { createClient } from "@supabase/supabase-js";
import nextConnect from "next-connect";

const apiRoute = nextConnect({});
const client = createClient(
  process?.env?.SUPABASE_API_URL as string,
  process?.env?.SUPABASE_API_KEY as string
);
const bucket = "labelflow-images";

apiRoute.put(async (req, res) => {
  const key = (req.query.id as string[]).join("/");
  const { file } = req;
  const { error } = await client.storage.from(bucket).upload(key, file.buffer, {
    contentType: file.mimetype,
    upsert: false,
    cacheControl: "public, max-age=31536000, immutable",
  });
  if (error) return res .status(404);
  return res.status(200);
});

export default apiRoute;

The problem is that we face a serious limitation in terms of upload size since we use Vercel for deployment which doesn’t allow serverless functions to handle requests that are more than 5Mb. Since we send over images in the upload request from the client to the server, we’re likely to reach that limit quite often.

Describe the solution you’d like

As we don’t want to manipulate Supabase clients client-side, we think that the ideal solution would be to allow us to upload directly to Supabase, using an upload signed URL. The above upload route could now take only a key as an input and return a signed URL to make the upload to.

Client-side upload would now be in two steps:

// Get Supabase signed Url
const { signedUrl } = await (await fetch("https://labelflow.ai/api/upload/[key-in-supabase]", {
                  method: "GET",
                })).json();

// Upload the file
await fetch(signedUrl, {
                  method: "PUT",
                  body: file,
                });

And our API route would look like that, more or less:

import { createClient } from "@supabase/supabase-js";
import nextConnect from "next-connect";

const apiRoute = nextConnect({});
const client = createClient(
  process?.env?.SUPABASE_API_URL as string,
  process?.env?.SUPABASE_API_KEY as string
);
const bucket = "labelflow-images";

apiRoute.get(async (req, res) => {
  const key = (req.query.id as string[]).join("/");
  const { signedURL } = await client.storage
    .from(bucket)
    .createUploadSignedUrl(key, 3600); // <= this is the missing feature

  if (signedURL) {
    res.setHeader("Content-Type", "application/json");
    return res.status(200).json({signedURL});
  }

  return res.status(404);
});

export default apiRoute;

Describe alternatives you’ve considered

I described them in our related issue:

Additional context

We’re happy to work on developing this feature at Labelflow if you think this is the best option!

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:23
  • Comments:10 (1 by maintainers)

github_iconTop GitHub Comments

9reactions
fenoscommented, Aug 25, 2022

Hello! Apologies for the late reply,

I really like the idea of a signed URL for upload, I will add this to the backlog for discovery & prioritization

3reactions
riderxcommented, Dec 14, 2022

@fenos thanks for that, for me, i don’t need anymore the feature since.

I was able to do APIKEY check with RLS.

If you want to do it too:

First create key_mode, the type of api key:

CREATE TYPE "public"."key_mode" AS ENUM (
    'read',
    'write',
    'all',
    'upload'
);

Then create the table:

CREATE TABLE "public"."apikeys" (
    "id" bigint NOT NULL,
    "created_at" timestamp with time zone DEFAULT "now"(),
    "user_id" "uuid" NOT NULL,
    "key" character varying NOT NULL,
    "mode" "public"."key_mode" NOT NULL,
    "updated_at" timestamp with time zone DEFAULT "now"()
);

Then create the postgress function:

CREATE OR REPLACE FUNCTION public.is_allowed_apikey(apikey text, keymode key_mode[])
 RETURNS boolean
 LANGUAGE plpgsql
 SECURITY DEFINER
AS $function$
Begin
  RETURN (SELECT EXISTS (SELECT 1
  FROM apikeys
  WHERE key=apikey
  AND mode=ANY(keymode)));
End;  
$function$

Then add the RLS in table you want to give access:

is_allowed_apikey(((current_setting('request.headers'::text, true))::json ->> 'apikey'::text), '{all,write}'::key_mode[])

And in the SDK 1 you can add your APIKEY like that

const supabase = createClient(hostSupa, supaAnon, {
    headers: {
        apikey: apikey,
    }
})

In SDK v2

const supabase = createClient(hostSupa, supaAnon, {
    global: {
      headers: {
          apikey: apikey,
      }
  }
})
Read more comments on GitHub >

github_iconTop Results From Across the Web

Generating a presigned URL to upload an object
You can use the AWS SDK to generate a presigned URL that you or anyone that you give the URL to can use...
Read more >
Signed URLs | Cloud Storage - Google Cloud
The most common uses for signed URLs are uploads and downloads, because in such requests, object data moves between requesters and Cloud Storage....
Read more >
Use Presigned URL to upload files into AWS S3
Presigned URL can be used in an instance such as the customer/user wants to upload a file into an S3 bucket, of which...
Read more >
Uploading objects to S3 using one-time pre signed URLs
AWS provides the means to upload files to an S3 bucket using a pre signed URL. The URL is generated using IAM credentials...
Read more >
Securing AWS S3 uploads using presigned URLs - Medium
A presigned URL is a URL that you can provide to your users to grant temporary access to a specific S3 object. Using...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found