Connect storage to external AWS S3 bucket
See original GitHub issueImprove documentation
Link
https://github.com/supabase/storage-api
Describe the problem
We want to help everybody who wants to hosting their own storage module. In our case, we need to connect a k8s self hosted supabase deployment with an AWS S3 bucket. When we surfing the documentation, we facing at a link to how we can create a IAM User to access through aws-cli or aws-sdk, but we never read about how we can configure the storage container.
We take a look at the code searching for something like AWS_ACCESS_ID, or SECRET_ID but nothing comes up. After that, I understand that we have to “upload” a credentials
file to the container in order to authenticate aws-sdk commands.
And finally, we found the solution!
Describe the improvement
You have to save a credentials
file at /root/.aws
in the storage container to be authenticated.
You have to upload a file at /root/.aws/credentials
with:
[default]
aws_access_key_id = xxxxxxxxxxxxxxxxxxxxx
aws_secret_access_key = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Additional context
Issue Analytics
- State:
- Created a year ago
- Reactions:9
- Comments:12 (2 by maintainers)
Awesome! Didn’t expect to get any sort of response. The new Docs site does look great!
We rebuilt the docs website from ground up recently. Appreciate the feedback, will share with the team that the self hosting docs aren’t discoverable in the new site.
We encourage folks to self host if the hosted platform doesn’t meet their requirements. We recently added storage to the self hosted dashboard and even new features like the image transformations feature are already available in the self hosted version. In fact, there are features like rate limiting and webhooks support for storage where the platform version lags the self hosted version.