'Custom domain name with SSL on Firebase Storage
I was able to get a custom domain name mapped to my Firebase Storage bucket by simply naming the bucket the same name as my domain name and then pointing the CNAME record to c.storage.googleapis.com. However, https doesn't work because the common name on the certificate is different. Is it possible for me to upload a certificate or, even better, have GCP or Firebase manage a certificate?
Solution 1:[1]
Currently we don't support custom domains in Cloud Storage for Firebase.
You have two options:
- Use Firebase Hosting (developer generated content)
- Set this up via GCS static hosting (docs)
In either case though, you'll lose the ability to use the Firebase SDKs for Cloud Storage, as well as it's authentication and authorization functionality.
Happy to learn more about the use case to see if it's something we should support in the future.
Solution 2:[2]
I'm coming a bit late to the party and this question might have been answered elsewhere. However, since this was the first result I found when googling for this feature, here goes nothing:
For starters, let's say you have a CNAME like assets.somedomain.com
pointing to c.storage.googleapis.com
, and you create a bucket called assets.somedomain.com
.
Then you upload a file, whose public url will look like:
https://firebasestorage.googleapis.com/v0/b/assets.somedomain.com/o/arduino.png?alt=media&token=asdf
Which can be seen as:
firebasestorage.googleapis.com/v0/b/
+
assets.somedomain.com
+
/o/
+
arduino.png?alt=media&token=asdf
You should be able to view said file using:
https://assets.somedomain.com/arduino.png?alt=media&token=asdf
Which is
assets.somedomain.com/
+
arduino.png?alt=media&token=asdf
(basically, you strip the original base URL and the /o/
prefix)
But of course you get a big fat warning telling you the certificate is invalid, because it's meant for *.storage.googleapis.com
.
In my case, I was able to circumvent this using cloudflare's universal SSL, which acts like a proxy that asks no questions whatsoever.
You try again, but somewhere in the middle the request becomes anonymous and you get an XML stating that you lack the storage.objects.get permission.
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
Anonymous users does not have storage.objects.get access to object.
</Details>
</Error>
This means that even with the token included in the query string the proxyed request has no permission. Next step, then, is to make the bucket publicly readable in Google Cloud Console -> Storage
.
(This can be done using gcloud cli, but I found this method easier to explain)
Pay attention to use the legacy object reader permission, which stops visitors from actually listing the bucket contents.
After that, you should be able to access the image using:
https://assets.somedomain.com/arduino.png
Note that you don't even need to include "alt=media" because cloudflare will serve the file instead of its metadata.
Solution 3:[3]
Update April 2021
Firebase 8.4.0 introduces storage().useEmulator(host, port)
.
You'll still need a reverse proxy, which you can do with Google Cloud Load Balancer or others.
Solution 4:[4]
I perfectly agree to the previous answer and thanks a lot for that. But I am writing the instruction in a better fashion
Create a bucket with you custom domain name in google cloud platform-> Storage.
Create a permission of legacy object viewer and add it to all users.Note:you have to search legacy object viewer from the filter text
Add a DNS record in your domain service provider account with CNAME assets which will point to c.storage.googleapis.com.
Create a cloudflare account if you do not have
Add website in cloudflre where you need to put your domain name not the subdomain Copy the nameserver details from cloudflare to your DNS service providers nameserver details
It will take some time to move all the dns records in cloudflare.
Goto page rules in cloudflare and add assets.yourdomain.com and turn on always use https
You are done
Solution 5:[5]
It's actually quite simple to achieve what you need - i.e to serve your storage content under your custom domain with SSL support. But you'd take a bit different approach.
I solved it this way.
- You can create a dedicated endpoint which redirects to a cloud function.
- That endpoint would accept a storage url as a parameter.
- Then the cloud function would read the url and just stream its content back.
That's it.
No need for complex proxies setup etc. All your content will now be served under your custom domain.
Here is a brief example of how the core logic of such a function may look like:
(req, res, next) => {
let link = req.query.url
const https = require('https');
//request the content of the link
https.get(link, response => {
if (response.statusCode < 200 || response.statusCode > 299) {
//handle error
} else {
//stream the content back to client
response.pipe(res)
res.on("close", () => response.destroy())
}
});
}
Now you can do something like this (assuming your function is hosted under 'storage/content
'):
https://my-custon-domain.com/storage/content?url={{put your storage url that refers to a file}}
Opening such a link in a browser will display your file content (or download it depending on the browser's settings)
I'll post a more detailed explanation with examples if this answer gets more attention.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Mike McDonald |
Solution 2 | ffflabs |
Solution 3 | Brian M. Hunt |
Solution 4 | biswarup bannerjee |
Solution 5 |