'Permission issues while docker push
I'm trying to push my docker image to google container image registry but get an error which says I do not have the needed permission to perform this operation.
I have already tried gcloud auth configure-docker but it doesn't work for me.
I first build the image using: docker build -t gcr.io/trynew/hello-world-image:v1 .
Then I'm trying to attach a tag and push it: docker push gcr.io/trynew/hello-world-image:v1
This is my output :
The push refers to repository [gcr.io/trynew/hello-world-image]
e62774cdb1c2: Preparing
0f6265b750f3: Preparing
f82351274ce3: Preparing
31a16430afc8: Preparing
67298499a3ed: Preparing
62d5f39c8fe4: Waiting
9f8566ee5135: Waiting
unauthorized: You don't have the needed permissions to perform this
operation, and you may have invalid credentials.
To authenticate your request, follow the steps in:
https://cloud.google.com/container-registry/docs/advanced-authentication
Solution 1:[1]
In order to be able to push images to the private registry you need two things: API Access Scopes and Authenticate your VM with the registry.
For the API Access Scopes (https://cloud.google.com/container-registry/docs/using-with-google-cloud-platform) we can read in the official documentation:
For GKE:
By default, new Google Kubernetes Engine clusters are created with read-only permissions for Storage buckets. To set the read-write storage scope when creating a Google Kubernetes Engine cluster, use the --scopes option.
For GCE:
By default, a Compute Engine VM has the read-only access scope configured for storage buckets. To push private Docker images, your instance must have read-write storage access scope configured as described in Access scopes.
So first, verify if your GKE cluster or GCE instance actually has the proper scopes set.
The next is to authenticate to the registry:
a) If you are using a Linux based image, you need to use "gcloud auth configure-docker" (https://cloud.google.com/container-registry/docs/advanced-authentication).
b) For Container-Optimized OS (COS), the command is “docker-credential-gcr configure-docker” (https://cloud.google.com/container-optimized-os/docs/how-to/run-container-instance#accessing_private_google_container_registry)
Solution 2:[2]
Google cloud services have specific information how to grant permissions for docker push, this is the first thing you should have a look I think, https://cloud.google.com/container-registry/docs/access-control
After checking that you have sufficient permissions you should proceed with authentication with something like:
gcloud auth configure-docker
See more here: https://cloud.google.com/container-registry/docs/pushing-and-pulling
Solution 3:[3]
If you are running docker
as root (i.e. with sudo docker
), then make sure to configure the authentication as root. You can run for example:
sudo -s
gcloud auth login
gcloud auth configure-docker
...that will create (or update) a file under /root/.docker/config.json
.
(Are there any security implications of gcloud auth login
as root
? Let me know in the comments.)
Solution 4:[4]
as noted https://stackoverflow.com/a/59799035/26283371 there appears to be a bug in the Linux version of cloud sdk where authentication fails using the standard authentication method (gcloud auth configure-docker). Instead, create a JSON keyfile per this and that tends to work.
Solution 5:[5]
Just in case anyone else is banging their head against a wall my PIA VPN caused this behavior.
"unauthorized: You don't have the needed permissions to perform this operation, and you may have invalid credentials. To authenticate your request, follow the steps in: https://cloud.google.com/container-registry/docs/advanced-authentication"
Turn my VPN off and it works fine. Turn it back on and it breaks again.
Solution 6:[6]
Windows / Powershell
I got this error on Windows when I was trying to run docker push
from a normal powershell window after authenticating in the google cloud shell that had opened when I installed the SDK.
The solution was simple:
Start a new powershell window to run docker push after running the gcloud auth configure-docker
command.
Make sure you've activated the registry too:
gcloud services enable containerregistry.googleapis.com
Also Google has a tendency to jump to a default account which may or may not be the one you want. Make sure if you're opening any links in a browser that you're in the correct Google account.
I'm not exactly sure what's going on yet because I'm brand new to docker, but something got refreshed when starting a new Powershell instance.
Solution 7:[7]
I still can't get the gcloud auth configure-docker
helper to work. What did was authenticating with an access token, like so
gcloud auth print-access-token | docker login -u oauth2accesstoken --password-stdin https://HOSTNAME
where HOSTNAME is gcr.io
, us.gcr.io
, eu.gcr.io
, or asia.gcr.io
. (Be sure to include https://, otherwise it won't work).
You can view options for print-access-token
here.
Solution 8:[8]
This is the only way that worked for me. I found it in a kubernetes/kompose Github issue.
- Remove the
credsStore
key in~/.docker/config.json
This will force docker to write the auth into the json when you use
docker login
. You can't untickSecurely store Docker logins in macOS keychain
in the docker desktop any more -- and the currentcredStore
is no longer macOS keychain, it'sdesktop
.
gcloud auth login
Auth with gcloud (just to be explicit)gcloud auth print-access-token | docker login -u oauth2accesstoken --password-stdin https://eu.gcr.io
You should see this:
WARNING! Your password will be stored unencrypted in /Users/andrew/.docker/config.json. Configure a credential helper to remove this warning. See https://docs.docker.com/engine/reference/commandline/login/#credentials-store Login Succeeded
Source: https://github.com/kubernetes/kompose/issues/1043#issuecomment-609019141
Solution 9:[9]
The fix is as follows: run gcloud auth login
(the browser will open and allow you to authenticate) then run gcloud auth configure-docker
and select Y - then redo push. It should work like charm.
Solution 10:[10]
I also have the same issue in the Linux environment. So I just set the Docker to run as a non-root user, (https://docs.docker.com/engine/install/linux-postinstall/#manage-docker-as-a-non-root-user), and it works.
Solution 11:[11]
In my case DOCKER_CONFIG
env variable was defined with an invalid value (not pointing to a docker config json).
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow