'gsutil cp command error, CommandException: NO URLs matched:

Good day, I am not a developer but running simple gsutil command to manage my google cloud storage.

I ran into an issue where I run the following command form the cmd

gsutil -m cp -r gs:/bucket/ .

Scenario1: with most buckets this goes just fine

Scenario2: there is one bucket where I get an error and I really have no clue how this is possible

the error I get is:

CommandException: NO URLs matched: gs://content-music.tapgamez.com/

I am hoping anyone can share their thoughts with me

thnx



Solution 1:[1]

One scenario where this error message appears is when the bucket you're attempting to recursively copy from contains no objects, e.g.:

$ gsutil mb gs://some-random-bucket-name
$ gsutil -m cp -r gs://some-random-bucket-name/ .
CommandException: No URLs matched: gs://some-random-bucket-name/
CommandException: 1 file/object could not be transferred.

The same issue, but for the rm command, is being tracked on GitHub: https://github.com/GoogleCloudPlatform/gsutil/issues/417

Solution 2:[2]

gsutil command rsync doesn't seem to have this issue (working fine even on empty buckets). Try it to see if it will do that you need. Docs

gsutil rsync -r gs://mybucket1 gs://mybucket2

Solution 3:[3]

I also faced a similar issue, I was doing the following mistake -

Mistake - gsutil cp -r gs://<bucket-name>/src/main/resources/output/20220430 .

Correct - gsutil cp -r gs://<bucket-name>//src/main/resources/output/20220430 .

I was missing the extra '/' after bucket name.

to get the exact name , you can select the object and get that URL from there.

enter image description here

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 mhouglum
Solution 2 Donald Duck
Solution 3 pritampanhale