'Error "No URLs matched" When copying Google cloud bucket data to my local computer?
I am trying to download a folder
which is inside my Google Cloud Bucket
, I read from google docs gsutil/commands/cp and executed below the line.
gsutil cp -r appengine.googleapis.com gs://my-bucket
But i am getting the error
CommandException: No URLs matched: appengine.googleapis.com
Edit
By running below command
gsutil cp -r gs://logsnotimelimit .
I am getting Error
IOError: [Errno 22] invalid mode ('ab') or filename: u'.\logsnotimelimit\appengine.googleapis.com\nginx.request\2018\03\14\14:00:00_14:59:59_S0.json_.gstmp'
Solution 1:[1]
Just wanted to help people out if they run into this problem on Windows. As administrator:
- Open
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform\gsutil\gslib\utils
- Delete
copy_helper.pyc
- Change the permissions for
copy_helper.py
to allow writing - Open
copy_helper.py
- Go to the function
_GetDownloadFile
- On line 2312 (at time of writing), change the following line
download_file_name = _GetDownloadTempFileName(dst_url)
to (for example, objective is to remove the colons):
download_file_name = _GetDownloadTempFileName(dst_url).replace(':', '-')
- Go to the function
_ValidateAndCompleteDownload
- On line 3184 (at time of writing), change the following line
final_file_name = dst_url.object_name
to (for example, objective is to remove the colons):
final_file_name = dst_url.object_name.replace(':', '-')
- Save the file, and rerun the
gsutil
command - FYI, I was using the command
gsutil -m cp -r gs://my-bucket/* .
to download all my logs, which by default contain:
which does not bode well for Windows files!
Hope this helps someone, I know it's a somewhat hacky solution, but seeing as you never need (should have) colons in Windows filenames, it's fine to do and forget. Just remember that if you update the Google SDK you'll have to redo this.
Solution 2:[2]
I got same issue and resolved it as below.
- Open a cloud shell, and copy objects by using gsutil command.
gsutil -m cp -r gs://[some bucket]/[object] .
- On the shell, zip those objects by using zip command.
zip [some file name].zip -r [some name of your specific folder]
- On the shell, copy the zip file into GCS by using gsutil command.
gsutil cp [some file name].zip gs://[some bucket] .
- On a Windows Command Prompt, copy the zip file in GCS by using gsutil command.
gsutil cp gs://[some bucket]/[some file name].zip .
I wish this information helps someone.
Solution 3:[3]
This is also gsutil
's way of saying file not found. The mention of URL is just confusing in the context of local files.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Eugene |
Solution 2 | |
Solution 3 | huoneusto |