Category "google-cloud-platform"

FTP to Google Storage

Some files get uploaded on a daily basis to an FTP server and I need those files under Google Cloud Storage. I don't want to bug the users that upload the files

How to get project name/id from Google Cloud Storage bucket?

I'm given a Google Cloud Storage bucket address (gs://some_bucket_name) to which I've already been granted read access. The bucket belongs to another project.

Celery task log using google-cloud-logging

I'm currently managing my API using Celery tasks and a kubernetes cluster on Google Cloud Platform. Celery is automatically logging input and output of each tas

kubectl exec/logs on GKE returns "remote error: tls: internal error"

I'm currently getting errors when trying to exec or get logs for my pods on my GKE cluster. $ kubectl logs <POD-NAME> Error from server: Get "https://<

GCP - Cloud Function can't find Python package from Artifact Registry in the same project

I've been trying GCP's Artifact Registry, which is currently in alpha for Python packages. I do the authentication via Keyring along with my service account, as

Stage level data is not coming for bigquery running jobs through java bigquery libraries

I am using com.google.cloud.bigquery library for fetching the job level details. We have the following code snippets Job job = getBigQuery(projectId, location)

Why does Google Cloud Storage freeze when I try to upload a large folder (2.5GB of images)?

After getting frustrated with Azure, I decided to try GCP. I wanted to try training a deep learning image classification model using GCP. To start off, I went t

Cannot add private python dependency to cloud function

I am trying to deploy a python cloud function on GCP, using a python package pushed on the artifact registry of another project. I followed the instructions off

How do I get root access to AI Platform notebook within GCP?

I am logged as user jupyter in the terminal in my GCP AI Platform notebook instance. I want to just install a few things (cannot be installed by pip) and I am u

How can I access Schema from the QueryResponse while calling getQueryResults method from my Java application?

I am using google.cloud.bigquery library to execute and create query using bigquery.query() method. I want to fetch the Schema details from the response but whe

OAuth authentication in apache airflow (Google Cloud Composer)

I have successfully written an API in Python to read Gmail message, URL from the message, call the URL and store CSV file, however, when I am deploying this in

Is Redis ReJSON module compatible with Google Memorystore?

Is Redis ReJSON module compatible with Google Memorystore? I want to store JSON in memorystore, can I use ReJSON module?

Is Redis ReJSON module compatible with Google Memorystore?

Is Redis ReJSON module compatible with Google Memorystore? I want to store JSON in memorystore, can I use ReJSON module?

Push GCP PubSub events to Cloud Datastore

I'm using GCP pub sub topics to store specific events in my applications and then have custom code running based on topic subscriptions fire to process db trans

Bigquery keyword Remote is not supported

We are trying out the REMOTE functions within bigquery as per this guide. We created the CLOUD_RESOURCE using the following command : bq mk --connection --disp

Micronaut GCP global pubsub endpoint

I've searched the Micronaut docs for a solution how to define GCP topic endpoint globally, but with no luck. Right now, I'll have to do the below config in ever

GCP Vertex Pipeline - Why kfp.v2.dsl.Output as function arguments work without being provided?

Why kfp.v2.dsl.Output as function argument works without being provided? I am following Create and run ML pipelines with Vertex Pipelines! Jupyter notebook exa

How to use C# and Google.Cloud.Dialogflow.Cx.V3 to generate valid Google DialogFlow CX webhook response JSON

I've created a webhook using C# and ASP.NET Core in order to try to generate the webhook response to DialogFlow, but I'm really struggling with using Google.Clo

gcloud: The user does not have access to service account "default"

I attempting to use an activated service account scoped to create and delete gcloud container clusters (k8s clusters), using the following commands: gcloud con

Allow Public Read access on a GCS bucket?

I am trying to allow anonymous (or just from my applications domain) read access for files in my bucket. When trying to read the files I get ``` <Error&g