Category "google-bigquery"

Google CLI does not create Transfer Service and shows no detailed error

I'm very new to Google Cloud CLI, sorry for the dumb question but it's really annoying. I'm trying to execute this command: bq mk --transfer_config --target_dat

I'm trying to use either a sumif or case clause to sum up the values in a data set

I have the total amount expected to be saved, the total amount saved, the principal amount expected to be saved and the principal amount saved, now I'm trying t

BigQuery - turn columns to array

I have this table: I'm looking for this table: I have searched for array functions (array_agg) and it didn't work as expected. How can I approach this task? W

BigQuery - turn columns to array

I have this table: I'm looking for this table: I have searched for array functions (array_agg) and it didn't work as expected. How can I approach this task? W

BigQuery BI Engine is Missing US-Central1

The Docs for the BigQuery BI Engine say "BI Engine is supported in the same regions as BigQuery" (https://cloud.google.com/bigquery/docs/bi-engine-intro). Also

Transfer from Adwords to Google Ads API in Big Query

We are using the BigQuery Data Transfer Service that is based on the AdWords API, but we're missing some of the campaigns. If we write a custom transfer for Goo

BigQuery - unsupported subquery with table in join predicate

Recently I started to work on BigQuery and there's something that makes me still confused. What's the alternative for this query on Big Query? select a.abc, c.x

Why does my table join return values I didn't specify?

I am attempting to join two tables to create a visualization that shows the relationship between weight, BMI, and total steps using the following code: SELECT

Power BI: Add bigquery service account key file

I am trying to add a service account key file to access a bigquery data set in power bi. I am selecting service account login during the set up process. I added

how to generate n rows based on a value in a column in Big Query? [duplicate]

I have following table. I need to transform this input as you can see in below output example: If you have any ideas please share. Thank you

BigQueryInsertJobOperator dryRun is returning success instead of failure on composer (airflow)

When using BigQueryInsertJobOperator and setting the configuration to perform a dry run on a faulty .sql file/ a hardcoded query, the task succeeds even though

BigQuery Lead/Lag Analytical Function

I have a table like below: I want to sort the port and value and then apply lead function on eventDateTime like below: I'm able to sort the port and value tog

Documentation on __TABLES__ and __TABLES_SUMMARY__ and other double underscore metadata (bigquery)

I'm diving into metadata available on datasets and tables in BigQuery. There is enough documentation on INFORMATION_SCHEMA:https://cloud.google.com/bigquery/doc

How to fix 'User does not have permission to query table XYZ.' in BigQuery?

I want to do a BQ query via the bq command. Here is my command: bq query --application_default_credential_file $GOOGLE_APPLICATION_CREDENTIALS --nouse_

Is there any reason or advantage to specify string column length in BigQuery?

I am new to BQ and experienced in OLTP RDBMS, I found the data in BQ for my company are mostly in STRING type while it was VARCHAR(255) or even less in the OLTP

How do you extract a specific field from a JSON array in Big Query?

I currently have a JSON array that looks like this in Big Query: [{"name":"","username":null},{"name":"Jimmy Dean","username":"iamjc"},{"name":"Ben Simmons","us

BigQuery query review for session based attribution

I've been struggling for a while to get a query to return the number of sessions and users per source/campaign/medium/content, based on the current session of t

GCP Cloud Function to write data to BigQuery runs with success but data doesn't appear in BigQuery table

I am running the following cloud function. It runs with success and indicates data was loaded to the table. But when I query the BigQuery no data has been added

Save BigQuery results to array

I have a query that looks like this SELECT ids FROM `table_name` The results set is as follows | ids | |-----| | 1 | | 2 | | 3 | I need to save this res

Issues streaming data from Pub/Sub into BigQuery using Dataflow and Apache Beam (Python)

currently I am facing issues getting my beam pipeline running on Dataflow to write data from Pub/Sub into BigQuery. I've looked through the various steps and al