Category "google-bigquery"

GBQ Execute Immediate into a CTE

I am building an application/script for users that do not have write access to the database. Normally I would use Execute Immediate and save that result into a

TensorFlow SavedModel output output had no dimensions when loading a model with BigQuery ML

I'm trying to use BigQuery ML to load a saved tensorflow model to make predictions. However when I run the query that read the saved model in GCS I got the foll

BigQuery standard SQL: how to group by an ARRAY field

My table has two columns, id and a. Column id contains a number, column a contains an array of strings. I want to count the number of unique id for a given arra

Discrepancy in Daily user engagement between Firebase Analytics dashboard and BigQuery

I'm trying to develop a query against Firebase Analytics data linked to BigQuery to reproduce the "Daily user engagement" graph from the Firebase Analytics dash

How to avoid double counting while using SQL

I have a transaction table which contains date, transaction_id and category (it is a sales table of clothes). It looks like this: ndate transaction_id category

BigQuery - loading CSV with newlines fail with Node.js, but works with gsutil

I'm trying to load CSV from Cloud Storage to Big Query with a Cloud Function. The file has newlines in some of the strings. When I load from Cloud Shell it load

How do I find yesterday's date

I'm stuck on a query regarding dates in Big Query using SQL. I have a table that consists of customer_id (int), date_purchase (date), sales (int). The query is

Is there any way we can load BigTable data into BigQuery?

I want to load BigTable data into BigQuery with direct way. Till now I am loading BigTable data into CSV file using Python and then loading csv file into BigQue

UsageError: Line magic function `%%bigquery` not found

I am very new to Big query. I am trying to load data from a Big query table to pandas dataframe. I followed the syntax given in the documentation here. Unfortun

Table expiration in GCS to BQ Airflow task

I am copying a CSV into a new BQ table using the GCSToBigQueryOperator task in Airflow. Is there a way to add a table expiration to this table within this task?

Access Denied: BigQuery BigQuery: Permission denied while opening file

When trying to perform a simple query in BigQuery I am getting this error: Access Denied: BigQuery BigQuery: Permission denied while opening file. I am using a

Can we check if table in bigquery is in locked or DML operation is being performed

There is a BQ table which has multiple data load/update/delete jobs scheduled in. Since this is automated jobs many of it are failing due to concurrent update i

BigQuery Stored Procedure VS BigQuery Scripts

I am able to perform a task of join and aggregation using both Big Query Script and BigQuery Stored Procedure, which is better , which one should be my first ch

Extracting excel files from the FTP to BigQuery using Cloud Functions

I am working on creating an automated script to download files from a FTP and store them into BigQuery. Problem is that BigQuery accepts only .csv files. For t

What method ga4 use for streaming data to bigquery? In SQL terms, is it just insert or update too?

Sorry, I'm new to this. I read a few sources including some google documentation guides but still don't quiet understand: Every time GA4 streams data into bigqu

Get a rolling order count into session data

I have the following table One client has two purchases in one session. My goal is to assign a order counter to each row of the table. To reach this goal I am

Ingest RDBMS data to BigQuery

If we have an on-prem sources like SQL-Server and Oracle. Data from it has to be ingested periodically in batch mode in Big Query. What shud be the architecture

Get Table_Id along with Rowcount for all Tables in a Project

There are >100 datasets in one of my project and I want to get the Table_id * No_of_rows of each table lying in these 50 datasets. I can get the metadata o

How can I refresh datasets/resources in the new Google BigQuery Web UI?

I'm creating tables via the Big Query command-line utility, but occasionally ad-hoc querying with the new web UI. After creating a table via the CLI, how do I

Move BigQuery Data Transfer Service(DCM) data to another project

I have BigQuery Data Transfer Service for Campaign Manager setup in dataset A in GCP project A. I would like to move this to dataset B located in project B. How