'Load local CSV file into BigQuery table with Terraform?
I'm new to terraform. Is it possible to load the content of a CSV file into a BigQuery table without uploading it to GCS?
I've studied the document below, but the solution doesn't seem to work on local files: https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigquery_job
Question: Is it possible somehow to do this without uploading the file into Google's environment?
resource "google_bigquery_table" "my_tyable" {
dataset_id = google_bigquery_dataset.bq_config_dataset.dataset_id
table_id = "my_tyable"
schema = file("${path.cwd}/path/to/schema.json")
}
resource "google_bigquery_job" "load_data" {
job_id = "load_data"
load {
source_uris = [
#"gs://cloud-samples-data/bigquery/us-states/us-states-by-date.csv", # this would work
"${path.cwd}/path/to/data.csv", # this is not working
]
destination_table {
project_id = google_bigquery_table.my_tyable.project
dataset_id = google_bigquery_table.my_tyable.dataset_id
table_id = google_bigquery_table.my_tyable.table_id
}
skip_leading_rows = 0
schema_update_options = ["ALLOW_FIELD_RELAXATION", "ALLOW_FIELD_ADDITION"]
write_disposition = "WRITE_APPEND"
autodetect = true
}
}
Solution 1:[1]
Probably best option is to load it using file function
file("${path.module}/data.csv")
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Victor Biga |