Dataflow Gcs To Bq Problems
Here's the situation: I have a set of files in GCS that are compressed and have a .gz file extension (i.e. 000000_[0-5].gz) that I am trying to import into a single BQ table. I hav
Solution 1:
I believe the input collection to WriteToBigQuery
should be a collection of dictionaries (each key maps to a BigQuery column), rather than a collection of strings. Try passing through something like | beam.Map(lambda line: dict(record=line))
.
Post a Comment for "Dataflow Gcs To Bq Problems"