Skip to main content

BigQuery

Setting up the BigQuery destination connector involves setting up the data loading method (GigQuery Standard method and Google Cloud Storage bucket) and configuring the BigQuery destination connector using the Airbyte UI.

This page guides you through setting up the BigQuery destination connector.

Prerequisites

Connector modes

While setting up the connector, you can configure it in the following modes:

  • BigQuery: Produces a normalized output by storing the JSON blob data in _airbyte_raw_* tables and then transforming and normalizing the data into separate tables, potentially exploding nested streams into their own tables if basic normalization is configured.
  • BigQuery (Denormalized): Leverages BigQuery capabilities with Structured and Repeated fields to produce a single "big" table per stream. Airbyte does not support normalization for this option at this time.

Setup guide

Step 1: Set up a data loading method

Although you can load data using BigQuery's INSERTS, we highly recommend using a Google Cloud Storage bucket.

To use a Google Cloud Storage bucket:

  1. Create a Cloud Storage bucket with the Protection Tools set to none or Object versioning. Make sure the bucket does not have a retention policy.
  2. Create an HMAC key and access ID.
  3. Grant the Storage Object Admin role to the Google Cloud Service Account.
  4. Make sure your Cloud Storage bucket is accessible from the machine running Airbyte. The easiest way to verify if Airbyte is able to connect to your bucket is via the check connection tool in the UI.

Using INSERT

You can use BigQuery's INSERT statement to upload data directly from your source to BigQuery. While this is faster to set up initially, we strongly recommend not using this option for anything other than a quick demo. Due to the Google BigQuery SDK client limitations, using INSERT is 10x slower than using a Google Cloud Storage bucket, and you may see some failures for big datasets and slow sources (For example, if reading from a source takes more than 10-12 hours). For more details, refer to https://github.com/airbytehq/airbyte/issues/3549

Step 2: Set up the BigQuery connector

  1. Log into your Airbyte Cloud or Airbyte OSS account.

  2. Click Destinations and then click + New destination.

  3. On the Set up the destination page, select BigQuery or BigQuery (denormalized typed struct) from the Destination type dropdown depending on whether you want to set up the connector in BigQuery or BigQuery (Denormalized) mode.

  4. Enter the name for the BigQuery connector.

  5. For Project ID, enter your Google Cloud project ID.

  6. For Dataset Location, select the location of your BigQuery dataset.

    warning

    You cannot change the location later.

  7. For Default Dataset ID, enter the BigQuery Dataset ID.

  8. For Loading Method, select Standard Inserts or GCS Staging.

    tip

    We recommend using the GCS Staging option.

  9. For Service Account Key JSON (Required for cloud, optional for open-source), enter the Google Cloud Service Account Key in JSON format.

  10. For Transformation Query Run Type (Optional), select interactive to have BigQuery run interactive query jobs or batch to have BigQuery run batch queries.

    note

    Interactive queries are executed as soon as possible and count towards daily concurrent quotas and limits, while batch queries are executed as soon as idle resources are available in the BigQuery shared resource pool. If BigQuery hasn't started the query within 24 hours, BigQuery changes the job priority to interactive. Batch queries don't count towards your concurrent rate limit, making it easier to start many queries at once.

  11. For Google BigQuery Client Chunk Size (Optional), use the default value of 15 MiB. Later, if you see networking or memory management problems with the sync (specifically on the destination), try decreasing the chunk size. In that case, the sync will be slower but more likely to succeed.

Supported sync modes

The BigQuery destination connector supports the following sync modes:

  • Full Refresh Sync
  • Incremental - Append Sync
  • Incremental - Deduped History

Output schema

Airbyte outputs each stream into its own table in BigQuery. Each table contains three columns:

  • _airbyte_ab_id: A UUID assigned by Airbyte to each event that is processed. The column type in BigQuery is String.
  • _airbyte_emitted_at: A timestamp representing when the event was pulled from the data source. The column type in BigQuery is Timestamp.
  • _airbyte_data: A JSON blob representing the event data. The column type in BigQuery is String.

The output tables in BigQuery are partitioned and clustered by the Time-unit column _airbyte_emitted_at at a daily granularity. Partitions boundaries are based on UTC time. This is useful to limit the number of partitions scanned when querying these partitioned tables, by using a predicate filter (a WHERE clause). Filters on the partitioning column are used to prune the partitions and reduce the query cost. (The parameter Require partition filter is not enabled by Airbyte, but you may toggle it by updating the produced tables.)

BigQuery Naming Conventions

Follow BigQuery Datasets Naming conventions.

Airbyte converts any invalid characters into _ characters when writing data. However, since datasets that begin with _ are hidden on the BigQuery Explorer panel, Airbyte prepends the namespace with n for converted namespaces.

Data type map

Airbyte typeBigQuery typeBigQuery denormalized type
DATEDATEDATE
STRING (BASE64)STRINGSTRING
NUMBERFLOATFLOAT
OBJECTSTRINGRECORD
STRINGSTRINGSTRING
BOOLEANBOOLEANBOOLEAN
INTEGERINTEGERINTEGER
STRING (BIG_NUMBER)STRINGSTRING
STRING (BIG_INTEGER)STRINGSTRING
ARRAYREPEATEDREPEATED
STRING (TIMESTAMP_WITH_TIMEZONE)TIMESTAMPDATETIME
STRING (TIMESTAMP_WITHOUT_TIMEZONE)TIMESTAMPDATETIME

Troubleshooting permission issues

The service account does not have the proper permissions.

  • Make sure the BigQuery service account has BigQuery User and BigQuery Data Editor roles or equivalent permissions as those two roles.
  • If the GCS staging mode is selected, ensure the BigQuery service account has the right permissions to the GCS bucket and path or the Cloud Storage Admin role, which includes a superset of the required permissions.

The HMAC key is wrong.

  • Make sure the HMAC key is created for the BigQuery service account, and the service account has permission to access the GCS bucket and path.

Tutorials

Now that you have set up the BigQuery destination connector, check out the following BigQuery tutorials:

Changelog

bigquery

VersionDatePull RequestSubject
1.1.112022-06-24#14114Remove "additionalProperties": false from specs for connectors with staging
1.1.102022-06-16#13852Updated stacktrace format for any trace message errors
1.1.92022-06-17#13753Deprecate and remove PART_SIZE_MB fields from connectors based on StreamTransferManager
1.1.82022-06-0713579Always check GCS bucket for GCS loading method to catch invalid HMAC keys.
1.1.72022-06-0713424Reordered fields for specification.
1.1.62022-05-1512768Clarify that the service account key json field is required on cloud.
1.1.52022-05-1212805Updated to latest base-java to emit AirbyteTraceMessage on error.
1.1.42022-05-0412578In JSON to Avro conversion, log JSON field values that do not follow Avro schema for debugging.
1.1.32022-05-0212528Update Dataset location field description
1.1.22022-04-2912477Dataset location is a required field
1.1.12022-04-1512068Fixed bug with GCS bucket conditional binding
1.1.02022-04-0611776Use serialized buffering strategy to reduce memory consumption.
1.0.22022-03-3011620Updated spec
1.0.12022-03-2411350Improve check performance
1.0.02022-03-1811238Updated spec and documentation
0.6.122022-03-1810793Fix namespace with invalid characters
0.6.112022-03-0310755Make sure to kill children threads and stop JVM
0.6.82022-02-1410256Add -XX:+ExitOnOutOfMemoryError JVM option
0.6.62022-02-01#9959Fix null pointer exception from buffered stream consumer.
0.6.62022-01-29#9745Integrate with Sentry.
0.6.52022-01-18#9573BigQuery Destination : update description for some input fields
0.6.42022-01-17#8383Support dataset-id prefixed by project-id
0.6.32022-01-12#9415BigQuery Destination : Fix GCS processing of Facebook data
0.6.22022-01-10#9121Fixed check method for GCS mode to verify if all roles assigned to user
0.6.12021-12-22#9039Added part_size configuration to UI for GCS staging
0.6.02021-12-17#8788BigQuery/BiqQuery denorm Destinations : Add possibility to use different types of GCS files
0.5.12021-12-16#8816Update dataset locations
0.5.02021-10-26#7240Output partitioned/clustered tables
0.4.12021-10-04#6733Support dataset starting with numbers
0.4.02021-08-26#5296Added GCS Staging uploading option
0.3.122021-08-03#3549Add optional arg to make a possibility to change the BigQuery client's chunk\buffer size
0.3.112021-07-30#5125Enable additionalPropertities in spec.json
0.3.102021-07-28#3549Add extended logs and made JobId filled with region and projectId
0.3.92021-07-28#5026Add sanitized json fields in raw tables to handle quotes in column names
0.3.62021-06-18#3947Service account credentials are now optional.
0.3.42021-06-07#3277Add dataset location option

bigquery-denormalized

VersionDatePull RequestSubject
1.1.122022-06-29#14079Map "airbyte_type": "big_integer" to INT64
1.1.112022-06-24#14114Remove "additionalProperties": false from specs for connectors with staging
1.1.102022-06-16#13852Updated stacktrace format for any trace message errors
1.1.92022-06-17#13753Deprecate and remove PART_SIZE_MB fields from connectors based on StreamTransferManager
1.1.82022-06-0713579Always check GCS bucket for GCS loading method to catch invalid HMAC keys.
1.1.72022-06-0713424Reordered fields for specification.
1.1.62022-05-1512768Clarify that the service account key json field is required on cloud.
0.3.52022-05-1212805Updated to latest base-java to emit AirbyteTraceMessage on error.
0.3.42022-05-0412578In JSON to Avro conversion, log JSON field values that do not follow Avro schema for debugging.
0.3.32022-05-0212528Update Dataset location field description
0.3.22022-04-2912477Dataset location is a required field
0.3.12022-04-1511978Fixed emittedAt timestamp.
0.3.02022-04-0611776Use serialized buffering strategy to reduce memory consumption.
0.2.152022-04-0511166Fixed handling of anyOf and allOf fields
0.2.142022-04-0211620Updated spec
0.2.132022-04-0111636Added new unit tests
0.2.122022-03-2811454Integration test enhancement for picking test-data and schemas
0.2.112022-03-1810793Fix namespace with invalid characters
0.2.102022-03-0310755Make sure to kill children threads and stop JVM
0.2.82022-02-1410256Add -XX:+ExitOnOutOfMemoryError JVM option
0.2.72022-02-01#9959Fix null pointer exception from buffered stream consumer.
0.2.62022-01-29#9745Integrate with Sentry.
0.2.52022-01-18#9573BigQuery Destination : update description for some input fields
0.2.42022-01-17#8383BigQuery/BiqQuery denorm Destinations : Support dataset-id prefixed by project-id
0.2.32022-01-12#9415BigQuery Destination : Fix GCS processing of Facebook data
0.2.22021-12-22#9039Added part_size configuration to UI for GCS staging
0.2.12021-12-21#8574Added namespace to Avro and Parquet record types
0.2.02021-12-17#8788BigQuery/BiqQuery denorm Destinations : Add possibility to use different types of GCS files
0.1.112021-12-16#8816Update dataset locations
0.1.102021-11-09#7804handle null values in fields described by a $ref definition
0.1.92021-11-08#7736Fixed the handling of ObjectNodes with $ref definition key
0.1.82021-10-27#7413Fixed DATETIME conversion for BigQuery
0.1.72021-10-26#7240Output partitioned/clustered tables
0.1.62021-09-16#6145BigQuery Denormalized support for date, datetime & timestamp types through the json "format" key
0.1.52021-09-07#5881BigQuery Denormalized NPE fix
0.1.42021-09-04#5813fix Stackoverflow error when receive a schema from source where "Array" type doesn't contain a required "items" element
0.1.32021-08-07#5261🐛 Destination BigQuery(Denormalized): Fix processing arrays of records
0.1.22021-07-30#5125Enable additionalPropertities in spec.json
0.1.12021-06-21#3555Partial Success in BufferedStreamConsumer
0.1.02021-06-21#4176Destination using Typed Struct and Repeated fields