Google BigQuery

Supported pipeline types:
  • Data Collector

The Google BigQuery origin executes a query job and reads the result from Google BigQuery.

The origin submits the query that you define, and then Google BigQuery runs the query as an interactive query. When the query is complete, the origin reads the query results to generate records. The origin runs the query once and then the pipeline stops when it finishes reading all query results. If you start the pipeline again, the origin submits the query again.

When you configure the origin, you define the query to run using valid BigQuery standard SQL or legacy SQL syntax. By default, BigQuery writes all query results to a temporary, cached results table. You can choose to disable retrieving cached results and force BigQuery to compute the query result.

You also define the project ID and credentials to use when connecting to Google BigQuery.

The origin can generate events for an event stream. For more information about dataflow triggers and the event framework, see Dataflow Triggers Overview.

Credentials

When the Google BigQuery origin executes a query job and reads the result from Google BigQuery, the origin must pass credentials to Google BigQuery.

You can provide credentials using one the following options:
  • Google Cloud default credentials
  • Credentials in a file
  • Credentials in a stage property

Default Credentials

You can configure the stage to use Google Cloud default credentials. When using Google Cloud default credentials, the stage checks for the credentials file defined in the GOOGLE_APPLICATION_CREDENTIALS environment variable.

If the environment variable doesn't exist and Data Collector is running on a virtual machine (VM) in Google Cloud Platform (GCP), the stage uses the built-in service account associated with the virtual machine instance.

For more information about the default credentials, see Finding credentials automatically in the Google Cloud documentation.

Complete the following steps to define the credentials file in the environment variable:
  1. Use the Google Cloud Platform Console or the gcloud command-line tool to create a Google service account and have your application use it for API access.
    For example, to use the command line tool, run the following commands:
    gcloud iam service-accounts create my-account
    gcloud iam service-accounts keys create key.json --iam-account=my-account@my-project.iam.gserviceaccount.com
  2. Store the generated credentials file in a local directory external to the Data Collector installation directory.
    For example, if you installed Data Collector in the following directory:
    /opt/sdc/
    you might store the credentials file at:
    /opt/sdc-credentials
  3. Add the GOOGLE_APPLICATION_CREDENTIALS environment variable to the appropriate file and point it to the credentials file.

    Modify environment variables using the method required by your installation type.

    Set the environment variable as follows:

    export GOOGLE_APPLICATION_CREDENTIALS="/var/lib/sdc-resources/keyfile.json"
  4. Restart Data Collector to enable the changes.
  5. On the Credentials tab for the stage, for the Credential Provider property, select Default Credentials Provider.

Credentials in a File

You can configure the stage to use credentials in a Google Cloud service account credentials JSON file.

Complete the following steps to use credentials in a file:
  1. Generate a service account credentials file in JSON format.

    Use the Google Cloud Platform Console or the gcloud command-line tool to generate and download the credentials file. For more information, see Generating a service account credential in the Google Cloud Platform documentation.

  2. Store the generated credentials file on the Data Collector machine.

    As a best practice, store the file in the Data Collector resources directory, $SDC_RESOURCES.

  3. On the Credentials tab for the stage, for the Credential Provider property, select Service Account Credentials File. Then, enter the path to the credentials file.

Credentials in a Property

You can configure the stage to use credentials specified in a stage property. When using credentials in stage properties, you provide JSON-formatted credentials from a Google Cloud service account credential file.

You can enter credential details in plain text, but best practice is to secure the credential details using runtime resources or a credential store.

Complete the following steps to use credentials specified in stage properties:

  1. Generate a service account credentials file in JSON format.

    Use the Google Cloud Platform Console or the gcloud command-line tool to generate and download the credentials file. For more information, see Generating a service account credential in the Google Cloud Platform documentation.

  2. As a best practice, secure the credentials using runtime resources or a credential store.
  3. On the Credentials tab for the stage, for the Credential Provider property, select Service Account Credentials. Then, enter the JSON-formatted credential details or an expression that calls the credentials from a credential store.

BigQuery Data Types

The Google BigQuery origin converts the Google BigQuery data types to Data Collector data types.

The following table lists the data types that the Google BigQuery origin supports and the Data Collector data types that the origin converts them to:
BigQuery Data Type Data Collector Data Type
Boolean Boolean
Bytes Byte Array
Date Date
Datetime Datetime
Float Double
Integer Long
Numeric Decimal
String String
Time Datetime
Timestamp Datetime

Datetime Conversion

In Google BigQuery, the Datetime, Time, and Timestamp data types have microsecond precision, but the corresponding Datetime data type in Data Collector has millisecond precision. The conversion between data types results in some precision loss.

To preserve potentially lost precision during data type conversion, the Google Big Query origin generates the bq.fullValue field attribute that stores a string containing the original value with microsecond precision. You can use the record:fieldAttribute or record:fieldAttributeOrDefault functions to access the information in the attribute.

Generated Field Attribute Description
bq.fullValue Provides the original precision for Datetime, Time, and Timestamp fields.

For more information about field attributes, see Field Attributes.

Event Generation

The Google BigQuery origin generates an event when a query completes successfully.

A Google BigQuery event can be used in any logical way. For example:

For more information about dataflow triggers and the event framework, see Dataflow Triggers Overview.

Event Record

Event records generated by the Google BigQuery origin have the following event-related record header attributes:

Record Header Attribute Description
sdc.event.type Event type. Uses the following type:
  • big-query-success - Generated when the origin successfully completes a query.
sdc.event.version Integer that indicates the version of the event record type.
sdc.event.creation_timestamp Epoch timestamp when the stage created the event.
The origin can generate the following type of event records:
Query success
The origin generates a query success event record when it completes processing the data returned from a query.
The query success event records have the sdc.event.type record header attribute set to big-query-success and include the following fields:
Field Description
query Query that completed successfully.
timestamp Timestamp when the query completed.
row-count Number of processed rows.
source-offset Offset after the query completed.

Configuring a Google BigQuery Origin

Configure a Google BigQuery origin to execute a query job and read the result from Google BigQuery.

  1. In the Properties panel, on the General tab, configure the following properties:
    General Property Description
    Name Stage name.
    Description Optional description.
    Produce Events Generates event records when events occur. Use for event handling.
    On Record Error Error record handling for the stage:
    • Discard - Discards the record.
    • Send to Error - Sends the record to the pipeline for error handling.
    • Stop Pipeline - Stops the pipeline.
  2. On the BigQuery tab, configure the following properties:
    BigQuery Property Description
    Query SQL query to use for the query job. Write the query using valid BigQuery standard SQL or legacy SQL syntax.

    Do not include the #legacySql or #standardSql prefix in the query. Instead, select or clear the Use Legacy SQL property to specify the SQL syntax type.

    Use Legacy SQL Specifies whether the query uses standard SQL or legacy SQL syntax.

    Clear to use standard SQL. Select to use legacy SQL.

    Use Cached Query Results Determines whether Google BigQuery retrieves cached results if they are present.

    Select to retrieve cached results. Clear to disable retrieving cached results.

    Query Timeout (sec) Maximum number of seconds to wait for the query to finish. If the query fails to complete within the timeout, the origin aborts the query and the pipeline fails.

    Enter a time in seconds or use the MINUTES or HOURS constant in an expression to define the time increment.

    Default is five minutes, defined as follows: ${5 * MINUTES}.

    Max Batch Size (records) Maximum number of records to include in a batch.
  3. On the Credentials tab, configure the following properties:
    Credentials Property Description
    Project ID Project ID to connect to.
    Credentials Provider Credentials to use:
    Credentials File Path (JSON) Path to the Google Cloud service account credentials file that the stage uses to connect. The credentials file must be a JSON file.

    Enter a path relative to the Data Collector resources directory, $SDC_RESOURCES, or enter an absolute path.

    Credentials File Content (JSON) Contents of a Google Cloud service account credentials JSON file used to connect.

    Enter JSON-formatted credential information in plain text, or use an expression to call the information from runtime resources or a credential store.