LogoLogo
Back to OsmosDeveloper DocsOsmos BlogWhat's New
  • Welcome to Osmos
  • 👋Getting Started with Osmos
    • Terminology
  • 🎉What's New
  • 🧩Osmos API Reference
  • ⌨️Osmos Chat
  • 👩‍💻Developer Docs
    • Manage API Keys
    • Embedding an Osmos Uploader
    • Embedding Uploader Jobs Table
    • Turning on Advanced Mode Uploader
    • Customizing Uploader Styling
    • Passing Parameterized Fields
    • Configuring Uploader's "Recall" functionality
    • Optional Uploader Settings
    • Uploader Submission Callback
    • Configuring AutoClean for your Uploader
    • Uploader Client-Side Validation
      • Data Validators
      • Checking for Duplicate values in a field
      • Creating Dropdown-Controlled Fields
      • Dynamic Dropdown Options
      • Dropdown Interaction with Validation Functions
    • Validation and Transformation Webhooks
      • OpenAPI Validation Webhook Testing
    • Parser Webhook for file based connectors
  • 🔠Datasets
    • Osmos Datasets
      • Uploading Data to your Table
      • Creating Primary and Foreign keys
      • Osmos Dataset Destination Connector
      • Osmos Dataset Source Connector
      • Dataset Edits
    • Datasets Query Builder
      • Query Builder Metadata
    • Performing Look Ups
      • Performing Joins
        • Types of Joins
  • ⏏️Uploader
    • Creating an Osmos Uploader
      • Testing your Osmos Uploader
    • Uploader Validation Summary
    • Advanced Mode
      • Overview
      • Process
    • Standard Mode
      • Overview
      • AutoClean
      • Process
    • AI AutoMapping
    • Uploaders Page
    • Uploader Details Page
  • 🔀Pipelines
    • Step 1. Select the Source
    • Step 2. Select a Destination
    • Step 3. Map & Transform Data
    • Step 4. Schedule the Pipeline
    • Step 5. Review & Confirm
    • Pipelines Page
    • Pipeline Details Page
  • ⏩Data Transformations
    • AutoMap
    • Column Mapping & Data Cleanup Panel
    • QuickFixes
    • AI Value Mapping
    • AI AutoClean
    • Lookups
      • Performing Lookups
    • SmartFill
    • Formulas
      • Date & Time Formulas
        • DateTime Format Specifiers
        • Timezone specifiers
      • Math Formulas and Operators
      • Logical Formulas & Operators
        • True & False Casting
      • Text Formulas
      • Other Formulas
    • Deduplication
  • ↘️Source Connectors
    • Amazon S3
    • Azure Blob Storage
    • BigQuery
    • Email
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
    • Osmos Dataset
    • Snowflake
    • Accessing Sources behind firewall
  • ↖️Destination Connectors
    • Amazon S3
    • BigQuery
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
      • Passing Dynamic Tokens in the API Header
    • MySQL
    • Osmos Dataset
    • PostgreSQL
    • Snowflake
    • Accessing Destinations behind firewall
  • 🗂️Projects
  • ⚙️Administration
    • Email Notifications
  • 🔒Security
  • 📞Support
  • Back to Osmos.io
Powered by GitBook
On this page
  • Overview
  • Prerequisites
  • Creating a BigQuery Source Connector
  • Step 4: Authentication
  • Step 6: Project ID
  • Step 7: SQL Query
  • Advanced Options
  • Connector Options

Was this helpful?

  1. Source Connectors

BigQuery

PreviousAzure Blob StorageNextEmail

Last updated 1 year ago

Was this helpful?

Overview

You can create a Google BigQuery Source Connector to read from your Google BigQuery table(s).

To set up this Connector using a GCP Service Account Key, you will need a GCP Service Account that has access to the project(s) where the resources reside. To learn more about creating and managing service accounts within GCP, visit: .

To set up this Connector using Single Sign-On, your Google account will need access to the BigQuery table(s) you are connecting to.

If your Google account has multi-factor authentication enabled, we recommend creating a Connector using a GCP Service Account instead to avoid disruptions from re-authentication.

The schema for this Source Connector is defined by the selected columns from the query.

Prerequisites

Required information:

  • Single Sign-On credentials OR Service Account Key with the proper privileges

  • Project ID

  • SQL Query

Creating a BigQuery Source Connector

Step 1: After selecting + New Connector, under the System prompt, click BigQuery

Step 2: Enter a Connector Name

Step 3: Select Source Connector

Step 4: Authentication

There are two options for authentication: Single Sign-On and Service Account. You can choose to authenticate through your Google account using single sign-on, or by providing a Google Cloud Platform service account JSON key.

Step 5a: To authenticate using Single Sign-On, you will be prompted to sign into your Google Account after entering the required information and clicking Grant Access.

Step 5b: To authenticate using Service Account, you will be prompted to enter the Service Account Key. Provide the Service Account JSON key for the account you wish to connect to.

  • Service accounts associated with a BigQuery Source connector will need Data Editor (read/write), and Job Creation privileges in order to successfully establish a connection.

Creating a Service Account Key in the Google Cloud Console

  1. To create a Service Account JSON key, first navigate to the Service Accounts page in the Google Cloud Console.

  2. Click the project dropdown in the top navigation bar to view all of your projects, choose the project you want to create a service account key for, and then click Open.

  3. Find the row of the service account that you want to create a key for. In that row, click the More button, and then click Create key.

  4. Select the JSON Key type and click Create.

Note: to set up a Source Connector using your service account, the service account you select needs to have access to the project you want to connect to, and needs to have both a BigQuery Data Editor role and BigQuery Job User role.

Step 6: Project ID

1: To find the Project ID, first select the project dropdown in the top navigation bar to view all of your projects.

2: Find the corresponding ID for the project, copy the Project ID and paste it into the Project ID field.

Step 7: SQL Query

Provide a query to access the data in your BigQuery. You can query against multiple tables and projects, and your query can be as complex as you need it to be.

Note: the schema for this source Connector is defined by the selected columns from the query.

Advanced Options

Header Normalization

The source file may have characters at the start or end that includes spaces, tabs, carriage returns and line endings. You can choose to keep all characters from the source or remove all whitespace. Select one of the options:

  1. Don't normalize headers. Use headers exactly as they appear in the source: If this option is selected, we will retain all characters from the source file.

  2. Remove extra whitespace and other common untypable characters from headers: If this option is selected, we remove all whitespace (spaces, tabs, carriage returns, line endings) at start/end.

Connector Options

The connector can be deleted, edited and duplicated.

Duplication

To save time, the connector can be duplicated. This new connector needs to be named and can be edited, as needed.

For more information about writing a query, visit:

↘️
https://cloud.google.com/bigquery/docs/query-overview
https://cloud.google.com/iam/docs/creating-managing-service-accounts
The project dropdown can be found in the the upper left of Google Big Query page
The Project ID in this example is example-project-365122