LogoLogo
Back to OsmosDeveloper DocsOsmos BlogWhat's New
  • Welcome to Osmos
  • 👋Getting Started with Osmos
    • Terminology
  • 🎉What's New
  • 🧩Osmos API Reference
  • ⌨️Osmos Chat
  • 👩‍💻Developer Docs
    • Manage API Keys
    • Embedding an Osmos Uploader
    • Embedding Uploader Jobs Table
    • Turning on Advanced Mode Uploader
    • Customizing Uploader Styling
    • Passing Parameterized Fields
    • Configuring Uploader's "Recall" functionality
    • Optional Uploader Settings
    • Uploader Submission Callback
    • Configuring AutoClean for your Uploader
    • Uploader Client-Side Validation
      • Data Validators
      • Checking for Duplicate values in a field
      • Creating Dropdown-Controlled Fields
      • Dynamic Dropdown Options
      • Dropdown Interaction with Validation Functions
    • Validation and Transformation Webhooks
      • OpenAPI Validation Webhook Testing
    • Parser Webhook for file based connectors
  • 🔠Datasets
    • Osmos Datasets
      • Uploading Data to your Table
      • Creating Primary and Foreign keys
      • Osmos Dataset Destination Connector
      • Osmos Dataset Source Connector
      • Dataset Edits
    • Datasets Query Builder
      • Query Builder Metadata
    • Performing Look Ups
      • Performing Joins
        • Types of Joins
  • ⏏️Uploader
    • Creating an Osmos Uploader
      • Testing your Osmos Uploader
    • Uploader Validation Summary
    • Advanced Mode
      • Overview
      • Process
    • Standard Mode
      • Overview
      • AutoClean
      • Process
    • AI AutoMapping
    • Uploaders Page
    • Uploader Details Page
  • 🔀Pipelines
    • Step 1. Select the Source
    • Step 2. Select a Destination
    • Step 3. Map & Transform Data
    • Step 4. Schedule the Pipeline
    • Step 5. Review & Confirm
    • Pipelines Page
    • Pipeline Details Page
  • ⏩Data Transformations
    • AutoMap
    • Column Mapping & Data Cleanup Panel
    • QuickFixes
    • AI Value Mapping
    • AI AutoClean
    • Lookups
      • Performing Lookups
    • SmartFill
    • Formulas
      • Date & Time Formulas
        • DateTime Format Specifiers
        • Timezone specifiers
      • Math Formulas and Operators
      • Logical Formulas & Operators
        • True & False Casting
      • Text Formulas
      • Other Formulas
    • Deduplication
  • ↘️Source Connectors
    • Amazon S3
    • Azure Blob Storage
    • BigQuery
    • Email
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
    • Osmos Dataset
    • Snowflake
    • Accessing Sources behind firewall
  • ↖️Destination Connectors
    • Amazon S3
    • BigQuery
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
      • Passing Dynamic Tokens in the API Header
    • MySQL
    • Osmos Dataset
    • PostgreSQL
    • Snowflake
    • Accessing Destinations behind firewall
  • 🗂️Projects
  • ⚙️Administration
    • Email Notifications
  • 🔒Security
  • 📞Support
  • Back to Osmos.io
Powered by GitBook
On this page
  • Overview
  • Prerequisites
  • Creating a Snowflake Source Connector
  • Authentication
  • Account Information
  • Step 8: Query
  • Advanced Options
  • Connector Options

Was this helpful?

  1. Source Connectors

Snowflake

PreviousOsmos DatasetNextAccessing Sources behind firewall

Last updated 10 months ago

Was this helpful?

Overview

You can create a Snowflake Source Connector to read from multiple tables within your Snowflake, using Snowflake's key pair authentication.

To set up this Source Connector, you will need to configure key pair authentication for your Snowflake account and provide the private key. To learn more about configuring key pair authentication within Snowflake, visit:

The schema for this Source Connector is defined by the selected columns from the query.

Prerequisites

Required information:

  • Private Key

  • Account Name

  • User Name (with appropriate privileges)

  • Warehouse

Creating a Snowflake Source Connector

Step 1: After selecting + New Connector, under the System prompt, click Snowflake

Step 2: Enter a Connector Name.‌

Step 3: Select Source Connector.‌

Authentication

Step 4: Provide a private key for your Snowflake environment in PEM format as shown below:

Note: We do not currently support encrypted keys

Account Information

Step 5: Provide your Snowflake account locator in this format <account_locator>.<region_id>.<cloud>

E.g. xk79910.us-central1.gcp

Note: You can find the account_locator, region_id and cloud info in your account URL. Please see a screenshot below to find your account URL:

Note: Your full account name may include segments that identify the region and cloud platform where your account is hosted. Include those additional segments when providing your account name.

To learn more about Snowflake account names, visit:

Step 6: Enter User Name.

Step 7: Provide the name of the warehouse that will execute the query you provide below.

Step 8: Query

Provide a query to access the data in your Snowflake. You can query against multiple tables, and your query can be as complex as you need it to be.

The schema for this Source Connector is defined by the selected columns from the query.

Advanced Options

Header Normalization

The source file may have characters at the start or end that includes spaces, tabs, carriage returns and line endings. You can choose to keep all characters from the source or remove all whitespace. Select one of the options:

  1. Don't normalize headers. Use headers exactly as they appear in the source: If this option is selected, we will retain all characters from the source file.

  2. Remove extra whitespace and other common untypable characters from headers: If this option is selected, we remove all whitespace (spaces, tabs, carriage returns, line endings) at start/end.

Connector Options

The connector can be deleted, edited and duplicated.

Duplication

To save time, the connector can be duplicated. This new connector needs to be named and can be edited, as needed.

To learn more about configuring key pair authentication for your Snowflake account, visit:

This User Name must be associated with an account that has the required permissions enabled in order to successfully build a connector. To learn more about Snowflake access controlprivledges, visit:

For more information about writing a query, visit:

↘️
https://docs.snowflake.com/en/user-guide/key-pair-auth.html
https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#non-vps-account-locator-formats-by-cloud-platform-and-region
https://docs.snowflake.com/en/user-guide/security-access-control.html
https://docs.snowflake.com/en/user-guide/querying.html
https://docs.snowflake.com/en/user-guide/key-pair-auth.html