LogoLogo
Back to OsmosDeveloper DocsOsmos BlogWhat's New
  • Welcome to Osmos
  • 👋Getting Started with Osmos
    • Terminology
  • 🎉What's New
  • 🧩Osmos API Reference
  • ⌨️Osmos Chat
  • 👩‍💻Developer Docs
    • Manage API Keys
    • Embedding an Osmos Uploader
    • Embedding Uploader Jobs Table
    • Turning on Advanced Mode Uploader
    • Customizing Uploader Styling
    • Passing Parameterized Fields
    • Configuring Uploader's "Recall" functionality
    • Optional Uploader Settings
    • Uploader Submission Callback
    • Configuring AutoClean for your Uploader
    • Uploader Client-Side Validation
      • Data Validators
      • Checking for Duplicate values in a field
      • Creating Dropdown-Controlled Fields
      • Dynamic Dropdown Options
      • Dropdown Interaction with Validation Functions
    • Validation and Transformation Webhooks
      • OpenAPI Validation Webhook Testing
    • Parser Webhook for file based connectors
  • 🔠Datasets
    • Osmos Datasets
      • Uploading Data to your Table
      • Creating Primary and Foreign keys
      • Osmos Dataset Destination Connector
      • Osmos Dataset Source Connector
      • Dataset Edits
    • Datasets Query Builder
      • Query Builder Metadata
    • Performing Look Ups
      • Performing Joins
        • Types of Joins
  • ⏏️Uploader
    • Creating an Osmos Uploader
      • Testing your Osmos Uploader
    • Uploader Validation Summary
    • Advanced Mode
      • Overview
      • Process
    • Standard Mode
      • Overview
      • AutoClean
      • Process
    • AI AutoMapping
    • Uploaders Page
    • Uploader Details Page
  • 🔀Pipelines
    • Step 1. Select the Source
    • Step 2. Select a Destination
    • Step 3. Map & Transform Data
    • Step 4. Schedule the Pipeline
    • Step 5. Review & Confirm
    • Pipelines Page
    • Pipeline Details Page
  • ⏩Data Transformations
    • AutoMap
    • Column Mapping & Data Cleanup Panel
    • QuickFixes
    • AI Value Mapping
    • AI AutoClean
    • Lookups
      • Performing Lookups
    • SmartFill
    • Formulas
      • Date & Time Formulas
        • DateTime Format Specifiers
        • Timezone specifiers
      • Math Formulas and Operators
      • Logical Formulas & Operators
        • True & False Casting
      • Text Formulas
      • Other Formulas
    • Deduplication
  • ↘️Source Connectors
    • Amazon S3
    • Azure Blob Storage
    • BigQuery
    • Email
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
    • Osmos Dataset
    • Snowflake
    • Accessing Sources behind firewall
  • ↖️Destination Connectors
    • Amazon S3
    • BigQuery
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
      • Passing Dynamic Tokens in the API Header
    • MySQL
    • Osmos Dataset
    • PostgreSQL
    • Snowflake
    • Accessing Destinations behind firewall
  • 🗂️Projects
  • ⚙️Administration
    • Email Notifications
  • 🔒Security
  • 📞Support
  • Back to Osmos.io
Powered by GitBook
On this page
  • Option 1: Create a New Destination Connector
  • Option 2: Use an Existing Destination Connector

Was this helpful?

  1. Pipelines

Step 2. Select a Destination

PreviousStep 1. Select the SourceNextStep 3. Map & Transform Data

Last updated 3 years ago

Was this helpful?

The second step in creating a Pipeline is selecting the Destination Connector for the Pipeline. This Connector provides access to the system you want to send the source data to.

You can create a New Destination Connector or use an Existing Destination Connector. See below for more details about these two options.

Option 1: Create a New Destination Connector

Step 1: To create a new Connector, click the + New Connector button above the list of Connectors.

You will be directed to the Create a New Connector page.

Step 2: Click on the system dropdown, and select the system for this new Connector.

Step 3: Once you have selected the system, proceed with configuring and saving the new Destination Connector. Click here to see detailed instructions for setting up a new Destination Connector for each of our supported systems (e.g. BigQuery, FTP, Google Drive, etc.).

Step 4: After saving the Destination Connector, you will be directed back to the list of Destination Connectors. This new connector is now available in your list of existing connectors. Follow the instruction below to continue with selecting an existing connector.

Option 2: Use an Existing Destination Connector

Step 1: To use an existing Connector, first click on the specific System (eg. S3) to see all of the Destination Connectors for that system.

Step 2: Look for the name of the Destination Connector you want to use (ie. Sample BQ Destination), and click on that Connector.

Once you select a Source Connector, you will be prompted to select the .

🔀
Destination Connector