LogoLogo
Back to OsmosDeveloper DocsOsmos BlogWhat's New
  • Welcome to Osmos
  • 👋Getting Started with Osmos
    • Terminology
  • 🎉What's New
  • 🧩Osmos API Reference
  • ⌨️Osmos Chat
  • 👩‍💻Developer Docs
    • Manage API Keys
    • Embedding an Osmos Uploader
    • Embedding Uploader Jobs Table
    • Turning on Advanced Mode Uploader
    • Customizing Uploader Styling
    • Passing Parameterized Fields
    • Configuring Uploader's "Recall" functionality
    • Optional Uploader Settings
    • Uploader Submission Callback
    • Configuring AutoClean for your Uploader
    • Uploader Client-Side Validation
      • Data Validators
      • Checking for Duplicate values in a field
      • Creating Dropdown-Controlled Fields
      • Dynamic Dropdown Options
      • Dropdown Interaction with Validation Functions
    • Validation and Transformation Webhooks
      • OpenAPI Validation Webhook Testing
    • Parser Webhook for file based connectors
  • 🔠Datasets
    • Osmos Datasets
      • Uploading Data to your Table
      • Creating Primary and Foreign keys
      • Osmos Dataset Destination Connector
      • Osmos Dataset Source Connector
      • Dataset Edits
    • Datasets Query Builder
      • Query Builder Metadata
    • Performing Look Ups
      • Performing Joins
        • Types of Joins
  • ⏏️Uploader
    • Creating an Osmos Uploader
      • Testing your Osmos Uploader
    • Uploader Validation Summary
    • Advanced Mode
      • Overview
      • Process
    • Standard Mode
      • Overview
      • AutoClean
      • Process
    • AI AutoMapping
    • Uploaders Page
    • Uploader Details Page
  • 🔀Pipelines
    • Step 1. Select the Source
    • Step 2. Select a Destination
    • Step 3. Map & Transform Data
    • Step 4. Schedule the Pipeline
    • Step 5. Review & Confirm
    • Pipelines Page
    • Pipeline Details Page
  • ⏩Data Transformations
    • AutoMap
    • Column Mapping & Data Cleanup Panel
    • QuickFixes
    • AI Value Mapping
    • AI AutoClean
    • Lookups
      • Performing Lookups
    • SmartFill
    • Formulas
      • Date & Time Formulas
        • DateTime Format Specifiers
        • Timezone specifiers
      • Math Formulas and Operators
      • Logical Formulas & Operators
        • True & False Casting
      • Text Formulas
      • Other Formulas
    • Deduplication
  • ↘️Source Connectors
    • Amazon S3
    • Azure Blob Storage
    • BigQuery
    • Email
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
    • Osmos Dataset
    • Snowflake
    • Accessing Sources behind firewall
  • ↖️Destination Connectors
    • Amazon S3
    • BigQuery
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
      • Passing Dynamic Tokens in the API Header
    • MySQL
    • Osmos Dataset
    • PostgreSQL
    • Snowflake
    • Accessing Destinations behind firewall
  • 🗂️Projects
  • ⚙️Administration
    • Email Notifications
  • 🔒Security
  • 📞Support
  • Back to Osmos.io
Powered by GitBook
On this page
  • Pipeline Details Overview
  • Pipeline Runs
  • Pipeline Actions
  • Download Error Report
  • Pipeline Fix Errors
  • Ignore Errors
  • Show Jobs

Was this helpful?

  1. Pipelines

Pipeline Details Page

PreviousPipelines PageNextData Transformations

Last updated 7 months ago

Was this helpful?

The Pipeline Details page allows you to review, download and edit important information about the pipeline and associated runs. You can also run the pipeline from the Details page.

Pipeline Details Overview

The title of the pipeline displays in the top left.

The pipeline can be run, edit mapping or deleted by selecting the buttons on the top right.

The Source and Destination buttons allows you to drill into the individual connector.

The Errors button displays the current number of errors in the pipeline.

Under the Status tab, you can view and take action on the individual pipeline runs. See more detail under Pipeline Actions.

Under the Overview tab, you can view details about the pipeline overall such as date created, total records, schedule, last run and pipeline ID.

Pipeline Runs

Osmos displays the last 30 days of pipeline runs. If you need to modify the number of days to display, please contact support@osmos.io

For each pipeline job, you can view the run date, run time and status.

Your pipeline may have a number of different status:

  • Complete - 100% successful run with no errors

  • Finished with Errors - The pipeline completed with one or more errors

  • Failure - Typically a failure means their is an error with the Source Connector that prevented the pipeline to successfully run

For each job, you can also view of the records are in a Pending, Successful, Skipped or Failed state.

If you have one or more failed records, you may chose to take an Action.

Pipeline Actions

Pipeline Actions are present only when you have at least one failed record on your Pipeline.

Under Actions you can do one of three things:

  1. Run a Download Error Report

  2. Fix Errors

  3. Ignore Errors.

Download Error Report

Osmos allows you to download a CSV report of the errors associated with the individual pipeline job.

The report displays the following details:

  1. The row number in the original file which has the errors

  2. A field to show if the error is at the connector level (e.g. auth issues)

  3. Every error row in the report will include all the “source columns for the same row” because often the source column names won’t match the destination column names. Showing the source columns helps the user understand and act on the errors better.

  4. Each destination column will have an “error description” column right next to it, which will contain the error reason if that destination column has errors in it. If that destination column has no errors, the “error description” column will be empty

Pipeline Fix Errors

This option allows you to fix the errors specific to the pipeline run only.

  1. When you select Fix Errors for the pipeline job you will be taken to the spreadsheet view to perform single-cell edit functionality to update the fields in error for the pipeline. Once the updates are complete, the pipeline can be re-run.

  2. If there are 50+ errors in the pipeline, a box will pop up giving you the option to either edit the cells in the spreadsheet view or go back to the column mapping and update the transformations.

Note: If you wish to update the mapping of your pipeline overall, use the Edit Mapping button in the top right-hand corner.

Ignore Errors

This option allows you to simply ignore the errors in the individual pipeline run. This means that the errors will not be carried over to subsequent runs. This is helpful when the error is a one-off and does not require an overall pipeline mapping change. If you select "Ignore Errors" on your pipeline, the Errors Count button on the Pipeline Detail page will reduce the number of errors ignored.

Show Jobs

You can streamline your pipeline runs view to show only jobs with records. This will benefit pipelines running on an automated schedule where zero record rows run periodically.

🔀