LogoLogo
Back to OsmosDeveloper DocsOsmos BlogWhat's New
  • Welcome to Osmos
  • 👋Getting Started with Osmos
    • Terminology
  • 🎉What's New
  • 🧩Osmos API Reference
  • ⌨️Osmos Chat
  • 👩‍💻Developer Docs
    • Manage API Keys
    • Embedding an Osmos Uploader
    • Embedding Uploader Jobs Table
    • Turning on Advanced Mode Uploader
    • Customizing Uploader Styling
    • Passing Parameterized Fields
    • Configuring Uploader's "Recall" functionality
    • Optional Uploader Settings
    • Uploader Submission Callback
    • Configuring AutoClean for your Uploader
    • Uploader Client-Side Validation
      • Data Validators
      • Checking for Duplicate values in a field
      • Creating Dropdown-Controlled Fields
      • Dynamic Dropdown Options
      • Dropdown Interaction with Validation Functions
    • Validation and Transformation Webhooks
      • OpenAPI Validation Webhook Testing
    • Parser Webhook for file based connectors
  • 🔠Datasets
    • Osmos Datasets
      • Uploading Data to your Table
      • Creating Primary and Foreign keys
      • Osmos Dataset Destination Connector
      • Osmos Dataset Source Connector
      • Dataset Edits
    • Datasets Query Builder
      • Query Builder Metadata
    • Performing Look Ups
      • Performing Joins
        • Types of Joins
  • ⏏️Uploader
    • Creating an Osmos Uploader
      • Testing your Osmos Uploader
    • Uploader Validation Summary
    • Advanced Mode
      • Overview
      • Process
    • Standard Mode
      • Overview
      • AutoClean
      • Process
    • AI AutoMapping
    • Uploaders Page
    • Uploader Details Page
  • 🔀Pipelines
    • Step 1. Select the Source
    • Step 2. Select a Destination
    • Step 3. Map & Transform Data
    • Step 4. Schedule the Pipeline
    • Step 5. Review & Confirm
    • Pipelines Page
    • Pipeline Details Page
  • ⏩Data Transformations
    • AutoMap
    • Column Mapping & Data Cleanup Panel
    • QuickFixes
    • AI Value Mapping
    • AI AutoClean
    • Lookups
      • Performing Lookups
    • SmartFill
    • Formulas
      • Date & Time Formulas
        • DateTime Format Specifiers
        • Timezone specifiers
      • Math Formulas and Operators
      • Logical Formulas & Operators
        • True & False Casting
      • Text Formulas
      • Other Formulas
    • Deduplication
  • ↘️Source Connectors
    • Amazon S3
    • Azure Blob Storage
    • BigQuery
    • Email
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
    • Osmos Dataset
    • Snowflake
    • Accessing Sources behind firewall
  • ↖️Destination Connectors
    • Amazon S3
    • BigQuery
    • FTP
    • Google Cloud Storage (GCS)
    • Google Drive
    • Google Sheets
    • HTTP API (Call an Osmos API)
    • HTTP API (Osmos Calls Your API)
      • Passing Dynamic Tokens in the API Header
    • MySQL
    • Osmos Dataset
    • PostgreSQL
    • Snowflake
    • Accessing Destinations behind firewall
  • 🗂️Projects
  • ⚙️Administration
    • Email Notifications
  • 🔒Security
  • 📞Support
  • Back to Osmos.io
Powered by GitBook
On this page

Was this helpful?

Projects

An Osmos Project is a collection of datasets, pipelines, uploaders, and connectors that are organized together to manage and automate data ingestion, preprocessing, transformation, and movement.

PreviousAccessing Destinations behind firewallNextAdministration

Last updated 10 months ago

Was this helpful?

Here are some key points about Osmos Projects:

  1. Collections: Projects act as containers for organizing your data resources. This includes datasets (tables), pipelines (ETL processes), uploaders (tools for uploading data files), and connectors (connections to source or destination systems).

  2. Multiple Projects: Customers can have more than one project. Each project can be tailored to different use cases, teams, or departments within an organization.

  3. Datasets: Within a project, datasets are managed databases that store customer data. Each dataset can contain multiple tables and can be used for various purposes such as a system of record, landing zone for incoming data, or source for clean data to be sent to other systems. Please refer to for more information.

  4. Pipelines: Pipelines in a project automate the ETL (Extract, Transform, Load) process. They define mappings between source systems and destination systems, including any transformations or data cleaning that needs to be done. Please refer to for more information.

  5. Uploaders: Projects include uploaders that allow users to upload files (CSV, Excel, etc.) into the system. Uploaders can be used directly within the Osmos UI or can be embedded into third-party websites to enable self-service data uploads. Please refer to for more information.

  6. Connectors: Connectors within a project facilitate integration with other systems. They can act as sources from which data is pulled or as destinations where data is pushed. Connectors can be reused across multiple pipelines and uploaders within the project. Please refer to and for more information

  7. Management: Projects provide a way to manage and monitor all associated resources, enabling users to see how data flows through their pipelines, validate data integrity, and troubleshoot any issues that may arise.

  8. Creation and Support: Currently, new projects must be created by Osmos support upon request, but self-service project creation is a feature that is coming soon.

By organizing resources within projects, Osmos ensures that users can effectively manage complex data ingestion workflows, maintain data integrity, and meet their organization’s data processing needs comprehensively.

If you have any more specific questions about projects or need help setting up your projects in Osmos, please contact support@osmos.io.

🗂️
Datasets documentation
Pipelines documentation
Uploaders documentation
Source connectors
Destination connectors