You can create an SFTP Destination Connector to write CSV files (by default) or JSONL files to an SFTP folder using an SSH Private Key or your SFTP password.
Note: We only support SFTP Destination Connector and not FTP.
Supported file formats: CSV and JSONL
- SFTP Host
- Port Number
Step 1: After selecting + New Connector, under the System prompt, click SFTP.
Step 2: Provide a Connector Name.
Step 3: Select Destination Connector.
Step 4: Select the SFTP protocol used by the server.
Step 5: Provide the SFTP host.
Step 6: Provide the port number.
Step 7: Provide your SFTP username.
Step 8: Provide the directory path for the SFTP folder you want to connect to. (Note: the path is case sensitive)
You have the option to authenticate with an SSH Private key or with a Username and Password.
Enter the SSH Private Key
If you choose to authenticate with an SSH Private Key, provide the SSH private key with the BEGIN prefix and END suffix, as shown below.
Some private keys have a password as well. If you have one, provide the password in the in the SSH Private Key Password field. If you do not have one, you can leave this field blank.
Enter the Password
If you select this option, provide your SFTP password.
Design the output schema via two options, either import the schema or build it within Osmos.
Upload or drag & drop the schema file.
Import a file with the headers along with one row of sample data. This data is used only in schema creation.
Use the schema designer to build the output schema for this Destination Connector.
1. Click Add Field for each additional field required in the schema 2. Select Create Schema once you have built the schema.
Output File Format
By default, this Destination Connector writes CSV files, and each Osmos Pipeline run produces a new file. If preferred, you can choose to change the output to a JSONL file instead of a CSV file.
We support the designation of file prefixes in order to more easily manage the output of this connector. The contents of this field will be written into the filename of the data this Connector writes. If a prefix is specified, a UUID will be appended to it to prevent filename conflicts.
By default, we do not set a limit on the number of records to be written to a single destination file by a single job (i.e. a single run of a Pipeline or Uploader). If this box is checked, the data written to the destination will be "chunked" into separate files which contain at-most the number of records designated here. These "chunked" files will be suffixed with it's position in the sequence i.e. filename_part_1.csv, filename_part_2.csv, etc.
We support the use of Validation Webhooks to prevent bad data from being written to your systems, adding another layer of protection to the built-in validations that Osmos provides. The Webhook URL can be posted here.
Enter the name of the destination column where you'd like to store the entire raw source record data. The raw source record data will be stored as a JSON string in the provided destination column.
We support the designation of file prefixes in order to more easily manage the output of this connector. The contents of this field will be written into the filename of the data this Connector writes. If a prefix is specified, a UUID will be appended to it to prevent filename conflicts. Osmos leverages magic string identifiers in order to include additional information in your file prefix.
DateTime: You can include datetime values in your file output using String from time (strftime) format specifiers. The time values created here correspond to Osmos internal system time at the moment the job was started. See example 3
- 1.No file prefix Output: <user base path>/chunk-<chunk num>-<UUID>.<file extension>
- 2.File includes description in the prefix Sample prefix: my_osmos_output_ Output: <user base path>/my_osmos_output_chunk-<chunk num>-<UUID>.<file extension>
- 3.File includes datetime specifiers in the prefixSample prefix: %F_%T_ Output: <user base path>/<YYYY-MM-DD>_<HH:mm:ss>_chunk-<chunk num>-<UUID>.<file extension>