You can create an Amazon S3 Destination Connector to write CSV files (by default) or JSONL files to an S3 bucket or folder using an Access Key ID and Secret Access Key.
Supported file formats: CSV and JSONL
Prerequisites
Information needed:
Bucket Name
Region
Access Key ID
Secret Access Key
Creating an Amazon S3 Destination Connector
Step 1: Click New Connector.
Step 2: Under the System prompt, click AmazonS3
Step 3: Provide a Connector Name.
Step 4: Select Destination Connector.
S3 Bucket Information
Step 1: Provide the name of the AmazonS3 bucket in the Bucket Name field.
Step 2: Provide the folder name with the trailing “/”. Subfolders within the folder provided will be ignored.
If this field is left blank, we will read from the root of the Amazon S3 bucket and ignore any folders within the bucket
Step 2: Provide the secret access key ID for the S3 bucket
Building the Schema for the Destination Connector
Use the schema designer to build the output schema for this Destination Connector.
Parameter
Description
Field Name
Provide a field name for the output fields. These names will be used as the column headers or field names in the output file you are writing to.
Type
Define the type of each field. The field types will be used to enforce rules when you send data to this Connector.
Nullable
Check this box if the field is nullable. If the field is not nullable, you will be required to provide values for this field when sending data to this Connector.
Delete
Deletes the field.
Add Field
Adds another field to the schema.
Step 1: Click Add Field for each additional field required in the Schema
Step 2: Select Create Schema once you have built the schema.
The schema cannot be changed after saving the Destination Connector, so please confirm it is accurate before proceeding.
Advanced Options
Output File Format
By default, this Connector writes CSV files, and each Pipeline run produces a new file. If preferred, you can choose to change the output to a JSONL file instead of a CSV file.