SFTP/FTP Client
The SFTP/FTP Client origin reads files from a server using the Secure File Transfer Protocol (SFTP) or the File Transfer Protocol (FTP).
When you configure the SFTP/FTP Client origin, you specify the URL where the files reside on the remote server. You can specify whether to process files in subdirectories, a file name pattern, and the first file to process.
If the server requires authentication, configure the origin to use login credentials. If using the SFTP protocol, you can also configure the origin for strict host checking.
You can configure the origin to download files to an archive directory if the origin encounters errors while reading the files.
When the pipeline stops, the SFTP/FTP Client origin notes where it stops reading. When the pipeline starts again, the origin continues processing from where it stopped by default. You can reset the origin to process all requested files.
The origin can generate events for an event stream. For more information about dataflow triggers and the event framework, see Dataflow Triggers Overview.
Read Order
cp
-p
. Preserving the existing timestamp can be problematic in some
cases, such as moving files across time zones.When ordering based on timestamp, any files with the same timestamp are read in lexicographically ascending order based on the file names.
File Name
|
Last Modified Timestamp
|
log-1.json
|
APR 24 2016 14:03:35
|
log-0054.json
|
APR 24 2016 14:05:03
|
log-0055.json
|
APR 24 2016 14:45:11
|
log-2.json
|
APR 24 2016 14:45:11
|
First File for Processing
Configure a first file for processing when you want the SFTP/FTP Client origin to ignore one or more existing files in the directory.
When you define a first file to process, the origin starts processing with the specified file and continues processing files in the expected read order: files that match the file name pattern in ascending order based on the last-modified timestamp.
When you do not specify a first file, the origin processes the files in the directory that match the file name pattern, starting with the earliest file and continuing in ascending order.
Credentials
If the remote server requires authentication, configure the authentication method that the origin must use to log in to the remote server.
- None
- The SFTP or FTP server does not require authentication.
- Password
- The SFTP or FTP server requires authentication using a user name and password.
- Private key
- The SFTP server requires authentication using a private key file. Store the private key file in a local directory. For the SFTP protocol only.
If using the SFTP protocol, you can also configure the origin to use strict host checking. When enabled, the origin connects to the SFTP server only if the server is listed in the known hosts file stored in a local directory. The known hosts file contains the host keys for the approved SFTP servers.
Record Header Attributes
The SFTP/FTP Client origin creates record header attributes that include information about the originating file for the record. When the origin processes Avro data, it includes the Avro schema in an avroSchema record header attribute.
You can use the record:attribute or record:attributeOrDefault functions to access the information in the attributes. For more information about working with record header attributes, see Working with Header Attributes.
- avroSchema - When processing Avro data, provides the Avro schema.
- filename - Provides the name of the file where the record originated.
- file - Provides the file path and file name where the record originated.
- mtime - Provides the last-modified time for the file.
- remoteUri - Provides the resource URL used by the stage.
Event Generation
The SFTP/FTP Client origin can generate events that you can use in an event stream. When you enable event generation, the origin generates event records each time the origin starts or completes reading a file. It can also generate events when it completes processing all available data and the configured batch wait time has elapsed.
- With the Pipeline Finisher executor to
stop the pipeline and transition the pipeline to a Finished state when
the origin completes processing available data.
When you restart a pipeline stopped by the Pipeline Finisher executor, the origin continues processing from the last-saved offset unless you reset the origin.
For an example, see Case Study: Stop the Pipeline.
- With the Email executor to send a custom email
after receiving an event.
For an example, see Case Study: Sending Email.
- With a destination to store event information.
For an example, see Case Study: Event Storage.
For more information about dataflow triggers and the event framework, see Dataflow Triggers Overview.
Event Records
Record Header Attribute | Description |
---|---|
sdc.event.type | Event type. Uses one of the following types:
|
sdc.event.version | An integer that indicates the version of the event record type. |
sdc.event.creation_timestamp | Epoch timestamp when the stage created the event. |
The SFTP/FTP Client origin can generate the following types of event records:
- new-file
- The SFTP/FTP Client origin generates a new-file event record when it starts processing a new file.
- finished-file
- The SFTP/FTP Client origin generates a finished-file event record when it finishes processing a file.
- no-more-data
- The SFTP/FTP Client origin generates a no-more-data event record when the origin completes processing all available records and the number of seconds configured for Batch Wait Time elapses without any new files appearing to be processed.
Data Formats
The SFTP/FTP Client origin processes data differently based on the data format. SFTP/FTP Client processes the following types of data:
- Avro
- Generates a record for every Avro record. Includes a "precision" and "scale" field attribute for each Decimal field. For more information about field attributes, see Field Attributes.
- Delimited
- Generates a record for each delimited line. You can use the
following delimited format types:
- Default CSV - File that includes comma-separated values. Ignores empty lines in the file.
- RFC4180 CSV - Comma-separated file that strictly follows RFC4180 guidelines.
- MS Excel CSV - Microsoft Excel comma-separated file.
- MySQL CSV - MySQL comma-separated file.
- PostgreSQL CSV - PostgreSQL comma-separated file.
- PostgreSQL Text - PostgreSQL text file.
- Tab-Separated Values - File that includes tab-separated values.
- Custom - File that uses user-defined delimiter, escape, and quote characters.
- Excel
- Generates a record for every row in the file. Can process
.xls
or.xlsx
files. - JSON
- Generates a record for each JSON object. You can process JSON files that include multiple JSON objects or a single JSON array.
- Log
- Generates a record for every log line.
- Protobuf
- Generates a record for every protobuf message.
- SDC Record
- Generates a record for every record. Use to process records generated by a Data Collector pipeline using the SDC Record data format.
- Text
- Generates a record for each line of text or for each section of text based on a custom delimiter.
- Whole File
- Streams whole files from the origin system to the destination system. You can specify a transfer rate or use all available resources to perform the transfer.
- XML
- Generates records based on a user-defined delimiter element. Use an XML element directly under the root element or define a simplified XPath expression. If you do not define a delimiter element, the origin treats the XML file as a single record.
Configuring an SFTP/FTP Client Origin
Configure an SFTP/FTP Client origin to read files from an SFTP or FTP server.