Pipeline Finisher Executor

When it receives an event, the Pipeline Finisher executor stops a pipeline and transitions it to a Finished state. This allows the pipeline to complete all expected processing before stopping.

Use the Pipeline Finisher as part of an event stream. You can use the Pipeline Finisher in any logical way, such as stopping a pipeline upon receiving a no-more-data event from the JDBC Query Consumer origin.

For example, you might use the executor in a pipeline designed to migrate all existing data from Microsoft SQL Server to HDFS. And then use a separate pipeline to process incremental updates. Or, you might use the executor to perform traditional "batch" processing - to process data, then stop when all data is processed rather than waiting indefinitely for more data.

When you restart a pipeline that was stopped by the Pipeline Finisher, the restart behavior depends on the origin. For example, if the origin stores an offset, when you restart the pipeline the origin begins at the last-saved offset by default. That is, if you run JDBC Query Consumer in incremental mode, the pipeline continues where it left off when you restart the pipeline. However, if you configure the JDBC Query Consumer to perform a full query, when you restart the pipeline, the origin performs the full query again. For more information, see "Event Generation" in the origin documentation.

The Pipeline Finisher executor has no stage-specific properties, but you might use a precondition to limit the records that enter the stage. You might also set up notification to be informed when the Pipeline Finisher stops the pipeline.

Before using the Pipeline Finisher, review the recommended implementation information.

For a case study about using the Pipeline Finisher, see Case Study: Stop the Pipeline. For more information about dataflow triggers and the event framework, see Dataflow Triggers Overview.

Recommended Implementation

The Pipeline Finisher executor is designed to stop and transition a pipeline to a Finished state after processing available data in the origin system. For example, you might use the executor to stop the pipeline after the JDBC Query Consumer processes all available data specified in the query.

When an origin generates only the no-more-data event, you can simply connect the event output to the Pipeline Finisher executor. When an origin generates multiple event types, you need to ensure that the Pipeline Finisher stops the pipeline only after receiving the no-more-data event.

Here are some ways you can ensure the executor receives only the no-more-data event:
Configure a precondition for the Pipeline Finisher
In the executor, add a precondition to allow only a no-more-data event into the stage to trigger the executor. You can use the following expression:
${record:eventType() == 'no-more-data'}
Tip: Records dropped because of a precondition are handled based on the stage error handling configuration. So to avoid racking up error records, you might also configure the Pipeline Finisher executor to discard error records.
Use this method when pipeline logic allows you to discard other event types generated by the origin.
Add a Stream Selector before the Pipeline Finisher
You can add a Stream Selector between the origin and the executor to route only the no-more-data event to the Pipeline Finisher. Use this option when you want to pass other event types to a different branch for processing.
For example, say you're using JDBC Query Consumer origin, which generates no-more-data, query success, and query failure events. And say you want to store the query success and query failure events. You can use a Stream Selector with the following condition to route the no-more-data event to the Pipeline Finisher:
${record:eventType() == 'no-more-data'}
Then you can connect the default stream - which receives the query success and query failure events - to a destination for storage.

Related Event Generating Stages

Best practice is to use the Pipeline Finisher only with origins that generate no-more-data events.

The following origins generate no-more-data events:
  • Amazon S3 origin
  • Directory origin
  • Google Cloud Storage origin
  • Hadoop FS Standalone origin
  • JDBC Multitable Consumer origin
  • JDBC Query Consumer origin
  • MongoDB origin
  • Salesforce origin
  • SFTP/FTP Client origin
  • SQL Server CDC Client origin
  • SQL Server Change Tracking origin

Notification Options

Data Collector can notify you when the Pipeline Finisher stops a pipeline. You can use either of the following notification methods:
Pipeline state notification
You can configure the pipeline to send an email or webhook when the pipeline transitions to the specified state. Use this option to send a webhook or a simple email notification. You cannot customize the email notification that is sent.
To have the pipeline send notification when the Pipeline Finisher stops a pipeline, set the Notify Upon Pipeline State Changes property to Finished. For more information about pipeline state notifications, see Notifications.
Email executor
You can use an Email executor to send email notification. The Email executor allows you to configure the condition to use to send the email, email recipients, subject, and message. You can also use expressions in any property to include details from the event record in the email. Use this option to send a customized email upon receiving an event.
To send a custom email, route the same event that triggers the Pipeline Finisher to the Email executor. After the Email executor and all other stages in the pipeline complete their tasks, the Pipeline Finisher transitions the pipeline to a Finished state.
For more information about using the Email executor, see Email Executor.

Configuring a Pipeline Finisher Executor

Configure a Pipeline Finisher executor to stop and transition the pipeline to a Finished state when the executor receives an event record.
  1. In the Properties panel, on the General tab, configure the following properties:
    General Property Description
    Name Stage name.
    Description Optional description.
    Stage Library Library version that you want to use.
    Required Fields Fields that must include data for the record to be passed into the stage.
    Tip: You might include fields that the stage uses.

    Records that do not include all required fields are processed based on the error handling configured for the pipeline.

    Preconditions Conditions that must evaluate to TRUE to allow a record to enter the stage for processing. All other records are handled based on the On Record Error property.

    Click Add to create additional preconditions.

    Tip: To allow only the no-more-data event to pass to the executor, use the following condition:
    ${record:eventType() == 'no-more-data'}
    On Record Error Error record handling for the stage:
    • Discard - Discards the record.
    • Send to Error - Sends the record to the pipeline for error handling.
    • Stop Pipeline - Stops the pipeline.
    Tip: When using preconditions to limit the event type that enters the executor, you might set this property to Discard to avoid processing other event types.