Development Stages

You can use several development stages to help develop and test pipelines.
Note: Do not use development stages in production pipelines.
You can use the following stages when developing or testing pipelines:
Dev Data Generator origin
Generates records with the specified field names and field types. For the field type, you can select a data type, such as String and Integer, or a type of data, such as address information, phone numbers, and book titles.
You can use a Map or List-Map root field type.
The origin can generate events to test event handling functionality. To generate event records, select the Produce Events property.
When generating events, the origin uses the configured fields as the body of the event record and adds event record header attributes. You can also specify the event type with the Event Name property. For example, to create a no-more-data event, enter "no-more-data" for the event name. For more information about the event framework and event record header attributes, see Dataflow Triggers Overview.
The origin can also generate multiple threads for testing for a multithreaded pipeline. To generate multiple threads, enter a number larger than 1 for the Number of Threads property. For more information about multithreaded pipelines, see Multithreaded Pipeline Overview.
The On Record Error property has no effect in this stage.
Dev Random Source origin
Generates records with the configured number of Long fields. You can define a delay between batches and a maximum number of records to generate.
The On Record Error property has no effect in this stage.
Dev Raw Data Source origin
Generates records based on user-supplied data. You can enter raw data, select the data format of the data, and then configure any format-related configuration options.
For example, you can enter a set of log data, select the log data format, and then define the log format and other log properties for the data.
In data preview, this stage displays the raw source data as well as the data generated by the stage.
Dev SDC RPC with Buffering origin
Receives records from an SDC RPC destination, temporarily buffering the records to disk before passing the records to the next stage in the pipeline. Use as the origin in an SDC RPC destination pipeline.
Note: After a deliberate or unexpected stop of the pipeline, buffered records are lost.
Dev Snapshot Replaying origin
Reads records from a downloaded snapshot file. The origin can start reading from the first set of records in the snapshot file. Or, you can configure the origin to start reading from a specific stage in the snapshot file.
Sensor Reader origin
Generates records with one of the following types of data:
  • Atmospheric data such as that generated by BMxx80 atmospheric sensors. Records include the following fields: temperature_C, pressure_KPa, and humidity. For example:
    {"temperature_C": "12.34", "pressure_KPa": "567.89", "humidity": "534.44"}
  • CPU temperature data such as that generated by the BCM2835 onboard thermal sensor, as found on the Raspberry Pi family of single board computers. Records contain a single field: temperature_C. For example:
    {"temperature_C": "123.45"}
For use in edge pipelines only.
Dev Identity processor
Passes all records through to the next stage. Use as a placeholder in the pipeline. You can define required fields and preconditions, and configure stage error handling.
Dev Random Error processor
Generates error records so you can test pipeline error handling. You can configure the stage to discard records, define required fields and preconditions, and configure stage error handling.
Dev Record Creator processor
Generates two records for each record that enters the stage. You can define required fields and preconditions, and configure stage error handling.
To Event destination
Generates events for testing event handling functionality. To generate events, select the Produce Events property.
The destination generates an event record for each incoming record. It uses the incoming record as the body of an event record and adds event record header attributes. Note that any record header attributes in the incoming record might be lost or replaced.
For more information about the event framework and event record header attributes, see Dataflow Triggers Overview.