Dataflow Performance Manager
Here are some of the new features and enhancements in 126.96.36.199, updated August 9, 2017. For a full list, see What's New.
- Auth Token Administrator role – Role that enables users to register, unregister, and deactivate Data Collectors using DPM. Also enables users to regenerate authentication tokens.
- Job history – History now includes all user actions on the job and the progress of all Data Collectors running pipelines for the job.
- Number of pipeline instances – Default value for the number of pipeline instances for a job is now 1. This runs one pipeline instance on an available Data Collector running the fewest number of pipelines.
Here are some of the new features and enhancements in 188.8.131.52, released on August 18, 2017. For a full list, see What's New. For a list of bug fixes and known issues, see the Release Notes.
A new credential store API that integrates with Java keystore and Hashicorp Vault.
Beta support for publishing metadata about running pipelines to Cloudera Navigator.
Pipeline event generation when the pipeline stops and starts that you can use as dataflow triggers.
The following new origins:
- Google BigQuery – An origin that executes a query job and reads the result from Google BigQuery.
- Google Pub/Sub Subscriber – A multithreaded origin that consumes messages from a Google Pub/Sub subscription.
- OPC UA Client – An origin that processes data from an OPC UA server.
- SQL Server CDC Client – A multithreaded origin that reads data from Microsoft SQL Server CDC tables.
- SQL Server Change Tracking – A multithreaded origin that reads data from Microsoft SQL Server change tracking tables and generates the latest version of each record.
The following new processors:
- Data Parser – A processor that extracts NetFlow or syslog messages as well as other supported data formats that are embedded in a field.
- JSON Generator – A processor that serializes data from a record field to a JSON-encoded string.
- Kudu Lookup – A processor that performs lookups in Kudu to enrich records with additional data.
The following new destinations:
A new Amazon S3 executor that can be triggered to create new Amazon S3 objects for the specified content or add tags to existing objects each time it receives an event.