skip to Main Content

StreamSets Control Hub

Build, run, monitor, and manage smart data pipelines at scale

Managing Data for Digital Transformation

With digital transformation comes complexity: diverse sources and destinations, multiple platforms, and evolving business demands. StreamSets Control Hub, the central nervous system of the StreamSets Platform, gives you a real-time, end-to-end console to operate all your data pipelines across your enterprise.

Build and run streaming, CDC, batch, ETL and ML data pipelines

Collaborate with shared templates and components, and built-in version control

Gain global operational visibility and control everywhere pipelines run

Screenshot: Design, Deploy, Monitor, And Manage Smart Data Pipelines At Scale

Our customers run millions of data pipelines using StreamSets

Deluxe-logo

Operationalize Your Data Pipeline Lifecycle

Monitor Data Pipelines In One Place

Mission Control for Hybrid and Multi-cloud

Scale to thousands of data pipelines with visibility and control. Control Hub centralizes building, running, monitoring and management, so any size team can collaborate across all lifecycle stages, all design patterns, and all engines. Reuse pipeline assets to scale new data connections and pipeline patterns across your business without special skills.

Watch: DataOps in Practice – Designing for Change

Smart Data Pipelines Built for Change

The most vulnerable points in a modern data architecture are the thousands of integrations that make modern analytics possible. Only StreamSets Control Hub is designed to provide transparency into that complexity through live data maps, powered by fully instrumented pipelines. See how your data is flowing in real time and automatically detect and handle data drift.

Watch: Modern Data Integration Using Data Collector
Monitor Data Pipelines At Scale
Build Data Pipelines For Data Drift

Single Design Experience for All Patterns

Deliver on the promise of DataOps in the cloud with a fully managed data integration platform for all your data pipeline needs. A single design canvas for building and running all pipeline patterns reduces the time to design and deploy smart data pipelines by 90%.

Watch: DataOps in Practice – Designing for Change

Frequently Asked Questions

What is Control Hub in StreamSets?

StreamSets Control Hub is a single hub for building, running, monitoring, and managing all your data pipelines and data processing jobs. As the center of the StreamSets Platform, Control Hub lets your entire extended team collaborate on data pipelines running on Data Collector and Transformer, and gives you a real-time, end-to-end view of all data pipelines across your enterprise. It also manages, monitors and scales the StreamSets engines such as Data Collector and Transformer to optimize your overall data integration environment. Finally, Control Hub also provides full transparency and control of all data pipelines and execution engines across your entire hybrid/multi-cloud architecture in one single hub.

Is StreamSets Control Hub free?

StreamSets Control Hub is part of your paid subscription to StreamSets data integration platform. While it is not free, it is included in the overall platform.

How do you monitor data pipelines?

Pipelines are monitored across two dashboards: Alerts and Topologies. The Alerts dashboard provides a summary of triggered alerts, jobs with errors, offline execution engines, and unhealthy engines that have exceeded their resource thresholds. This dashboard view is used to monitor and troubleshoot jobs. The Topologies Dashboard provides a summary of the number of pipelines, jobs, topologies, and execution engines that you have access to. The Topologies Dashboard was previously named the Default Dashboard.

Helpful Resources
Blog

The Difference Between Data Governance and Data Management

Every day, business is becoming more dependent on data. From reports stating that the world’s data volume will grow by…

Blog

Data Pipeline Architecture: Key Design Principles & Considerations for Data Engineers

Data pipelines are meant to transfer data from a source or legacy system to a target system. Easy right? Well…

Other

Metadata Management

The Basics of Metadata Management: Examples, Tools, and Best Practices Why it's crucial for data engineers, scientists and analysts.

Ready to Get Started?

We’re here to help you start building pipelines or see the platform in action.

Back To Top