Event Types¶
Event types describe common categories of events that can happen in data operations. For technical details on each of the event types, go to the Event Ingestion API documentation.
The system currently supports the following event types:
| Event type | Accepted components | Description |
|---|---|---|
| Run Status | Batch pipelines (pipeline_key) |
Describes a change in status (running, completed, completed with warnings, failed) for a specified batch pipeline run. Optionally accepts a task_key that describes the change in status for a task within a specific run. |
| Message Log | Batch pipeline (pipeline_key)Dataset ( dataset_key)Server ( server_key)Streaming pipeline ( stream_key) |
Logs a string message related to the pipeline, and optionally related to a specific task. Can be used for capturing failure, warning, or debugging messages from external tools and scripts. |
| Metric Log | Batch pipeline (pipeline_key)Dataset ( dataset_key)Server ( server_key)Streaming pipeline ( stream_key) |
Logs the value of a user-defined datum of interest, such as a row count or an RMSE. Can be used for tracking the value of a metric through a run or for comparing the value of a metric across multiple runs. |
| Test Outcomes | Batch pipeline (pipeline_key)Dataset ( dataset_key)Server ( server_key)Streaming pipeline ( stream_key) |
Describes the results of a test or a set of tests executed on an external testing tool, such as DataKitchen's DataOps Data Quality TestGen. Accepts a list of objects, where each object represents the outcome of a test. |
| Dataset Operation | Dataset (dataset_key) |
Reports a read or write operation on a specified dataset component. |
Tip
Set rules based on each event type, so you can stay up-to-date on important situations in your data estate.
View event types¶
You can see what types of events are occurring in your data estate from the Events page, instance details, and run details.