Introduction to DaQAPO

Niels Martin1, Greg Van Houdt2, and Gert Janssenswillen3

2020-04-07

Introduction

Process mining techniques generate valuable insights in business processes using automatically generated process execution data. However, despite the extensive opportunities that process mining techniques provide, the garbage in - garbage out principle still applies. Data quality issues are widespread in real-life data and can generate misleading results when used for analysis purposes. Currently, there is no systematic way to perform data quality assessment on process-oriented data. To fill this gap, we introduce DaQAPO - Data Quality Assessment for Process-Oriented data. It provides a set of assessment functions to identify a wide array of quality issues.

We identify two stages in the data quality assessment process:

  1. Reading and preparing data;
  2. Assessing the data quality - running quality tests.

If the user desires to remove anomalies detected by quality tests, he has the ability to do so.

Data Sources

Before we can perform the first stage - reading data - we must have access to the appropriate data sources and have knowledge of the expected data structure. Our package supports two input data formats:

Two example datasets are included in daqapo. These are hospital and hospital_events. Below, you can find their respective structures.

str(hospital)
#> Classes 'tbl_df', 'tbl' and 'data.frame':    53 obs. of  7 variables:
#>  $ patient_visit_nr: num  510 512 510 512 512 510 517 518 518 518 ...
#>  $ activity        : chr  "registration" "Registration" "Triage" "Triage" ...
#>  $ originator      : chr  "Clerk 9" "Clerk 12" "Nurse 27" "Nurse 27" ...
#>  $ start_ts        : chr  "20/11/2017 10:18:17" "20/11/2017 10:33:14" "20/11/2017 10:34:08" "20/11/2017 10:44:12" ...
#>  $ complete_ts     : chr  "20/11/2017 10:20:06" "20/11/2017 10:37:00" "20/11/2017 10:41:48" "20/11/2017 10:50:17" ...
#>  $ triagecode      : num  3 3 3 3 3 NA 3 4 4 4 ...
#>  $ specialization  : chr  "TRAU" "URG" "TRAU" "URG" ...
str(hospital_events)
#> Classes 'tbl_df', 'tbl' and 'data.frame':    106 obs. of  8 variables:
#>  $ patient_visit_nr     : num  510 510 510 510 510 510 512 512 512 512 ...
#>  $ activity             : chr  "registration" "registration" "Triage" "Triage" ...
#>  $ originator           : chr  "Clerk 9" "Clerk 9" "Nurse 27" "Nurse 27" ...
#>  $ event_lifecycle_state: chr  "start" "complete" "start" "complete" ...
#>  $ timestamp            : chr  "20/11/2017 10:18:17" "20/11/2017 10:20:06" "20/11/2017 10:34:08" "20/11/2017 10:41:48" ...
#>  $ triagecode           : num  3 3 3 3 NA NA 3 3 3 3 ...
#>  $ specialization       : chr  "TRAU" "TRAU" "TRAU" "TRAU" ...
#>  $ event_matching       : num  1 1 1 1 1 1 1 1 1 1 ...

Both datasets were artificially created merely to illustrate the package’s functionalities.

Stage 1 - Read in data

First of all, data must be read and prepared such that the quality assessment tests can be executed. Data preparation requires transforming the dataset to a standardised activity log format. However, earlier we mentioned two input data formats: an activity log and an event log. When an event log is available, it needs to be converted to an activity log. daqapo provides a set of functions, with the aid of bupaR, to assist the user in this process.

Preparing an Activity Log

As mentioned earlier, the goal of reading and preparing data is to obtain a standardised activity log format. When your source data is already in this format, preparations come down to the following elements:

For this section, the dataset hospital will be used to illustrate data preparations. Three main functions help the user to prepare his/her own dataset:

Rename

The activity log object adds a mapping to the data frame to link each column with its specific meaning. In this regard, the timestamp columns each represent a different lifecycle state. daqapo must know which column is which, requiring standardised timestamp names. The accepted timestamp values are:

  • schedule
  • assign
  • reassign
  • start
  • suspend
  • resume
  • abort_activity
  • abort_case
  • complete
  • manualskip
  • autoskip

The two timestamps required by daqapo are start and complete.

Convert timestamp format

Each timestamp must also be in the POSIXct format.

Create activitylog

When the timestamps are edited to the desired format, the activity log object can be created along with the required mapping.

Preparing an Event Log

With event logs, things are a bit more complex. In an event log, each row represents only a part of an activity instance. Therefore, more complex data transformations must be executed and several problems could arise. In this section, we will use an event log variant of the activity log used earlier, named hospital_events.

The same principle regarding the timestamps apply. Therefore, the POSIXct format must be applied in advance. Additionally, the event log object also requires an activity instance id. If needed, one can be created manually as illustrated below.

The following functions form the building blocks of the required data preparation, but not all must be called to obtain a fully prepared event log at all times:

Stage 2 - Data Quality Assessment

The table below summarizes the different data quality assessment tests available in daqapo, after which each test will be briefly demonstrated.

An overview of data quality assessment tests in daqapo.
Function name Description Output
detect_activity_frequency_violations Function that detects activity frequency anomalies per case Summary in console + Returns activities in cases which are executed too many times
detect_activity_order_violations Function detecting violations in activity order Summary in console + Returns detected orders which violate the specified order
detect_attribute_dependencies Function detecting violations of dependencies between attributes (i.e. condition(s) that should hold when (an)other condition(s) hold(s)) Summary in console + Returns rows with dependency violations
detect_case_id_sequence_gaps Function detecting gaps in the sequence of case identifiers Summary in console + Returns case IDs which should be expected to be present
detect_conditional_activity_presence Function detection violations of conditional activity presence (i.e. activity/activities that should be present when (a) particular condition(s) hold(s)) Summary in console + Returns cases violating conditional activity presence
detect_duration_outliers Function detecting duration outliers for a particular activity Summary in console + Returns rows with outliers
detect_inactive_periods Function detecting inactive periods, i.e. periods of time in which no activity executions/arrivals are recorded Summary in console + Returns periods of inactivity
detect_incomplete_cases Function detecting incomplete cases in terms of the activities that need to be recorded for a case Summary in console + Returns traces in which the mentioned activities are not present
detect_incorrect_activity_names Function returning the incorrect activity labels in the log Summary in console + Returns rows with incorrect activities
detect_missing_values Function detecting missing values at different levels of aggregation Summary in console + Returns rows with NAs
detect_multiregistration Function detecting the registration of a series of events in a short time period for the same case or by the same resource Summary in console + Returns rows with multiregistration on resource or case level
detect_overlaps Checks if a resource has performed two activities in parallel Data frame containing the activities, the number of overlaps and average overlap in minutes
detect_related_activities Function detecting missing related activities, i.e. activities that should be registered because another activity is registered for a case Summary in console + Returns cases violating related activities
detect_similar_labels Function detecting potential spelling mistakes Table showing similarities for each label
detect_time_anomalies Funtion detecting activity executions with negative or zero duration Summary in console + Returns rows with negative or zero durations
detect_unique_values Function listing all distinct combinations of the given log attributes Summary in console + Returns all unique combinations of values in given columns
detect_value_range_violations Function detecting violations of the range of acceptable values Summary in console + Returns rows with value range infringements

Detect Activity Frequency Violations

Detect Activity Order Violations

Detect Attribute Dependencies

Detect Case ID Sequence Gaps

Detect Conditional Activity Presence

Detect Duration Outliers

Detect Inactive Periods

Detect Incomplete Cases

Detect Incorrect Activity Names

Detect Missing Values

Detect Multiregistration

Detect Overlaps

Detect Similar Labels

Detect Time Anomalies

Detect Unique Values

Detect Value Range Violations


  1. Hasselt University, Research group Business Informatics | Research Foundation Flanders (FWO).

  2. Hasselt University, Research group Business Informatics.

  3. Hasselt University, Research group Business Informatics.