Create a Data Quality Job

Data Quality Jobs regularly monitor specific data to identify issues before they affect your business. A Data Quality Job includes a scope query to select data from one or more tables, and various settings to define how and when to run the job with specific monitors. Data Quality Jobs allow you to:

  • Conduct immediate and scheduled data quality checks on your data.
  • Create data profiles of your data.
  • Apply automatic monitoring to track the evolution of your data.
  • Configure automated notifications to send to users when certain conditions are met.

Important A Data Quality Job is not an asset; however, it is presented in Collibra in a way that closely resembles an asset.

Example When you expand a table from a schema in Monitoring Overview, click Plus icon Data Quality Job to create a Data Quality Job on the table. From the dialog that appears, a stepper guides you through the 5-step creation process.

screenshot of data quality job stepper

StepDescriptionRequired?
1. Select the data to monitorSelect a table from Monitoring Overview, then create a Data Quality Job on it.

Yes

2. Add monitorsSelect the data quality monitors to apply to your job.

Yes

3. Set a run scheduleSchedule a Data Quality Job to run automatically at a specified frequency.

No

4. Add notifications

Configure notifications to send to job owners when a job:

  • Fails
  • Completes a run
  • Contains a total number of rows equal to or exceeding a given value
  • Runs without data for a given number of runs
  • Runs without data for a given number of days
  • Results in a score equal to or greater than a given value
  • Has a run time that exceeds a given value

No

5. Review and runReview the SQL query or JSON payload, then save and run your Data Quality Job.

Yes

Prerequisites

  • The DQ Pushdown Edge capability is enabled for your data source.
  • You have a global role with the Product Rights > Data Quality global permission or the following permissions based on the actions you want to perform:
    • To create quick monitoring jobs, you must have the Data Quality Manager resource role with the Data Quality Job > Create resource permission.
    • To schedule jobs, you must have the Data Quality Editor or Data Quality Manager resource role with the Data Quality Job > Schedule resource permission.
    • To run quick monitoring jobs, you must have the Data Quality Editor or Data Quality Manager resource role with the Data Quality Job > Run resource permission.
  • If you want to use notifications, ensure that the users who need to receive notifications have a Collibra user account.

Steps

0 Open the Data Quality Job stepper

  1. From Monitoring Overview, expand your data source, then expand a schema within it.
    The available tables within the schema expand.
  2. Click the table where you want to run a Data Quality Job.
    The table opens on the right side of the page.
  3. Click + Data Quality Job on the upper right corner of the page.
    The Create Data Quality Job stepper opens to the Data step.
  4. Note If a table does not yet have monitoring, click + Data Quality Job in the center of the page, to the right of the data explorer.

1 Select the data to include in the scan

  1. Optionally, give your Data Quality Job a unique name in the Job Name field. The Job Name cannot contain special characters, other than -._.
  2. Tip The Job Name field uses a default name, following a schemaName.tableName_number format. A number at the end of the name, such as schemaTest.tableExample_2, indicates that another Data Quality Job called schemaTest.tableExample already exists.

  3. Optionally, click Select columns to include specific columns from the selected table in your Data Quality Job.

    Tip All columns in your table are included in the Data Quality Job by default. If you want to run a Data Quality Job on the entire table, you do not need to modify the default column selections.

  4. Optionally, select Sample, then enter a specific value in the Sample rows field to include that number of rows in the sample.
  5. Click Next.
    The Monitors step opens.

2 Add monitors

After selecting the data to include in the scan, you need to add monitors to your Data Quality Job. All Data Quality Jobs include data type and schema change checks.

Tip To run basic profiling on your data, click Next to accept the default monitors and proceed to the Schedule step. A Data Quality Job with basic profiling monitors the following changes in your data:
  • Data type
  • Schema evolution
  • Row count
  • Uniqueness
  • Null values
  • Empty fields
  1. Optionally, select Allow descriptive statistics in profiling to show the results of checks for min, median, max, Q1 and Q3 values from numeric columns in the UI.
  2. Warning Allowing descriptive statistics in profiling may include sensitive values if they are present in the columns you include in the scan.

  3. Select the monitors to include or exclude in your Data Quality Job.
  4. Monitor Description Default on?
    Row count Tracks changes to the total number of rows.

    Yes

    Uniqueness Tracks changes in the number of distinct values across all columns.

    Yes

    Null values Detects changes in the number of NULL values in all columns.

    Yes

    Empty fields Detects changes in the number of empty values in numeric columns.

    Yes

    Min value Detects changes in the lowest value in numeric columns.

    No

    Mean value Detects changes in the average value in numeric columns.

    No

    Max value Detects changes in the highest value in numeric columns.

    No

    Execution time Tracks changes in the execution time of the data job.

    No

    Note If you do not change the default options, row count, uniqueness, and null and empty values monitors will be included in your Data Quality Job.

  5. Optionally, click Console icon Advanced, then click and drag the sliders or enter a value in the fields to configure the settings.
  6. Setting name Description Default setting
    Data Lookback The number of runs to train the adaptive monitoring for expected values. 10 (runs)
    Learning Phase The minimum number of runs to train the adaptive monitoring before it can begin to suggest if the adaptive rules are breaking or passing. 4 (runs)
  7. Click Next.
    The Schedule step opens.

3 Set a run schedule

  1. Select the automated run frequency of the Data Quality Job from the Repeat drop-down list. The available options differ depending on the selected Repeat option.
  2. Repeat options Description Run start options
    Never

    Your Data Quality Job doesn't run automatically along a schedule. No additional Repeat configuration is required.

    Note When you select Never, the job runs only once, when you initiate the job run.

    None
    Hourly Your Data Quality Job runs every hour, at the minute you set in the Run start minute option.

    Run start minute

    Enter a value between 00 and 59 in mm format, or click Clock icon and select a time.

    Daily Your Data Quality Job runs every day, at the time you set in the Run start time option. Run start time

    Enter a value between 01:00 and 12:59 in hh:mm format, or click Clock icon and select a time.

    Note Because run start time uses a 12-hour clock, you need to select AM or PM from the Clock icon option to change the scheduled time to morning or afternoon.

    Weekly on Your Data Quality Job runs every week, on the day(s) of the week and time you set in the Run start time option. Run days

    Select one or more days on which you want your Data Quality Job to run automatically at the designated run start time.

    Run start time

    Enter a value between 01:00 and 12:59 in hh:mm format, or click Clock icon and select a time.

    Note Because run start time uses a 12-hour clock, you need to select AM or PM from the Clock icon option to change the scheduled time to morning or afternoon.

    Weekdays Your Data Quality Job runs every day, except Saturday and Sunday, at the time you set in the Run start time option. Run start time

    Enter a value between 01:00 and 12:59 in hh:mm format, or click Clock icon and select a time.

    Note Because run start time uses a 12-hour clock, you need to select AM or PM from the Clock icon option to change the scheduled time to morning or afternoon.

    Monthly Your Data Quality Job runs every month, on either the first or last day of the month, or the calendar day number and time you set in the Run start time option.

    Run day

    Select a day of the week on which you want your Data Quality Job to run automatically at the designated run start time. You can use the drop-down list to run your Data Quality Job monthly, on the first or last day of the month you select.

    Alternatively, you can select a specific date from the Day number drop-down list to run your Data Quality Job every month on a fixed date.

    Run start time

    Enter a value between 01:00 and 12:59 in hh:mm format, or click Clock icon and select a time.

    Note Because run start time uses a 12-hour clock, you need to select AM or PM from the Clock icon option to change the scheduled time to morning or afternoon.

  3. Optionally, configure the compute resourcing.
  4. Optionally, include a time slice to run the Data Quality Job on a moving subset of time, or enable back runs to start Data Quality Job runs on historical data.
  5. Click Next.
  6. The Notifications step opens.

4 Add notifications

  1. Select the notification options.
  2. Notification option Sends a notification to the specified recipients when... Default on
    Job failed Your Data Quality Job does not complete a run.

    Yes

    Job completed Your Data Quality Job completes a run.

    No

    Rows <=

    The number of rows is less than or equal to the value you set.

    Enter a value greater than 0.

    Yes

    Runs without data >=

    The number of runs where your Data Quality Job contains no data are greater than or equal to the value you set.

    Enter a value greater than 0.

    No

    Days without data >=

    The number of days without data is greater than or equal to the value you set.

    Enter a value greater than 0.

    No

    Score <=

    The data quality score is lower than or equal to the value you set.

    Enter a value between 0 and 100.

    Yes

    Run time minutes >

    The run time of your data exceeds the value you set, in minutes.

    Enter a value greater than 0.

    Yes

  3. Optionally, in the Global custom message input field, enter a custom message to include with each notification.
  4. Optionally, switch on Individual custom messages in the upper-right corner, and enter a unique message for each applicable notification option.
  5. Note When individual custom messages are enabled, they override the global custom message.

  6. Click the Recipients drop-down list and select a user to add them to the list of recipients, or click Grey close icon to the right of a recipient's username to remove them.
  7. Note The drop-down list of users includes all active users in Collibra.

  8. Click Next.
  9. The Review step opens.

5 Review and run

The review step provides a summary of the settings of your Data Quality Job.

  1. Review the SQL query or JSON payload.
  2. Note The code under the Query and JSON tabs is a read-only preview of your job query.

  3. Click Save and run now.
  4. The job is submitted to the jobs queue and appears in the Job column of its table of origin in the Monitoring Overview page.
    You can open the Job Details page via the Monitoring Overview page. While the job runs, you will see a "Running" status on the Job Details page and Autocomplete icon in the Score and Last Run columns of the new job on the Monitoring Overview page.

    Note If information is incorrect or missing, such as an incorrect data type in a date column, clicking Save and run now returns you to the step with the issue. You must resolve it before you can run the job.

What's next?

You can now review the details of your Data Quality Job on the Job Details page.