Review monitor results

From the Monitors tab in the Job Details page, you can:

  • Review the results of monitors for a closer look at their details.
  • Find details on quality issues.
  • Adjust tolerance levels or suppress rules that are outdated, irrelevant, or no longer necessary.

Configuring tolerance levels

Tolerance levels allow you to control how rigidly the adaptive levels of monitors are enforced. By default, the adaptive level for all monitors is set to neutral, but you can choose to enforce them more strictly or leniently.

Example In the screenshot, the details of the column "column2-Row_count" are expanded to show its row count evolution over a period of time. With this information, you may decide to leave this adaptive rule in its current state, or click Advanced and change the tolerance level to instruct the behavioral model to enforce it differently.

screenshot of monitor details drawer

Prerequisites

You need a global role with the Product Rights > Data Quality global permission or the Data Quality Editor or Data Quality Manager resource role with the Data Quality Job > Train Monitors resource permission.

Steps

  1. Open the Job Details page of a Data Quality Job.
  2. Click Monitors.
  3. In the monitor details table, go to the adaptive rule monitor whose tolerance levels you want to configure.
  4. On the right side of the monitor row, click More icon, then click Open.
  5. The Details drawer appears.
  6. Click Console icon Advanced.
  7. The tolerance options are shown in the Advanced tab.
  8. Select a tolerance option.
  9. Tolerance Description
    Strict

    Rigidly enforce and flag any data points that fall outside what Data Quality & Observability's behavioral model learns to be normal. Any values outside the strict boundaries will be flagged as potential data quality issues.

    Neutral Automatically enforce and flag any data points that fall outside what Data Quality & Observability's behavioral model learns to be normal. Any values outside the neutral boundaries will be flagged as potential data quality issues.
    Lenient

    Relax the enforcement and flagging when data points fall outside of the range that Data Quality & Observability's behavioral model learns to be normal. Any values outside the lenient boundaries will be flagged as potential data quality issues.

    Tip This can help avoid excessive alerts when Data Quality & Observability is in the learning phase of a new Data Quality Job.

What's next?