Release 2025.05

Release Information

  • Release date of Data Quality & Observability Classic 2025.05.1: June 16, 2025
  • Release date of Data Quality & Observability Classic 2025.05: June 2, 2025
  • Release notes publication date: May 6, 2025

Announcement

Important 

As a security measure, we are announcing the end of life of the Java 8 and 11 versions of Data Quality & Observability Classic, effective in the August 2025 (2025.08) release.

In this release (2025.05), Data Quality & Observability Classic is only available on Java 17 and Spark 3.5.3. Depending on your installation of Data Quality & Observability Classic, you can expect the following in this release:
  • Kubernetes installations 
    • Kubernetes containers automatically contain Java 17 and Spark 3.5.3.
    • If you use custom drivers, ensure they are compatible with Java 17 and Spark 3.5.3.
    • If you use file-based SAML authentication with the SAML_METADATA_USE_URL variable set to false in the owl-web ConfigMap, update the Meta-Data URL option on the SAML Security Settings page with your metadata file. Use the file:/opt/owl/config/idp-metadata.xml format, ensuring the file name begins with the prefix file:. For steps on how to configure this, go to the "Enable the SAML SSO sign in option" section in Configuring SAML authentication.
  • Standalone installations
    • To install Data Quality & Observability Classic 2025.05, you must upgrade to Java 17 and Spark 3.5.3.
    • If you use custom drivers, ensure they are compatible with Java 17 and Spark 3.5.3.
    • Follow the latest steps to upgrade to Data Quality & Observability Classic 2025.05 with Java 17.
    • If you use file-based SAML authentication with the SAML_METADATA_USE_URL variable set to false in the owl-env.sh script, update the Meta-Data URL option on the SAML Security Settings page with your metadata file. Use the file:/opt/owl/config/idp-metadata.xml format, ensuring the file name begins with the prefix file:. For steps on how to configure this, go to the "Enable the SAML SSO sign in option" section in Configuring SAML authentication.
    • We encourage you to migrate to a Kubernetes installation to improve the scalability and ease of future maintenance.

Between 2025.05 and 2025.07, only critical and high-priority bug fixes will be made for Java 8 and 11 versions of Data Quality & Observability Classic. For a breakdown of Java and Spark availability in current and upcoming releases, click "See what is changing" below.

See what is changing
Data Quality & Observability Classic versionJava 8Java 11Java 17Spark versionsAdditional notes
2025.01 and earlier

Yes

Yes

No

  • 2.3.0 (Java 8 only)
  • 2.4.5 (Java 8 only)
  • 3.0.1 (Java 8 and 11)
  • 3.1.2 (Java 8 and 11)
  • 3.2.2 (Java 8 and 11)
  • 3.4.1 (Java 11 only)
 
2025.02

No

No

Yes

3.5.3 onlyReleases 2025.02 and later support Google Cloud Dataproc for Spark 3.5.1. Dataproc requires Java 17 and currently does not support SQL Server pullup datasources.
2025.03

No

No

Yes

3.5.3 only 
2025.04

Yes

Yes

Yes

  • 2.3.0 (Java 8 only)
  • 2.4.5 (Java 8 only)
  • 3.0.1 (Java 8 and 11)
  • 3.1.2 (Java 8 and 11)
  • 3.2.2 (Java 8 and 11)
  • 3.4.1 (Java 11 only)
  • 3.5.3 (Java 17 only)

Important 
The Java 8 and 11 build profiles only contain the 2025.02 release and critical bug fixes addressed in 2025.03 and 2025.04. They do not contain any feature enhancements from the 2025.03 or 2025.04 releases.

Only the Java 17 build profile contains feature enhancements and bug fixes listed in the 2025.04 release notes.

2025.05

No

No

Yes

3.5.3 onlyFixes for Java 8 and 11 build profiles will be available only for critical and high-priority defects.
2025.06

No

No

Yes

3.5.3 onlyFixes for Java 8 and 11 build profiles will be available only for critical and high-priority defects.
2025.07

No

No

Yes

3.5.3 onlyFixes for Java 8 and 11 build profiles will be available only for critical and high-priority defects.
2025.08

No

No

Yes

3.5.3 onlyFixes for Java 8 and 11 build profiles will be available only for critical and high-priority defects.


For more information, go to the Data Quality & Observability Classic Java Upgrade FAQ.

New and improved

Warning We identified a known security vulnerability (CVE-2025-48734) in version 2025.05. We addressed this issue in the 2025.05.1 patch.

Platform

  • You can now authenticate users via Microsoft Azure Active Directory B2C when signing into Data Quality & Observability Classic.
  • Only users with the ROLE_ADMIN role can now see the username and connection string of JDBC connections on the Connections page and in API responses, improving connection security.
  • Metastore credentials are no longer stored in plain text when you create a DQ Job via a notebook. This improvement increases the security of your credentials.
  • We enhanced the security of Data Category validation.
  • We improved the security of our application.

Jobs

  • Trino Pushdown jobs now support validate source.
  • The Run Job buttons now have two enhancements:
    • The Run Job with Date button on the Jobs tab of the Findings page is now labeled Select Run Date.
    • The Run Job button on the metadata bar now includes a helpful tooltip on hover.
  • SAP HANA connections now support the operators =>, ||, and && for scoped queries, the reserved word ‘LIKE_REGEXTR’, and the function REPLACE in scoped queries.
  • The assignments queue is now disabled for the validate source activity on new installations of Data Quality & Observability Classic. To enable it, an admin can set valsrcdisableaq to "false" on the Admin Limits page.
  • Warning When you set valsrcdisableaq to "true," the ability to assign findings is disabled for all finding types, such as rules, outliers, and so on. This issue is resolved in version 2025.05.1.

Rules

  • When Archive Break Records is disabled, you can now use NULL link IDs with Athena, BigQuery, Hive, Redshift, and SQL Server connections.
  • When Archive Break Records is enabled, and a rule doesn't specify a link ID column but the job does, you can no longer archive break records in these scenarios, as Data Quality & Observability Classic no longer appends the link ID to the rule. When a rule does not contain the selected link ID, the break records for that rule will not be archived, and an exception will occur. The job will execute successfully, but the rule will not produce any archived break records, and the exception message will explain that the link ID is not part of the rule.
  • Trino Pushdown jobs now support profiling and custom rules on columns of the ARRAY data type.
  • The Rule Details dialog on the Rule Workbench page now has an optional Purpose field. This allows you to add details about Collibra Platform assets associated with your rule and see which assets should relate to your Data Quality Rule asset upon integration.

Alerts

  • You can now set up Data Quality & Observability Classic to send data quality alerts to one or more webhooks, eliminating the dependency on SMTP for email alerts.
  • You can now rename alerts on the Dataset Alerts page. When you rename an alert, the updated name is reflected in the alert_nm column of the alert_cond Metastore table. The updated name applies only to alerts generated in future job runs and does not affect historical data.
  • In addition to the existing rule name condition variable for percentages, you can now include a rule name with a value identifier, such as test_rule.score > 0, to create condition alerts based on rule scores.

Profile

  • When you retrain a breaking adaptive rule, negative values are now supported for lower and upper bounds, min, max, and mean.

Findings

  • To enhance usability, we moved the data preview from the Rules results table into a dedicated modal, making it easier to navigate and view data.
    • The Actions button is now always visible on the right side of the Rules tab, and includes the following options:
      • Archived Breaks, previously called Rule Breaks, retains the same content and functionality.
      • Preview Breaks, previously embedded under the Plus icon in the first column, shows all breaking records for a given rule. This option provides the same information that was previously available in the Rules tab and still requires the ROLE_DATASET_ACCESS role. Additionally, the Rule Break Preview modal reflects the preview limit of the rule.
    • Tooltips were added to various components on the Rules tab and Rule Break Preview modal.

Integration

  • You can now use the Map to primary column only switch in the Connections step of the integration setup wizard. This allows you to map a rule to only the Column asset in Collibra Platform that corresponds to the primary column in Data Quality & Observability Classic.
  • When you rename a custom rule in Data Quality & Observability Classic with an active integration with Collibra Platform, a new asset is no longer created in Collibra Platform. Instead, renamed rules reuse the Collibra Platform asset IDs associated with the previous rule name when their metadata is integrated.

APIs

  • The /v3/datasetDefs/template API endpoint now returns the nested key-value pairs for shape settings and job schedule in the datasetDefs JSON object.

Admin Console

  • You can now use the previewlimit setting on the Admin Limits page to define the maximum number of preview results shown in the Dataset Overview and Rule Workbench. The default value is 250.
  • Warning Large previewlimit values can negatively impact performance.

Fixes

Platform

  • All datasets are now available when you run a dataset rule fetch with dataset security disabled for the admin user account.
  • FIPS-enabled standalone installations of Data Quality & Observability Classic now support SAML authentication.

Jobs

  • When you remove a table from a manually triggered or scheduled job, you now receive a more descriptive error stating that the table does not exist instead of a generic syntax error.
  • You can now use Amazon S3 as a secondary dataset name for a rule query.
  • You can now rerun a job for a selected date using the Run DQ Job button on the metadata bar of the Findings page.
  • The command line now retains the created query, including the ${rd} parameter, when you run a job using the Run DQ Job button on the metadata bar.
  • The controller response for queries to the alert_cond table in the Metastore now maps internal objects correctly.
  • The Profile page now correctly shows the top 5 TopN and BottomN shape results for Pullup jobs.
  • When you add a transform to a job with histogram enabled or set to auto, the job processes as expected, and aliased columns display correctly on the Profile page.

Rules

  • You can again save rules on Pushdown jobs that exceed ⅓ of a buffer page.
  • Pushdown rules that use stat variables now run with the correct information from the current job, rather than using data from previous job runs.
  • The DOUBLECHECK rule on PostgreSQL and Redshift Pullup jobs now flags rows with negative double values as valid.
  • The Result Preview in the Rule Workbench no longer produces an error when you use out-of-the-box templates.

Findings

  • Pushdown jobs run on Snowflake connections now show as failed on the Findings page if a password retrieval issue occurs.

Dataset Manager

  • When you apply an option from the Actions drop-down list, such as delete a dataset, the correct dataset now has the action applied to it.

Admin Console

  • When you select a submenu option from the Admin Console menu, the submenu section now remains open.

Patch release

2025.05.1

  • Columns from Db2, Oracle, SQL Server, Sybase, and Teradata connections now load correctly in Explorer for users without the "ROLE_ADMIN" or "ROLE_CONNECTION_MANAGER" roles.
  • The ability to assign findings for all finding types, such as rules, outliers, and so on, is no longer disabled when the "valsrcdisableaq" limit is set to "true" on the Admin Limits page.
  • We resolved a security vulnerability related to CVE-2025-48734.