pRODUCT dATAtrust

Have 100%  Trust  In  Your  Data

Data observability and data quality in a single platform, all powered by generative AI

Data trust process
a gradient with blue purple light
trust your data

If you can’t trust your data, what’s the point?

DataTrust is built to accelerate test cycles and reduces the cost of delivery by enabling continuous integration and continuous deployment (CI/CD) of data. Explore a quick tour of DataTrust

Connect to data with hundreds of included connectors

ODBC/JDBC Data Sources and any other database that can support ODBC / JDBC drivers

It’s everything you need for data observability, data validation, and data reconciliation at massive scale, code-free, and easy to use.

Perform Field-by-Field Data Reconciliation
  • Reconciling data across multiple sources
  • Reconciling a single data source
Generate Business Rules with Machine Learning
  • Rapid generation of business rules using ML
  • Flexibility to accept, modify, or discard rules as needed
Compare Count and Data in Multiple Tables
  • Compare row counts at the schema level for multiple tables
  • Perform checksum data comparisons for multiple tables
Bulk Data Validation
  • Optimized validation of data for multiple tables
  • Source compute used, no need for ingestion
Executive Reports
  • Interactive executive reports with quality dimension insights
  • Personalized drill-down reports with filters
Combine data quality
  • Query Builder lets you do powerful but easy data profiling including ML generation of business rules
  • Perform comparisons, validations, and do reconciliation with re-usable scenarios
  • Automate the testing process and get alerted when issues arise

Analyze Your Database

  • DataTrust solutions offers full set of applications to analyze source and target datasets.
  • Our "Query Builder" component and Data Profiling features help the stake holders to understand and analyze the data before using the corresponding datasets in the various validation and reconciliation scenarios available.
Analyze Your Database

Perform Data Reconciliation — Row Counts

  • Compare row counts between source and target dataset pairs and identifies the tables for which the row count is not matched
  • Row counts compare algorithm allows to compare row counts of multiple tables/views simultaneously
  • Best fit for database upgrade testing, bigdata ingest layer testing, data warehouse staging extract and load testing, master data testing
Perform Data Reconciliation — Row Counts

Do Data Reconciliation — Row Level Data Compare

  • Compares datasets between source and target and identifies the rows that are not matching
  • Field-level data compare algorithm allows data to be compared between multiple pairs of tables/views simultaneously
  • Best fit for database upgrade testing, big data ingest layer testing, data warehouse testing for objects with minimal transformations, production parallel testing, master data testing
Do Data Reconciliation — Row Level Data Compare

Compare Databases

  • Use Key Data Statistics Studio (KDS) to test whether the data from prior to an upgrade matches with after upgrade (Technical Data Testing)
  • Perform bulk comparisons and compare more than one pair of datasets and exponentially speed-up the testing process for data integration, upgrade, data staging loads
  • Use Record Count Compare to quickly identify the row count differences between one or more tables/queries or Row Level Compare to compare data between one or more tables/queries across source and target
  • Create reconciliation scenarios to identify the records that are not matching from source to target but also the exact set of fields contributing to the mismatch
Compare Databases

Perform data validation

  • Rule-based data validation engine with an easy to use interface to create validation scenarios
  • Define multiple validation rules against target data set and capture exceptions
  • Analyze and report on validations
Perform data validation

Validate datasets

  • Create validation rules against a dataset, execute rules and identify the records violating rules
  • Select a data source to be validated and define one or more validation rules, ingesting the dataset and executing the rules defined in the scenario and returning exceptions
Validate datasets

See what users have to say about RightData:

DataTrust (formerly RDt) reviews sourced by G2

$940K saved annually by automating data quality across nine data sources.

14 FTEs saved through automation. 60% reduction in time needed to test data.