QA platform for Machine Learning Systems

Deepchecks is a customizable, plug & play, QA Platform, for monitoring and testing machine learning systems in production.
BOOK A DEMO
Deepchecks can plug in to your ML pipelines wherever they are.
We support:
aws
gcp
azure
openshift
onprem
hybrid

How It Works

Phase 1: Validation of the training data and the ML model

training Training Data

Training data is analyzed, looking for undesired issues regarding the training process, and collecting statistics to be used during monitoring.

model Model

Model is analyzed for limitations, characteristics, and determining the borders of confidence regions.
combine

Phase 2: Ongoing testing and monitoring of the production data and the ML model

Data Sources

Improved observability of the ML system are obtained by connecting to the data in it’s raw format, across all of the relevant data sources.
tree

Input Data

Monitoring of the input data in production, before and after various phases of the preprocessing. These are constantly compared to the historic data as well as to the corresponding data in the original training set.

Model

Results stored during the pre-launch analysis of the model are used to determine the severity of different issues that are detected.

Predictions

Monitoring of the model’s predictions, looking for annomalies and patterns regarding which types of mistakes the model may be making.

Labels

Ground truth labels are not mandatory for the use of Deepchecks. However, when they exist, these can be used to display real time metrics and to help Deepchecks improve all other alerts. They are also scanned for inconsistencies and patterns that don’t make sense.
media

Subscribe to our newsletter

Do you want to stay informed?
Keep up-to-date with industry news, the latest trends in MLOps, and observability of ML systems.
Subscribe to our newsletter: