AI Regulations in China
AI Regulations in the European Union (EU)
AI Regulations in the US
AI Regulations in India
Model safety
Synthetic & Generative AI
MLOps
Model Performance
ML Monitoring
Explainable AI
ML Monitoring

Model Drift

Model drift transpires when the ML models predictive ability "drifts" from the performance over the training period.

Model drift transpires when the ML models predictive ability "drifts" from the performance over the training period. It is a deterioration of the model’s predictive ability, where the model makes less accurate predictions from incoming values when compared to its original prediction. Monitoring for drift is crucial in machine learning observability, enabling teams to quickly diagnose production issues that adversely affect your model’s performance.

Model drift can happen for a variety of issues, including the following:

  • Data drift: Change in statistical properties of independent variables. There exists a substantial gap between the time of the gathering of the data and when the model is applied to predict real data.
  • Concept drift: Change in Actuals - the correlation between input and target variables changes
  • Upstream data changes: When the relationship between input and target variables changes

Techniques to measure data drift:

Liked the content? you'll love our emails!

Thank you! We will send you newest issues straight to your inbox!
Oops! Something went wrong while submitting the form.

Is Explainability critical for your 'AI' solutions?

Schedule a demo with our team to understand how AryaXAI can make your mission-critical 'AI' acceptable and aligned with all your stakeholders.