Interpretable Drift Correction with Adaptive Transformation Selection

2025;
: pp. 1065–1076
Received: August 25, 2025
Revised: October 23, 2025
Accepted: October 24, 2025

Shakhovska K., Pukach P.  Interpretable Drift Correction with Adaptive Transformation Selection.  Mathematical Modeling and Computing. Vol. 12, No. 4, pp. 1065–1076 (2025)   

1
Lviv Polytechnic National University
2
Lviv Polytechnic National University

An interpretable drift adaptation mechanism for detection and correction is introduced, based on statistical tests and transparent transformations.  In contrast to prior work that applies a single universal mapping, the method adaptively selects transformations by drift type (location, scale, shape, or extreme), identified via Kolmogorov–Smirnov tests, Wasserstein distance, and distributional comparisons.  Each category is corrected with a suitable transformation such as mean-variance scaling, rank-based adjustment, or quantile mapping.  A novel Wasserstein–aware fallback rule ensures balanced corrections across metrics.  Applied to salary data across roles and years, the approach reduced Wasserstein distance by over 95% in location+scale drifts (e.g., from 22 416 to 1 118).  The method remains easy to interpret, auditable for regulatory checks, and effective for practical drift correction.

  1. Ashok S., Ezhumalai S., Patwa T.  Remediating data drifts and re-establishing ML models.  Procedia Computer Science.  218, 799–809 (2023).
  2. Xiang Q., Zi L., Cong X., Wang Y.  Concept Drift Adaptation Methods under the Deep Learning Framework: A Literature Review.  Applied Sciences.  13 (11), 6515 (2023).
  3. Gama J.  Knowledge Discovery from Data Streams.  Chapman & Hall/CRC (2010).
  4. Gama J., Žliobaitė  I., Bifet A., Pechenizkiy M., Bouchachia A.  A survey on concept drift adaptation.  ACM Computing Surveys.  46 (4), 1–37 (2014).
  5. Bifet A., Gavaldà R.  Adaptive learning from evolving data streams.  Advances in Intelligent Data Analysis VIII. 249–260 (2009).
  6. Barddal J., Gomes H. M., Enembreck F., Pfahringer B.  A survey on feature drift adaptation: Definition, benchmark, challenges and future directions.  Journal of Systems and Software.  127, 278–294 (2017).
  7. Lu J., Liu A., Dong F., Gu F., Gama J., Zhang G.  Learning under Concept Drift: A Review.  IEEE Transactions on Knowledge and Data Engineering.  31 (12), 2346–2363 (2018).
  8. Sun Y., Mi J., Jin C.  Entropy-based concept drift detection in information systems.  Knowledge-Based Systems.  290, 111596 (2024).
  9. Yu S., Abraham Z., Wang H., Shah M., Wei Y., Príncipe J.  Concept drift detection and adaptation with hierarchical hypothesis testing.  Journal of the Franklin Institute.  356 (6), 3187–3215 (2019).
  10. Sugiyama M., Krauledat M., Müller K.-R.  Covariate shift adaptation by importance weighted cross validation.  Journal of Machine Learning Research.  8, 985–1005 (2007).
  11. Pelosi D., Cacciagrano D., Piangerelli M.  Explainability and Interpretability in Concept and Data Drift: A Systematic Literature Review.  Algorithms.  18 (7), 443 (2025).
  12. Lesort T., Caccia M., Rish I.  Understanding Continual Learning Settings with Data Distribution Drift Analysis.  Preprint arXiv:2104.01678 (2021).
  13. Kim D., Yook D.  Robust Model Adaptation Using Mean and Variance Transformations in Linear Spectral Domain.  Intelligent Data Engineering and Automated Learning – IDEAL 2005.  149–154 (2005).
  14. Cheadle C., Vawter M. P., Freed W. J., Becker K. G.  Analysis of Microarray Data Using Z Score Transformation.  The Journal of Molecular Diagnostics.  5 (2), 73–81 (2003).
  15. Cannon A. J.  Multivariate quantile mapping bias correction: An N-dimensional probability density function transform for climate model simulations of multiple variables.  Climate Dynamics.  50, 31–49 (2018).
  16. Bornmann L., Leydesdorff L., Mutz R.  The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits.  Journal of Informetrics.  7 (1), 158–165 (2013).
  17. Chien L.-C.  A rank-based normalization method with the fully adjusted full-stage procedure in genetic association studies.  PLoS One.  15 (6), e0233847 (2020).
  18. Demircioğlu A.  The effect of feature normalization methods in radiomics.  Insights into Imaging.  15 (1), 2 (2024).
  19. Madireddy S., Balaprakash P., Carns P., Latham R., Lockwood G., Ross R., Snyder S., Wild S.  Adaptive learning for concept drift in application performance modeling.  ICPP '19: Proceedings of the 48th International Conference on Parallel Processing.  79, 1–11 (2019).
  20. Karakoulas G.  Empirical Validation of Retail Credit-Scoring Models.  RMA Journal.  87 (9), 56–60 (2004).
  21. Nelson K., Corbin G., Anania M., Kovacs M., Tobias J., Blowers M.  Evaluating model drift in machine learning algorithms.  2015 IEEE Symposium on Computational Intelligence for Security and Defense Applications (CISDA). 1–8 (2015).
  22. McCaw Z., Lane J., Saxena R., Redline S., Lin X.  Operating Characteristics of the Rank-Based Inverse Normal Transformation for Quantitative Trait Analysis in Genome-Wide Association Studies.  Biometrics.  76 (4), 1262–1272 (2020).