Model machine learning yang diterapkan pada sistem produksi berbasis data sensor berpotensi mengalami degradasi kinerja akibat perubahan distribusi data yang bersifat dinamis (data drift). Tanpa mekanisme pemantauan dan pembaruan model yang terstruktur, performa model cenderung menurun seiring waktu. Penelitian ini mengusulkan integrasi pendekatan Machine Learning Operations (MLOps) untuk membangun sistem monitoring, deteksi data drift, dan retraining model secara adaptif dalam lingkungan produksi machine learning. Evaluasi dilakukan menggunakan Gas Sensor Array Drift Dataset dari UCI Machine Learning Repository dengan skema pemrosesan data berbasis batch untuk mensimulasikan aliran data operasional. Deteksi drift dilakukan menggunakan Population Stability Index (PSI) dan Kullback–Leibler Divergence (KL) sebagai dasar pemicu retraining adaptif. Hasil eksperimen menunjukkan bahwa strategi retraining adaptif mampu menjaga stabilitas kinerja model secara lebih konsisten dibandingkan pendekatan tanpa retraining dan retraining periodik, serta mendukung pengelolaan siklus hidup model yang terukur dan dapat direproduksi.
Tesis Full
Abstract
Daftar Pustaka
Bab V
Kata Pengantar
Abstrak
Surat Pernyataan Orisinalitas Dan Bebas Plagiarisme
Surat Pernyataan Orisinalitas Dan Bebas Plagiarisme
Bab II
Bab IV
Bab III
Lampiran
Draft Jurnal
Persetujuan Dan Pengesahan Tesis
Surat Pernyataan Persetujuan Publikasi Karya Ilmiah Untuk Kepentingan Akademis
Bab I
[1] Z. Ye, Y. Liu, and Q. Li, “Recent progress in smart electronic nose technologies enabled with machine learning methods,” Nov. 01, 2021, MDPI. doi: 10.3390/s21227620.
[2] D. Kreuzberger, N. Kühl, and S. Hirschl, “Machine Learning Operations (MLOps): Overview, Definition, and Architecture.”
[3] J. R. R. Kumar and P. Chouksey, “Gas Sensor Array Drift in an E-Nose System: A Dataset for Machine Learning Applications,” International Journal on Recent and Innovation Trends in Computing and Communication, vol. 11, no. 6, pp. 167–171, Jun. 2023, doi: 10.17762/ijritcc.v11i6.7343.
[4] G. V. Prasad and K. Sharma, “A Unified Framework for Detecting Gradual and Abrupt Concept Drifts in Data Stream Mining: The Concept Drift Detection Framework with Hybrid Meta-Learning (CDDF-HML),” SSRG International Journal of Electrical and Electronics Engineering, vol. 11, no. 7, pp. 39–50, Jul. 2024, doi: 10.14445/23488379/IJEEE-V11I7P103.
[5] H. Yu, T. Liu, J. Lu, and G. Zhang, “Automatic Learning to Detect Concept Drift,” May 2021, [Online]. Available: http://arxiv.org/abs/2105.01419
[6] G. V. Prasad and K. Sharma, “A Unified Framework for Detecting Gradual and Abrupt Concept Drifts in Data Stream Mining: The Concept Drift Detection Framework with Hybrid Meta-Learning (CDDF-HML),” SSRG International Journal of Electrical and Electronics Engineering, vol. 11, no. 7, pp. 39–50, Jul. 2024, doi: 10.14445/23488379/IJEEE-V11I7P103.
[7] H. Yu, T. Liu, J. Lu, and G. Zhang, “Automatic Learning to Detect Concept Drift,” May 2021, [Online]. Available: http://arxiv.org/abs/2105.01419
[8] J. Conde et al., “Lightweight Automated Feature Monitoring for Data Streams,” Jul. 2022, [Online]. Available: http://arxiv.org/abs/2207.08640
[9] B. Eck, D. Kabakci-Zorlu, Y. Chen, F. Savard, and X. Bao, “A monitoring framework for deployed machine learning models with supply chain examples,” Nov. 2022, [Online]. Available: http://arxiv.org/abs/2211.06239
[10] L. Caruccio, S. Cirillo, G. Polese, and R. Stanzione, “An RFD-Based Approach for Concept Drift Detection in Machine Learning Systems,” 2025, doi: 10.48786/edbt.2025.66.
[11] V. Yarabolu, G. Waghmare, S. Gupta, and S. Asthana, “A Scalable Approach to Covariate and Concept Drift Management via Adaptive Data Segmentation A Scalable Approach to Covariate and Concept Drift Management via Adaptive Data Segmentation. In 8th International Conference on Data Science and Management of Data (12th ACM IKDD CODS and 30th COMAD),” 2024, doi: 10.1145/3703323.3703337.
[12] S. Broscheit, Q. Do, and J. Gaspers, “Distributionally Robust Finetuning BERT for Covariate Drift in Spoken Language Understanding,” Long Papers, 1970.
[13] B. Wang and X. Qiao, “Conformal Prediction Under Generalized Covariate Shift with Posterior Drift,” Feb. 2025, [Online]. Available: http://arxiv.org/abs/2502.17744
[14] A. Bodor, M. Hnida, and N. Daoudi, “Machine Learning Models Monitoring in MLOps Context: Metrics and Tools,” International Journal of Interactive Mobile Technologies, vol. 17, no. 23, pp. 125–139, 2023, doi: 10.3991/IJIM.V17I23.43479.
[15] H. Ha, “An Efficient Framework for Monitoring Subgroup Performance of Machine Learning Systems,” Dec. 2022, [Online]. Available: http://arxiv.org/abs/2212.08312
[16] Q. Xu, S. Ali, T. Yue, Z. Nedim, and I. Singh, “KDDT: Knowledge Distillation-Empowered Digital Twin for Anomaly Detection,” in ESEC/FSE 2023 - Proceedings of the 31st ACM Joint Meeting European Software Engineering Conference and Symposium on the Foundations of Software Engineering, Association for Computing Machinery, Inc, Nov. 2023, pp. 1867–1878. doi: 10.1145/3611643.3613879.
[17] M. Yusuff, “Model Drift Monitoring: Continuously Tracking Model Performance Metrics to Detect Accuracy Degradation.” [Online]. Available: https://www.researchgate.net/publication/387022445
[18] A. Balasubramanian, “End-to-end model lifecycle management: An MLOPS framework for drift detection, root cause analysis, and continuous retraining,” International Journal of Multidisciplinary Research and Growth Evaluation, vol. 1, no. 1, pp. 92–102, 2020, doi: 10.54660/.IJMRGE.2020.1.1-92-102.
[19] N. E. Vijayan, “Building Scalable MLOps: Optimizing Machine Learning Deployment and Operations,” INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT, vol. 08, no. 10, pp. 1–9, Oct. 2024, doi: 10.55041/IJSREM37784.
[20] H. Kong Journal and M. B. H. Kong, “Exploring the Role of Kubernetes in MLOps and DevOps for Containerized Machine Learning Model Management.”
[21] Y. J. Hu, J. Rombouts, and I. Wilms, “MLOps Monitoring at Scale for Digital Platforms,” Apr. 2025, [Online]. Available: http://arxiv.org/abs/2504.16789
[22] Y. Luo and J. K. Nurminen, “Autonomously Adaptive Machine Learning Systems: Experimentation-Driven Open-Source Pipeline.” [Online]. Available: https://grafana.com/
[23] R. Nazir, A. Bucaioni, and P. Pelliccione, “Architecting ML-enabled systems: challenges, best practices, and design decisions.” [Online]. Available: http://www.scopus.com
[24] Sri Santhosh Hari, “Understanding Feedback Loops in Machine Learning Systems,” International Journal of Scientific Research in Computer Science, Engineering and Information Technology, vol. 11, no. 2, pp. 2810–2823, Mar. 2025, doi: 10.32628/CSEIT25112725.
[25] T. S. Sun et al., “Designing a Direct Feedback Loop between Humans and Convolutional Neural Networks through Local Explanations,” Proc. ACM Hum. Comput. Interact., vol. 7, no. CSCW2, Oct. 2023, doi: 10.1145/3610187.
[26] S. Biswas, Y. She, and E. Kang, “Towards Safe ML-Based Systems in Presence of Feedback Loops,” in SE4SafeML 2023 - Proceedings of the 1st International Workshop on Dependability and Trustworthiness of Safety-Critical Systems with Machine Learned Components, Co-located with: ESEC/FSE 2023, Association for Computing Machinery, Inc, Dec. 2023, pp. 18–21. doi: 10.1145/3617574.3617861.
[27] A. Balasubramanian, “End-to-end model lifecycle management: An MLOPS framework for drift detection, root cause analysis, and continuous retraining,” International Journal of Multidisciplinary Research and Growth Evaluation, vol. 1, no. 1, pp. 92–102, 2020, doi: 10.54660/.IJMRGE.2020.1.1-92-102.
[28] Y. Park, J. Mun, Y. Lee, J. Um, J. Choi, and J. Choi, “Data-Driven Optimization of Healthcare Recommender System Retraining Pipelines in MLOps with Wearable IoT Data,” Sensors, vol. 25, no. 20, p. 6369, Oct. 2025, doi: 10.3390/s25206369.
[29] Y. Cao, Y. He, and C. Zhang, “IMLMA: An Intelligent Algorithm for Model Lifecycle Management with Automated Retraining, Versioning, and Monitoring,” Journal of Electronic Research and Application, vol. 9, no. 5, pp. 233–248, Oct. 2025, doi: 10.26689/jera.v9i5.12394.
[30] A. Reda, S. A. Taie, and M. E. Shaheen, “Hybrid MLOps framework for automated lifecycle management of adaptive phishing detection models,” Sci. Rep., vol. 15, no. 1, p. 38478, Nov. 2025, doi: 10.1038/s41598-025-23600-z.
[31] H. Zhang and Y. Han, “A New Mixed-Gas-Detection Method Based on a Support Vector Machine Optimized by a Sparrow Search Algorithm,” Sensors, vol. 22, no. 22, Nov. 2022, doi: 10.3390/s22228977.
[32] M. Zhang, J. Fang, H. Wang, F. Hao, X. Lin, and Y. Wang, “Application of graphene gas sensor technological convergence PSO-SVM in distribution transformer insulation condition monitoring and fault diagnosis,” Materials Express, vol. 13, no. 10, pp. 1743–1752, Oct. 2023, doi: 10.1166/mex.2023.2517.
[33] J. Chang, M. Kang, and D. Park, “Low-Power On-Chip Implementation of Enhanced SVM Algorithm for Sensors Fusion-Based Activity Classification in Lightweighted Edge Devices,” Electronics (Switzerland), vol. 11, no. 1, Jan. 2022, doi: 10.3390/electronics11010139.
[34] D. Karmakar, J. Saw, D. Modak, S. Pramanick, S. Pal, and K. Majumder, “Advanced Gas Sensor Design: A Machine Learning Perspective on Photonic Crystals,” in 2024 IEEE International Conference of Electron Devices Society Kolkata Chapter (EDKCON), IEEE, Nov. 2024, pp. 99–104. doi: 10.1109/EDKCON62339.2024.10870675.
[35] S. Shukla and S. Kumar, “Self-Adaptive Ensemble-based Approach for Software Effort Estimation,” in 2023 IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER), IEEE, Mar. 2023, pp. 581–592. doi: 10.1109/SANER56733.2023.00060.
[36] L. Sarkar, S. Paul, A. Sett, A. Kumari, and T. K. Bhattacharyya, “Classification of Gases With Single FET-Based Gas Sensor Through Gate Voltage Sweeping and Machine Learning,” IEEE Trans. Electron Devices, vol. 72, no. 1, pp. 376–382, Jan. 2025, doi: 10.1109/TED.2024.3486261.
[37] C. Cheng, X.-P. Gong, X.-Y. Cheng, L. Xiao, and X.-Y. Ma, “Optimum Machine Learning on Gas Extraction and Production for Adaptive Negative Control,” Frontiers in Heat and Mass Transfer, vol. 23, no. 3, pp. 1037–1051, 2025, doi: 10.32604/fhmt.2025.065719.
[38] M. Azeem, “Population Stability Index (PSI).”
[39] J. F. Kurian and M. Allali, “Detecting drifts in data streams using Kullback-Leibler (KL) divergence measure for data engineering applications,” Journal of Data, Information and Management, vol. 6, no. 3, pp. 207–216, Sep. 2024, doi: 10.1007/s42488-024-00119-y.
[40] M. R. Haas and L. Sibbald, “Measuring data drift with the unstable population indicator,” Data Science, vol. 7, no. 1, pp. 1–12, Jun. 2024, doi: 10.3233/DS-240059.
[41] J. F. Kurian and M. Allali, “Detecting drifts in data streams using Kullback-Leibler (KL) divergence measure for data engineering applications,” Journal of Data, Information and Management, vol. 6, no. 3, pp. 207–216, Sep. 2024, doi: 10.1007/s42488-024-00119-y.
[42] A. Reda, S. A. Taie, and M. E. Shaheen, “Hybrid MLOps framework for automated lifecycle management of adaptive phishing detection models,” Sci. Rep., vol. 15, no. 1, p. 38478, Nov. 2025, doi: 10.1038/s41598-025-23600-z.
[43] H. M. Wong and S. Perumal, “Model Monitoring, Data Drift Detection, and Efficient Model Retraining: A Review.”
[44] P. Lorwongtrakool and P. Meesad, “Correlation-based incremental learning network for gas sensors drift compensation classification,” Advances in Science, Technology and Engineering Systems, vol. 5, no. 4, pp. 660–666, 2020, doi: 10.25046/AJ050479.
[45] Z. Ma, G. Luo, K. Qin, N. Wang, and W. Niu, “Weighted Domain Transfer Extreme Learning Machine and Its Online Version for Gas Sensor Drift Compensation in E‐Nose Systems,” Wirel. Commun. Mob. Comput., vol. 2018, no. 1, Jan. 2018, doi: 10.1155/2018/2308237.
[46] A. Manna, Drift Compensation for Electronic Nose by Multiple Classifiers System with Genetic Algorithm Optimized Feature Subset. IEEE, 2020.
[47] X. Dong, X. Qi, J. Cui, X. Xu, and A. Wan, Research on Recognition Model with Random Forest and Entropy Weight for Chemical Gas Sensor Array. IEEE, 2020.
[48] P. Das, A. Manna, and S. Ghoshal, Gas Sensor Drift Compensation by Ensemble of Classifiers Using Extreme Learning Machine. IEEE, 2020.
[49] W. Zhang et al., “A Novel Gas Recognition and Concentration Estimation Model for an Artificial Olfactory System with a Gas Sensor Array,” IEEE Sens. J., vol. 21, no. 17, pp. 18459–18468, Sep. 2021, doi: 10.1109/JSEN.2021.3091582.
[50] O. F. Ajayi, A. A. Udosen, W. Ajayi, B. O. Ohwo, and A. I. Amusa, “Voting Ensemble Learning Model (VELM) for Harmful Gas Detection in Environmental Applications,” Asian Journal of Electrical Sciences, vol. 13, no. 2, pp. 45–50, Oct. 2024, doi: 10.70112/ajes-2024.13.2.4252.
[51] X. Dong, S. Han, A. Wang, and K. Shang, “Online inertial machine learning for sensor array long-term drift compensation,” Chemosensors, vol. 9, no. 12, Dec. 2021, doi: 10.3390/chemosensors9120353.
[52] N. Dennler, S. Rastogi, J. Fonollosa, A. van Schaik, and M. Schmuker, “Drift in a popular metal oxide sensor dataset reveals limitations for gas classification benchmarks,” Sens. Actuators B Chem., vol. 361, Jun. 2022, doi: 10.1016/j.snb.2022.131668.
[53] W. Zhang et al., “Unsupervised Attention-Based Multi-Source Domain Adaptation Framework for Drift Compensation in Electronic Nose Systems,” 2024.
[54] Y. Tian et al., “A Drift-Compensating Novel Deep Belief Classification Network to Improve Gas Recognition of Electronic Noses,” IEEE Access, vol. 8, pp. 121385–121397, 2020, doi: 10.1109/ACCESS.2020.3006729.