An Ensemble Model to Minimize Fluctuation Influences on Short-Term Medical Workload Prediction

Document Type : Article


1 Department of Mechanical and Industrial Engineering, Ryerson University, 350 Victoria St., Toronto, ON, Canada

2 Department of Mechanical and Industrial Engineering, Ryerson University, 350 Victoria St., Office-EPH 300, Toronto, ON, Canada


Time series forecasting is an important field of machine learning since many real-world events are related to time. Real-time data are commonly prone to errors due to irregular fluctuations, seasonal biases, and missing values in the data. The erroneous data causes inaccurate forecasting which leads to business loss. Moreover, the concept drift problem is a known problem in time series forecasting that also results in poor forecasting accuracy. This work presents an Adaptive Batched-Ranked Ensemble (ABRE) model that reduces the effect of fluctuation using the time-variant windowing technique. A data aggregation technique is developed and integrated with the offline training phase of the proposed model to tackle the concept drift problem. A meta-model is developed from the offline phase. This meta-model is exposed in the online forecasting phase which ensures faster execution for incoming data. The model is implemented for the medical workload prediction after testing and comparing with a few other heterogeneous ensemble models. The comparison results show in terms of the root mean squared error, the proposed model performs at least 65.7% better than the heterogeneous stacked ensemble models applied to the experimental dataset. Moreover, the ABRE model reduces the prediction error by approximately 73.6%.


  • References

    • Chen, K., & Huang, R. “Robust empirical likelihood for time series”, Journal of Time Series Analysis, 42(1), 4-18 (2021).
    • Imani, S., Madrid, F., Ding, W., et al. “Introducing time series snippets: a new primitive for summarizing long time series”, Data Mining and Knowledge Discovery, 34(6), 1713-1743 (2020).
    • Lu, H. J., Zou, N., Jacobs, R., et al. “Error assessment and optimal cross-validation approaches in machine learning applied to impurity diffusion”, Computational Materials Science, 169, 109075 (2019).
    • Xi, J., & Yamauchi, H. “A Layer-wise Ensemble Technique for Binary Neural Network”, International Journal of Pattern Recognition and Artificial Intelligence 2152011 (2021).
    • Birman, Y., Hindi, S., Katz, G., et al. “Cost-effective ensemble models selection using deep reinforcement learning”, Information Fusion, 77, 133-148 (2022).
    • Anifowose, F., Labadin, J., & Abdulraheem, A. “Ensemble learning model for petroleum reservoir characterization: a case of feed-forward back-propagation neural networks”, In Pacific-Asia Conference on Knowledge Discovery and Data Mining, Springer, Berlin, Heidelberg, April, 71-82 (2013).
    • van Rijn, J. N., Holmes, G., Pfahringer, B., et al. “The online performance estimation framework: heterogeneous ensemble learning for data streams”, Machine Learning, 107(1), 149-176 (2018).
    • Jomaa, H. S., Schmidt-Thieme, L., & Grabocka, J. “Dataset2vec: Learning dataset meta-features”, Data Mining and Knowledge Discovery, 35(3), 964-985 (2021).
    • Vishwakarma, G. K., Paul, C., & Elsawah, A. M. “A hybrid feedforward neural network algorithm for detecting outliers in non-stationary multivariate time series”, Expert Systems with Applications, 184, 115545 (2021).
    • Shi, Z., Bai, Y., Jin, X., et al. “Parallel deep prediction with covariance intersection fusion on non-stationary time series”, Knowledge-Based Systems, 211, 106523 (2021).
    • Neto, A. F., & Canuto, A. M. “EOCD: An ensemble optimization approach for concept drift applications”, Information Sciences, 561, 81-100 (2021).
    • Yu, H., & Webb, G. I. “Adaptive online extreme learning machine by regulating forgetting factor by concept drift map”, Neurocomputing, 343, 141-153 (2019).
    • Zhou, J., Sun, M., Han, D., et al. “Analysis of oil price fluctuation under the influence of crude oil stocks and US dollar index—Based on time series network model”, Physica A: Statistical Mechanics and its Applications, 582, 126218 (2021).
    • Li, Z., Huang, W., Xiong, Y., et al. “Incremental learning imbalanced data streams with concept drift: The dynamic updated ensemble algorithm”, Knowledge-Based Systems, 195, 105694 (2020).
    • Micevska, S., Awad, A., & Sakr, S. “SDDM: an interpretable statistical concept drift detection method for data streams”, Journal of Intelligent Information Systems, 1-26 (2021).
    • Mizan, T., & Taghipour, S. “A causal model for short‐term time series analysis to predict incoming Medicare workload”, Journal of Forecasting, 40(2), 228-242 (2021).
    • Qiu, X., Suganthan, P. N., & Amaratunga, G. A. “Ensemble incremental learning random vector functional link network for short-term electric load forecasting”, Knowledge-Based Systems, 145, 182-196 (2018).
    • Ayachi, R., Afif, M., Said, Y., et al. “Real-time implementation of traffic signs detection and identification application on graphics processing units”, International Journal of Pattern Recognition and Artificial Intelligence, 35(07), 2150024 (2021).
    • Priya, S., & Uthra, R. A. “Comprehensive analysis for class imbalance data with concept drift using ensemble based classification”, Journal of Ambient Intelligence and Humanized Computing, 12(5), 4943-4956 (2021).
    • Idrees, M. M., Minku, L. L., Stahl, F., et al. “A heterogeneous online learning ensemble for non-stationary environments”, Knowledge-Based Systems, 188, 104983 (2020).
    • Wang, J., Mo, Z., Zhang, H., et al. “Ensemble diagnosis method based on transfer learning and incremental learning towards mechanical big data”, Measurement, 155, 107517 (2020).
    • Cassales, G., Gomes, H., Bifet, A., et al. “Improving the performance of bagging ensembles for data streams through mini-batching”, Information Sciences, 580, pp.260-282 (2021).
    • Trizoglou, P., Liu, X., & Lin, Z. “Fault detection by an ensemble framework of Extreme Gradient Boosting (XGBoost) in the operation of offshore wind turbines”, Renewable Energy, 179, 945-962 (2021).
    • Radhakrishnan, P., Ramaiyan, K., Vinayagam, A., et al. “A stacking ensemble classification model for detection and classification of power quality disturbances in PV integrated power network”, Measurement, 175, 109025 (2021).
    • Anifowose, F. A., Labadin, J., & Abdulraheem, A. “Ensemble model of non-linear feature selection-based extreme learning machine for improved natural gas reservoir characterization”, Journal of Natural Gas Science and Engineering, 26, 1561-1572 (2015).
    • Anifowose, F., Labadin, J., & Abdulraheem, A. “Towards an improved ensemble learning model of artificial neural networks: lessons learned on using randomized numbers of hidden neurons”, In Artificial Intelligence: Concepts, Methodologies, Tools, and Applications, IGI Global, pp. 325-356 (2017).
    • Neto, A. F., & Canuto, A. M. “EOCD: An ensemble optimization approach for concept drift applications”, Information Sciences, 561, 81-100 (2021).
    • Guo, H., Zhang, S., & Wang, W. “Selective ensemble-based online adaptive deep neural networks for streaming data with concept drift”, Neural Networks, 142, 437-456 (2021).
    • Zheng, X., Li, P., Hu, X., et al. “Semi-supervised classification on data streams with recurring concept drift and concept evolution”, Knowledge-Based Systems, 215, 106749 (2021).
    • Liu, F., Qian, Y., Li, H., et al. “CAFFNet: Channel Attention and Feature Fusion Network for Multi-target Traffic Sign Detection”, International Journal of Pattern Recognition and Artificial Intelligence. 35(07), 2152008 (2021).
    • Mahdi, O. A., Pardede, E., Ali, N., et al. “Diversity measure as a new drift detection method in data streaming”, Knowledge-Based Systems, 191, 105227 (2020).
    • Baena-Garcıa, M., del Campo-Ávila, J., Fidalgo, et al. “Early drift detection method”, In Fourth international workshop on knowledge discovery from data streams, Vol. 6, pp. 77-86 (2006).
    • Mejri, D., Limam, M., & Weihs, C., “A new time adjusting control limits chart for concept drift detection”, IFAC Journal of Systems and Control, 17, 100170 (2021).
    • de Barros, R. S. M., Hidalgo, J. I. G., & de Lima Cabral, D. R. “Wilcoxon rank sum test drift detector”, Neurocomputing, 275, 1954-1963 (2018).
    • Sun, J., Fujita, H., Chen, P., et al. “Dynamic financial distress prediction with concept drift based on time weighting combined with Adaboost support vector machine ensemble”, Knowledge-Based Systems, 120, 4-14 (2017).
    • Zhu, L., Ikeda, K., Pang, S., et al. “Merging weighted SVMs for parallel incremental learning”, Neural Networks, 100, 25-38 (2018).
    • Ren, S., Zhu, W., Liao, B., et al. “Selection-based resampling ensemble algorithm for nonstationary imbalanced stream data learning”, Knowledge-Based Systems, 163, 705-722 (2019).
    • Xiao, J., Xiao, Z., Wang, D., et al. “Short-term traffic volume prediction by ensemble learning in concept drifting environments”, Knowledge-Based Systems, 164, 213-225 (2019).
    • Ancy, S., & Paulraj, D. “Handling imbalanced data with concept drift by applying dynamic sampling and ensemble classification model”, Computer Communications, 153, 553-560 (2020).
    • Chen, J., Lécué, F., Pan, J. Z., et al. “Knowledge graph embeddings for dealing with concept drift in machine learning”, Journal of Web Semantics, 67, 100625 (2021).
    • Ahmad, W., Aamir, M., Khalil, U., et al. “A New Approach for Forecasting Crude Oil Prices Using Median Ensemble Empirical Mode Decomposition and Group Method of Data Handling”, Mathematical Problems in Engineering 2021, pp.1-12 (2021).
    • Yang, C., Cheung, Y. M., Ding, J., et al. “Concept Drift-Tolerant Transfer Learning in Dynamic Environments”, IEEE Transactions on Neural Networks and Learning Systems, (2021).
    • Cox, D. R. “The regression analysis of binary sequences”, Journal of the Royal Statistical Society: Series B (Methodological), 20(2), 215-232 (1958).
    • Gu, B., Quan, X., Gu, Y., et al. “Chunk incremental learning for cost-sensitive hinge loss support vector machine”, Pattern Recognition, 83, 196-208 (2018).
    • Xiong, Z., Cui, Y., Liu, Z., et al. “Evaluating explorative prediction power of machine learning algorithms for materials discovery using k-fold forward cross-validation”, Computational Materials Science, 171, 109203 (2020).
    • Werbin-Ofir, H., Dery, L., & Shmueli, E. “Beyond majority: Label ranking ensembles based on voting rules”, Expert Systems with Applications, 136, 50-61 (2019).