Backbreak in the mining industry presents a considerable challenge, impacting both safety and operational efficiency. Accurate prediction of backbreak is therefore a critical endeavour. This study rigorously evaluates four advanced machine learning (ML) techniques—Lagrangian Support Vector Machine (LSVM), Radial Basis Function Neural Network (RBFNN), Gaussian Process Regression (GPR), and Extreme Gradient Boosting (XGBoost)—to ascertain the most effective method for backbreak prediction. Utilising a comprehensive dataset of 60 blasting rounds from the Damang Goldfields Open Pit Mine and prior to the analysis, this dataset underwent a thorough preprocessing phase. The efficacy of each model is assessed using a suite of metrics, including correlation coefficient (r), coefficient of determination (R2), mean squared error (MSE), root mean squared error (RMSE), and mean absolute error (MAE). The performance of the models is quantitatively compared, revealing XGBoost as the superior predictor in this context, characterised by an r of 0.9788, an R2 of 0.9565, an MSE of 0.1714, an RMSE of 0.4139, and an MAE of 0.2819. The findings of this study underscore the potential of XGBoost as a robust tool for backbreak prediction, offering mining companies a viable solution to enhance safety protocols and mitigate financial losses related to backbreak incidents. This research contributes significantly to the field of predictive analytics in mining, providing a comprehensive comparative analysis of various ML techniques for backbreak prediction.
Published in | American Journal of Science, Engineering and Technology (Volume 9, Issue 1) |
DOI | 10.11648/j.ajset.20240901.11 |
Page(s) | 1-13 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2024. Published by Science Publishing Group |
Backbreak, Blasting, Machine Learning (ML), Cosine Amplitude Method (CAM), Simple Linear Regression (SLR)
[1] | M. Monjezi and H. Dehghani, ‘Evaluation of effect of blasting pattern parameters on back break using neural networks’, International Journal of Rock Mechanics and Mining Sciences, vol. 45, no. 8, pp. 1446–1453, 2008, doi: 10.1016/j.ijrmms.2008.02.007. |
[2] | I. W. Auma, ‘Department of Mining and Water Resources Final Year Project Report Investigating the Causes of Rock Overbreak During Blasting Operations’, 2016. |
[3] | S. Bhandari, Engineering_rock_blasting_operations_bha.pdf. Taylor & Francis, 1997. [Online]. Available: https://books.google.com.gh/books?id=ZOtyQgAACAAJ |
[4] | P.-A. Persson, H. Roger, and L. Jaimin, Pdfcoffee.Com_Rock-Blasting-and-Explosives-Engineering-2-Pdf-Free. Taylor & Francis, 1993. |
[5] | W. Hustrulid, ‘Blasting principles for open pit mining: Volume 1 - General design concepts’, vol. 1. p. 412, 1999. |
[6] | C. J. Konya and E. J. Walter, ‘Rock blasting and overbreak control (No. FHWA-HI-92-001; NHI-13211)’, National Highway Institute, no. 132, p. 430, 1991. |
[7] | W. C. B. Gates, L. T. Ortiz, and R. M. Florez, ‘Analysis of rockfall and blasting backbreak problems, US 550, Molas Pass, CO’, American Rock Mechanics Association - 40th US Rock Mechanics Symposium, ALASKA ROCKS 2005: Rock Mechanics for Energy, Mineral and Infrastructure Development in the Northern Regions, pp. 671–677, 2005, [Online]. Available: https://onepetro.org/ARMAUSRMS/proceedings-pdf/ARMA05/All-ARMA05/ARMA-05-671/1841977/arma-05-671.pdf |
[8] | K. & E. Walter, Surface Blast Design_Calvin J. Konya & Edward J. Walter_1990. 1990. |
[9] | D.-D. N. K. Bansah K. J., Arko-Gyimah K, Kansake B. A., ‘Mitigating Blast Vibration Impact’, 4th UMaT Biennial International Mining and Mineral Conference, no. August, pp. 30–36, 2016. |
[10] | P. Ragam and D. S. Nimaje, ‘Evaluation and prediction of blast-induced peak particle velocity using artificial neural network: A case study’, Noise and Vibration Worldwide, vol. 49, no. 3, pp. 111–119, 2018, doi: 10.1177/0957456518763161. |
[11] | R. Kumar, D. Choudhury, and K. Bhargava, ‘Determination of blast-induced ground vibration equations for rocks using mechanical and geological properties’, Journal of Rock Mechanics and Geotechnical Engineering, vol. 8, no. 3, pp. 341–349, 2016, doi: 10.1016/j.jrmge.2015.10.009. |
[12] | R. J. Gagnon, ‘Empirical Research: The Burdens and the Benefits’, Interfaces (Providence), vol. 12, no. 4, pp. 98–102, 1982, doi: 10.1287/inte.12.4.98. |
[13] | S. H. Sahir, K. Minan, S. Samsudin, I. Zufria, and R. Rahim, ‘Adaptive neuro fuzzy inference system for prediction: A study approach’, International Journal of Engineering and Technology (UAE), vol. 7, no. 2.14 Special Issue 14, pp. 260–263, 2018, doi: 10.14419/ijet.v7i1.1.9482. |
[14] | E. Ebrahimi, M. Monjezi, M. R. Khalesi, and D. J. Armaghani, ‘Prediction and optimization of back-break and rock fragmentation using an artificial neural network and a bee colony algorithm’, Bulletin of Engineering Geology and the Environment, vol. 75, no. 1, pp. 27–36, 2015, doi: 10.1007/s10064-015-0720-2. |
[15] | A. A. Bazzazi and M. Esmaeili, ‘Prediction of backbreak in open pit blasting by adaptive neuro-fuzzy inference system’, Archives of Mining Sciences, vol. 57, no. 4, pp. 933–943, 2012, doi: 10.2478/v10267-012-0062-x. |
[16] | Q. Yu, M. Monjezi, A. S. Mohammed, H. Dehghani, D. J. Armaghani, and D. V. Ulrikh, ‘Optimized support vector machines combined with evolutionary random forest for prediction of back-break caused by blasting operation’, Sustainability (Switzerland), vol. 13, no. 22, 2021, doi: 10.3390/su132212797. |
[17] | R. Shirani Faradonbeh, M. Monjezi, and D. Jahed Armaghani, ‘Genetic programing and non-linear multiple regression techniques to predict backbreak in blasting operation’, Eng Comput, vol. 32, no. 1, pp. 123–133, 2016, doi: 10.1007/s00366-015-0404-3. |
[18] | Z. Nabavi, M. Mirzehi, H. Dehghani, and P. Ashtari, ‘A Hybrid Model for Back-Break Prediction using XGBoost Machine learning and Metaheuristic Algorithms in Chadormalu Iron Mine’, Journal of Mining and Environment, vol. 14, no. 2, pp. 689–712, 2023, doi: 10.22044/jme.2023.12796.2323. |
[19] | C. K. Arthur, V. A. Temeng, and Y. Y. Ziggah, ‘Novel approach to predicting blast-induced ground vibration using Gaussian process regression’, Eng Comput, vol. 36, no. 1, pp. 29–42, 2020, doi: 10.1007/s00366-018-0686-3. |
[20] | J. Gareth, W. Daniela, T. Hastie, and T. Robert, ‘An Introduction to Statistical Learning with Applications in R’, Springer Science & Business Media, 2013. https://books.google.com.gh/books?id=qcI%5C_AAAAQBAJ%7D,%0A year=%7B2013 |
[21] | O. A. Montesinos López, A. Montesinos López, and C. Jose, Overfitting, Model Tuning, and Evaluation of Prediction Performance. Springer, Cham, 2022. [Online]. Available: https://doi.org/10.1007/978-3-030-89010-0_4 |
[22] | M. Kuhn and K. Johnson, Applied predictive modeling. 2013. doi: 10.1007/978-1-4614-6849-3. |
[23] | V. K. Paspula, ‘Splitting a data set into training, validation and test sets’, 2020. https://vinaypaspula.substack.com/p/splitting-a-data-set-into-training-validation-and-test-sets-f1654b7574c (accessed Jul. 28, 2023). |
[24] | J. Brownlee, ‘Train-Test Split for Evaluating Machine Learning Algorithms’, Machine Learning Mastery, 2020. https://machinelearningmastery.com/train-test-split-for-evaluating-machine-learning-algorithms/ (accessed Oct. 28, 2023). |
[25] | M. Behzad, K. Asghari, M. Eazi, and M. Palhang, ‘Generalization performance of support vector machines and neural networks in runoff modeling’, Expert Syst Appl, vol. 36, no. 4, pp. 7624–7629, 2009, doi: 10.1016/j.eswa.2008.09.053. |
[26] | V. N. Vapnik, The Nature of Statistical Learning Theory, 2nd ed., no. 1. 1999. doi: https://doi.org/10.1007/978-1-4757-3264-1. |
[27] | N. Sapankevych and R. Sankar, ‘Time series prediction using support vector machines: A survey’, IEEE Comput Intell Mag, vol. 4, no. 2, pp. 24–38, 2009, doi: 10.1109/MCI.2009.932254. |
[28] | H. Guo, H. Nguyen, X. N. Bui, and D. J. Armaghani, ‘A new technique to predict fly-rock in bench blasting based on an ensemble of support vector regression and GLMNET’, Eng Comput, vol. 37, no. 1, pp. 421–435, 2021, doi: 10.1007/s00366-019-00833-x. |
[29] | S. Mukherjee, E. Osuna, and F. Girosi, ‘Nonlinear prediction of chaotic time series using support vector machines’, Neural Networks for Signal Processing - Proceedings of the IEEE Workshop, no. July, pp. 511–520, 1997, doi: 10.1109/nnsp.1997.622433. |
[30] | J. T. Jeng, C. C. Chuang, and S. F. Su, ‘Support vector interval regression networks for interval regression analysis’, Fuzzy Sets Syst, vol. 138, no. 2, pp. 283–300, 2003, doi: 10.1016/S0165-0114(02)00570-5. |
[31] | M. Sabzekar and S. M. H. Hasheminejad, ‘Robust regression using support vector regressions’, Chaos Solitons Fractals, vol. 144, p. 110738, 2021, doi: 10.1016/j.chaos.2021.110738. |
[32] | V. D. A. Sánchez, ‘Advanced support vector machines and kernel methods’, Neurocomputing, vol. 55, no. 1–2, pp. 5–20, 2003, doi: 10.1016/S0925-2312(03)00373-4. |
[33] | Q. A. Tran, X. Li, and H. Duan, ‘Efficient performance estimate for one-class support vector machine’, Pattern Recognit Lett, vol. 26, no. 8, pp. 1174–1182, 2005, doi: 10.1016/j.patrec.2004.11.001. |
[34] | C. Wikipedia, ‘Bra – ket notation’, Wikipedia, The Free Encyclopedia., 2023. |
[35] | H. J. Lin and J. P. Yeh, ‘Optimal reduction of solutions for support vector machines’, Appl Math Comput, vol. 214, no. 2, pp. 329–335, 2009, doi: 10.1016/j.amc.2009.04.010. |
[36] | B. Schölkopf, A. Smola, and K. R. Müller, ‘Nonlinear Component Analysis as a Kernel Eigenvalue Problem’, Neural Comput, vol. 10, no. 5, pp. 1299–1319, 1998, doi: 10.1162/089976698300017467. |
[37] | Y. G. Wang, X. Lin, and M. Zhu, ‘Robust estimating functions and bias correction for longitudinal data analysis’, Biometrics, vol. 61, no. 3, pp. 684–691, 2005, doi: 10.1111/j.1541-0420.2005.00354.x. |
[38] | C. Savas and F. Dovis, ‘The impact of different kernel functions on the performance of scintillation detection based on support vector machines’, Sensors (Switzerland), vol. 19, no. 23, pp. 1–16, 2019, doi: 10.3390/s19235219. |
[39] | A. Ben-Hur and J. Weston, ‘A user’s guide to support vector machines.’, Methods Mol Biol, vol. 609, no. January 2010, pp. 223–239, 2010, doi: 10.1007/978-1-60327-241-4_13. |
[40] | K. B. Kim, J. B. Park, Y. H. Choi, and G. Chen, ‘Control of chaotic dynamical systems using radial basis function network approximators’, Inf Sci (N Y), vol. 130, no. 1–4, pp. 165–183, 2000, doi: 10.1016/S0020-0255(00)00074-8. |
[41] | C. S. K. Dash, A. K. Behera, S. Dehuri, and S. B. Cho, ‘Radial basis function neural networks: A topical state-of-the-art survey’, Open Computer Science, vol. 6, no. 1, pp. 33–63, 2016, doi: 10.1515/comp-2016-0005. |
[42] | T. Poggio and F. Girosi, ‘Networks for Approximation arid Learning’, Proceedings of the IEEE, vol. 78, no. 9, pp. 1481–1497, 1990, doi: 10.1109/5.58326. |
[43] | C. Bishop, ‘Improving the Generalization Properties of Radial Basis Function Neural Networks’, Neural Comput, vol. 3, no. 4, pp. 579–588, 1991, doi: 10.1162/neco.1991.3.4.579. |
[44] | S. Haykin, J. Nie, and B. Currie, ‘Neural network-based receiver for wireless communications’, Electronics Letters, vol. 35, no. 3. pp. 203–205, 1999. doi: 10.1049/el:19990177. |
[45] | M. J. L. Orr, ‘Introduction to radial basis function networks’, University of Edinburgh, pp. 1–67, 1996. |
[46] | C. Bishop, Pattern Recognition and Machine Learning. 2006. doi: 10.4324/9780203733332. |
[47] | B. Schölkopf and A. J. Smola, ‘Support Vector Machines and Kernel Algorithms’, The Handbook of Brain Theory and Neural Networks, pp. 1119–1125, 2002. |
[48] | J. Park and I. W. Sandberg, ‘Universal Approximation Using Radial-Basis-Function Networks’, Neural Comput, vol. 3, no. 2, pp. 246–257, 1991, doi: 10.1162/neco.1991.3.2.246. |
[49] | S. Chen, C. F. N. Cowan, and P. M. Grant. ‘Orthogonal least squares learning algorithm for radial basis function networks’. IEEE Transactions on Neural Networks, Vol. 2, No. 2, pp. 302-309, 1991. |
[50] | C. E. Rasmussen, ‘Gaussian Processes in machine learning’, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 3176, no. September 2004, pp. 63–71, 2004, doi: 10.1007/978-3-540-28650-9_4. |
[51] | M. Ebden, ‘Gaussian Processes: A Quick Introduction’, no. May 2015, 2015, [Online]. Available: http://arxiv.org/abs/1505.02965 |
[52] | N. A. C. Cressie, statistics for spatial, 2nd ed. New York, Chichester, Toronto, Brisbane, Singapore: A Wiley-Interscience Publication, 2015. |
[53] | T. Beckers, ‘An Introduction to Gaussian Process Models’, 2021, [Online]. Available: http://arxiv.org/abs/2102.05497 |
[54] | J. Franklin, ‘The elements of statistical learning: data mining, inference and prediction’, The Mathematical Intelligencer, vol. 27, no. 2, pp. 83–85, 2005, doi: 10.1007/bf02985802. |
[55] | J. J. Li, A. Jutzeler, and B. Faltings, ‘Estimating urban ultrafine particle distributions with Gaussian process models’, CEUR Workshop Proc, vol. 1142, no. April, pp. 145–153, 2014. |
[56] | L. Yan, ‘Covariance’. pp. 1–2, 2019. |
[57] | J. Vojtassak, P. Maresch, F. Makai, and J. Tkacik, ‘Analysis of Covariance’, Rheumatologia, vol. 11, no. 1, pp. 53–55, 1997. |
[58] | D. Pollard, ‘Variances and covariances’, no. September. pp. 1–6, 1997. |
[59] | E. L. Snelson, ‘Flexible and efficient Gaussian process models for machine learning’, ACM SIGKDD Explorations Newsletter, vol. 7, no. 2001, pp. 1–135, 2007, [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.62.4041&rep=rep1&type=pdf%5Cnhttp://portal.acm.org/citation.cfm?id=1117456 |
[60] | J. Liu, K. Z. Yan, X. Zhao, and Y. Hu, ‘Prediction of autogenous shrinkage of concretes by support vector machine’, International Journal of Pavement Research and Technology, vol. 9, no. 3, pp. 169–177, 2016, doi: 10.1016/j.ijprt.2016.06.003. |
[61] | C. J. Moore, A. J. K. Chua, C. P. L. Berry, and J. R. Gair, ‘Fast methods for training gaussian processes on large datasets’, R Soc Open Sci, vol. 3, no. 5, pp. 0–9, 2016, doi: 10.1098/rsos.160125. |
[62] | A. V. Mueller and H. F. Hemond, ‘Extended artificial neural networks: Incorporation of a priori chemical knowledge enables use of ion selective electrodes for in-situ measurement of ions at environmentally relevant levels’, Talanta, vol. 117, pp. 112–118, 2013, doi: 10.1016/j.talanta.2013.08.045. |
[63] | S. Malik, R. Harode, and A. Singh Kunwar, ‘XGBoost: a deep dive into boosting’, Simon Fraser University, no. February, pp. 1–21, 2020, doi: 10.13140/RG.2.2.15243.64803. |
[64] | B. Candice, C. Anna, and M. Gonzalo. ‘A Comparative Analysis of XGBoost’, 2019. |
[65] | J. H. Friedman, ‘Greedy Function Approximation: A Gradient Boosting Machine’, Institute of Mathematical Statistics, vol. 29, no. 5, pp. 1189–1232, 2014, [Online]. Available: http://www.jstor.org/stable/2699986 |
[66] | C. Bentéjac, A. Csörgő, and G. Martínez-Muñoz, ‘A Comparative Analysis of XGBoost’, no. November 2019, 2019, doi: 10.1007/s10462-020-09896-5. |
[67] | R. Kesarwani, ‘XGBoost: A BOOSTING Ensemble’, 2021. https://medium.com/almabetter/xgboost-a-boosting-ensemble-b273a71de7a8 (accessed Oct. 28, 2023). |
[68] | T. Chen and C. Guestrin, ‘XGBoost: A scalable tree boosting system’, Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. 13-17-August-2016, pp. 785–794, 2016, doi: 10.1145/2939672.2939785. |
[69] | J. Heaton, ‘Ian Goodfellow, Yoshua Bengio, and Aaron Courville: Deep learning’, Genet Program Evolvable Mach, vol. 19, no. 1–2, pp. 305–307, 2018, doi: 10.1007/s10710-017-9314-z. |
[70] | P. Domingos, ‘A few useful things to know about machine learning’, Commun ACM, vol. 55, no. 10, pp. 78–87, 2012, doi: 10.1145/2347736.2347755. |
[71] | G. Varoquaux, P. R. Raamana, D. A. Engemann, A. Hoyos-Idrobo, Y. Schwartz, and B. Thirion, ‘Assessing and tuning brain decoders: Cross-validation, caveats, and guidelines’, Neuroimage, vol. 145, no. June, pp. 166–179, 2017, doi: 10.1016/j.neuroimage.2016.10.038. |
[72] | H. Zhang, ‘The optimality of Naive Bayes’, Proceedings of the Seventeenth International Florida Artificial Intelligence Research Society Conference, FLAIRS 2004, vol. 2, pp. 562–567, 2004. |
[73] | S. Raschka, ‘Naive Bayes and Text Classification I - Introduction and Theory’, pp. 1–20, 2014. |
[74] | H. Fisher et al., ‘Taylor_Diagram.Pdf’, 2023. https://metplotpy.readthedocs.io/en/latest/Users_Guide/index.html (accessed Jul. 29, 2023). |
[75] | K. E. Taylor, ‘in a Single Diagram’, vol. 106, pp. 7183–7192, 2001. |
[76] | F. Pianosi et al., ‘Sensitivity analysis of environmental models: A systematic review with practical workflow’, Environmental Modelling and Software, vol. 79, no. May, pp. 214–232, 2016, doi: 10.1016/j.envsoft.2016.02.008. |
[77] | J. C. Helton, J. D. Johnson, C. J. Sallaberry, and C. B. Storlie, ‘Survey of sampling-based methods for uncertainty and sensitivity analysis’, Reliab Eng Syst Saf, vol. 91, no. 10–11, pp. 1175–1209, 2006, doi: 10.1016/j.ress.2005.11.017. |
[78] | I. M. Sobol, ‘Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates’, Math Comput Simul, vol. 55, no. 1–3, pp. 271–280, 2001, doi: 10.1016/S0378-4754(00)00270-6. |
[79] | M. S. Saeed, H. Jalalifar, H. Shamsoddini, and M. Darbor, ‘A comparative study on the application of Regression-PSO and ANN methods to predict backbreak in open-pit mines A comparative study on the application of Regression-PSO and ANN methods to predict backbreak in open-pit mines’, vol. 11, no. June, pp. 55–66, 2022, doi: 10.29252/ANM.2022.16933.1508. |
APA Style
Kunkyin-Saadaari, F., Agadzie, V. K., Gyebuni, R. (2024). A Comparative Study on the Application of Intelligent Models in the Estimation of Backbreak in Mine Blasting Operations. American Journal of Science, Engineering and Technology, 9(1), 1-13. https://doi.org/10.11648/j.ajset.20240901.11
ACS Style
Kunkyin-Saadaari, F.; Agadzie, V. K.; Gyebuni, R. A Comparative Study on the Application of Intelligent Models in the Estimation of Backbreak in Mine Blasting Operations. Am. J. Sci. Eng. Technol. 2024, 9(1), 1-13. doi: 10.11648/j.ajset.20240901.11
AMA Style
Kunkyin-Saadaari F, Agadzie VK, Gyebuni R. A Comparative Study on the Application of Intelligent Models in the Estimation of Backbreak in Mine Blasting Operations. Am J Sci Eng Technol. 2024;9(1):1-13. doi: 10.11648/j.ajset.20240901.11
@article{10.11648/j.ajset.20240901.11, author = {Festus Kunkyin-Saadaari and Victor Kwaku Agadzie and Richard Gyebuni}, title = {A Comparative Study on the Application of Intelligent Models in the Estimation of Backbreak in Mine Blasting Operations}, journal = {American Journal of Science, Engineering and Technology}, volume = {9}, number = {1}, pages = {1-13}, doi = {10.11648/j.ajset.20240901.11}, url = {https://doi.org/10.11648/j.ajset.20240901.11}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajset.20240901.11}, abstract = {Backbreak in the mining industry presents a considerable challenge, impacting both safety and operational efficiency. Accurate prediction of backbreak is therefore a critical endeavour. This study rigorously evaluates four advanced machine learning (ML) techniques—Lagrangian Support Vector Machine (LSVM), Radial Basis Function Neural Network (RBFNN), Gaussian Process Regression (GPR), and Extreme Gradient Boosting (XGBoost)—to ascertain the most effective method for backbreak prediction. Utilising a comprehensive dataset of 60 blasting rounds from the Damang Goldfields Open Pit Mine and prior to the analysis, this dataset underwent a thorough preprocessing phase. The efficacy of each model is assessed using a suite of metrics, including correlation coefficient (r), coefficient of determination (R2), mean squared error (MSE), root mean squared error (RMSE), and mean absolute error (MAE). The performance of the models is quantitatively compared, revealing XGBoost as the superior predictor in this context, characterised by an r of 0.9788, an R2 of 0.9565, an MSE of 0.1714, an RMSE of 0.4139, and an MAE of 0.2819. The findings of this study underscore the potential of XGBoost as a robust tool for backbreak prediction, offering mining companies a viable solution to enhance safety protocols and mitigate financial losses related to backbreak incidents. This research contributes significantly to the field of predictive analytics in mining, providing a comprehensive comparative analysis of various ML techniques for backbreak prediction. }, year = {2024} }
TY - JOUR T1 - A Comparative Study on the Application of Intelligent Models in the Estimation of Backbreak in Mine Blasting Operations AU - Festus Kunkyin-Saadaari AU - Victor Kwaku Agadzie AU - Richard Gyebuni Y1 - 2024/01/18 PY - 2024 N1 - https://doi.org/10.11648/j.ajset.20240901.11 DO - 10.11648/j.ajset.20240901.11 T2 - American Journal of Science, Engineering and Technology JF - American Journal of Science, Engineering and Technology JO - American Journal of Science, Engineering and Technology SP - 1 EP - 13 PB - Science Publishing Group SN - 2578-8353 UR - https://doi.org/10.11648/j.ajset.20240901.11 AB - Backbreak in the mining industry presents a considerable challenge, impacting both safety and operational efficiency. Accurate prediction of backbreak is therefore a critical endeavour. This study rigorously evaluates four advanced machine learning (ML) techniques—Lagrangian Support Vector Machine (LSVM), Radial Basis Function Neural Network (RBFNN), Gaussian Process Regression (GPR), and Extreme Gradient Boosting (XGBoost)—to ascertain the most effective method for backbreak prediction. Utilising a comprehensive dataset of 60 blasting rounds from the Damang Goldfields Open Pit Mine and prior to the analysis, this dataset underwent a thorough preprocessing phase. The efficacy of each model is assessed using a suite of metrics, including correlation coefficient (r), coefficient of determination (R2), mean squared error (MSE), root mean squared error (RMSE), and mean absolute error (MAE). The performance of the models is quantitatively compared, revealing XGBoost as the superior predictor in this context, characterised by an r of 0.9788, an R2 of 0.9565, an MSE of 0.1714, an RMSE of 0.4139, and an MAE of 0.2819. The findings of this study underscore the potential of XGBoost as a robust tool for backbreak prediction, offering mining companies a viable solution to enhance safety protocols and mitigate financial losses related to backbreak incidents. This research contributes significantly to the field of predictive analytics in mining, providing a comprehensive comparative analysis of various ML techniques for backbreak prediction. VL - 9 IS - 1 ER -