References
- R.M. Weeks, Ore reserve estimation and grade control at the quemont mine, in Ore reserve estimation and grade control: a Canadian Centennial conference sponsored by the Geology and Metal Mining Divisions of the CIM; held at Hotel L’Estere, September 25-27, 1967, Vol. 9, Ville d’Esterel, Quebec, Canadian Institute of Mining and Metallurgy; 1968. p. 123.
- A.J. Sinclair and G.H. Blackwell, Applied Mineral Inventory Estimation, Cambridge, Cambridge University Press, 2002
- A.D. Akbari, M. Osanloo, and M.A. Shirazi, Reserve estimation of an open pit mine under price uncertainty by real option approach, Min. Sci. Technol. (China) 19 (6) (2009), pp. 709–717. doi:https://doi.org/10.1016/S1674-5264(09)60130-7.
- J.M. Rendu, An Introduction to Cut-off Grade Estimation. Society for Mining, Metallurgy, and Exploration, Engelwood, CO, 2014.
- V.R. Joseph, Limit Kriging, Technometrics. 48(4) (nov 2006), pp. 458–466. doi:https://doi.org/10.1198/004017006000000011.
- N. Cressie, The origins of kriging, Math. Geol. 22 (3) (1990), pp. 239–252. doi:https://doi.org/10.1007/BF00889887.
- A.G. Journel and C.J. Huijbregts, Mining Geostatistics, Vol. 600, Academic press, London, 1978.
- J. Rendu, Kriging, logarithmic kriging, and conditional expectation: Comparison of theory with actual results, in Proceedings 16th APCOM Symposium, Tucson, Arizona, 1979. p. 199–212.
- P. Goovaerts, Geostatistics for Natural Resources Evaluation, NY: Oxford University Press, 1997.
- J.P. Chiles, P. Delfiner, et al., Modeling spatial uncertainty, Geostatistics, Wiley series in probability and statistics. 1999.
- M. David, Geostatistical Ore Reserve Estimation, Tucson, Arizona: Elsevier, 2012.
- A. Paithankar and S. Chatterjee, Grade and tonnage uncertainty analysis of an african copper deposit using multiple-point geostatistics and sequential gaussian simulation, Nat. Resour. Res. 27 (4) (2018), pp. 419–436. doi:https://doi.org/10.1007/s11053-017-9364-1.
- M. Badel, S. Angorani, and M.S. Panahi, The application of median indicator kriging and neural network in modeling mixed population in an iron ore deposit, Comput. Geosci. 37 (4) (2011), pp. 530–540. doi:https://doi.org/10.1016/j.cageo.2010.07.009.
- P. Tahmasebi and A. Hezarkhani, Application of a modular feedforward neural network for grade estimation, Nat. Resour. Res. 20 (1) (2011), pp. 25–32. doi:https://doi.org/10.1007/s11053-011-9135-3.
- G. Pan, D.P. Harris, and T. Heiner, Fundamental issues in quantitative estimation of mineral resources, Nonrenewable Res. 1(4) (1992), pp. 281–292. doi:https://doi.org/10.1007/BF01782693
- H. Jang and E. Topal, A review of soft computing technology applications in several mining problems, Appl. Soft. Comput. 22 (2014), pp.638–651. http://www.sciencedirect.com/science/article/pii/S1568494614002464.
- S. Chatterjee, A. Bhattacherjee, B. Samanta, et al., Ore grade estimation of a limestone deposit in India using an artificial neural network, Applied GIS 2 (1) (2006), pp. 1–20. doi:https://doi.org/10.2104/ag060003.
- B. Samanta, Radial basis function network for ore grade estimation, Nat. Resour. Res. 19 (2) (2010), pp. 91–102. doi:https://doi.org/10.1007/s11053-010-9115-z.
- S. Chatterjee, S. Bandopadhyay, and D. Machuca, Ore grade prediction using a genetic algorithm and clustering based ensemble neural network model, Math. Geosci. 42(3) (2010, apr), pp. 309–326. doi:https://doi.org/10.1007/s11004-010-9264-y.
- H. Mahmoudabadi, M. Izadi, and M.B. Menhaj, A hybrid method for grade estimation using genetic algorithm and neural networks, Comput. Geoscie. 13 (1) (2009), pp. 91–101. doi:https://doi.org/10.1007/s10596-008-9107-9.
- B. Tutmez, An uncertainty oriented fuzzy methodology for grade estimation, Comput. Geosci. 33 (2) (2007), pp. 280–288. doi:https://doi.org/10.1016/j.cageo.2006.09.001.
- D. Misra, B. Samanta, S. Dutta, et al. Evaluation of artificial neural networks and kriging for the prediction of arsenic in Alaskan bedrock-derived stream sediments using gold concentration data, Int. J. Min. Reclam. Environ. 21(4) (dec 2007), pp. 282–294. doi:https://doi.org/10.1080/17480930701259294.
- Y. Dagasan, P. Renard, J. Straubhaar, et al., Automatic parameter tuning of multiple-point statistical simulations for lateritic bauxite deposits, Minerals 8 (5) (2018), pp. 220. doi:https://doi.org/10.3390/min8050220.
- U.E. Kaplan and E. Topal, A new ore grade estimation using combine machine learning algorithms, Minerals 10 (10) (2020), pp. 847. doi:https://doi.org/10.3390/min10100847.
- K. Koike, S. Matsuda, T. Suzuki, et al., Neural network-based estimation of principal metal contents in the Hokuroku District, Northern Japan, for Exploring Kuroko-Type Deposits, Nat. Resour. Res. 11(2) (Jun 2002), pp. 135–156. doi:https://doi.org/10.1023/A:1015520204066.
- B. Jafrasteh, N. Fathianpour, and A. Suárez, Comparison of machine learning methods for copper ore grade estimation, Comput. Geoscie. 22(5) (Oct 2018), pp. 1371–1388. doi:https://doi.org/10.1007/s10596-018-9758-0.
- S. Cheng, S. Zhang, L. Li, et al., Water quality monitoring method based on tld 3d fish tracking and xgboost, Math. Probl. Eng. 2018 (2018), pp. 1–12.
- R. Zhong, R. Johnson, and Z. Chen, Generating pseudo density log from drilling and logging-while-drilling data using extreme gradient boosting (xgboost), Int. J. Coal Geol. 220 (2020), pp.103416. http://www.sciencedirect.com/science/article/pii/S0166516219306743.
- H. Nguyen, X.N. Bui, H.B. Bui, et al., Developing an xgboost model to predict blast-induced peak particle velocity in an open-pit mine: A case study, Acta geophys. 67 (2) (2019), pp. 477–490. doi:https://doi.org/10.1007/s11600-019-00268-4.
- V.A. Dev and M.R. Eden, Formation lithology classification using scalable gradient boosted decision trees, Comput. Chem. Eng. 128 (2019), pp. 392–404. doi:https://doi.org/10.1016/j.compchemeng.2019.06.001.
- D. He, B.T. Le, D. Xiao, et al., Coal mine area monitoring method by machine learning and multispectral remote sensing images, Infrared Phys. Technol. 103 (2019), pp. 103070. doi:https://doi.org/10.1016/j.infrared.2019.103070.
- G.T. Nwaila, S.E. Zhang, and H.E. Frimmel, Local and target exploration of conglomerate-hosted gold deposits using machine learning algorithms: a case study of the Witwatersrand Gold Ores, South Africa, Nat. Resour. Res., 29(1) (2020), pp. 135–159. doi:https://doi.org/10.1007/s11053-019-09498-1.
- J.H. Friedman, Greedy boosting approximation: A gradient boosting machine, Annal. Stat. 29 (5) (2001), pp. 1189–1232. doi:https://doi.org/10.1214/aos/1013203451.
- L. Breiman, J. Friedman, R. Olshen, et al., Classification and regression trees. wadsworth int, Group 37 (15) (1984), pp. 237–251.
- K. Song, F. Yan, T. Ding, et al., A steel property optimization model based on the xgboost algorithm and improved pso, Comput. Mater. Sci. 174 (2020), pp. 109472. doi:https://doi.org/10.1016/j.commatsci.2019.109472.
- A. Krizhevsky, I. Sutskever, and G.E. Hinton, Imagenet classification with deep convolutional neural networks, in Advances in Neural Information Processing Systems, Lake Tahoe, CA, 2012, pp. 1097–1105.
- G. Ke, Q. Meng, T. Finley, et al., Lightgbm: A highly efficient gradient boosting decision tree, in Advances in Neural Information Processing Systems, Long Beach, CA, USA, 2017, pp. 3146–3154.
- L. Prokhorenkova, G. Gusev, A. Vorobev, et al., Catboost: Unbiased boosting with categorical features, in Advances in Neural Information Processing Systems, Montréal, Canada, 2018, pp. 6638–6648.
- R. Webster and M.A. Oliver, Geostatistics for Environmental Scientists, Chichester: John Wiley & Sons, 2007.
- E. Isaaks and R. Srivastava, Applied Geostatistics, Oxford University Press, New York, 1989.
- U.E. Kaplan, System and method for grade estimation using gradient boosted decision tree based machine learning algorithms, Australia Patent AU2020100630A4, 2020.
- Y. Dagasan, O. Erten, P. Renard, et al., Multiple-point statistical simulation of the ore boundaries for a lateritic bauxite deposit, Stochastic Environ. Res. Risk Assess. 33 (3) (2019), pp. 865–878. doi:https://doi.org/10.1007/s00477-019-01660-8.
- Y. Bengio and Y. Grandvalet, No unbiased estimator of the variance of k-fold cross-validation, J. Mach. Learn. Res. 5 (2004), pp. 1089–1105.
- J.A. Rice, Mathematical statistics and data analysis, Cengage Learning, 2006.
- Xgboost parameters, 2020; Available at https://xgboost.readthedocs.io/en/latest/parameter.html.
- Lightgbm parameters, 2020; Available at https://lightgbm.readthedocs.io/en/latest/parameters.html.
- Catboost training parameters, 2020; Available at https://catboost.ai/docs.
- G. Van Rossum and F.L. Drake Jr, Python tutorial. Centrum voor Wiskunde en Informatica Amsterdam, The Netherlands, 1995.
- S.V. Der Walt, S.C. Colbert, and G. Varoquaux, The numpy array: A structure for efficient numerical computation, Comput. Sci. Eng. 13 (2) (2011), pp. 22–30. doi:https://doi.org/10.1109/MCSE.2011.37.
- W. McKinney, Data structures for statistical computing in python, in Proceedings of the 9th Python in Science Conference, S. van der Walt and J. Millman, eds., Austin, Texas, 2010, pp. 51–56.
- J.D. Hunter, Matplotlib: A 2d graphics environment, Comput. Sci. Eng. 9 (3) (2007), pp. 90–95. doi:https://doi.org/10.1109/MCSE.2007.55.
- F. Pedregosa, G. Varoquaux, A. Gramfort, et al., Scikit-learn: machine learning in python, J. Mach. Learn. Res. 12 (2011), pp. 2825–2830.
- T. Chen and C. Guestrin. Xgboost: A scalable tree boosting system, in Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining; San Francisco, CA, USA, 2016. pp. 785–794.
- D. Renard, N. Bez, N. Desassis, et al., Rgeostats: The Geostatistical Package [11.0. 1], Fontainebleau: MINES ParisTech, 2015.