164
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Wer Debris Recognition and Quantification in Ferrography Images by Instance Segmentation

, , & ORCID Icon
Pages 508-518 | Received 09 Nov 2021, Accepted 31 Jan 2022, Published online: 20 Apr 2022

References

  • Mehta, P., Werner, A., and Mears, L. (2015), “Condition Based Maintenance—Systems Integration and Intelligence Using Bayesian Classification and Sensor Fusion,” Journal of Intelligent Manufacturing, 26, pp 331–346. doi:https://doi.org/10.1007/s10845-013-0787-1
  • Roylance, B. J. (2005), “Ferrography—Then and Now,” Tribology International, 38, pp 857–862. doi:https://doi.org/10.1016/j.triboint.2005.03.006
  • Raadnui, S. (2005), “Wear Particle Analysis Utilization of Quantitative Computer Image Analysis: A Review,” Tribology International, 38(10), pp 871–878. doi:https://doi.org/10.1016/j.triboint.2005.03.013
  • Wang, J., Zhang, L., Lu, F., and Wang, X. (2014), “The Segmentation of Wear Particles in Ferrograph Images Based on an Improved Ant Colony Algorithm,” Wear, 311, pp 123–129. doi:https://doi.org/10.1016/j.wear.2014.01.004
  • Qin, G., Yi, X., Li, Y., and Xie, W. (2014), “Automatic Detection Technology and System for Tool Wear,” Optics and Precision Engineering, 22(12), pp 3332–3341.
  • Feng, S., Qiu, G., Luo, J., Han, L., Mao, J., and Zhang, Y. (2019), “A Wear Debris Segmentation Method for Direct Reflection Online Visual Ferrography,” Sensors, 19(3), pp 723.doi:https://doi.org/10.3390/s19030723
  • Wang, J., Yao, P., Liu, W., and Wang, X. (2016), “A Hybrid Method for the Segmentation of a Ferrograph Image Using Marker-Controlled Watershed and Grey Clustering,” Tribology Transactions, 59, pp 513–521. doi:https://doi.org/10.1080/10402004.2015.1091534
  • Gu, D. and Zhou, L. (2005), “Wear Particles Pattern Recognition Based on Neural Network,” International Symposium on Test and Measurement,International Academic Publishers Ltd, Dalian, June 1–4, pp 1901–1904.
  • Stachowiak, G. P., Stachowiak, G. W., and Podsiadlo, P. (2008), “Automated Classification of Wear Particles Based on Their Surface Texture and Shape Features,” Tribology International, 41, pp 34–43. doi:https://doi.org/10.1016/j.triboint.2007.04.004
  • Wang, J. and Wang, X. (2013), “A Wear Particle Identification Method by Combining Principal Component Analysis and Grey Relational Analysis,” Wear, 304, pp 96–102. doi:https://doi.org/10.1016/j.wear.2013.04.021
  • Peng, Y., Wu, T., Cao, G., Huang, S., Wu, H., Kwok, N., and Peng, Z. (2017), “A Hybrid Search-Tree Discriminant Technique for Multivariate Wear Debris Classification,” Wear, 392–393, pp 152–158. doi:https://doi.org/10.1016/j.wear.2017.09.022
  • Li, Q., Zhao, T., Zhang, L., Sun, W., and Zhao, X. (2017), “Ferrography Wear Particles Image Recognition Based on Extreme Learning Machine,” Journal of Electrical and Computer Engineering, 2017(2), pp 1–6. doi:https://doi.org/10.1155/2017/3451358
  • Yuan, W., Chin, K. S., Hua, M., Dong, G., and Wang, C. (2016), “Shape Classification of Wear Particles by Image Boundary Analysis Using Machine Learning Algorithms,” Mechanical Systems and Signal Processing, 72–73, pp 346–358. doi:https://doi.org/10.1016/j.ymssp.2015.10.013
  • Liu, H., Wei, H., Wei, L., Li, J., and Yang, Z. (2015), “An Experiment on Wear Particle’s Texture Analysis and Identification by Using Deterministic Tourist Walk Algorithm,” Industrial Lubrication and Tribology, 67(6), pp 582–593. doi:https://doi.org/10.1108/ILT-01-2015-0008
  • Krizhevsky, A., Sutskever, I., and Hinton, G. E. (2012), “ImageNet Classification with Deep Convolutional Neural Networks,” International Conference on Neural Information Processing Systems, Neural Information Processing Systems, Lake Tahoe, December 3, pp 1097–1105.
  • Peng, Y., Cai, J., Wu, T., Cao, G., Kwokc, N., Zhou, S., and Peng, Z. (2019), “A Hybrid Convolutional Neural Network for Intelligent Wear Particle Classification,” Tribology International, 138, pp 166–173. doi:https://doi.org/10.1016/j.triboint.2019.05.029
  • Wang, S., Wu, T., Shao, T., and Peng, Z. (2019), “Integrated Model of BP Neural Network and CNN Algorithm for Automatic Wear Debris Classification,” Wear, 426–427, pp 1761–1770. doi:https://doi.org/10.1016/j.wear.2018.12.087
  • Peng, P. and Wang, J. (2019), “Wear Particle Classification Considering Particle Overlapping,” Wear, 422–423, pp 119–127. doi:https://doi.org/10.1016/j.wear.2019.01.060
  • Peng, P. and Wang, J. (2019), “FECNN: A Promising Model for Wear Particle Recognition,” Wear, 432–433, pp 202968. doi:https://doi.org/10.1016/j.wear.2019.202968
  • Garcia-Garcia, A., Orts-Escolano, S., Oprea, S., Villena-Martinez, V., Martinez-Gonzalez, P., Garcia-Rodriguez, J. (2018), “A Survey on Deep Learning Techniques for Image and Video Semantic Segmentation,” Applied Soft Computing, 70, pp 41–65.
  • He, K., Gkioxari, G., Dollár, P., and Ross, G. (2017), “ Mask R-CNN,” IEEE International Conference on Computer Vision, IEEE, Venice, October 22–29, pp 2980–2988.
  • Ren, S., He, K., Girshick, R., and Sun, J. (2015), “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(6), pp 1137–1149.
  • He, K., Zhang, X., Ren, S., and Sun, J. (2016), “Deep Residual Learning for Image Recognition,” IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Las Vegas, June 27–30, pp 770–778.
  • Lin, T., Dollár, P., Girshick, R., He, K., Hariharan, B., and Belongie, S. (2017), “Feature Pyramid Networks for Object Detection,” IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Honolulu, July 21–26, pp 936–944.
  • Shelhamer, E., Long, J., and Darrell, T. (2017), “Fully Convolutional Networks for Semantic Segmentation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(4), pp 640–651.
  • Yu, F. and Koltun, V. (2015), “Multi-Scale Context Aggregation by Dilated Convolutions,” arXiv e-prints, arXiv:1511.07122.
  • Chen, L., Papandreou, G., Kokkinos, I., Murphy, K., and Yuille, A. (2018), “DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(4), pp 834–848. doi:https://doi.org/10.1109/TPAMI.2017.2699184
  • Zhang, Z., Wang, X., and Jung, C. (2019), “DCSR: Dilated Convolutions for Single Image Super-Resolution,” IEEE Transactions on Image Processing, 28(4), pp 1625–1635. doi:https://doi.org/10.1109/TIP.2018.2877483
  • Ioffe, S. and Szegedy, C. (2015), “ Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” International Conference on Machine Learning, Joural Machine Learning Research, Lille, July 7–9, pp 448–456.
  • Glorot, X., Bordes, A., and Bengio, Y. (2010), “Deep Sparse Rectifier Neural Networks,” Journal of Machine Learning Research, 15, pp 315–323.
  • Abadi, M., Barham, P., Chen, J. M., Chen, Z. F., Davis, A., Dean, J., Devin, M., Ghemawat, S., Irving, G., Isard, M., Kudlur, M., Levenberg, J., Monga, R., Moore, S., Murray, D. G., Steiner, B., Tucker, P., Vasudevan, V., Warden, P., Wicke, M., Yu, Y., Zheng, X. Q. (2016), “TensorFlow: A System for Large-Scale Machine Learning,” Proceedings of the 12th USENIX conference on Operating Systems Design and Implementation, USENIX, Savannah, November 2–4, pp 265–283.
  • Oquab, M., Bottou, L., Laptev, I., and Sivic, J. (2014), “Learning and Transferring Mid-Level Image Representations Using Convolutional Neural Networks,” IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Columbus, June 23–28, pp 1717–1724.
  • Lin, T., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollár, P., Zitnick, C. L. (2014), “Microsoft COCO: Common Objects in Context,” European Conference on Computer Vision, Springer, Zurich, September 6–12, pp 740–755.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.