221
Views
12
CrossRef citations to date
0
Altmetric
Original Articles

HEp-Net: a smaller and better deep-learning network for HEp-2 cell classification

&
Pages 266-272 | Received 06 Nov 2017, Accepted 02 Mar 2018, Published online: 22 Mar 2018

References

  • Foggia P , Percannella G , Soda P , Vento M . 2013. Benchmarking HEp-2 cells classification methods. IEEE Trans Med Imaging. 32:1878–1889.10.1109/TMI.2013.2268163
  • Gao Z , Wang L , Zhou L , Zhang J . 2017. HEp-2 cell image classification with deep convolutional neural networks. IEEE J Biomed Health Inf. 21:416–428.10.1109/JBHI.2016.2526603
  • He K , Zhang X , Ren S , Sun J . 2016a. Deep residual learning for image recognition. In: IEEE Conference on Computer Vision and Pattern Recognition. p. 770–778.
  • He K , Zhang X , Ren S , Sun J . 2016b. Identity mappings in deep residual networks. In: European Conference on Computer Vision. p. 630–645.
  • Hinton GE , Srivastava N , Krizhevsky A , Sutskever I , Salakhutdinov RR . 2012. Improving neural networks by preventing co-adaptation of feature detectors. arXiv e-print arXiv:12070580.
  • Hobson P , Lovell BC , Percannella G , Saggese A , Vento M , Wiliem A . 2016. HEp-2 staining pattern recognition at cell and specimen levels: datasets, algorithms and results. Pattern Recogn Lett. 82:12–22.10.1016/j.patrec.2016.07.013
  • Hobson P , Lovell BC , Percannella G , Vento M , Wiliem A . 2015. Benchmarking human epithelial type 2 interphase cells classification methods on a very large dataset. Artif Intell Med. 65:239–250.10.1016/j.artmed.2015.08.001
  • Huang G , Liu Z , Weinberger KQ , Laurens VDM . 2017. Densely connected convolutional networks. In: IEEE Conference on Computer Vision and Pattern Recognition. p. 4700–4708.
  • Ioffe S , Szegedy C . 2015. Identity mappings in deep residual networks. In: International Conference on Machine Learning. p. 448–456.
  • Jia X , Shen L , Zhou X , Yu S . 2017. Deep convolutional neural network based HEp-2 cell classification. In: International Conference on Pattern Recognition. p. 77–80.
  • Kastaniotis D , Fotopoulou F , Theodorakopoulos I , Economou G , Fotopoulos S . 2017. HEp-2 cell classification with vector of hierarchically aggregated residuals. Pattern Recogn. 65:47–57.10.1016/j.patcog.2016.12.013
  • Kingma DP , Ba J , Yan S . 2014. Adam: a method for stochastic optimization. arXiv e-print arXiv:14126980.
  • LeCun Y , Bengio Y , Hinton G . 2015. Deep learning. Nature. 521:436–444.10.1038/nature14539
  • Li Y , Shen L . 2017. A Deep Residual Inception Network for HEp-2 Cell Classification. In: Cardoso MJ , Arbel T , Carneiro G , Syeda-Mahmood T , Tavares JMRS , Moradi M , Bradley A , Greenspan H , Papa JP , Madabhushi A , et al. , editors. Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support: Third International Workshop, DLMIA 2017, and 7th International Workshop, ML-CDS 2017, Held in Conjunction with MICCAI 2017, Québec City, QC, Canada, September 14, Proceedings. Cham: Springer International; p. 12–20.
  • Lin M , Chen Q , Yan S . 2013. Network in network. arXiv e-print arXiv:13124400.
  • Liu J , Xu B , Shen L , Garibaldi J , Qiu G . 2017. HEp-2 cell classification based on a deep autoencoding-classification convolutional neural network. In: IEEE International Symposium on Biomedical Imaging. p. 1019–1023.
  • Manivannan S , Li W , Akbar S , Wang R , Zhang J , McKenna SJ . 2016. An automated pattern recognition system for classifying indirect immunofluorescence images of HEp-2 cells and specimens. Pattern Recogn. 51:12–26.10.1016/j.patcog.2015.09.015
  • Nosaka R , Fukui K . 2014. HEp-2 cell classification using rotation invariant co-occurrence among local binary patterns. Pattern Recogn. 47:2428–2436.10.1016/j.patcog.2013.09.018
  • Phan HTH , Kumar A , Kim J , Feng D . 2016. Transfer learning of a convolutional neural network for HEp-2 cell image classification. In: IEEE International Symposium on Biomedical Imaging. p. 1208–1211.
  • Qi X , Zhao G , Li CG , Guo J , Pietikäinen M . 2015. HEp-2 cell classification via fusing texture and shape information. arXiv e-print arXiv:150204658.
  • Shen L , Lin J , Wu S , Yu S . 2014. HEp-2 image classification using intensity order pooling based features and bag of words. Pattern Recogn. 47:2419–2427.10.1016/j.patcog.2013.09.020
  • Simonyan K , Zisserman A . 2015. Very deep convolutional networks for large-scale image recognition. arXiv e-print arXiv:1409.1556.
  • Szegedy C , Ioffe S , Vanhoucke V , Alemi A . 2016. Inception-v4, inception-ResNet and the impact of residual connections on learning. arXiv e-print arXiv:160207261.
  • Szegedy C , Liu W , Jia Y , Sermanet P . 2015. Going deeper with convolutions. In: IEEE Conference on Computer Vision and Pattern Recognition. p. 1–9.
  • Taalimi A , Ensafi S , Qi H , Lu S , Kassim AA , Tan CL . 2015. Multimodal Dictionary Learning and Joint Sparse Representation for HEp-2 Cell Classification. MICCAI 2015, Part III. LNCS. vol 9351. Heidelberg: Springer; p. 308–315.
  • Xu X , Lin F , Ng C , Leong KP . 2015. Adaptive Co-occurrence Differential Texton Space for HEp-2 Cells Classification. MICCAI 2015, Part III. LNCS. vol 9351. Heidelberg: Springer; p. 260–267.
  • Yao A , Cai D , Hu P , Wang S , Sha L , Chen Y . 2016. HoloNet: towards robust emotion recognition in the wild. In: ACM International Conference on Multimodal Interaction. p. 472–478.
  • Yu F , Koltun V . 2016. Multi-scale context aggregation by dilated convolutions. arXiv e-print arXiv:151107122.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.