1,231
Views
2
CrossRef citations to date
0
Altmetric
Research Article

Comparative Investigation of Learning Algorithms for Image Classification with Small Dataset

ORCID Icon, , &

Figures & data

Figure 1. Three common forms of machine learning, their subclasses and renowened algorihtms

Figure 1. Three common forms of machine learning, their subclasses and renowened algorihtms

Figure 2. Dataset for image classification task with ten classes: Calculator, Cigarette Pack, Fork, Glasses, Hook, Mug, Rubber Duck, Scissor, Stapler, and Toothbrush

Figure 2. Dataset for image classification task with ten classes: Calculator, Cigarette Pack, Fork, Glasses, Hook, Mug, Rubber Duck, Scissor, Stapler, and Toothbrush

Table 1. Eight learning algorithms with their learning rates, mini-batch sizes, number of layers, and number of epochs

Figure 3. The layout of first and second phases of the proposed algorithm using coarse to fine random sampling scheme

Figure 3. The layout of first and second phases of the proposed algorithm using coarse to fine random sampling scheme

Figure 4. Accuracy and cost curves of eight learning algorithms on training and development sets. (a) Training accuracy, (b) Development accuracy, (c) Training cost, (d) Development cost

Figure 4. Accuracy and cost curves of eight learning algorithms on training and development sets. (a) Training accuracy, (b) Development accuracy, (c) Training cost, (d) Development cost

Figure 4. Continued

Figure 4. Continued

Figure 5. Precision-recall curves of eight learning algorithms on test set of ten object classes. (a) SGDNesterov optimizer, (b) AdaGrad optimizer, (c) RMSProp optimizer, (d) AdaDelta optimizer, (e) Adam optimizer, (f) AdaMax optimizer, (g) Nadam optimizer, (h) AMSGrad optimizer

Figure 5. Precision-recall curves of eight learning algorithms on test set of ten object classes. (a) SGDNesterov optimizer, (b) AdaGrad optimizer, (c) RMSProp optimizer, (d) AdaDelta optimizer, (e) Adam optimizer, (f) AdaMax optimizer, (g) Nadam optimizer, (h) AMSGrad optimizer

Figure 5. Continued

Figure 5. Continued

Figure 5. Continued

Figure 5. Continued

Figure 5. Continued

Figure 5. Continued

Figure 6. Confusion matrices of eight learning algorithms on test set of ten object classes. (a) SGDNesterov optimizer, (b) AdaGrad optimizer, (c) RMSProp optimizer, (d) AdaDelta optimizer, (e) Adam optimizer, (f) AdaMax optimizer, (g) Nadam optimizer, (h) AMSGrad optimizer

Figure 6. Confusion matrices of eight learning algorithms on test set of ten object classes. (a) SGDNesterov optimizer, (b) AdaGrad optimizer, (c) RMSProp optimizer, (d) AdaDelta optimizer, (e) Adam optimizer, (f) AdaMax optimizer, (g) Nadam optimizer, (h) AMSGrad optimizer

Figure 6. Continued

Figure 6. Continued

Table 2. Training time, memory utilization, and accuracy on test set of eight learning algorithms. Bold font shows the best results

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.