714
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Effective human–object interaction recognition for edge devices in intelligent space

ORCID Icon, ORCID Icon & ORCID Icon
Pages 1-9 | Received 27 Apr 2023, Accepted 04 Dec 2023, Published online: 23 Dec 2023

Figures & data

Figure 1. Overview of the proposed human–object interaction.

Figure 1. Overview of the proposed human–object interaction.

Figure 2. Human–object interaction class examples recognized by the proposed method.

Figure 2. Human–object interaction class examples recognized by the proposed method.

Figure 3. Action class examples recognized by the proposed method.

Figure 3. Action class examples recognized by the proposed method.

Figure 4. Model accuracies of the proposed method: (a) bottle, (b) mobile phone, (c) book, (d) keyboard, and (e) actions (no interaction).

Figure 4. Model accuracies of the proposed method: (a) bottle, (b) mobile phone, (c) book, (d) keyboard, and (e) actions (no interaction).

Table 1. Robustness experiment.

Table 2. Robustness with various subjects.

Table 3. Robustness with various camera angles.

Table 4. Robustness with various camera frame sizes.

Table 5. Robustness in various environments.

Table 6. Ablation study on the explanatory variables in the proposed method.

Table 7. Comparison of accuracy on consideration of multi-camera complementarity.

Table 8. Effect of accuracy on changing the size of training dataset.

Table 9. Effect of accuracy on changing the number of recognized interaction models.

Table 10. Recognition speed experiment on various processor.

Table A1. Comparison of accuracy on consideration of depth coordinates.

Figure A1. Multi-person interaction recognition.

Figure A1. Multi-person interaction recognition.