266
Views
2
CrossRef citations to date
0
Altmetric
Research Article

Explainable human activity recognition based on probabilistic spatial partitions for symbiotic workplaces

ORCID Icon & ORCID Icon
Pages 1783-1800 | Received 29 Apr 2022, Accepted 16 Jan 2023, Published online: 27 Feb 2023
 

ABSTRACT

In recent years, smart workplaces that adapt to human activity and motion behavior have been proposed for cognitive production systems. In this respect, methods for identifying the feelings and activities of human workers are being investigated to improve the cognitive capability of smart machines such as robots in shared working spaces. Recognizing human activities and predicting the possible next sequence of operations may simplify robot programming and improve collaboration efficiency. However, human activity recognition still requires explainable models that are versatile, robust, and interpretative. Therefore, recognizing and analyzing human action details using continuous probability density estimates in different workplace layouts is essential. Three scenarios are considered: a standalone, a one-piece flow U-form, and a human-robot hybrid workplace. This work presents a novel approach to human activity recognition based on a probabilistic spatial partition (HAROPP). Its performance is compared to the geometric-bounded activity recognition method. Results show that spatial partitions based on probabilistic density contain 20% fewer data frames and 10% more spatial areas than the geometric bounding box. The approach, on average, detects human activities correctly for 81% of the cases for a pre-known workplace layout. HAROPP has scalability and applicability potential for cognitive workplaces with a digital twin in the loop for pushing the cognitive capabilities of machine systems and realizing human-centered environments.

Availability of data and material

Not applicable

Code availability

Not applicable.

Consent for publication

The authors affirm that human research participants provided informed consent for publication of the images in .

Consent to participate

Voluntary participation and informed consent were obtained from all individual participants included in the study in the lab environment.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Ethics approval

The motion capture data comprise joint position, orientation, and time stamp. The data is stored anonymously for analysis at the participant’s consent for academic research.

Symbols and acronyms

λa=

List of activities

λr=

List of resources

λa=

List of cells or tasks

ξ=

Observability scope of activities

K=

Kernel of density estimates

fˆk=

Probability of Kernel density estimation

h=

Kernel bandwidth

ζ=

Contour path extracted from the density plot

H=

Recognized human activity

F=

Function

Abb=

Area of bounding geometry

Afˆk=

Area of density estimate

Hˉbb=

Contained frame within the bounding geometry

Hˉfˆk=

Contained frame within the density area

HRC=

Human robot collaboration

HAR=

Human activity recognition

HAROPP=

Human activity recognition on probabilistic partition

MTM=

Method-time measurement

HAROPP=

Inertial measurement unit

YAML=

Yet another markup language

MTM=

Method-time measurement

MTM-1=

MTM first generation

MTM-2=

MTM second generation

MTM-UAS=

MTM Universal Analysing System

Notes

Additional information

Funding

This work is supported partly by the Federal Ministry of Education and Research of Germany within the ITEA3 project MOSIM under Grant: 01IS18060AH; The European Regional Development Fund (EFRE) within the project SMAPS under Grant: 0200545. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the authors and do not necessarily reflect the view of funding agencies.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.