Manual Material Handling (MMH) activities represent a large portion of the workers’ tasks in the tertiary sector. The ability to monitor, model, and predict human behaviours are crucial to both the design of productive human-robot collaboration and an efficient physical exposure assessment system that can prevent Work-related Musculoskeletal Disorders (WMSDs), with the ultimate goal of improving workers’ quality of life. The combined use of wearable sensors and machine learning (ML) techniques can fulfil these purposes. Inertial Measurement Units (IMUs) and surface Electromyography (sEMG) allow collecting kinematic data and muscular activity information that can be used for biomechanical analyses, ergonomic risk assessment, and as input of ML algorithms aimed at joint torque/load estimation, and Human Activity Recognition (HAR). The latter needs a large amount of annotated training samples, and the use of publicly available datasets is the way forward. Nowadays, the majority of them concern Activities of Daily Life (ADLs) and, including only kinematic data, have limited applications. This paper presents a fully labelled dataset of working activities that include full-body kinematics from 17 IMUs and upper limbs sEMG data from 16 channels. Fourteen subjects participated in the experiment performed in laboratory settings for overall 18.6 hours of recordings. The activities are divided into two sets. The first includes lifting, lowering, and carrying objects, MMH activities suitable for ergonomic risk assessment, and HAR. The second includes isokinetic arm movements, mainly targeting load and joint torque estimation.
A Dataset of Human Motion and Muscular Activities in Manual Material Handling tasks for Biomechanical and Ergonomic Analyses
Bassani G.;Filippeschi A.;Avizzano C. A.
2021-01-01
Abstract
Manual Material Handling (MMH) activities represent a large portion of the workers’ tasks in the tertiary sector. The ability to monitor, model, and predict human behaviours are crucial to both the design of productive human-robot collaboration and an efficient physical exposure assessment system that can prevent Work-related Musculoskeletal Disorders (WMSDs), with the ultimate goal of improving workers’ quality of life. The combined use of wearable sensors and machine learning (ML) techniques can fulfil these purposes. Inertial Measurement Units (IMUs) and surface Electromyography (sEMG) allow collecting kinematic data and muscular activity information that can be used for biomechanical analyses, ergonomic risk assessment, and as input of ML algorithms aimed at joint torque/load estimation, and Human Activity Recognition (HAR). The latter needs a large amount of annotated training samples, and the use of publicly available datasets is the way forward. Nowadays, the majority of them concern Activities of Daily Life (ADLs) and, including only kinematic data, have limited applications. This paper presents a fully labelled dataset of working activities that include full-body kinematics from 17 IMUs and upper limbs sEMG data from 16 channels. Fourteen subjects participated in the experiment performed in laboratory settings for overall 18.6 hours of recordings. The activities are divided into two sets. The first includes lifting, lowering, and carrying objects, MMH activities suitable for ergonomic risk assessment, and HAR. The second includes isokinetic arm movements, mainly targeting load and joint torque estimation.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.