Ramirez, HeilymVelastin Carroza, Sergio AlejandroCuéllar, SaraFabregas, ErnestoFarias, Gonzalo2023-06-082023-06-082023-02-01Ramirez, H.; Velastin, S.A.; Cuellar, S.; Fabregas, E.; Farias, G. BERT for Activity Recognition Using Sequences of Skeleton Features and Data Augmentation with GAN. Sensors 2023, 23, 1400. https://doi.org/10.3390/s230314001424-3210https://hdl.handle.net/10016/37440Recently, the scientific community has placed great emphasis on the recognition of human activity, especially in the area of health and care for the elderly. There are already practical applications of activity recognition and unusual conditions that use body sensors such as wrist-worn devices or neck pendants. These relatively simple devices may be prone to errors, might be uncomfortable to wear, might be forgotten or not worn, and are unable to detect more subtle conditions such as incorrect postures. Therefore, other proposed methods are based on the use of images and videos to carry out human activity recognition, even in open spaces and with multiple people. However, the resulting increase in the size and complexity involved when using image data requires the use of the most recent advanced machine learning and deep learning techniques. This paper presents an approach based on deep learning with attention to the recognition of activities from multiple frames. Feature extraction is performed by estimating the pose of the human skeleton, and classification is performed using a neural network based on Bidirectional Encoder Representation of Transformers (BERT). This algorithm was trained with the UP-Fall public dataset, generating more balanced artificial data with a Generative Adversarial Neural network (GAN), and evaluated with real data, outperforming the results of other activity recognition methods using the same dataset.15eng© 2023 by the authorsAtribución 3.0 Españaactivity recognitionBERTcomputer visionhuman skeletonpose estimationBERT for Activity Recognition Using Sequences of Skeleton Features and Data Augmentation with GANresearch articleInformáticahttps://doi.org/10.3390/s23031400open access3Sensors23AR/0000033063