loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Edgar Medina and Leyong Loh

Affiliation: QualityMinds GmbH, Germany

Keyword(s): Adversarial Attacks, Human Motion Prediction, 3D Deep Learning, Motion Analyses.

Abstract: Human motion prediction is still an open problem, which is extremely important for autonomous driving and safety applications. Although there are great advances in this area, the widely studied topic of adversarial attacks has not been applied to multi-regression models such as GCNs and MLP-based architectures in human motion prediction. This work intends to reduce this gap using extensive quantitative and qualitative experiments in state-of-the-art architectures similar to the initial stages of adversarial attacks in image classification. The results suggest that models are susceptible to attacks even on low levels of perturbation. We also show experiments with 3D transformations that affect the model performance, in particular, we show that most models are sensitive to simple rotations and translations which do not alter joint distances. We conclude that similar to earlier CNN models, motion forecasting tasks are susceptible to small perturbations and simple 3D transformations.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.118.50.213

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Medina, E. and Loh, L. (2024). Fooling Neural Networks for Motion Forecasting via Adversarial Attacks. In Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP; ISBN 978-989-758-679-8; ISSN 2184-4321, SciTePress, pages 232-242. DOI: 10.5220/0012562800003660

@conference{visapp24,
author={Edgar Medina. and Leyong Loh.},
title={Fooling Neural Networks for Motion Forecasting via Adversarial Attacks},
booktitle={Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP},
year={2024},
pages={232-242},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012562800003660},
isbn={978-989-758-679-8},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP
TI - Fooling Neural Networks for Motion Forecasting via Adversarial Attacks
SN - 978-989-758-679-8
IS - 2184-4321
AU - Medina, E.
AU - Loh, L.
PY - 2024
SP - 232
EP - 242
DO - 10.5220/0012562800003660
PB - SciTePress