A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet

التفاصيل البيبلوغرافية
العنوان: A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet
المؤلفون: Fu, Hanyuan, Bonis, Thomas, Renaudin, Valerie, Zhu, Ni
المساهمون: Géolocalisation (AME-GEOLOC), Université Gustave Eiffel, Laboratoire Analyse et de Mathématiques Appliquées (LAMA), Université Paris-Est Créteil Val-de-Marne - Paris 12 (UPEC UP12)-Centre National de la Recherche Scientifique (CNRS)-Université Gustave Eiffel, Institute of Electrical and Electronics Engineers (IEEE), ANR-20-LCV1-0002,INMOB,cartographie du handicap par mesure INertielle pour faciliter la MOBilité(2020)
المصدر: 2023 IEEE/ION Position, Location and Navigation Symposium (PLANS 2023)
https://hal.science/hal-04152049Test
2023 IEEE/ION Position, Location and Navigation Symposium (PLANS 2023), Institute of Electrical and Electronics Engineers (IEEE), Apr 2023, Monterey, CA, United States. pp.691-699, ⟨10.1109/plans53410.2023.10140028⟩
https://www.ion.org/plans/index.cfmTest
بيانات النشر: HAL CCSD
IEEE
سنة النشر: 2023
مصطلحات موضوعية: Indoor positioning, inertial sensors, pedestrian navigation, pedestrian dead reckoning, walking direction, deep learning, [INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV], [INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]
جغرافية الموضوع: Monterey, CA, United States
الوصف: International audience ; In this paper, we propose an image-based neural network approach (PatternNet) for walking direction estimation with wearable inertial sensors. Gait event segmentation and projection are used to convert the inertial signals to image-like tabular samples, from which a Convolutional Neural Network (CNN) extracts geometrical features for walking direction inference. To embrace the diversity of individual walking characteristics and different ways to carry the device, tailor-made models are constructed based on individual users' gait characteristics and the device-carrying mode. Experimental assessments of the proposed method and a competing method (RoNIN) are carried out in reallife situations and over 3 km total walking distance, covering indoor and outdoor environments, involving both sighted and visually impaired volunteers carrying the device in three different ways: texting, swinging and in a jacket pocket. PatternNet estimates the walking directions with a mean accuracy between 7 to 10 degrees for the three test persons and is 1.5 times better than RONIN estimates.
نوع الوثيقة: conference object
اللغة: English
العلاقة: hal-04152049; https://hal.science/hal-04152049Test; https://hal.science/hal-04152049/documentTest; https://hal.science/hal-04152049/file/ION_PLANS_paper-8.pdfTest
DOI: 10.1109/plans53410.2023.10140028
الإتاحة: https://doi.org/10.1109/plans53410.2023.10140028Test
https://hal.science/hal-04152049Test
https://hal.science/hal-04152049/documentTest
https://hal.science/hal-04152049/file/ION_PLANS_paper-8.pdfTest
حقوق: info:eu-repo/semantics/OpenAccess
رقم الانضمام: edsbas.982A6506
قاعدة البيانات: BASE