Development of dead-reckoning sensor system for indoor environments
Article Sidebar
Issue Vol. 22 No. 1 (2026)
-
Development of dead-reckoning sensor system for indoor environments
Toshihiro YUKAWA1-19
-
A real-time adaptive traffic light control algorithm at urban intersections for smart cities
Chahrazad HAMBLI, Mourad AMAD20-34
-
A text-guided vision model for enhanced recognition of small instances
Hyun-Ki JUNG35-46
-
Reinforcement learning for solving optimization problems: Opportunities and limitations on the example of the assignment problem
Wojciech MISZTAL, Sybilla NAZAREWICZ47-62
-
SCADA-Driven big data framework for fault prediction in spiral steel pipe manufacturing using fuzzy and neural network models
Bakhshali BAKHTIYAROV, Aynur JABIYEVA, Mahabbat KHUDAVERDIYEVA63-81
-
Enhanced ELECTRE III method with interval-valued hesitant fuzzy linguistic sets for multi-criteria group decision-making in smart supply networks
Fadoua TAMTAM, Amina TOURABI82-98
-
Models for calculating the integral quality indicator of the offset printing process for the IIOT-system
Vyacheslav REPETA, Pavlo RYVAK, Oleksandra KRYKHOVETS99-109
-
A scalable and cost-effective forest fire detection approach using deep transfer learning on a Raspberry Pi cluster
Achraf Nasser Eddine BELFERD, Hamdan BENSENANE, Abdellatif RAHMOUN110-122
-
Addressing non-stationarity with stochastic trend in the context of limited time series data: An experimental survey in healthcare analytics
Apollinaire BATOURE BAMANA, Yannick SOKDOU BILA LAMOU, David Jaures FOTSA-MBOGNE, Mahdi SHAFIEE KAMALABAD123-139
-
Efficient multi-robot exploration of unknown environments using inverted ant colony optimization and reinforcement learning
Nabila RAHMOUNE, Adel RAHMOUNE140-153
-
A comprehensive review of metaheuristic algorithms for mobile robot path planning
Sheren SADIQ, Araz ABRAHIM, Haval SADEEQ154-170
-
Smart Autolube: Optimized machine learning-based pressure prediction for AIoT lubrication systems
Ali KHUMAIDI, Risanto DARMAWAN; Lukman ADITYA; Wardhana Halking HAMKA, Hudzaifah Al JIHAD171-183
-
Application of artificial intelligence methods to determine the optimal process parameters in resistance projection welding of steel nuts
Szymon KARSKI, Michał AWTONIUK, Mirosław SZALA184-198
-
Development of non-destructive vibration method for classification of bone fracture severity
Jignesh JANI, Nikunj RACHCHH199-213
-
Quantifying pain: An AI-driven approach to detecting pain levels via facial expressions
Abeer A. Mohamad ALSHIHA214-227
Archives
-
Vol. 22 No. 1
2026-03-31 15
-
Vol. 21 No. 4
2025-12-31 12
-
Vol. 21 No. 3
2025-10-05 12
-
Vol. 21 No. 2
2025-06-27 12
-
Vol. 21 No. 1
2025-03-31 12
-
Vol. 20 No. 4
2025-01-31 12
-
Vol. 20 No. 3
2024-09-30 12
-
Vol. 20 No. 2
2024-08-14 12
-
Vol. 20 No. 1
2024-03-30 12
-
Vol. 19 No. 4
2023-12-31 10
-
Vol. 19 No. 3
2023-09-30 10
-
Vol. 19 No. 2
2023-06-30 10
-
Vol. 19 No. 1
2023-03-31 10
-
Vol. 18 No. 4
2022-12-30 8
-
Vol. 18 No. 3
2022-09-30 8
-
Vol. 18 No. 2
2022-06-30 8
-
Vol. 18 No. 1
2022-03-31 8
Main Article Content
DOI
Authors
Abstract
This paper presents a method for constructing a dead-reckoning sensor system for indoor environments that enables autonomous control of a mobile robot. The proposed technique includes a method for achieving accurate autonomous control of robots. Using existing electronic equipment, we propose a system to measure the actual position and azimuth of mobile robots. The synthesis of information from sensor data into time series data of actual transition movement record of the mobile robot, and the algorithm of programming installed in a microcomputer and a PC for controlling peripheral devices around sensors influences the accuracy in estimating its position/posture. The dynamic characteristics of the mobile robot can be derived using induction theory for the system that installs a mouse device. The objective of the study for the mobile robot corresponds to a novel autonomous robot as an assistant without any guideline or other induction by GPS indoors. Here we discuss the adequacy of the sensor system that determines the positional accuracy of the robot. The position and orientation of the robot can be determined using a gyroscope and azimuth sensors. Finally, we investigate the performance of the robot indicated by the sensor system for a dead-reckoning strategy with optical sensors, orientation sensors, and gyroscope sensors to achieve highly accurate self-position estimation for a mobile robot moving indoors in the experiment.
Keywords:
References
Barbera, A., Horst, J., Schlenoff, C., Wallace, E., & Aha, D. (2003). Developing world model data specifications as metrics for sensory processing for on-road driving tasks. Performance Metrics for Intelligent Systems, Workshop.
Bensalem, S., Gallien, M., Ingrand, F. F., & Kahloul, I. (2008). Designing autonomous robots. IEEE Robotics & Automation Magazine, 16(1), 67 – 77. https://doi.org/10.1109/MRA.2008.931631
Bui, T.-L., Tran N.-T. (2023). Navigation strategy for mobile robot based on computer vision and YOLOV5 network in the unknown environment. Applied Computer Science, 19(2), 82–95. https://doi.org/10.35784/acs-2023-16
Cai, K., Chen, W., Wang, C., Zhang, H., & Meng, M. Q.-H. (2023) Curiosity-based robot navigation under uncertainty in crowded environments. IEEE Robotics and Automation Letters, 8(2), 800-807. https://doi.org/10.1109/LRA.2022.3232270
Correll, N., Hayes, B., Heckman, C., & Roncone, A. (2021). Introduction to autonomous robots: Mechanisms, sensors, actuators, and algorithms. The MIT Press.
Czarnecka, A., Sobaszek, Ł., Świc, A. (2018). 2D image-based industrial robot end effector trajectory control algorithm. Applied Computer Science, 14(1), 73–83. https://doi.org/10.23743/acs-2018-07
DeSouza, G. N., & Kak, A. C. (2002). Vision for mobile robot navigation: A Survey, IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(2), 237-267. https://doi.org/10.1109/34.982903
Ichihara, K., Hasegawa, T., Yuta, S., Ichikawa, H., & Naruse, Y. (2022). Waypoint-based human-tracking navigation for museum Guide robot. Journal of Robotics and Mechatronics, 34(5), 1192-1204. https://doi.org/10.20965/jrm.2022.p1192
Imaoka, N., Kitazawa, K, Kamezaki, M., Sugano, S., & Ando T. (2020). Autonomous mobile robot moving through static crowd: Arm with one-DoF and hand with involute shape to maneuver human position. Journal of Robotics and Mechatronics, 32(1), 59-67. https://doi.org/10.20965/jrm.2020.p0059
Kidono, K., Miura, J., & Shirai, Y. (2000). Autonomous visual navigation of a mobile robot using a human-guided experience. Robotics and Autonomous Systems, 40(2-3), 121-130. https://doi.org/10.1016/S0921-8890(02)00237-3
Lewis. F. L., & Ge, S. S. (2006). Autonomous mobile robots: Sensing, control, decision making and applications. Routledge Taylor & Francis Group.
Morales, Y., Carballo, A., Takeuchi, E., Aburadani, A., & Tsubouchi, T. (2009). Autonomous robot navigation in outdoor cluttered pedestrian walkways. Journal of Field Robotics, 26(8), 609-635. https://doi.org/10.1002/rob.20301
Patel, R., Cox, R., & Correll, N. (2018). Integrated proximity, contact and force sensing using elastomer-embedded commodity proximity sensors. Autonomous Robots, 42, 1443–1458. https://doi.org/10.1007/s10514-018-9751-4
Sofman, B., Lin, E., Bagnell, J. A., Cole, J., Vandapel, N., & Stentz, A. (2006). Improving robot navigation through self-supervised online learning. Journal of Field Robotics, 23(11-12), 1059–1075. https://doi.org/10.1002/rob.20169
Tsunoda, Y., Nghia, L. T., Sueoka, Y., & Osuka, K. (2023). Experimental analysis of shepherding-type robot navigation utilizing sound-obstacle-interaction. Journal of Robotics and Mechatronics, 35(4), 957-968. https://doi.org/10.20965/jrm.2023.p0957
Youssefian, S., Rahbar, N., & Torres-Jara, E. (2013). Contact behavior of soft spherical tactile sensors. IEEE sensors J., 14(5), 1435–1442. https://doi.org/10.1109/JSEN.2013.2296208
Article Details
Abstract views: 0
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
All articles published in Applied Computer Science are open-access and distributed under the terms of the Creative Commons Attribution 4.0 International License.
