GAP FILLING ALGORITHM FOR MOTION CAPTURE DATA TO CREATE REALISTIC VEHICLE ANIMATION
Article Sidebar
Open full text
Issue Vol. 20 No. 3 (2024)
-
VIOLENCE PREDICTION IN SURVEILLANCE VIDEOS
Esraa Alaa MAHAREEK, Doaa Rizk FATHY, Eman Karm ELSAYED, Nahed ELDESOUKY, Kamal Abdelraouf ELDAHSHAN1-16
-
GAP FILLING ALGORITHM FOR MOTION CAPTURE DATA TO CREATE REALISTIC VEHICLE ANIMATION
Weronika WACH, Kinga CHWALEBA17-33
-
SEMANTIC SEGMENTATION OF ALGAL BLOOMS ON THE OCEAN SURFACE USING SENTINEL 3 CHL_NN BAND IMAGERY
Venkatesh BHANDAGE, Manohara PAI M. M.34-50
-
ADVANCED FRAUD DETECTION IN CARD-BASED FINANCIAL SYSTEMS USING A BIDIRECTIONAL LSTM-GRU ENSEMBLE MODEL
Toufik GHRIB, Yacine KHALDI, Purnendu Shekhar PANDEY, Yusef Awad ABUSAL51-66
-
EXPLORING THE ACCURACY AND RELIABILITY OF MACHINE LEARNING APPROACHES FOR STUDENT PERFORMANCE
Bilal OWAIDAT67-84
-
REFRIGERANT CHARGING UNIT FOR THE RESIDENTIAL AIR CONDITIONERS: AN EXPERIMENT
Hong Son Le NGUYEN, Minh Ha NGUYEN, Luan Nguyen THANH85-95
-
CHATGPT IN COMMUNICATION: A SYSTEMATIC LITERATURE REVIEW
Muhammad Hasyimsyah BATUBARA, Awal Kurnia Putra NASUTION , NURMALINA, Fachrur RIZHA96-115
-
AERODYNAMIC AND ROLLING RESISTANCES OF HEAVY DUTY VEHICLE. SIMULATION OF ENERGY CONSUMPTION
Łukasz GRABOWSKI, Arkadiusz DROZD, Mateusz KARABELA, Wojciech KARPIUK116-131
-
DEVELOPING MACHINE LEARNING APPLICATION FOR EARLY CARDIOVASCULAR DISEASE (CVD) RISK DETECTION IN FIJI: A DESIGN SCIENCE APPROACH
Shahil SHARMA, Rajnesh LAL, Bimal KUMAR132-152
-
THE POTENTIAL OF ARTIFICIAL INTELLIGENCE IN HUMAN RESOURCE MANAGEMENT
Loubna BOUHSAIEN, Abdellah AZMANI153-170
-
A QUALITATIVE AND QUANTITATIVE APPROACH USING MACHINE LEARNING AND NON-MOTOR SYMPTOMS FOR PARKINSON’S DISEASE CLASSIFICATION. A HIERARCHICAL STUDY
Anitha Rani PALAKAYALA, Kuppusamy P171-191
-
SIMULATION OF TORQUE VARIATIONS IN A DIESEL ENGINE FOR LIGHT HELICOPTERS USING PI CONTROL ALGORITHMS
Paweł MAGRYTA, Grzegorz BARAŃSKI192-201
Archives
-
Vol. 21 No. 3
2025-10-05 12
-
Vol. 21 No. 2
2025-06-27 12
-
Vol. 21 No. 1
2025-03-31 12
-
Vol. 20 No. 4
2025-01-31 12
-
Vol. 20 No. 3
2024-09-30 12
-
Vol. 20 No. 2
2024-08-14 12
-
Vol. 20 No. 1
2024-03-30 12
-
Vol. 19 No. 4
2023-12-31 10
-
Vol. 19 No. 3
2023-09-30 10
-
Vol. 19 No. 2
2023-06-30 10
-
Vol. 19 No. 1
2023-03-31 10
-
Vol. 18 No. 4
2022-12-30 8
-
Vol. 18 No. 3
2022-09-30 8
-
Vol. 18 No. 2
2022-06-30 8
-
Vol. 18 No. 1
2022-03-30 7
-
Vol. 16 No. 4
2020-12-30 8
-
Vol. 16 No. 3
2020-09-30 8
-
Vol. 16 No. 2
2020-06-30 8
-
Vol. 16 No. 1
2020-03-30 8
Main Article Content
DOI
Authors
Abstract
The dynamic development of the entertainment market entails the need to develop new methods enabling the application of current scientific achievements. Motion capture is one of the cutting-edge technologies that plays a key role in movement and trajectory computer mapping. The use of optical systems allows one to obtain highly precise motion data that is often applied in computer animations. This study aimed to define the research methodology proposed to analyze the movement of remotely controlled cars utilizing developed gap filling algorithm, a part of post-processing, for creating realistic vehicle animation. On a specially prepared model, six various types of movements were recorded, such as: driving straight line forward, driving straight line backwards, driving on a curve to the left, driving on a curve to the right and driving around a roundabout on both sides. These movements were recorded using a VICON passive motion capture system. As a result, three-dimensional models of vehicles were created that were further post-processed, mainly by filling in the gaps in the trajectories. The case study highlighted problems such as missing points at the beginning and end of the recordings. Therefore, algorithm was developed to solve the above-mentioned problem and allowed for obtaining an accurate movement trajectory throughout the entire route. Realistic animations were created from the prepared data. The preliminary studies allowed one for the verification of the research method and implemented algorithm for obtaining animations reflecting accurate movements.
Keywords:
References
Ardestani, M. M. M., & Yan, H. (2022). Noise reduction in human motion-captured signals for computer animation based on B-Spline filtering. Sensors, 22(12), 4629. https://doi.org/10.3390/s22124629 DOI: https://doi.org/10.3390/s22124629
Asraf, S. M. H., Abdullasim, N., & Romli, R. (2020). Hybrid animation: implementation of motion capture. IOP Conference Series. Materials Science and Engineering, 767, 012065. https://doi.org/10.1088/1757-899x/767/1/012065 DOI: https://doi.org/10.1088/1757-899X/767/1/012065
Autodesk. (n.d. a). 3DS Max Software. https://www.autodesk.com/products/3ds-max
Autodesk. (n.d. b). Maya Software. https://www.autodesk.com/products/maya
Cao, Y., Zhao, Y., Hu, Y., & Lin, B. (2020). Research on physically-based computer animation. 2020 2nd International Conference on Information Technology and Computer Application (ITCA) (pp. 186-190). IEEE. https://doi.org/10.1109/itca52113.2020.00046 DOI: https://doi.org/10.1109/ITCA52113.2020.00046
Chung, Y., Annaswamy, T. M., & Prabhakaran, B. (2022). Design of calibration module for a home-based immersive game using camera motion capture system. 2022 ACM Symposium on Spatial User Interaction. Association for Computing Machinery. https://doi.org/10.1145/3565970.3567694 DOI: https://doi.org/10.1145/3565970.3567694
Guo, Y., & Zhong, C. (2022). Motion capture technology and its applications in film and television animation. Advances in Multimedia, 2022(1), 6392168. https://doi.org/10.1155/2022/6392168 DOI: https://doi.org/10.1155/2022/6392168
Intel RealSense. (2024, May 17). Intel® RealSenseTM Computer Vision - Depth and Tracking cameras. Intel® RealSenseTM Depth and Tracking Cameras. https://www.intelrealsense.com/
Lam, W. W. T., Tang, Y. M., & Fong, K. N. K. (2023). A systematic review of the applications of markerless motion capture (MMC) technology for clinical measurement in rehabilitation. Journal of Neuroengineering and Rehabilitation, 20, 57. https://doi.org/10.1186/s12984-023-01186-9 DOI: https://doi.org/10.1186/s12984-023-01186-9
Lei, Q. (2019). Research on animation and its motion capture technology. 2018 International Conference on Data Processing, 2018 International Conference on Data Processing, Artificial Intelligence, and Communications (DPAIC 2018) (pp. 121-124). Francis Academic Press.
Liu, S., Zhang, J., Zhang, Y., & Zhu, R. (2020). A wearable motion capture device able to detect dynamic motion of human limbs. Nature Communications, 11, 5615. https://doi.org/10.1038/s41467-020-19424-2 DOI: https://doi.org/10.1038/s41467-020-19424-2
Lopez, S., Johnson, C., Frankston, N., Ruh, E., McClincy, M., & Anderst, W. (2024). Accuracy of conventional motion capture in measuring hip joint center location and hip rotations during gait, squat, and step-up activities. Journal of Biomechanics, 167, 112079. https://doi.org/10.1016/j.jbiomech.2024.112079 DOI: https://doi.org/10.1016/j.jbiomech.2024.112079
Lugrís, U., Pérez-Soto, M., Michaud, F., & Cuadrado, J. (2023). Human motion capture, reconstruction, and musculoskeletal analysis in real time. Multibody System Dynamics, 60, 3-25. https://doi.org/10.1007/s11044-023-09938-0 DOI: https://doi.org/10.1007/s11044-023-09938-0
Mousas, C., & Anagnostopoulos, C.-N. (2017). Real-time performance-driven finger motion synthesis. Computers & Graphics, 65, 1–11. https://doi.org/10.1016/j.cag.2017.03.001 DOI: https://doi.org/10.1016/j.cag.2017.03.001
Naik, M., Suryawanshi, Y., & Atre, A. (2023). Unleashing the power of animation in marketing: Insights and implications. The Online Journal of Distance Education and e-Learning, 11(02), 2621-2630.
Powroznik, P., Skublewska-Paszkowska, M., Karczmarek, P., & Lukasik, E. (2022). Aggregation of tennis groundstrokes on the basis of the choquet integral and its generalizations. 2022 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) (pp. 1-8). IEEE. https://doi.org/10.1109/fuzz-ieee55066.2022.9882592 DOI: https://doi.org/10.1109/FUZZ-IEEE55066.2022.9882592
Praveen, C. K., & Srinivasan, K. (2022). Psychological impact and influence of animation on viewer’s visual attention and cognition: A systematic literature review, open challenges, and future research directions. Computational and Mathematical Methods in Medicine, 2022(1), 8802542. https://doi.org/10.1155/2022/8802542 DOI: https://doi.org/10.1155/2022/8802542
Rupnawar, N. R., Swami, N. D., Kshirsagar, N. A., Sayyed, N. A., Samleti, N. R., & Chandane, N. P. E. R. (2024). A real-time motion capture system for 3-D virtual characters. International Journal of Advanced Research in Science, Communication and Technology, 4(2), 108–115. https://doi.org/10.48175/ijarsct-18814 DOI: https://doi.org/10.48175/IJARSCT-18814
Salonen, S. (2021). Motion Capture in 3D animation. Tampere University.
Sharma, S., Verma, S., Kumar, M., & Sharma, L. (2019). Use of motion capture in 3D animation: Motion capture systems, challenges, and recent trends. 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon) (pp. 289-294). IEEE. https://doi.org/10.1109/comitcon.2019.8862448 DOI: https://doi.org/10.1109/COMITCon.2019.8862448
Skublewska-Paszkowska, M., & Powroznik, P. (2023). Temporal pattern attention for multivariate time series of tennis strokes classification. Sensors, 23(5), 2422. https://doi.org/10.3390/s23052422 DOI: https://doi.org/10.3390/s23052422
Skublewska-Paszkowska, M., Łukasik, E., & Smołka, J. (2012). Analysis on motion interpolation methods. Actual Problems of Economics, 11(137), 448–455.
Skublewska-Paszkowska, M., Lukasik, E., Szydlowski, B., Smolka, J., & Powroznik, P. (2020). Recognition of tennis shots using convolutional neural networks based on three-dimensional data. In A. Gruca, T. Czachórski, S. Deorowicz, K. Harężlak, & A. Piotrowska (Eds.), Man-Machine Interactions 6 (Vol. 1061, pp. 146–155). Springer International Publishing. https://doi.org/10.1007/978-3-030-31964-9_14 DOI: https://doi.org/10.1007/978-3-030-31964-9_14
Skublewska-Paszkowska, M., Powroznik, P., & Lukasik, E. (2022). Attention temporal graph convolutional network for tennis groundstrokes phases classification. 2022 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) (pp. 1-8). IEEE. https://doi.org/10.1109/fuzz-ieee55066.2022.9882822 DOI: https://doi.org/10.1109/FUZZ-IEEE55066.2022.9882822
Skublewska-Paszkowska, M., Powroźnik, P., Barszcz, M., & Dziedzic, K. (2023). Dual attention graph convolutional neural network to support mocap data animation. Advances in Science and Technology Research Journal, 17(5), 313-325. https://doi.org/10.12913/22998624/171592 DOI: https://doi.org/10.12913/22998624/171592
Smirnova, V., Khamatnurova, R., Kharin, N., Yaikova, E., Baltina, T., & Sachenkov, O. (2022). The automatization of the GAIT analysis by the Vicon video system: A pilot study. Sensors, 22(19), 7178. https://doi.org/10.3390/s22197178 DOI: https://doi.org/10.3390/s22197178
Smołka, J., & Skublewska-Paszkowska, M. (2014). Comparison of interpolation methods based on real human motion data. Przegląd Elektrotechniczny, 90(10), 226-229. https://doi.org/10.12915/pe.2014.10.54
Topley, M., & Richards, J. G. (2020). A comparison of currently available optoelectronic motion capture systems. Journal of Biomechanics, 106, 109820. https://doi.org/10.1016/j.jbiomech.2020.109820 DOI: https://doi.org/10.1016/j.jbiomech.2020.109820
Vicon. (2024, June 25). https://www.vicon.com/
Vicon. (2024, May 24). Nexus Version 2.16. https://www.vicon.com/software/nexus/
Vicon. (2024, May 24b). Tracker: software. Version 3.1. https://vicon.com/software/tracker/
Wibowo, M. C., Nugroho, S., & Wibowo, A. (2024). The use of motion capture technology in 3D animation. International Journal of Computing and Digital Systems, 15(01). http://dx.doi.org/10.12785/ijcds//150169 DOI: https://doi.org/10.12785/ijcds//150169
Yun, G., Lee, H., Han, S., & Choi, S. (2021). Improving viewing experiences of first-person shooter gameplays with automatically-generated motion effects. 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21) (pp. 1-14). Association for Computing Machinery. https://doi.org/10.1145/3411764.3445358 DOI: https://doi.org/10.1145/3411764.3445358
Zhu, Y. (2019). Application of motion capture technology in 3D animation creation. 3rd International Conference on Culture, Education and Economic Development of Modern Society (ICCESE 2019) (pp. 452-456). Atlantis Press. https://doi.org/10.2991/iccese-19.2019.101 DOI: https://doi.org/10.2991/iccese-19.2019.101
Article Details
Abstract views: 624
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
All articles published in Applied Computer Science are open-access and distributed under the terms of the Creative Commons Attribution 4.0 International License.
