-
CiteScore
-
Impact Factor
Volume 2, Issue 1, ICCK Transactions on Advanced Computing and Systems
Volume 2, Issue 1, 2026
Submit Manuscript Edit a Special Issue
Article QR Code
Article QR Code
Scan the QR code for reading
Popular articles
ICCK Transactions on Advanced Computing and Systems, Volume 2, Issue 1, 2026: 25-41

Open Access | Research Article | 25 December 2025
VNNPF: A Variational Neural Network with Planar Flow for Robust IMU-GPS Fusion and Trajectory Estimation
1 School of Computer and Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China
* Corresponding Author: Xuebo Jin, [email protected]
Received: 16 January 2025, Accepted: 18 December 2025, Published: 25 December 2025  
Abstract
Accurate state estimation for dynamic targets is essential in fields such as target tracking, navigation, and autonomous driving. However, traditional estimation models struggle to handle the nonlinear motion patterns and sensor noise prevalent in real-world environments. To address these challenges, this paper proposes a novel end-to-end estimation model named Variational Neural Network with Planar Flow (VNNPF). The model integrates a Bayesian Gated Recurrent Unit (BGRU) as the process model, a planar flow-based variational autoencoder (PFVAE) as the measurement model, and a Bayesian hyperparameter optimization module inspired by Kalman filtering. The BGRU captures nonlinear temporal dependencies and enhances robustness by modeling parameters as distributions. PFVAE transforms simple latent distributions into more complex posteriors, enabling more accurate modeling of colored noise in sensor data. The Kalman-inspired update module computes a learnable gain to fuse prior and posterior information effectively. Experiments on the KITTI IMU–GPS benchmark demonstrate that VNNPF consistently achieves lower state-estimation errors than several state-of-the-art neural network baselines. These results indicate that VNNPF can provide accurate and robust trajectory estimation for nonlinear dynamic systems with complex sensor noise.

Graphical Abstract
VNNPF: A Variational Neural Network with Planar Flow for Robust IMU-GPS Fusion and Trajectory Estimation

Keywords
state estimation
bayesian recurrent neural networks
variational autoencoders
planar flow
kalman filter
IMU-GPS fusion

Data Availability Statement
Data will be made available on request.

Funding
This work was supported in part by the National Natural Science Foundation of China under Grant 62173007, Grant 62203020, Grant 62473008, Grant 62433002, and Grant 62476014; in part by the Beijing Nova Program under Grant 20240484710; in part by the Project of Humanities and Social Sciences (Ministry of Education in China, MOC) under Grant 22YJCZH006; in part by the Beijing Scholars Program Grant 099; in part by the Project of ALL China Federation of Supply and Marketing Cooperatives under Grant 202407; in part by the Project of Beijing Municipal University Teacher Team Construction Support Plan under Grant BPHR20220104.

Conflicts of Interest
The authors declare no conflicts of interest.

Ethical Approval and Consent to Participate
Not applicable.

References
  1. Zhou, D. H., & Frank, P. M. (2002). Fault diagnostics and fault tolerant control. IEEE Transactions on Aerospace and Electronic Systems, 34(2), 420--427.
    [CrossRef]   [Google Scholar]
  2. Mathiassen, K., Hanssen, L., & Hallingstad, O. (2010, September). A low cost navigation unit for positioning of personnel after loss of GPS position. In 2010 International Conference on Indoor Positioning and Indoor Navigation (pp. 1--10). IEEE.
    [CrossRef]   [Google Scholar]
  3. Liu, K. R. (2009). Cooperative communications and networking. Cambridge University Press.
    [Google Scholar]
  4. Shi, Q. (2023). Trajectory tracking control method of mobile robot based on AVRX operating system. Journal of Computational Methods in Science and Engineering, 23(1), 253--265.
    [CrossRef]   [Google Scholar]
  5. Julier, S. J., & Uhlmann, J. K. (2004). Unscented filtering and nonlinear estimation. Proceedings of the IEEE, 92(3), 401--422.
    [CrossRef]   [Google Scholar]
  6. Eltoukhy, M., Ahmad, M. O., & Swamy, M. N. S. (2020). An adaptive turn rate estimation for tracking a maneuvering target. IEEE Access, 8, 94176--94189.
    [CrossRef]   [Google Scholar]
  7. Wang, L., & Zhou, G. (2021). Pseudo-spectrum based track-before-detect for weak maneuvering targets in range-Doppler plane. IEEE Transactions on Vehicular Technology, 70(4), 3043--3058.
    [CrossRef]   [Google Scholar]
  8. Jia, S., Zhang, Y., & Wang, G. (2017). Highly maneuvering target tracking using multi-parameter fusion singer model. Journal of Systems Engineering and Electronics, 28(5), 841--850.
    [CrossRef]   [Google Scholar]
  9. Zhenkai, X., Fanying, L., & Lei, Z. (2018). Study on Maneuvering Target On-axis Tracking Algorithm of Modified Current Statistical Model. In MATEC Web of Conferences (Vol. 160, p. 02008). EDP Sciences.
    [CrossRef]   [Google Scholar]
  10. Kalman, R. E. (1960). A new approach to linear filtering and prediction problems. Journal of Basic Engineering, 82(1), 35--45.
    [CrossRef]   [Google Scholar]
  11. Bar-Shalom, Y., Li, X. R., & Kirubarajan, T. (2001). Estimation with applications to tracking and navigation: Theory algorithms and software. John Wiley & Sons.
    [Google Scholar]
  12. Julier, S. J., & Uhlmann, J. K. (1997, July). New extension of the Kalman filter to nonlinear systems. In Signal Processing, Sensor Fusion, and Target Recognition VI (Vol. 3068, pp. 182--193). SPIE.
    [CrossRef]   [Google Scholar]
  13. Wu, Y., Hu, D., Wu, M., & Hu, X. (2005, June). Unscented Kalman filtering for additive noise case: Augmented vs. non-augmented. In Proceedings of the 2005 American Control Conference (pp. 4051--4055). IEEE.
    [CrossRef]   [Google Scholar]
  14. Chen, Y., Xie, X., Yu, B., Li, Y., & Lin, K. (2021). Multitarget vehicle tracking and motion state estimation using a novel driving environment perception system of intelligent vehicles. Journal of Advanced Transportation, 2021(1), 6251399.
    [CrossRef]   [Google Scholar]
  15. Kulikov, G. Y., & Kulikova, M. V. (2015). The accurate continuous-discrete extended Kalman filter for radar tracking. IEEE Transactions on Signal Processing, 64(4), 948--958.
    [CrossRef]   [Google Scholar]
  16. Knudsen, T., & Leth, J. (2018). A new continuous discrete unscented Kalman filter. IEEE Transactions on Automatic Control, 64(5), 2198--2205.
    [CrossRef]   [Google Scholar]
  17. Zhang, T., Xu, X., & Wang, Z. (2018). Spacecraft attitude estimation based on matrix Kalman filter and recursive cubature Kalman filter. Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, 232(16), 3024--3033.
    [CrossRef]   [Google Scholar]
  18. Lou, T. S., Wu, H. C., Yue, Z. P., Dong, Y. S., & He, Z. D. (2021, October). Adaptive lattice Kalman Filter-SLAM for robot auto-navigation. In 2021 China Automation Congress (CAC) (pp. 4497--4501). IEEE.
    [CrossRef]   [Google Scholar]
  19. Dutta, A., & Das, M. (2022, December). ECG signal denoising using adaptive unscented Kalman filter. In 2022 IEEE Conference on Interdisciplinary Approaches in Technology and Management for Social Innovation (IATMSI) (pp. 1--6). IEEE.
    [CrossRef]   [Google Scholar]
  20. Nagui, N., Attallah, O., Zaghloul, M. S., & Morsi, I. (2021). Improved GPS/IMU loosely coupled integration scheme using two Kalman filter-based cascaded stages. Arabian Journal for Science and Engineering, 46, 1345--1367.
    [CrossRef]   [Google Scholar]
  21. Li, L. Q., Sun, Y. C., & Liu, Z. X. (2021). Maximum fuzzy correntropy Kalman filter and its application to bearings-only maneuvering target tracking. International Journal of Fuzzy Systems, 23(2), 405--418.
    [CrossRef]   [Google Scholar]
  22. Nguyen, M. T., Nguyen, C. V., Do, H. T., Hua, H. T., Tran, T. A., Nguyen, A. D., ... & Viola, F. (2021). Uav-assisted data collection in wireless sensor networks: A comprehensive survey. Electronics, 10(21), 2603.
    [CrossRef]   [Google Scholar]
  23. Jouaber, S., Bonnabel, S., Velasco-Forero, S., & Pilte, M. (2021, June). NNAKF: A neural network adapted Kalman filter for target tracking. In ICASSP 2021--2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 4075--4079). IEEE.
    [CrossRef]   [Google Scholar]
  24. Li, S., Hu, C., Wang, R., Zhou, C., & Yang, J. (2019, December). A maneuvering tracking method based on LSTM and CS model. In 2019 IEEE International Conference on Signal, Information and Data Processing (ICSIDP) (pp. 1--4). IEEE.
    [CrossRef]   [Google Scholar]
  25. Yaqi, C., You, H. E., Tiantian, T. A. N. G., & Yu, L. I. U. (2022). A new target tracking filter based on deep learning. Chinese Journal of Aeronautics, 35(5), 11--24.
    [CrossRef]   [Google Scholar]
  26. Zhang, J., Wu, Y., & Jiao, S. (2021, November). Research on trajectory tracking algorithm based on LSTM-UKF. In 2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC) (pp. 61--65). IEEE.
    [CrossRef]   [Google Scholar]
  27. Ge, B., Zhang, H., Jiang, L., Li, Z., & Butt, M. M. (2019). Adaptive unscented Kalman filter for target tracking with unknown time-varying noise covariance. Sensors, 19(6), 1371.
    [CrossRef]   [Google Scholar]
  28. Wang, M., Xu, C., Zhou, C., Gong, Y., & Qiu, B. (2022). Study on underwater target tracking technology based on an LSTM–Kalman filtering method. Applied Sciences, 12(10), 5233.
    [CrossRef]   [Google Scholar]
  29. Peng, Y., Pan, X., Wang, S., Wang, C., Wang, J., & Wu, J. (2021, May). An aero-engine RUL prediction method based on VAE-GAN. In 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design (CSCWD) (pp. 953--957). IEEE.
    [CrossRef]   [Google Scholar]
  30. De Miguel, M. \'A., Armingol, J. M., & Garc\'ia, F. (2022). Vehicles trajectory prediction using recurrent VAE network. IEEE Access, 10, 32742--32749.
    [CrossRef]   [Google Scholar]
  31. Santhosh, K. K., Dogra, D. P., Roy, P. P., & Mitra, A. (2021). Vehicular trajectory classification and traffic anomaly detection in videos using a hybrid CNN-VAE architecture. IEEE Transactions on Intelligent Transportation Systems, 23(8), 11891--11902.
    [CrossRef]   [Google Scholar]
  32. Uckan, T., Aslan, C., & Hark, C. (2025). A Comprehensive Hybrid Approach for Indoor Scene Recognition Combining CNNs and Text-Based Features. Sensors, 25(17), 5350.
    [CrossRef]   [Google Scholar]
  33. Rezende, D., & Mohamed, S. (2015, June). Variational inference with normalizing flows. In International Conference on Machine Learning (pp. 1530--1538). PMLR.
    [Google Scholar]
  34. Dinh, L., Krueger, D., & Bengio, Y. (2014). NICE: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516.
    [Google Scholar]
  35. Dinh, L., Sohl-Dickstein, J., & Bengio, S. (2016). Density estimation using Real NVP. arXiv preprint arXiv:1605.08803.
    [Google Scholar]
  36. Kingma, D. P., & Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1 convolutions. Advances in Neural Information Processing Systems, 31.
    [Google Scholar]
  37. Tang, Q., Liang, J., & Zhu, F. (2023). Multisensors fusion for trajectory tracking based on variational normalizing flow. IEEE Transactions on Geoscience and Remote Sensing, 61, 1--13.
    [CrossRef]   [Google Scholar]
  38. Delecki, H., Kruse, L. A., Schlichting, M. R., & Kochenderfer, M. J. (2023, June). Deep normalizing flows for state estimation. In 2023 26th International Conference on Information Fusion (FUSION) (pp. 1--6). IEEE.
    [CrossRef]   [Google Scholar]
  39. Papamakarios, G., Pavlakou, T., & Murray, I. (2017). Masked autoregressive flow for density estimation. Advances in Neural Information Processing Systems, 30.
    [Google Scholar]
  40. Nugraha, A. A., Sekiguchi, K., & Yoshii, K. (2020). A flow-based deep latent variable model for speech spectrogram modeling and enhancement. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 28, 1104--1117.
    [CrossRef]   [Google Scholar]
  41. Sun, S., Zhang, C., & Yu, G. (2006). A Bayesian network approach to traffic flow forecasting. IEEE Transactions on Intelligent Transportation Systems, 7(1), 124--132.
    [CrossRef]   [Google Scholar]
  42. Kong, J., Fan, X., Jin, X., Lin, S., & Zuo, M. (2023). A variational Bayesian inference-based en-decoder framework for traffic flow prediction. IEEE Transactions on Intelligent Transportation Systems, 25(3), 2966--2975.
    [CrossRef]   [Google Scholar]
  43. Nghiem, T. L., Le, V. D., Le, T. L., Mar\'echal, P., Delahaye, D., & Vidosavljevic, A. (2022, October). Applying Bayesian inference in a hybrid CNN-LSTM model for time-series prediction. In 2022 International Conference on Multimedia Analysis and Pattern Recognition (MAPR) (pp. 1--6). IEEE.
    [CrossRef]   [Google Scholar]
  44. Nayak, A., Eskandarian, A., & Doerzaph, Z. (2022). Uncertainty estimation of pedestrian future trajectory using Bayesian approximation. IEEE Open Journal of Intelligent Transportation Systems, 3, 617--630.
    [CrossRef]   [Google Scholar]
  45. LI Jiaqi, SHI Yunhe, & ZHANG Xiaofei. (2024). Direct localization of UAV targets in dense urban areas based on sparse Bayesian inference. Journal of Signal Processing, 40(5), 815--825.
    [CrossRef]   [Google Scholar]
  46. Posch, K., Steinbrener, J., & Pilz, J. (2019). Variational inference to measure model uncertainty in deep neural networks. arXiv preprint arXiv:1902.10189.
    [Google Scholar]
  47. Li, H., Ren, X., & Yang, Z. (2023). Data-driven Bayesian network for risk analysis of global maritime accidents. Reliability Engineering & System Safety, 230, 108938.
    [CrossRef]   [Google Scholar]
  48. Blundell, C., Cornebise, J., Kavukcuoglu, K., & Wierstra, D. (2015, June). Weight uncertainty in neural network. In International Conference on Machine Learning (pp. 1613--1622). PMLR.
    [Google Scholar]
  49. Chung, J., Kastner, K., Dinh, L., Goel, K., Courville, A. C., & Bengio, Y. (2015). A recurrent latent variable model for sequential data. Advances in Neural Information Processing Systems, 28.
    [Google Scholar]

Cite This Article
APA Style
Fu, H., & Jin, X. (2025). VNNPF: A Variational Neural Network with Planar Flow for Robust IMU-GPS Fusion and Trajectory Estimation. ICCK Transactions on Advanced Computing and Systems, 2(1), 25–41. https://doi.org/10.62762/TACS.2025.570823
Export Citation
RIS Format
Compatible with EndNote, Zotero, Mendeley, and other reference managers
RIS format data for reference managers
TY  - JOUR
AU  - Fu, Heran
AU  - Jin, Xuebo
PY  - 2025
DA  - 2025/12/25
TI  - VNNPF: A Variational Neural Network with Planar Flow for Robust IMU-GPS Fusion and Trajectory Estimation
JO  - ICCK Transactions on Advanced Computing and Systems
T2  - ICCK Transactions on Advanced Computing and Systems
JF  - ICCK Transactions on Advanced Computing and Systems
VL  - 2
IS  - 1
SP  - 25
EP  - 41
DO  - 10.62762/TACS.2025.570823
UR  - https://www.icck.org/article/abs/TACS.2025.570823
KW  - state estimation
KW  - bayesian recurrent neural networks
KW  - variational autoencoders
KW  - planar flow
KW  - kalman filter
KW  - IMU-GPS fusion
AB  - Accurate state estimation for dynamic targets is essential in fields such as target tracking, navigation, and autonomous driving. However, traditional estimation models struggle to handle the nonlinear motion patterns and sensor noise prevalent in real-world environments. To address these challenges, this paper proposes a novel end-to-end estimation model named Variational Neural Network with Planar Flow (VNNPF). The model integrates a Bayesian Gated Recurrent Unit (BGRU) as the process model, a planar flow-based variational autoencoder (PFVAE) as the measurement model, and a Bayesian hyperparameter optimization module inspired by Kalman filtering. The BGRU captures nonlinear temporal dependencies and enhances robustness by modeling parameters as distributions. PFVAE transforms simple latent distributions into more complex posteriors, enabling more accurate modeling of colored noise in sensor data. The Kalman-inspired update module computes a learnable gain to fuse prior and posterior information effectively. Experiments on the KITTI IMU–GPS benchmark demonstrate that VNNPF consistently achieves lower state-estimation errors than several state-of-the-art neural network baselines. These results indicate that VNNPF can provide accurate and robust trajectory estimation for nonlinear dynamic systems with complex sensor noise.
SN  - 3068-7969
PB  - Institute of Central Computation and Knowledge
LA  - English
ER  - 
BibTeX Format
Compatible with LaTeX, BibTeX, and other reference managers
BibTeX format data for LaTeX and reference managers
@article{Fu2025VNNPF,
  author = {Heran Fu and Xuebo Jin},
  title = {VNNPF: A Variational Neural Network with Planar Flow for Robust IMU-GPS Fusion and Trajectory Estimation},
  journal = {ICCK Transactions on Advanced Computing and Systems},
  year = {2025},
  volume = {2},
  number = {1},
  pages = {25-41},
  doi = {10.62762/TACS.2025.570823},
  url = {https://www.icck.org/article/abs/TACS.2025.570823},
  abstract = {Accurate state estimation for dynamic targets is essential in fields such as target tracking, navigation, and autonomous driving. However, traditional estimation models struggle to handle the nonlinear motion patterns and sensor noise prevalent in real-world environments. To address these challenges, this paper proposes a novel end-to-end estimation model named Variational Neural Network with Planar Flow (VNNPF). The model integrates a Bayesian Gated Recurrent Unit (BGRU) as the process model, a planar flow-based variational autoencoder (PFVAE) as the measurement model, and a Bayesian hyperparameter optimization module inspired by Kalman filtering. The BGRU captures nonlinear temporal dependencies and enhances robustness by modeling parameters as distributions. PFVAE transforms simple latent distributions into more complex posteriors, enabling more accurate modeling of colored noise in sensor data. The Kalman-inspired update module computes a learnable gain to fuse prior and posterior information effectively. Experiments on the KITTI IMU–GPS benchmark demonstrate that VNNPF consistently achieves lower state-estimation errors than several state-of-the-art neural network baselines. These results indicate that VNNPF can provide accurate and robust trajectory estimation for nonlinear dynamic systems with complex sensor noise.},
  keywords = {state estimation, bayesian recurrent neural networks, variational autoencoders, planar flow, kalman filter, IMU-GPS fusion},
  issn = {3068-7969},
  publisher = {Institute of Central Computation and Knowledge}
}

Article Metrics
Citations:

Crossref

0

Scopus

0

Web of Science

0
Article Access Statistics:
Views: 105
PDF Downloads: 22

Publisher's Note
ICCK stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and Permissions
CC BY Copyright © 2025 by the Author(s). Published by Institute of Central Computation and Knowledge. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.
ICCK Transactions on Advanced Computing and Systems

ICCK Transactions on Advanced Computing and Systems

ISSN: 3068-7969 (Online)

Email: [email protected]

Portico

Portico

All published articles are preserved here permanently:
https://www.portico.org/publishers/icck/