Volume 3, Issue 2, ICCK Transactions on Emerging Topics in Artificial Intelligence
Volume 3, Issue 2, 2026
Submit Manuscript Edit a Special Issue
Article QR Code
Article QR Code
Scan the QR code for reading
Popular articles
ICCK Transactions on Emerging Topics in Artificial Intelligence, Volume 3, Issue 2, 2026: 86-127

Open Access | Review Article | 18 February 2026
Exploring Graph-Based Techniques in Text Data Processing: A Comprehensive Survey of NLP Advancements
1 Graduate School of Natural and Applied Sciences, Izmir Katip Çelebi University, Izmir 35620, Turkey
2 Department of Software Engineering, Faculty of Technology, Manisa Celal Bayar University, Manisa 45140, Turkey
3 Department of Computer Engineering, Izmir Institute of Technology, Izmir 35430, Turkey
* Corresponding Author: Tuğba Çelikten, [email protected]
ARK: ark:/57805/tetai.2025.740330
Received: 22 November 2025, Accepted: 02 December 2025, Published: 18 February 2026  
Abstract
Graph Neural Networks (GNNs) have become increasingly prominent in Natural Language Processing (NLP) due to their ability to model intricate relationships and contextual connections between texts. Unlike traditional NLP methods, which typically process text linearly, GNNs utilize graph structures to represent the complex relationships between texts more effectively. This capability has led to significant advancements in various NLP applications, such as social media interaction analysis, sentiment analysis, text classification, and information extraction. Notably, GNNs excel in scenarios with limited labeled data, often outperforming traditional approaches by providing deeper, context-aware solutions. Their versatility in handling different data types has made GNNs a popular choice in NLP research. In this study, we thoroughly explored the application of GNNs across various NLP tasks, demonstrating their advantages in understanding and representing text relationships. We also examined how GNNs address traditional NLP challenges, showcasing their potential to deliver more meaningful and accurate results. Our research underscores the value of GNNs as a potent tool in NLP and suggests future research directions to enhance their applicability and effectiveness further.

Graphical Abstract
Exploring Graph-Based Techniques in Text Data Processing: A Comprehensive Survey of NLP Advancements

Keywords
graph neural networks (GNN)
natural language processing (NLP)
graph convolutional networks (GCN)
heterogeneous graph neural networks
GNN-based NLP

Data Availability Statement
Not applicable.

Funding
This work was supported without any funding.

Conflicts of Interest
The authors declare no conflicts of interest.

AI Use Statement
The authors declare that no generative AI was used in the preparation of this manuscript.

Ethical Approval and Consent to Participate
Not applicable.

References
  1. Patwardhan, N., Marrone, S., & Sansone, C. (2023). Transformers in the real world: A survey on nlp applications. Information, 14(4), 242.
    [CrossRef]   [Google Scholar]
  2. Jia, J., Liang, W., & Liang, Y. (2023). A review of hybrid and ensemble in deep learning for natural language processing. arXiv preprint arXiv:2312.05589.
    [Google Scholar]
  3. Benslimane, S., Azé, J., Bringay, S., Servajean, M., & Mollevi, C. (2023). A text and GNN based controversy detection method on social media. World Wide Web, 26(2), 799–825.
    [CrossRef]   [Google Scholar]
  4. Dara, S., Srinivasulu, C. H., Babu, C. M., Ravuri, A., Paruchuri, T., Kilak, A. S., & Vidyarthi, A. (2023). Context-Aware auto-encoded graph neural model for dynamic question generation using NLP. ACM transactions on asian and low-resource language information processing.
    [CrossRef]   [Google Scholar]
  5. Abdalla, H. I., Amer, A. A., & Ravana, S. D. (2023). BoW-based neural networks vs. cutting-edge models for single-label text classification. Neural Computing and Applications, 35(27), 20103-20116.
    [CrossRef]   [Google Scholar]
  6. Wang, H., Ren, C., & Yu, Z. (2024). Multimodal sentiment analysis based on cross-instance graph neural networks. Applied Intelligence, 54(4), 3403-3416.
    [CrossRef]   [Google Scholar]
  7. Mehta, S., Karwa, R., Chavan, R., Khatavkar, V., & Joshi, A. (2024). Keyphrase extraction using graph-based statistical approach with NLP patterns. Sādhanā, 49(2), 170.
    [CrossRef]   [Google Scholar]
  8. Liu, X., Qin, X., Xu, C., & Li, H. (2025). A knowledge-enhanced model with syntactic-aware attentive graph convolutional network for biomedical entity and relation extraction. International Journal of Machine Learning and Cybernetics, 16(1), 583-598.
    [CrossRef]   [Google Scholar]
  9. Martínez-Cruz, R., Mahata, D., López-López, A. J., & Portela, J. (2025). Enhancing keyphrase extraction from long scientific documents using graph embeddings. Applied Intelligence, 55(7), 1-18.
    [CrossRef]   [Google Scholar]
  10. Ahmed, H., Traore, I., Mamun, M., & Saad, S. (2023). Text augmentation using a graph-based approach and clonal selection algorithm. Machine Learning with Applications, 11, 100452.
    [CrossRef]   [Google Scholar]
  11. Abadal, S., Jain, A., Guirado, R., López-Alonso, J., & Alarcón, E. (2021). Computing graph neural networks: A survey from algorithms to accelerators. ACM Computing Surveys (CSUR), 54(9), 1-38.
    [CrossRef]   [Google Scholar]
  12. Chi, X., & Xiang, Y. (2021). Augmenting paraphrase generation with syntax information using graph convolutional networks. Entropy, 23(5), 566.
    [CrossRef]   [Google Scholar]
  13. Guo, Q., Qiu, X., Xue, X., & Zhang, Z. (2021). Syntax-guided text generation via graph neural network. Science China Information Sciences, 64(5), 152102.
    [CrossRef]   [Google Scholar]
  14. Khan, I. Z., Sheikh, A. A., & Sinha, U. (2024). Graph neural network and ner-based text summarization. arXiv preprint arXiv:2402.05126.
    [Google Scholar]
  15. Hua, J., Sun, D., Hu, Y., Wang, J., Feng, S., & Wang, Z. (2024). Heterogeneous graph-convolution-network-based short-text classification. Applied Sciences, 14(6), 2279.
    [CrossRef]   [Google Scholar]
  16. Das, S., Tariq, A., Santos, T., Kantareddy, S. S., & Banerjee, I. (2023). Recurrent neural networks (RNNs): architectures, training tricks, and introduction to influential research. Machine learning for Brain disorders, 117-138.
    [CrossRef]   [Google Scholar]
  17. Mishra, B. K., & Jain, S. (2025). Word sense disambiguation for Indic language using Bi-LSTM. Multimedia Tools and Applications, 84(16), 16631-16656.
    [CrossRef]   [Google Scholar]
  18. Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., & Yu, P. S. (2020). A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1), 4-24.
    [CrossRef]   [Google Scholar]
  19. Tian, M., Ma, K., Wu, Q., Qiu, Q., Tao, L., & Xie, Z. (2024). Joint extraction of entity relations from geological reports based on a novel relation graph convolutional network. Computers & Geosciences, 187, 105571.
    [CrossRef]   [Google Scholar]
  20. He, H., Wang, H., Ma, H., Liu, X., Jia, Y., & Gong, G. (2020, October). Research on short-term power load forecasting based on Bi-GRU. In Journal of Physics: Conference Series (Vol. 1639, No. 1, p. 012017). IOP Publishing.
    [CrossRef]   [Google Scholar]
  21. Zhou, J., Cui, G., Hu, S., Zhang, Z., Yang, C., Liu, Z., ... & Sun, M. (2020). Graph neural networks: A review of methods and applications. AI open, 1, 57-81.
    [CrossRef]   [Google Scholar]
  22. Waikhom, L., & Patgiri, R. (2021). Graph neural networks: Methods, applications, and opportunities. arXiv preprint arXiv:2108.10733.
    [Google Scholar]
  23. Li, Y., Wu, J., & Luo, X. (2024). BERT-CNN based evidence retrieval and aggregation for Chinese legal multi-choice question answering. Neural Computing and Applications, 36(11), 5909–5925.
    [CrossRef]   [Google Scholar]
  24. Phan, H. T., Nguyen, N. T., & Hwang, D. (2023). Aspect-level sentiment analysis: A survey of graph convolutional network methods. Information Fusion, 91, 149-172.
    [CrossRef]   [Google Scholar]
  25. Khemani, B., Patil, S., Kotecha, K., & Tanwar, S. (2024). A review of graph neural networks: concepts, architectures, techniques, challenges, datasets, applications, and future directions. Journal of Big Data, 11(1), 18.
    [CrossRef]   [Google Scholar]
  26. Sun, Q., Zhang, K., Huang, K., Xu, T., Li, X., & Liu, Y. (2023). Document-level relation extraction with two-stage dynamic graph attention networks. Knowledge-Based Systems, 267, 110428.
    [CrossRef]   [Google Scholar]
  27. Maskey, S., Levie, R., Lee, Y., & Kutyniok, G. (2022). Generalization analysis of message passing neural networks on large random graphs. Advances in neural information processing systems, 35, 4805-4817.
    [Google Scholar]
  28. Tang, M., Li, B., & Chen, H. (2023). Application of message passing neural networks for molecular property prediction. Current Opinion in Structural Biology, 81, 102616.
    [CrossRef]   [Google Scholar]
  29. Liu, Y., Xing, X., Jia, Z., & Li, Y. (2023, December). Enhancing Social Recommendation with Global Dependency Modeling base on Self-Attention Graph Neural Network. In 2023 4th International Conference on Computer, Big Data and Artificial Intelligence (ICCBD+ AI) (pp. 350-354). IEEE.
    [CrossRef]   [Google Scholar]
  30. Marino, A., Pacchierotti, C., & Giordano, P. R. (2024). Input state stability of gated graph neural networks. IEEE Transactions on Control of Network Systems, 11(4), 2052-2063.
    [CrossRef]   [Google Scholar]
  31. Zheng, Y., Yi, L., & Wei, Z. (2024). A survey of dynamic graph neural networks. arXiv preprint arXiv:2404.18211.
    [Google Scholar]
  32. Zhang, Y. (2024). A General Benchmark Framework is Dynamic Graph Neural Network Need. arXiv preprint arXiv:2401.06559.
    [Google Scholar]
  33. Jin, G., Liang, Y., Fang, Y., Shao, Z., Huang, J., Zhang, J., & Zheng, Y. (2023). Spatio-temporal graph neural networks for predictive learning in urban computing: A survey. IEEE transactions on knowledge and data engineering, 36(10), 5388-5408.
    [CrossRef]   [Google Scholar]
  34. Ju, W., Wang, Y., Qin, Y., Mao, Z., Xiao, Z., Luo, J., Yang, J., Gu, Y., Wang, D., Long, Q., & others. (2024). Towards Graph Contrastive Learning: A Survey and Beyond. arXiv preprint arXiv:2405.11868.
    [Google Scholar]
  35. Zhao, T., Jin, W., Liu, Y., Wang, Y., Liu, G., Günnemann, S., Shah, N., & Jiang, M. (2022). Graph data augmentation for graph machine learning: A survey. arXiv preprint arXiv:2202.08871.
    [Google Scholar]
  36. Ghojogh, B., & Ghodsi, A. (2024). Graph Neural Network, ChebNet, Graph Convolutional Network, and Graph Autoencoder: Tutorial and Survey.
    [Google Scholar]
  37. Behrouzi, T., & Hatzinakos, D. (2022). Graph variational auto-encoder for deriving EEG-based graph embedding. Pattern Recognition, 121, 108202.
    [CrossRef]   [Google Scholar]
  38. Ding, Y., Tian, L. P., Lei, X., Liao, B., & Wu, F. X. (2021). Variational graph auto-encoders for miRNA-disease association prediction. Methods, 192, 25-34.
    [CrossRef]   [Google Scholar]
  39. Perozzi, B., Al-Rfou, R., & Skiena, S. (2014, August). Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 701-710).
    [CrossRef]   [Google Scholar]
  40. Xu, M. (2021). Understanding graph embedding methods and their applications. SIAM Review, 63(4), 825–853.
    [CrossRef]   [Google Scholar]
  41. Jeyaraj, R., Balasubramaniam, T., Balasubramaniam, A., & Paul, A. (2024). DeepWalk with Reinforcement Learning (DWRL) for node embedding. Expert Systems with Applications, 243, 122819.
    [CrossRef]   [Google Scholar]
  42. Grover, A., & Leskovec, J. (2016, August). node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 855-864).
    [CrossRef]   [Google Scholar]
  43. Tang, J., Qu, M., Wang, M., Zhang, M., Yan, J., & Mei, Q. (2015). Line: Large-scale information network embedding. In Proceedings of the 24th international conference on world wide web, 1067–1077.
    [CrossRef]   [Google Scholar]
  44. Ju, W., Fang, Z., Gu, Y., Liu, Z., Long, Q., Qiao, Z., ... & Zhang, M. (2024). A comprehensive survey on deep graph representation learning. Neural Networks, 173, 106207.
    [CrossRef]   [Google Scholar]
  45. Thambi, S. V., & Reghu Raj, P. C. (2024). A novel technique using graph neural networks and relevance scoring to improve the performance of knowledge graph-based question answering systems. Journal of Intelligent Information Systems, 62(3), 809-832.
    [CrossRef]   [Google Scholar]
  46. Kau, A., He, X., Nambissan, A., Astudillo, A., Yin, H., & Aryani, A. (2024). Combining Knowledge Graphs and Large Language Models. arXiv preprint arXiv:2407.06564.
    [Google Scholar]
  47. Simcharoen, S. (2022). Cooperatively Managing and Exploiting Distributed Co-occurrence Graphs (Doctoral dissertation, Dissertation, FernUniversität in Hagen, 2022).
    [Google Scholar]
  48. Zhou, J. (2023). Generating Semantic Graphs for Natural Language (Doctoral dissertation, Harvard University).
    [Google Scholar]
  49. Xu, K., Hu, W., Leskovec, J., & Jegelka, S. (2018). How powerful are graph neural networks? arXiv preprint arXiv:1810.00826.
    [Google Scholar]
  50. Nikolentzos, G., Siglidis, G., & Vazirgiannis, M. (2021). Graph kernels: A survey. Journal of Artificial Intelligence Research, 72, 943–1027.
    [CrossRef]   [Google Scholar]
  51. Kriege, N. M., Johansson, F. D., & Morris, C. (2020). A survey on graph kernels. Applied Network Science, 5(1), 6.
    [CrossRef]   [Google Scholar]
  52. Wang, Y., Wang, C., Zhan, J., Ma, W., & Jiang, Y. (2023). Text FCG: Fusing contextual information via graph learning for text classification. Expert Systems with Applications, 219, 119658.
    [CrossRef]   [Google Scholar]
  53. Cui, H., Wang, G., Li, Y., & Welsch, R. E. (2022). Self-training method based on GCN for semi-supervised short text classification. Information Sciences, 611, 18-29.
    [CrossRef]   [Google Scholar]
  54. Shi, Y., Xiao, Y., Quan, P., Lei, M., & Niu, L. (2021). Document-level relation extraction via graph transformer networks and temporal convolutional networks. Pattern Recognition Letters, 149, 150-156.
    [CrossRef]   [Google Scholar]
  55. Li, M., Xie, Y., Yang, W., & Chen, S. (2023). Multistream BertGCN for Sentiment Classification Based on Cross‐Document Learning. Quantum Engineering, 2023(1), 3668960.
    [CrossRef]   [Google Scholar]
  56. Wu, Y., Wang, X., Zhao, W., & Lv, X. (2023). A novel topic clustering algorithm based on graph neural network for question topic diversity. Information Sciences, 629, 685-702.
    [CrossRef]   [Google Scholar]
  57. Liang, Y., Meng, F., Zhang, Y., Chen, Y., Xu, J., & Zhou, J. (2022). Emotional conversation generation with heterogeneous graph neural network. Artificial Intelligence, 308, 103714.
    [CrossRef]   [Google Scholar]
  58. Bhadra, J., Khanna, A. S., & Beuno, A. (2023, April). A Graph Neural Network Approach for Identification of Influencers and Micro-Influencers in a Social Network:* Classifying influencers from non-influencers using GNN and GCN. In 2023 International Conference on Advances in Electronics, Communication, Computing and Intelligent Information Systems (ICAECIS) (pp. 66-71). IEEE.
    [CrossRef]   [Google Scholar]
  59. Yang, S., Liu, Y., Zhang, Y., & Zhu, J. (2023). A word-concept heterogeneous graph convolutional network for short text classification. Neural Processing Letters, 55(1), 735-750.
    [CrossRef]   [Google Scholar]
  60. Xu, S., & Xiang, Y. (2021). Frog-GNN: Multi-perspective aggregation based graph neural network for few-shot text classification. Expert Systems with Applications, 176, 114795.
    [CrossRef]   [Google Scholar]
  61. Hua, S., Li, X., Jing, Y., & Liu, Q. (2022). A semantic hierarchical graph neural network for text classification. arXiv preprint arXiv:2209.07031.
    [Google Scholar]
  62. Yang, Y., & Cui, X. (2021). Bert-enhanced text graph neural network for classification. Entropy, 23(11), 1536.
    [CrossRef]   [Google Scholar]
  63. Zeng, D., Zha, E., Kuang, J., & Shen, Y. (2024). Multi-label text classification based on semantic-sensitive graph convolutional network. Knowledge-Based Systems, 284, 111303.
    [CrossRef]   [Google Scholar]
  64. Ai, W., Wei, Y., Shao, H., Shou, Y., Meng, T., & Li, K. (2024). Edge-enhanced minimum-margin graph attention network for short text classification. Expert Systems with Applications, 251, 124069.
    [CrossRef]   [Google Scholar]
  65. Wang, B., Liang, B., Du, J., Yang, M., & Xu, R. (2022, December). SEMGraph: Incorporating sentiment knowledge and eye movement into graph model for sentiment analysis. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (pp. 7521-7531).
    [CrossRef]   [Google Scholar]
  66. Gu, T., Zhao, H., He, Z., Li, M., & Ying, D. (2023). Integrating external knowledge into aspect-based sentiment analysis using graph neural network. Knowledge-based systems, 259, 110025.
    [CrossRef]   [Google Scholar]
  67. Lu, G., Li, J., & Wei, J. (2022). Aspect sentiment analysis with heterogeneous graph neural networks. Information Processing & Management, 59(4), 102953.
    [CrossRef]   [Google Scholar]
  68. Wu, F., & Li, X. (2023). Local dependency-enhanced graph convolutional network for aspect-based sentiment analysis. Applied Sciences, 13(17), 9669.
    [CrossRef]   [Google Scholar]
  69. Sunny, J., Padraig, T., Terry, R., & Ali, W. (2023). Syntactic Fusion: Enhancing Aspect-Level Sentiment Analysis Through Multi-Tree Graph Integration. arXiv preprint arXiv:2312.03738.
    [Google Scholar]
  70. Jin, Y., & Zhao, A. (2024). Bert-based graph unlinked embedding for sentiment analysis. Complex & Intelligent Systems, 10(2), 2627-2638.
    [CrossRef]   [Google Scholar]
  71. An, W., Tian, F., Chen, P., & Zheng, Q. (2022). Aspect-based sentiment analysis with heterogeneous graph neural network. IEEE Transactions on Computational Social Systems, 10(1), 403-412.
    [CrossRef]   [Google Scholar]
  72. Zeng, Y., Li, Z., Tang, Z., Chen, Z., & Ma, H. (2023). Heterogeneous graph convolution based on in-domain self-supervision for multimodal sentiment analysis. Expert Systems with Applications, 213, 119240.
    [CrossRef]   [Google Scholar]
  73. Liu, Z., He, J., Gong, T., Weng, H., Wang, F. L., Liu, H., & Hao, T. (2024). Improving topic tracing with a textual reader for conversational knowledge based question answering. IEEE Transactions on Emerging Topics in Computational Intelligence, 8(3), 2640-2653.
    [CrossRef]   [Google Scholar]
  74. Gao, P., Gao, F., Ni, J. C., Wang, Y., Wang, F., & Zhang, Q. (2024). Medical knowledge graph question answering for drug‐drug interaction prediction based on multi‐hop machine reading comprehension. CAAI Transactions on Intelligence Technology, 9(5), 1217-1228.
    [CrossRef]   [Google Scholar]
  75. Cai, S., Ma, Q., Hou, Y., & Zeng, G. (2024). Knowledge graph multi-hop question answering based on dependent syntactic semantic augmented graph networks. Electronics, 13(8), 1436.
    [CrossRef]   [Google Scholar]
  76. Li, J., Wu, S., Zhang, X., & Feng, Z. (2024). A Context-enhanced Adaptive Graph Network for Time-sensitive Question Answering. ACM Transactions on Asian and Low-Resource Language Information Processing.
    [CrossRef]   [Google Scholar]
  77. Gao, Y., Qiao, L., Kan, Z., Wen, Z., He, Y., & Li, D. (2024). Two-stage Generative Question Answering on Temporal Knowledge Graph Using Large Language Models. arXiv preprint arXiv:2402.16568.
    [Google Scholar]
  78. Su, Y., Zhang, J., Song, Y., & Zhang, T. (2024). PipeNet: Question Answering with Semantic Pruning over Knowledge Graphs. arXiv preprint arXiv:2401.17536.
    [Google Scholar]
  79. De Cao, N., Aziz, W., & Titov, I. (2019, June). Question answering by reasoning across documents with graph convolutional networks. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human language technologies, Volume 1 (long and short papers) (pp. 2306-2317).
    [CrossRef]   [Google Scholar]
  80. Zou, S., Liu, Z., Wang, K., Cao, J., Liu, S., Xiong, W., & Li, S. (2024). A study on pharmaceutical text relationship extraction based on heterogeneous graph neural networks. Mathematical Biosciences and Engineering, 21(1), 1489–1507.
    [CrossRef]   [Google Scholar]
  81. Khanfir, Y., Dhiaf, M., Ghodhbani, E., Rouhou, A. C., & Kessentini, Y. (2024, January). Graph Neural Networks for End-to-End Information Extraction from Handwritten Documents. In 2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) (pp. 493-501). IEEE.
    [CrossRef]   [Google Scholar]
  82. Liu, F., Zhang, X., & Liu, Q. (2023). An emotion-aware approach for fake news detection. IEEE Transactions on Computational Social Systems, 11(3), 3516-3524.
    [CrossRef]   [Google Scholar]
  83. Hu, Z., Dong, Y., Wang, K., Chang, K. W., & Sun, Y. (2020, August). Gpt-gnn: Generative pre-training of graph neural networks. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining (pp. 1857-1867).
    [CrossRef]   [Google Scholar]
  84. Pi, Q., Lu, J., Zhu, T., Sun, Y., Li, S., & Guo, J. (2024). Enhancing cross-evidence reasoning graph for document-level relation extraction. PeerJ Computer Science, 10, e2123.
    [CrossRef]   [Google Scholar]
  85. Du, C., Li, Y., & Wen, M. (2023). G-DCS: GCN-Based Deep Code Summary Generation Model. Journal of Internet Technology, 24(4), 965–973.
    [Google Scholar]
  86. Lai, P., Ye, F., Fu, Y., Chen, Z., Wu, Y., Wang, Y., & Chang, V. (2024). CogNLG: Cognitive graph for KG‐to‐text generation. Expert Systems, 41(1), e13461.
    [CrossRef]   [Google Scholar]
  87. Wang, C., van Noord, R., & Bos, J. (2023). Controlling Topic-Focus Articulation in Meaning-to-Text Generation using Graph Neural Networks. arXiv preprint arXiv:2310.02053.
    [Google Scholar]
  88. Lou, X., Liu, G., & Li, J. (2024). Heterogeneous graph neural network with graph-data augmentation and adaptive denoising. Applied Intelligence, 54(5), 4411–4424. Springer.
    [CrossRef]   [Google Scholar]
  89. Chen, W., Su, Y., Yan, X., & Wang, W. Y. (2020). KGPT: Knowledge-grounded pre-training for data-to-text generation. arXiv preprint arXiv:2010.02307.
    [Google Scholar]
  90. Tang, J., Yang, Y., Wei, W., Shi, L., Su, L., Cheng, S., Yin, D., & Huang, C. (2023). Graphgpt: Graph instruction tuning for large language models. arXiv preprint arXiv:2310.13023.
    [Google Scholar]
  91. Huang, J., Wu, W., Li, J., & Wang, S. (2023). Text summarization method based on gated attention graph neural network. Sensors, 23(3), 1654. MDPI.
    [CrossRef]   [Google Scholar]
  92. Nourbakhsh, S. E., & Kashani, H. B. (2024). ConHGNN-SUM: A Contextualized Heterogeneous Graph Neural Network for Extractive Text Summarization. In 2024 20th CSI International Symposium on Artificial Intelligence and Signal Processing (AISP), 1–6. IEEE.
    [CrossRef]   [Google Scholar]
  93. Yu, Z., Wu, S., Jiang, J., & Liu, D. (2024). A knowledge-graph based text summarization scheme for mobile edge computing. Journal of Cloud Computing, 13(1), 9.
    [CrossRef]   [Google Scholar]
  94. Etaiwi, W., & Awajan, A. (2022). SemG-TS: Abstractive arabic text summarization using semantic graph embedding. Mathematics, 10(18), 3225. MDPI.
    [CrossRef]   [Google Scholar]
  95. Jiang, M., Zou, Y., Xu, J., & Zhang, M. (2022). Gatsum: graph-based topic-aware abstract text summarization. Information Technology and Control, 51(2), 345–355.
    [CrossRef]   [Google Scholar]
  96. Chen, J. (2024). An entity-guided text summarization framework with relational heterogeneous graph neural network. Neural Computing and Applications, 36(7), 3613–3630. Springer.
    [CrossRef]   [Google Scholar]
  97. Ghadimi, A., & Beigy, H. (2023). SGCSumm: An extractive multi-document summarization method based on pre-trained language model, submodularity, and graph convolutional neural networks. Expert Systems with Applications, 215, 119308.
    [CrossRef]   [Google Scholar]
  98. Zhao, H., Zhang, W., Huang, M., Feng, S., & Wu, Y. (2023). A multi-granularity heterogeneous graph for extractive text summarization. Electronics, 12(10), 2184. MDPI.
    [CrossRef]   [Google Scholar]
  99. Frisoni, G., Italiani, P., Salvatori, S., & Moro, G. (2023, June). Cogito ergo summ: abstractive summarization of biomedical papers via semantic parsing graphs and consistency rewards. In Proceedings of the AAAI conference on artificial intelligence (Vol. 37, No. 11, pp. 12781-12789).
    [CrossRef]   [Google Scholar]
  100. Guo, J., Fan, Y., Pang, L., Yang, L., Ai, Q., Zamani, H., Wu, C., Croft, W. B., & Cheng, X. (2020). A deep look into neural ranking models for information retrieval. Information Processing & Management, 57(6), 102067. Elsevier.
    [CrossRef]   [Google Scholar]
  101. Hu, Z., Wang, J., Yan, Y., Zhao, P., Chen, J., & Huang, J. (2021). Neural graph personalized ranking for Top-N Recommendation. Knowledge-Based Systems, 213, 106426. Elsevier.
    [CrossRef]   [Google Scholar]
  102. Fan, W., Ma, Y., Li, Q., Wang, J., Cai, G., Tang, J., & Yin, D. (2020). A graph neural network framework for social recommendations. IEEE Transactions on Knowledge and Data Engineering, 34(5), 2033–2047. IEEE.
    [CrossRef]   [Google Scholar]
  103. Keyhanipour, A. H. (2025). Graph-based rank aggregation: a deep-learning approach. International Journal of Web Information Systems, 21(1), 54-76.
    [CrossRef]   [Google Scholar]
  104. Meng, Z., Lin, R., & Wu, B. (2024). Graph neural networks-based preference learning method for object ranking. International Journal of Approximate Reasoning, 167, 109131. Elsevier.
    [CrossRef]   [Google Scholar]
  105. Liu, C., Cao, T., & Zhou, L. (2022). Learning to rank complex network node based on the self-supervised graph convolution model. Knowledge-Based Systems, 251, 109220. Elsevier.
    [CrossRef]   [Google Scholar]
  106. Bougouin, A., Boudin, F., & Daille, B. (2013). Topicrank: Graph-based topic ranking for keyphrase extraction. In International joint conference on natural language processing (IJCNLP), 543–551.
    [Google Scholar]
  107. Kazemi, A., Pérez-Rosas, V., & Mihalcea, R. (2020). Biased TextRank: Unsupervised graph-based content extraction. arXiv preprint arXiv:2011.01026.
    [Google Scholar]
  108. Lee, C. Y., Li, C. L., Wang, C., Wang, R., Fujii, Y., Qin, S., ... & Pfister, T. (2021). Rope: reading order equivariant positional encoding for graph-based document information extraction. arXiv preprint arXiv:2106.10786.
    [Google Scholar]
  109. Yang, M., Liu, Z., Yang, L., Liu, X., Wang, C., Peng, H., & Yu, P. S. (2023). Ranking-based group identification via factorized attention on social tripartite graph. In Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining, 769–777.
    [CrossRef]   [Google Scholar]
  110. He, K., Mao, R., Gong, T., Cambria, E., & Li, C. (2022). JCBIE: a joint continual learning neural network for biomedical information extraction. BMC bioinformatics, 23(1), 549.
    [CrossRef]   [Google Scholar]
  111. Onan, A., & Alhumyani, H. (2024). Knowledge-Enhanced Transformer Graph Summarization (KETGS): Integrating Entity and Discourse Relations for Advanced Extractive Text Summarization. Mathematics, 12(23), 3638.
    [CrossRef]   [Google Scholar]
  112. Onan, A., & Alhumyani, H. (2024). Contextual hypergraph networks for enhanced extractive summarization: Introducing multi-element contextual hypergraph extractive summarizer (MCHES). Applied Sciences, 14(11), 4671.
    [CrossRef]   [Google Scholar]
  113. Onan, A. (2023). GTR-GA: Harnessing the power of graph-based neural networks and genetic algorithms for text augmentation. Expert systems with applications, 232, 120908.
    [CrossRef]   [Google Scholar]
  114. Onan, A. (2023). Hierarchical graph-based text classification framework with contextual node embedding and BERT-based dynamic fusion. Journal of king saud university-computer and information sciences, 35(7), 101610.
    [CrossRef]   [Google Scholar]

Cite This Article
APA Style
Çelikten, T., & Onan, A. (2026). Exploring Graph-Based Techniques in Text Data Processing: A Comprehensive Survey of NLP Advancements. ICCK Transactions on Emerging Topics in Artificial Intelligence, 3(2), 86–127. https://doi.org/10.62762/TETAI.2025.740330
Export Citation
RIS Format
Compatible with EndNote, Zotero, Mendeley, and other reference managers
RIS format data for reference managers
TY  - JOUR
AU  - Çelikten, Tuğba
AU  - Onan, Aytuğ
PY  - 2026
DA  - 2026/02/18
TI  - Exploring Graph-Based Techniques in Text Data Processing: A Comprehensive Survey of NLP Advancements
JO  - ICCK Transactions on Emerging Topics in Artificial Intelligence
T2  - ICCK Transactions on Emerging Topics in Artificial Intelligence
JF  - ICCK Transactions on Emerging Topics in Artificial Intelligence
VL  - 3
IS  - 2
SP  - 86
EP  - 127
DO  - 10.62762/TETAI.2025.740330
UR  - https://www.icck.org/article/abs/TETAI.2025.740330
KW  - graph neural networks (GNN)
KW  - natural language processing (NLP)
KW  - graph convolutional networks (GCN)
KW  - heterogeneous graph neural networks
KW  - GNN-based NLP
AB  - Graph Neural Networks (GNNs) have become increasingly prominent in Natural Language Processing (NLP) due to their ability to model intricate relationships and contextual connections between texts. Unlike traditional NLP methods, which typically process text linearly, GNNs utilize graph structures to represent the complex relationships between texts more effectively. This capability has led to significant advancements in various NLP applications, such as social media interaction analysis, sentiment analysis, text classification, and information extraction. Notably, GNNs excel in scenarios with limited labeled data, often outperforming traditional approaches by providing deeper, context-aware solutions. Their versatility in handling different data types has made GNNs a popular choice in NLP research. In this study, we thoroughly explored the application of GNNs across various NLP tasks, demonstrating their advantages in understanding and representing text relationships. We also examined how GNNs address traditional NLP challenges, showcasing their potential to deliver more meaningful and accurate results. Our research underscores the value of GNNs as a potent tool in NLP and suggests future research directions to enhance their applicability and effectiveness further.
SN  - 3068-6652
PB  - Institute of Central Computation and Knowledge
LA  - English
ER  - 
BibTeX Format
Compatible with LaTeX, BibTeX, and other reference managers
BibTeX format data for LaTeX and reference managers
@article{elikten2026Exploring,
  author = {Tuğba Çelikten and Aytuğ Onan},
  title = {Exploring Graph-Based Techniques in Text Data Processing: A Comprehensive Survey of NLP Advancements},
  journal = {ICCK Transactions on Emerging Topics in Artificial Intelligence},
  year = {2026},
  volume = {3},
  number = {2},
  pages = {86-127},
  doi = {10.62762/TETAI.2025.740330},
  url = {https://www.icck.org/article/abs/TETAI.2025.740330},
  abstract = {Graph Neural Networks (GNNs) have become increasingly prominent in Natural Language Processing (NLP) due to their ability to model intricate relationships and contextual connections between texts. Unlike traditional NLP methods, which typically process text linearly, GNNs utilize graph structures to represent the complex relationships between texts more effectively. This capability has led to significant advancements in various NLP applications, such as social media interaction analysis, sentiment analysis, text classification, and information extraction. Notably, GNNs excel in scenarios with limited labeled data, often outperforming traditional approaches by providing deeper, context-aware solutions. Their versatility in handling different data types has made GNNs a popular choice in NLP research. In this study, we thoroughly explored the application of GNNs across various NLP tasks, demonstrating their advantages in understanding and representing text relationships. We also examined how GNNs address traditional NLP challenges, showcasing their potential to deliver more meaningful and accurate results. Our research underscores the value of GNNs as a potent tool in NLP and suggests future research directions to enhance their applicability and effectiveness further.},
  keywords = {graph neural networks (GNN), natural language processing (NLP), graph convolutional networks (GCN), heterogeneous graph neural networks, GNN-based NLP},
  issn = {3068-6652},
  publisher = {Institute of Central Computation and Knowledge}
}

Article Metrics
Citations:

Crossref

0

Scopus

0

Web of Science

0
Article Access Statistics:
Views: 27
PDF Downloads: 5

Publisher's Note
ICCK stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and Permissions
CC BY Copyright © 2026 by the Author(s). Published by Institute of Central Computation and Knowledge. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.
ICCK Transactions on Emerging Topics in Artificial Intelligence

ICCK Transactions on Emerging Topics in Artificial Intelligence

ISSN: 3068-6652 (Online)

Email: [email protected]

Portico

Portico

All published articles are preserved here permanently:
https://www.portico.org/publishers/icck/