Download PDFOpen PDF in browser

Comparative Study of Inductive Graph Neural Network Models for Text Classification

EasyChair Preprint no. 9135

5 pagesDate: October 26, 2022

Abstract

Among several proposed methods for text classifi- cation, transformers and GNN have gained popularity recently. Models which use GNN are both transductive and inductive. Transductive models such as TextGCN fail to deal with scalability issues with larger datasets because of converting the whole corpus into a graph. Induction models were introduced, which convert individual documents to graphs fed to the model for classification. In this paper, a comparative study of the three Inductive Graph Neural Network(GNN), namely TexTING, In- GCN, In-GAT models, is analyzed. The study shows that In- GAT gave better result comparecd other two models. Also, It is proved that message passing mechanism does not have effect on performance of model and Entropy loss value depends on size of Dataset and Model used.

Keyphrases: Entropy, Gated Graph Recurrent Unit, Graph Attention Network, Graph Convolutional Network, inductive model, text classification

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:9135,
  author = {Saran Pandian and Uttkarsh Chaurasia and Shudhanshu Ranjan and Shefali Saxena},
  title = {Comparative Study of Inductive Graph Neural Network Models for Text Classification},
  howpublished = {EasyChair Preprint no. 9135},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser