Skip to content

Latest commit

 

History

History
12 lines (9 loc) · 1.89 KB

README.md

File metadata and controls

12 lines (9 loc) · 1.89 KB

BERT for Arabic Topic Modeling: An Experimental Study on BERTopic Technique

Read the Full-Text Paper

Topic modeling is an unsupervised machine learning technique for finding abstract topics in a large collection of documents. It helps in organizing, understanding and summarizing large collections of textual information and discovering the latent topics that vary among documents in a given corpus. Latent Dirichlet allocation (LDA) and Non-Negative Matrix Factorization (NMF) are two of the most popular topic modeling techniques. LDA uses a probabilistic approach whereas NMF uses matrix factorization approach, however, new techniques that are based on BERT for topic modeling do exist. In this paper, we aim to experiment with BERTopic using different Pre-Trained Arabic Language Models as embeddings, and compare its results against LDA and NMF techniques. We used Normalized Pointwise Mutual Information (NPMI) measure to evaluate the results of topic modeling techniques. The overall results generated by BERTopic showed better results compared to NMF and LDA.

Code for this paper: Open In Colab

Citation

Abeer Abuzayed and Hend Al-Khalifa. BERT for Arabic Topic Modeling: An Experimental Study on BERTopic Technique. 5th International Conference on AI in Computational Linguistics, Procedia Computer Science (ISSN: 1877-0509), Elsevier, 2021. (in press).

Team: