Skip to content

BERT for Arabic Topic Modeling: An Experimental Study on BERTopic Technique

Notifications You must be signed in to change notification settings

iwan-rg/Arabic-Topic-Modeling

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 

Repository files navigation

BERT for Arabic Topic Modeling: An Experimental Study on BERTopic Technique

Read the Full-Text Paper

Topic modeling is an unsupervised machine learning technique for finding abstract topics in a large collection of documents. It helps in organizing, understanding and summarizing large collections of textual information and discovering the latent topics that vary among documents in a given corpus. Latent Dirichlet allocation (LDA) and Non-Negative Matrix Factorization (NMF) are two of the most popular topic modeling techniques. LDA uses a probabilistic approach whereas NMF uses matrix factorization approach, however, new techniques that are based on BERT for topic modeling do exist. In this paper, we aim to experiment with BERTopic using different Pre-Trained Arabic Language Models as embeddings, and compare its results against LDA and NMF techniques. We used Normalized Pointwise Mutual Information (NPMI) measure to evaluate the results of topic modeling techniques. The overall results generated by BERTopic showed better results compared to NMF and LDA.

Code for this paper: Open In Colab

Citation

Abeer Abuzayed and Hend Al-Khalifa. BERT for Arabic Topic Modeling: An Experimental Study on BERTopic Technique. 5th International Conference on AI in Computational Linguistics, Procedia Computer Science (ISSN: 1877-0509), Elsevier, 2021. (in press).

Team:

About

BERT for Arabic Topic Modeling: An Experimental Study on BERTopic Technique

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published