Skip to content

Sharefah-Alghamdi/Arabic_BERT-Based_Dependency_Parsing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Arabic_BERT-Based_Dependency_Parsing

This repository contains the code for the paper “Parsing as Pretraining” by Vilares, David, et al., which proposes a novel method for improving the performance of natural language understanding models by using syntactic parsing as a pretraining task. The code also contains our updates to conduct our experiments for our article “Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing” , where we used it to evaluate it on different Arabic treebanks with different Arabic BERT models.

Colaboratory file

Finetune AraBERTV02 to perform sequence labeling dependency parsing on ArPoT and PADT treebanks. HERE

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published