Skip to content

A unified model for Word-in-Context Disambiguation for both Multilingual and Cross-lingual settings.

Notifications You must be signed in to change notification settings

InverseAddict/MCL-WiC-Disambiguation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SemEval 2021: MCL-WiC

This repository contains the code for the submissions made by Team PAW at MCL-WiC Shared Task.

Paper Title: PAW at SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation : Exploring Cross Lingual Transfer, Augmentations and Adversarial Training

Paper Abstract:

We experiment with XLM RoBERTa for Word in Context Disambiguation in the Multi Lingual and Cross Lingual setting so as to develop a single model having knowledge about both settings. We solve the problem as a binary classification problem and also experiment with data augmentation and adversarial training techniques. In addition, we also experiment with a 2-stage training technique. Our approaches prove to be beneficial for better performance and robustness.

Our paper can be found at https://aclanthology.org/2021.semeval-1.98/

About

A unified model for Word-in-Context Disambiguation for both Multilingual and Cross-lingual settings.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published