Skip to content

Latest commit

 

History

History
13 lines (8 loc) · 990 Bytes

README.md

File metadata and controls

13 lines (8 loc) · 990 Bytes

Abstractive Summarization of Portuguese Texts by fine-tuning the Portuguese-Based T5 Model


The following project aims to fine-tune and implement the Portuguese Vocabulary Pretrained T5 model on abstractive text summarization tasks in Brazilian Portuguese

By finetuning Google's T5 model for abstractive text summarization over the XSum Dataset, as well as its Portuguese trained counterpart PT-T5, this project explores the abstractive summarization field with Deep Learning for both English and Portuguese languages.

Please refer to the adjacent PDF paper for further explanation on the project's development and results.

Acknowledgements


This repository was developed as the final project of the IA376's graduate course, taught by Professors Rodrigo Nogueira and Roberto Lotufo at the University of Campinas (UNICAMP).