diff --git a/research/_posts/2021-09-01-self-refine.md b/research/_posts/2021-09-01-self-refine.md index c91d248..6408748 100644 --- a/research/_posts/2021-09-01-self-refine.md +++ b/research/_posts/2021-09-01-self-refine.md @@ -13,7 +13,7 @@ description: "Self-Refine Learning For Data-Centric Deep Learning" ### Abstract -In industry NLP application, our manually labeled data has a certain number of noise data. We present a simple method to find the noise data and remove them. We select the noise data whose human label is not contained in the top-K model's predictions. The experiment result shows that our method works. For industry deep learning application, our method improve the text classification accuracy from 80.5% to 90.6% in dev dataset, and improve the human-evaluation accuracy from 83.2% to 90.1%. The conclusion is: the self-predict and self-drop method of this paper can not improve the accuracy to more than 95%, without human labeling again for the training dataset. +In industry NLP application, our manually labeled data has a certain number of noise data. We present a simple method to find the noise data and remove them. We select the noise data whose human label is not contained in the top-K model's predictions. The experiment result shows that our method works. For industry deep learning application, our method improve the text classification accuracy from 80.5% to 90.6% in dev dataset, and improve the human-evaluation accuracy from 83.2% to 90.1%. The conclusion is: The self-predict and self-drop method of this paper can not improve the accuracy to more than 95%, without human labeling again for the training dataset. #### Keywords