You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, we noticed that the file 20200501.en no longer exists at the path you provided. However, we found the same dataset at another location: ( https://huggingface.co/datasets/SamuelYang/wikipedia_20200501.en ). Additionally, we have some precomputed layer stats weight files corresponding to the following models: gpt-j-6B, llama2-7b, llama2-7b-chat, mistral-7b. I will upload the weights to the cloud drive and write a README file specifying the download links and the corresponding model files by the day after tomorrow.
@JizhanFang thanks for your reply. Then, also it might be good to modify that line to allow users select SamuelYang/wikipedia_20200501.en as dataset in hparams file.
https://github.com/zjunlp/EasyEdit/blob/2bbf0e1e878b355e77279e76fe1f167991a6f19e/easyeditor/models/rome/layer_stats.py#L102C1-L105C10
20200501.en
is no longer available in datasets library. So, it needs to be updated according to up-to-date usage (see https://huggingface.co/datasets/wikimedia/wikipedia):ds = load_dataset("wikimedia/wikipedia", "20231101.en")
Since this is likely to affect results, it would be nice to have a way to use old wikipedia dataset too.
The text was updated successfully, but these errors were encountered: