-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please can the pre-training code about NLI be shared? #4
Comments
Wow, super thanks! Look forward to them! |
您好~ 能分享关于NLI的代码嘛 |
这是我的微信和邮箱~ 十分期待您的回复! |
I see, I need to find the code for NLI and will post it these two days. Thank you for the patience! |
Thank you for your quick reply! Hope the code is not eaten by mice~
|
Hi, I am so sorry for delay since I have been busy with my conference ddl. I just now spent half an hour digging into my hard drive to find out the code for NLI fine-tuning, but I failed, which is so weird. But I did recall that I was using the huggingface code for training the NLI model, which can be referred to this code: https://github.com/abidlabs/pytorch-transformers. I am sorry that I cannot offer a direct code but I do believe that it is easy to adopt the code in the link to fine-tuned an Albert model on NLI. Of course you can also use the latest Huggingface transformers source code. Let me know if you have more questions. |
I want to train more models like albert and will share it later~
The text was updated successfully, but these errors were encountered: