Create an account to participate in the community.

Sign Up Login

How to win NLP Kaggle competitions
Asked on 2021-03-10 03:29 by Rachel W.

Great video from earlier today that has Kaggle grandmasters discussing how they used Hugging Face and other tools to come in top positions in NLP competitions. Really good intro to different approaches and strategies.

Response #1
By Lex S. on 2021-03-19 15:53
Yeah, there seem to be two related approaches to building really solid NLP models quickly like this:
  1. Fine tune an existing model, like BERT. That's like your point, Rachel.
  2. Build a different corpus of articles and text to train new BERT-like embeddings.
For #2, there's LegalBert, ClimateBert, medical-BERT, etc.

Michelle W. on 2021-03-27 00:23:
I wonder if you can combine all these BERTs to build some sort of ensemble model for the next NLP competitions on Kaggle?