[EMNLP'21] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels. - View it on GitHub
Star
73
Rank
303474