github地址:https://github.com/tomohideshibata/BERT-related-papers
Table of Contents
Surver paper
Downstream task
Generation
Quality evaluator
Modification (multi-task, masking strategy, etc.)
Transformer variants
Probe
Inside BERT
Multi-lingual
Other than English models
Domain specific
Multi-modal
Model compression
Misc.