MT-DNN, an open-source natural language understanding (NLU) toolkit that makes it easy for researchers and developers to train customized deep learning models. Built upon PyTorch and Transformers, MT-DNN is designed to facilitate rapid customization for a broad spectrum of NLU tasks, using a variety of objectives (classification, regression, structured prediction) and text encoders (e.g., RNNs, BERT, RoBERTa, UniLM).
A unique feature of MT-DNN is its built-in support for robust and transferable learning using the adversarial multi-task learning paradigm. To enable efficient production deployment, MT-DNN supports multi-task knowledge distillation, which can substantially compress a deep neural model without significant performance drop. We demonstrate the effectiveness of MT-DNN on a wide range of NLU applications across general and biomedical domains.
This repository is a pip installable package that implements the Multi-Task Deep Neural Networks (MT-DNN) for Natural Language Understanding, as described in the following papers:
Xiaodong Liu*, Pengcheng He*, Weizhu Chen and Jianfeng Gao Multi-Task Deep Neural Networks for Natural Language Understanding ACL 2019 *: Equal contribution
Xiaodong Liu, Pengcheng He, Weizhu Chen and Jianfeng Gao Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding arXiv version
Pengcheng He, Xiaodong Liu, Weizhu Chen and Jianfeng Gao Hybrid Neural Network Model for Commonsense Reasoning arXiv version
Liyuan Liu, Haoming Jiang, Pengcheng He, Weizhu Chen, Xiaodong Liu, Jianfeng Gao and Jiawei Han On the Variance of the Adaptive Learning Rate and Beyond arXiv version
Haoming Jiang, Pengcheng He, Weizhu Chen, Xiaodong Liu, Jianfeng Gao and Tuo Zhao SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization arXiv version
Xiaodong Liu, Yu Wang, Jianshu Ji, Hao Cheng, Xueyun Zhu, Emmanuel Awa, Pengcheng He, Weizhu Chen, Hoifung Poon, Guihong Cao, Jianfeng Gao The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding arXiv version
Xiaodong Liu, Hao Cheng, Pengcheng He, Weizhu Chen, Yu Wang, Hoifung Poon and Jianfeng Gao Adversarial Training for Large Neural Language Models arXiv version
github地址:https://github.com/microsoft/MT-DNN