二郎神-MacBERT-325M-NLI-中文
3.25亿参数的MacBERT,在NLI任务上进行预训练,并在FewCLUE的OCNLI任务上微调。
  • 模型资讯
  • 模型资料

Erlangshen-MacBERT-325M-NLI-Chinese

简介 Brief Introduction

3.25亿参数的MacBERT,在NLI任务上进行预训练,并在FewCLUE的OCNLI任务上微调。

The MacBERT with 325M parameters is pre-trained for Chinese NLI tasks, and finetuned on task OCNLI from FewCLUE.

模型分类 Model Taxonomy

需求 Demand 任务 Task 系列 Series 模型 Model 参数 Parameter 额外 Extra
通用 General 自然语言理解 NLU 二郎神 Erlangshen MacBERT 325M Chinese

模型信息 Model Information

为了提高模型在NLI上的效果,我们收集了大量NLI进行预训练,随后在FewCLUE的OCNLI任务进行微调,所有的训练均基于我们提出的UniMC框架。最终结果表明,3.25亿参数的模型通过我们的训练策略在NLI任务上可以达到1.3亿参数大模型相当的效果。

To improve the model performance on the NLI task, we collected numerous NLI datasets for pre-training. Then the model was finetuned on a specific NLI task, OCNLI from FewCLUE. All the training is based on the UniMC framework we proposed. The results show that our model with 325M parameters could achieve comparable performance to the model with 1.3B parameters on the NLI task via our training strategies.

下游效果 Performance

BUSTM任务上的效果:

The results on BUSTM:

Model BUSTM
Erlangshen-UniMC-MegatronBERT-1.3B-Chinese 76.34
Erlangshen-MacBERT-325M-NLI-Chinese 74.42

使用 Usage

from modelscope.pipelines import pipeline
from modelscope.utils.constant import Tasks


pipeline_ins = pipeline(
                'fill-mask', 
                model='Fengshenbang/Erlangshen-MacBERT-325M-NLI-Chinese',
                model_revision='v1.0.1'
)

print(pipeline_ins('飞流直下三千尺'))

引用 Citation

如果您在您的工作中使用了我们的模型,可以引用我们的论文

If you are using the resource for your work, please cite the our paper:

@article{fengshenbang,
  author    = {Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen and Ruyi Gan and Jiaxing Zhang},
  title     = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
  journal   = {CoRR},
  volume    = {abs/2209.02970},
  year      = {2022}
}

也可以引用我们的网站:

You can also cite our website:

@misc{Fengshenbang-LM,
  title={Fengshenbang-LM},
  author={IDEA-CCNL},
  year={2021},
  howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}