chinese-pert-base-mrc
  • 模型资讯
  • 模型资料

示例代码

from modelscope.pipelines import pipeline
from modelscope.utils.constant import Tasks


pipeline_ins = pipeline(
		'fill-mask',
		model='dienstag/chinese-pert-base-mrc',
        model_revision='v1.0.0'
)

print(pipeline_ins('生活的真谛是[MASK]。'))

A Chinese MRC model built on Chinese PERT-base

Please use BertForQuestionAnswering to load this model!

This is a Chinese machine reading comprehension (MRC) model built on PERT-base and fine-tuned on a mixture of Chinese MRC datasets.

PERT is a pre-trained model based on permuted language model (PerLM) to learn text semantic information in a self-supervised manner without introducing the mask tokens [MASK]. It yields competitive results on in tasks such as reading comprehension and sequence labeling.

Results on Chinese MRC datasets (EM/F1):

(We report the checkpoint that has the best AVG score)

CMRC 2018 Dev DRCD Dev SQuAD-Zen Dev (Answerable) AVG
PERT-base 73.2/90.6 88.7/94.1 59.7/76.5 73.9/87.1

Please visit our GitHub repo for more information: https://github.com/ymcui/PERT

You may also be interested in,

Chinese Minority Languages CINO: https://github.com/ymcui/Chinese-Minority-PLM
Chinese MacBERT: https://github.com/ymcui/MacBERT
Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer

More resources by HFL: https://github.com/ymcui/HFL-Anthology