中文

Please use 'Bert' related functions to load this model!

Chinese small pre-trained model MiniRBT

In order to further promote the research and development of Chinese information processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.

This repository is developed based on: https://github.com/iflytek/MiniRBT

You may also interested in,

More resources by HFL: https://github.com/iflytek/HFL-Anthology