模型:
google/mobilebert-uncased
MobileBERT相当于BERT_LARGE的瘦身版,同时配备了瓶颈结构和经过精心设计的自注意力和前馈网络之间的平衡。
该检查点是原始的MobileBert优化版,用于英文: uncased_L-24_H-128_B-512_A-4_F-4_OPT 检查点。
from transformers import pipeline fill_mask = pipeline( "fill-mask", model="google/mobilebert-uncased", tokenizer="google/mobilebert-uncased" ) print( fill_mask(f"HuggingFace is creating a {fill_mask.tokenizer.mask_token} that the community uses to solve NLP tasks.") )