模型:

DeepPavlov/bert-base-multilingual-cased-sentence

中文

bert-base-multilingual-cased-sentence

Sentence Multilingual BERT (101 languages, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) is a representation‑based sentence encoder for 101 languages of Multilingual BERT. It is initialized with Multilingual BERT and then fine‑tuned on english MultiNLI[1] and on dev set of multilingual XNLI[2]. Sentence representations are mean pooled token embeddings in the same manner as in Sentence‑BERT[3].

[1]: Williams A., Nangia N. & Bowman S. (2017) A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference. arXiv preprint arXiv:1704.05426

[2]: Williams A., Bowman S. (2018) XNLI: Evaluating Cross-lingual Sentence Representations. arXiv preprint arXiv:1809.05053

[3]: N. Reimers, I. Gurevych (2019) Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. arXiv preprint arXiv:1908.10084