中文

ruBert-large

Model was trained by SberDevices team.

  • Task: mask filling
  • Type: encoder
  • Tokenizer: bpe
  • Dict size: 120 138
  • Num Parameters: 427 M
  • Training Data Volume 30 GB

Authors