模型:
MilosKosRad/BioNER
This model was created during the research collaboration between Bayer Pharma and The Institute for Artificial Intelligence Research and Development of Serbia. The model is trained on 26 biomedical Named Entity (NE) classes and can perform zero-shot inference. It also can be further fine-tuned for new classes with just few examples (few-shot learning). For more details about our method please see the paper named "A transformer-based method for zero and few-shot biomedical named entity recognition" . The model corresponds to PubMedBERT-based model, trained with 1 in the first segment (check paper for more details).
Model takes two strings as input. String1 is NE label that is being searched in second string. String2 is short text where one wants to searc for NE (represented by String1). Model outputs list of ones (corresponding to the found Named Entities) and zeros (corresponding to other non-NE tokens) of the Sring2.
from transformers import AutoTokenizer from transformers import BertForTokenClassification modelname = 'MilosKorsRad/BioNER' # modelpath tokenizer = AutoTokenizer.from_pretrained(modelname) ## loading the tokenizer of the model string1 = 'Drug' string2 = 'No recent antibiotics or other nephrotoxins, and no symptoms of UTI with benign UA.' encodings = tokenizer(string1, string2, is_split_into_words=False, padding=True, truncation=True, add_special_tokens=True, return_offsets_mapping=False, max_length=512, return_tensors='pt') model0 = BertForTokenClassification.from_pretrained(modelname, num_labels=2) prediction_logits = model0(**encodings) print(prediction_logits)
In order to fine-tune model with new entity using few-shots, the dataset needs to be transformed to torch.utils.data.Dataset, containing BERT tokens and set of 0s and 1s (1 is where the class is positive and should be predicted as the member of given NE class). After the dataset is created, the following can be done (for more details, please have a look at the code at GitHub - https://github.com/br-ai-ns-institute/Zero-ShotNER ):
for i in [train1shot, train10shot, train100shot]: training_args = TrainingArguments( output_dir='./Results'+class_unseen+'FewShot'+str(i), # output folder (folder to store the results) num_train_epochs=10, # number of training epochs per_device_train_batch_size=16, # batch size per device during training per_device_eval_batch_size=16, # batch size for evaluation weight_decay=0.01, # strength of weight decay logging_dir='./Logs'+class_unseen+'FewShot'+str(i), # folder to store the logs save_strategy='epoch', evaluation_strategy='epoch', load_best_model_at_end=True ) model0 = BertForTokenClassification.from_pretrained(model_path, num_labels=2) trainer = Trainer( model=model0, # pre-trained model for fine-tuning args=training_args, # training arguments defined above train_dataset=train_0shot, # dataset class object for training eval_dataset=valid_dataset # dataset class object for validation ) start_time = time.time() trainer.train() total_time = time.time()-start_time model_path = os.path.join('Results', class_unseen, 'FewShot',str(i), 'Model') os.makedirs(model_path, exist_ok=True) model.save_pretrained(model_path) tokenizer_path = os.path.join('Results', class_unseen, 'FewShot', str(i), 'Tokenizer') os.makedirs(tokenizer_path, exist_ok=True) tokenizer.save_pretrained(tokenizer_path)
The following datasets and entities were used for training and therefore they can be used as label in the first segment (as a first string). Note that multiword string have been merged.
On top of this, one can use the model for zero-shot learning with other classes, and also fine-tune it with few examples of other classes.
Code used for training and testing the model is available at https://github.com/br-ai-ns-institute/Zero-ShotNER
If you use this model, or are inspired by it, please cite in your paper the following paper:
Košprdić M.,Prodanović N., Ljajić A., Bašaragin B., Milošević N., 2023. A transformer-based method for zero and few-shot biomedical named entity recognition. arXiv preprint arXiv:2305.04928. https://arxiv.org/abs/2305.04928
or in bibtex:
@misc{kosprdic2023transformerbased, title={A transformer-based method for zero and few-shot biomedical named entity recognition}, author={Miloš Košprdić and Nikola Prodanović and Adela Ljajić and Bojana Bašaragin and Nikola Milošević}, year={2023}, eprint={2305.04928}, archivePrefix={arXiv}, primaryClass={cs.CL} }