模型:
matth/flowformer
Automatic detection of blast cells in ALL data using transformers.
Official implementation of our work: "Automated Identification of Cell Populations in Flow Cytometry Data with Transformers" by Matthias Wödlinger, Michael Reiter, Lisa Weijler, Margarita Maurer-Granofszky, Angela Schumich, Elisa O Sajaroff, Stefanie Groeneveld-Krentz, Jorge G Rossi, Leonid Karawajew, Richard Ratei and Michael Dworzak
Load the pretrained model from huggingface
from transformers import AutoModel flowformer = AutoModel.from_pretrained("matth/flowformer", trust_remote_code=True)
trust_remote_code=True is necessary because the model code uses a custom architecture.
The model expects as input a pytorch tensor x with shape batch_size x num_cells x num_markers . The pretrained model is trained with the the markers: TIME, FSC-A, FSC-W, SSC-A, CD20, CD10, CD45, CD34, CD19, CD38, SY41 . If you use different markers (or a different ordering of markers), you need to specify this by setting the markers kwarg in the model forward pass:
output = flowformer(x, markers=["Marker1", "Marker2", "Marker3"])
For more information about model usage as well as hands-on examples check out this demo notebook from my colleague Florian Kowarsch: https://github.com/CaRniFeXeR/python4FCM_Examples/blob/main/hyperflow2023.ipynb
If you use this project please consider citing our work
@article{wodlinger2022automated, title={Automated identification of cell populations in flow cytometry data with transformers}, author={Wödlinger, Matthias and Reiter, Michael and Weijler, Lisa and Maurer-Granofszky, Margarita and Schumich, Angela and Sajaroff, Elisa O and Groeneveld-Krentz, Stefanie and Rossi, Jorge G and Karawajew, Leonid and Ratei, Richard and others}, journal={Computers in Biology and Medicine}, volume={144}, pages={105314}, year={2022}, publisher={Elsevier} }