模型:
laion/CLIP-ViT-B-32-roberta-base-laion2B-s12B-b32k
A CLIP ViT-B/32 roberta base model trained with the LAION-2B English subset of LAION-5B ( https://laion.ai/blog/laion-5b/ ) using OpenCLIP ( https://github.com/mlfoundations/open_clip ).
Model training done by Romain Beaumont on the stability.ai cluster.
Zero-shot image classification, image and text retrieval, among others.
Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others.
This model was trained with the 2 Billion sample English subset of LAION-5B ( https://laion.ai/blog/laion-5b/ ).
Training with batch size 32k for 12B sample of laion2B-en, see https://wandb.ai/rom1504/open-clip/reports/clip-B-32-roberta-base--VmlldzoyOTM0NDQ3
Model is B/32 on visual side, roberta base initialized with pretrained weights on text side.
Evaluation done with code in the LAION CLIP Benchmark suite .
The testing is performed with VTAB+ (A combination of VTAB ( https://arxiv.org/abs/1910.04867 ) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval.
The model achieves
Acknowledging stability.ai for the compute used to train this model.
BibTeX:
In addition to forthcoming LAION-5B ( https://laion.ai/blog/laion-5b/ ) paper, please cite:
OpenAI CLIP paper
@inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} }
OpenCLIP software
@software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} }