Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languagesBy Hugging Face - Blog / May 24, 2024