Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top