Skip to content

BLOOM

This model was released on 2022-11-09 and added to Hugging Face Transformers on 2022-06-09.

PyTorch

The BLOOM model has been proposed with its various versions through the BigScience Workshop. BigScience is inspired by other open science initiatives where researchers have pooled their time and resources to collectively achieve a higher impact. The architecture of BLOOM is essentially similar to GPT3 (auto-regressive model for next token prediction), but has been trained on 46 different languages and 13 programming languages. Several smaller versions of the models have been trained on the same dataset. BLOOM is available in the following versions:

A list of official Hugging Face and community (indicated by 🌎) resources to help you get started with BLOOM. If you’re interested in submitting a resource to be included here, please feel free to open a Pull Request and we’ll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.

See also:

⚡️ Inference

⚙️ Training

[[autodoc]] BloomConfig - all

[[autodoc]] BloomModel - forward

[[autodoc]] BloomForCausalLM - forward

[[autodoc]] BloomForSequenceClassification - forward

[[autodoc]] BloomForTokenClassification - forward

[[autodoc]] BloomForQuestionAnswering - forward