mT5
This model was released on 2020-10-22 and added to Hugging Face Transformers on 2020-11-17.
mT5 is a multilingual variant of T5, training on 101 languages. It also incorporates a new “accidental translation” technique to prevent the model from incorrectly translating predictions into the wrong language.
You can find all the original [mT5] checkpoints under the mT5 collection.
Click on the mT5 models in the right sidebar for more examples of how to apply mT5 to different language tasks.
The example below demonstrates how to summarize text with Pipeline, AutoModel, and from the command line.
import torchfrom transformers import pipeline
pipeline = pipeline( task="text2text-generation", model="csebuetnlp/mT5_multilingual_XLSum", dtype=torch.float16, device=0)pipeline("""Plants are remarkable organisms that produce their own food using a method called photosynthesis.This process involves converting sunlight, carbon dioxide, and water into glucose, which provides energy for growth.Plants play a crucial role in sustaining life on Earth by generating oxygen and serving as the foundation of most ecosystems.""")import torchfrom transformers import AutoModelForSeq2SeqLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained( "csebuetnlp/mT5_multilingual_XLSum")model = AutoModelForSeq2SeqLM.from_pretrained( "csebuetnlp/mT5_multilingual_XLSum", dtype=torch.float16, device_map="auto",)
input_text = """Plants are remarkable organisms that produce their own food using a method called photosynthesis.This process involves converting sunlight, carbon dioxide, and water into glucose, which provides energy for growth.Plants play a crucial role in sustaining life on Earth by generating oxygen and serving as the foundation of most ecosystems."""input_ids = tokenizer(input_text, return_tensors="pt").to(model.device)
output = model.generate(**input_ids, cache_implementation="static")print(tokenizer.decode(output[0], skip_special_tokens=True))echo -e "Plants are remarkable organisms that produce their own food using a method called photosynthesis." | transformers run --task text2text-generation --model csebuetnlp/mT5_multilingual_XLSum --device 0Quantization reduces the memory burden of large models by representing the weights in a lower precision. Refer to the Quantization overview for more available quantization backends.
The example below uses bitsandbytes to only quantize the weights to int4.
import torchfrom transformers import BitsAndBytesConfig, AutoModelForSeq2SeqLM, AutoTokenizer
quantization_config = BitsAndBytesConfig( load_in_4bit=True, bnb_4bit_compute_dtype=torch.bfloat16, bnb_4bit_quant_type="nf4")model = AutoModelForSeq2SeqLM.from_pretrained( "csebuetnlp/mT5_multilingual_XLSum", dtype=torch.bfloat16, device_map="auto", quantization_config=quantization_config)
tokenizer = AutoTokenizer.from_pretrained( "csebuetnlp/mT5_multilingual_XLSum")input_text = """Plants are remarkable organisms that produce their own food using a method called photosynthesis.This process involves converting sunlight, carbon dioxide, and water into glucose, which provides energy for growth.Plants play a crucial role in sustaining life on Earth by generating oxygen and serving as the foundation of most ecosystems."""input_ids = tokenizer(input_text, return_tensors="pt").to(model.device)
output = model.generate(**input_ids, cache_implementation="static")print(tokenizer.decode(output[0], skip_special_tokens=True))- mT5 must be fine-tuned for downstream tasks because it was only pretrained on the mc4 dataset.
MT5Config
Section titled “MT5Config”[[autodoc]] MT5Config
MT5Model
Section titled “MT5Model”[[autodoc]] MT5Model
MT5ForConditionalGeneration
Section titled “MT5ForConditionalGeneration”[[autodoc]] MT5ForConditionalGeneration
MT5EncoderModel
Section titled “MT5EncoderModel”[[autodoc]] MT5EncoderModel
MT5ForSequenceClassification
Section titled “MT5ForSequenceClassification”[[autodoc]] MT5ForSequenceClassification
MT5ForTokenClassification
Section titled “MT5ForTokenClassification”[[autodoc]] MT5ForTokenClassification
MT5ForQuestionAnswering
Section titled “MT5ForQuestionAnswering”[[autodoc]] MT5ForQuestionAnswering