Skip to content

ERNIE

This model was released on 2019-04-19 and added to Hugging Face Transformers on 2022-09-09.

PyTorch

ERNIE1.0, ERNIE2.0, ERNIE3.0, ERNIE-Gram, ERNIE-health are a series of powerful models proposed by baidu, especially in Chinese tasks.

ERNIE (Enhanced Representation through kNowledge IntEgration) is designed to learn language representation enhanced by knowledge masking strategies, which includes entity-level masking and phrase-level masking.

Other ERNIE models released by baidu can be found at Ernie 4.5, and Ernie 4.5 MoE.

Click on the ERNIE models in the right sidebar for more examples of how to apply ERNIE to different language tasks.

The example below demonstrates how to predict the [MASK] token with Pipeline, AutoModel, and from the command line.

from transformers import pipeline
pipeline = pipeline(
task="fill-mask",
model="nghuyong/ernie-3.0-xbase-zh"
)
pipeline("巴黎是[MASK]国的首都。")
import torch
from transformers import AutoModelForMaskedLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
"nghuyong/ernie-3.0-xbase-zh",
)
model = AutoModelForMaskedLM.from_pretrained(
"nghuyong/ernie-3.0-xbase-zh",
dtype=torch.float16,
device_map="auto"
)
inputs = tokenizer("巴黎是[MASK]国的首都。", return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model(**inputs)
predictions = outputs.logits
masked_index = torch.where(inputs['input_ids'] == tokenizer.mask_token_id)[1]
predicted_token_id = predictions[0, masked_index].argmax(dim=-1)
predicted_token = tokenizer.decode(predicted_token_id)
print(f"The predicted token is: {predicted_token}")
Terminal window
echo -e "巴黎是[MASK]国的首都。" | transformers run --task fill-mask --model nghuyong/ernie-3.0-xbase-zh --device 0

Model variants are available in different sizes and languages.

Model NameLanguageDescription
ernie-1.0-base-zhChineseLayer:12, Heads:12, Hidden:768
ernie-2.0-base-enEnglishLayer:12, Heads:12, Hidden:768
ernie-2.0-large-enEnglishLayer:24, Heads:16, Hidden:1024
ernie-3.0-base-zhChineseLayer:12, Heads:12, Hidden:768
ernie-3.0-medium-zhChineseLayer:6, Heads:12, Hidden:768
ernie-3.0-mini-zhChineseLayer:6, Heads:12, Hidden:384
ernie-3.0-micro-zhChineseLayer:4, Heads:12, Hidden:384
ernie-3.0-nano-zhChineseLayer:4, Heads:12, Hidden:312
ernie-health-zhChineseLayer:12, Heads:12, Hidden:768
ernie-gram-zhChineseLayer:12, Heads:12, Hidden:768

You can find all the supported models from huggingface’s model hub: huggingface.co/nghuyong, and model details from paddle’s official repo: PaddleNLP and ERNIE’s legacy branch.

[[autodoc]] ErnieConfig - all

[[autodoc]] models.ernie.modeling_ernie.ErnieForPreTrainingOutput

[[autodoc]] ErnieModel - forward

[[autodoc]] ErnieForPreTraining - forward

[[autodoc]] ErnieForCausalLM - forward

[[autodoc]] ErnieForMaskedLM - forward

[[autodoc]] ErnieForNextSentencePrediction - forward

[[autodoc]] ErnieForSequenceClassification - forward

[[autodoc]] ErnieForMultipleChoice - forward

[[autodoc]] ErnieForTokenClassification - forward

[[autodoc]] ErnieForQuestionAnswering - forward