Apertus
This model was released on 2025-09-02 and added to Hugging Face Transformers on 2025-08-28.
Apertus
Section titled “Apertus”
Overview
Section titled “Overview”Apertus is a family of large language models from the Swiss AI Initiative.
The example below demonstrates how to generate text with Pipeline or the AutoModel, and from the command line.
import torchfrom transformers import pipeline
pipeline = pipeline( task="text-generation", model="swiss-ai/Apertus-8B", dtype=torch.bfloat16, device=0)pipeline("Plants create energy through a process known as")import torchfrom transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained( "swiss-ai/Apertus-8B",)model = AutoModelForCausalLM.from_pretrained( "swiss-ai/Apertus-8B", dtype=torch.bfloat16, device_map="auto", attn_implementation="sdpa")input_ids = tokenizer("Plants create energy through a process known as", return_tensors="pt").to("cuda")
output = model.generate(**input_ids)print(tokenizer.decode(output[0], skip_special_tokens=True))echo -e "Plants create energy through a process known as" | transformers run --task text-generation --model swiss-ai/Apertus-8B --device 0ApertusConfig
Section titled “ApertusConfig”[[autodoc]] ApertusConfig
ApertusModel
Section titled “ApertusModel”[[autodoc]] ApertusModel - forward
ApertusForCausalLM
Section titled “ApertusForCausalLM”[[autodoc]] ApertusForCausalLM - forward
ApertusForTokenClassification
Section titled “ApertusForTokenClassification”[[autodoc]] ApertusForTokenClassification - forward