Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .github/workflows/test_exporters_common.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@ jobs:
run: |
pip install --upgrade pip
pip install --no-cache-dir torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
pip install .[tests,exporters]
pip install optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git
pip install .[tests]
Comment on lines +39 to +40
Copy link
Member Author

@IlyasMoutawwakil IlyasMoutawwakil Sep 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To fix unresolvable dependency issue
Using pip install .[onnx] results in a dependency graph where optimum@PR points to optimum-onnx@main which points at optimum@main


- name: Test with pytest
run: |
Expand Down
45 changes: 45 additions & 0 deletions .github/workflows/test_pipelines.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
name: Optimum Pipelines / Python - Test

on:
push:
branches: [main]
pull_request:
branches: [main]

concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

env:
UV_SYSTEM_PYTHON: 1
UV_TORCH_BACKEND: auto
TRANSFORMERS_IS_CI: true

jobs:
build:
strategy:
fail-fast: false
matrix:
python-version: [3.9]
runs-on: [ubuntu-22.04]

runs-on: ${{ matrix.runs-on }}

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Install dependencies
run: |
pip install --upgrade pip uv
uv pip install --no-cache-dir optimum-onnx[onnxruntime]@git+https://github.com/huggingface/optimum-onnx.git
uv pip install --no-cache-dir .[tests]
- name: Test with pytest
run: |
pytest tests/pipelines -vvvv --durations=0
36 changes: 0 additions & 36 deletions docs/source/quicktour.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -129,42 +129,6 @@ To train transformers on Habana's Gaudi processors, 🤗 Optimum provides a `Gau

You can find more examples in the [documentation](https://huggingface.co/docs/optimum/habana/quickstart) and in the [examples](https://github.com/huggingface/optimum-habana/tree/main/examples).


#### ONNX Runtime

To train transformers with ONNX Runtime's acceleration features, 🤗 Optimum provides a `ORTTrainer` that is very similar to the 🤗 Transformers [Trainer](https://huggingface.co/docs/transformers/main_classes/trainer). Here is a simple example:

```diff
- from transformers import Trainer, TrainingArguments
+ from optimum.onnxruntime import ORTTrainer, ORTTrainingArguments

# Download a pretrained model from the Hub
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")

# Define the training arguments
- training_args = TrainingArguments(
+ training_args = ORTTrainingArguments(
output_dir="path/to/save/folder/",
optim="adamw_ort_fused",
...
)

# Create a ONNX Runtime Trainer
- trainer = Trainer(
+ trainer = ORTTrainer(
model=model,
args=training_args,
train_dataset=train_dataset,
+ feature="text-classification", # The model type to export to ONNX
...
)

# Use ONNX Runtime for training!
trainer.train()
```

You can find more examples in the [documentation](https://huggingface.co/docs/optimum/onnxruntime/usage_guides/trainer) and in the [examples](https://github.com/huggingface/optimum/tree/main/examples/onnxruntime/training).

## Out of the box ONNX export

The Optimum library handles out of the box the ONNX export of Transformers and Diffusers models!
Expand Down
3 changes: 2 additions & 1 deletion optimum/configuration_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -342,6 +342,7 @@ def to_dict(self) -> Dict[str, Any]:
output["transformers_version"] = transformers_version_str
output["optimum_version"] = __version__

self.dict_torch_dtype_to_str(output)
if hasattr(self, "dict_torch_dtype_to_str"):
self.dict_torch_dtype_to_str(output)

return output
4 changes: 2 additions & 2 deletions optimum/exporters/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@

from ..utils import (
DIFFUSERS_MINIMUM_VERSION,
check_if_diffusers_greater,
is_diffusers_available,
is_diffusers_version,
logging,
)
from ..utils.import_utils import _diffusers_version
Expand All @@ -38,7 +38,7 @@


if is_diffusers_available():
if not check_if_diffusers_greater(DIFFUSERS_MINIMUM_VERSION.base_version):
if is_diffusers_version("<", DIFFUSERS_MINIMUM_VERSION.base_version):
raise ImportError(
f"We found an older version of diffusers {_diffusers_version} but we require diffusers to be >= {DIFFUSERS_MINIMUM_VERSION}. "
"Please update diffusers by running `pip install --upgrade diffusers`"
Expand Down
284 changes: 278 additions & 6 deletions optimum/pipelines/__init__.py

Large diffs are not rendered by default.

Loading