Skip to content

Update FlashInfer version used by vLLM tests on PyTorch CI to v0.3.1 #164562

@huydhn

Description

@huydhn

This is to match what vLLM is using after vllm-project/vllm#25782, we should also explore the option to use the new tools/flashinfer-build.sh script there to simplify the build process. This is an important dependency that we need to update to match the behavior of vLLM CI.

#164361 attempted to do this while trying to fix trunk, but we encountered several issues along the way that prompted us to abandon the approach:

cc @ezyang @gchanan @zou3519 @kadeng @msaroufim @seemethere @malfet @pytorch/pytorch-dev-infra @yangw-dev

Metadata

Metadata

Labels

high prioritymodule: ciRelated to continuous integrationmodule: vllmtriage reviewtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

Status

Prioritized

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions