Skip to content

torch.export lift symint input to a graph during dynamo #165073

@zhxchen17

Description

@zhxchen17

This is an issue filed internally for torch.export team:

torch.export will unlift symint inputs to a graph during tracing, but in dynamo It will keep it lifted in torch IR:
torch.export(strict=True):

    class GraphModule(torch.nn.Module):
        def forward(self, x: "f32[s77, 2]"):
             # 
            sym_size_int_1: "Sym(s77)" = torch.ops.aten.sym_size.int(x, 0)
            
             # File: /tmp/ipykernel_183885/3092020146.py:8 in forward, code: return x + x.shape[0]
            add: "f32[s77, 2]" = torch.ops.aten.add.Tensor(x, sym_size_int_1);  x = sym_size_int_1 = None
            return (add,)

Dynamo:

def forward(self, s77 : torch.SymInt, L_x_ : torch.Tensor):
    l_x_ = L_x_
    add = l_x_ + s77;  l_x_ = s77 = None
    return (add,)

This means at some step we unlift symint inputs. @tugsbayasgalan suspects we did something in the tracing. We need to confirm this and make sure export always consumes the lifted graph and unlift it at a later stage to reduce divergence between torch.export and dynamo.

cc @chauhang @penguinwu @avikchaudhuri @gmagogsfm @tugsbayasgalan @angelayi @suo @ydwu4

Metadata

Metadata

Labels

export-triagedThis tag is used to tag issues that have been looked by PT2 Export team and determined the next steponcall: exportoncall: pt2

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions