Skip to content

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Jul 28, 2023

Although the sun is setting for torchscript, it is not officially deprecated since nothing currently fully replaces it. Thus, "downstream" libraries like TorchVision, that started offering torchscript support still need to support it for BC.

torchscript has forced us to use workaround after workaround since forever. Although this makes the code harder to read and maintain, we made our peace with it. However, we are currently looking into more elaborate API designs that are severely hampered by our torchscript BC guarantees.

Although likely not intended as such, while looking for ways to enable our design while keeping a subset of it scriptable, we found the undocumented __prepare_scriptable__ escape hatch:

obj = obj.__prepare_scriptable__() if hasattr(obj, '__prepare_scriptable__') else obj # type: ignore[operator]

One can define this method and if you call torch.jit.script on the object, the returned object of the method will be scripted rather than the original object. In TorchVision we are using exactly this mechanism to enable BC while allowing the object in eager mode to be a lot more flexible (*args, **kwargs, dynamic dispatch, ...).

Unfortunately, this escape hatch is only available for nn.Module's

pytorch/torch/jit/_script.py

Lines 1279 to 1283 in 0cf9189

if isinstance(obj, torch.nn.Module):
obj = call_prepare_scriptable_func(obj)
return torch.jit._recursive.create_script_module(
obj, torch.jit._recursive.infer_methods_to_compile
)

This was fine for the example above since we were subclassing from nn.Module anyway. However, we recently also hit a case where this wasn't the case.

Given the frozen state on JIT, would it be possible to give us a general escape hatch so that we can move forward with the design unconstrained while still keeping BC?

This PR implements just this by re-using the __prepare_scriptable__ hook.

@pytorch-bot
Copy link

pytorch-bot bot commented Jul 28, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/106229

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit 1fcca1a:

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: jit release notes category label Jul 28, 2023
@pmeier pmeier requested review from ezyang and lezcano August 11, 2023 08:15
@pmeier pmeier marked this pull request as ready for review August 11, 2023 08:15
Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SGTM as this unblocks vision, and this feature is not documented anyway, but let's see what @ezyang thinks.

@ezyang ezyang requested a review from suo August 11, 2023 15:23
Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK

@pmeier
Copy link
Collaborator Author

pmeier commented Aug 11, 2023

@pytorchbot merge -f "unrelated dynamo failure"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pmeier pmeier deleted the jit-escape branch August 11, 2023 18:25
dherrera-meta added a commit that referenced this pull request Feb 28, 2024
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

[ghstack-poisoned]
dherrera-meta added a commit that referenced this pull request Feb 28, 2024
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

ghstack-source-id: 216795799
Pull Request resolved: #120806
dherrera-meta added a commit that referenced this pull request Mar 4, 2024
…sures"

This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

[ghstack-poisoned]
dherrera-meta added a commit that referenced this pull request Mar 4, 2024
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

[ghstack-poisoned]
dherrera-meta added a commit that referenced this pull request Mar 4, 2024
…sures"

This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

[ghstack-poisoned]
dherrera-meta added a commit that referenced this pull request Mar 4, 2024
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

[ghstack-poisoned]
dherrera-meta added a commit that referenced this pull request Mar 4, 2024
Pull Request resolved: #120806

This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.
ghstack-source-id: 217307158
@exported-using-ghexport

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)
dherrera-meta added a commit that referenced this pull request Mar 5, 2024
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.

Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)

[ghstack-poisoned]
facebook-github-bot pushed a commit that referenced this pull request Mar 6, 2024
Summary:

This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

The next diff fails without this fix.
ghstack-source-id: 217307158
exported-using-ghexport

Test Plan:
```
buck2 run mode/opt caffe2/test:jit -- -r test_decorator
```

Differential Revision: D54308741
facebook-github-bot pushed a commit that referenced this pull request Mar 8, 2024
Summary:
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

Test Plan:
```
buck2 run mode/opt caffe2/test:jit -- -r test_decorator
```

Differential Revision: D54308741
pytorchmergebot pushed a commit that referenced this pull request Mar 11, 2024
Summary:
This fixes a case left incomplete by #106229
The object is using __prepare_scriptable__ correctly inside of torch.jit.script()
but the clousre that is obtained below is using the non-prepared version.
This causes issues when the prepared and non-prepared versions are in different python modules.

Test Plan:
```
buck2 run mode/opt caffe2/test:jit -- -r test_decorator
```

Differential Revision: D54308741

Re-exporting, as #120806 #121307 were not properly merged.

Co-authored-by: Daniel Herrera <[email protected]>
Pull Request resolved: #121553
Approved by: https://github.com/huydhn, https://github.com/seemethere
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants