-
Notifications
You must be signed in to change notification settings - Fork 25.3k
torch.jit.script escape hatch #106229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.jit.script escape hatch #106229
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/106229
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New FailureAs of commit 1fcca1a: NEW FAILURE - The following job has failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SGTM as this unblocks vision, and this feature is not documented anyway, but let's see what @ezyang thinks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
@pytorchbot merge -f "unrelated dynamo failure" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) [ghstack-poisoned]
This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) ghstack-source-id: 216795799 Pull Request resolved: #120806
…sures" This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) [ghstack-poisoned]
This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) [ghstack-poisoned]
…sures" This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) [ghstack-poisoned]
This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) [ghstack-poisoned]
Pull Request resolved: #120806 This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. ghstack-source-id: 217307158 @exported-using-ghexport Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/)
This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. Differential Revision: [D54308741](https://our.internmc.facebook.com/intern/diff/D54308741/) [ghstack-poisoned]
Summary: This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. The next diff fails without this fix. ghstack-source-id: 217307158 exported-using-ghexport Test Plan: ``` buck2 run mode/opt caffe2/test:jit -- -r test_decorator ``` Differential Revision: D54308741
Summary: This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. Test Plan: ``` buck2 run mode/opt caffe2/test:jit -- -r test_decorator ``` Differential Revision: D54308741
Summary: This fixes a case left incomplete by #106229 The object is using __prepare_scriptable__ correctly inside of torch.jit.script() but the clousre that is obtained below is using the non-prepared version. This causes issues when the prepared and non-prepared versions are in different python modules. Test Plan: ``` buck2 run mode/opt caffe2/test:jit -- -r test_decorator ``` Differential Revision: D54308741 Re-exporting, as #120806 #121307 were not properly merged. Co-authored-by: Daniel Herrera <[email protected]> Pull Request resolved: #121553 Approved by: https://github.com/huydhn, https://github.com/seemethere
Although the sun is setting for torchscript, it is not officially deprecated since nothing currently fully replaces it. Thus, "downstream" libraries like TorchVision, that started offering torchscript support still need to support it for BC.
torchscript has forced us to use workaround after workaround since forever. Although this makes the code harder to read and maintain, we made our peace with it. However, we are currently looking into more elaborate API designs that are severely hampered by our torchscript BC guarantees.
Although likely not intended as such, while looking for ways to enable our design while keeping a subset of it scriptable, we found the undocumented
__prepare_scriptable__
escape hatch:pytorch/torch/jit/_script.py
Line 977 in 0cf9189
One can define this method and if you call
torch.jit.script
on the object, the returned object of the method will be scripted rather than the original object. In TorchVision we are using exactly this mechanism to enable BC while allowing the object in eager mode to be a lot more flexible (*args, **kwargs
, dynamic dispatch, ...).Unfortunately, this escape hatch is only available for
nn.Module
'spytorch/torch/jit/_script.py
Lines 1279 to 1283 in 0cf9189
This was fine for the example above since we were subclassing from
nn.Module
anyway. However, we recently also hit a case where this wasn't the case.Given the frozen state on JIT, would it be possible to give us a general escape hatch so that we can move forward with the design unconstrained while still keeping BC?
This PR implements just this by re-using the
__prepare_scriptable__
hook.