Skip to content

Commit 987b3dc

Browse files
generatedunixname499836121meta-codesync[bot]
authored andcommitted
Imporve-graph-break-skip-logs (#167067)
Summary: Fixes #150477 ### Summary: - Added frame information (function name, file, line number) to all graph break/skip messages - Standardized message format: "torch.compile will skip tracing the frame <name> (<file> line <N>) and fall back to eager. Reason: <reason>" ### Impacts: module: dynamo X-link: pytorch/pytorch#167067 Approved by: https://github.com/williamwen42 Reviewed By: jeanschmidt Differential Revision: D87036500 fbshipit-source-id: 62281bad4609b8ea3557f7139695678bed0679cb
1 parent 0c70462 commit 987b3dc

File tree

1 file changed

+5
-2
lines changed
  • userbenchmark/dynamo/dynamobench/_dynamo

1 file changed

+5
-2
lines changed

userbenchmark/dynamo/dynamobench/_dynamo/utils.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2248,12 +2248,15 @@ def skip_frame_if_in_functorch_mode(val: torch.Tensor) -> None:
22482248
try:
22492249
val.data_ptr() # will throw for functorch tensors
22502250
except RuntimeError as e:
2251-
from .exc import SkipFrame
2251+
from .exc import format_skip_frame_message, SkipFrame
22522252

22532253
# This will be GradTrackingTensor/BatchedTensor/etc
22542254
functorch_subclass_name = re.sub(r"\(.*", "", repr(val))
22552255
raise SkipFrame(
2256-
f"torch.compile cannot be run in context: {functorch_subclass_name}"
2256+
format_skip_frame_message(
2257+
None,
2258+
f"torch.compile cannot be run in context: {functorch_subclass_name}",
2259+
)
22572260
) from e
22582261

22592262

0 commit comments

Comments
 (0)