Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forward XLATensorImpl::is_contiguous_custom to TensorImpl. #8032

Open
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

ysiraichi
Copy link
Collaborator

This PR fixes #7998. Instead of always returning true, we forward this call to the base class TensorImpl::is_contiguous_custom().

The reason is that after pytorch/pytorch#135498 is merged, XLA tensors' metadata might stop reflecting on the actual XLA storage. Which means that the tensors' strides might not always be contiguous. Whenever that happens, tensor.is_contiguous() call should be consistent with the tensors' strides.

cc @miladm @JackCaoG @alanwaketan

@JackCaoG
Copy link
Collaborator

test failure seems real?

@JackCaoG
Copy link
Collaborator

yea test_memory_format_preserved_after_permute_xla still failing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants