-
Notifications
You must be signed in to change notification settings - Fork 461
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Import "torch_xla.core.xla_model" could not be resolved #7897
Comments
I am assuming you are running on TPU, what command did you use ? Does instruction in https://github.com/pytorch/xla#tpu works? |
I had updated code as following to resolve issue,| if is_torch_xla_available():
else: but still not working, showing many compatibility issues |
what workload are you trying to run? |
I am working on Virtual dressing room project. in which i have getting issue of CUDA . |
I guess my question is more basic, it seem you are trying to use HF peft and you don't intend to use PyTorch/XLA nor TPU. In which case none of the torch_xla logic should be trigger,
should be false. Can you check why it returns True? if |
Hi, @JackCaoG, is that ok to assign this ticket to you? |
getting issues on torch_xla.core.xla_model. , while installing package also getting errors : "ERROR: Could not find a version that satisfies the requirement torch-xla (from versions: none)
ERROR: No matching distribution found for torch-xla"
I have installed python version is : Python 3.10.0
Any Solution ?
The text was updated successfully, but these errors were encountered: