Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: No module named 'dropout_layer_norm' #5726

Open
1 task done
apachemycat opened this issue May 17, 2024 · 2 comments
Open
1 task done

[BUG]: No module named 'dropout_layer_norm' #5726

apachemycat opened this issue May 17, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@apachemycat
Copy link

apachemycat commented May 17, 2024

Is there an existing issue for this bug?

🐛 Describe the bug

ModuleNotFoundError: No module named 'dropout_layer_norm'
[2024-05-17 03:23:11,932] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 1) local_rank: 0 (pid: 615) of binary: /usr/bin/python

dropout_layer_norm is depreated by flash_attn ,so If any other choise ?

Environment

No response

@apachemycat apachemycat added the bug Something isn't working label May 17, 2024
@duanjunwen
Copy link
Member

Hi @apachemycat , would you mind sharing the version of flash_atten in your environment? I am using flash-attn==2.5.7 , looks all good. Also, you can replace dropout_layer_norm with torch.nn.functional.layer_norm
& dropout, although kernel acceleration may not supported now.

@zhurunhua
Copy link
Contributor

watching...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants