Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong Pretrained model saved in Pretraining and FineTuning BERT #1924

Open
joHussien opened this issue Sep 4, 2024 · 0 comments
Open

Wrong Pretrained model saved in Pretraining and FineTuning BERT #1924

joHussien opened this issue Sep 4, 2024 · 0 comments
Assignees

Comments

@joHussien
Copy link

Issue Type

Bug

Source

source

Keras Version

Keras 2.14

Custom Code

Yes

OS Platform and Distribution

No response

Python version

No response

GPU model and memory

No response

Current Behavior?

Save this base model for further finetuning.

encoder_model.save("encoder_model.keras")

This line under https://github.com/keras-team/keras-io/tree/master/guides/keras_nlp/transformer_pretraining.py

Here you save the model architecture (encoder_model), but instead you should save pretrained_model which is the final pretrained model that you should use for finetuning afterwards

Standalone code to reproduce the issue or tutorial link

# Save this base model for further finetuning.
encoder_model.save("encoder_model.keras")

This line under https://github.com/keras-team/keras-io/tree/master/guides/keras_nlp/transformer_pretraining.py

Here you save the model architecture (encoder_model), but instead you should save pretrained_model which is the final pretrained model that you should use for finetuning afterwards

Relevant log output

# Save this base model for further finetuning.
encoder_model.save("encoder_model.keras")

This line under https://github.com/keras-team/keras-io/tree/master/guides/keras_nlp/transformer_pretraining.py

Here you save the model architecture (encoder_model), but instead you should save pretrained_model which is the final pretrained model that you should use for finetuning afterwards
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants