Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Perplexity Models in Settings #337

Open
hugo-k-m opened this issue Sep 4, 2024 · 0 comments
Open

Update Perplexity Models in Settings #337

hugo-k-m opened this issue Sep 4, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@hugo-k-m
Copy link

hugo-k-m commented Sep 4, 2024

Good day,

It seems that the Perplexity settings in mods --settings are not up to date with the current perplexity API. According to their docs Perplexity have switched to llama 3.1 models, but it appears that mods settings are still configured to use llama 3.

For instance, the context allowed is only 8K rather than the 128K context allowed with llama 3.1.

Please let me know if we can resolve this or if I'm misunderstanding something about how the settings are configured.

@hugo-k-m hugo-k-m added the enhancement New feature or request label Sep 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant