Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama download model if not present #283

Open
p5 opened this issue Jun 4, 2024 · 1 comment
Open

Ollama download model if not present #283

p5 opened this issue Jun 4, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@p5
Copy link

p5 commented Jun 4, 2024

Super glad to see the Ollama support in the recent releases! This has made the tool far more accessible for most who do not wish to risk a large OpenAI bill.

Is your feature request related to a problem? Please describe.
When using mods on a new system, it's always frustrating when you invoke mods directed towards Ollama but it fails when you do not have the model on your system.

Describe the solution you'd like
It would be great if mods detects (maybe using the /api/show or /api/tags calls) whether the model you are trying to invoke is present before calling it. If it is not present, download the model (using the /api/pull request) and display some text to the user saying something along the lines of "The model you tried to invoke is not present, pulling...".

Currently, it just fails with an "Unknown API error."

Describe alternatives you've considered
The alternative is to download the model before invoking Mods.

Additional context
N/A

@p5 p5 added the enhancement New feature or request label Jun 4, 2024
@caarlos0
Copy link
Member

I'm not sure this would sit well within the scope of mods, probably could do a better job with the error, though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants