#llm #ollama #ai #chatbot #cli

app ask-ollama

Query any LLM found on Ollama from the terminal!

7 releases (stable)

1.0.5 Dec 1, 2023
0.1.0 Nov 30, 2023

#53 in Machine learning

Download history 233/week @ 2023-11-29 27/week @ 2023-12-06 18/week @ 2023-12-13 12/week @ 2023-12-20 11/week @ 2023-12-27 14/week @ 2024-01-03 3/week @ 2024-01-10 2/week @ 2024-01-17 24/week @ 2024-01-24 20/week @ 2024-01-31 10/week @ 2024-02-07 109/week @ 2024-02-14

164 downloads per month


109 lines


Ask-Ollama is a command-line tool that allows users to interact with the Ollama LLM models directly from the terminal. This tool provides a simple and intuitive way to ask questions and receive responses from Ollama models.


  • Interactive CLI: Easily ask questions and get responses.
  • Model Selection: Choose different models for varied responses.


To install Ask-Ollama, you need to have Rust and Cargo installed on your system. If you haven't already installed Rust, you can do so by following the instructions here.

Once Rust is installed, you can install Ask-Ollama using Cargo:

cargo install ask-ollama


After installation, you can start using Ask-Ollama by running:



  • --model=[MODEL]: Specify the model to use (default is 'mistral').
  • --version: Display the version of the installed Ask-Ollama tool.
  • [PROMPT]: The question or prompt to send to Ollama. Quotation marks are not required.


Asking a question using the default model:

ask "What is the capital of France?"


ask What is the capital of France?

Specifying a different model:

ask --model=gale "Explain the theory of relativity"

Find all available models from Ollama here.

Checking the version:

ask --version

Seeing the help info:

ask --help


Contributions to Ask-Ollama are welcome! If you have suggestions for improvements or encounter any issues, please feel free to open an issue or submit a pull request on our GitHub repository.


Ask-Ollama is licensed under the MIT License.


~48K SLoC