Switch to Russian

Simple chat shell UI for Ollama

This is a simple chat shell UI for convenient communication with Ollama models, made as a pet project with heavy AI assistance.
Requires a running Ollama instance on your machine! Download and install it from ollama.com.

This chat source code can be found on github.com/Arsmeen/ollama-chat-shell

The project was originally created for the gemma3:12b model, but you can use any model supported by Ollama and your hardware.

It stores chat history into the chat_history/ChatHistory.json.gz, and monitors the history file size according to the config option history_max_file_kb, creating additional archive files if necessary.

Since gemma3:12b only sees images during initialization, and cannot do so in the chat, I implemented it so that if an image is attached to a message, such a message reinitializes the model so that it can see it. But it does not see the history of messages before this request, it is important to remember this. For testing, I added the image_include_messages option – it forcibly includes the specified number of chat threads in the message, but all attempts except 0 more often caused hallucinations in the model. The next message after this one with an image will already include the entire history up to the history_max_chars limit normally.

Remember, on first run, if no model downloaded with ollama, this chat will wait until model downloads, so this time is depends of model and your internet speed.
Binary releases in archive: