Metadata-Version: 2.4
Name: ollama-rich
Version: 1.0.0
Summary: A feature-rich Ollama client with enhanced terminal UI using the Rich library.
Home-page: https://github.com/jakbin/ollama-rich
Author: Jak BIn
License: MIT
Project-URL: Source, https://github.com/jakbin/ollama-rich
Project-URL: Tracker, https://github.com/jakbin/ollama-rich/issues
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: rich
Requires-Dist: ollama
Dynamic: author
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: license
Dynamic: license-file
Dynamic: project-url
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# ollama-rich

A feature-rich Ollama client with enhanced terminal UI using the Rich library.

[![PyPI version](https://badge.fury.io/py/ollama-rich.svg)](https://pypi.org/project/ollama-rich)
[![Downloads](https://pepy.tech/badge/ollama-rich/month)](https://pepy.tech/project/ollama-rich)
[![Downloads](https://static.pepy.tech/personalized-badge/ollama-rich?period=total&units=international_system&left_color=green&right_color=blue&left_text=Total%20Downloads)](https://pepy.tech/project/ollama-rich)
![GitHub commit activity](https://img.shields.io/github/commit-activity/m/jakbin/ollama-rich)
![GitHub last commit](https://img.shields.io/github/last-commit/jakbin/ollama-rich)

## Features
- List available Ollama models in a beautiful table
- Chat with models directly from the terminal
- Stream responses live with markdown rendering
- Easy-to-use CLI interface
- More comming soon

## Requirements
- Python 3.7+
- [Ollama](https://github.com/jmorganca/ollama) server running
- [rich](https://github.com/Textualize/rich) Python library

## Installation
```sh
pip install ollama-rich
```
Or

```bash
pip install .
```

## Usage
### List Models
```bash
ollama-rich models
```

### Chat with a Model
```bash
ollama-rich chat <model> "Your message here"
```

### Stream Chat Response
```bash
ollama-rich chat <model> "Your message here" --stream
```


