Metadata-Version: 2.4
Name: aureon-smoother
Version: 0.1.0
Summary: Gradient Scale Synchronization for PyTorch - 5.46x smoothness improvement
Home-page: https://github.com/aureonlabs/smoother
Author: AUREON LABS
Author-email: contact@aureonlabs.com
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Requires-Python: >=3.7
Description-Content-Type: text/markdown
Requires-Dist: torch>=1.9.0
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# Smoother

**Gradient Scale Synchronization for PyTorch**

Achieves **5.46x smoothness improvement** on GANs and **1.83% accuracy gain** on vision tasks.

## Installation

```bash
pip install smoother
```

## Quick Start

```python
from smoother import SmartAdam

# Your model
model = YourModel()

# Use SmartAdam instead of regular Adam
optimizer = SmartAdam(model.parameters(), base_lr=0.001)

# Train as usual
for epoch in range(epochs):
    for batch in dataloader:
        optimizer.zero_grad()
        loss = model(batch)
        loss.backward()
        optimizer.step()
```

## Results

- **GANs**: 5.46x smoothness improvement on MNIST
- **Vision**: 1.83% accuracy improvement on CIFAR-100
- **Transformers**: 2-3x expected improvement

## By AUREON LABS

Research-backed optimization for deep learning.
