Metadata-Version: 2.4
Name: gcs-easy
Version: 0.1.3
Summary: Autenticación sencilla y utilidades para Google Cloud Storage (subir, bajar, firmar URLs).
Author-email: Juan Manuel <juanmanuelcabrera.r@gmail.com>
License-Expression: MIT
License-File: LICENSE
Requires-Python: >=3.9
Requires-Dist: google-auth-oauthlib>=1.2.1
Requires-Dist: google-auth>=2.33.0
Requires-Dist: google-cloud-storage>=2.18.0
Requires-Dist: python-dotenv>=1.0.0
Requires-Dist: pyyaml>=6.0
Requires-Dist: tqdm>=4.66.0
Provides-Extra: test
Requires-Dist: pytest-mock>=3.0.0; extra == 'test'
Requires-Dist: pytest>=7.0.0; extra == 'test'
Description-Content-Type: text/markdown

# GCS Easy

A simple and easy-to-use Python library for Google Cloud Storage operations with CLI support.

## Features

- 🔐 Simple authentication using service account credentials
- 📤 Upload files with resumable uploads and progress tracking
- 📥 Download files from GCS
- 📋 List bucket contents
- 🗑️ Delete files
- 🔗 Generate signed URLs for secure access
- ⚙️ Flexible configuration via YAML
- 🖥️ Command-line interface for quick operations
- 🔍 Built-in permissions checker

## Installation

### From Source


### From PyPI (when published)

```bash
pip install gcs-easy
```

## Configuration

1. Add to `config.yml`:
   ```bash
    # Google Cloud Storage Configuration
    # Add your GCP credentials file path here
    GOOGLE_APPLICATION_CREDENTIALS : /path/to/your/credentials.json

    # GCS Client Configuration
    # default_bucket can be just bucket name (e.g., "your-bucket-name")
    # or bucket with default path/prefix (e.g., "your-bucket-name/your-path")
    default_bucket: your-bucket-name
    location: EU
    uniform_access: true
    chunk_size: 8388608  # 8MB in bytes
    cache_control: "public, max-age=3600"
    signed_url_expires_minutes: 15
   ```

2. Edit `config.yml` with your settings:
   - Set `GOOGLE_APPLICATION_CREDENTIALS` to the path of your service account JSON file
   - Configure `default_bucket` (bucket name or bucket/path)
   - Adjust other settings as needed

### Configuration Parameters

- `GOOGLE_APPLICATION_CREDENTIALS`: Path to your Google Cloud service account JSON file
- `default_bucket`: Default GCS bucket name (can include a path prefix)
- `location`: GCS location for bucket creation (default: "EU")
- `uniform_access`: Enable uniform bucket-level access (default: true)
- `chunk_size`: Upload chunk size in bytes (default: 8388608)
- `cache_control`: Cache control header for uploaded files
- `signed_url_expires_minutes`: Default expiration time for signed URLs (default: 15)


## Usage

### Command Line Interface

The `gcs-easy` command provides quick access to GCS operations:

```bash
# Upload a file
gcs-easy upload --bucket your-bucket --src /path/to/file.txt --dst remote/path/file.txt

# Make file public during upload
gcs-easy upload --bucket your-bucket --src /path/to/file.txt --dst remote/path/file.txt --public

# Download a file
gcs-easy download --bucket your-bucket --src remote/path/file.txt --dst /local/path/file.txt

# List bucket contents
gcs-easy list --bucket your-bucket --prefix folder/

# Generate signed URL
gcs-easy sign --bucket your-bucket --path remote/path/file.txt --minutes 60

# Check permissions and configuration
gcs-easy permissions
```

### Programmatic Usage

```python
from gcs_easy import GCSClient

# Initialize client (uses config.yml settings)
client = GCSClient()

# Or specify explicitly
client = GCSClient(
    project: GCP project ID (optional if resolved by ADC).
        default_bucket= 'default bucket (optional). Can be just the bucket name',                    
        create_bucket_if_missing= 'creates the bucket if it doesn't exist.',
        location= 'location when creating bucket (e.g., EU, US, europe-west1). If not specified, uses config.yml.',
        uniform_access= 'enables Uniform bucket-level access. If not specified, uses config.yml.',
        config_path= 'path to the config.yml file (optional). If not specified, uses default location.'
)

# Upload a file
result = client.upload_file(
    local_path="/path/to/file.txt",
    blob_path="remote/path/file.txt",
    make_public=False
)
print(f"Uploaded {result.size} bytes to {result.bucket}/{result.blob}")

# Download a file
local_file = client.download_file(
    blob_path="remote/path/file.txt",
    local_path="/local/path/file.txt"
)

# List files
for blob_name in client.list(prefix="folder/"):
    print(blob_name)

# Check if file exists
if client.exists(
    blob_path= 'default_prefix',
    bucket_name= 'default_bucket'
):
    print("File exists")

# Delete a file
client.delete("remote/path/file.txt")

# Generate signed URL
url = client.signed_url("remote/path/file.txt", expires_minutes=30)
print(f"Signed URL: {url}")
```

## Authentication

This library uses Google Cloud service account authentication. You need:

1. A Google Cloud service account with appropriate GCS permissions
2. The service account JSON key file
3. The path to this file in your `config.yml`

### Required Permissions

The service account needs these GCS permissions:
- `storage.objects.create` - Upload files
- `storage.objects.get` - Download files and generate signed URLs
- `storage.objects.list` - List bucket contents
- `storage.objects.delete` - Delete files
- `storage.buckets.get` - Check bucket existence

### Checking Permissions

Use the built-in permissions checker:

```bash
gcs-easy permissions
```

This will analyze your configuration and show required permissions.

## Examples

### Basic Upload/Download

```python
from gcs_easy import GCSClient

client = GCSClient()

# Upload
client.upload_file("local_file.txt", "uploads/file.txt")

# Download
client.download_file("uploads/file.txt", "downloaded_file.txt")
```

### Working with Different Buckets

```python
# Use default bucket from config
client = GCSClient()

# Specify bucket for this operation
client.upload_file("file.txt", "path/file.txt", bucket_name="other-bucket")
```

### Batch Operations

```python
import glob
from pathlib import Path

client = GCSClient()

# Upload all .txt files in a directory
for file_path in Path("data/").glob("*.txt"):
    blob_path = f"data/{file_path.name}"
    client.upload_file(str(file_path), blob_path)
    print(f"Uploaded {file_path.name}")
```

## Development

### Setup Development Environment

```bash
# Create virtual environment
python -m venv .venv

# Activate (Windows)
.venv\Scripts\activate

# Activate (Linux/Mac)
source .venv/bin/activate

# Install in development mode
pip install -e ".[test]"
```

### Running Tests

```bash
# Run all tests
pytest

# Run specific test file
pytest test/test_smoke.py
```

### Building

```bash
# Build distribution
python -m build

# Upload to PyPI
python -m twine upload dist/*
```

## Contributing

1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests for new functionality
5. Ensure all tests pass
6. Submit a pull request

## License

MIT License - see [LICENSE](LICENSE) file for details.

## Author

Juan Manuel Cabrera - juanmanuelcabrera.r@gmail.com