Metadata-Version: 2.4
Name: langspend
Version: 1.0.0
Summary: LangSpend Python SDK - Track LLM costs and usage with customer attribution. Requires LangSpend API key for production use.
Home-page: https://github.com/langspend/langspend-python
Author: LangSpend
Author-email: LangSpend <support@langspend.com>
Maintainer-email: LangSpend <support@langspend.com>
License: MIT
Project-URL: Homepage, https://langspend.com
Project-URL: Documentation, https://docs.langspend.com/python
Project-URL: Repository, https://github.com/langspend/langspend-python
Project-URL: Bug Tracker, https://github.com/langspend/langspend-python/issues
Project-URL: Source Code, https://github.com/langspend/langspend-python
Keywords: llm,ai,cost-tracking,openai,anthropic,bedrock,langspend
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: requests>=2.28.0
Requires-Dist: httpx>=0.24.0
Requires-Dist: typing-extensions>=4.0.0
Provides-Extra: openai
Requires-Dist: openai>=1.0.0; extra == "openai"
Provides-Extra: anthropic
Requires-Dist: anthropic>=0.7.0; extra == "anthropic"
Provides-Extra: bedrock
Requires-Dist: boto3>=1.26.0; extra == "bedrock"
Provides-Extra: vertex
Requires-Dist: google-cloud-aiplatform>=1.38.0; extra == "vertex"
Provides-Extra: all
Requires-Dist: openai>=1.0.0; extra == "all"
Requires-Dist: anthropic>=0.7.0; extra == "all"
Requires-Dist: boto3>=1.26.0; extra == "all"
Requires-Dist: google-cloud-aiplatform>=1.38.0; extra == "all"
Dynamic: author
Dynamic: home-page
Dynamic: license-file
Dynamic: requires-python

# LangSpend Python SDK

Track LLM costs and usage with customer attribution for OpenAI, Anthropic, AWS Bedrock, and Google Vertex AI.

## Features

- **Customer Attribution**: Track costs per customer and feature
- **Zero Code Changes**: Wrap your existing LLM clients
- **Multi-Provider Support**: OpenAI, Anthropic, AWS Bedrock, Google Vertex AI
- **Streaming Support**: Works with streaming responses
- **Async Support**: Full async/await compatibility
- **Error Resilience**: Never breaks your LLM calls

## Installation

### Using pip
```bash
pip install langspend
```

### Using uv (recommended)
```bash
uv add langspend
```

### For specific providers
```bash
# Using pip
pip install langspend[openai]      # OpenAI support
pip install langspend[anthropic]   # Anthropic support
pip install langspend[bedrock]     # AWS Bedrock support
pip install langspend[vertex]      # Google Vertex AI support
pip install langspend[all]         # All providers

# Using uv
uv add "langspend[openai]"         # OpenAI support
uv add "langspend[anthropic]"      # Anthropic support
uv add "langspend[bedrock]"        # AWS Bedrock support
uv add "langspend[vertex]"         # Google Vertex AI support
uv add "langspend[all]"            # All providers
```

## Get Your API Key

**LangSpend is a commercial SaaS product with a free tier available.**

Get your LangSpend API key at [langspend.com](https://langspend.com) - [Sign up here](https://langspend.com/signup)

> **Note**: An API key is required for production use. The SDK includes a free tier for development and evaluation.

## Quick Start

### Option 1: Customer and Feature Cost Tracking (Recommended)

```python
from langspend import LangSpend, wrap_openai
from openai import OpenAI

# Initialize LangSpend
langspend = LangSpend(api_key="lsp_your_api_key_here")

# Wrap your OpenAI client
openai = wrap_openai(
    OpenAI(api_key="your_openai_api_key"),
    langspend
)

# Use as normal - tracking happens automatically
response = openai.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}],
    langspend_tags={
        "customer": {
            "id": "user_123",
            "name": "John Doe",
            "email": "john@example.com"
        },
        "feature": {
            "name": "chat-api"
        }
    }
)
```

### Option 2:  Zero Code Setup (Quick, no customer level tracking)

```python
from langspend import LangSpend, wrap_openai
from openai import OpenAI

# Initialize LangSpend
langspend = LangSpend(api_key="lsp_your_api_key_here")

# Wrap your OpenAI client
openai = wrap_openai(OpenAI(), langspend)

# Use as normal - tracking happens automatically
response = openai.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)
```

## Supported Providers

LangSpend supports OpenAI, Anthropic, and AWS Bedrock. See the [examples directory](examples/) for integration examples with each provider.

## Demo Application

A complete Next.js + FastAPI demo is available in [`examples/nextjs-chat-demo/`](examples/nextjs-chat-demo/):

```bash
cd examples/nextjs-chat-demo/backend
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt
cp env.example .env
# Edit .env with your API keys
python main.py
```

## Examples

See the [`examples/`](examples/) directory for comprehensive usage examples:

- `basic_usage.py` - Basic usage examples
- `customer_attribution.py` - Customer attribution examples
- `anthropic_usage.py` - Anthropic integration
- `bedrock_usage.py` - AWS Bedrock integration
- `flask_integration.py` - Flask web app integration
- `fastapi_integration.py` - FastAPI integration
- `streaming_example.py` - Streaming responses
- `multi_provider.py` - Multiple providers
- `nextjs-chat-demo/` - Complete Next.js + FastAPI demo

## Configuration

```python
from langspend import LangSpend

langspend = LangSpend(
    api_key="lsp_your_api_key",
    base_url="https://platform.langspend.com",  # Optional
    debug=False,  # Enable debug logging
    max_retries=3  # Max retry attempts
)
```

## Error Handling

The SDK never throws errors for tracking failures. All tracking happens asynchronously and failures are silent by default.

Enable debug mode to log tracking errors:

```python
langspend = LangSpend(api_key="lsp_your_api_key", debug=True)
```

## License

MIT License - see LICENSE file for details.
