metadata_version string | name string | version string | summary string | description string | description_content_type string | author string | author_email string | maintainer string | maintainer_email string | license string | keywords string | classifiers list | platform list | home_page string | download_url string | requires_python string | requires list | provides list | obsoletes list | requires_dist list | provides_dist list | obsoletes_dist list | requires_external list | project_urls list | uploaded_via string | upload_time timestamp[us] | filename string | size int64 | path string | python_version string | packagetype string | comment_text string | has_signature bool | md5_digest string | sha256_digest string | blake2_256_digest string | license_expression string | license_files list | recent_7d_downloads int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2.4 | reaktiv | 0.21.2 | Signals for Python - inspired by Angular Signals / SolidJS. Reactive Declarative State Management Library for Python - automatic dependency tracking and reactive updates for your application state. | # reaktiv
<div align="center">

 [](https://pypi.org/project/reaktiv/) [](https://pepy.tech/projects/reaktiv)   [](https://microsoft.github.io/pyright/)
[](https://ko-fi.com/H2H71OBINS)
**Reactive Declarative State Management Library for Python** - automatic dependency tracking and reactive updates for your application state.
[Website](https://reaktiv.bui.app/) | [Live Playground](https://reaktiv.bui.app/#playground) | [Documentation](https://reaktiv.bui.app/docs) | [Deep Dive Article](https://bui.app/the-missing-manual-for-signals-state-management-for-python-developers/)
</div>
## Installation
```bash
pip install reaktiv
# or with uv
uv pip install reaktiv
```
`reaktiv` is a **reactive declarative state management library** that lets you **declare relationships between your data** instead of manually managing updates. When data changes, everything that depends on it updates automatically - eliminating a whole class of bugs where you forget to update dependent state.
**Think of it like Excel spreadsheets for your Python code**: when you change a cell value, all formulas that depend on it automatically recalculate. That's exactly how `reaktiv` works with your application state.
**Key benefits:**
- 🐛 **Fewer bugs**: No more forgotten state updates or inconsistent data
- 📋 **Clearer code**: State relationships are explicit and centralized
- ⚡ **Better performance**: Only recalculates what actually changed (fine-grained reactivity)
- 🔄 **Automatic updates**: Dependencies are tracked and updated automatically
- 🎯 **Python-native**: Built for Python's patterns with full async support
- 🔒 **Type safe**: Full type hint support with automatic inference
- 🚀 **Lazy evaluation**: Computed values are only calculated when needed
- 💾 **Smart memoization**: Results are cached and only recalculated when dependencies change
## Documentation
Full documentation is available at [https://reaktiv.bui.app/docs/](https://reaktiv.bui.app/docs/).
For a comprehensive guide, check out [The Missing Manual for Signals: State Management for Python Developers](https://bui.app/the-missing-manual-for-signals-state-management-for-python-developers/).
## Quick Start
```python
from reaktiv import Signal, Computed, Effect
# Your reactive data sources
name = Signal("Alice")
age = Signal(30)
# Reactive derived data - automatically stays in sync
@Computed
def greeting():
return f"Hello, {name()}! You are {age()} years old."
# Reactive side effects - automatically run when data changes
# IMPORTANT: Must assign to variable to prevent garbage collection
greeting_effect = Effect(lambda: print(f"Updated: {greeting()}"))
# Just change your base data - everything reacts automatically
name.set("Bob") # Prints: "Updated: Hello, Bob! You are 30 years old."
age.set(31) # Prints: "Updated: Hello, Bob! You are 31 years old."
```
## Core Concepts
`reaktiv` provides three simple building blocks for reactive programming - just like Excel has cells and formulas:
1. **Signal**: Holds a reactive value that can change (like an Excel cell with a value)
2. **Computed**: Automatically derives a reactive value from other signals/computed values (like an Excel formula)
3. **Effect**: Runs reactive side effects when signals/computed values change (like Excel charts that update when data changes)
```python
# Signal: wraps a reactive value (like Excel cell A1 = 5)
counter = Signal(0)
# Computed: derives from other reactive values (like Excel cell B1 = A1 * 2)
@Computed
def doubled():
return counter() * 2
# Effect: reactive side effects (like Excel chart that updates when cells change)
def print_values():
print(f"Counter: {counter()}, Doubled: {doubled()}")
counter_effect = Effect(print_values)
counter.set(5) # Reactive update: prints "Counter: 5, Doubled: 10"
```
### Excel Spreadsheet Analogy
If you've ever used Excel, you already understand reactive programming:
| Cell | Value/Formula | reaktiv Equivalent |
|------|---------------|-------------------|
| A1 | `5` | `Signal(5)` |
| B1 | `=A1 * 2` | `Computed(lambda: a1() * 2)` |
| C1 | `=A1 + B1` | `Computed(lambda: a1() + b1())` |
When you change A1 in Excel, B1 and C1 automatically recalculate. That's exactly what happens with reaktiv:
```python
# Excel-style reactive programming in Python
a1 = Signal(5) # A1 = 5
@Computed # B1 = A1 * 2
def b1() -> int:
return a1() * 2
@Computed # C1 = A1 + B1
def c1() -> int:
return a1() + b1()
# Display effect (like Excel showing the values)
display_effect = Effect(lambda: print(f"A1={a1()}, B1={b1()}, C1={c1()}"))
a1.set(10) # Change A1 - everything recalculates automatically!
# Prints: A1=10, B1=20, C1=30
```
Just like in Excel, you don't need to manually update B1 and C1 when A1 changes - the dependency tracking handles it automatically.
```mermaid
graph TD
%% Define node subgraphs for better organization
subgraph "Data Sources"
S1[Signal A]
S2[Signal B]
S3[Signal C]
end
subgraph "Derived Values"
C1[Computed X]
C2[Computed Y]
end
subgraph "Side Effects"
E1[Effect 1]
E2[Effect 2]
end
subgraph "External Systems"
EXT1[UI Update]
EXT2[API Call]
EXT3[Database Write]
end
%% Define relationships between nodes
S1 -->|"get()"| C1
S2 -->|"get()"| C1
S2 -->|"get()"| C2
S3 -->|"get()"| C2
C1 -->|"get()"| E1
C2 -->|"get()"| E1
S3 -->|"get()"| E2
C2 -->|"get()"| E2
E1 --> EXT1
E1 --> EXT2
E2 --> EXT3
%% Change propagation path
S1 -.-> |"1\. set()"| C1
C1 -.->|"2\. recompute"| E1
E1 -.->|"3\. execute"| EXT1
%% Style nodes by type
classDef signal fill:#4CAF50,color:white,stroke:#388E3C,stroke-width:1px
classDef computed fill:#2196F3,color:white,stroke:#1976D2,stroke-width:1px
classDef effect fill:#FF9800,color:white,stroke:#F57C00,stroke-width:1px
%% Apply styles to nodes
class S1,S2,S3 signal
class C1,C2 computed
class E1,E2 effect
%% Legend node
LEGEND[" Legend:
• Signal: Stores a value, notifies dependents
• Computed: Derives value from dependencies
• Effect: Runs side effects when dependencies change
• → Data flow / Dependency (read)
• ⟿ Change propagation (update)
"]
classDef legend fill:none,stroke:none,text-align:left
class LEGEND legend
```
### Additional Features That reaktiv Provides
**Lazy Evaluation** - Computations only happen when results are actually needed:
```python
# This expensive computation isn't calculated until you access it
@Computed
def expensive_calc():
return sum(range(1000000)) # Not calculated yet!
print(expensive_calc()) # NOW it calculates when you need the result
print(expensive_calc()) # Instant! (cached result)
```
**Memoization** - Results are cached until dependencies change:
```python
# Results are automatically cached for efficiency
a1 = Signal(5)
@Computed
def b1():
return a1() * 2 # Define the computation
result1 = b1() # Calculates: 5 * 2 = 10
result2 = b1() # Cached! No recalculation needed
a1.set(6) # Dependency changed - cache invalidated
result3 = b1() # Recalculates: 6 * 2 = 12
```
**Fine-Grained Reactivity** - Only affected computations recalculate:
```python
# Independent data sources don't affect each other
a1 = Signal(5) # Independent signal
d2 = Signal(100) # Another independent signal
@Computed # Depends only on a1
def b1():
return a1() * 2
@Computed # Depends on a1 and b1
def c1():
return a1() + b1()
@Computed # Depends only on d2
def e2():
return d2() / 10
a1.set(10) # Only b1 and c1 recalculate, e2 stays cached
d2.set(200) # Only e2 recalculates, b1 and c1 stay cached
```
This intelligent updating means your application only recalculates what actually needs to be updated, making it highly efficient.
## The Problem This Solves
Consider a simple order calculation:
### Without reaktiv (Manual Updates)
```python
class Order:
def __init__(self):
self.price = 100.0
self.quantity = 2
self.tax_rate = 0.1
self._update_totals() # Must remember to call this
def set_price(self, price):
self.price = price
self._update_totals() # Must remember to call this
def set_quantity(self, quantity):
self.quantity = quantity
self._update_totals() # Must remember to call this
def _update_totals(self):
# Must update in the correct order
self.subtotal = self.price * self.quantity
self.tax = self.subtotal * self.tax_rate
self.total = self.subtotal + self.tax
# Oops, forgot to update the display!
```
### With reaktiv (Excel-style Automatic Updates)
This is like Excel - change a cell and everything recalculates automatically:
```python
from reaktiv import Signal, Computed, Effect
# Base values (like Excel input cells)
price = Signal(100.0) # A1
quantity = Signal(2) # A2
tax_rate = Signal(0.1) # A3
# Formulas (like Excel computed cells)
@Computed # B1 = A1 * A2
def subtotal():
return price() * quantity()
@Computed # B2 = B1 * A3
def tax():
return subtotal() * tax_rate()
@Computed # B3 = B1 + B2
def total():
return subtotal() + tax()
# Auto-display (like Excel chart that updates automatically)
total_effect = Effect(lambda: print(f"Order total: ${total():.2f}"))
# Just change the input - everything recalculates like Excel!
price.set(120.0) # Change A1 - B1, B2, B3 all update automatically
quantity.set(3) # Same thing
```
Benefits:
- ✅ Cannot forget to update dependent data
- ✅ Updates always happen in the correct order
- ✅ State relationships are explicit and centralized
- ✅ Side effects are guaranteed to run
## Type Safety & Decorator Benefits
`reaktiv` provides full type hint support, making it compatible with static type checkers like ruff, mypy and pyright. This enables better IDE autocompletion, early error detection, and improved code maintainability.
```python
from reaktiv import Signal, Computed, Effect
# Explicit type annotations
name: Signal[str] = Signal("Alice")
age: Signal[int] = Signal(30)
active: Signal[bool] = Signal(True)
# Type inference works automatically
score = Signal(100.0) # Inferred as Signal[float]
items = Signal([1, 2, 3]) # Inferred as Signal[list[int]]
# Computed values preserve and infer types
@Computed
def name_length(): # Type is automatically ComputeSignal[int]
return len(name())
@Computed
def greeting(): # Type is automatically ComputeSignal[str]
return f"Hello, {name()}!"
@Computed
def total_score(): # Type is automatically ComputeSignal[float]
return score() * 1.5
# Type-safe update functions
def increment_age(current: int) -> int:
return current + 1
age.update(increment_age) # Type checked!
```
## Why This Pattern?
```mermaid
graph TD
subgraph "Traditional Approach"
T1[Manual Updates]
T2[Scattered Logic]
T3[Easy to Forget]
T4[Hard to Debug]
T1 --> T2
T2 --> T3
T3 --> T4
end
subgraph "Reactive Approach"
R1[Declare Relationships]
R2[Automatic Updates]
R3[Centralized Logic]
R4[Guaranteed Consistency]
R1 --> R2
R2 --> R3
R3 --> R4
end
classDef traditional fill:#f44336,color:white
classDef reactive fill:#4CAF50,color:white
class T1,T2,T3,T4 traditional
class R1,R2,R3,R4 reactive
```
This reactive approach comes from frontend frameworks like **Angular** and **SolidJS**, where fine-grained reactivity revolutionized UI development. While those frameworks use this reactive pattern to efficiently update user interfaces, the core insight applies everywhere: **declaring reactive relationships between data leads to fewer bugs** than manually managing updates.
The reactive pattern is particularly valuable in Python applications for:
- Configuration management with cascading overrides
- Caching with automatic invalidation
- Real-time data processing pipelines
- Request/response processing with derived context
- Monitoring and alerting systems
## Practical Examples
### Reactive Configuration Management
```python
from reaktiv import Signal, Computed
# Multiple reactive config sources
defaults = Signal({"timeout": 30, "retries": 3})
user_prefs = Signal({"timeout": 60})
feature_flags = Signal({"new_retry_logic": True})
# Automatically reactive merged config
@Computed
def config():
return {
**defaults(),
**user_prefs(),
**feature_flags()
}
print(config()) # {'timeout': 60, 'retries': 3, 'new_retry_logic': True}
# Change any source - merged config reacts automatically
defaults.update(lambda d: {**d, "max_connections": 100})
print(config()) # Now includes max_connections
```
### Reactive Data Processing Pipeline
```python
import time
from reaktiv import Signal, Computed, Effect
# Reactive raw data stream
raw_data = Signal([])
# Reactive processing pipeline
@Computed
def filtered_data():
return [x for x in raw_data() if x > 0]
@Computed
def processed_data():
return [x * 2 for x in filtered_data()]
@Computed
def summary():
data = processed_data()
return {
"count": len(data),
"sum": sum(data),
"avg": sum(data) / len(data) if data else 0
}
# Reactive monitoring - MUST assign to variable!
summary_effect = Effect(lambda: print(f"Summary: {summary()}"))
# Add data - entire reactive pipeline recalculates automatically
raw_data.set([1, -2, 3, 4]) # Prints summary
raw_data.update(lambda d: d + [5, 6]) # Updates summary
```
#### Reactive Pipeline Visualization
```mermaid
graph LR
subgraph "Reactive Data Processing Pipeline"
RD[raw_data<br/>Signal<list>]
FD[filtered_data<br/>Computed<list>]
PD[processed_data<br/>Computed<list>]
SUM[summary<br/>Computed<dict>]
RD -->|reactive filter x > 0| FD
FD -->|reactive map x * 2| PD
PD -->|reactive aggregate| SUM
SUM --> EFF[Effect: print summary]
end
NEW[New Data] -.->|"raw_data.set()"| RD
RD -.->|reactive update| FD
FD -.->|reactive update| PD
PD -.->|reactive update| SUM
SUM -.->|reactive trigger| EFF
classDef signal fill:#4CAF50,color:white
classDef computed fill:#2196F3,color:white
classDef effect fill:#FF9800,color:white
classDef input fill:#9C27B0,color:white
class RD signal
class FD,PD,SUM computed
class EFF effect
class NEW input
```
### Reactive System Monitoring
```python
from reaktiv import Signal, Computed, Effect
# Reactive system metrics
cpu_usage = Signal(20)
memory_usage = Signal(60)
disk_usage = Signal(80)
# Reactive health calculation
system_health = Computed(lambda:
"critical" if any(x > 90 for x in [cpu_usage(), memory_usage(), disk_usage()]) else
"warning" if any(x > 75 for x in [cpu_usage(), memory_usage(), disk_usage()]) else
"healthy"
)
# Reactive automatic alerting - MUST assign to variable!
alert_effect = Effect(lambda: print(f"System status: {system_health()}")
if system_health() != "healthy" else None)
cpu_usage.set(95) # Reactive system automatically prints: "System status: critical"
```
## Advanced Features
### LinkedSignal (Writable derived state)
`LinkedSignal` is a writable computed signal that can be manually set by users but will automatically reset when its source context changes. Use it for “user overrides with sane defaults” that should survive some changes but reset on others.
Common use cases:
- Pagination: selection resets when page changes
- Wizard flows: step-specific state resets when the step changes
- Filters & search: user-picked value persists across pagination, resets when query changes
- Forms: default values computed from context but user can override temporarily
**Using the @Linked decorator:**
```python
from reaktiv import Signal, Linked
page = Signal(1)
# Writable derived state that resets whenever page changes
@Linked
def selection() -> str:
return f"default-for-page-{page()}"
selection.set("custom-choice") # user override
print(selection()) # "custom-choice"
page.set(2) # context changes → resets
print(selection()) # "default-for-page-2"
```
**Alternative: factory function style:**
```python
# Still supported
selection = LinkedSignal(lambda: f"default-for-page-{page()}")
```
Advanced pattern (explicit source and previous-state aware computation):
```python
from reaktiv import Signal, LinkedSignal, PreviousState
# Source contains (query, page). We want selection to persist across page changes
# but reset when the query string changes.
query = Signal("shoes")
page = Signal(1)
def compute_selection(src: tuple[str, int], prev: PreviousState[str] | None) -> str:
current_query, _ = src
# If only the page changed, keep previous selection
if prev is not None and isinstance(prev.source, tuple) and prev.source[0] == current_query:
return prev.value
# Otherwise, provide a new default for the new query
return f"default-for-{current_query}"
selection = LinkedSignal(source=lambda: (query(), page()), computation=compute_selection)
print(selection()) # "default-for-shoes"
selection.set("red-sneakers")
page.set(2) # page changed, same query → keep user override
print(selection()) # "red-sneakers"
query.set("boots") # query changed → reset to new default
print(selection()) # "default-for-boots"
```
Notes:
- It’s writable: call `selection.set(...)` or `selection.update(...)` to override.
- It auto-resets based on the dependencies you read (simple pattern) or your custom `source` logic (advanced pattern).
### Resource - Async Data Loading
`Resource` brings async operations into your reactive application with automatic dependency tracking, request cancellation, and comprehensive status management. Perfect for API calls and data fetching.
```python
import asyncio
from reaktiv import Resource, Signal, ResourceStatus
# Reactive parameter
user_id = Signal(1)
# Async data loader
async def fetch_user(params):
# Check for cancellation
if params.cancellation.is_set():
return None
await asyncio.sleep(0.5) # Simulate API call
return {"id": params.params["user_id"], "name": f"User {params.params['user_id']}"}
async def main():
# Create resource
user_resource = Resource(
params=lambda: {"user_id": user_id()}, # When user_id changes, auto-reload
loader=fetch_user
)
# Wait for initial load
await asyncio.sleep(0.6)
# Access data safely
if user_resource.has_value():
print(user_resource.value()) # {"id": 1, "name": "User 1"}
# Changing param automatically triggers reload
user_id.set(2)
await asyncio.sleep(0.6)
print(user_resource.value()) # {"id": 2, "name": "User 2"}
asyncio.run(main())
```
**Key features:**
- **6 status states**: IDLE, LOADING, RELOADING, RESOLVED, ERROR, LOCAL
- **Automatic request cancellation** when parameters change (prevents race conditions)
- **Seamless integration** with Computed and Effect
- **Manual control** via `reload()`, `set()`, and `update()` methods
- **Atomic snapshots** for safe state access
- **Automatic cleanup** when garbage collected
For complete documentation, examples, and design patterns, see the [Resource User Guide](https://reaktiv.bui.app/docs/resource-guide.html).
### Custom Equality
```python
# For objects where you want value-based comparison
items = Signal([1, 2, 3], equal=lambda a, b: a == b)
items.set([1, 2, 3]) # Won't trigger updates (same values)
```
### Update Functions
```python
counter = Signal(0)
counter.update(lambda x: x + 1) # Increment based on current value
```
### Async Effects
**Recommendation: Use synchronous effects** - they provide better control and predictable behavior:
```python
import asyncio
my_signal = Signal("initial")
# ✅ RECOMMENDED: Synchronous effect with async task spawning
def sync_effect():
# Signal values captured at this moment - guaranteed consistency
current_value = my_signal()
# Spawn async task if needed
async def background_work():
await asyncio.sleep(0.1)
print(f"Processing: {current_value}")
asyncio.create_task(background_work())
# MUST assign to variable!
my_effect = Effect(sync_effect)
```
**Experimental: Direct async effects**
Async effects are experimental and should be used with caution:
```python
import asyncio
async def async_effect():
await asyncio.sleep(0.1)
print(f"Async processing: {my_signal()}")
# MUST assign to variable!
my_async_effect = Effect(async_effect)
```
**Key differences:**
- **Synchronous effects**: Block the signal update until complete, ensuring signal values don't change during effect execution
- **Async effects** (experimental): Allow signal updates to complete immediately, but signal values may change while the async effect is running
**Note:** Most applications should use synchronous effects for predictable behavior.
### Untracked Reads
Use `untracked()` to read signals without creating dependencies:
```python
from reaktiv import Signal, Computed, Effect
user_id = Signal(1)
debug_mode = Signal(False)
# This computed only depends on user_id, not debug_mode
def get_user_data():
uid = user_id() # Creates dependency
if untracked(debug_mode): # No dependency created
print(f"Loading user {uid}")
return f"User data for {uid}"
user_data = Computed(get_user_data)
debug_mode.set(True) # Won't trigger recomputation
user_id.set(2) # Will trigger recomputation
```
**Context Manager Usage**
You can also use `untracked` as a context manager to read multiple signals without creating dependencies. This is useful for logging or conditional logic inside an effect without adding extra dependencies.
```python
from reaktiv import Signal, Computed, Effect, untracked
name = Signal("Alice")
is_logging_enabled = Signal(False)
log_level = Signal("INFO")
greeting = Computed(lambda: f"Hello, {name()}!")
# An effect that depends on `greeting`, but reads other signals untracked
def display_greeting():
# Create a dependency on `greeting`
current_greeting = greeting()
# Read multiple signals without creating dependencies
with untracked():
logging_active = is_logging_enabled()
current_log_level = log_level()
if logging_active:
print(f"LOG [{current_log_level}]: Greeting updated to '{current_greeting}'")
print(current_greeting)
# MUST assign to variable!
greeting_effect = Effect(display_greeting)
# Initial run prints: "Hello, Alice"
name.set("Bob")
# Prints: "Hello, Bob"
is_logging_enabled.set(True)
log_level.set("DEBUG")
# Prints nothing, because these are not dependencies of the effect.
name.set("Charlie")
# Prints:
# LOG [DEBUG]: Greeting updated to 'Hello, Charlie'
# Hello, Charlie
```
The context manager approach is particularly useful when you need to read multiple signals for logging, debugging, or conditional logic without creating reactive dependencies.
### Batch Updates
Use `batch()` to group multiple updates and trigger effects only once:
```python
from reaktiv import Signal, Effect, batch
name = Signal("Alice")
age = Signal(30)
city = Signal("New York")
def print_info():
print(f"{name()}, {age()}, {city()}")
info_effect = Effect(print_info)
# Effect prints one time on init
# Without batch - prints 3 times
name.set("Bob")
age.set(25)
city.set("Boston")
# With batch - prints only once at the end
with batch():
name.set("Charlie")
age.set(35)
city.set("Chicago")
# Only prints once: "Charlie, 35, Chicago"
```
### Error Handling
Proper error handling is crucial to prevent cascading failures:
```python
from reaktiv import Signal, Computed, Effect
# Example: Division computation that can fail
numerator = Signal(10)
denominator = Signal(2)
# Unsafe computation - can throw ZeroDivisionError
unsafe_division = Computed(lambda: numerator() / denominator())
# Safe computation with error handling
def safe_divide():
try:
return numerator() / denominator()
except ZeroDivisionError:
return float('inf') # or return 0, or handle as needed
safe_division = Computed(safe_divide)
# Error handling in effects
def safe_print():
try:
unsafe_result = unsafe_division()
print(f"Unsafe result: {unsafe_result}")
except ZeroDivisionError:
print("Error: Division by zero!")
safe_result = safe_division()
print(f"Safe result: {safe_result}")
effect = Effect(safe_print)
# Test error scenarios
denominator.set(0) # Triggers ZeroDivisionError in unsafe computation
# Prints: "Error: Division by zero!" and "Safe result: inf"
```
## Important Notes
### ⚠️ Effect Retention (Critical!)
**Effects must be assigned to a variable to prevent garbage collection.** This is the most common mistake when using reaktiv:
```python
# ❌ WRONG - effect gets garbage collected immediately and won't work
Effect(lambda: print("This will never print"))
# ✅ CORRECT - effect stays active
my_effect = Effect(lambda: print("This works!"))
# ✅ Also correct - store in a list or class attribute
effects = []
effects.append(Effect(lambda: print("This also works!")))
# ✅ In classes, assign to self
class MyClass:
def __init__(self):
self.counter = Signal(0)
# Keep effect alive by assigning to instance
self.effect = Effect(lambda: print(f"Counter: {self.counter()}"))
```
**Why this design?** This explicit retention requirement prevents accidental memory leaks. Unlike some reactive systems that automatically keep effects alive indefinitely, `reaktiv` requires you to explicitly manage effect lifetimes. When you no longer need an effect, simply let the variable go out of scope or delete it - the effect will be automatically cleaned up. This gives you control over when reactive behavior starts and stops, preventing long-lived applications from accumulating abandoned effects.
**Manual cleanup:** You can also explicitly dispose of effects when you're done with them:
```python
my_effect = Effect(lambda: print("This will run"))
# ... some time later ...
my_effect.dispose() # Manually clean up the effect
# Effect will no longer run when dependencies change
```
### Mutable Objects
By default, reaktiv uses identity comparison. For mutable objects:
```python
data = Signal([1, 2, 3])
# This triggers update (new list object)
data.set([1, 2, 3])
# This doesn't trigger update (same object, modified in place)
current = data()
current.append(4) # reaktiv doesn't see this change
```
### Working with Lists and Dictionaries
When working with mutable objects like lists and dictionaries, you need to create new objects to trigger updates:
#### Lists
```python
items = Signal([1, 2, 3])
# ❌ WRONG - modifies in place, no update triggered
current = items()
current.append(4) # reaktiv doesn't detect this
# ✅ CORRECT - create new list
items.set([*items(), 4]) # or items.set(items() + [4])
# ✅ CORRECT - using update() method
items.update(lambda current: current + [4])
items.update(lambda current: [*current, 4])
# Other list operations
items.update(lambda lst: [x for x in lst if x > 2]) # Filter
items.update(lambda lst: [x * 2 for x in lst]) # Map
items.update(lambda lst: lst[:-1]) # Remove last
items.update(lambda lst: [0] + lst) # Prepend
```
#### Dictionaries
```python
config = Signal({"timeout": 30, "retries": 3})
# ❌ WRONG - modifies in place, no update triggered
current = config()
current["new_key"] = "value" # reaktiv doesn't detect this
# ✅ CORRECT - create new dictionary
config.set({**config(), "new_key": "value"})
# ✅ CORRECT - using update() method
config.update(lambda current: {**current, "new_key": "value"})
# Other dictionary operations
config.update(lambda d: {**d, "timeout": 60}) # Update value
config.update(lambda d: {k: v for k, v in d.items() if k != "retries"}) # Remove key
config.update(lambda d: {**d, **{"max_conn": 100, "pool_size": 5}}) # Merge multiple
```
#### Alternative: Value-Based Equality
If you prefer to modify objects in place, provide a custom equality function:
```python
# For lists - compares actual values
def list_equal(a, b):
return len(a) == len(b) and all(x == y for x, y in zip(a, b))
items = Signal([1, 2, 3], equal=list_equal)
# Now you can modify in place and trigger updates manually
current = items()
current.append(4)
items.set(current) # Triggers update because values changed
# For dictionaries - compares actual content
def dict_equal(a, b):
return a == b
config = Signal({"timeout": 30}, equal=dict_equal)
current = config()
current["retries"] = 3
config.set(current) # Triggers update
```
## More Examples
You can find more example scripts in the [examples](./examples) folder to help you get started with using this project.
Including integration examples with:
- [FastAPI - Websocket](./examples/fastapi_websocket.py)
- [NiceGUI - Todo-App](./examples/nicegui_todo_app.py)
- [Reactive Data Pipeline with NumPy and Pandas](./examples/data_pipeline_numpy_pandas.py)
- [Jupyter Notebook - Reactive IPyWidgets](./examples/reactive_jupyter_notebook.ipynb)
- [NumPy Matplotlib - Reactive Plotting](./examples/numpy_plotting.py)
- [IoT Sensor Agent Thread - Reactive Hardware](./examples/iot_sensor_agent_thread.py)
## Star History
[](https://star-history.com/#buiapp/reaktiv&Date)
---
**Inspired by** Angular Signals and SolidJS reactivity • **Built for** Python developers who want fewer state management bugs • **Made in** Hamburg
| text/markdown | null | Tuan Anh Bui <mail@bui.app> | null | null | MIT | null | [
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12"
] | [] | null | null | >=3.9 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/buiapp/reaktiv"
] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-19T10:44:37.722870 | reaktiv-0.21.2.tar.gz | 85,019 | c0/2b/3162c689f5ed78b0a29d9cf2911af3279667110335c04cd270a7b82eb5f8/reaktiv-0.21.2.tar.gz | source | sdist | null | false | fba4c597881608e25cbfa489771a495f | 69a96c993457947af0ce6f29a8d4b8e607168d2f6a963f3a398894813677a9ce | c02b3162c689f5ed78b0a29d9cf2911af3279667110335c04cd270a7b82eb5f8 | null | [] | 401 |
2.4 | alt-pytest-asyncio | 0.9.5 | Alternative pytest plugin to pytest-asyncio | Alternative Pytest Asyncio
==========================
This plugin allows you to have async pytest fixtures and tests.
This plugin only supports python 3.11 and above.
The code here is influenced by pytest-asyncio but with some differences:
* Error tracebacks from are from your tests, rather than asyncio internals
* There is only one loop for all of the tests
* You can manage the lifecycle of the loop yourself outside of pytest by using
this plugin with your own loop
* No need to explicitly mark your tests as async. (pytest-asyncio requires you
mark your async tests because it also supports other event loops like curio
and trio)
Like pytest-asyncio it supports async tests, coroutine fixtures and async
generator fixtures.
Full documentation can be found at https://alt-pytest-asyncio.readthedocs.io
| text/x-rst | null | Stephen Moore <delfick755@gmail.com> | null | null | MIT | null | [
"Framework :: Pytest",
"Topic :: Software Development :: Testing"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"pytest>=8.0.0",
"alt-pytest-asyncio-test-driver; extra == \"dev\"",
"tools; extra == \"dev\"",
"tools; extra == \"tools\""
] | [] | [] | [] | [
"Homepage, https://github.com/delfick/alt-pytest-asyncio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:44:17.279349 | alt_pytest_asyncio-0.9.5.tar.gz | 9,030 | 38/52/920ab736636580ea40ce47d1ba98a027e273a32e4be144eed2f928d9ac41/alt_pytest_asyncio-0.9.5.tar.gz | source | sdist | null | false | 80506ce5d3c561353500fbc84a4ed450 | 8171fbf00fbb4d763d7ded77e343dc443bd111901b1f5654ad271cf94a74c248 | 3852920ab736636580ea40ce47d1ba98a027e273a32e4be144eed2f928d9ac41 | null | [
"LICENSE"
] | 490 |
2.4 | charmlibs-interfaces-tls-certificates | 1.7.0 | The charmlibs.interfaces.tls_certificates package. | # charmlibs.interfaces.tls_certificates
The `tls-certificates` interface library.
To install, add `charmlibs-interfaces-tls-certificates` to your Python dependencies. Then in your Python code, import as:
```py
from charmlibs.interfaces import tls_certificates
```
See the [reference documentation](https://documentation.ubuntu.com/charmlibs/reference/charmlibs/interfaces/tls-certificates).
Also see the [usage documentation](https://charmhub.io/tls-certificates-interface) on Charmhub. This documentation was written when the library was hosted on Charmhub, so some parts might not be directly applicable.
| text/markdown | The TLS team at Canonical | null | null | null | null | null | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"cryptography>=43.0.0",
"ops",
"pydantic"
] | [] | [] | [] | [
"Repository, https://github.com/canonical/charmlibs",
"Issues, https://github.com/canonical/charmlibs/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:43:57.758544 | charmlibs_interfaces_tls_certificates-1.7.0.tar.gz | 145,552 | da/fa/9c20a264c42adcad31293e4588e421c53ae24c105199e4e1c9d7374a4e64/charmlibs_interfaces_tls_certificates-1.7.0.tar.gz | source | sdist | null | false | bb0e49e1bed67f6c5fbceb7686a98a56 | 7fe79c78fab51a864c96d8d731049479610a014152a75dd585568ad268ecaafa | dafa9c20a264c42adcad31293e4588e421c53ae24c105199e4e1c9d7374a4e64 | null | [] | 475 |
2.4 | lumora | 0.3.7 | A composable image generation and document rendering library for Python, powered by Pillow. | <p align="center">
<img src="assets/icon.svg" alt="Lumora" width="128" height="128">
</p>
<h1 align="center">Lumora</h1>
<p align="center">
A composable image generation and document rendering library for Python, powered by Pillow.
</p>
<p align="center">
<a href="https://pypi.org/project/lumora/"><img src="https://img.shields.io/pypi/v/lumora.svg" alt="PyPI"></a>
<a href="https://www.python.org/downloads/"><img src="https://img.shields.io/badge/python-3.12+-blue.svg" alt="Python 3.12+"></a>
<a href="https://github.com/krypton-byte/lumora/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-MIT-green.svg" alt="MIT License"></a>
<a href="https://krypton-byte.github.io/lumora/playground/"><img src="https://img.shields.io/badge/try-playground-7c6ef0.svg" alt="Playground"></a>
<a href="https://krypton-byte.github.io/lumora/docs/"><img src="https://img.shields.io/badge/docs-mkdocs-blue.svg" alt="Documentation"></a>
</p>
---
## Overview
**Lumora** lets you build complex, pixel-perfect images, multi-page PDF documents, and PPTX presentations entirely in Python — no browser, no template engine, no external rendering service.
Inspired by modern declarative UI toolkits (Jetpack Compose, Flutter), Lumora provides a composable widget tree where every element — containers, columns, rows, text, images — is sized, laid out, clipped, and rendered automatically.
### Key features
- **Composable layouts** — `Container`, `Column`, `Row`, `Stack`, `Expanded`, `Spacer`, `SizedBox`, `Padding`
- **Rich text** — `TextWidget`, `RichTextWidget` (inline HTML-like markup), `JustifiedTextWidget`, `HighlightTextWidget`
- **Advanced styling** — gradients (linear & radial), rounded corners, circles, strokes, shadows, text transforms
- **Visual effects** — blur, stroke outlines, glassmorphism corners, alpha gradients, custom effects
- **GPU acceleration** — optional Taichi-powered GPU backend (`lumora[gpu]`) with automatic CPU/NumPy fallback
- **Smart render caching** — content-addressable LRU cache with structural hashing; unchanged widgets return instantly on re-render (~17× speedup)
- **Design-resolution scaling** — define once at a virtual size, render to any resolution with LANCZOS resampling
- **PDF export** — multi-page documents with optional metadata via pikepdf
- **PPTX export** — PowerPoint presentations with auto-fitted slide dimensions via python-pptx
- **`@preview` decorator** — mark specific `Page` subclasses for selective rendering in the playground and VS Code extension
- **Pure Python** — depends on Pillow, NumPy, and python-pptx; pikepdf is an optional extra
---
## Monorepo structure
```
lumora/
├── packages/
│ ├── lumora/ # Core Python library (published to PyPI)
│ └── lumora-vscode/ # VS Code extension for live preview
├── docs/
│ ├── content/ # MkDocs documentation source
│ ├── playground/ # Browser-based playground (Vite + React)
│ └── landing/ # Landing page
├── examples/ # Example scripts
└── pyproject.toml # Python project configuration
```
---
## Try it online
The **[Lumora Playground](https://krypton-byte.github.io/lumora/playground/)** lets you write and render Lumora pages directly in the browser — no installation required. Features include:
- Live code editor with Python syntax highlighting
- Instant rendering with render time display
- Zoom, pan, and fit controls for the preview
- Export to PDF (client-side via jsPDF) and PPTX
- Page reordering before export
- File save/open and shareable URLs
- Auto-saves your code between sessions
- Built-in examples to get started
---
## Installation
```bash
pip install lumora
```
For PDF metadata support (title, author, etc.):
```bash
pip install lumora[pikepdf]
```
For GPU-accelerated rendering via [Taichi](https://www.taichi-lang.org/):
```bash
pip install lumora[gpu]
```
Or with [uv](https://docs.astral.sh/uv/):
```bash
uv add lumora # core (includes python-pptx)
uv add lumora[pikepdf] # with PDF metadata
uv add lumora[gpu] # with GPU acceleration
```
### Requirements
- Python 3.12+
- Pillow >= 12.1
- NumPy >= 2.4
- python-pptx >= 0.6 *(included — for PPTX export)*
- pikepdf >= 10.3 *(optional — for PDF metadata)*
- Taichi >= 1.7 *(optional — for GPU acceleration)*
---
## Quick start
```python
from lumora.page import Page, preview
from lumora.layout import Container, Column, TextWidget
from lumora.config import RGBColor, Alignment, Margin
from lumora.font import TextStyle, FontWeight
title_style = TextStyle(
font_size=72,
font_family="Inter",
font_weight=FontWeight.BOLD,
paint=RGBColor(255, 255, 255),
)
body_style = TextStyle(
font_size=32,
font_family="Inter",
paint=RGBColor(200, 200, 200),
)
@preview
class HelloPage(Page):
design_width = 1920
design_height = 1080
def build(self):
return Container(
width=self.design_width,
height=self.design_height,
background=RGBColor(25, 25, 35),
alignment=Alignment.CENTER,
padding=Margin(60, 60, 60, 60),
child=Column(
spacing=20,
children=[
TextWidget("Hello, Lumora!", style=title_style),
TextWidget(
"Composable image generation in pure Python.",
style=body_style,
),
],
),
)
page = HelloPage()
page.save("hello.png", width=1920)
```
### The `@preview` decorator
Mark any `Page` subclass with `@preview` to flag it for selective rendering in the [Playground](https://krypton-byte.github.io/lumora/playground/) and the [VS Code extension](#vs-code-extension). When `@preview` is present on any class, only decorated classes are rendered — otherwise all `Page` subclasses are rendered:
```python
from lumora.page import Page, preview
@preview
class DesignDraft(Page):
"""Only this page renders in the playground/VS Code."""
...
class ProductionPage(Page):
"""This page is skipped in preview tools."""
...
```
---
## Widget catalog
| Widget | Description |
|-|-|
| `Container` | Box with background, padding, alignment, clipping, effects |
| `Column` | Vertical layout with spacing and axis alignment |
| `Row` | Horizontal layout with spacing and axis alignment |
| `Stack` | Overlay children on top of each other |
| `Expanded` | Fill remaining space in a `Column` or `Row` |
| `Spacer` | Fixed-size empty space |
| `SizedBox` | Force explicit width/height on a child |
| `Padding` | Add insets around a child |
| `TextWidget` | Single-style text block |
| `RichTextWidget` | Inline markup: `<b>`, `<i>`, `<color>`, `<size>`, etc. |
| `JustifiedTextWidget` | Fully justified paragraph text |
| `HighlightTextWidget` | Text with highlighted background spans |
| `ImageWidget` | Render an image from file or PIL Image |
| `LineWidget` | Horizontal line / divider |
| `GlassContainer` | Frosted-glass effect container |
---
## Multi-page PDF
```python
from lumora.page import Document, DocumentMetadata, Format
doc = Document()
doc.add_page(CoverPage())
doc.add_page(ChartPage())
doc.add_page(SummaryPage())
# Without metadata — no pikepdf needed
doc.save("report.pdf", width=1920)
# With metadata — requires: pip install lumora[pikepdf]
doc.save(
"report.pdf",
Format.PDF,
DocumentMetadata(title="Q4 Report", author="Analytics"),
width=1920,
)
```
---
## PPTX (PowerPoint) export
Slide dimensions automatically match the image aspect ratio — no stretching.
```python
doc.save("slides.pptx", Format.PPTX)
# With metadata
doc.save(
"slides.pptx",
Format.PPTX,
DocumentMetadata(title="Q4 Report", author="Analytics"),
width=1920,
)
# At half resolution
doc.save("slides.pptx", Format.PPTX, scale=0.5)
```
---
## GPU acceleration
Lumora includes a dual compute backend. When `lumora[gpu]` is installed,
heavy pixel operations (gradients, blending, vignette, noise, etc.) are
offloaded to the GPU via [Taichi](https://www.taichi-lang.org/). Without
Taichi, the same operations run on the CPU using optimised NumPy.
```bash
pip install lumora[gpu]
```
Check the active backend at runtime:
```python
from lumora.backend import get_backend
print(get_backend()) # 'taichi' or 'numpy'
```
Taichi tries CUDA → Vulkan → Metal → OpenGL and finally falls back to
CPU if no GPU driver is available. Even in CPU mode, Taichi kernels
benefit from multi-threading and JIT compilation.
---
## Render caching
Lumora includes a **content-addressable render cache** that makes
re-rendering fast — especially during live-preview workflows where only
a few widgets change between refreshes.
Every widget computes a *structural hash* from its parameters and the
hashes of its children. When `render()` is called, Lumora checks the
cache first and returns the stored image immediately on a hit. Only
widgets whose content actually changed are re-rendered.
**Highlights:**
- **Automatic** — enabled by default, zero configuration needed.
- **Content-based** — even if widget objects are recreated (e.g. on each VS Code preview refresh), cache hits occur as long as the parameters match.
- **LRU + memory budget** — defaults to 256 MB; evicts least-recently-used entries when the budget is exceeded.
- **~17× speedup** on hot re-renders with partial changes (benchmarked on a 25-widget tree).
### Configuration
```python
from lumora.cache import RenderCache
# Raise memory budget to 512 MB
RenderCache.configure(max_mb=512)
# Disable caching entirely
RenderCache.configure(enabled=False)
# Inspect hit/miss statistics
print(RenderCache.instance().stats)
# {'hits': 19, 'misses': 1, 'hit_rate': 0.95, 'entries': 20,
# 'memory_mb': 42.0, 'max_memory_mb': 256.0}
# Clear all cached entries
RenderCache.instance().clear()
```
---
## Tooling
### Playground
The **[Lumora Playground](https://krypton-byte.github.io/lumora/playground/)** runs entirely in the browser via PyScript/Pyodide. Write, render, zoom/pan, and export — with auto-save and shareable URLs.
<p align="center">
<img src="https://cdn.jsdelivr.net/gh/bgstanly/stream@157861b/output_playground.webp" alt="Lumora Playground Demo" width="720" />
</p>
<p align="center"><em>Lumora Playground — write and render Lumora pages directly in the browser</em></p>
### VS Code Extension
The **Lumora Preview** extension (`packages/lumora-vscode/`) provides live preview directly in VS Code with hot reload.
<p align="center">
<img src="https://cdn.jsdelivr.net/gh/bgstanly/stream@157861b/output_vscode.webp" alt="Lumora VS Code Extension Demo" width="720" />
</p>
<p align="center"><em>Lumora VS Code Extension — live preview with hot reload</em></p>
Files must start with `# @lumora-file-preview` on the first line, and pages must be decorated with `@preview`:
```python
# @lumora-file-preview
from lumora.page import Page, preview
from lumora.layout import Container, TextWidget
from lumora.config import RGBColor, Alignment
from lumora.font import TextStyle, FontWeight
@preview
class MyPage(Page):
design_width = 1920
design_height = 1080
def build(self):
return Container(
width=self.design_width,
height=self.design_height,
background=RGBColor(25, 25, 40),
alignment=Alignment.CENTER,
child=TextWidget(
"Hello!",
style=TextStyle(
font_size=80,
font_weight=FontWeight.BOLD,
paint=RGBColor(255, 255, 255),
),
),
)
```
Install from the [Releases](https://github.com/krypton-byte/lumora/releases) page — download the `.vsix` file and install via `Ctrl+Shift+P` → "Extensions: Install from VSIX...".
---
## Development
```bash
# Clone the monorepo
git clone https://github.com/krypton-byte/lumora.git
cd lumora
# Install Python dependencies
uv sync --group docs --group dev
# Run examples
uv run python examples/01_hello_world.py
# Serve docs locally
uv run mkdocs serve -f docs/mkdocs.yml
# Build the playground
cd docs/playground && bun install && bun run dev
# Build the VS Code extension
cd packages/lumora-vscode && bun install && bun run compile
```
---
## Documentation
Full API documentation is available at **[krypton-byte.github.io/lumora/docs/](https://krypton-byte.github.io/lumora/docs/)**.
---
## License
MIT — see [LICENSE](LICENSE) for details.
| text/markdown | null | null | null | null | MIT | composable, generation, image, layout, pdf, pillow, report, widget | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.12",
"Topic :: Multimedia :: Graphics",
"Topic :: Software Development :: Libraries :: Python Modules"
] | [] | null | null | >=3.12 | [] | [] | [] | [
"numpy>=2.4.2",
"pillow>=12.1.1",
"python-pptx>=0.6.21",
"pikepdf>=10.3.0; extra == \"all\"",
"taichi>=1.7.0; extra == \"all\"",
"taichi>=1.7.0; extra == \"gpu\"",
"pikepdf>=10.3.0; extra == \"pikepdf\""
] | [] | [] | [] | [
"Documentation, https://krypton-byte.github.io/lumora",
"Repository, https://github.com/krypton-byte/lumora",
"Issues, https://github.com/krypton-byte/lumora/issues"
] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-19T10:43:20.538219 | lumora-0.3.7-py3-none-any.whl | 4,827,679 | 75/7b/5827fbf524c550a9650197cea6785f82bd1a3b4d555bdbc36a2a2e02a29b/lumora-0.3.7-py3-none-any.whl | py3 | bdist_wheel | null | false | ec2d2818d0930049b6d7457c800167c8 | f95ce3c07e7a415b7efb08bf75d5ef9d9f5ea2c3438e5ca1821e17e33dba7f85 | 757b5827fbf524c550a9650197cea6785f82bd1a3b4d555bdbc36a2a2e02a29b | null | [] | 233 |
2.4 | manila | 19.1.1 | Shared Storage for OpenStack | ========================
Team and repository tags
========================
.. image:: https://governance.openstack.org/tc/badges/manila.svg
:target: https://governance.openstack.org/tc/reference/tags/index.html
.. Change things from this point on
======
MANILA
======
You have come across an OpenStack shared file system service. It has
identified itself as "Manila." It was abstracted from the Cinder
project.
* Wiki: https://wiki.openstack.org/wiki/Manila
* Developer docs: https://docs.openstack.org/manila/latest/
Getting Started
---------------
If you'd like to run from the master branch, you can clone the git repo:
git clone https://opendev.org/openstack/manila
For developer information please see
`HACKING.rst <https://opendev.org/openstack/manila/src/branch/master/HACKING.rst>`_
You can raise bugs here https://bugs.launchpad.net/manila
Python client
-------------
https://opendev.org/openstack/python-manilaclient
* Documentation for the project can be found at:
https://docs.openstack.org/manila/latest/
* Release notes for the project can be found at:
https://docs.openstack.org/releasenotes/manila/
* Source for the project:
https://opendev.org/openstack/manila
* Bugs:
https://bugs.launchpad.net/manila
* Blueprints:
https://blueprints.launchpad.net/manila
* Design specifications are tracked at:
https://specs.openstack.org/openstack/manila-specs/
| null | OpenStack | openstack-discuss@lists.openstack.org | null | null | null | null | [
"Environment :: OpenStack",
"Intended Audience :: Information Technology",
"Intended Audience :: System Administrators",
"License :: OSI Approved :: Apache Software License",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python",
"Programming Language :: Python :: 3 :: Only",
"Program... | [] | https://docs.openstack.org/manila/latest/ | null | >=3.8 | [] | [] | [] | [
"pbr>=5.5.0",
"alembic>=1.4.2",
"defusedxml>=0.7.1",
"eventlet>=0.26.1",
"greenlet>=0.4.16",
"lxml>=4.5.2",
"netaddr>=0.8.0",
"oslo.config>=8.3.2",
"oslo.context>=3.1.1",
"oslo.db>=8.4.0",
"oslo.i18n>=5.0.1",
"oslo.log>=4.4.0",
"oslo.messaging>=14.1.0",
"oslo.middleware>=4.1.1",
"oslo.po... | [] | [] | [] | [] | twine/6.2.0 CPython/3.11.14 | 2026-02-19T10:41:38.215853 | manila-19.1.1.tar.gz | 3,513,967 | e8/4c/c7d0ca024a9d0afa66f3ca414835ae4ce22f4d8db9f3b4d381eaf3418958/manila-19.1.1.tar.gz | source | sdist | null | false | a6a0a55b0af68d9231cb41e581d43964 | 2abf92408cf3ae69eabc799e587ed4cf1f1c3325ab0f3944479d7d34241d6a0e | e84cc7d0ca024a9d0afa66f3ca414835ae4ce22f4d8db9f3b4d381eaf3418958 | null | [
"LICENSE"
] | 230 |
2.4 | cycode | 3.10.1 | Boost security in your dev lifecycle via SAST, SCA, Secrets & IaC scanning. | # Cycode CLI User Guide
The Cycode Command Line Interface (CLI) is an application you can install locally to scan your repositories for secrets, infrastructure as code misconfigurations, software composition analysis vulnerabilities, and static application security testing issues.
This guide walks you through both installation and usage.
# Table of Contents
1. [Prerequisites](#prerequisites)
2. [Installation](#installation)
1. [Install Cycode CLI](#install-cycode-cli)
1. [Using the Auth Command](#using-the-auth-command)
2. [Using the Configure Command](#using-the-configure-command)
3. [Add to Environment Variables](#add-to-environment-variables)
1. [On Unix/Linux](#on-unixlinux)
2. [On Windows](#on-windows)
2. [Install Pre-Commit Hook](#install-pre-commit-hook)
3. [Cycode CLI Commands](#cycode-cli-commands)
4. [MCP Command](#mcp-command-experiment)
1. [Starting the MCP Server](#starting-the-mcp-server)
2. [Available Options](#available-options)
3. [MCP Tools](#mcp-tools)
4. [Usage Examples](#usage-examples)
5. [Scan Command](#scan-command)
1. [Running a Scan](#running-a-scan)
1. [Options](#options)
1. [Severity Threshold](#severity-option)
2. [Monitor](#monitor-option)
3. [Cycode Report](#cycode-report-option)
4. [Package Vulnerabilities](#package-vulnerabilities-option)
5. [License Compliance](#license-compliance-option)
6. [Lock Restore](#lock-restore-option)
2. [Repository Scan](#repository-scan)
1. [Branch Option](#branch-option)
3. [Path Scan](#path-scan)
1. [Terraform Plan Scan](#terraform-plan-scan)
4. [Commit History Scan](#commit-history-scan)
1. [Commit Range Option (Diff Scanning)](#commit-range-option-diff-scanning)
5. [Pre-Commit Scan](#pre-commit-scan)
6. [Pre-Push Scan](#pre-push-scan)
2. [Scan Results](#scan-results)
1. [Show/Hide Secrets](#showhide-secrets)
2. [Soft Fail](#soft-fail)
3. [Example Scan Results](#example-scan-results)
1. [Secrets Result Example](#secrets-result-example)
2. [IaC Result Example](#iac-result-example)
3. [SCA Result Example](#sca-result-example)
4. [SAST Result Example](#sast-result-example)
4. [Company Custom Remediation Guidelines](#company-custom-remediation-guidelines)
3. [Ignoring Scan Results](#ignoring-scan-results)
1. [Ignoring a Secret Value](#ignoring-a-secret-value)
2. [Ignoring a Secret SHA Value](#ignoring-a-secret-sha-value)
3. [Ignoring a Path](#ignoring-a-path)
4. [Ignoring a Secret, IaC, or SCA Rule](#ignoring-a-secret-iac-sca-or-sast-rule)
5. [Ignoring a Package](#ignoring-a-package)
6. [Ignoring via a config file](#ignoring-via-a-config-file)
6. [Report command](#report-command)
1. [Generating SBOM Report](#generating-sbom-report)
7. [Import command](#import-command)
8. [Scan logs](#scan-logs)
9. [Syntax Help](#syntax-help)
# Prerequisites
- The Cycode CLI application requires Python version 3.9 or later. The MCP command is available only for Python 3.10 and above. If you're using an earlier Python version, this command will not be available.
- Use the [`cycode auth` command](#using-the-auth-command) to authenticate to Cycode with the CLI
- Alternatively, you can get a Cycode Client ID and Client Secret Key by following the steps detailed in the [Service Account Token](https://docs.cycode.com/docs/en/service-accounts) and [Personal Access Token](https://docs.cycode.com/v1/docs/managing-personal-access-tokens) pages, which contain details on getting these values.
# Installation
The following installation steps are applicable to both Windows and UNIX / Linux operating systems.
> [!NOTE]
> The following steps assume the use of `python3` and `pip3` for Python-related commands; however, some systems may instead use the `python` and `pip` commands, depending on your Python environment’s configuration.
## Install Cycode CLI
To install the Cycode CLI application on your local machine, perform the following steps:
1. Open your command line or terminal application.
2. Execute one of the following commands:
- To install from [PyPI](https://pypi.org/project/cycode/):
```bash
pip3 install cycode
```
- To install from [Homebrew](https://formulae.brew.sh/formula/cycode):
```bash
brew install cycode
```
- To install from [GitHub Releases](https://github.com/cycodehq/cycode-cli/releases) navigate and download executable for your operating system and architecture, then run the following command:
```bash
cd /path/to/downloaded/cycode-cli
chmod +x cycode
./cycode
```
3. Finally authenticate the CLI. There are three methods to set the Cycode client ID and credentials (client secret or OIDC ID token):
- [cycode auth](#using-the-auth-command) (**Recommended**)
- [cycode configure](#using-the-configure-command)
- Add them to your [environment variables](#add-to-environment-variables)
### Using the Auth Command
> [!NOTE]
> This is the **recommended** method for setting up your local machine to authenticate with Cycode CLI.
1. Type the following command into your terminal/command line window:
`cycode auth`
2. A browser window will appear, asking you to log into Cycode (as seen below):
<img alt="Cycode login" height="300" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/cycode_login.png"/>
3. Enter your login credentials on this page and log in.
4. You will eventually be taken to the page below, where you'll be asked to choose the business group you want to authorize Cycode with (if applicable):
<img alt="authorize CLI" height="450" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/authorize_cli.png"/>
> [!NOTE]
> This will be the default method for authenticating with the Cycode CLI.
5. Click the **Allow** button to authorize the Cycode CLI on the selected business group.
<img alt="allow CLI" height="450" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/allow_cli.png"/>
6. Once completed, you'll see the following screen if it was selected successfully:
<img alt="successfully auth" height="450" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/successfully_auth.png"/>
7. In the terminal/command line screen, you will see the following when exiting the browser window:
`Successfully logged into cycode`
### Using the Configure Command
> [!NOTE]
> If you already set up your Cycode Client ID and Client Secret through the Linux or Windows environment variables, those credentials will take precedent over this method.
1. Type the following command into your terminal/command line window:
```bash
cycode configure
```
2. Enter your Cycode API URL value (you can leave blank to use default value).
`Cycode API URL [https://api.cycode.com]: https://api.onpremise.com`
3. Enter your Cycode APP URL value (you can leave blank to use default value).
`Cycode APP URL [https://app.cycode.com]: https://app.onpremise.com`
4. Enter your Cycode Client ID value.
`Cycode Client ID []: 7fe5346b-xxxx-xxxx-xxxx-55157625c72d`
5. Enter your Cycode Client Secret value (skip if you plan to use an OIDC ID token).
`Cycode Client Secret []: c1e24929-xxxx-xxxx-xxxx-8b08c1839a2e`
6. Enter your Cycode OIDC ID Token value (optional).
`Cycode ID Token []: eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9...`
7. If the values were entered successfully, you'll see the following message:
`Successfully configured CLI credentials!`
or/and
`Successfully configured Cycode URLs!`
If you go into the `.cycode` folder under your user folder, you'll find these credentials were created and placed in the `credentials.yaml` file in that folder.
The URLs were placed in the `config.yaml` file in that folder.
### Add to Environment Variables
#### On Unix/Linux:
```bash
export CYCODE_CLIENT_ID={your Cycode ID}
```
and
```bash
export CYCODE_CLIENT_SECRET={your Cycode Secret Key}
```
If your organization uses OIDC authentication, you can provide the ID token instead (or in addition):
```bash
export CYCODE_ID_TOKEN={your Cycode OIDC ID token}
```
#### On Windows
1. From the Control Panel, navigate to the System menu:
<img height="30" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/image1.png" alt="system menu"/>
2. Next, click Advanced system settings:
<img height="30" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/image2.png" alt="advanced system setting"/>
3. In the System Properties window that opens, click the Environment Variables button:
<img height="30" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/image3.png" alt="environments variables button"/>
4. Create `CYCODE_CLIENT_ID` and `CYCODE_CLIENT_SECRET` variables with values matching your ID and Secret Key, respectively. If you authenticate via OIDC, add `CYCODE_ID_TOKEN` with your OIDC ID token value as well:
<img height="100" src="https://raw.githubusercontent.com/cycodehq/cycode-cli/main/images/image4.png" alt="environment variables window"/>
5. Insert the `cycode.exe` into the path to complete the installation.
## Install Pre-Commit Hook
Cycode's pre-commit and pre-push hooks can be set up within your local repository so that the Cycode CLI application will identify any issues with your code automatically before you commit or push it to your codebase.
> [!NOTE]
> pre-commit and pre-push hooks are not available for IaC scans.
Perform the following steps to install the pre-commit hook:
### Installing Pre-Commit Hook
1. Install the pre-commit framework (Python 3.9 or higher must be installed):
```bash
pip3 install pre-commit
```
2. Navigate to the top directory of the local Git repository you wish to configure.
3. Create a new YAML file named `.pre-commit-config.yaml` (include the beginning `.`) in the repository’s top directory that contains the following:
```yaml
repos:
- repo: https://github.com/cycodehq/cycode-cli
rev: v3.5.0
hooks:
- id: cycode
stages: [pre-commit]
```
4. Modify the created file for your specific needs. Use hook ID `cycode` to enable scan for Secrets. Use hook ID `cycode-sca` to enable SCA scan. Use hook ID `cycode-sast` to enable SAST scan. If you want to enable all scanning types, use this configuration:
```yaml
repos:
- repo: https://github.com/cycodehq/cycode-cli
rev: v3.5.0
hooks:
- id: cycode
stages: [pre-commit]
- id: cycode-sca
stages: [pre-commit]
- id: cycode-sast
stages: [pre-commit]
```
5. Install Cycode’s hook:
```bash
pre-commit install
```
A successful hook installation will result in the message: `Pre-commit installed at .git/hooks/pre-commit`.
6. Keep the pre-commit hook up to date:
```bash
pre-commit autoupdate
```
It will automatically bump `rev` in `.pre-commit-config.yaml` to the latest available version of Cycode CLI.
> [!NOTE]
> Trigger happens on `git commit` command.
> Hook triggers only on the files that are staged for commit.
### Installing Pre-Push Hook
To install the pre-push hook in addition to or instead of the pre-commit hook:
1. Add the pre-push hooks to your `.pre-commit-config.yaml` file:
```yaml
repos:
- repo: https://github.com/cycodehq/cycode-cli
rev: v3.5.0
hooks:
- id: cycode-pre-push
stages: [pre-push]
```
2. Install the pre-push hook:
```bash
pre-commit install --hook-type pre-push
```
3. For both pre-commit and pre-push hooks, use:
```bash
pre-commit install
pre-commit install --hook-type pre-push
```
> [!NOTE]
> Pre-push hooks trigger on `git push` command and scan only the commits about to be pushed.
# Cycode CLI Commands
The following are the options and commands available with the Cycode CLI application:
| Option | Description |
|-------------------------------------------------------------------|------------------------------------------------------------------------------------|
| `-v`, `--verbose` | Show detailed logs. |
| `--no-progress-meter` | Do not show the progress meter. |
| `--no-update-notifier` | Do not check CLI for updates. |
| `-o`, `--output [rich\|text\|json\|table]` | Specify the output type. The default is `rich`. |
| `--client-id TEXT` | Specify a Cycode client ID for this specific scan execution. |
| `--client-secret TEXT` | Specify a Cycode client secret for this specific scan execution. |
| `--id-token TEXT` | Specify a Cycode OIDC ID token for this specific scan execution. |
| `--install-completion` | Install completion for the current shell.. |
| `--show-completion [bash\|zsh\|fish\|powershell\|pwsh]` | Show completion for the specified shell, to copy it or customize the installation. |
| `-h`, `--help` | Show options for given command. |
| Command | Description |
|-------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|
| [auth](#using-the-auth-command) | Authenticate your machine to associate the CLI with your Cycode account. |
| [configure](#using-the-configure-command) | Initial command to configure your CLI client authentication. |
| [ignore](#ignoring-scan-results) | Ignore a specific value, path or rule ID. |
| [mcp](#mcp-command-experiment) | Start the Model Context Protocol (MCP) server to enable AI integration with Cycode scanning capabilities. |
| [scan](#running-a-scan) | Scan the content for Secrets/IaC/SCA/SAST violations. You`ll need to specify which scan type to perform: commit-history/path/repository/etc. |
| [report](#report-command) | Generate report. You will need to specify which report type to perform as SBOM. |
| status | Show the CLI status and exit. |
# MCP Command \[EXPERIMENT\]
> [!WARNING]
> The MCP command is available only for Python 3.10 and above. If you're using an earlier Python version, this command will not be available.
The Model Context Protocol (MCP) command allows you to start an MCP server that exposes Cycode's scanning capabilities to AI systems and applications. This enables AI models to interact with Cycode CLI tools via a standardized protocol.
> [!TIP]
> For the best experience, install Cycode CLI globally on your system using `pip install cycode` or `brew install cycode`, then authenticate once with `cycode auth`. After global installation and authentication, you won't need to configure `CYCODE_CLIENT_ID` and `CYCODE_CLIENT_SECRET` environment variables in your MCP configuration files.
[](https://cursor.com/en/install-mcp?name=cycode&config=eyJjb21tYW5kIjoidXZ4IGN5Y29kZSBtY3AiLCJlbnYiOnsiQ1lDT0RFX0NMSUVOVF9JRCI6InlvdXItY3ljb2RlLWlkIiwiQ1lDT0RFX0NMSUVOVF9TRUNSRVQiOiJ5b3VyLWN5Y29kZS1zZWNyZXQta2V5IiwiQ1lDT0RFX0FQSV9VUkwiOiJodHRwczovL2FwaS5jeWNvZGUuY29tIiwiQ1lDT0RFX0FQUF9VUkwiOiJodHRwczovL2FwcC5jeWNvZGUuY29tIn19)
## Starting the MCP Server
To start the MCP server, use the following command:
```bash
cycode mcp
```
By default, this starts the server using the `stdio` transport, which is suitable for local integrations and AI applications that can spawn subprocesses.
### Available Options
| Option | Description |
|-------------------|--------------------------------------------------------------------------------------------|
| `-t, --transport` | Transport type for the MCP server: `stdio`, `sse`, or `streamable-http` (default: `stdio`) |
| `-H, --host` | Host address to bind the server (used only for non stdio transport) (default: `127.0.0.1`) |
| `-p, --port` | Port number to bind the server (used only for non stdio transport) (default: `8000`) |
| `--help` | Show help message and available options |
### MCP Tools
The MCP server provides the following tools that AI systems can use:
| Tool Name | Description |
|----------------------|---------------------------------------------------------------------------------------------|
| `cycode_secret_scan` | Scan files for hardcoded secrets |
| `cycode_sca_scan` | Scan files for Software Composition Analysis (SCA) - vulnerabilities and license issues |
| `cycode_iac_scan` | Scan files for Infrastructure as Code (IaC) misconfigurations |
| `cycode_sast_scan` | Scan files for Static Application Security Testing (SAST) - code quality and security flaws |
| `cycode_status` | Get Cycode CLI version, authentication status, and configuration information |
### Usage Examples
#### Basic Command Examples
Start the MCP server with default settings (stdio transport):
```bash
cycode mcp
```
Start the MCP server with explicit stdio transport:
```bash
cycode mcp -t stdio
```
Start the MCP server with Server-Sent Events (SSE) transport:
```bash
cycode mcp -t sse -p 8080
```
Start the MCP server with streamable HTTP transport on custom host and port:
```bash
cycode mcp -t streamable-http -H 0.0.0.0 -p 9000
```
Learn more about MCP Transport types in the [MCP Protocol Specification – Transports](https://modelcontextprotocol.io/specification/2025-03-26/basic/transports).
#### Configuration Examples
##### Using MCP with Cursor/VS Code/Claude Desktop/etc (mcp.json)
> [!NOTE]
> For EU Cycode environments, make sure to set the appropriate `CYCODE_API_URL` and `CYCODE_APP_URL` values in the environment variables (e.g., `https://api.eu.cycode.com` and `https://app.eu.cycode.com`).
Follow [this guide](https://code.visualstudio.com/docs/copilot/chat/mcp-servers) to configure the MCP server in your **VS Code/GitHub Copilot**. Keep in mind that in `settings.json`, there is an `mcp` object containing a nested `servers` sub-object, rather than a standalone `mcpServers` object.
For **stdio transport** (direct execution):
```json
{
"mcpServers": {
"cycode": {
"command": "cycode",
"args": ["mcp"],
"env": {
"CYCODE_CLIENT_ID": "your-cycode-id",
"CYCODE_CLIENT_SECRET": "your-cycode-secret-key",
"CYCODE_API_URL": "https://api.cycode.com",
"CYCODE_APP_URL": "https://app.cycode.com"
}
}
}
}
```
For **stdio transport** with `pipx` installation:
```json
{
"mcpServers": {
"cycode": {
"command": "pipx",
"args": ["run", "cycode", "mcp"],
"env": {
"CYCODE_CLIENT_ID": "your-cycode-id",
"CYCODE_CLIENT_SECRET": "your-cycode-secret-key",
"CYCODE_API_URL": "https://api.cycode.com",
"CYCODE_APP_URL": "https://app.cycode.com"
}
}
}
}
```
For **stdio transport** with `uvx` installation:
```json
{
"mcpServers": {
"cycode": {
"command": "uvx",
"args": ["cycode", "mcp"],
"env": {
"CYCODE_CLIENT_ID": "your-cycode-id",
"CYCODE_CLIENT_SECRET": "your-cycode-secret-key",
"CYCODE_API_URL": "https://api.cycode.com",
"CYCODE_APP_URL": "https://app.cycode.com"
}
}
}
}
```
For **SSE transport** (Server-Sent Events):
```json
{
"mcpServers": {
"cycode": {
"url": "http://127.0.0.1:8000/sse"
}
}
}
```
For **SSE transport** on custom port:
```json
{
"mcpServers": {
"cycode": {
"url": "http://127.0.0.1:8080/sse"
}
}
}
```
For **streamable HTTP transport**:
```json
{
"mcpServers": {
"cycode": {
"url": "http://127.0.0.1:8000/mcp"
}
}
}
```
##### Running MCP Server in Background
For **SSE transport** (start server first, then configure client):
```bash
# Start the MCP server in the background
cycode mcp -t sse -p 8000 &
# Configure in mcp.json
{
"mcpServers": {
"cycode": {
"url": "http://127.0.0.1:8000/sse"
}
}
}
```
For **streamable HTTP transport**:
```bash
# Start the MCP server in the background
cycode mcp -t streamable-http -H 127.0.0.2 -p 9000 &
# Configure in mcp.json
{
"mcpServers": {
"cycode": {
"url": "http://127.0.0.2:9000/mcp"
}
}
}
```
> [!NOTE]
> The MCP server requires proper Cycode CLI authentication to function. Make sure you have authenticated using `cycode auth` or configured your credentials before starting the MCP server.
### Troubleshooting MCP
If you encounter issues with the MCP server, you can enable debug logging to get more detailed information about what's happening. There are two ways to enable debug logging:
1. Using the `-v` or `--verbose` flag:
```bash
cycode -v mcp
```
2. Using the `CYCODE_CLI_VERBOSE` environment variable:
```bash
CYCODE_CLI_VERBOSE=1 cycode mcp
```
The debug logs will show detailed information about:
- Server startup and configuration
- Connection attempts and status
- Tool execution and results
- Any errors or warnings that occur
This information can be helpful when:
- Diagnosing connection issues
- Understanding why certain tools aren't working
- Identifying authentication problems
- Debugging transport-specific issues
# Scan Command
## Running a Scan
The Cycode CLI application offers several types of scans so that you can choose the option that best fits your case. The following are the current options and commands available:
| Option | Description |
|------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|
| `-t, --scan-type [secret\|iac\|sca\|sast]` | Specify the scan you wish to execute (`secret`/`iac`/`sca`/`sast`), the default is `secret`. |
| `--show-secret BOOLEAN` | Show secrets in plain text. See [Show/Hide Secrets](#showhide-secrets) section for more details. |
| `--soft-fail BOOLEAN` | Run scan without failing, always return a non-error status code. See [Soft Fail](#soft-fail) section for more details. |
| `--severity-threshold [INFO\|LOW\|MEDIUM\|HIGH\|CRITICAL]` | Show only violations at the specified level or higher. |
| `--sca-scan` | Specify the SCA scan you wish to execute (`package-vulnerabilities`/`license-compliance`). The default is both. |
| `--monitor` | When specified, the scan results will be recorded in Cycode. |
| `--cycode-report` | Display a link to the scan report in the Cycode platform in the console output. |
| `--no-restore` | When specified, Cycode will not run the restore command. This will scan direct dependencies ONLY! |
| `--gradle-all-sub-projects` | Run gradle restore command for all sub projects. This should be run from |
| `--maven-settings-file` | For Maven only, allows using a custom [settings.xml](https://maven.apache.org/settings.html) file when scanning for dependencies |
| `--help` | Show options for given command. |
| Command | Description |
|----------------------------------------|-----------------------------------------------------------------------|
| [commit-history](#commit-history-scan) | Scan commit history or perform diff scanning between specific commits |
| [path](#path-scan) | Scan the files in the path supplied in the command |
| [pre-commit](#pre-commit-scan) | Use this command to scan the content that was not committed yet |
| [repository](#repository-scan) | Scan git repository including its history |
### Options
#### Severity Option
To limit the results of the scan to a specific severity threshold, the argument `--severity-threshold` can be added to the scan command.
For example, the following command will scan the repository for policy violations that have severity of Medium or higher:
`cycode scan --severity-threshold MEDIUM repository ~/home/git/codebase`
#### Monitor Option
> [!NOTE]
> This option is only available to SCA scans.
To push scan results tied to the [SCA policies](https://docs.cycode.com/docs/sca-policies) found in an SCA type scan to Cycode, add the argument `--monitor` to the scan command.
For example, the following command will scan the repository for SCA policy violations and push them to Cycode platform:
`cycode scan -t sca --monitor repository ~/home/git/codebase`
#### Cycode Report Option
For every scan performed using the Cycode CLI, a report is automatically generated and its results are sent to Cycode. These results are tied to the relevant policies (e.g., [SCA policies](https://docs.cycode.com/docs/sca-policies) for Repository scans) within the Cycode platform.
To have the direct URL to this Cycode report printed in your CLI output after the scan completes, add the argument `--cycode-report` to your scan command.
`cycode scan --cycode-report repository ~/home/git/codebase`
All scan results from the CLI will appear in the CLI Logs section of Cycode. If you included the `--cycode-report` flag in your command, a direct link to the specific report will be displayed in your terminal following the scan results.
> [!WARNING]
> You must have the `owner` or `admin` role in Cycode to view this page.

The report page will look something like below:

#### Package Vulnerabilities Option
> [!NOTE]
> This option is only available to SCA scans.
To scan a specific package vulnerability of your local repository, add the argument `--sca-scan package-vulnerabilities` following the `-t sca` or `--scan-type sca` option.
In the previous example, if you wanted to only run an SCA scan on package vulnerabilities, you could execute the following:
`cycode scan -t sca --sca-scan package-vulnerabilities repository ~/home/git/codebase`
#### License Compliance Option
> [!NOTE]
> This option is only available to SCA scans.
To scan a specific branch of your local repository, add the argument `--sca-scan license-compliance` followed by the name of the branch you wish to scan.
In the previous example, if you wanted to only scan a branch named `dev`, you could execute the following:
`cycode scan -t sca --sca-scan license-compliance repository ~/home/git/codebase -b dev`
#### Lock Restore Option
> [!NOTE]
> This option is only available to SCA scans.
We use the sbt-dependency-lock plugin to restore the lock file for SBT projects.
To disable lock restore in use `--no-restore` option.
Prerequisites:
* `sbt-dependency-lock` plugin: Install the plugin by adding the following line to `project/plugins.sbt`:
```text
addSbtPlugin("software.purpledragon" % "sbt-dependency-lock" % "1.5.1")
```
### Repository Scan
A repository scan examines an entire local repository for any exposed secrets or insecure misconfigurations. This more holistic scan type looks at everything: the current state of your repository and its commit history. It will look not only for secrets that are currently exposed within the repository but previously deleted secrets as well.
To execute a full repository scan, execute the following:
`cycode scan repository {{path}}`
For example, if you wanted to scan a repository stored in `~/home/git/codebase`, you could execute the following:
`cycode scan repository ~/home/git/codebase`
The following option is available for use with this command:
| Option | Description |
|---------------------|--------------------------------------------------------|
| `-b, --branch TEXT` | Branch to scan, if not set scanning the default branch |
#### Branch Option
To scan a specific branch of your local repository, add the argument `-b` (alternatively, `--branch`) followed by the name of the branch you wish to scan.
Given the previous example, if you wanted to only scan a branch named `dev`, you could execute the following:
`cycode scan repository ~/home/git/codebase -b dev`
### Path Scan
A path scan examines a specific local directory and all the contents within it, instead of focusing solely on a GIT repository.
To execute a directory scan, execute the following:
`cycode scan path {{path}}`
For example, consider a scenario in which you want to scan the directory located at `~/home/git/codebase`. You could then execute the following:
`cycode scan path ~/home/git/codebase`
#### Terraform Plan Scan
Cycode CLI supports Terraform plan scanning (supporting Terraform 0.12 and later)
Terraform plan file must be in JSON format (having `.json` extension)
If you just have a configuration file, you can generate a plan by doing the following:
1. Initialize a working directory that contains Terraform configuration file:
`terraform init`
2. Create Terraform execution plan and save the binary output:
`terraform plan -out={tfplan_output}`
3. Convert the binary output file into readable JSON:
`terraform show -json {tfplan_output} > {tfplan}.json`
4. Scan your `{tfplan}.json` with Cycode CLI:
`cycode scan -t iac path ~/PATH/TO/YOUR/{tfplan}.json`
### Commit History Scan
> [!NOTE]
> Commit History Scan is not available for IaC scans.
The commit history scan command provides two main capabilities:
1. **Full History Scanning**: Analyze all commits in the repository history
2. **Diff Scanning**: Scan only the changes between specific commits
Secrets scanning can analyze all commits in the repository history because secrets introduced and later removed can still be leaked or exposed. For SCA and SAST scans, the commit history command focuses on scanning the differences/changes between commits, making it perfect for pull request reviews and incremental scanning.
A commit history scan examines your Git repository's commit history and can be used both for comprehensive historical analysis and targeted diff scanning of specific changes.
To execute a commit history scan, execute the following:
`cycode scan commit-history {{path}}`
For example, consider a scenario in which you want to scan the commit history for a repository stored in `~/home/git/codebase`. You could then execute the following:
`cycode scan commit-history ~/home/git/codebase`
The following options are available for use with this command:
| Option | Description |
|---------------------------|----------------------------------------------------------------------------------------------------------|
| `-r, --commit-range TEXT` | Scan a commit range in this git repository, by default cycode scans all commit history (example: HEAD~1) |
#### Commit Range Option (Diff Scanning)
The commit range option enables **diff scanning** – scanning only the changes between specific commits instead of the entire repository history.
This is particularly useful for:
- **Pull request validation**: Scan only the changes introduced in a PR
- **Incremental CI/CD scanning**: Focus on recent changes rather than the entire codebase
- **Feature branch review**: Compare changes against main/master branch
- **Performance optimization**: Faster scans by limiting scope to relevant changes
#### Commit Range Syntax
The `--commit-range` (`-r`) option supports standard Git revision syntax:
| Syntax | Description | Example |
|---------------------|-----------------------------------|-------------------------|
| `commit1..commit2` | Changes from commit1 to commit2 | `abc123..def456` |
| `commit1...commit2` | Changes in commit2 not in commit1 | `main...feature-branch` |
| `commit` | Changes from commit to HEAD | `HEAD~1` |
| `branch1..branch2` | Changes from branch1 to branch2 | `main..feature-branch` |
#### Diff Scanning Examples
**Scan changes in the last commit:**
```bash
cycode scan commit-history -r HEAD~1 ~/home/git/codebase
```
**Scan changes between two specific commits:**
```bash
cycode scan commit-history -r abc123..def456 ~/home/git/codebase
```
**Scan changes in your feature branch compared to main:**
```bash
cycode scan commit-history -r main..HEAD ~/home/git/codebase
```
**Scan changes between main and a feature branch:**
```bash
cycode scan commit-history -r main..feature-branch ~/home/git/codebase
```
**Scan all changes in the last 3 commits:**
```bash
cycode scan commit-history -r HEAD~3..HEAD ~/home/git/codebase
```
> [!TIP]
> For CI/CD pipelines, you can use environment variables like `${{ github.event.pull_request.base.sha }}..${{ github.sha }}` (GitHub Actions) or `$CI_MERGE_REQUEST_TARGET_BRANCH_SHA..$CI_COMMIT_SHA` (GitLab CI) to scan only PR/MR changes.
### Pre-Commit Scan
A pre-commit scan automatically identifies any issues before you commit changes to your repository. There is no need to manually execute this scan; configure the pre-commit hook as detailed under the Installation section of this guide.
After installing the pre-commit hook, you may occasionally wish to skip scanning during a specific commit. To do this, add the following to your `git` command to skip scanning for a single commit:
```bash
SKIP=cycode git commit -m <your commit message>`
```
### Pre-Push Scan
A pre-push scan automatically identifies any issues before you push changes to the remote repository. This hook runs on the client side and scans only the commits that are about to be pushed, making it efficient for catching issues before they reach the remote repository.
> [!NOTE]
> Pre-push hook is not available for IaC scans.
The pre-push hook integrates with the pre-commit framework and can be configured to run before any `git push` operation.
#### Installing Pre-Push Hook
To set up the pre-push hook using the pre-commit framework:
1. Install the pre-commit framework (if not already installed):
```bash
pip3 install pre-commit
```
2. Create or update your `.pre-commit-config.yaml` file to include the pre-push hooks:
```yaml
repos:
- repo: https://github.com/cycodehq/cycode-cli
rev: v3.5.0
hooks:
- id: cycode-pre-push
stages: [pre-push]
```
3. For multiple scan types, use this configuration:
```yaml
repos:
- repo: https://github.com/cycodehq/cycode-cli
rev: v3.5.0
hooks:
- id: cycode-pre-push # Secrets scan
stages: [pre-push]
- id: cycode-sca-pre-push # SCA scan
stages: [pre-push]
- id: cycode-sast-pre-push # SAST scan
stages: [pre-push]
```
4. Install the pre-push hook:
```bash
pre-commit install --hook-type pre-push
```
A successful installation will result in the message: `Pre-push installed at .git/hooks/pre-push`.
5. Keep the pre-push hook up to date:
```bash
pre-commit autoupdate
```
#### How Pre-Push Scanning Works
The pre-push hook:
- Receives information about what commits are being pushed
- Calculates the appropriate commit range to scan
- For new branches: scans all commits from the merge base with the default branch
- For existing branches: scans only the new commits since the last push
- Runs the same comprehensive scanning as other Cycode scan modes
#### Smart Default Branch Detection
The pre-push hook intelligently detects the default branch for merge base calculation using this priority order:
1. **Environment Variable**: `CYCODE_DEFAULT_BRANCH` - allows manual override
2. **Git Remote HEAD**: Uses `git symbolic-ref refs/remotes/origin/HEAD` to detect the actual remote default branch
3. **Git Remote Info**: Falls back to `git remote show origin` if symbolic-ref fails
4. **Hardcoded Fallbacks**: Uses common default branch names (origin/main, origin/master, main, master)
**Setting a Custom Default Branch:**
```bash
export CYCODE_DEFAULT_BRANCH=origin/develop
```
This smart detection ensures the pre-push hook works correctly regardless of whether your repository uses `main`, `master`, `develop`, or any other default branch name.
#### Skipping Pre-Push Scans
To skip the pre-push scan for a specific push operation, use:
```bash
SKIP=cycode-pre-push git push
```
Or to skip all pre-push hooks:
```bash
git push --no-verify
```
> [!TIP]
> The pre-push hook is triggered on `git push` command and scans only the commits that are about to be pushed, making it more efficient than scanning the entire repository.
## Exclude Paths From Scans
You can use a `.cycodeignore` file to tell the Cycode CLI which files and directories to exclude from scans.
It works just like a `.gitignore` file. This helps you focus scans on your relevant code and prevent certain paths from triggering violations locally.
### How It Works
1. Create a | text/markdown | Cycode | support@cycode.com | null | null | null | secret-scan, cycode, devops, token, secret, security, code | [
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Natural Language :: English",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.1... | [] | null | null | >=3.9 | [] | [] | [] | [
"arrow<1.4.0,>=1.0.0",
"binaryornot<0.5.0,>=0.4.4",
"click<8.2.0,>=8.1.0",
"colorama<0.5.0,>=0.4.3",
"gitpython<3.2.0,>=3.1.30",
"marshmallow<4.0.0,>=3.15.0",
"mcp<2.0.0,>=1.9.3; python_version >= \"3.10\"",
"patch-ng==1.18.1",
"pathvalidate<4.0.0,>=3.3.1",
"pydantic<3.0.0,>=2.11.5",
"pyjwt<3.0,... | [] | [] | [] | [
"Repository, https://github.com/cycodehq/cycode-cli"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:41:31.808135 | cycode-3.10.1.tar.gz | 154,130 | fc/f4/6ad9db6ecc718309b8bec751389e57fe4f5d8935b334b9a171d912c24b3c/cycode-3.10.1.tar.gz | source | sdist | null | false | 369ab0e51ddc9412fe73600c6006e23b | fe3ca5f83b3abd425e7af91caf51b3ba45705f6f3da371802e68876e3e9d9d95 | fcf46ad9db6ecc718309b8bec751389e57fe4f5d8935b334b9a171d912c24b3c | MIT | [
"LICENCE"
] | 5,207 |
2.4 | MaRDMO | 0.4.2 | RDMO Plugin to document and query mathematical research data using the MaRDI infrastructure. | <div align='center' style="margin-top: 50px; font-size: 14px; color: grey;">
<img src="https://github.com/user-attachments/assets/98c92c58-9d31-41ca-a3ca-189bbfb92101" />
<p>MaRDMO Logo by <a href="https://www.mardi4nfdi.de/about/mission" target="_blank" style="color: grey;">MaRDI</a>, licensed under <a href="https://creativecommons.org/licenses/by-nc-nd/4.0/" target="_blank" style="color: grey;">CC BY-NC-ND 4.0</a>.</p>
</div>
# MaRDMO Plugin
This repository contains the MaRDMO Plugin for the [Research Datamanagement Organizer](https://rdmorganiser.github.io/) (RDMO) developed within the [Mathematical Research Data Initiative](https://www.mardi4nfdi.de/about/mission) (MaRDI).
The plugin allows a standardized documentation of:
<ol>
<li>Mathematical Models</li>
<li>Interdisciplinary Workflows</li>
<li>Algorithms</li>
</ol>
Model documentation in MaRDMO is based on the [MathModDB ontology](https://portal.mardi4nfdi.de/wiki/MathModDB). Within the plugin, users can record a model, related expressions, computational tasks, quantities or quantity kinds, research problems, academic disciplines, and publications. These inputs are gathered in a guided interview, enabling MaRDMO to produce metadata that is directly compatible with the MaRDI knowledge graph for mathematical models. A demo video showing the documentation process for a mathematical model in MaRDMO is available [here](https://www.youtube.com/watch?v=UmbBNUZJ994&list=PLgoPZ7uPWbo-jqDXzx4fSm_4JyAYEMPjn).
Workflow documentation follows a [standardized scheme](https://portal.mardi4nfdi.de/wiki/MD_UseCases) developed in MaRDI. Within the plugin, users can record a workflow, related models, methods, software, hardware, experimentall devices, data sets, and publications. These inputs are gathered in a guided interview, enabling MaRDMO to produce metadata that is directly compatible with the MaRDI knowledge graph for interdisciplinary workflows.
Algorithm documentation in MaRDMO follows the [MathAlgoDB ontology](https://portal.mardi4nfdi.de/wiki/Service:6534228). Within the plugin, users can record an algorithm, related algorithmic tasks, implementing software, benchmarks, and publications. These inputs are gathered in a guided interview, enabling MaRDMO to produce metadata that is directly compatible with the MaRDI knowledge graph for algorithms.
<div align="center" style="margin-top: 20px; font-size: 14px; color: grey;">
<img src="https://github.com/user-attachments/assets/fb22ea44-8648-44e8-8a1e-51f31564d23c" width="800" />
<p><em>Figure 1: MaRDMO Data Model containing Classes and Relations for the Documentation of Mathematical Models (blue), Algorithms (red), and Interdisciplinary Workflows (green).</em></p>
</div>
Completed documentations in MaRDMO can be exported directly from RDMO to the respective MaRDI knowledge graph via the **MaRDMO Button**. This feature generates a concise summary of the documented model, algorithm, or workflow, and—after user authentication—submits the metadata to the corresponding knowledge graphs. This streamlines the publication process and ensures the documentation becomes immediately discoverable within the MaRDI ecosystem.
In addition to documentation, MaRDMO provides a dedicated interview for searching existing workflows, algorithms, and models. Users can specify individual search parameters, and the MaRDI Button will generate the corresponding SPARQL query based on the input. The query results are displayed directly in RDMO, enabling researchers to discover and reuse existing knowledge—thus closing the knowledge transfer loop within the MaRDI ecosystem.
## MaRDMO Plugin Installation
To use the MaRDMO Plugin at least `RDMO v2.4.0` is required. Follow the installation / update instructions of [RDMO](https://rdmo.readthedocs.io/en/latest/installation) if required.
Go to the `rdmo-app` directory of your RDMO installation. In the virtual environment of the RDMO installation install the MaRDMO Plugin:
```bash
pip install MaRDMO
```
To connect the MaRDMO Plugin with the RDMO installation add the following lines to `config/settings/local.py` (if not already present):
```python
from django.utils.translation import gettext_lazy as _
```
```python
INSTALLED_APPS = ['MaRDMO'] + INSTALLED_APPS
PROJECT_EXPORTS += [
('wikibase', _('Export to MaRDI Portal'), 'MaRDMO.main.MaRDMOExportProvider'),
]
OPTIONSET_PROVIDERS = [
# Search
('MaRDISearch', _('Options for MaRDI Search'), 'MaRDMO.search.providers.MaRDISearch'),
('SoftwareSearch', _('Options for Software Search'), 'MaRDMO.search.providers.SoftwareSearch'),
# Workflow
('MaRDIAndWikidataSearch', _('Options for MaRDI and Wikidata Search'), 'MaRDMO.workflow.providers.MaRDIAndWikidataSearch'),
('MainMathematicalModel', _('Options for Main Mathematical Model'), 'MaRDMO.workflow.providers.MainMathematicalModel'),
('WorkflowTask', _('Options for Workflow Task'), 'MaRDMO.workflow.providers.WorkflowTask'),
('SoftwareW', _('Options for Software (Workflow)'), 'MaRDMO.workflow.providers.Software'),
('Hardware', _('Options for Hardware'), 'MaRDMO.workflow.providers.Hardware'),
('Instrument', _('Options for Instruments'), 'MaRDMO.workflow.providers.Instrument'),
('DataSet', _('Options for Data Sets'), 'MaRDMO.workflow.providers.DataSet'),
('RelatedDataSet', _('Options for related Data Sets'), 'MaRDMO.workflow.providers.RelatedDataSet'),
('RelatedSoftware', _('Options for related Software'), 'MaRDMO.workflow.providers.RelatedSoftware'),
('RelatedInstrument', _('Options for related Instruments'), 'MaRDMO.workflow.providers.RelatedInstrument'),
('Method', _('Options for Methods'), 'MaRDMO.workflow.providers.Method'),
('RelatedMethod', _('Options for related Methods'), 'MaRDMO.workflow.providers.RelatedMethod'),
('ProcessStep', _('Options for Process Step'), 'MaRDMO.workflow.providers.ProcessStep'),
('Discipline', _('Options for Disciplines'), 'MaRDMO.workflow.providers.Discipline'),
# Model
('Formula', _('Options for Formulas'), 'MaRDMO.model.providers.Formula'),
('ResearchField', _('Options for Research Fields'), 'MaRDMO.model.providers.ResearchField'),
('RelatedResearchFieldWithCreation', _('Options for related Research Fields with Creation'), 'MaRDMO.model.providers.RelatedResearchFieldWithCreation'),
('RelatedResearchFieldWithoutCreation', _('Options for related Research Fields without Creation'), 'MaRDMO.model.providers.RelatedResearchFieldWithoutCreation'),
('ResearchProblem', _('Options for Research Problems'), 'MaRDMO.model.providers.ResearchProblem'),
('RelatedResearchProblemWithCreation', _('Options for related Research Problems with Creation'), 'MaRDMO.model.providers.RelatedResearchProblemWithCreation'),
('RelatedResearchProblemWithoutCreation', _('Options for related Research Problems without Creation'), 'MaRDMO.model.providers.RelatedResearchProblemWithoutCreation'),
('MathematicalModel', _('Options for Mathematical Model'), 'MaRDMO.model.providers.MathematicalModel'),
('RelatedMathematicalModelWithoutCreation', _('Options for related Mathematical Model without Creation'), 'MaRDMO.model.providers.RelatedMathematicalModelWithoutCreation'),
('QuantityOrQuantityKind', _('Options for Quantities and Quantity Kinds'), 'MaRDMO.model.providers.QuantityOrQuantityKind'),
('RelatedQuantityWithoutCreation', _('Options for related Quantities without Creation'), 'MaRDMO.model.providers.RelatedQuantityWithoutCreation'),
('RelatedQuantityKindWithoutCreation', _('Options for related Quantity Kinds without Creation'), 'MaRDMO.model.providers.RelatedQuantityKindWithoutCreation'),
('RelatedQuantityOrQuantityKindWithCreation', _('Options for related Quantites or Quantity Kinds with Creation'), 'MaRDMO.model.providers.RelatedQuantityOrQuantityKindWithCreation'),
('MathematicalFormulation', _('Options for Mathematical Formulation'), 'MaRDMO.model.providers.MathematicalFormulation'),
('RelatedMathematicalFormulationWithCreation', _('Options for related Mathematical Formulations with Creation'), 'MaRDMO.model.providers.RelatedMathematicalFormulationWithCreation'),
('RelatedMathematicalFormulationWithoutCreation', _('Options for related Mathematical Formulations without Creation'), 'MaRDMO.model.providers.RelatedMathematicalFormulationWithoutCreation'),
('AllEntities', _('Options for All Entities'), 'MaRDMO.model.providers.AllEntities'),
('Task', _('Options for Task'), 'MaRDMO.model.providers.Task'),
('RelatedTaskWithCreation', _('Options for related Tasks with Creation'), 'MaRDMO.model.providers.RelatedTaskWithCreation'),
('RelatedTaskWithoutCreation', _('Options for related Tasks without Creation'), 'MaRDMO.model.providers.RelatedTaskWithoutCreation'),
('RelatedModelEntityWithoutCreation', _('Options for related Model Entities without Creation'), 'MaRDMO.model.providers.RelatedModelEntityWithoutCreation'),
# Publication
('Publication', _('Options for Publication'), 'MaRDMO.publication.providers.Publication'),
# Algorithm
('Algorithm', _('Options for Algorithms'), 'MaRDMO.algorithm.providers.Algorithm'),
('RelatedAlgorithmWithoutCreation', _('Options for related Algorithms without Creation'), 'MaRDMO.algorithm.providers.RelatedAlgorithmWithoutCreation'),
('AlgorithmicProblem', _('Options for Algorithmic Problems'), 'MaRDMO.algorithm.providers.AlgorithmicProblem'),
('RelatedAlgorithmicProblemWithCreation', _('Options for related Algorithmic Problems with Creation'), 'MaRDMO.algorithm.providers.RelatedAlgorithmicProblemWithCreation'),
('RelatedAlgorithmicProblemWithoutCreation', _('Options for related Algorithmic Problems without Creation'), 'MaRDMO.algorithm.providers.RelatedAlgorithmicProblemWithoutCreation'),
('SoftwareAL', _('Options for Software (Algorithm)'), 'MaRDMO.algorithm.providers.Software'),
('RelatedSoftwareALWithCreation', _('Options for related Software (Algorithm) with Creation'), 'MaRDMO.algorithm.providers.RelatedSoftwareWithCreation'),
('Benchmark', _('Options for Benchmarks'), 'MaRDMO.algorithm.providers.Benchmark'),
('RelatedBenchmarkWithCreation', _('Options for related Benchmarks with Creation'), 'MaRDMO.algorithm.providers.RelatedBenchmarkWithCreation')
]
```
In addition add the following urlpattern to `config/urls.py`:
```python
path('services/', include("MaRDMO.urls")),
```
Thereby, the MaRDMO Plugin is installed and a "MaRDI Button" button is added in the project view.
## MaRDI Portal, MathAlgoDB, and MathModDB Connection
Add the following lines to `config/settings/local.py` to connect MaRDMO with the individual databases.
```python
MARDMO_PROVIDER = {
'mardi': {
'items': 'data/items.json',
'properties': 'data/properties.json',
'api': 'https://portal.mardi4nfdi.de/w/api.php',
'sparql': 'https://query.portal.mardi4nfdi.de/sparql',
'uri': 'https://portal.mardi4nfdi.de',
'oauth2_client_id': '',
'oauth2_client_secret': '',
},
'mathalgodb': {
'uri': 'https://cordi2025.m1.mardi.ovh/',
'sparql': 'https://sparql.cordi2025.m1.mardi.ovh/mathalgodb/query',
'update': 'https://sparql.cordi2025.m1.mardi.ovh/mathalgodb/update',
'mathalgodb_id': '',
'mathalgodb_secret': ''
},
'wikidata': {
'uri': 'https://www.wikidata.org',
'api': 'https://www.wikidata.org/w/api.php',
'sparql': 'https://query-main.wikidata.org/sparql',
'uri': 'https://www.wikidata.org'
},
}
```
Contact the MaRDI consortium for the individual credentials.
## MaRDMO-Questionnaire
The MaRDMO Plugin requires the [MaRDMO-Questionnaire](https://github.com/MarcoReidelbach/MaRDMO-Questionnaire), download its latest release [](https://github.com/MarcoReidelbach/MaRDMO-Questionnaire/releases/latest).
Integrate the MaRDMO Questionnaire into your RDMO instance through the user interface of your RDMO instance (`Management -> Import -> attributes.xml/optionsets.xml/conditions.xml/catalogs.xml`) or via
```bash
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/attributes.xml
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/optionsets.xml
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/conditions.xml
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/mardmo-search-catalog.xml
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/mardmo-model-catalog.xml
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/mardmo-interdisciplinary-workflow-catalog.xml
python manage.py import /path/to/MaRDMO-Questionnaire/catalog/mardmo-algorithm-catalog.xml
```
## Usage of MaRDMO Plugin
Once the MaRDMO Plugin is set up, the Questionnaires can be used to document and query interdisciplinary workflows, mathematical models, and algorithms. Therefore, select "Create New Project" in RDMO, choose a proper project name (for interdisciplinary workflow the project name will the workflow name), assign one of the the MaRDMO Catalogs and select "Create Project". The project is created. Choose "Answer Questions" to start the Interview. Once the Interview is completed, return to the Project Page. On the right hand side in the "Export" section the "Export to MaRDI Portal" button is located to process and subsequently export the completed Questionnaires.
| text/markdown | null | Marco Reidelbach <reidelbach@zib.de> | null | Marco Reidelbach <reidelbach@zib.de> | Apache Software License | null | [
"Intended Audience :: Science/Research",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.10"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"rdmo>=2.4.0"
] | [] | [] | [] | [
"homepage, https://github.com/MarcoReidelbach/MaRDMO"
] | twine/6.2.0 CPython/3.10.19 | 2026-02-19T10:41:17.922714 | mardmo-0.4.2.tar.gz | 6,826,414 | 15/7f/cd20cff8b649ee7fd9fe2d11db58997461d8b1456e77a1e4719cfd35bfa1/mardmo-0.4.2.tar.gz | source | sdist | null | false | 8a5f37c675f223921437c93850572d24 | c23df0702e037e5ee10505776368db13a1623b7a87de0f0a53b7374683da1361 | 157fcd20cff8b649ee7fd9fe2d11db58997461d8b1456e77a1e4719cfd35bfa1 | null | [
"LICENSE.md"
] | 0 |
2.4 | filip | 0.7.4 | [FI]WARE [Li]brary for [P]ython | 
# FiLiP
[](https://rwth-ebc.github.io/FiLiP/master/pylint/pylint.html)
[](https://rwth-ebc.github.io/FiLiP/master/docs/index.html)
[](https://rwth-ebc.github.io/FiLiP/coverage)
[](https://opensource.org/licenses/BSD-3-Clause)
[](https://rwth-ebc.github.io/FiLiP/master/build/build.svg)
[](https://doi.org/10.21105/joss.06953 )
<br/>
[](https://www.fiware.org/developers/catalogue/)

FiLiP (<ins>FI</ins>WARE <ins>Li</ins>brary for <ins>P</ins>ython) is a python Software Development Kit (SDK) for
accelerating the development of web services that use Fiware's Generic
Enablers (GEs) as backend.
It is mainly based on the [Pydantic](https://pydantic-docs.helpmanual.io/)
package which is a sophisticated library for data validation and settings
management using python type annotations.
Pydantic enforces type hints at runtime, and provides user friendly errors
when data is invalid.
We mainly use the Pydantic model to build our own data model structure required
for efficient data model parsing and validation and interaction with FIWARE
services' RestAPIs.
For API interaction, FiLiP relies on the well-known
[requests](https://docs.python-requests.org/en/latest/) package.
It is important to understand that we do not in any way restrict any
features of requests.
Furthermore, FiLiP is designed to help with the fast development of FIWARE-based
applications and avoid hundreds of lines of boilerplate, but it cannot
substitute learning the basic concepts behind the used FIWARE components.
This project is part of [FIWARE](https://www.fiware.org/). For more information check the FIWARE Catalogue entry for the
[Core Context Management](https://github.com/Fiware/catalogue/tree/master/core).
## General Motivation
Why implement a client library when clients can be auto-generated
from openapi documentation?
A general prerequisite to do so is that the documentation is in depth and of
good quality.
While FIWARE generally provides
[openapi documentation](https://github.com/FIWARE/specifications),
here are some thoughts on the challenges of auto-generating client code from
these documents:
- Auto-generated code tends to become rather bulky and its quality strongly
depends on the provided input data.
- Manipulating generated code can result in a big hassle for maintenance if
additional features need to be integrated.
- The underlying NGSI (Next Generation Service Interface) for FIWARE is a
rather generic specification.
Hence, generated models may also be of generic types as lists
and dicts in Python. So there is no real benefit.
Furthermore, there is no chance for reasonable validation and error handling.
## Getting started
The following section shortly describes how to use the library.
### Prerequisites
Since FiLiP is designed as a client library, it requires a server that provides
the target Service-APIs.
Hence, if you do not yet have a running instance of a FIWARE based platform,
using docker is the most convenient way to set it up.
Please check [here](https://github.com/N5GEH/n5geh.platform) for a tutorial
on this.
If this is not an option for you, FIWARE also provides a testing server.
You can register for a testing account
[here](https://www.fiware.org/developers/fiware-lab/).
> **Note**: FiLiP is now compatible to [Pydantic V2](https://docs.pydantic.dev/latest/migration/). If your program still require Pydantic V1.x for some reason, please use release [v0.2.5](https://github.com/RWTH-EBC/FiLiP/releases/tag/v0.2.5) or earlier version of FiLiP. Besides, we recommended to set `pydantic~=1.10` in the `requirements.txt`, otherwise Pydantic V2 might still be installed.
#### Supported Python Versions
| Version | Status |
|-----------|----------------|
| 3.7 | ❌ Deprecated |
| 3.8 | ❌ Deprecated |
| 3.9 | ✅ Tested |
| 3.10 | ✅ Tested |
| 3.11 | ✅ Tested |
| 3.12 | ✅ Tested |
> ✅ Tested python versions have passed the unittests
### Installation
The easiest way to install the library is via pip:
````
pip install -U filip
````
If you want to benefit from the latest changes, use the following command
(This will install the current master branch from this repository):
```
pip install -U git+git://github.com/RWTH-EBC/filip
```
> **Note**: For development, you should install FiLiP in editable mode with the following command:
> ````bash
> pip install -e .[development]
> ````
> The `development` option will install extra libraries required for contribution. Please check the [CONTRIBUTING.md](CONTRIBUTING.md) for more information.
#### Install extra dependencies for tutorials and examples (optional)
If you need to go through the tutorials or examples, please install filip with extra module ``tutorials``:
````
pip install -U filip[tutorials]
````
#### Install semantics module (optional)
If you want to use the optional [semantics module](filip/semantics), use the following command. This will install the libraries (e.g., `igraph` and `rdflib`) that only required for the semantics module):
````
pip install -U filip[semantics]
````
### Introduction to FIWARE
The following section introduces FIWARE. If you are already familiar with
FIWARE, you can skip this section and go straight to [Getting Started](#getting-started).
#### What is FIWARE?
FIWARE is a framework of open-source cloud platform components, created
to facilitate the development of smart solutions within various application
domains.
At the moment, the FIWARE
[catalogue](https://www.fiware.org/developers/catalogue/) contains over 30
interoperable software modules, so-called Generic Enablers
(GE) for developing and providing customized IoT platform solutions.
To get familiar with the APIs of the different modules we highly recommend
checking the
[step-by-step tutorial](https://fiware-tutorials.readthedocs.io/en/latest/).
It provides a good overview on FIWARE and its basic usage.
Whereas the tutorial helps to understand most of the general concepts,
for a deep dive, where you can learn about the components in more detail,
FIWARE also offers extended lessons through their
[academy](https://fiware-academy.readthedocs.io/en/latest/index.html/).
However, usually one only requires a small set of components.
Hence, we recommend using the cited pages only as needed.
#### How to set up a FIWARE platform?
The easiest way to set up a FIWARE platform is by using docker as all GEs are
open-source and distributed as docker containers on dockerhub.
However, as mentioned before, for most use cases only a subset of GEs is required.
Hence, we wrote a small [tutorial](https://github.com/N5GEH/n5geh.platform)
explaining how to set up a platform suited for most use cases within the energy
domain.
#### FIWARE GEs covered by FiLiP
FiLiP is a library developed on demand.
Hence, we do not aim to cover the APIs of all GEs that are included in the
[catalogue](https://www.fiware.org/developers/catalogue/).
This would mean an unnecessary development overhead.
Therefore, FiLiP currently only covers the APIs of the following GEs:
- NGSIv2 Context Broker for managing context data. We use its
reference implementation ORION for testing. There is also warning in Filip if you are using an old version of ORION which could have breaking changes.
- [documentation](https://fiware-orion.readthedocs.io/en/master/)
- [github](https://github.com/telefonicaid/fiware-orion)
- [swagger](https://swagger.lab.fiware.org/)
- [NGSI v2 specifications](https://github.com/FIWARE/specifications/tree/master/OpenAPI/ngsiv2)
- NGSI-LD Context Broker for managing context data with Linked Data concept. The functionalities that FiLiP supports are closely aligned with the specification **_NGSI-LD V1.3.1_**, which is according to the FIWARE [catalogue](https://github.com/FIWARE/catalogue#core-context-broker-components) the latest spec version that has been implemented by all three brokers (Orion-LD, Scorpio, and Stellio). We currently use Orion-LD for testing.
- [github](https://github.com/FIWARE/context.Orion-LD)
- [swagger](https://swagger.lab.fiware.org/?url=https://raw.githubusercontent.com/FIWARE/specifications/master/OpenAPI/ngsi-ld/full_api.json#/)
> **Note**: `-experimental` flag need to be set for Orion-LD Context Broker to enable the full functionality. Check this [issue](https://github.com/FIWARE/context.Orion-LD/issues/1648) for more information
- IoT-Agents for managing IoT Devices. IoT agents are implemented using
the FIWARE IoT Agent Node Lib as a common framework.
- [documentation](https://iotagent-node-lib.readthedocs.io/en/latest/)
- [github](https://github.com/telefonicaid/iotagent-node-lib)
- IoT-Agent-JSON for managing devices using a JSON message payload protocol
format.
- [documentation](https://fiware-iotagent-json.readthedocs.io/en/latest/)
- [github](https://github.com/telefonicaid/iotagent-json)
- [apiary](https://telefonicaiotiotagents.docs.apiary.io/)
(*partly deprecated*)
Example payload:
{
"humidity": "45%",
"temperature": "23",
"luminosity": "1570"
}
- IoT-Agent-Ultralight for managing devices using an Ultralight 2.0 message
payload protocol.
- [documentation](https://fiware-iotagent-ul.readthedocs.io/en/latest/)
- [github](https://github.com/telefonicaid/iotagent-ul)
- [apiary](https://telefonicaiotiotagents.docs.apiary.io/)
(*partly deprecated*)
Example payload:
humidity|45%|temperature|23|luminosity|1570
- QuantumLeap for the management of time series data
- [documentation](https://quantumleap.readthedocs.io/en/latest/)
- [github](https://github.com/FIWARE-GEs/quantum-leap)
- [swagger](https://app.swaggerhub.com/apis/smartsdk/ngsi-tsdb/0.8.3)
## Structure of FiLiP

## Documentation
We are still working on the documentation.
You can find our current documentation
[here](https://rwth-ebc.github.io/FiLiP/master/docs/index.html).
## Running examples
Once you have installed the library, you can check the [examples](/examples)
to learn how to use the different components.
Currently, we provide basic examples for the usage of FiLiP for the FIWARE
GEs mentioned above.
We suggest to start in the right order to first understand the
configuration of clients.
Afterwards, you can start modelling context data and interacting with the context
broker and use its functionalities before you learn how to connect
IoT Devices and store historic data.
## Testing
We use unittests to write our test cases.
To test the source code of the library in our CI workflow, the CI
executes all tests located in the `tests`-directory and prefixed with `test_` .
## Contributing
#### Bug Reports & Feature Requests
The best way to report a bug or request a new feature is to open an issue. Please use one of the available issue templates and provide as much detail as possible.
#### Submitting Pull Requests
We warmly welcome code contributions! If you're planning to contribute, please first read our [Contribution Guide](./CONTRIBUTING.md) for details on our development workflow, coding standards, and how to submit a pull request.
For other inquiries, you can still contact us via [ebc-tools@eonerc.rwth-aachen.de](mailto:ebc-tools@eonerc.rwth-aachen.de).
## Authors
* [Thomas Storek](https://github.com/tstorek)
* [Junsong Du](https://www.ebc.eonerc.rwth-aachen.de/cms/E-ON-ERC-EBC/Das-Institut/Mitarbeiter/Digitale-Energie-Quartiere/~trcib/Du-Junsong/lidx/1/) (corresponding)
* [Saira Bano](https://www.ebc.eonerc.rwth-aachen.de/cms/E-ON-ERC-EBC/Das-Institut/Mitarbeiter/Systemadministration/~ohhca/Bano-Saira/)
## Alumni
* Jeff Reding
* Felix Rehmann
* Daniel Nikolay
* Sebastian Blechmann
## References
If you use FiLiP in your work, please cite the JOSS paper:
```bibtex
@article{Storek2024,
doi = {10.21105/joss.06953},
url = {https://doi.org/10.21105/joss.06953},
year = {2024},
publisher = {The Open Journal},
volume = {9},
number = {101},
pages = {6953},
author = {Storek, Thomas and Du, Junsong and Blechmann, Sebastian and Streblow, Rita and Müller, Dirk},
title = {FiLiP: A python software development kit (SDK) for accelerating the development of services based on FIWARE IoT platform},
journal = {Journal of Open Source Software}
}
```
We also want to refer to publications that presented or applied the library:
- S. Blechmann, I. Sowa, M. H. Schraven, R. Streblow, D. Müller & A. Monti. Open source platform application for smart building and smart grid controls. Automation in Construction 145 (2023), 104622. ISSN: 0926-5805. https://doi.org/10.1016/j.autcon.2022.104622
- Haghgoo, M., Dognini, A., Storek, T., Plamanescu, R, Rahe, U.,
Gheorghe, S, Albu, M., Monti, A., Müller, D. (2021) A cloud-based service-oriented architecture to unlock smart energy services
https://www.doi.org/10.1186/s42162-021-00143-x
- Baranski, M., Storek, T. P. B., Kümpel, A., Blechmann, S., Streblow, R.,
Müller, D. et al.,
(2020). National 5G Energy Hub : Application of the Open-Source Cloud Platform
FIWARE for Future Energy Management Systems.
https://doi.org/10.18154/RWTH-2020-07876
- T. Storek, J. Lohmöller, A. Kümpel, M. Baranski & D. Müller (2019).
Application of the open-source cloud platform FIWARE for future building
energy management systems.
Journal of Physics:
Conference Series, 1343, 12063. https://doi.org/10.1088/1742-6596/1343/1/012063
## License
This project is licensed under the BSD License - see the [LICENSE](LICENSE.md) file for details.
## Copyright
<a href="https://www.ebc.eonerc.rwth-aachen.de/"> <img alt="EBC" src="https://www.ebc.eonerc.rwth-aachen.de/global/show_picture.asp?id=aaaaaaaaaakevlz" height="100"> </a>
2021-2025, RWTH Aachen University, E.ON Energy Research Center, Institute for Energy
Efficient Buildings and Indoor Climate
[Institute for Energy Efficient Buildings and Indoor Climate (EBC)](https://www.ebc.eonerc.rwth-aachen.de)
[E.ON Energy Research Center (E.ON ERC)](https://www.eonerc.rwth-aachen.de)
[RWTH University Aachen, Germany](https://www.rwth-aachen.de)
## Disclaimer
This project is part of the cooperation between the RWTH Aachen University and
the Research Centre Jülich.
<a href="https://www.jara.org/de/forschung/jara-energy"> <img alt="JARA
ENERGY" src="https://raw.githubusercontent.com/RWTH-EBC/FiLiP/master/docs/logos/LogoJARAEnergy.jpg" height="100"> </a>
## Related projects
<a href="https://n5geh.de/"> <img alt="National 5G Energy Hub"
src="https://avatars.githubusercontent.com/u/43948851?s=200&v=4" height="100"></a>
<a href="https://fismep.de/"> <img alt="FISMEP"
src="https://raw.githubusercontent.com/RWTH-EBC/FiLiP/master/docs/logos/FISMEP.png"
height="100"></a>
## Acknowledgments
We gratefully acknowledge the financial support of the Federal Ministry <br />
for Economic Affairs and Climate Action (BMWK), promotional references
03ET1495A, 03ET1551A, 0350018A, 03ET1561B, 03EN1030B.
<a href="https://www.bmwi.de/Navigation/EN/Home/home.html"> <img alt="BMWK"
src="https://raw.githubusercontent.com/RWTH-EBC/FiLiP/master/docs/logos/bmwi_logo_en.png" height="100"> </a>
This project has received funding in the framework of the joint programming initiative ERA-Net Smart Grids Plus, with support from the European Union’s Horizon 2020 research and innovation programme.
<a href="https://www.eranet-smartgridsplus.eu/"> <img alt="ERANET"
src="https://fismep.de/wp-content/uploads/2017/09/SmartGridsPlus_rgb-300x55.jpg" height="100"> </a>
| text/markdown | RWTH Aachen University, E.ON Energy Research Center, Institute of Energy Efficient Buildings and Indoor Climate | ebc-tools@eonerc.rwth-aachen.de | null | null | null | iot, fiware, semantic | [
"Development Status :: 3 - Alpha",
"Topic :: Scientific/Engineering",
"Intended Audience :: Science/Research",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: ... | [] | https://github.com/RWTH-EBC/filip | https://github.com/RWTH-EBC/FiLiP/archive/refs/tags/v0.7.4.tar.gz | >=3.8 | [] | [] | [] | [
"aenum~=3.1.15",
"datamodel_code_generator[http]~=0.25.0",
"paho-mqtt~=2.0.0",
"pandas_datapackage_reader~=0.18.0",
"pydantic<2.9.0,>=2.6.0",
"pydantic-settings<2.3.0,>=2.0.0",
"geojson_pydantic~=1.0.2",
"stringcase>=1.2.0",
"regex~=2023.10.3",
"requests~=2.32.0",
"rapidfuzz~=3.4.0",
"geojson-... | [] | [] | [] | [
"Documentation, https://rwth-ebc.github.io/FiLiP/master/docs/index.html",
"Source, https://github.com/RWTH-EBC/filip",
"Download, https://github.com/RWTH-EBC/FiLiP/archive/refs/tags/v0.7.4.tar.gz"
] | twine/6.2.0 CPython/3.10.19 | 2026-02-19T10:40:52.742567 | filip-0.7.4.tar.gz | 222,966 | 22/89/3df5baf5985e0e1d1df345c02e6ce7e8177c6297c7f5ab695efeff0f2c51/filip-0.7.4.tar.gz | source | sdist | null | false | aa64fab72fcbda3b867b8134da785993 | 578fc5632509ee87bad62209dd2e87dd1855522f2b7beb3242c8e85e3585cc16 | 22893df5baf5985e0e1d1df345c02e6ce7e8177c6297c7f5ab695efeff0f2c51 | null | [
"LICENSE.md"
] | 289 |
2.4 | fivccliche | 0.1.42 | A modern async Python framework for building scalable applications with FastAPI and SQLModel | # Fivccliche
A **production-ready, multi-user backend framework** designed specifically for **AI agents**. Built with **FastAPI** and **SQLModel** for high-performance, type-safe async operations that handle concurrent AI agent requests at scale.
## ✨ Features
- **AI Agent Backend** - Purpose-built for multi-user AI agent interactions and orchestration
- **FastAPI** - Modern, fast web framework for building high-performance APIs with Python 3.10+
- **SQLModel** - SQL ORM combining SQLAlchemy and Pydantic for type-safe database operations
- **Async/Await** - Full async support for handling concurrent AI agent requests at scale
- **Type Safety** - Built-in type hints with Pydantic 2.0 validation for reliable data handling
- **Multi-User Support** - Designed for managing multiple AI agents with proper isolation and access control
- **Testing** - Pytest with async support for comprehensive test coverage
- **Code Quality** - Black, Ruff, and MyPy configured for professional code standards
- **Package Management** - `uv` for fast, reliable dependency management
## 🚀 Quick Start
### Prerequisites
- Python 3.10 or higher
- `uv` package manager ([install](https://docs.astral.sh/uv/))
### Installation
```bash
# Clone the repository
git clone https://github.com/MindFiv/FivcCliche.git
cd FivcCliche
# Install production dependencies
uv pip install -e .
# Or install with development tools
uv pip install -e ".[dev]"
```
### Using the CLI
The easiest way to run FivcCliche is using the built-in CLI:
```bash
# Start the server
python -m fivccliche.cli run
# Show project information
python -m fivccliche.cli info
# Clean temporary files and cache
python -m fivccliche.cli clean
# Initialize configuration
python -m fivccliche.cli setup
```
Visit http://localhost:8000/docs for interactive API documentation.
### CLI Options
```bash
# Custom host and port
python -m fivccliche.cli run --host 127.0.0.1 --port 9000
# Production mode (no auto-reload)
python -m fivccliche.cli run --no-reload
# Test configuration without running
python -m fivccliche.cli run --dry-run
# Verbose output
python -m fivccliche.cli run --verbose
```
## 📚 Documentation
For detailed information, see the documentation in the `docs/` folder:
- **[Getting Started](docs/getting-started.md)** - Comprehensive tutorial with examples
- **[Setup Summary](docs/setup-summary.md)** - Installation and project structure
- **[Migration Plan](docs/migration-plan.md)** - Technical migration details
- **[Completion Summary](docs/completion-summary.md)** - What was accomplished
## 🛠️ Development
### CLI Commands
```bash
make format # Format code with Black
make lint # Lint with Ruff
make check # Run all checks (format, lint, type check)
```
### Run Tests
```bash
pytest
pytest -v --cov=src # With coverage
```
### Code Quality
```bash
black src/ tests/ # Format code
ruff check src/ tests/ # Lint code
mypy src/ # Type check
```
### Project Structure
```
fivccliche/
├── pyproject.toml # Project configuration
├── src/
│ └── fivccliche/
│ ├── __init__.py
│ ├── cli.py # CLI implementation
│ ├── services/
│ ├── utils/
│ ├── settings/
│ └── modules/
├── tests/ # Add your tests here
└── docs/ # Documentation
```
## 📦 Dependencies
**Production Core**: FastAPI, SQLModel, Uvicorn, Pydantic, SQLAlchemy
**CLI & Output**: Typer, Rich, python-dotenv
**Component System**: fivcglue, fivcplayground
**Development**: Pytest, Black, Ruff, MyPy, Coverage
See `pyproject.toml` for complete dependency list and versions.
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 👤 Author
Charlie Zhang (sunnypig2002@gmail.com)
## 🔗 Links
- **Repository**: https://github.com/MindFiv/FivcCliche
- **FastAPI**: https://fastapi.tiangolo.com/
- **SQLModel**: https://sqlmodel.tiangolo.com/
- **Pydantic**: https://docs.pydantic.dev/
| text/markdown | null | Charlie Zhang <sunnypig2002@gmail.com> | null | null | null | fastapi, sqlmodel, async, orm, rest-api | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"fastapi>=0.104.0",
"sqlmodel>=0.0.14",
"uvicorn[standard]>=0.24.0",
"pydantic>=2.0.0",
"pydantic-settings>=2.0.0",
"pydantic[email]>=2.0.0",
"python-multipart>=0.0.6",
"python-dotenv>=1.0.0",
"typer>=0.9.0",
"rich>=13.0.0",
"fivcplayground>=0.1.20",
"fivcplayground[strands]>=0.1.20",
"fivcp... | [] | [] | [] | [
"Homepage, https://github.com/MindFiv/FivcCliche",
"Repository, https://github.com/MindFiv/FivcCliche.git",
"Issues, https://github.com/MindFiv/FivcCliche/issues"
] | uv/0.5.10 | 2026-02-19T10:39:40.958907 | fivccliche-0.1.42.tar.gz | 76,960 | ef/a7/bf8715643b468016fa1fd441d2f770452b4c36eac0875ed78a770765d27f/fivccliche-0.1.42.tar.gz | source | sdist | null | false | 2dc6f779e14fe49336e3f134761ea7aa | 3ae2a34df77aaace3ba0bf507eb14946b54eaa1cd79511a54cad36a0e282606f | efa7bf8715643b468016fa1fd441d2f770452b4c36eac0875ed78a770765d27f | null | [
"LICENSE"
] | 249 |
2.4 | walrio | 1.1.2 | Modular audio library management system for playing, managing, and editing music files (requires FFmpeg, GStreamer, ImageMagick, rsgain) | # Walrio GUI
Walrio is a modular library/set of files that let you play, manage, and edit music and music-related files. Every file should be usable via the terminal except documentation.
## Contributing
For those interested in contributing code/documentation, please check the [contribution guidelines](https://github.com/TAPSOSS/.github/blob/main/CONTRIBUTING.md). On top of these guidelines, this specific project requires a single comment at the top of each file explaining what it does so that help commands properlyload dynmically. TODO: add this to a CONTRIBUTING.md later.
All current contributors are listed both in the sidebar and (optionally) in the [AUTHORS](AUTHORS) file.
## Star History
[](https://www.star-history.com/#TAPSOSS/Walrio&type=date&legend=top-left)
## Licensing (USE IN OTHER PROJECTS)
Check out the [LICENSE file](LICENSE) to see what LICENSE this project uses and how you're allowed to use it. General rule of thumb is attribution (crediting) is required at a minimum.
## Installation
### Quick Install (pip)
```bash
pip install walrio
```
**⚠️ Important:** Walrio requires system dependencies that pip cannot install:
- FFmpeg
- GStreamer
- ImageMagick
- rsgain
After installing via pip, check for missing dependencies:
```bash
walrio dependency_checker --verbose
```
Then install any missing system packages (see [System Requirements](#system-requirements) below).
If you have all the needed dependencies, you can get started using walrio with the help command (`walrio --help`).
### System Requirements
Walrio requires the following non-Python tools to be installed on your system:
- **FFmpeg** - Audio/video conversion and processing
- **GStreamer** - Audio playback engine
- **ImageMagick** - Image processing for album art
- **rsgain** - ReplayGain 2.0 loudness scanner
**Installation by platform:**
**Fedora:**
```bash
sudo dnf install gstreamer1-plugins-base gstreamer1-plugins-good gstreamer1-plugins-ugly gstreamer1-tools ffmpeg ImageMagick rsgain
```
**Ubuntu/Debian:**
```bash
sudo apt install gstreamer1.0-tools gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-ugly ffmpeg imagemagick
# rsgain: See https://github.com/complexlogic/rsgain
```
**Arch Linux:**
```bash
sudo pacman -S gstreamer gst-plugins-base gst-plugins-good gst-plugins-ugly ffmpeg imagemagick
yay -S rsgain # or use another AUR helper
```
**macOS:**
```bash
brew install gstreamer gst-plugins-base gst-plugins-good gst-plugins-ugly ffmpeg imagemagick rsgain
```
## Development Setup
1. Clone the repository
```bash
git clone https://github.com/TAPSOSS/Walrio.git
cd Walrio
```
2. Install system dependencies (see [System Requirements](#system-requirements) above)
3. Install in editable mode with dev dependencies
```bash
pip install -e .[dev]
```
4. Verify dependencies
```bash
walrio dependency_checker --verbose
```
5. Run Walrio
```bash
walrio --help
walrio player song.mp3
```
## Third-Party Credits
Walrio uses/requires/bundles the following projects (and [python](https://www.python.org/)):
### Non-Python
- **GStreamer** : <https://github.com/GStreamer/gstreamer>
- **FFmpeg** : <https://github.com/FFmpeg/FFmpeg>
- **rsgain** : <https://github.com/complexlogic/rsgain>
- **ImageMagick**: <https://github.com/ImageMagick/ImageMagick>
### Python/Pip-Installable
Check the [requirements.txt](requirements.txt) file to see what to install with pip/python in order to use this library.
## File Structure
### Modules
The main folder with all the seperate walrio music modules you can use and walrio.py,
the global file that lets you easily run any file without having the CD into each folder.
#### Addons
Files that are non-essential for playing music but are still very nice to have/relevant for maintaining a music library (converter files, replay gain, move files, etc.). Can require modules from the addons folder itself or the core modules.
#### Core
The core set of modules that are absolutely essential to playing your music files from your media library. Often required for addons/niche modules to function.
#### Database
Modules that require a SQLite database (walrio_library.db/database.py from the core section) to function. These provide advanced library management features like playback statistics, smart playlists, and database-powered queues. The database must be created first using the database module.
#### Niche
Very specific workflow related files or extremely niche functionality. Generally files combining multiple different core and addon modules together into a singular unified workflow or something to connect your music to external programs/hardware.
| text/markdown | TapsOSS | null | TapsOSS | null | null | music, audio, playlist, metadata, player, library, management | [
"Development Status :: 4 - Beta",
"Intended Audience :: End Users/Desktop",
"Intended Audience :: Developers",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Pyt... | [] | null | null | >=3.8 | [] | [] | [] | [
"mutagen>=1.45.0",
"PySide6>=6.0.0",
"Pillow>=8.0.0",
"pytest>=7.0.0; extra == \"dev\"",
"black>=22.0.0; extra == \"dev\"",
"flake8>=4.0.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/TAPSOSS/Walrio",
"Repository, https://github.com/TAPSOSS/Walrio",
"Bug Tracker, https://github.com/TAPSOSS/Walrio/issues"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:38:59.131346 | walrio-1.1.2.tar.gz | 227,653 | d8/b0/669eb9ee2521119c52c772ae72ed6699fcf51acdd343a268d601e2922a72/walrio-1.1.2.tar.gz | source | sdist | null | false | ba6fd5103e9bdfcd0ecde90d1a443c6a | 17247e3d6925e85b30f5faa35570e53965cea4a9153bb470bcb9750368eaa938 | d8b0669eb9ee2521119c52c772ae72ed6699fcf51acdd343a268d601e2922a72 | BSD-3-Clause | [
"LICENSE",
"AUTHORS"
] | 231 |
2.4 | berryworld | 1.0.0.201168 | Handy classes to improve ETL processes | # BerryWorld
This is a package developed by the Data Team @ BerryWorld.
For more infomation: [BerryWorld](https://www.berryworld.com/)
| text/markdown | BerryWorld ltd | data@berryworld.com | null | null | null | null | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent"
] | [] | https://www.berryworld.com | null | >=3.11 | [] | [] | [] | [
"azure-identity>=1.15.0",
"azure-keyvault-secrets>=4.8.0",
"cryptography>=3.4.8",
"msal>=1.14.0",
"numpy>=1.25.2",
"opencensus>=0.11.0",
"opencensus_ext_azure>=1.1.7",
"pandas<3.0.1,>=1.4.2",
"pyodbc>=4.0.39",
"python-dotenv>=1.0.0",
"PyYAML>=6.0",
"requests>=2.31.0",
"scipy>=1.11.1",
"set... | [] | [] | [] | [] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:37:25.162841 | berryworld-1.0.0.201168-py3-none-any.whl | 85,422 | 57/58/0b99620c0b8fa6a1d1543a0d618e4ce595870d9e6cdf33d84492dafb3a00/berryworld-1.0.0.201168-py3-none-any.whl | py3 | bdist_wheel | null | false | 554ae98b28654c07abf3f3d46097306f | 5be203e27f35c2249b2ee63e55fbe0ab0ed29b168a11d33fcfc9f3b8884025d1 | 57580b99620c0b8fa6a1d1543a0d618e4ce595870d9e6cdf33d84492dafb3a00 | null | [
"LICENSE"
] | 168 |
2.4 | mne-videobrowser | 0.2.1 | Python package for browsing video and audio time-synchronized to MEG/EEG data | # Video and audio browser extension for MNE-Python's Qt data browser
[](https://mne-videobrowser.readthedocs.io/en/latest/?badge=latest)
This is an open-source Python package for browsing video and audio time-synchronized to MEG/EEG data.
It serves as an add-on for [mne-qt-browser](https://github.com/mne-tools/mne-qt-browser), which is part
of [MNE-Python](https://mne.tools/stable/), an open-source Python package for exploring, visualizing,
and analyzing human neurophysiological data.
This project also complements [Helsinki VideoMEG project](https://github.com/Helsinki-VideoMEG-Project)
by supporting video and audio files recorded with their software.

Screenshot of the browser extension showing a black video frame and a test audio file synchronized with MNE-Python's sample MEG data.
## Features
* Time-synchronized video browsing and playback with MEG/EEG data
* Time-synchronized audio browsing and playback with MEG/EEG data
* Support for multiple video and MEG files simultaneously (only one audio file with multiple channels at a time)
* Support for [Helsinki VideoMEG project](https://github.com/Helsinki-VideoMEG-Project) format files
* Standard video format support (MP4, AVI, etc.) via OpenCV (for audio only Helsinki VideoMEG format is currently supported)
## Documentation
[Documentation](https://mne-videobrowser.readthedocs.io/) contains installation instructions (same as below), public API reference,
and some of the usage examples available in the [GitHub](https://github.com/ttaiv/mne-videobrowser/tree/main/examples).
A bit of the code structure and implementation details are also documented.
## Installation
In addition to MNE-Python, this project requires package `OpenCV` for standard video file (such as .mp4) reading
and `sounddevice` for audio playback. Recommended way to install MNE-Python and thus this package is using
[conda](https://github.com/conda/conda).
### Using conda (recommended)
1. Create a new conda environment (named `mne-videobrowser`) with this package and all dependencies installed:
```bash
conda create --channel=conda-forge --strict-channel-priority --name=mne-videobrowser mne-videobrowser
```
2. Activate the environment:
```bash
conda activate mne-videobrowser
```
3. Only on linux: If you do not have [PortAudio library](https://www.portaudio.com/), which is
dependency of `sounddevice` installed, install it. For example on Ubuntu/Debian:
```bash
sudo apt install libportaudio2
```
### Using pip
1. Activate your desired Python environment ([documentation for virtual environments](https://docs.python.org/3/tutorial/venv.html)).
2. Install this package, all dependencies will be installed automatically, except for a Qt binding, so you need to specify that in the command line as well if you don't have one (we recommend PySide6):
```bash
pip install mne-videobrowser PySide6
```
3. Only on linux: If you do not have [PortAudio library](https://www.portaudio.com/), which is
dependency of `sounddevice` installed, install it. For example on Ubuntu/Debian:
```bash
sudo apt install libportaudio2
```
See usage examples in [GitHub](https://github.com/ttaiv/mne-videobrowser/tree/main/examples).
## For developers
### Installation for development
1. Clone this repository and navigate to project root.
2. Run
```bash
conda env create -f environment.yml
```
This will create a conda environment `mne-videobrowser-dev`, that has
development dependencies and the package installed in editable mode, which ensures that changes in source code are reflected to the installed package.
3. Activate the environment:
```bash
conda activate mne-videobrowser-dev
```
4. Only on linux: If you do not have [PortAudio library](https://www.portaudio.com/), which is
dependency of `sounddevice` installed, install it. For example on Ubuntu/Debian:
```bash
sudo apt install libportaudio2
```
### Running tests
Tests are located in directory `tests/` and they run using package `pytest` (included in development dependencies).
You can run all the tests with:
```bash
pytest
```
You can also selectively run tests in a specific file/class/method. See [pytest documentation](https://docs.pytest.org/en/stable/how-to/usage.html) for details.
### Building documentation
Documentation source files are located in `docs/source/` and built documentation in `docs/build/`.
Documentation is mostly automatically generated from the source code docstrings using `sphinx`.
To build the documentation:
```bash
cd docs
make html # on Windows use 'make.bat html'
```
Then view the built html documentation by opening file `docs/build/html/index.html` in a web browser.
| text/markdown | null | Teemu Taivainen <ttaiv@outlook.com> | null | null | null | audio, brain, eeg, meg, neuroimaging, neuroscience, video | [
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"License :: OSI Approved",
"Operating System :: MacOS",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX",
"Operating System :: Unix",
"Programming Language :: Python :: 3",
"Topic :: Multimedia :: Soun... | [] | null | null | >=3.10 | [] | [] | [] | [
"mne-qt-browser>=0.7",
"mne>=1.10",
"opencv-python-headless>=4.8",
"psutil",
"sounddevice>=0.5",
"pydata-sphinx-theme; extra == \"dev\"",
"pyright; extra == \"dev\"",
"pytest; extra == \"dev\"",
"ruff; extra == \"dev\"",
"sphinx; extra == \"dev\""
] | [] | [] | [] | [
"Repository, https://github.com/ttaiv/mne-videobrowser",
"Documentation, https://mne-videobrowser.readthedocs.io/"
] | uv/0.8.22 | 2026-02-19T10:36:46.497995 | mne_videobrowser-0.2.1.tar.gz | 474,069 | c0/8e/2d0a576336b9c0f37fb4a8c11e19da7951cc4edfb0e1af6b8ede70ad625c/mne_videobrowser-0.2.1.tar.gz | source | sdist | null | false | 68d0b923b4e43f11b857e2a6d46c119f | ab8263c2fdac10bbfaa1080627ac1bd9549b65f9b3bbd0f9e2fb48135ed48b08 | c08e2d0a576336b9c0f37fb4a8c11e19da7951cc4edfb0e1af6b8ede70ad625c | BSD-3-Clause | [
"LICENSE"
] | 252 |
2.4 | authx | 1.5.1 | Ready to use and customizable Authentications and Oauth2 management for FastAPI | # AuthX
<p align="center">
<a href="https://authx.yezz.me" target="_blank">
<img src="https://user-images.githubusercontent.com/52716203/136962014-280d82b0-0640-4ee5-9a11-b451b338f6d8.png" alt="AuthX">
</a>
<p align="center">
<em>Ready-to-use and customizable Authentications and Oauth2 management for FastAPI ⚡</em>
</p>
</p>
---
| Project | Status |
|---------|--------|
| CI | [](https://github.com/yezz123/authx/actions/workflows/ci.yml) [](https://results.pre-commit.ci/latest/github/yezz123/authx/main) [](https://codecov.io/gh/yezz123/authx) |
| Meta | [](https://pypi.org/project/authx) [](https://pepy.tech/project/authx) [](https://pydantic.dev) [](https://github.com/astral-sh/ruff) [](https://sonarcloud.io/summary/new_code?id=yezz123_authx) |
---
**Source Code**: <https://github.com/yezz123/authx>
**Documentation**: <https://authx.yezz.me/>
---
Add a fully featured authentication and authorization system to your [FastAPI](https://fastapi.tiangolo.com/) project. **AuthX** is designed to be simple, customizable, and secure.
## Installation
```bash
pip install authx
```
## Quick Start
```python
from fastapi import FastAPI, Depends, HTTPException
from authx import AuthX, AuthXConfig
app = FastAPI()
config = AuthXConfig(
JWT_SECRET_KEY="your-secret-key", # Change this!
JWT_TOKEN_LOCATION=["headers"],
)
auth = AuthX(config=config)
auth.handle_errors(app)
[@app](https://github.com/app).post("/login")
def login(username: str, password: str):
if username == "test" and password == "test":
token = auth.create_access_token(uid=username)
return {"access_token": token}
raise HTTPException(401, detail="Invalid credentials")
[@app](https://github.com/app).get("/protected", dependencies=[Depends(auth.access_token_required)])
def protected():
return {"message": "Hello World"}
```
**Test it:**
```bash
# Get a token
curl -X POST "http://localhost:8000/login?username=test&password=test"
# Access protected route
curl -H "Authorization: Bearer <your-token>" http://localhost:8000/protected
```
## Features
- Support for Python 3.9+ and Pydantic 2
- JWT authentication with multiple token locations:
- Headers (Bearer token)
- Cookies (with CSRF protection)
- Query parameters
- JSON body
- Access and refresh token support
- Token freshness for sensitive operations
- Token blocklist/revocation
- Extensible error handling
### Extra Features
Install [`authx-extra`](https://github.com/yezz123/authx-extra) for additional features:
```bash
pip install authx-extra
```
- Redis session store and cache
- HTTP caching
- Performance profiling with pyinstrument
- Prometheus metrics
**Note:** Check [Release Notes](https://authx.yezz.me/release/).
## Contributors and Sponsors
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
[](#contributors-)
<!-- ALL-CONTRIBUTORS-BADGE:END -->
Thanks goes to these wonderful people
([emoji key](https://allcontributors.org/docs/en/emoji-key)):
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tbody>
<tr>
<td align="center" valign="top" width="14.28%"><a href="http://yezz.me"><img src="https://avatars.githubusercontent.com/u/52716203?v=4?s=100" width="100px;" alt="Yasser Tahiri"/><br /><sub><b>Yasser Tahiri</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=yezz123" title="Code">💻</a> <a href="https://github.com/yezz123/authx/commits?author=yezz123" title="Documentation">📖</a> <a href="#maintenance-yezz123" title="Maintenance">🚧</a> <a href="#infra-yezz123" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://soubai.me"><img src="https://avatars.githubusercontent.com/u/11523791?v=4?s=100" width="100px;" alt="Abderrahim SOUBAI-ELIDRISI"/><br /><sub><b>Abderrahim SOUBAI-ELIDRISI</b></sub></a><br /><a href="https://github.com/yezz123/authx/pulls?q=is%3Apr+reviewed-by%3AAbderrahimSoubaiElidrissi" title="Reviewed Pull Requests">👀</a> <a href="https://github.com/yezz123/authx/commits?author=AbderrahimSoubaiElidrissi" title="Documentation">📖</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://smakosh.com"><img src="https://avatars.githubusercontent.com/u/20082141?v=4?s=100" width="100px;" alt="Ismail Ghallou "/><br /><sub><b>Ismail Ghallou </b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=smakosh" title="Code">💻</a> <a href="#security-smakosh" title="Security">🛡️</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/MojixCoder"><img src="https://avatars.githubusercontent.com/u/76670309?v=4?s=100" width="100px;" alt="MojixCoder"/><br /><sub><b>MojixCoder</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=MojixCoder" title="Code">💻</a> <a href="https://github.com/yezz123/authx/issues?q=author%3AMojixCoder" title="Bug reports">🐛</a></td>
<td align="center" valign="top" width="14.28%"><a href="http://sralab.com"><img src="https://avatars.githubusercontent.com/u/1815?v=4?s=100" width="100px;" alt="Stéphane Raimbault"/><br /><sub><b>Stéphane Raimbault</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=stephane" title="Code">💻</a> <a href="#plugin-stephane" title="Plugin/utility libraries">🔌</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/theoohoho"><img src="https://avatars.githubusercontent.com/u/31537466?v=4?s=100" width="100px;" alt="theoohoho"/><br /><sub><b>theoohoho</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=theoohoho" title="Documentation">📖</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://yogeshupadhyay.netlify.app/"><img src="https://avatars.githubusercontent.com/u/53992168?v=4?s=100" width="100px;" alt="Yogesh Upadhyay"/><br /><sub><b>Yogesh Upadhyay</b></sub></a><br /><a href="https://github.com/yezz123/authx/issues?q=author%3AYogeshUpdhyay" title="Bug reports">🐛</a></td>
</tr>
<tr>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/iftenet"><img src="https://avatars.githubusercontent.com/u/1397880?v=4?s=100" width="100px;" alt="Roman"/><br /><sub><b>Roman</b></sub></a><br /><a href="https://github.com/yezz123/authx/issues?q=author%3Aiftenet" title="Bug reports">🐛</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://www.linkedin.com/today/author/alobbs"><img src="https://avatars.githubusercontent.com/u/170559?v=4?s=100" width="100px;" alt="Alvaro Lopez Ortega"/><br /><sub><b>Alvaro Lopez Ortega</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=alobbs" title="Documentation">📖</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/pinchXOXO"><img src="https://avatars.githubusercontent.com/u/68501799?v=4?s=100" width="100px;" alt="Devy Santo"/><br /><sub><b>Devy Santo</b></sub></a><br /><a href="#infra-pinchXOXO" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/pg365"><img src="https://avatars.githubusercontent.com/u/173273017?v=4?s=100" width="100px;" alt="pg365"/><br /><sub><b>pg365</b></sub></a><br /><a href="#infra-pg365" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jor-rit"><img src="https://avatars.githubusercontent.com/u/16398756?v=4?s=100" width="100px;" alt="Jorrit"/><br /><sub><b>Jorrit</b></sub></a><br /><a href="#platform-jor-rit" title="Packaging/porting to new platform">📦</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/callamd"><img src="https://avatars.githubusercontent.com/u/1664656?v=4?s=100" width="100px;" alt="Callam"/><br /><sub><b>Callam</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=callamd" title="Code">💻</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/lmasikl"><img src="https://avatars.githubusercontent.com/u/1556136?v=4?s=100" width="100px;" alt="Maxim"/><br /><sub><b>Maxim</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=lmasikl" title="Code">💻</a></td>
</tr>
<tr>
<td align="center" valign="top" width="14.28%"><a href="https://evergreenies.github.io"><img src="https://avatars.githubusercontent.com/u/33820365?v=4?s=100" width="100px;" alt="Suyog Shimpi"/><br /><sub><b>Suyog Shimpi</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=Evergreenies" title="Code">💻</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/NeViNez"><img src="https://avatars.githubusercontent.com/u/91369880?v=4?s=100" width="100px;" alt="NeViNez"/><br /><sub><b>NeViNez</b></sub></a><br /><a href="https://github.com/yezz123/authx/issues?q=author%3ANeViNez" title="Bug reports">🐛</a></td>
<td align="center" valign="top" width="14.28%"><a href="https://github.com/Antareske"><img src="https://avatars.githubusercontent.com/u/171327898?v=4?s=100" width="100px;" alt="Antareske"/><br /><sub><b>Antareske</b></sub></a><br /><a href="https://github.com/yezz123/authx/commits?author=Antareske" title="Code">💻</a></td>
</tr>
</tbody>
</table>
<!-- markdownlint-restore -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->
This project follows the
[all-contributors](https://github.com/all-contributors/all-contributors)
specification. Contributions of any kind welcome!
## License
This project is licensed under the terms of the MIT License.
## Changelog
## Latest Changes
## 1.5.1
* :sparkles: feat: support Async/sync Callbacks. PR [#793](https://github.com/yezz123/authx/pull/793) by [@Antareske](https://github.com/Antareske).
### Upgrades
* ⬆ Bump the python-packages group with 9 updates. PR [#807](https://github.com/yezz123/authx/pull/807) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump cryptography from 44.0.2 to 46.0.5 in /examples. PR [#806](https://github.com/yezz123/authx/pull/806) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump cryptography from 44.0.1 to 46.0.5. PR [#805](https://github.com/yezz123/authx/pull/805) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump python-jose from 3.3.0 to 3.4.0. PR [#801](https://github.com/yezz123/authx/pull/801) by [@dependabot[bot]](https://github.com/apps/dependabot).
* Bump starlette from 0.46.0 to 0.49.1 in /examples. PR [#803](https://github.com/yezz123/authx/pull/803) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump cryptography from 44.0.0 to 44.0.1. PR [#804](https://github.com/yezz123/authx/pull/804) by [@dependabot[bot]](https://github.com/apps/dependabot).
* Bump h11 from 0.14.0 to 0.16.0 in /examples. PR [#802](https://github.com/yezz123/authx/pull/802) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump urllib3 from 1.26.15 to 2.6.3. PR [#800](https://github.com/yezz123/authx/pull/800) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump starlette from 0.41.3 to 0.49.1. PR [#799](https://github.com/yezz123/authx/pull/799) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump pyasn1 from 0.6.1 to 0.6.2. PR [#798](https://github.com/yezz123/authx/pull/798) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump virtualenv from 20.33.1 to 20.36.1. PR [#797](https://github.com/yezz123/authx/pull/797) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ Bump `ruff-pre-commit` from v0.14.14 to v0.15.0. PR [#795](https://github.com/yezz123/authx/pull/795) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.14.13 to v0.14.14. PR [#791](https://github.com/yezz123/authx/pull/791) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.14.11 to v0.14.12. PR [#790](https://github.com/yezz123/authx/pull/790) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.14.10 to v0.14.11. PR [#789](https://github.com/yezz123/authx/pull/789) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
### Docs
* :memo: docs: add [@Antareske](https://github.com/Antareske) as a contributor for code. PR [#794](https://github.com/yezz123/authx/pull/794) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
### Internal
* :bug: improve type annotations to fix mypy issues. PR [#796](https://github.com/yezz123/authx/pull/796) by [@yezz123](https://github.com/yezz123).
## 1.5.0
### Breaking Changes
* ✨ Implement scope management features. PR [#788](https://github.com/yezz123/authx/pull/788) by [@yezz123](https://github.com/yezz123).
### Features
* :recycle: Update CI workflow and add support for Python 3.14. PR [#782](https://github.com/yezz123/authx/pull/782) by [@yezz123](https://github.com/yezz123).
* :sparkles: Drop pydantic v1 & support only pydantic v2. PR [#781](https://github.com/yezz123/authx/pull/781) by [@yezz123](https://github.com/yezz123).
### Fixes
* :bug: Improve CSRF error handling and messaging. PR [#783](https://github.com/yezz123/authx/pull/783) by [@yezz123](https://github.com/yezz123).
* 🔧 Refactor token retrieval methods in AuthX class. PR [#780](https://github.com/yezz123/authx/pull/780) by [@yezz123](https://github.com/yezz123).
* 🐛 Enhance CSRF token error handling and simplify token location. PR [#779](https://github.com/yezz123/authx/pull/779) by [@yezz123](https://github.com/yezz123).
* 🐛 use refresh parameter in `_get_token_from_headers()`. PR [#776](https://github.com/yezz123/authx/pull/776) by [@NeViNez](https://github.com/NeViNez).
### Upgrades
* ⬆️ Bump ruff-pre-commit from v0.14.6 to v0.14.10. PR [#777](https://github.com/yezz123/authx/pull/777) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.14.1 to v0.14.6. PR [#773](https://github.com/yezz123/authx/pull/773) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆ Bump actions/checkout from 5 to 6. PR [#774](https://github.com/yezz123/authx/pull/774) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ Bump `ruff-pre-commit` from v0.12.11 to v0.14.1. PR [#770](https://github.com/yezz123/authx/pull/770) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆ Bump astral-sh/setup-uv from 6 to 7. PR [#772](https://github.com/yezz123/authx/pull/772) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump actions/setup-python from 5 to 6. PR [#769](https://github.com/yezz123/authx/pull/769) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ Bump `ruff-pre-commit` from v0.12.10 to v0.12.11. PR [#768](https://github.com/yezz123/authx/pull/768) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.12.9 to v0.12.10. PR [#767](https://github.com/yezz123/authx/pull/767) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.12.3 to v0.12.9. PR [#764](https://github.com/yezz123/authx/pull/764) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆ Bump actions/checkout from 4 to 5. PR [#766](https://github.com/yezz123/authx/pull/766) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆ Bump samuelcolvin/check-python-version from 4.1 to 5. PR [#765](https://github.com/yezz123/authx/pull/765) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ Bump `ruff-pre-commit` from v0.12.2 to v0.12.3. PR [#762](https://github.com/yezz123/authx/pull/762) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.12.1 to v0.12.2. PR [#761](https://github.com/yezz123/authx/pull/761) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.11.13 to v0.12.1. PR [#760](https://github.com/yezz123/authx/pull/760) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.11.12 to v0.11.13. PR [#759](https://github.com/yezz123/authx/pull/759) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.11.11 to v0.11.12. PR [#757](https://github.com/yezz123/authx/pull/757) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
### Docs
* 📝 Update README for AuthX. PR [#785](https://github.com/yezz123/authx/pull/785) by [@yezz123](https://github.com/yezz123).
* 📝 Update documentation for AuthX features and usage. PR [#784](https://github.com/yezz123/authx/pull/784) by [@yezz123](https://github.com/yezz123).
* 📝 add [@NeViNez](https://github.com/NeViNez) as a contributor for bug. PR [#778](https://github.com/yezz123/authx/pull/778) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
### Internal
* ✨ Add Jinja2 template engine support documentation and examples. PR [#787](https://github.com/yezz123/authx/pull/787) by [@yezz123](https://github.com/yezz123).
* :sparkles: Add dual token location pattern documentation and example. PR [#786](https://github.com/yezz123/authx/pull/786) by [@yezz123](https://github.com/yezz123).
## 1.4.3
### Features
* ✨ added support for `JWT_COOKIE_HTTP_ONLY`. PR [#755](https://github.com/yezz123/authx/pull/755) by [@Evergreenies](https://github.com/Evergreenies).
### Fixes
* 🐛 Fix Examples bugs. PR [#753](https://github.com/yezz123/authx/pull/753) by [@Evergreenies](https://github.com/Evergreenies).
### Upgrades
* ⬆️ Bump `ruff-pre-commit` from v0.11.9 to v0.11.11. PR [#751](https://github.com/yezz123/authx/pull/751) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.11.6 to v0.11.9. PR [#750](https://github.com/yezz123/authx/pull/750) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆ Bump astral-sh/setup-uv from 5 to 6. PR [#749](https://github.com/yezz123/authx/pull/749) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ Bump `ruff-pre-commit` from v0.11.5 to v0.11.6. PR [#748](https://github.com/yezz123/authx/pull/748) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.11.2 to v0.11.5. PR [#747](https://github.com/yezz123/authx/pull/747) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
### Docs
* 📝 add [@Evergreenies](https://github.com/Evergreenies) as a contributor for code. PR [#754](https://github.com/yezz123/authx/pull/754) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
### Internal
* :recycle: change the Package ecosystem to `uv`. PR [#744](https://github.com/yezz123/authx/pull/744) by [@yezz123](https://github.com/yezz123).
## 1.4.2
### Features
* ♻️ Deprecate data keyword argument in `decode_token`. PR [#741](https://github.com/yezz123/authx/pull/741) by [@lmasikl](https://github.com/lmasikl).
### Upgrades
* ⬆️ Bump `ruff-pre-commit` from v0.9.10 to v0.11.2. PR [#743](https://github.com/yezz123/authx/pull/743) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.9.9 to v0.9.10. PR [#740](https://github.com/yezz123/authx/pull/740) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.9.7 to v0.9.9. PR [#739](https://github.com/yezz123/authx/pull/739) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.9.6 to v0.9.7. PR [#735](https://github.com/yezz123/authx/pull/735) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.9.4 to v0.9.6. PR [#733](https://github.com/yezz123/authx/pull/733) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ [pre-commit.ci] pre-commit autoupdate. PR [#732](https://github.com/yezz123/authx/pull/732) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Update pytz requirement in the python-packages group. PR [#731](https://github.com/yezz123/authx/pull/731) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ [pre-commit.ci] pre-commit autoupdate. PR [#730](https://github.com/yezz123/authx/pull/730) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.9.1 to v0.9.2. PR [#729](https://github.com/yezz123/authx/pull/729) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.8.6 to v0.9.1. PR [#728](https://github.com/yezz123/authx/pull/728) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump `ruff-pre-commit` from v0.8.4 to v0.8.6 . PR [#727](https://github.com/yezz123/authx/pull/727) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump astral-sh/setup-uv from 4 to 5. PR [#724](https://github.com/yezz123/authx/pull/724) by [@dependabot[bot]](https://github.com/apps/dependabot).
### Docs
* 📝 add [@lmasikl](https://github.com/lmasikl) as a contributor for code. PR [#742](https://github.com/yezz123/authx/pull/742) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
* 📝 Update documentation for AuthX. PR [#737](https://github.com/yezz123/authx/pull/737) by [@yezz123](https://github.com/yezz123).
### Internal
* ✨ Add examples for AuthX authentication. PR [#738](https://github.com/yezz123/authx/pull/738) by [@yezz123](https://github.com/yezz123).
* 🏗️ Update CI workflow to use only Ubuntu. PR [#736](https://github.com/yezz123/authx/pull/736) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove requirements from the build target. PR [#726](https://github.com/yezz123/authx/pull/726) by [@yezz123](https://github.com/yezz123).
* 🔧 Update release workflow to use `astral-sh/setup-uv`. PR [#723](https://github.com/yezz123/authx/pull/723) by [@yezz123](https://github.com/yezz123).
## 1.4.1
### Features
* ✨ Support async sessions. PR [#709](https://github.com/yezz123/authx/pull/709) by [@callamd](https://github.com/callamd).
### Internal
* 🗑️ Deprecate MemoryIO class and associated tests. PR [#721](https://github.com/yezz123/authx/pull/721) by [@yezz123](https://github.com/yezz123).
* 🔧 Enhance the Project Dependencies to be fully migrated to UV. PR [#711](https://github.com/yezz123/authx/pull/711) by [@yezz123](https://github.com/yezz123).
### Docs
* 📝 Update installation instructions to simplify pip install command. PR [#722](https://github.com/yezz123/authx/pull/722) by [@yezz123](https://github.com/yezz123).
* 📝 Update documentation for installation and command usage. PR [#718](https://github.com/yezz123/authx/pull/718) by [@yezz123](https://github.com/yezz123).
* 📝 add [@callamd](https://github.com/callamd) as a contributor for code. PR [#710](https://github.com/yezz123/authx/pull/710) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
### Upgrades
* ⬆️️ Bump `ruff-pre-commit` from v0.8.1 to v0.8.4. PR [#719](https://github.com/yezz123/authx/pull/719) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️ Bump ruff-pre-commit from v0.8.0 to v0.8.1. PR [#707](https://github.com/yezz123/authx/pull/707) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump the python-packages group with 17 updates. PR [#706](https://github.com/yezz123/authx/pull/706) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Bump `ruff-pre-commit` from v0.7.4 to v0.8.0. PR [#705](https://github.com/yezz123/authx/pull/705) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️ Bump `ruff-pre-commit` from v0.7.3 to v0.7.4. PR [#703](https://github.com/yezz123/authx/pull/703) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump codecov/codecov-action from 4 to 5. PR [#701](https://github.com/yezz123/authx/pull/701) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Bump `ruff-pre-commit` from v0.7.2 to v0.7.3. PR [#699](https://github.com/yezz123/authx/pull/699) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump pypa/gh-action-pypi-publish from 1.11.0 to 1.12.2. PR [#697](https://github.com/yezz123/authx/pull/697) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Upgrade the uv setup package to 4.1. PR [#700](https://github.com/yezz123/authx/pull/700) by [@yezz123](https://github.com/yezz123).
* ⬆️️ Bump `ruff-pre-commit` from v0.7.1 to v0.7.2. PR [#696](https://github.com/yezz123/authx/pull/696) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump the python-packages group with 8 updates. PR [#694](https://github.com/yezz123/authx/pull/694) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️ Bump pypa/gh-action-pypi-publish from 1.10.3 to 1.11.0. PR [#695](https://github.com/yezz123/authx/pull/695) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Bump `ruff-pre-commit` from v0.7.0 to v0.7.1. PR [#693](https://github.com/yezz123/authx/pull/693) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump the python-packages group with 7 updates. PR [#692](https://github.com/yezz123/authx/pull/692) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Bump `ruff-pre-commit` from v0.6.9 to v0.7.0. PR [#690](https://github.com/yezz123/authx/pull/690) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️ Bump the python-packages group with 12 updates. PR [#689](https://github.com/yezz123/authx/pull/689) by [@dependabot[bot]](https://github.com/apps/dependabot).
## 1.4.0
__Note:__ This release contains breaking changes, Dropping support for Python 3.8, and adding support for Python 3.13.
### Breaking Changes
* 🔧 Add support for Python 3.13. PR [#687](https://github.com/yezz123/authx/pull/687) by [@yezz123](https://github.com/yezz123).
### Fixes
* 🐛 fix Missing `Audience` & `Issuer` in the verify token. PR [#686](https://github.com/yezz123/authx/pull/686) by [@yezz123](https://github.com/yezz123).
### Upgrades
* ⬆️️ Bump starlette from 0.39.2 to 0.40.0. PR [#688](https://github.com/yezz123/authx/pull/688) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Bump the python-packages group with 16 updates. PR [#685](https://github.com/yezz123/authx/pull/685) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️ Bump the Pre-commit Dependencies Version. PR [#684](https://github.com/yezz123/authx/pull/684) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️ Bump pypa/gh-action-pypi-publish from 1.10.2 to 1.10.3. PR [#681](https://github.com/yezz123/authx/pull/681) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️ Update dependencies in `uv.lock`. PR [#680](https://github.com/yezz123/authx/pull/680) by [@yezz123](https://github.com/yezz123).
* ⬆️️ Bump `ruff-pre-commit` from v0.6.7 to v0.6.8. PR [#678](https://github.com/yezz123/authx/pull/678) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
### Docs
* 📝 Fix Pydantic Version in the documentation. PR [#679](https://github.com/yezz123/authx/pull/679) by [@yezz123](https://github.com/yezz123).
### Internal
* ♻️ Refactor pyproject to add dev dependencies in uv tool. PR [#676](https://github.com/yezz123/authx/pull/676) by [@yezz123](https://github.com/yezz123).
## 1.3.1
### Upgrades
* ⬆️️️ Bump `ruff-pre-commit` from v0.6.5 to v0.6.7. PR [#673](https://github.com/yezz123/authx/pull/673) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump pypa/gh-action-pypi-publish from 1.10.1 to 1.10.2. PR [#672](https://github.com/yezz123/authx/pull/672) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️ Bump ruff-pre-commit from v0.6.4 to v0.6.5. PR [#671](https://github.com/yezz123/authx/pull/671) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump ruff-pre-commit from v0.6.3 to v0.6.4. PR [#670](https://github.com/yezz123/authx/pull/670) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump pypa/gh-action-pypi-publish from 1.10.0 to 1.10.1. PR [#669](https://github.com/yezz123/authx/pull/669) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️️ Bump cryptography from 43.0.0 to 43.0.1. PR [#668](https://github.com/yezz123/authx/pull/668) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️ Bump ruff-pre-commit from v0.6.2 to v0.6.3. PR [#667](https://github.com/yezz123/authx/pull/667) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump ruff-pre-commit from v0.6.1 to v0.6.2. PR [#662](https://github.com/yezz123/authx/pull/662) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump pypa/gh-action-pypi-publish from 1.9.0 to 1.10.0. PR [#665](https://github.com/yezz123/authx/pull/665) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️ Bump ruff-pre-commit from v0.5.7 to v0.6.1. PR [#659](https://github.com/yezz123/authx/pull/659) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump ruff-pre-commit from v0.5.6 to v0.5.7. PR [#656](https://github.com/yezz123/authx/pull/656) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump ruff-pre-commit from v0.5.5 to v0.5.6. PR [#654](https://github.com/yezz123/authx/pull/654) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️ Bump the python-packages group with 5 updates. PR [#652](https://github.com/yezz123/authx/pull/652) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️ Bump ruff-pre-commit from v0.5.4 to v0.5.5. PR [#653](https://github.com/yezz123/authx/pull/653) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Update uv version to `0.2.30`. PR [#650](https://github.com/yezz123/authx/pull/650) by [@pinchXOXO](https://github.com/pinchXOXO).
* ⬆️️️ Bump ruff-pre-commit from v0.5.2 to v0.5.4. PR [#648](https://github.com/yezz123/authx/pull/648) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
### Docs
* ♻️ Refactor documentation requirements. PR [#674](https://github.com/yezz123/authx/pull/674) by [@yezz123](https://github.com/yezz123).
* 📝 add [@jor-rit](https://github.com/jor-rit) as a contributor for platform. PR [#666](https://github.com/yezz123/authx/pull/666) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
* 🍱 Optimize images sizes. PR [#651](https://github.com/yezz123/authx/pull/651) by [@imgbot[bot]](https://github.com/apps/imgbot).
* 📝 Update image URL in different pages & Fix naming typo. PR [#649](https://github.com/yezz123/authx/pull/649) by [@yezz123](https://github.com/yezz123).
* 🔧 Re-organize `mkdocs` configuration & add new plugins. PR [#646](https://github.com/yezz123/authx/pull/646) by [@yezz123](https://github.com/yezz123).
### Internal
* :wrench: Drop Python 3.8 version matrix in CI workflow. PR [#675](https://github.com/yezz123/authx/pull/675) by [@yezz123](https://github.com/yezz123).
* 🐛 add `itsdangerous` to project dependencies level. PR [#664](https://github.com/yezz123/authx/pull/664) by [@jor-rit](https://github.com/jor-rit).
## 1.3.0
### Features
* 🔧 Support both pydantic versions. PR [#638](https://github.com/yezz123/authx/pull/638) by [@yezz123](https://github.com/yezz123).
### Fixes
* 🐛 fix unsupported comparison between two type instances. PR [#643](https://github.com/yezz123/authx/pull/643) by [@yezz123](https://github.com/yezz123).
* 🐛 add support for additional data in JWT token creation and decoding. PR [#641](https://github.com/yezz123/authx/pull/641) by [@yezz123](https://github.com/yezz123).
### Refactors
* ♻️ Refactor `_CallbackHandler` to handle optional model instances. PR [#640](https://github.com/yezz123/authx/pull/640) by [@yezz123](https://github.com/yezz123).
### Upgrades
* ⬆️️️ Bump ruff-pre-commit from v0.5.1 to v0.5.2. PR [#644](https://github.com/yezz123/authx/pull/644) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Update uv version to `0.2.24`. PR [#642](https://github.com/yezz123/authx/pull/642) by [@pinchXOXO](https://github.com/pinchXOXO).
* ⬆️️️ Bump `ruff-pre-commit` from v0.5.0 to v0.5.1. PR [#639](https://github.com/yezz123/authx/pull/639) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
## 1.2.1
### Upgrades
* ⬆️️️️ Bump certifi from `2024.6.2` to `2024.7.4`. PR [#637](https://github.com/yezz123/authx/pull/637) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️️ Update pydantic and pydantic-core versions to 2.8.0 and 2.20.0. PR [#635](https://github.com/yezz123/authx/pull/635) by [@yezz123](https://github.com/yezz123).
* ⬆️️️️ Bump `ruff-pre-commit` from v0.4.10 to v0.5.0. PR [#634](https://github.com/yezz123/authx/pull/634) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Bump `uv` version to `0.2.18`. PR [#632](https://github.com/yezz123/authx/pull/632) by [@pg365](https://github.com/pg365).
* ⬆️️️️ Bump `ruff-pre-commit` from v0.4.9 to v0.4.10. PR [#627](https://github.com/yezz123/authx/pull/627) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
### Docs
* 📝 add [@pg365](https://github.com/pg365) as a contributor for Dependencies. PR [#633](https://github.com/yezz123/authx/pull/633) by [@allcontributors[bot]](https://github.com/apps/allcontributors).
### Internal
* ♻️ Refactor CSRF token retrieval. PR [#636](https://github.com/yezz123/authx/pull/636) by [@yezz123](https://github.com/yezz123).
* 🐛 Add edge cases Tests for error module. PR [#631](https://github.com/yezz123/authx/pull/631) by [@yezz123](https://github.com/yezz123).
* ♻️ Test Edge cases & Refactor `SignatureSerializer` & `_CallbackHandler` tests. PR [#625](https://github.com/yezz123/authx/pull/625) by [@yezz123](https://github.com/yezz123).
* 🔧 automate Labeling PRs by Dependabot. PR [#628](https://github.com/yezz123/authx/pull/628) by [@yezz123](https://github.com/yezz123).
## 1.2.0
### Features
* 🔧 Update Python versions for `mypy` workflow. PR [#603](https://github.com/yezz123/authx/pull/603) by [@yezz123](https://github.com/yezz123).
### Refactors
* 🔧 Remove `no-untyped-def` from disabled error codes. PR [#621](https://github.com/yezz123/authx/pull/621) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove `arg-type` from disabled error codes. PR [#619](https://github.com/yezz123/authx/pull/619) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove `dict-item` from disabled error codes. PR [#617](https://github.com/yezz123/authx/pull/617) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove `call-arg` from disabled error codes. PR [#616](https://github.com/yezz123/authx/pull/616) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove return-value from disabled error codes. PR [#615](https://github.com/yezz123/authx/pull/615) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove `call-overload` from disabled error codes. PR [#613](https://github.com/yezz123/authx/pull/613) by [@yezz123](https://github.com/yezz123).
* 🔧 Remove `type-arg` from disabled error codes. PR [#612](https://github.com/yezz123/authx/pull/612) by [@yezz123](https://github.com/yezz123).
* 🐛 remove `print()` in the release file. PR [#594](https://github.com/yezz123/authx/pull/594) by [@pinchXOXO](https://github.com/pinchXOXO).
### Upgrades
* ⬆️️️️ Bump urllib3 from 2.2.1 to 2.2.2 in /requirements. PR [#622](https://github.com/yezz123/authx/pull/622) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️️ Bump `ruff-pre-commit` from v0.4.8 to v0.4.9. PR [#623](https://github.com/yezz123/authx/pull/623) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Bump pypa/gh-action-pypi-publish from 1.8.14 to 1.9.0. PR [#620](https://github.com/yezz123/authx/pull/620) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️️ Bump `ruff-pre-commit` from v0.4.7 to v0.4.8. PR [#601](https://github.com/yezz123/authx/pull/601) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Bump the python-packages group with 2 updates. PR [#600](https://github.com/yezz123/authx/pull/600) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️️ Bump `ruff-pre-commit` from v0.4.5 to v0.4.7. PR [#598](https://github.com/yezz123/authx/pull/598) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ [pre-commit.ci] pre-commit autoupdate. PR [#597](https://github.com/yezz123/authx/pull/597) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Bump the python-packages group with 4 updates. PR [#596](https://github.com/yezz123/authx/pull/596) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ⬆️️️️ Update uv version to `0.2.3`. PR [#595](https://github.com/yezz123/authx/pull/595) by [@pinchXOXO](https://github.com/pinchXOXO).
### Docs
* 📝 Update meta information. PR [#614](https://github.com/yezz123/authx/pull/614) by [@yezz123](https://github.com/yezz123).
### Internal
* 🔨 Integrate `mypy` and enhance `typing` in code. PR [#532](https://github.com/yezz123/authx/pull/532) by [@yezz123](https://github.com/yezz123).
* 🔧 Separate CI of testing from Release one . PR [#593](https://github.com/yezz123/authx/pull/593) by [@yezz123](https://github.com/yezz123).
## 1.1.3
### Features
* ♻️ Replaces use of default mutable arguments. PR [#589](https://github.com/yezz123/authx/pull/589) by [@yezz123](https://github.com/yezz123).
### Upgrades
* Bump requests from 2.31.0 to 2.32.2 in /requirements. PR [#592](https://github.com/yezz123/authx/pull/592) by [@dependabot[bot]](https://github.com/apps/dependabot).
* ☔ [pre-commit.ci] pre-commit autoupdate. PR [#590](https://github.com/yezz123/authx/pull/590) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ☔ [pre-commit.ci] pre-commit autoupdate. PR [#585](https://github.com/yezz123/authx/pull/585) by [@pre-commit-ci[bot]](https://github.com/apps/pre-commit-ci).
* ⬆️️️️ Bump jinja2 from 3.1.3 to 3.1.4 in /requirements. PR [#584](https://github.com/yezz123/authx/pull/584) b | text/markdown | null | Yasser Tahiri <hello@yezz.me> | null | null | null | Authentication, Cookie, FastAPI, JWT, Oauth2, Pydantic | [
"Development Status :: 4 - Beta",
"Framework :: AsyncIO",
"Framework :: FastAPI",
"Framework :: Pydantic",
"Framework :: Pydantic :: 2",
"Intended Audience :: Developers",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
... | [] | null | null | >=3.9 | [] | [] | [] | [
"fastapi>=0.111.0",
"itsdangerous<3.0.0,>=2.2.0",
"pydantic-settings>=2.1.0",
"pydantic<3.0.0,>=2.10.5",
"pyjwt[crypto]<3.0.0,>=2.6.0",
"python-dateutil<3.0.0,>=2.8",
"python-jose<4.0.0,>=3.3.0",
"pytz<2026.0,>=2023.3"
] | [] | [] | [] | [
"Homepage, https://github.com/yezz123/authx",
"Documentation, https://authx.yezz.me/",
"Funding, https://github.com/sponsors/yezz123",
"Source, https://github.com/yezz123/authx",
"Changelog, https://authx.yezz.me/release/"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:36:31.522821 | authx-1.5.1.tar.gz | 69,062 | f7/9c/04cf55319d1c4ce886c6775acaa6e8c79d6e86d997e95acbcc140360a58d/authx-1.5.1.tar.gz | source | sdist | null | false | 06c7a721cdbb97f0c183e096a87732fa | 43c370d62fbe5b32167859cb988366856666f6cc2892ddf9715f1dfe660520d9 | f79c04cf55319d1c4ce886c6775acaa6e8c79d6e86d997e95acbcc140360a58d | MIT | [
"LICENSE"
] | 551 |
2.4 | pyliebherrhomeapi | 0.3.0 | Python library for Liebherr Home API | # pyliebherrhomeapi
[](https://github.com/mettolen/pyliebherrhomeapi/actions/workflows/ci.yml)
[](https://badge.fury.io/py/pyliebherrhomeapi)
[](https://pypi.org/project/pyliebherrhomeapi/)
[](https://codecov.io/gh/mettolen/pyliebherrhomeapi)
[](https://pypi.org/project/pyliebherrhomeapi/)
[](https://github.com/mettolen/pyliebherrhomeapi/blob/main/LICENSE)
Python library for the [Liebherr SmartDevice Home API](https://developer.liebherr.com/apis/smartdevice-homeapi/).
## Features
- 🔌 **Async/await support** using asyncio with comprehensive error handling
- 🌡️ **Temperature control** for all zones in your Liebherr appliances
- ❄️ **SuperFrost/SuperCool** control for quick cooling/freezing
- 🎉 **Special modes** (Party Mode, Night Mode, Presentation Light)
- 🧊 **Ice maker control** with Max Ice support
- 💧 **HydroBreeze and BioFreshPlus** mode management
- 🚪 **Auto door** control for supported appliances
- 📱 **Device management** - list and query all connected appliances
- 🛡️ **Type hints** for better IDE support and development experience
- ✅ **Input validation** with proper error handling
- 📊 **Comprehensive data models** for all control types
- 📝 **Configurable logging** with privacy-focused debug output
- 🧪 **100% test coverage** ensuring reliability and code quality
## Requirements
- Python 3.11+ (matches the typed codebase and test matrix)
- Asyncio environment with `aiohttp` (installed automatically)
- Network access to `https://home-api.smartdevice.liebherr.com`
## Installation
- From PyPI (when published):
```bash
pip install pyliebherrhomeapi
```
- From source (current repository):
```bash
pip install .
```
## Prerequisites
Before using this library, you need:
1. **Connect your appliance**: Connect your Liebherr appliance via the [SmartDevice app](https://smartdevice.onelink.me/OrY5/8neax8lp) to your home WiFi network
- [Download the SmartDevice app](https://smartdevice.onelink.me/OrY5/8neax8lp)
- [Instructions for connecting your appliance](https://go.liebherr.com/cb2ct1)
2. **Get your API Key** (via the SmartDevice app):
- Go to **Settings** in the SmartDevice app
- Select **"Beta features"**
- Activate the **HomeAPI**
- Copy the API Key (⚠️ **Important**: The API key can only be copied once. Once you leave the screen, it cannot be copied again. If you forget your key, you'll need to create a new one via the app)
3. **Connected appliances only**: Only appliances that are connected to the internet via the SmartDevice app can be accessed through the HomeAPI. Appliances that are only registered but not connected will not appear
## Quick Start
```python
import asyncio
from pyliebherrhomeapi import (
LiebherrClient,
TemperatureUnit,
IceMakerMode,
)
async def main():
# Create client with your API key
async with LiebherrClient(api_key="your-api-key-here") as client:
# Get all devices (only connected devices are returned)
devices = await client.get_devices()
print(f"Found {len(devices)} device(s)")
for device in devices:
# device_id is the serial number of the appliance
print(f"Device: {device.nickname} ({device.device_id})")
print(f" Type: {device.device_type}")
print(f" Model: {device.device_name}")
# Get all controls for this device
controls = await client.get_controls(device.device_id)
print(f" Controls: {len(controls)}")
if __name__ == "__main__":
asyncio.run(main())
```
## Important Notes
### Device Zones
- Each device has at least one zone (cooling zone, freezing zone, etc.)
- **Zone numbering**: The top zone is zone 0, zone numbers ascend from top to bottom
- Zone controls (like temperature, SuperFrost, SuperCool) always require a `zone_id`
- Base controls (like Party Mode, Night Mode) apply to the whole device and don't need a zone
### Polling Recommendations
⚠️ **Beta Version Notice**: The API currently doesn't push updates, so endpoints need to be polled regularly.
**Recommended polling intervals:**
- **Controls**: Poll every 30 seconds using `/v1/devices/{deviceId}/controls` to get all states in one call
- **Device list**: Poll manually only when appliances are added/removed or nicknames change
- **Rate limits**: Be mindful of API call limits. Avoid too many calls at once as there are restrictions for security and performance
### Control Types
**Base Controls** (apply to entire device, no `zone_id` needed):
- Party Mode
- Night Mode
**Zone Controls** (require `zone_id`, even if device has only one zone):
- Temperature
- SuperFrost
- SuperCool
- Ice Maker
- HydroBreeze
- BioFreshPlus
- Auto Door
## Usage Examples
### Temperature Control
```python
from pyliebherrhomeapi import LiebherrClient, TemperatureUnit
async with LiebherrClient(api_key="your-api-key") as client:
# Set temperature for zone 0 (top zone) to 4°C
await client.set_temperature(
device_id="12.345.678.9",
zone_id=0, # Zone 0 is the top zone
target=4,
unit=TemperatureUnit.CELSIUS
)
# Get temperature control info
controls = await client.get_control(
device_id="12.345.678.9",
control_name="temperature",
zone_id=0
)
```
### SuperCool and SuperFrost
```python
# Enable SuperCool for zone 0
await client.set_super_cool(
device_id="12.345.678.9",
zone_id=0,
value=True
)
# Enable SuperFrost for zone 1
await client.set_super_frost(
device_id="12.345.678.9",
zone_id=1,
value=True
)
```
### Special Modes
```python
# Enable Party Mode
await client.set_party_mode(
device_id="12.345.678.9",
value=True
)
# Enable Night Mode
await client.set_night_mode(
device_id="12.345.678.9",
value=True
)
# Set presentation light intensity (0-5)
await client.set_presentation_light(
device_id="12.345.678.9",
target=3
)
```
### Ice Maker Control
```python
from pyliebherrhomeapi import IceMakerMode
# Turn on ice maker
await client.set_ice_maker(
device_id="12.345.678.9",
zone_id=0,
mode=IceMakerMode.ON
)
# Enable Max Ice mode
await client.set_ice_maker(
device_id="12.345.678.9",
zone_id=0,
mode=IceMakerMode.MAX_ICE
)
```
### HydroBreeze Control
```python
from pyliebherrhomeapi import HydroBreezeMode
# Set HydroBreeze to medium
await client.set_hydro_breeze(
device_id="12.345.678.9",
zone_id=0,
mode=HydroBreezeMode.MEDIUM
)
```
### BioFreshPlus Control
```python
from pyliebherrhomeapi import BioFreshPlusMode
# Set BioFreshPlus mode
await client.set_bio_fresh_plus(
device_id="12.345.678.9",
zone_id=0,
mode=BioFreshPlusMode.ZERO_ZERO
)
```
### Auto Door Control
```python
# Open the door
await client.trigger_auto_door(
device_id="12.345.678.9",
zone_id=0,
value=True # True to open, False to close
)
```
### Query Device Controls
```python
# Get all controls (recommended for polling - gets all states in one call)
all_controls = await client.get_controls(device_id="12.345.678.9")
# Get specific control by name
temp_controls = await client.get_control(
device_id="12.345.678.9",
control_name="temperature"
)
# Get control for specific zone
zone_temp = await client.get_control(
device_id="12.345.678.9",
control_name="temperature",
zone_id=0 # Top zone
)
```
### Efficient Polling Pattern
```python
import asyncio
from pyliebherrhomeapi import LiebherrClient
async def poll_device_state(client: LiebherrClient, device_id: str):
"""Poll device state every 30 seconds (recommended interval)."""
while True:
try:
# Get all controls in a single API call
device_state = await client.get_device_state(device_id)
# Process the controls
for control in device_state.controls:
print(f"{control.name}: {control}")
# Wait 30 seconds before next poll (recommended by Liebherr)
await asyncio.sleep(30)
except Exception as e:
print(f"Error polling device: {e}")
await asyncio.sleep(30)
async def main():
async with LiebherrClient(api_key="your-api-key") as client:
devices = await client.get_devices()
if devices:
await poll_device_state(client, devices[0].device_id)
```
## Logging
The library uses Python's standard `logging` module for diagnostics. By default, it uses a `NullHandler`, so no logs are emitted unless you configure logging in your application.
### Enable Debug Logging
```python
import logging
# Enable debug logging for the library
logging.basicConfig(
level=logging.DEBUG,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
# Or configure just for pyliebherrhomeapi
logger = logging.getLogger('pyliebherrhomeapi')
logger.setLevel(logging.DEBUG)
handler = logging.StreamHandler()
handler.setFormatter(logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s'))
logger.addHandler(handler)
```
### Log Levels
- **DEBUG**: Detailed information about API requests, responses, and session lifecycle
- **INFO**: General information about operations (currently not used)
- **WARNING**: HTTP errors and connection issues
- **ERROR**: Severe errors like server failures
### Privacy
Device IDs are automatically masked in debug logs (showing only last 4 characters) to protect sensitive information.
## Error Handling
```python
from pyliebherrhomeapi import (
LiebherrClient,
LiebherrAuthenticationError,
LiebherrBadRequestError,
LiebherrNotFoundError,
LiebherrPreconditionFailedError,
LiebherrUnsupportedError,
LiebherrConnectionError,
LiebherrTimeoutError,
)
async with LiebherrClient(api_key="your-api-key") as client:
try:
await client.set_temperature(
device_id="12.345.678.9",
zone_id=0,
target=4
)
except LiebherrAuthenticationError:
print("Invalid API key")
except LiebherrBadRequestError as e:
print(f"Invalid request: {e}")
except LiebherrNotFoundError:
print("Device not reachable")
except LiebherrPreconditionFailedError:
print("Device not onboarded to your household")
except LiebherrUnsupportedError:
print("Feature not supported on this device")
except (LiebherrConnectionError, LiebherrTimeoutError) as e:
print(f"Connection error: {e}")
```
## Development
### Setup
```bash
# Clone the repository
git clone https://github.com/mettolen/pyliebherrhomeapi.git
cd pyliebherrhomeapi
# Install development dependencies
pip install -e ".[dev]"
```
### Testing
```bash
# Run tests
pytest
# Run tests with coverage
pytest --cov=pyliebherrhomeapi --cov-report=html
```
### Code Quality
```bash
# Format code
ruff format .
# Lint code
ruff check .
# Type checking
mypy src
```
## API Documentation
For detailed API documentation, visit:
- [SmartDevice HomeAPI Overview](https://developer.liebherr.com/apis/smartdevice-homeapi/)
- [Swagger UI Documentation](https://developer.liebherr.com/apis/smartdevice-homeapi/swagger-ui/)
- [Release Notes](https://developer.liebherr.com/apis/smartdevice-homeapi/releasenotes/)
**API Base URL**: `https://home-api.smartdevice.liebherr.com`
## Implementation Notes
This client library is generated based on the official `openapi.json` specification downloaded from the Liebherr Developer Portal, which reflects the latest API state. When Liebherr updates their API and releases a new version of the OpenAPI specification, this client library will be updated accordingly to maintain compatibility and support new features.
## License
MIT License - see [LICENSE](LICENSE) file for details.
## Contributing
Contributions are welcome! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for details.
| text/markdown | null | Mario Mett <mario@mett.ee> | null | null | MIT | liebherr, home-automation, api | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Home Automation"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"aiohttp>=3.9.0",
"pytest>=7.0; extra == \"dev\"",
"pytest-asyncio>=0.21.0; extra == \"dev\"",
"pytest-cov>=4.0; extra == \"dev\"",
"pytest-timeout>=2.1.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\"",
"mypy>=1.0; extra == \"dev\"",
"build>=1.0.0; extra == \"dev\"",
"twine>=4.0.0; extra == \"d... | [] | [] | [] | [
"Homepage, https://github.com/mettolen/pyliebherrhomeapi",
"Repository, https://github.com/mettolen/pyliebherrhomeapi",
"Issues, https://github.com/mettolen/pyliebherrhomeapi/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:36:24.473219 | pyliebherrhomeapi-0.3.0.tar.gz | 29,389 | 98/67/afbb0eef65b6bbb227098e8dc2b44678ab90de9bf8a36abc208780fc39ac/pyliebherrhomeapi-0.3.0.tar.gz | source | sdist | null | false | 49e5eeb75a18017ca693e050e89f3cc0 | 1bbb33415f162b8b1cf5bcc2a4909ca278efe98a461741de84ce9840ba45a5a5 | 9867afbb0eef65b6bbb227098e8dc2b44678ab90de9bf8a36abc208780fc39ac | null | [
"LICENSE"
] | 867 |
2.4 | pylife | 2.2.1 | General Fatigue library | # pyLife – a general library for fatigue and reliability
[](https://mybinder.org/v2/gh/boschresearch/pylife/develop?labpath=demos%2Findex.ipynb)
[](https://pylife.readthedocs.io/en/latest/?badge=latest)
[](https://pypi.org/project/pylife/)

[](https://github.com/boschresearch/pylife/actions/workflows/pytest.yml)
pyLife is an Open Source Python library for state of the art algorithms used in
lifetime assessment of mechanical components subjected to fatigue.
## Purpose of the project
This library was originally compiled at [Bosch
Research](https://www.bosch.com/research/) to collect algorithms needed by
different in house software projects, that deal with lifetime prediction and
material fatigue on a component level. In order to further extent and
scrutinize it we decided to release it as Open Source. Read [this
article](https://www.bosch.com/stories/bringing-open-source-to-mechanical-engineering/)
about pyLife's origin.
So we are welcoming collaboration not only from science and education but also
from other commercial companies dealing with the topic. We commend this library
to university teachers to use it for education purposes.
The company [Viktor](https://viktor.ai) has set up a [web application for Wöhler
test analysis](https://cloud.viktor.ai/public/wohler-fatigue-test-analysis)
based on pyLife code.
## Status
pyLife-2.1.x is the current release the you get by default. We are doing small
improvements, in the pyLife-2.1.x branch (`master`) while developing the more
vast features in the 2.2.x branch (`develop`).
The main new features of the 2.2.x branch is about FKM functionality. As that
is quite a comprehensive addition we would need some time to get it right
before we can release it as default release.
Once 2.2.x is released we will probably stick to a one branch development.
## Contents
There are/will be the following subpackages:
* `stress` everything related to stress calculation
* equivalent stress
* stress gradient calculation
* rainflow counting
* ...
* `strength` everything related to strength calculation
* failure probability estimation
* S-N-calculations
* local strain concept: FKM guideline nonlinear
* ...
* `mesh` FEM mesh related stuff
* stress gradients
* FEM-mapping
* hotspot detection
* `util` all the more general utilities
* ...
* `materialdata` analysis of material testing data
* Wöhler (SN-curve) data analysis
* `materiallaws` modeling material behavior
* Ramberg Osgood
* Wöhler curves
* `vmap` a interface to [VMAP](https://www.vmap.eu.com/)
## License
pyLife is open-sourced under the Apache-2.0 license. See the
[LICENSE](LICENSE) file for details.
For a list of other open source components included in pyLife, see the
file [3rd-party-licenses.txt](3rd-party-licenses.txt).
| text/markdown; charset=UTF-8 | null | "pyLife developer team @ Bosch Research" <johannes.mueller4@de.bosch.com> | null | null | Apache-2 | null | [
"Development Status :: 4 - Beta",
"Programming Language :: Python :: 3",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent"
] | [
"any"
] | null | null | >=3.9 | [] | [] | [] | [
"numpy>=1.23.5",
"scipy",
"pandas>=2.2",
"h5py!=3.7.0",
"pickleshare>=0.7.5; python_full_version < \"3.11\"",
"tsfresh>=0.21.0; extra == \"tsfresh\"",
"numba>=0.60; extra == \"tsfresh\"",
"tsfresh>=0.21.0; extra == \"all\"",
"numba>=0.60; extra == \"all\"",
"jupyter; extra == \"extras\"",
"matpl... | [] | [] | [] | [
"Homepage, https://github.com/boschresearch/pylife/",
"Documentation, https://pylife.readthedocs.io",
"Source, https://github.com/boschresearch/pylife/",
"Changelog, https://pylife.readthedocs.io/en/stable/CHANGELOG.html",
"Tracker, https://github.com/boschresearch/pylife/issues",
"Download, https://pypi.... | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:36:19.162622 | pylife-2.2.1.tar.gz | 11,675,111 | 21/be/c0dfd5c8c410a0e951e6953a466a3f6e8c054c66b322989cf664d48f3977/pylife-2.2.1.tar.gz | source | sdist | null | false | d1e4fe07cd821228b08157daff80ab42 | 4794c5db736e850d5a83ebd2140b3ee35fc808ff7a33543bdb9124efc7bbc4e8 | 21bec0dfd5c8c410a0e951e6953a466a3f6e8c054c66b322989cf664d48f3977 | null | [
"LICENSE",
"NOTICE"
] | 1,760 |
2.4 | openlit | 1.36.9 | OpenTelemetry-native Auto instrumentation library for monitoring LLM Applications and GPUs, facilitating the integration of observability into your GenAI-driven projects | <div align="center">
<img src="https://github.com/openlit/.github/blob/main/profile/assets/wide-logo-no-bg.png?raw=true" alt="OpenLIT Logo" width="30%">
<h3>OpenTelemetry-native</h3>
<h1>AI Observability, Evaluation and Guardrails Framework</h1>
**[Documentation](https://docs.openlit.io/) | [Quickstart](#-getting-started-with-llm-observability) | [Roadmap](#️-roadmap) | [Feature Request](https://github.com/openlit/openlit/issues/new?assignees=&labels=%3Araised_hand%3A+Up+for+Grabs%2C+%3Arocket%3A+Feature&projects=&template=feature-request.md&title=%5BFeat%5D%3A) | [Report a Bug](https://github.com/openlit/openlit/issues/new?assignees=&labels=%3Abug%3A+Bug%2C+%3Araised_hand%3A+Up+for+Grabs&projects=&template=bug.md&title=%5BBug%5D%3A)**
[](https://github.com/openlit/openlit)
[](https://github.com/openlit/openlit/blob/main/LICENSE)
[](https://pepy.tech/project/openlit)
[](https://github.com/openlit/openlit/pulse)
[](https://github.com/openlit/openlit/graphs/contributors)
[](https://join.slack.com/t/openlit/shared_invite/zt-2etnfttwg-TjP_7BZXfYg84oAukY8QRQ)
[](https://twitter.com/openlit_io)

</div>
OpenLIT SDK is a monitoring framework built on top of **OpenTelemetry** that gives your complete Observability for your AI stack, from LLMs to vector databases and GPUs, with just one line of code with tracing and metrics. It also allows you to send the generated traces and metrics to your existing monitoring tools like Grafana, New Relic, and more.
This project proudly follows and maintains the [Semantic Conventions](https://github.com/open-telemetry/semantic-conventions/tree/main/docs/gen-ai) with the OpenTelemetry community, consistently updating to align with the latest standards in Observability.
## ⚡ Features
- 🔎 **Auto Instrumentation**: Works with 50+ LLM providers, Agents, Vector databases, and GPUs with just one line of code.
- 🔭 **OpenTelemetry-Native Observability SDKs**: Vendor-neutral SDKs that can send traces and metrics to your existing observability tool like Prometheus and Jaeger.
- 💲 **Cost Tracking for Custom and Fine-Tuned Models**: Pass custom pricing files for accurate budgeting of custom and fine-tuned models.
- 🚀 **Suppport for OpenLIT Features**: Includes suppprt for prompt management and secrets management features available in OpenLIT.
## Auto Instrumentation Capabilities
| LLMs | Vector DBs | Frameworks | GPUs |
|--------------------------------------------------------------------------|----------------------------------------------|-------------------------------------------------|---------------|
| [✅ OpenAI](https://docs.openlit.io/latest/integrations/openai) | [✅ ChromaDB](https://docs.openlit.io/latest/integrations/chromadb) | [✅ Langchain](https://docs.openlit.io/latest/integrations/langchain) | [✅ NVIDIA](https://docs.openlit.io/latest/integrations/nvidia-gpu) |
| [✅ Ollama](https://docs.openlit.io/latest/integrations/ollama) | [✅ Pinecone](https://docs.openlit.io/latest/integrations/pinecone) | [✅ LiteLLM](https://docs.openlit.io/latest/integrations/litellm) | [✅ AMD](https://docs.openlit.io/latest/integrations/amd-gpu) |
| [✅ Anthropic](https://docs.openlit.io/latest/integrations/anthropic) | [✅ Qdrant](https://docs.openlit.io/latest/integrations/qdrant) | [✅ LlamaIndex](https://docs.openlit.io/latest/integrations/llama-index) | |
| [✅ GPT4All](https://docs.openlit.io/latest/integrations/gpt4all) | [✅ Milvus](https://docs.openlit.io/latest/integrations/milvus) | [✅ Haystack](https://docs.openlit.io/latest/integrations/haystack) | |
| [✅ Cohere](https://docs.openlit.io/latest/integrations/cohere) | [✅ AstraDB](https://docs.openlit.io/latest/integrations/astradb) | [✅ EmbedChain](https://docs.openlit.io/latest/integrations/embedchain) | |
| [✅ Mistral](https://docs.openlit.io/latest/integrations/mistral) | | [✅ Guardrails](https://docs.openlit.io/latest/integrations/guardrails) | |
| [✅ Azure OpenAI](https://docs.openlit.io/latest/integrations/azure-openai) | | [✅ CrewAI](https://docs.openlit.io/latest/integrations/crewai) | |
| [✅ Azure AI Inference](https://docs.openlit.io/latest/integrations/azure-ai-inference) | | [✅ DSPy](https://docs.openlit.io/latest/integrations/dspy) | |
| [✅ GitHub AI Models](https://docs.openlit.io/latest/integrations/github-models) | | [✅ AG2](https://docs.openlit.io/latest/integrations/ag2) | |
| [✅ HuggingFace Transformers](https://docs.openlit.io/latest/integrations/huggingface) | | [✅ Dynamiq](https://docs.openlit.io/latest/integrations/dynamiq) | |
| [✅ Amazon Bedrock](https://docs.openlit.io/latest/integrations/bedrock) | | [✅ Phidata](https://docs.openlit.io/latest/integrations/phidata) | |
| [✅ AI21](https://docs.openlit.io/latest/integrations/ai21) | | [✅ mem0](https://docs.openlit.io/latest/integrations/mem0) | |
| [✅ Vertex AI](https://docs.openlit.io/latest/integrations/vertexai) | | [✅ MultiOn](https://docs.openlit.io/latest/integrations/multion) | |
| [✅ Groq](https://docs.openlit.io/latest/integrations/groq) | | [✅ Julep AI](https://docs.openlit.io/latest/integrations/julep-ai) | |
| [✅ ElevenLabs](https://docs.openlit.io/latest/integrations/elevenlabs) | | [✅ ControlFlow](https://docs.openlit.io/latest/integrations/controlflow) | |
| [✅ vLLM](https://docs.openlit.io/latest/integrations/vllm) | | [✅ Crawl4AI](https://docs.openlit.io/latest/integrations/crawl4ai) | |
| [✅ OLA Krutrim](https://docs.openlit.io/latest/integrations/krutrim) | | [✅ FireCrawl](https://docs.openlit.io/latest/integrations/firecrawl) | |
| [✅ Google AI Studio](https://docs.openlit.io/latest/integrations/google-ai-studio) | | [✅ Letta](https://docs.openlit.io/latest/integrations/letta) | |
| [✅ NVIDIA NIM](https://docs.openlit.io/latest/integrations/nvidia-nim) | | [✅ SwarmZero](https://docs.openlit.io/latest/integrations/swarmzero) | |
| [✅ Titan ML](https://docs.openlit.io/latest/integrations/titan-ml) | | | |
| [✅ Reka AI](https://docs.openlit.io/latest/integrations/reka) | | | |
| [✅ xAI](https://docs.openlit.io/latest/integrations/xai) | | | |
| [✅ Prem AI](https://docs.openlit.io/latest/integrations/premai) | | | |
| [✅ Assembly AI](https://docs.openlit.io/latest/integrations/assemblyai) | | | |
| [✅ Together](https://docs.openlit.io/latest/integrations/together) | | | |
| [✅ DeepSeek](https://docs.openlit.io/latest/integrations/deepseek) | | | |
## Supported Destinations
- [✅ OpenTelemetry Collector](https://docs.openlit.io/latest/connections/otelcol)
- [✅ Prometheus + Tempo](https://docs.openlit.io/latest/connections/prometheus-tempo)
- [✅ Prometheus + Jaeger](https://docs.openlit.io/latest/connections/prometheus-jaeger)
- [✅ Grafana Cloud](https://docs.openlit.io/latest/connections/grafanacloud)
- [✅ New Relic](https://docs.openlit.io/latest/connections/new-relic)
- [✅ Elastic](https://docs.openlit.io/latest/connections/elastic)
- [✅ Middleware.io](https://docs.openlit.io/latest/connections/middleware)
- [✅ HyperDX](https://docs.openlit.io/latest/connections/hyperdx)
- [✅ DataDog](https://docs.openlit.io/latest/connections/datadog)
- [✅ SigNoz](https://docs.openlit.io/latest/connections/signoz)
- [✅ OneUptime](https://docs.openlit.io/latest/connections/oneuptime)
- [✅ Dynatrace](https://docs.openlit.io/latest/connections/dynatrace)
- [✅ OpenObserve](https://docs.openlit.io/latest/connections/openobserve)
- [✅ Highlight.io](https://docs.openlit.io/latest/connections/highlight)
- [✅ SigLens](https://docs.openlit.io/latest/connections/siglens)
- [✅ Oodle](https://docs.openlit.io/latest/connections/oodle)
## 💿 Installation
```bash
pip install openlit
```
## 🚀 Getting Started with LLM Observability
### Step 1: Install OpenLIT SDK
Open your command line or terminal and run:
```bash
pip install openlit
```
### Step 2: Initialize OpenLIT in your Application
Integrate OpenLIT into your AI applications by adding the following lines to your code.
```python
import openlit
openlit.init()
```
Configure the telemetry data destination as follows:
| Purpose | Parameter/Environment Variable | For Sending to OpenLIT |
|-------------------------------------------|--------------------------------------------------|--------------------------------|
| Send data to an HTTP OTLP endpoint | `otlp_endpoint` or `OTEL_EXPORTER_OTLP_ENDPOINT` | `"http://127.0.0.1:4318"` |
| Authenticate telemetry backends | `otlp_headers` or `OTEL_EXPORTER_OTLP_HEADERS` | Not required by default |
> 💡 Info: If the `otlp_endpoint` or `OTEL_EXPORTER_OTLP_ENDPOINT` is not provided, the OpenLIT SDK will output traces directly to your console, which is recommended during the development phase.
#### Example
---
<details>
<summary>Initialize using Function Arguments</summary>
---
Add the following two lines to your application code:
```python
import openlit
openlit.init(
otlp_endpoint="YOUR_OTEL_ENDPOINT",
otlp_headers ="YOUR_OTEL_ENDPOINT_AUTH"
)
```
</details>
---
<details>
<summary>Initialize using Environment Variables</summary>
---
Add the following two lines to your application code:
```python
import openlit
openlit.init()
```
Then, configure the your OTLP endpoint using environment variable:
```env
export OTEL_EXPORTER_OTLP_ENDPOINT = "YOUR_OTEL_ENDPOINT"
export OTEL_EXPORTER_OTLP_HEADERS = "YOUR_OTEL_ENDPOINT_AUTH"
```
</details>
---
### Step 3: Visualize and Optimize!
Now that your LLM observability data is being collected and sent to configured OpenTelemetry destination, the next step is to visualize and analyze this data. This will help you understand your LLM application's performance and behavior and identify where it can be improved.
If you want to use OpenLIT's Observability Dashboard to monitor LLM usage—like cost, tokens, and user interactions—please check out our [Quickstart Guide](https://docs.openlit.io/latest/quickstart).
If you're sending metrics and traces to other observability tools, take a look at our [Connections Guide](https://docs.openlit.io/latest/destinations/intro) to start using a pre-built dashboard we have created for these tools.

## Configuration
### Observability - `openlit.init()`
Below is a detailed overview of the configuration options available, allowing you to adjust OpenLIT's behavior and functionality to align with your specific observability needs:
| Argument | Description | Default Value | Required |
|-------------------------|-----------------------------------------------------------------------------------------------|----------------|----------|
| `environment` | The deployment environment of the application. | `"default"` | Yes |
| `application_name` | Identifies the name of your application. | `"default"` | Yes |
| `tracer` | An instance of OpenTelemetry Tracer for tracing operations. | `None` | No |
| `meter` | An OpenTelemetry Metrics instance for capturing metrics. | `None` | No |
| `otlp_endpoint` | Specifies the OTLP endpoint for transmitting telemetry data. | `None` | No |
| `otlp_headers` | Defines headers for the OTLP exporter, useful for backends requiring authentication. | `None` | No |
| `disable_batch` | A flag to disable batch span processing, favoring immediate dispatch. | `False` | No |
| `capture_message_content` | Enables tracing of content for deeper insights. | `True` | No |
| `disabled_instrumentors`| List of instrumentors to disable. | `None` | No |
| `disable_metrics` | If set, disables the collection of metrics. | `False` | No |
| `pricing_json` | URL or file path of the pricing JSON file. | `https://github.com/openlit/openlit/blob/main/assets/pricing.json` | No |
| `collect_gpu_stats` | Flag to enable or disable GPU metrics collection. | `False` | No |
### OpenLIT Prompt Hub - `openlit.get_prompt()`
Below are the parameters for use with the SDK for OpenLIT Prompt Hub for prompt management:
| Parameter | Description |
|-------------------|------------------------------------------------------------------------------------------------------------------------------------|
| `url` | Sets the OpenLIT URL. Defaults to the `OPENLIT_URL` environment variable. |
| `api_key` | Sets the OpenLIT API Key. Can also be provided via the `OPENLIT_API_KEY` environment variable. |
| `name` | Sets the name to fetch a unique prompt. Use this or `prompt_id`. |
| `prompt_id` | Sets the ID to fetch a unique prompt. Use this or `name`. Optional |
| `version` | Set to `True` to get the prompt with variable substitution.. Optional |
| `shouldCompile` | Boolean value that compiles the prompt using the provided variables. Optional |
| `variables` | Sets the variables for prompt compilation. Optional |
| `meta_properties` | Sets the meta-properties for storing in the prompt's access history metadata. Optional |
### OpenLIT Vault - `openlit.get_secrets()`
Below are the parameters for use with the SDK for OpenLIT Vault for secret management:
| Parameter | Description |
|-------------------|------------------------------------------------------------------------------------------------------------------------------------|
| `url` | Sets the Openlit URL. Defaults to the `OPENLIT_URL` environment variable. |
| `api_key` | Sets the OpenLIT API Key. Can also be provided via the `OPENLIT_API_KEY` environment variable. |
| `key` | Sets the key to fetch a specific secret. Optional |
| `should_set_env` | Boolean value that sets all the secrets as environment variables for the application. Optional |
| `tags` | Sets the tags for fetching only the secrets that have the mentioned tags assigned. Optional |
## 🛣️ Roadmap
We are dedicated to continuously improving OpenLIT SDKs. Here's a look at what's been accomplished and what's on the horizon:
| Feature | Status |
|----------------------------------------------------------------------------------------------|---------------|
| [OpenTelmetry auto-instrumentation for LLM Providers like OpenAI, Anthropic]() | ✅ Completed |
| [OpenTelmetry auto-instrumentation for Vector databases like Pinecone, Chroma]() | ✅ Completed |
| [OpenTelmetry auto-instrumentation for LLM Frameworks like LangChain, LlamaIndex]() | ✅ Completed |
| [OpenTelemetry-native auto-instrumentation for NVIDIA GPU Monitoring](https://docs.openlit.io/latest/features/gpu) | ✅ Completed |
| [Real-Time Guardrails Implementation](https://docs.openlit.io/latest/features/guardrails) | ✅ Completed |
| [Programmatic Evaluation for LLM Response](https://docs.openlit.io/latest/features/evaluations) | ✅ Completed |
| [OpenTelemetry-native AI Agent Observability]() | ✅ Completed |
## 🌱 Contributing
Whether it's big or small, we love contributions 💚. Check out our [Contribution guide](../../CONTRIBUTING.md) to get started
Unsure where to start? Here are a few ways to get involved:
- Join our [Slack](https://join.slack.com/t/openlit/shared_invite/zt-2etnfttwg-TjP_7BZXfYg84oAukY8QRQ) or [Discord](https://discord.gg/rjvTm6zd) community to discuss ideas, share feedback, and connect with both our team and the wider OpenLIT community.
Your input helps us grow and improve, and we're here to support you every step of the way.
## 💚 Community & Support
Connect with the OpenLIT community and maintainers for support, discussions, and updates:
- 🌟 If you like it, Leave a star on our [GitHub](https://github.com/openlit/openlit/)
- 🌍 Join our [Slack](https://join.slack.com/t/openlit/shared_invite/zt-2etnfttwg-TjP_7BZXfYg84oAukY8QRQ) or [Discord](https://discord.gg/CQnXwNT3) community for live interactions and questions.
- 🐞 Report bugs on our [GitHub Issues](https://github.com/openlit/openlit/issues) to help us improve OpenLIT.
- 𝕏 Follow us on [X](https://x.com/openlit_io) for the latest updates and news.
| text/markdown | OpenLIT | null | null | null | Apache-2.0 | OpenTelemetry, otel, otlp, llm, tracing, openai, anthropic, claude, cohere, llm monitoring, observability, monitoring, gpt, Generative AI, chatGPT, gpu | [
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
... | [] | null | null | <4.0.0,>=3.9.0 | [] | [] | [] | [
"anthropic<1.0.0,>=0.42.0",
"boto3<2.0.0,>=1.34.0",
"botocore<2.0.0,>=1.34.0",
"langchain<0.4.0,>=0.3.15",
"openai<2.0.0,>=1.1.1",
"openai-agents>=0.0.3",
"opentelemetry-api<2.0.0,>=1.30.0",
"opentelemetry-exporter-otlp<2.0.0,>=1.30.0",
"opentelemetry-instrumentation<1.0.0,>=0.52b0",
"opentelemetr... | [] | [] | [] | [
"Homepage, https://github.com/openlit/openlit/tree/main/openlit/python",
"Repository, https://github.com/openlit/openlit/tree/main/openlit/python"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:35:30.465906 | openlit-1.36.9.tar.gz | 336,476 | b6/aa/9724553734f7e747903ea28884f0e5ec31e7c04dffdb1109cd741e13090f/openlit-1.36.9.tar.gz | source | sdist | null | false | a859d0be8798d82780bae359610642a8 | 2c207abeba1550d73a781df3565e1a21e9fcb3be1db0d5c2ed545828d5789e23 | b6aa9724553734f7e747903ea28884f0e5ec31e7c04dffdb1109cd741e13090f | null | [
"LICENSE"
] | 3,751 |
2.4 | xdsl | 0.57.0 | xDSL | <!-- markdownlint-disable-next-line MD041 -->
[](https://github.com/xdslproject/xdsl/actions/workflows/ci-core.yml?query=workflow%3A%22CI+-+Python+application%22++)
[](https://badge.fury.io/py/xdsl)
[](https://www.pepy.tech/projects/xdsl)
[](https://pepy.tech/project/xdsl)
[](https://codecov.io/gh/xdslproject/xdsl)
[](https://xdsl.zulipchat.com)
# xDSL: A Python-native SSA Compiler Framework
[xDSL](http://www.xdsl.dev) is a Python-native framework for
building compiler infrastructure. It provides *[SSA-based intermediate
representations (IRs)](https://en.wikipedia.org/wiki/Static_single-assignment_form)*
and Pythonic APIs to define, assemble, and optimize custom IRs—all with seamless
compatibility with [MLIR](https://mlir.llvm.org/) from the LLVM project.
Inspired by MLIR, xDSL enables smooth translation of programs and abstractions
between frameworks. This lets users prototype compilers entirely in Python,
while still accessing MLIR's powerful optimization and code generation pipeline.
All IRs in xDSL employ a unified SSA-based data structure, with regions and basic blocks,
making it easy to write generic analyses and transformation passes.
xDSL supports assembling compilers from predefined or custom IRs, and organizing
transformations across a multi-level IR stack. This layered approach enables
abstraction-specific optimization passes, similar to the architecture of projects
like [Devito](https://github.com/devitocodes/devito), [PSyclone](https://github.com/stfc/PSyclone),
and [Firedrake](https://github.com/firedrakeproject/firedrake).
In short, xDSL makes it possible to:
- Prototype compilers quickly in Python
- Build DSLs with custom IRs
- Run analyses and transformations with simple scripts
- Interoperate smoothly with MLIR and benefit from LLVM's backend
## Contents
- [Installation](#installation)
- [Getting Started](#getting-started)
- [Discussion](#discussion)
## Installation
To contribute to xDSL, follow the [xDSL Developer Setup Guide](CONTRIBUTING.md).
To use xDSL as part of a larger project for developing your own compiler,
just install [xDSL via pip](https://pypi.org/project/xdsl/):
```bash
pip install xdsl
```
To quickly install xDSL for development and contribution purposes, use:
``` bash
pip install xdsl[dev]
```
This may be useful for projects wanting to replicate the xDSL testing setup.
*Note:* This version of xDSL is validated against a specific MLIR version,
interoperability with other versions is not guaranteed. The supported
MLIR version is 21.1.1.
> [!IMPORTANT]
>
> ### Experimental Pyright Features
>
> xDSL currently relies on an experimental feature of Pyright called TypeForm.
> TypeForm is [in discussion](https://discuss.python.org/t/pep-747-typeexpr-type-hint-for-a-type-expression/55984)
> and will likely land in some future version of Python.
>
> For xDSL to type check correctly using Pyright, please add this to your `pyproject.toml`:
>
> ```toml
> [tool.pyright]
> enableExperimentalFeatures = true
> ```
### Subprojects With Extra Dependencies
xDSL has a number of subprojects, some of which require extra dependencies.
To keep the set of dependencies to a minimum, these extra dependencies have to
be specified explicitly, e.g. by using:
``` bash
pip install xdsl[gui] # or [jax], [riscv]
```
## Getting Started
Check out the dedicated [Getting Started guide](https://docs.xdsl.dev)
for a comprehensive tutorial.
To get familiar with xDSL, we recommend starting with our Jupyter notebooks. The
notebooks provide hands-on examples and documentation of xDSL's core concepts: data
structures, the Python-embedded abstraction definition language, and end-to-end custom
compilers construction, like a database compiler.
There also exists a small documentation showing how to connect xDSL with MLIR
for users interested in that use case.
- [A Database example](https://xdsl.dev/xdsl/lab/index.html?path=database_example.ipynb)
- [A simple introduction](https://xdsl.dev/xdsl/lab/index.html?path=tutorial.ipynb)
- [A DSL for defining new IRs](https://xdsl.dev/xdsl/lab/index.html?path=irdl.ipynb)
- [Connecting xDSL with MLIR](docs/guides/mlir_interoperation.md)
We provide a [Makefile](https://github.com/xdslproject/xdsl/blob/main/Makefile)
containing a lot of common tasks, which might provide an overview of common actions.
## Discussion
You can also join the discussion at our [Zulip chat room](https://xdsl.zulipchat.com),
kindly supported by community hosting from [Zulip](https://zulip.com/).
| text/markdown | null | Mathieu Fehr <mathieu.fehr@ed.ac.uk> | null | null | Apache License v2.0 with LLVM Exceptions | null | [
"Programming Language :: Python :: 3"
] | [
"Linux"
] | null | null | >=3.10 | [] | [] | [] | [
"immutabledict<4.2.3",
"typing-extensions<5,>=4.7",
"ordered-set==4.1.0",
"toml<0.11; extra == \"dev\"",
"pytest-cov; extra == \"dev\"",
"coverage<8.0.0; extra == \"dev\"",
"ipykernel; extra == \"dev\"",
"pytest<8.5; extra == \"dev\"",
"nbval<0.12; extra == \"dev\"",
"filecheck==1.0.3; extra == \"... | [] | [] | [] | [
"Homepage, https://xdsl.dev/",
"Source Code, https://github.com/xdslproject/xdsl",
"Issue Tracker, https://github.com/xdslproject/xdsl/issues"
] | twine/6.2.0 CPython/3.11.14 | 2026-02-19T10:35:12.051422 | xdsl-0.57.0.tar.gz | 3,999,543 | 51/bd/9dc4e0a05998c8661bbc14a56d04d7eca3f7746bcf680e8dd01243712e1f/xdsl-0.57.0.tar.gz | source | sdist | null | false | cd41175b9be77884b258caaf44ac2e56 | efecb7036a650f00ea58ef012cc5754c85ee723bc7652240298bb6f39ca0616f | 51bd9dc4e0a05998c8661bbc14a56d04d7eca3f7746bcf680e8dd01243712e1f | null | [
"LICENSE"
] | 3,858 |
2.4 | apis-core-rdf | 0.60.0 | Base package for the APIS framework | APIS
====


The *Austrian Prosophographical Information System* is a
[Django](https://www.djangoproject.com/) based prosopography framework. It
allows to create web applications to manage both entities and relations between
entities. It provides API access to the data in various formats and creates
swagger defintions. A swagger-ui allows for comfortable access to the data.
Data can also be imported from remote resources described in
[RDF](https://en.wikipedia.org/wiki/Resource_Description_Framework).
In addition to this configurable import of data via RDF, there is also an
configurable serialization of data. The generic RestAPI of APIS provides data
either in the internal JSON format, TEI or RDF (serialized with *CIDOC CRM*).
APIS comes with a built in system of autocompletes that allows researchers to
import meta-data of entities with just a single click. Out of the box APIS
supports Stanbol as a backend for the autocompletes, but the system is rather
easy to adapt to any Restfull API. APIS also supports the parsing of RDFs
describing entities into an entity. The parsing is configured in a settings
file.
*Entities*
*Relations*
Licensing
---------
All code unless otherwise noted is licensed under the terms of the MIT License
(MIT). Please refer to the file LICENSE.txt in the root directory of this
repository.
All documentation and images unless otherwise noted are licensed under the
terms of Creative Commons Attribution-ShareAlike 4.0 International License. To
view a copy of this license, visit
http://creativecommons.org/licenses/by-sa/4.0/
APIS contains the ["Material Symbols" font](https://fonts.google.com/icons)(commit ace1af0), which
is licensed under the [Apache License Version 2.0](https://www.apache.org/licenses/LICENSE-2.0.html).
The Swagger Logo in `core/static/img` comes from [wikimedia
commons](https://commons.wikimedia.org/wiki/File:Swagger-logo.png) and is
licensed under the [Creative Commons Attribution-Share Alike 4.0 International
license](https://creativecommons.org/licenses/by-sa/4.0/deed.en)
Installation
------------
<!-- Installation -->
Create a new [Django project](https://docs.djangoproject.com/en/stable/ref/django-admin/#startproject):
```shell
django-admin startproject my_apis_instance
```
Add apis-core-rdf as a dependency to your project.
To use the APIS framework in your application, you will need to add the following dependencies to
[`INSTALLED_APPS`](https://docs.djangoproject.com/en/stable/ref/settings/#installed-apps):
```python
INSTALLED_APPS = [
# our main app, containing the ontology (in the `models.py`)
# and our customizations
"sample_project",
"django.contrib.admin",
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.staticfiles",
# ui stuff used by APIS
"crispy_forms",
"crispy_bootstrap4",
"django_filters",
"django_tables2",
"dal",
"dal_select2",
# REST API
"rest_framework",
# swagger ui generation
"drf_spectacular",
# The APIS apps
"apis_core.core",
"apis_core.generic",
"apis_core.apis_metainfo",
"apis_core.apis_entities",
# APIS collections provide a collection model similar to
# SKOS collections and allow tagging of content
"apis_core.collections",
# APIS history modules tracks changes of instances over
# time and lets you revert changes
"apis_core.history",
]
```
Finally, add the APIS urls to your applications [URL Dispatcher](https://docs.djangoproject.com/en/stable/topics/http/urls/)
```python
urlpatterns = [
path("", include("apis_core.urls", namespace="apis")),
# https://docs.djangoproject.com/en/stable/topics/auth/default/#module-django.contrib.auth.views
path("accounts/", include("django.contrib.auth.urls")),
# https://docs.djangoproject.com/en/stable/ref/contrib/admin/#hooking-adminsite-to-urlconf
path("admin/", admin.site.urls),
]
```
Now start using your Django project
```shell
./manage.py runserver
```
Now you should be ready to roll. Start [creating your ontology](https://acdh-oeaw.github.io/apis-core-rdf/ontology).
| text/markdown | null | Matthias Schlögl <matthias.schloegl@oeaw.ac.at>, Birger Schacht <birger.schacht@oeaw.ac.at>, K Kollmann <dev-kk@oeaw.ac.at>, Saranya Balasubramanian <saranya.balasubramanian@oeaw.ac.at> | null | null | null | null | [] | [] | null | null | >=3.13 | [] | [] | [] | [
"acdh-arche-assets<=4.0,>=3.21.1",
"crispy-bootstrap5>=2024.10",
"django-autocomplete-light<3.13.0,>=3.9.4",
"django-crispy-forms<3.0,>=2",
"django-crum<8.0,>=0.7.9",
"django-filter>=24.3",
"django-model-utils<6.0.0,>=4.1.1",
"django-simple-history>=3.6",
"django-tables2<3.0,>=2.3.3",
"django>=5.2... | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:34:17.239259 | apis_core_rdf-0.60.0.tar.gz | 308,059 | 67/b9/00e4fea9fef05a5c264fa4efad7df5a3be25ecdc76080fa387a49c4d1015/apis_core_rdf-0.60.0.tar.gz | source | sdist | null | false | 1f590a011b15c70bc8f5a239f659a351 | 38c4f751b68118c131957fb178690a031301bbd4b6ad07a65db2c41f4a93ac93 | 67b900e4fea9fef05a5c264fa4efad7df5a3be25ecdc76080fa387a49c4d1015 | MIT | [
"LICENSE.txt"
] | 247 |
2.4 | pubsublib | 0.1.21 | The Python Adaptation of the PubSubLib for Orange Health | # pubsublib
pubsublib is a Python library designed for PubSub functionality using AWS - SNS & SQS.
## PIP Package
[pubsublib](https://pypi.org/project/pubsublib/)
## Getting Started
To get started with pubsublib, you can install it via pip:
```bash
pip install pubsublib
```
## Using Pubsublib
Once pubsublib is installed, you can use it in your Python code as follows:
```python
from pubsublib.aws.main import AWSPubSubAdapter
pubsub_adapter = AWSPubSubAdapter(
aws_region='XXXXX',
aws_access_key_id='XXXXX',
aws_secret_access_key='XXXXX',
redis_location='XXXXX',
sns_endpoint_url=None,
sqs_endpoint_url=None
)
```
#### Using AWS IAM Roles for Service Accounts
When using IAM Roles for Service Accounts (IRSA), pass `aws_access_key_id` and `aws_secret_access_key` as empty strings so that the AWS SDK uses its default credential provider chain (which will pick up the service account role). You should still provide a valid `aws_region` here, or ensure that `AWS_REGION` or `AWS_DEFAULT_REGION` is set in the environment or shared AWS config when omitting it.
[Steps to Publish Package](https://github.com/Orange-Health/pubsublib-python/wiki/PyPI-%7C-Publish-Package#steps-to-publish-the-pubsublib-package-on-pypi)
| text/markdown | null | Anant Chauhan <anant.chauhan@orangehealth.in>, Rajendra Talekar <talekar.r@gmail.com> | null | null | null | null | [
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/Orange-Health/pubsublib-python",
"Issues, https://github.com/Orange-Health/pubsublib-python/issues",
"Repository, https://github.com/Orange-Health/pubsublib-python"
] | twine/6.2.0 CPython/3.9.25 | 2026-02-19T10:33:45.781669 | pubsublib-0.1.21.tar.gz | 11,148 | 57/c9/c5bbf2e0c57b368b1cf4dc1e9e5fb901258b7091412510d61bce39d18f36/pubsublib-0.1.21.tar.gz | source | sdist | null | false | d00d4462aeb6245a0ca0f937d9382735 | 596feb20f81b71a770590f67782d3867774e5cf614480253027e8f5af672f5db | 57c9c5bbf2e0c57b368b1cf4dc1e9e5fb901258b7091412510d61bce39d18f36 | null | [
"LICENSE"
] | 256 |
2.4 | playback-studio | 0.3.32 | Record your service operations in production and replay them locally at any time in a sandbox | # playback [](https://circleci.com/gh/Optibus/playback) [](https://codecov.io/gh/Optibus/playback) [](https://pypi.python.org/pypi/playback-studio/) [](https://pypi.python.org/pypi/playback-studio/)
A Python decorator-based framework that lets you "record" and "replay" operations (e.g. API requests, workers consuming jobs from queues).
a java-script / type script [version](https://github.com/Optibus/playback-ts) is in the works
## Main uses
* Regression testing - replay recorded production traffic on modified code before pushing it
* Debug production issues locally
* Access many "real data" scenarios to test/validate new features/behaviours
The framework intercepts all decorated inputs and outputs throughout the recorded operation, which are used later to replay the exact operation in a controlled isolated sandbox, as well as to compare the output of the recorded operation vs the replayed operation.
## Background
The motivation for this framework was to be able to test new code changes on actual data from production while doing it not in production, when the
alternative of canary deployment is not a viable option.
Some examples when this might happen include:
* When detecting a regression is based on intimate knowledge of the service output
* When the service amount of possible input permutations is large while the number of users per permutation is low, resulting
in a statistical sample that is not large enough to rely on in production in order to detect regression early enough
to rollback
On top of this, the ability for the developer to check and get an accurate comparison of his/her code vs production then
debug it during development increases productivity by detecting issues right away.
The quality of the released code improves significantly by covering many edge cases that are hard to predict in tests.
## Features
* Create a standalone "recording" of each intercepted operation, with all the relevant inputs and outputs, and save it
to AWS S3
* Replay a recorded operation anywhere via code execution
* Run an extensive comparison of recorded vs replayed operations
# Installation
`pip install playback-studio`
# Examples
There are two examples as part of this repo you can check out under the [examples](examples) directory:
* [basic service operation](examples/basic_service_operation.py/) - a simple example for in memory operation
* [Flask based service](examples/flask) - an end to end flask based example with persistent recording
# Usage and examples - interception and replay
## Intercepting an operation
In order to intercept an operation, you need to explicitly declare the recorded operation entry point by decorating it with
the `TapeRecorder.operation` decorator and explicitly declare what inputs and outputs need to be intercepted by using
the `TapeRecorder.intercept_input` and `TapeRecorder.intercept_output` decorators, as demonstrated below:
```python
from flask import request
tape_cassette = S3TapeCassette('production-recordings', region='us-east-1', read_only=False)
tape_recorder = TapeRecorder(tape_cassette)
tape_recorder.enabled_recording()
class ServiceOperation(object):
...
@tape_recorder.operation()
def execute(self):
"""
Executes the operation and return the key of where the result is stored
"""
data = self.get_request_data()
result = self.do_something_with_input(data)
storage_key = self.store_result(result)
return storage_key
@tape_recorder.intercept_input(alias='service_operation.get_request_data')
def get_request_data(self):
"""
Reads the required input for the operation
"""
# Get request data from flask
return request.data
@tape_recorder.intercept_output(alias='service_operation.store_result')
def store_result(self, result):
"""
Stores the operation result and return the key that can be used to fetch the result
"""
result_key = self.put_result_in_mongo(result)
return result_key
```
## Locally Cached S3 Tape Cassette
The `CachedReadOnlyS3TapeCassette` class serves as an alternative to `S3TapeCassette`, enabling the storage of recordings locally. By providing a local path (or defaulting to `/tmp/recordings_cache`), it eliminates the need for constant access to AWS S3. In the event of a cache failure, the class gracefully falls back to its parent `S3TapeCassette`, attempting to fetch the original recording data directly from S3 as needed.
```
tape_cassette = CachedReadOnlyS3TapeCassette('production-recordings', region='us-east-1', read_only=True, local_path=`/tmp/recordings_cache`, use_cache=True)
```
Notice that CachedReadOnlyS3TapeCassette is used in a read_only state only and you can refresh your recording id cache by calling it with use_cache=False
## Replaying an intercepted operation
In order to replay an operation, you need the specific recording ID. Typically, you would add this information to your
logs output. Later, we will demonstrate how to look for recording IDs using search filters, the `Equalizer`, and the
`PlaybackStudio`
```python
tape_cassette = S3TapeCassette('production-recordings', region='us-east-1')
tape_recorder = TapeRecorder(tape_cassette)
def playback_function(recording):
"""
Given a recording, starts the execution of the recorded operation
"""
operation_class = recording.get_metadata()[TapeRecorder.OPERATION_CLASS]
return operation_class().execute()
# Will replay recorded operation, injecting and capturing needed data in all of the intercepted inputs and outputs
tape_recorder.play(recording_id, playback_function)
```
# Framework classes - recording and replaying
## `TapeRecorder` class
This class is used to "record" an operation and "replay" (rerun) the recorded operation on any code version.
The recording is done by placing different decorators that intercept the operation and its inputs and outputs by using
decorators.
### `operation` decorator
```python
def operation(self, metadata_extractor=None)
```
Decorates the operation entry point. Every decorated input and output that is being executed within this scope is being
intercepted and recorded or replayed, depending on whether the current context is recording or playback.
* `metadata_extractor` - an optional function that can be used to add
metadata to the recording. The metadata can be used as a search filter when fetching recordings, hence it can be used
to add properties specific to the operation received parameters that make sense to filter by when you wish to replay
the operation.
### `intercept_input` decorator
```python
def intercept_input(self, alias, alias_params_resolver=None, data_handler=None, capture_args=None, run_intercepted_when_missing=True)
```
Decorates a function that acts as an input to the operation. The result of the function is the recorded input, and the
combined passed arguments and alias are used as the key that uniquely identifies the input. Upon playback, an invocation to
the intercepted method will fetch the input from the recording by combining the passed arguments and alias as the
lookup key. If no recorded value is found, a `RecordingKeyError` will be raised.
* `alias` - Input alias, used to uniquely identify the input function, hence the name should be unique across all
relevant inputs this operation can reach. This should be renamed as it will render previous recording useless
* `alias_params_resolver` - Optional function that resolve parameters inside alias if such are given. This is useful when
you have the same input method invoked many times with the same arguments on different class instances
* `data_handler` - Optional data handler that prepares and restores the input data for and from the recording when default
pickle serialization is not enough. This needs to be an implementation of `InputInterceptionDataHandler` class
* `capture_args` - If a list is given, it will annotate which arg indices and/or names should be captured as part of
the intercepted key (invocation identification). If None, all args are captured
* `run_intercepted_when_missing` - If no matching content is found on recording during playback, run the original intercepted
method. This is useful when you want to use existing recording to play a code flow where this interception didn't exist
When intercepting a static method, `static_intercept_input` should be used.
### `intercept_output` decorator
```python
def intercept_output(self, alias, data_handler=None, fail_on_no_recorded_result=True)
```
Decorates a function that acts as an output of the operation. The parameters passed to the function are recorded as
the output and the return value is recorded as well. The alias combined with the invocation number are used as the key that
uniquely identifies this output. Upon playback, an invocation to the intercepted method will construct the same
identification key and capture the outputs again (which can be used later to compare against the recorded output), and
the recorded return value will be returned.
* `alias` - Output alias, used to uniquely identify the input function, hence the name should be unique across all
relevant inputs this operation can reach. This should be renamed as it will render previous recording useless
* `data_handler` - Optional data handler that prepares and restores the output data for and from the recording when
default pickle serialization is not enough. This needs to be an implementation of `OutputInterceptionDataHandler` class
* `fail_on_no_recorded_result` - Whether to fail if there is no recording of a result or return None.
Setting this to False is useful when there are already pre-existing recordings and this is a new output interception
where we want to be able to playback old recordings and the return value of the output is not actually used.
Defaults to True
The return value of the operation is always intercepted as an output implicitly using
`TapeRecorder.OPERATION_OUTPUT_ALIAS` as the output alias.
When intercepting a static method, `static_intercept_output` should be used.
## `TapeCassette` class
An abstract class that acts as a storage driver for TapeRecorder to store and fetch recordings, the class has three main
methods that need to be implemented.
```python
def get_recording(self, recording_id)
```
Get recording is stored under the given ID
```python
def create_new_recording(self, category)
```
Creates a new recording object that is used by the tape recorded
* `category` - Specifies under which category to create the recording and represent the operation type
```python
def iter_recording_ids(self, category, start_date=None, end_date=None, metadata=None, limit=None)
```
Creates an iterator of recording IDs matching the given search parameters
* `category` - Specifies in which category to look for recordings
* `start_date` - Optional earliest date of when recordings were captured
* `end_date` - Optional latest date of when recordings were captured
* `metadata` - Optional dictionary to filter captured metadata by
* `limit` - Optional limit on how many matching recording IDs to fetch
The framework comes with two built-in implementations:
* `InMemoryTapeCassette` - Saves recording in a dictionary, its main usage is for tests
* `S3TapeCassette` - Saves recording in AWS S3 bucket
### `S3TapeCassette` class
```python
# Instantiate the cassette connected to bucket 'production-recordings'
# under region 'us-east-1' in read/write mode
tape_cassette = S3TapeCassette('production-recordings', region='us-east-1', read_only=False)
```
Instantiating this class relies on being able to connect to AWS S3 from the current terminal/process and have read/write
access to the given bucket (for playback, only read access is needed).
```python
def __init__(self, bucket, key_prefix='', region=None, transient=False, read_only=True,
infrequent_access_kb_threshold=None, sampling_calculator=None)
```
* `bucket` - AWS S3 bucket name
* `key_prefix` - Each recording is saved under two keys, one containing full data and the other just for fast lookup
and filtering of recordings. The key structure used for recording is
'tape_recorder_recordings/{key_prefix}<full/metadata>/{id}', this gives the option to add a prefix to the key
* `region` - This value is propagated to the underline boto client
* `transient` - If this is a transient cassette, all recording under the given prefix will be deleted when closed
(only if not read-only). This is useful for testing purposes and clean-up after tests
* `read_only` - If True, this cassette can only be used to fetch recordings and not to create new ones.
Any write operations will raise an assertion.
* `infrequent_access_kb_threshold` - Threshold in KB. When above the threshhold, the object will be saved in STANDARD_IA
(infrequent access storage class), None means never (default)
* `sampling_calculator` - Optional sampling ratio calculator function. Before saving the recording, this
function will be triggered with (category, recording_size, recording)
and the function should return a number between 0 and 1 which specifies its sampling rate
# Usage and examples - comparing replayed vs recorded operations
## Using the Equalizer
In order to run a comparison, we can use the `Equalizer` class and provide it with relevant playable recordings.
In this example, we will look for five recordings from the last week using the `find_matching_recording_ids` function.
The `Equalizer` relies on:
* `playback_function` to replay the recorded operation
* `result_extractor` to extract the result that we want to compare from the captured outputs
* `comparator` to compare the extracted result
```python
# Creates an iterator over relevant recordings which are ready to be played
lookup_properties = RecordingLookupProperties(start_date=datetime.utcnow() - timedelta(days=7),
limit=5)
recording_ids = find_matching_recording_ids(tape_recorder,
ServiceOperation.__name__,
lookup_properties)
def result_extractor(outputs):
"""
Given recording or playback outputs, find the relevant output which is the result that
needs to be compared
"""
# Find the relevant captured output
output = next(o for o in outputs if 'service_operation.store_result' in o.key)
# Return the captured first arg as the result that needs to be compared
return output.value['args'][0]
def comparator(recorded_result, replay_result):
"""
Compare the operation captured output result
"""
if recorded_result == replay_result:
return ComparatorResult(EqualityStatus.Equal, "Value is {}".format(recorded_result))
return ComparatorResult(EqualityStatus.Different,
"{recorded_result} != {replay_result}".format(
recorded_result=recorded_result, replay_result=replay_result))
def player(recording_id):
return tape_recorder.play(recording_id, playback_function)
# Run comparison and output comparison result using the Equalizer
equalizer = Equalizer(recording_ids, player, result_extractor, comparator)
for comparison_result in equalizer.run_comparison():
print('Comparison result {recording_id} is: {result}'.format(
recording_id=comparison_result.playback.original_recording.id,
result=comparison_result.comparator_status))
```
# Framework classes - comparing replayed vs recorded operations
## `Equalizer` class
The `Equalizer` is used to replay multiple recordings of a single operation and conduct a comparison between the
recorded results (outputs) vs the replayed results. Underline it uses the `TapeRecorder` to replay the
operations and the `TapeCassette` to look for and fetch relevant recordings.
```python
def __init__(self, recording_ids, player, result_extractor, comparator,
comparison_data_extractor=None, compare_execution_config=None)
```
* `recording_ids` - An iterator of recording IDs to play and compare the results
* `player` - A function that plays a recording given an ID
* `result_extractor` - A function used to extract the results that need to be compared from the recording and playback
outputs
* `comparator` - A function used to create the comparison result by comparing the recorded vs replayed result
* `comparison_data_extractor` - A function used to extract optional data from the recording that will be passed to the
comparator
* `compare_execution_config` - A configuration specific to the comparison execution flow
For more context, you can look at the [basic service operation](examples/basic_service_operation.py/) example.
## Usage and examples - comparing multiple recorded vs replayed operations in one flow
When a code change may affect multiple operations, or when you want to have a general regression job running, you can use
the `PlaybackStudio` and `EqualizerTuner` to run multiple operations together and aggregate the results.
Moreover, the `EqualizerTuner` can be used as a factory to create the relevant plugin functions required to set up an
`Equalizer` to run a comparison of a specific operation.
```python
# Will run 10 playbacks per category
lookup_properties = RecordingLookupProperties(start_date, limit=10)
catagories = ['ServiceOperationA', 'ServiceOperationB']
equalizer_tuner = MyEqualizerTuner()
studio = PlaybackStudio(categories, equalizer_tuner, tape_recorder, lookup_properties)
categories_comparison = studio.play()
```
Implementing an `EqualizerTuner`
```python
class MyEqualizerTuner(EqualizerTuner):
def create_category_tuning(self, category):
if category == 'ServiceOperationA':
return EqualizerTuning(operation_a_playback_function,
operation_a_result_extractor,
operation_a_comparator)
if category == 'ServiceOperationB':
return EqualizerTuning(operation_b_playback_function,
operation_b_result_extractor,
operation_b_comparator)
```
# Framework classes - comparing replayed vs recorded operations
## `PlaybackStudio` class
The studio runs many playbacks for one or more categories (operations), and uses the `Equalizer` to conduct a comparison
between the recorded outputs and the playback outputs.
```python
def __init__(self, categories, equalizer_tuner, tape_recorder, lookup_properties=None,
recording_ids=None, compare_execution_config=None)
```
* `categories` - The categories (operations) to conduct comparison for
* `equalizer_tuner` - Given a category, returns a corresponding equalizer tuning to be used for playback and comparison
* `tape_recorder` - The tape recorder that will be used to play the recordings
* `lookup_properties` - Optional `RecordingLookupProperties` used to filter recordings by
* `recording_ids` - Optional specific recording IDs. If given, the `categories` and `lookup_properties` are ignored and
only the given recording IDs will be played
* `compare_execution_config` - A configuration specific to the comparison execution flow
## `EqualizerTuner` class
An abstract class that is used to create an `EqualizerTuning` per category that contains the correct plugins (functions)
required to play the operation and compare its results.
```python
def create_category_tuning(self, category)
```
Create a new `EqualizerTuning` for the given category
```python
class EqualizerTuning(object):
def __init__(self, playback_function, result_extractor, comparator,
comparison_data_extractor=None):
self.playback_function = playback_function
self.result_extractor = result_extractor
self.comparator = comparator
self.comparison_data_extractor = comparison_data_extractor
```
# Contributions
Feel free to send pull requests and raise issues. Make sure to add/modify tests to cover your changes.
Please squash your commits in the pull request to one commit. If there is a good logical reason to break it into
few commits, multiple pull requests are preferred unless there is a good logical reason to bundle the commits to the
same pull request.
Please note that as of now this framework is compatible with both Python 2 and 3, hence any changes should
keep that. We use the ״six״ framework to help keep this support.
To contribute, please review our [contributing policy](https://github.com/Optibus/playback/blob/main/CONTRIBUTING.md).
## Running tests
Tests are automatically run in the CI flow using CircleCI. In order to run them locally, you should install the
development requirements:
`pip install -e .[dev]`
and then run `pytest tests`.
| text/markdown | Optibus | eitan@optibus.com | null | null | null | null | [
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3",
"License :: OSI Approved :: BSD License",
"Operating System :: OS Independent"
] | [] | https://github.com/Optibus/playback | null | >=2.7 | [] | [] | [] | [
"parse==1.6.6",
"jsonpickle==0.9.4; python_version < \"3.13\"",
"jsonpickle<5,>=1; python_version >= \"3.13\"",
"six>=1.15.0",
"contextlib2==0.6.0",
"decorator==4.4.2",
"mock==2.0.0; extra == \"dev\"",
"rsa<=4.0; python_version < \"3\" and extra == \"dev\"",
"python-jose<=0.3.2; python_version < \"3... | [] | [] | [] | [] | twine/6.2.0 CPython/3.10.13 | 2026-02-19T10:33:12.006851 | playback_studio-0.3.32.tar.gz | 68,074 | 56/71/55a0a287dbb542422c11d3264b690b6c33da54d24a0753973c32618e5240/playback_studio-0.3.32.tar.gz | source | sdist | null | false | 0dd417559647764f8bf68b5e47e1c48c | 920387fc5176c57b55bc3885c41df3312ef04d5e8cce62613a739156d828104d | 567155a0a287dbb542422c11d3264b690b6c33da54d24a0753973c32618e5240 | null | [
"LICENSE.txt"
] | 949 |
2.4 | aoh | 2.0.5 | A library for calculating Area of Habitat for species distribution mapping | # AOH Calculator
This repository contains both a Python library and command line tools for making Area of Habitat (AOH) rasters from a mix of data sources, following the methodology described in [Brooks et al](https://www.cell.com/trends/ecology-evolution/fulltext/S0169-5347(19)30189-2) and adhering to the IUCN Redlist Technical Working Group guidance on AOH production.
## Overview
An AOH raster combines data on species range, habitat preferences and elevation preferences along with raster products such as a Digital Elevation Map (DEM) and a land cover or habitat map, and which are combined to generate a raster that refines the species range down to just those areas that match its preferences: its area of habitat, or AOH.
This package provides two implementations of the AOH method: a binary method and a fractional, or proportional, method. The binary method takes a single land cover or habitat map where each pixel is encoded to a particular land cover or habitat class (e.g., the [Copernicus Land Cover map](https://land.copernicus.eu/en/products/global-dynamic-land-cover) or the [Jung habitat map](https://zenodo.org/records/4058819)). The fractional method takes in a set of rasters, one per class, with each pixel being some proportional value. In this approach if a species has multiple habitat preferences and their maps overlap the resulting value in the AOH map will be a summation of those values.
By default the package generates a binary or fractional coverage per pixel, depending on your input layers, but if you provide the `pixel-area` option then it will covert the results into metres squared per pixel, calculating the area per pixel based on the map projection and pixel scale of the other rasters.
## Installation
The AOH Calculator is available as a Python package and can be installed via pip:
```bash
pip install aoh
```
This provides both command-line tools and a Python library for programmatic use.
For validation tools that require R, install with the validation extra:
```bash
pip install aoh[validation]
```
You will also need to following R packages installed: lme4, lmerTest, broom.mixed, emmeans, report, sklearn
### Prerequisites
You'll need GDAL installed on your system. The Python GDAL package version should match your system GDAL version. You can check your GDAL version with:
```bash
gdalinfo --version
```
Then install the matching Python package:
```bash
pip install gdal[numpy]==YOUR_VERSION_HERE
```
## Input Data Requirements
To generate AOH rasters, you will need the following inputs:
### Species Data
A GeoJSON file containing species range and attributes. Each file should include:
- **id_no**: IUCN taxon ID of the species
- **season**: Season using IUCN codes (1 = resident, 2 = breeding, 3 = non-breeding, 4 = passage, 5 = unknown)
- **elevation_lower**: Lower elevation bound (in meters) where species is found
- **elevation_upper**: Upper elevation bound (in meters) where species is found
- **full_habitat_code**: Pipe-separated list of IUCN habitat codes (e.g., "1.5|1.6|2.1")
- **geometry**: Polygon or MultiPolygon describing the species' geographic range for this season
### Habitat Data
**For binary/classified method:**
- A single GeoTIFF raster where each pixel contains an integer value representing a habitat or land cover class
- Examples: Copernicus Global Land Cover, Jung habitat classification
**For fractional/proportional method:**
- A directory containing multiple GeoTIFF rasters, one per habitat class
- Files must be named `lcc_{value}.tif` where `{value}` matches the crosswalk table
- Each pixel contains a fractional value (typically 0.0-1.0) indicating proportional coverage
- **Note**: Use the `aoh-habitat-process` tool (described below) to convert a classified habitat map into this format while optionally reprojecting and rescaling
### Elevation Data
**Single DEM (recommended for high-resolution analyses):**
- A GeoTIFF containing elevation values in meters
**Min/Max DEM pair (for downscaled analyses):**
- Two GeoTIFFs containing minimum and maximum elevation per pixel in meters
- Useful when working at coarser resolution while maintaining elevation accuracy
### Crosswalk Table
A CSV file mapping IUCN habitat codes to raster values with two columns:
- **code**: IUCN habitat code (e.g., "1.5", "2.1")
- **value**: Corresponding integer value in the land cover or habitat raster(s)
### Optional: Weight Layers
GeoTIFF rasters for scaling or masking:
- **Mask raster**: Binary raster to clip results to specific regions (e.g., land areas only)
- **Custom pixel area raster**: There is a built in feature to scale by geometric area per pixel, but if that does not suit your needs you can use your own area-per-pixel raster here.
### Technical Requirements
- All rasters must share the same projection and pixel resolution
- Elevation units must match between species data and DEM
- This code has been tested with Lumbierres, Jung, and ESA datasets
- Tested projections: Mercator, Mollweide, and Behrmann
## Usage
### Python Library
The AOH Calculator provides two main functions for programmatic use:
```python
from aoh import aohcalc_binary, aohcalc_fractional
# Binary method - for classified habitat maps
aohcalc_binary(
habitat_path="landcover.tif",
elevation_path="dem.tif", # or tuple of (min_dem, max_dem)
crosswalk_path="iucn_to_landcover.csv",
species_data_path="species_123.geojson",
output_directory_path="results/",
weight_layer_paths=["pixel_areas.tif"], # optional
force_habitat=False, # optional
multiply_by_area_per_pixel=False, # optional
)
# Fractional method - for proportional habitat coverage
aohcalc_fractional(
habitats_directory_path="fractional_habitats/",
elevation_path=("dem_min.tif", "dem_max.tif"), # or single DEM
crosswalk_path="iucn_to_habitat.csv",
species_data_path="species_123.geojson",
output_directory_path="results/",
weight_layer_paths=["pixel_areas.tif"], # optional
force_habitat=False, # optional
multiply_by_area_per_pixel=False, # optional
)
# Other utilities
from aoh import tidy_data
from aoh.summaries import species_richness
from aoh.validation import collate_data
```
Both functions create two output files:
- `{id_no}_{season}.tif`: The AOH raster
- `{id_no}_{season}.json`: Metadata including range_total, hab_total, dem_total, aoh_total, and prevalence
For detailed examples, see the doc strings on the functions directory.
# Command Line Tools
## aoh-calc
This is the main command for calculating the AOH of a single species. It supports both binary (classified) and fractional (proportional) habitat inputs.
```bash
$ aoh-calc --help
usage: aoh-calc [-h] [--fractional_habitats FRACTIONAL_HABITAT_PATH | --classified_habitat DISCRETE_HABITAT_PATH]
[--elevation ELEVATION_PATH | --elevation-min MIN_ELEVATION_PATH --elevation-max MAX_ELEVATION_PATH]
[--weights WEIGHT_PATHS] --crosswalk CROSSWALK_PATH
--speciesdata SPECIES_DATA_PATH [--force-habitat]
--output OUTPUT_PATH
Area of habitat calculator.
options:
-h, --help show this help message and exit
--fractional_habitats FRACTIONAL_HABITAT_PATH
Directory of fractional habitat rasters, one per habitat class.
--classified_habitat DISCRETE_HABITAT_PATH
Habitat raster, with each class a discrete value per pixel.
--elevation ELEVATION_PATH
Elevation raster (for high-resolution analyses).
--elevation-min MIN_ELEVATION_PATH
Minimum elevation raster (for downscaled analyses).
--elevation-max MAX_ELEVATION_PATH
Maximum elevation raster (for downscaled analyses).
--weights WEIGHT_PATHS
Optional weight layer raster(s) to multiply with result.
Can specify multiple times. Common uses: pixel area
correction, spatial masking.
--pixel-area
If set, multiply each pixel by its area in metres squared based
on map projection and pixel scale of other input rasters.
--crosswalk CROSSWALK_PATH
Path of habitat crosswalk table.
--speciesdata SPECIES_DATA_PATH
Single species/seasonality geojson.
--force-habitat If set, don't treat an empty habitat layer as per IUCN RLTWG.
--output OUTPUT_PATH Directory where area geotiffs should be stored.
```
### Usage Notes
**Habitat Input Options:**
- Use `--fractional_habitats` for a directory of per-class rasters with proportional values
- Use `--classified_habitat` for a single raster with discrete habitat class values
- You must specify exactly one of these options
**Elevation Input Options:**
- Use `--elevation` for a single DEM raster (recommended for high-resolution analyses)
- Use `--elevation-min` and `--elevation-max` together for min/max elevation pairs (for downscaled analyses)
- You must specify exactly one of these options
**Weight Layers (Optional):**
Weight layers are rasters that are multiplied with the AOH result. You can specify `--weights` multiple times to apply multiple layers, which will be multiplied together.
Common use cases:
- **Spatial masking**: Clip results to specific regions (e.g., land areas only)
```bash
--weights land_mask.tif
```
**Output:**
Two files are created in the output directory:
- `{id_no}_{season}.tif`: The AOH raster
- `{id_no}_{season}.json`: Metadata manifest with statistics
**Other Flags:**
- `--force-habitat`: Prevents fallback to range when habitat filtering yields zero area (useful for land-use change scenarios)
See the "Input Data Requirements" section above for detailed format specifications.
## aoh-habitat-process
This command prepares habitat data for use with the **fractional method** by converting a classified habitat map into a set of per-class rasters. It splits a single habitat raster (where each pixel contains one class value) into multiple rasters with fractional coverage values, while optionally rescaling and reprojecting.
While terrestrial AOH calculations typically have one habitat class per pixel, other domains like marine environments (which represent 3D space) may have multiple overlapping habitats. This tool enables the fractional method to work across all realms by creating the required per-class raster format.
```bash
$ aoh-habitat-process --help
usage: aoh-habitat-process [-h] --habitat HABITAT_PATH --scale PIXEL_SCALE
[--projection TARGET_PROJECTION]
--output OUTPUT_PATH [-j PROCESSES_COUNT]
Downsample habitat map to raster per terrain type.
options:
-h, --help show this help message and exit
--habitat HABITAT_PATH
Path of initial combined habitat map.
--scale PIXEL_SCALE Optional output pixel scale value, otherwise same as
source.
--projection TARGET_PROJECTION
Optional target projection, otherwise same as source.
--output OUTPUT_PATH Destination folder for raster files.
-j PROCESSES_COUNT Optional number of concurrent threads to use.
```
# Summary Tools
These commands take a set of AOH maps and generate summary statistics useful for analysing groups of species.
## aoh-species-richness
The species richness map is just an indicator of how many species exist in a given area. It takes each AOH map, converts it to a boolean layer to indicate presence, and then sums the resulting boolean raster layers to give you a count in each pixel of how many species are there.
```bash
$ aoh-species-richness --help
usage: aoh-species-richness [-h] --aohs_folder AOHS --output OUTPUT
[-j PROCESSES_COUNT]
Calculate species richness
options:
-h, --help show this help message and exit
--aohs_folder AOHS Folder containing set of AoHs
--output OUTPUT Destination GeoTIFF file for results.
-j PROCESSES_COUNT Number of concurrent threads to use.
```
## aoh-endemism
Endemism is an indicator of how much an area of land contributes to a species overall habitat: for a species with a small area of habitat then each pixel is more precious to it than it is for a species with a vast area over which they can be found. The endemism map takes the set of AoHs and the species richness map to generate, and for each species works out the proportion of its AoH is within a given pixel, and calculates the geometric mean per pixel across all species in that pixel.
```bash
$ aoh-endemism --help
usage: aoh-endemism [-h] --aohs_folder AOHS
--species_richness SPECIES_RICHNESS --output OUTPUT
[-j PROCESSES_COUNT]
Calculate species richness
options:
-h, --help show this help message and exit
--aohs_folder AOHS Folder containing set of AoHs
--species_richness SPECIES_RICHNESS
GeoTIFF containing species richness
--output OUTPUT Destination GeoTIFF file for results.
-j PROCESSES_COUNT Number of concurrent threads to use.
```
# Validation Tools
In [Dahal et al](https://gmd.copernicus.org/articles/15/5093/2022/) there is a method described for validating a set of AoH maps. This is implemented as validation commands, and borrows heavily from work by [Franchesca Ridley](https://www.researchgate.net/profile/Francesca-Ridley).
## aoh-collate-data
Before running validation, the metadata provided for each AoH map must be collated into a single table using this command:
```bash
$ aoh-collate-data --help
usage: aoh-collate-data [-h] --aoh_results AOHS_PATH --output OUTPUT_PATH
Collate metadata from AoH build.
options:
-h, --help show this help message and exit
--aoh_results AOHS_PATH
Path of all the AoH outputs.
--output OUTPUT_PATH Destination for collated CSV.
```
## aoh-validate-prevalence
To run the model validation use this command:
```bash
$ aoh-validate-prevalence --help
usage: aoh-validate-prevalence [-h] --collated_aoh_data COLLATED_DATA_PATH
--output OUTPUT_PATH
Validate map prevalence.
options:
-h, --help show this help message and exit
--collated_aoh_data COLLATED_DATA_PATH
CSV containing collated AoH data
--output OUTPUT_PATH CSV of outliers.
```
This will produce a CSV file listing just the AoH maps that fail model validation.
**Note:** The validation tools require R to be installed on your system with the `lme4` and `lmerTest` packages.
## aoh-fetch-gbif-data
This command fetches occurrence data from [GBIF](https://gbif.org) to do occurrence checking as per Dahal et al.
```bash
$ aoh-fetch-gbif-data --help
usage: aoh-fetch-gbif-data [-h] --collated_aoh_data COLLATED_DATA_PATH [--gbif_username GBIF_USERNAME] [--gbif_email GBIF_EMAIL] [--gbif_password GBIF_PASSWORD] --taxa TAXA --output_dir OUTPUT_DIR_PATH
Fetch GBIF records for species for validation.
options:
-h, --help show this help message and exit
--collated_aoh_data COLLATED_DATA_PATH
CSV containing collated AoH data
--gbif_username GBIF_USERNAME
Username of user's GBIF account. Can also be set in environment.
--gbif_email GBIF_EMAIL
E-mail of user's GBIF account. Can also be set in environment.
--gbif_password GBIF_PASSWORD
Password of user's GBIF account. Can also be set in environment.
--taxa TAXA
--output_dir OUTPUT_DIR_PATH
Destination directory for GBIF data.
Environment Variables:
GBIF_USERNAME Username of user's GBIF account.
GBIF_EMAIL E-mail of user's GBIF account.
GBIF_PASSWORD Password of user's GBIF account.
```
Important notes:
1. You will need a GBIF account for this.
2. This can take a long time, particularly for birds as there are so many records.
3. It can also generate a lot of data, hundreds of gigabytes worth, so ensure you have enough storage space!
## aoh-validate-occurrences
This command will run occurrence validation using the GBIF data fetched with the previous command.
```bash
aoh-validate-occurences --help
usage: aoh-validate-occurences [-h] --gbif_data_path GBIF_DATA_PATH --species_data SPECIES_DATA_PATH --aoh_results AOHS_PATH --output OUTPUT_PATH [-j PROCESSES_COUNT]
Validate occurrence prevelance.
options:
-h, --help show this help message and exit
--gbif_data_path GBIF_DATA_PATH
Data containing downloaded GBIF data.
--species_data SPECIES_DATA_PATH
Path of all the species range data.
--aoh_results AOHS_PATH
Path of all the AoH outputs.
--output OUTPUT_PATH CSV of outliers.
-j PROCESSES_COUNT Optional number of concurrent threads to use.
```
| text/markdown | null | Michael Dales <mwd24@cam.ac.uk> | null | null | null | gis, species, habitat, biodiversity, ecology | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Science/Research",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering :: GIS"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"alive-progress",
"numpy<3.0,>=1.24",
"geopandas<2.0,>=1.0",
"psutil",
"pyproj<4.0,>=3.4",
"scikit-image<1.0,>=0.20",
"yirgacheffe<2.0,>=1.12.5",
"zenodo_search",
"pandas<3.0,>=2.0",
"gdal[numpy]<3.13,>=3.8",
"tomli",
"pymer4==0.8.2; extra == \"validation\"",
"rpy2; extra == \"validation\"",... | [] | [] | [] | [
"Homepage, https://github.com/quantifyearth/aoh-calculator",
"Repository, https://github.com/quantifyearth/aoh-calculator",
"Issues, https://github.com/quantifyearth/aoh-calculator/issues"
] | twine/6.2.0 CPython/3.10.19 | 2026-02-19T10:32:52.189734 | aoh-2.0.5.tar.gz | 41,617 | 2f/1b/13b449f7aecd2ad2899d95cedd60605b533964ab9b6cbfb2d0f9d7490b3d/aoh-2.0.5.tar.gz | source | sdist | null | false | 6d015397059f1657b7fb18243b90e8ad | df5e9df6badfd7c9c398ebb9dbd3385f5315b9a87ef9872e5917c0e4ff8e2f9c | 2f1b13b449f7aecd2ad2899d95cedd60605b533964ab9b6cbfb2d0f9d7490b3d | ISC | [
"LICENSE"
] | 258 |
2.4 | genvm-linter | 0.7.1 | Fast validation and schema extraction for GenLayer intelligent contracts | # genvm-linter
Fast validation and schema extraction for GenLayer intelligent contracts.
## Installation
```bash
pip install genvm-linter
```
## Usage
```bash
# Run both lint and validate (default)
genvm-lint check contract.py
# Fast AST safety checks only (~50ms)
genvm-lint lint contract.py
# Full SDK semantic validation (~200ms cached)
genvm-lint validate contract.py
# Extract ABI schema
genvm-lint schema contract.py
genvm-lint schema contract.py --output abi.json
# Pre-download GenVM artifacts
genvm-lint download # Latest
genvm-lint download --version v0.2.12 # Specific version
genvm-lint download --list # Show cached
# Agent-friendly JSON output
genvm-lint check contract.py --json
```
## How It Works
### Layer 1: AST Lint Checks (Fast)
- Forbidden imports (`random`, `os`, `time`, etc.)
- Non-deterministic patterns (`float()`, `time.time()`)
- Structure validation (dependency header)
### Layer 2: SDK Validation (Accurate)
- Downloads GenVM release artifacts (cached at `~/.cache/genvm-linter/`)
- Loads exact SDK version specified in contract header
- Validates types, decorators, storage fields
- Extracts ABI schema
## Exit Codes
- `0` - All checks passed
- `1` - Lint or validation errors
- `2` - Contract file not found
- `3` - SDK download failed
## VS Code Extension
This linter is used by the [GenLayer VS Code Extension](https://github.com/genlayerlabs/vscode-extension) for real-time contract validation.
## Development
```bash
git clone https://github.com/genlayerlabs/genvm-linter.git
cd genvm-linter
pip install -e ".[dev]"
pytest
```
## License
MIT
| text/markdown | null | GenLayer Labs <dev@genlayer.com> | null | null | null | blockchain, genlayer, genvm, linter, smart-contracts, validation | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Software Develop... | [] | null | null | >=3.10 | [] | [] | [] | [
"click>=8.0",
"numpy>=1.20",
"pyright>=1.1.350",
"pytest>=7.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/genlayerlabs/genvm-linter",
"Repository, https://github.com/genlayerlabs/genvm-linter",
"Issues, https://github.com/genlayerlabs/genvm-linter/issues",
"Documentation, https://docs.genlayer.com/developers/intelligent-contracts/tooling-setup"
] | twine/6.2.0 CPython/3.12.12 | 2026-02-19T10:32:38.015436 | genvm_linter-0.7.1.tar.gz | 53,895 | d6/0d/3ce6bd79fad2bf6d328cb522e737e3b3e0ce2bbc466752eec28afa6ef576/genvm_linter-0.7.1.tar.gz | source | sdist | null | false | 746c118bf7aef2a4988904883a000468 | 9dd0ae4fc4687c84b2d6cf8b60d45a149295a959454cd198cda6c0169669d2ed | d60d3ce6bd79fad2bf6d328cb522e737e3b3e0ce2bbc466752eec28afa6ef576 | MIT | [
"LICENSE"
] | 286 |
2.4 | sag-py-fastapi-request-id | 1.0.5 | Adds an unique identifiert to fastapi requests | # sag_py_fastapi_request_id
[![Maintainability][codeclimate-image]][codeclimate-url]
[![Coverage Status][coveralls-image]][coveralls-url]
[](https://snyk.io/test/github/SamhammerAG/sag_py_fastapi_request_id)
[coveralls-image]:https://coveralls.io/repos/github/SamhammerAG/sag_py_fastapi_request_id/badge.svg?branch=master
[coveralls-url]:https://coveralls.io/github/SamhammerAG/sag_py_fastapi_request_id?branch=master
[codeclimate-image]:https://api.codeclimate.com/v1/badges/1d0606922774a8ac4a7d/maintainability
[codeclimate-url]:https://codeclimate.com/github/SamhammerAG/sag_py_fastapi_request_id/maintainability
This library provides a way to identify all log entries that belong to a single request.
## What it does
* Provides a middleware to generate a random request id for every request
* Contains a logging filter that adds the request id as field to every log entry
## How to use
### Installation
pip install sag-py-fastapi-request-id
### Add the middleware
Add this middleware so that the request ids are generated:
```python
from sag_py_fastapi_request_id.request_context_middleware import RequestContextMiddleware
from fastapi import FastAPI
app = FastAPI(...)
app.add_middleware(RequestContextMiddleware)
```
### Get the request id
The request id can be accessed over the context
```python
from sag_py_fastapi_request_id.request_context import get_request_id as get_request_id_from_context
request_id = get_request_id_from_context()
```
This works in async calls but not in sub threads (without additional changes).
See:
* https://docs.python.org/3/library/contextvars.html
* https://kobybass.medium.com/python-contextvars-and-multithreading-faa33dbe953d
### Add request id field to logging
It is possible to log the request id by adding a filter.
```python
import logging
from sag_py_fastapi_request_id.request_context_logging_filter import RequestContextLoggingFilter
console_handler = logging.StreamHandler(sys.stdout)
console_handler.addFilter(RequestContextLoggingFilter())
```
The filter adds the field request_id if it has a value in the context.
## How to start developing
### With vscode
Just install vscode with dev containers extension. All required extensions and configurations are prepared automatically.
### With pycharm
* Install latest pycharm
* Install pycharm plugin Mypy
* Configure the python interpreter/venv
* pip install requirements-dev.txt
* Ctl+Alt+S => Click Tools => Actions on save => Reformat code
* Restart pycharm
## How to publish
* Update the version in setup.py and commit your change
* Create a tag with the same version number
* Let github do the rest
| text/markdown | Samhammer AG | support@samhammer.de | null | null | MIT | auth, fastapi, keycloak | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Libraries",
"Topic :: Software Development"
] | [] | https://github.com/SamhammerAG/sag_py_fastapi_request_id | null | >=3.12 | [] | [] | [] | [
"contextvars>=2.4",
"starlette>=0.52.1",
"mypy; extra == \"dev\"",
"build>=1.2.2.post1; extra == \"dev\"",
"pytest; extra == \"dev\"",
"pytest-asyncio; extra == \"dev\"",
"pytest-cov; extra == \"dev\"",
"ruff; extra == \"dev\"",
"coverage-lcov; extra == \"dev\"",
"mock; extra == \"dev\"",
"types... | [] | [] | [] | [
"Documentation, https://github.com/SamhammerAG/sag_py_fastapi_request_id",
"Bug Reports, https://github.com/SamhammerAG/sag_py_fastapi_request_id/issues",
"Source, https://github.com/SamhammerAG/sag_py_fastapi_request_id"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:31:02.436605 | sag_py_fastapi_request_id-1.0.5.tar.gz | 5,533 | 4c/a3/5f1cdf4744e0f124d6926317c39aad80518d3ac09ce4fcf2b03f0149c6bc/sag_py_fastapi_request_id-1.0.5.tar.gz | source | sdist | null | false | d9472c24b4be91e464e757304ceede72 | 771b13e577202f0c8ac16115e32d40bd65d37f7e226a0094074deac283b7563c | 4ca35f1cdf4744e0f124d6926317c39aad80518d3ac09ce4fcf2b03f0149c6bc | null | [
"LICENSE.txt"
] | 299 |
2.4 | xpander-sdk | 2.0.248 | xpander.ai Backend-as-a-service for AI Agents - SDK | # xpander.ai SDK
[](https://www.python.org/downloads/) [](https://opensource.org/licenses/MIT) [](https://docs.xpander.ai) [](https://pypi.org/project/xpander-sdk/) [](https://pepy.tech/project/xpander-sdk)
The official Python SDK for xpander.ai - a powerful Backend-as-a-Service (BaaS) platform for building, deploying, and managing AI agents at scale.
## 🚀 Overview
xpander.ai SDK provides comprehensive tools for:
- **Agent Management**: Create, configure, and manage AI agents
- **Task Execution**: Handle complex task workflows and execution
- **Tools Repository**: Integrate external tools and services
- **Knowledge Bases**: Manage and search knowledge repositories
- **Event Handling**: Event-driven programming with decorators
- **Real-time Monitoring**: Track agent performance and execution
## 📦 Installation
```bash
pip install xpander-sdk
```
### With Optional Dependencies
```bash
# For Agno framework support (2.0+)
pip install xpander-sdk[agno]
# For development
pip install xpander-sdk[dev]
```
## 🔧 Quick Start
### 1. Configuration
```python
from xpander_sdk import Configuration
# Using environment variables (recommended)
config = Configuration()
# Or explicit configuration
config = Configuration(
api_key="your-api-key",
organization_id="your-org-id",
base_url="https://inbound.xpander.ai"
)
```
### 2. Basic Agent Operations
```python
from xpander_sdk import Agents, Agent, Tasks, AgentDeploymentType
# Initialize agents module
agents = Agents(configuration=config)
# List all agents
agent_list = await agents.alist()
# Load existing agent
agent = await agents.aget("agent-id")
# Create and execute a task
task = await agent.acreate_task(
prompt="Help me analyze this data",
file_urls=["https://example.com/data.csv"]
)
```
### 3. Task Management
```python
from xpander_sdk import Tasks, Task
# Initialize tasks module
tasks = Tasks(configuration=config)
# Load and manage tasks
task = await tasks.aget("task-id")
await task.aset_status(AgentExecutionStatus.Running)
await task.asave()
# Retrieve task activity log
activity_log = await task.aget_activity_log()
for message in activity_log.messages:
print(f"{message.role}: {message.content.text}")
```
### 4. Tools Integration
```python
from xpander_sdk import register_tool, ToolsRepository
# Register a local tool
@register_tool
def check_weather(location: str) -> str:
"""Check weather for a given location."""
return f"Weather in {location}: Sunny, 25°C"
# Register a tool with graph synchronization
@register_tool(add_to_graph=True)
async def analyze_data(data: list, analysis_type: str) -> dict:
"""Analyze data from multiple sources."""
return {
"analysis_type": analysis_type,
"data_points": len(data),
"status": "completed"
}
# Use tools repository
tools = ToolsRepository(configuration=config)
weather_tool = tools.get_tool_by_id("check_weather")
result = await weather_tool.ainvoke(
agent_id="agent-id",
payload={"location": "New York"}
)
```
### 5. Knowledge Base Operations
```python
from xpander_sdk import KnowledgeBases, KnowledgeBase
# Initialize knowledge bases
kb_module = KnowledgeBases(configuration=config)
# Create knowledge base
kb = await kb_module.acreate(
name="Company Docs",
description="Internal documentation"
)
# Add documents
documents = await kb.aadd_documents([
"https://example.com/doc1.pdf",
"https://example.com/doc2.txt"
])
# Search knowledge base
results = await kb.asearch(
search_query="product pricing",
top_k=5
)
```
### 6. Event-Driven Programming
```python
from xpander_sdk import on_task, Events
# Basic task handler
@on_task
async def handle_task(task):
print(f"Processing task: {task.id}")
# Task processing logic here
task.result = "Task processed successfully"
return task
# Task handler with configuration
@on_task(configuration=config)
def sync_task_handler(task):
print(f"Handling task synchronously: {task.id}")
task.result = "Sync processing complete"
return task
```
## 📚 Core Modules
| Module | Description | Documentation |
| ------------------- | ----------------------------------------- | ---------------------------------------------------------------------------------------- |
| **Agents** | Agent creation, management, and execution | [Agents Guide](https://github.com/xpander-ai/xpander-sdk/blob/main/docs/AGENTS.md) |
| **Tasks** | Task lifecycle and execution management | [Tasks Guide](https://github.com/xpander-ai/xpander-sdk/blob/main/docs/TASKS.md) |
| **ToolsRepository** | External tools and integrations | [Tools Guide](https://github.com/xpander-ai/xpander-sdk/blob/main/docs/TOOLS.md) |
| **KnowledgeBases** | Knowledge management and search | [Knowledge Guide](https://github.com/xpander-ai/xpander-sdk/blob/main/docs/KNOWLEDGE.md) |
| **Events** | Event-driven programming | [Events Guide](https://github.com/xpander-ai/xpander-sdk/blob/main/docs/EVENTS.md) |
| **Backend** | Agent runtime arguments for frameworks | [Backend Guide](https://github.com/xpander-ai/xpander-sdk/blob/main/docs/BACKEND.md) |
## 🔄 Async/Sync Support
The SDK provides both asynchronous and synchronous interfaces:
```python
# Asynchronous (recommended for production)
agent = await Agent.aload("agent-id")
task = await agent.acreate_task(prompt="input data")
# Synchronous (convenient for scripts)
agent = Agent.load("agent-id")
task = agent.create_task(prompt="input data")
```
## 📖 Advanced Examples
### Multi-Agent Orchestration
```python
# Load multiple specialized agents
agents_list = await agents.alist()
data_agent = await agents.aget("data-agent-id")
writer_agent = await agents.aget("writer-agent-id")
# Chain agent executions
analysis_task = await data_agent.acreate_task(prompt="Analyze sales data")
report_task = await writer_agent.acreate_task(
prompt=f"Write a report based on: {analysis_task.result}"
)
```
### Tool Integration with MCP Servers
```python
from xpander_sdk import MCPServerDetails, MCPServerType
# Configure MCP server
mcp_server = MCPServerDetails(
name="data-server",
type=MCPServerType.STDIO,
command="python",
args=["-m", "mcp_server"],
env={"API_KEY": "your-key"}
)
# MCP servers are configured at the platform level
# and tools become available through ToolsRepository
```
### Streaming Task Execution
```python
# Create a task with event streaming enabled
task = await agent.acreate_task(
prompt="complex analysis task",
events_streaming=True
)
# Stream events from the task
async for event in task.aevents():
print(f"Event Type: {event.type}")
print(f"Event Data: {event.data}")
```
### Authentication Events Callback
Handle authentication events in real-time. This callback is triggered only for authentication flows (e.g., MCP OAuth requiring user login).
**You can use both approaches simultaneously** - decorated handlers will always be invoked, and you can also pass an explicit callback for additional handling.
You can provide the callback in two ways:
#### Option 1: Direct Function
```python
from xpander_sdk import Backend
from xpander_sdk.modules.agents.sub_modules.agent import Agent
from xpander_sdk.modules.tasks.sub_modules.task import Task, TaskUpdateEvent
from agno.agent import Agent as AgnoAgent
# Define event callback (async or sync)
async def my_event_callback(agent: Agent, task: Task, event: TaskUpdateEvent):
"""Called for authentication events only"""
# event.type will always be "auth_event"
print(f"Authentication required: {event.data}")
# Display login URL or handle OAuth flow
# Get args with callback
backend = Backend(configuration=config)
args = await backend.aget_args(
agent_id="agent-123",
task=my_task,
auth_events_callback=my_event_callback
)
```
#### Option 2: Decorator (Auto-registered)
```python
from xpander_sdk import Backend, on_auth_event
from xpander_sdk.modules.agents.sub_modules.agent import Agent
from xpander_sdk.modules.tasks.sub_modules.task import Task, TaskUpdateEvent
from agno.agent import Agent as AgnoAgent
# Use decorator - auto-registers globally
@on_auth_event
async def handle_auth(agent: Agent, task: Task, event: TaskUpdateEvent):
# event.type will always be "auth_event"
print(f"Authentication required for {agent.name}")
print(f"Auth data: {event.data}")
# Decorated handler is automatically invoked - no need to pass it
backend = Backend(configuration=config)
args = await backend.aget_args(
agent_id="agent-123",
task=my_task
)
```
#### Option 3: Combine Both
```python
from xpander_sdk import Backend, on_auth_event
# Global handler for all auth events
@on_auth_event
async def log_auth(agent, task, event):
print(f"[GLOBAL] Auth event for {agent.name}")
# Additional one-time handler
async def custom_handler(agent, task, event):
print(f"[CUSTOM] Specific handling for this call")
# Both handlers will be invoked
args = await backend.aget_args(
agent_id="agent-123",
auth_events_callback=custom_handler # Optional additional callback
)
# Use with Agno
agno_agent = AgnoAgent(**args)
result = await agno_agent.arun(
input="Process this data",
stream=True
)
```
### Task Activity Monitoring
```python
from xpander_sdk import Task
from xpander_sdk.models.activity import (
AgentActivityThreadMessage,
AgentActivityThreadToolCall,
AgentActivityThreadReasoning
)
# Load a completed task
task = await Task.aload("task-id")
# Get detailed activity log
activity_log = await task.aget_activity_log()
# Analyze messages between user and agent
for message in activity_log.messages:
if isinstance(message, AgentActivityThreadMessage):
print(f"{message.role}: {message.content.text}")
elif isinstance(message, AgentActivityThreadToolCall):
# Tool call
print(f"Tool: {message.tool_name}")
print(f"Payload: {message.payload}")
print(f"Result: {message.result}")
elif isinstance(message, AgentActivityThreadReasoning):
# Reasoning step
print(f"Reasoning ({message.type}): {message.thought}")
# Synchronous version
task = Task.load("task-id")
activity_log = task.get_activity_log()
```
### Local Task Testing
```python
from xpander_sdk.modules.tasks.models.task import LocalTaskTest, AgentExecutionInput
from xpander_sdk.models.shared import OutputFormat
from xpander_sdk import on_task
# Define a local test task
local_task = LocalTaskTest(
input=AgentExecutionInput(text="What can you do?"),
output_format=OutputFormat.Json,
output_schema={"capabilities": "list of capabilities"}
)
# Test with local task
@on_task(test_task=local_task)
async def handle_test_task(task):
task.result = {
"capabilities": [
"Data analysis",
"Text processing",
"API integration"
]
}
return task
```
## 🧪 Testing
```bash
# Run tests
pytest tests/
# Run with coverage
pytest tests/ --cov=xpander_sdk
# Run specific test
pytest tests/test_agents.py::test_agent_creation
```
## 🏗️ Architecture
```plaintext
xpander_sdk/
├── core/ # Core API client and base classes
├── models/ # Pydantic models and configurations
├── modules/
│ ├── agents/ # Agent management
│ ├── tasks/ # Task execution
│ ├── tools_repository/ # Tools and integrations
│ ├── knowledge_bases/ # Knowledge management
│ ├── events/ # Event handling
│ └── backend/ # Agent runtime arguments for frameworks
└── utils/ # Utility functions
```
## 🔒 Authentication
The SDK supports multiple authentication methods:
### Environment Variables (Recommended)
```bash
export XPANDER_API_KEY="your-api-key"
export XPANDER_ORGANIZATION_ID="your-org-id"
export XPANDER_BASE_URL="https://inbound.xpander.ai" # Optional
export XPANDER_AGENT_ID="your-agent-id" # Optional for Backend module
```
### Configuration Object
```python
config = Configuration(
api_key="your-api-key",
organization_id="your-org-id"
)
```
### From File
```python
# .env file
XPANDER_API_KEY=your-api-key
XPANDER_ORGANIZATION_ID=your-org-id
# Python code
from dotenv import load_dotenv
load_dotenv()
config = Configuration()
```
## 🏢 Self-Hosted Deployment
If you're using a self-hosted xpander.ai deployment, configure the SDK to point to your Agent Controller endpoint.
**Important**: Use the **Agent Controller API key** generated during Helm installation, not your xpander.ai cloud API key.
### Configuration
```bash
# Set environment variables
export XPANDER_API_KEY="your-agent-controller-api-key" # From Helm installation
export XPANDER_ORGANIZATION_ID="your-org-id"
export XPANDER_BASE_URL="https://agent-controller.my-company.com"
```
Or configure explicitly:
```python
from xpander_sdk import Configuration
config = Configuration(
api_key="your-agent-controller-api-key", # From Helm installation
organization_id="your-org-id",
base_url="https://agent-controller.my-company.com"
)
```
### Using with Agno Framework
```python
from xpander_sdk import Backend, Configuration
from agno.agent import Agent
# Configure for self-hosted
config = Configuration(
api_key="your-agent-controller-api-key", # From Helm installation
organization_id="your-org-id",
base_url="https://agent-controller.my-company.com"
)
# Initialize Backend with self-hosted config
backend = Backend(configuration=config)
# Create agent - it will use your self-hosted infrastructure
agno_agent = Agent(**backend.get_args(agent_id="agent-123"))
# Run agent
result = await agno_agent.arun(
input="What can you help me with?",
stream=True
)
```
### Complete Self-Hosted Example
```python
import asyncio
from xpander_sdk import Configuration, Agent
async def main():
# Configure for self-hosted deployment
config = Configuration(
api_key="your-agent-controller-api-key", # From Helm installation
organization_id="your-org-id",
base_url="https://agent-controller.my-company.com"
)
# Load agent from self-hosted deployment
agent = await Agent.aload("agent-123", configuration=config)
print(f"Agent: {agent.name}")
# Create and execute task
task = await agent.acreate_task(
prompt="Analyze Q4 sales data",
file_urls=["https://example.com/sales-q4.csv"]
)
print(f"Task created: {task.id}")
print(f"Status: {task.status}")
if __name__ == "__main__":
asyncio.run(main())
```
**Important**: Make sure your `base_url` points to the Agent Controller endpoint (e.g., `https://agent-controller.{your-domain}`), not the root domain.
📖 **Full Guide**: [Self-Hosted Configuration Documentation](https://docs.xpander.ai/api-reference/configuration/self-hosted)
## 🔄 Error Handling
```python
from xpander_sdk.exceptions import ModuleException
try:
agent = await Agent.aload("invalid-agent-id")
except ModuleException as e:
print(f"Error {e.status_code}: {e.description}")
```
## 🤝 Contributing
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/{base_branch}/amazing-feature`)
3. Commit your changes (`git commit -m 'feat/chore/fix: Add amazing feature'`)
4. Push to the branch (`git push origin feature/{base_branch}/amazing-feature`)
5. Open a Pull Request
## 📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
## 🆘 Support
- **Documentation**: [https://docs.xpander.ai](https://docs.xpander.ai)
- **Issues**: [GitHub Issues](https://github.com/xpander-ai/xpander-sdk/issues)
- **Email**: support@xpander.ai
---
Built with ❤️ by the xpander.ai team
| text/markdown | xpanderAI | dev@xpander.ai | null | null | null | null | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent"
] | [] | https://www.xpander.ai | null | >=3.9 | [] | [] | [] | [
"python-dotenv>=1.2.1",
"packaging>=25.0",
"pydantic>=2.12.5",
"loguru>=0.7.3",
"httpx>=0.28.1",
"httpx_sse>=0.4.3",
"nest-asyncio>=1.6.0",
"strands-agents>=1.20.0",
"openai-agents>=0.6.4",
"python-toon>=0.1.3",
"agno==2.3.26; extra == \"agno\"",
"sqlalchemy; extra == \"agno\"",
"psycopg[bin... | [] | [] | [] | [] | twine/6.2.0 CPython/3.9.25 | 2026-02-19T10:30:37.947559 | xpander_sdk-2.0.248.tar.gz | 129,916 | 6b/e6/671c5e83a115e7e5cf5f48d9e6d450e97c59a61466ba483358c58b07ae71/xpander_sdk-2.0.248.tar.gz | source | sdist | null | false | a52c3cfd1ae285dff2354394c16d9e40 | 637257973d3a76d3e46139659499a0f7f70a4aea4ee27287edf3d581d4f27023 | 6be6671c5e83a115e7e5cf5f48d9e6d450e97c59a61466ba483358c58b07ae71 | null | [
"LICENSE"
] | 335 |
2.4 | plotxvg | 1.1.2 | Batch-generation of publication quality graphs. | # plotXVG
Molecular simulation tools such as [GROMACS](https://www.gromacs.org) routinely produce
time-series of energies and other observables. To turn this data into
publication quality figures a user can either use a (commercial) software package with a graphical user interface,
often offering fine control and high-quality output, or write their own
code to make plots using a scripting language. In the age of big data and machine
learning it is often necessary to generate many graphs, be able to rapidly inspect
them, and to make plots for manuscripts.
This repository provides *plotxvg*, a simple Python tool built on the well-known [matplotlib](https://matplotlib.org/) plotting library
that will generate publication-quality graphics from, for instance, an energy calculation.
This will allow users to rapidly and reproducibly generate series of graphics files without programming.
Obviously, the tool is applicable to any kind of line graph data, not just that from molecular simulations.
Pip install plotXVG to your **current** python environment by running
python -m pip install plotxvg
in the command line. Or, if preferred, you can clone this repository by running
git clone https://github.com/AlexandriaChemistry/plotXVG
cd into it and then run
python -m pip install .
Simple as that! In the plotXVG directory you can find a number of showcase illustrations along with descriptive text files found
under *tests/plotxvg_tests-output/* and can also easily be generated by running `python test.py` in the *tests* folder.
Example xvg-files have been generated by GROMACS and
[Alexandria Chemistry Toolkit](https://github.com/AlexandriaChemistry/ACT) (ACT), which cen be found in their respective folders.
To use plotXVG, simply type `plotxvg -f <filename>.xvg` in the command line and it will generate a plot.
Default setting is a scatterplot, if you want to generate a lineplot, add the linestyle flag along with a line argument, such as `-ls solid`.
In order to use plotXVG in an API, the general call in a python environment looks like this:
```
import plotxvg
plotxvg.plot(["file1.xvg", "file2.xvg"],
ls=["solid", "dashed"],
panels=True)
```
Note that boolean flags not in lists such as panels do not need to be strings.
You can also create 2D density plots using either the ´-contour´ or ´-heatmap´ flag! Additionally, you can utilize SciPy's kde for probability density calculations of datasets with few datapoints. Note that you do not require the SciPy package for using the rest of plotXVG's functions.
PlotXVG supports a number of flags, shown below.
For more information on installation, usage and a collection of plotXVG examples, you find the manual in the *plotXVG_manual* directory.
## plotXVG flags
```
options:
-h, --help show this help message and exit
-f [FILENAME ...], --filename [FILENAME ...]
Filename(s) to read and plot
-follow, --follow Continuously update the plot by re-reading the input file(s). Use for CLI operations.
-debug, --debug Turn on printing of debugging messages
-font FONTNAME, --fontname FONTNAME
Font for all text.
-allfs ALLFONTSIZES, --allfontsizes ALLFONTSIZES
Scale all font sizes equally, default 0
-axfs AXISLABELFONTSIZE, --axislabelfontsize AXISLABELFONTSIZE
Axis label font size, default 26
-tfs TITLEFONTSIZE, --titlefontsize TITLEFONTSIZE
Title font size, set to zero for no title, default 30
-lfs LEGENDFONTSIZE, --legendfontsize LEGENDFONTSIZE
Legend font size, set to zero for no legend, default 26
-tickfs TICKFONTSIZE, --tickfontsize TICKFONTSIZE
Tick font size, default 24
-ls LINESTYLE [LINESTYLE ...], --linestyle LINESTYLE [LINESTYLE ...]
What kind of line style: solid, dashed, dashdot, dotted
-mk MARKER [MARKER ...], --marker MARKER [MARKER ...]
Use markers for data sets: o, +, x, <, >...
-mksize MARKERSIZE, --markersize MARKERSIZE
Size of filled markers for data sets, default 10
-mkwidth MARKEREDGEWIDTH, --markeredgewidth MARKEREDGEWIDTH
Size of character markers (e.g. +) for data sets, default 2
-colors [COLORS ...], --colors [COLORS ...]
Colors for the plots. Colors defined by the user will be applied to the datasets in order. If there are more datasets than color inputs, default colors will be used.
-save SAVE, --save SAVE
Save plot. Please specify saving location and preferred filetype (.pdf, .png...)
-sqfig, --squarefig Make the figure square
-eqax, --equalaxes Make the plot square with equally large axes
-bar, --bar Make a bar graph
-noshow, --noshow Do not show the figure
-res, --residual Subtract x from y for all data sets - useful for correlation plots
-fl, --filelabel Add the filename to the labels in the plot (may yield long labels)
-logy, --logy Use a log scale on the Y-axis
-xmin XMIN, --xmin XMIN
Minimum value of X-axis. Default = defined by data.
-xmax XMAX, --xmax XMAX
Maximum value of X-axis. Default = defined by data.
-ymin YMIN, --ymin YMIN
Minimum value of Y-axis. Default = defined by data.
-ymax YMAX, --ymax YMAX
Maximum value of Y-axis. Default = defined by data.
-xframe XFRAME, --xframe XFRAME
Width of the plot 100 pixels, default 16
-yframe YFRAME, --yframe YFRAME
Height of the plot 100 pixels, default 9
-panels [{top,side}], --panels [{top,side}]
Generate different panels to plot in, one file per panel. Add 'side' for side-by side panels.
-sfx SUBFIGUREX, --subfigureX SUBFIGUREX
X position of subfigure label when using panels. Default -0.15
-sfy SUBFIGUREY, --subfigureY SUBFIGUREY
Y position of subfigure label when using panels. Default 1.0
-ign [IGNORE ...], --ignore [IGNORE ...]
legends of the series to ignore. Please specify the whole legend label.
-title TITLE [TITLE ...], --title TITLE [TITLE ...]
User-defined title(s). This flag overwrites pre-defined titles. If the user wants to use user-defined titles and pre-defined titles on different panels, use None
as placeholders for predefnied title panels. If the user wants to remove a specific panel title, put an empty string ''.
-notitles, --notitles
Remove all titles (Both user-defined and title pre-defined by data.)
-dslegends [DATASETLEGENDS ...], --datasetlegends [DATASETLEGENDS ...]
Set user-defined legends. If legends are already defined in the input file they are combined for each dataset i.e. '<user-defined legend> <pre-defined legend>'.
-sharelabel, --sharelabel
Show only the x-labels on the last row of plots and the y-labels on the first column of plots (useful if all subplots share the same x- and y-labels)
-legend_x LEGEND_X, --legend_x LEGEND_X
Put the legend box horizontally on this position, default 0.02
-legend_y LEGEND_Y, --legend_y LEGEND_Y
Put the legend box vertically on this position, default 0.98
-stats, --stats Print RMSD and R2 values of datasets (x-axis is reference data and y-axis holds the predicted values)
-colx COLX, --colx COLX
Choose what x column you would like from for example a csv file. Choose only one column. Default is column 1
-coly COLY [COLY ...], --coly COLY [COLY ...]
Choose what y column(s) you would like from for example a csv file. Default is column 3
-v, --version show program's version number and exit
-heatmap, --heatmap 2D heatmap plot
-contour, --contour Contour 2D plot
-cmap COLORMAP, --colormap COLORMAP
Matplotlib colormap
-levels LEVELS, --levels LEVELS
Number of contour levels
-bins BINS, --bins BINS
Number of bins for 2D histogram
-gibbs, --gibbs Calculates Gibbs free energy from probabilty density
-temp TEMPERATURE, --temperature TEMPERATURE
At what temperature to calcuate Gibbs free energy according to G = kBT - log(P).
-kde, --kde Calculate probability density using kernel density estimation (kde). Preferably combined with contour rather than heatmap. OBS! You need the scipy package for this function.
-showdots, --showdots
Choose to show the scattered dots together with kde.
```
| text/markdown | null | Måns Rosenbaum <mans.rosenbaum@icm.uu.se>, David van der Spoel <david.vanderspoel@icm.uu.se> | null | null | GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.
| null | [] | [] | null | null | >=3.7 | [] | [] | [] | [
"matplotlib>=3.7.0",
"numpy>=1.24.1"
] | [] | [] | [] | [
"Homepage, https://github.com/AlexandriaChemistry/plotXVG",
"Source, https://github.com/AlexandriaChemistry/plotXVG",
"Issues, https://github.com/AlexandriaChemistry/plotXVG/issues"
] | twine/6.2.0 CPython/3.10.19 | 2026-02-19T10:30:23.699202 | plotxvg-1.1.2.tar.gz | 59,328 | 75/29/d2f9a415d2a953e3e165ebcb5768320c3a4b65d31546ae6942f9da603e45/plotxvg-1.1.2.tar.gz | source | sdist | null | false | 046a763b8607d63f208a22ce5fc5d80a | 9c48bc7ca1fcaa374571e6c155cbd52bffe0cc06c21b0230e430cc3d867f5385 | 7529d2f9a415d2a953e3e165ebcb5768320c3a4b65d31546ae6942f9da603e45 | null | [
"LICENSE"
] | 250 |
2.3 | pydantic-forms | 2.4.0 | Pydantic-forms engine. | # Pydantic forms
[](https://pypi.org/project/pydantic-forms)
[](https://pypi.org/project/pydantic-forms)
[](https://pepy.tech/project/pydantic-forms)
[](https://codecov.io/gh/workfloworchestrator/pydantic-forms)
A Python package that lets you add smart forms to [FastAPI](https://fastapi.tiangolo.com/)
and [Flask](https://palletsprojects.com/p/flask/). Forms will respond with a JSON scheme that
contains all info needed in a React frontend with uniforms to render the forms and handle all validation tasks.
Forms can also consist out of a wizard, so you can create complex form flows consisting out of multiple
consecutive forms. The forms and the validation logic are defined by
using [Pydantic](https://pydantic-docs.helpmanual.io/) models.
Documentation regarding the usage of Forms can be found
[here](https://github.com/workfloworchestrator/orchestrator-core/blob/main/docs/architecture/application/forms-frontend.md)
### Installation (Development standalone)
Install the project and its dependencies to develop on the code.
#### Step 1 - install flit:
```shell
python3 -m venv venv
source venv/bin/activate
pip install flit
```
#### Step 2 - install the development code:
```shell
flit install --deps develop --symlink --python venv/bin/python
```
!!! danger
Make sure to use the flit binary that is installed in your environment. You can check the correct
path by running
```shell
which flit
```
To be sure that the packages will be installed against the correct venv you can also prepend the python interpreter
that you want to use:
```shell
flit install --deps develop --symlink --python venv/bin/python
```
### Running tests
Run the unit-test suite to verify a correct setup.
#### Step 2 - Run tests
```shell
pytest tests/unit_tests
```
or with xdist:
```shell
pytest -n auto tests/unit_tests
```
If you do not encounter any failures in the test, you should be able to develop features in the pydantic-forms.
### Installation (Development symlinked into project that use pydantic-forms)
If you are working on a project that already uses the `pydantic-forms` and you want to test your new form features
against it, you can use some `flit` magic to symlink the dev version of the forms to your project. It will
automatically replace the pypi dep with a symlink to the development version
of the core and update/downgrade all required packages in your own project.
#### Step 1 - install flit:
```shell
python - m venv venv
source venv/bin/activate
pip install flit
```
### Step 2 - symlink pydantic-forms to your own project
```shell
flit install --deps develop --symlink --python /path/to/a/project/venv/bin/python
```
# Increasing the version number for a (pre) release.
When your PR is accepted you will get a version number.
You can do the necessary change with a clean, e.g. every change committed, branch:
```shell
bumpversion patch --new-version 0.0.1
```
Note: specifying it like this, instead of relying on bumpversion itself to increase the version, allows you to
set a "RC1" version if needed.
# Debugging Form behaviour
If you want/need the traceback of pydantic in a Form response you can add an env variable:
`
LOG_LEVEL_PYDANTIC_FORMS=DEBUG
`
This will add the traceback to the `JSONResponse`. If the loglevel is set to DEBUG the library will also add the
traceback to the logger.
| text/markdown | SURF | automation-beheer@surf.nl | null | null | null | null | [
"Intended Audience :: Information Technology",
"Intended Audience :: System Administrators",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python",
"Topic :: Internet",
"Topic :: Software Development :: Libraries :: Application Frameworks",
"Topi... | [] | https://github.com/workfloworchestrator/pydantic-forms | null | <3.15,>=3.10 | [] | [
"pydantic_forms"
] | [] | [
"more-itertools",
"pydantic[email]>=2.9.0",
"pydantic-i18n==0.4.5",
"toml; extra == \"dev\"",
"bumpversion; extra == \"dev\"",
"mypy_extensions; extra == \"dev\"",
"pre-commit; extra == \"dev\"",
"pydocstyle; extra == \"dev\"",
"python-dotenv; extra == \"dev\"",
"watchdog; extra == \"dev\"",
"mk... | [] | [] | [] | [
"Documentation, https://github.com/workfloworchestrator/pydantic-forms/blob/main/README.md"
] | python-requests/2.32.5 | 2026-02-19T10:28:25.970304 | pydantic_forms-2.4.0.tar.gz | 44,467 | 73/29/6a593b78c705c7363a09af1a8b6152eed69005efc5e7b1daac2bc20140d8/pydantic_forms-2.4.0.tar.gz | source | sdist | null | false | 05428a5abacf739f704b2458a1668fb6 | bc4aeb91c87ab6753c4db88a1a36768f236eff7cdfd613e34c860d0968a40616 | 73296a593b78c705c7363a09af1a8b6152eed69005efc5e7b1daac2bc20140d8 | null | [] | 348 |
2.4 | mistapi | 0.60.0 | Python package to simplify the Mist System APIs usage | # MISTAPI - Python Package for Mist API
[](https://pypi.org/project/mistapi/)
[](https://pypi.org/project/mistapi/)
[](https://opensource.org/licenses/MIT)
A comprehensive Python package to interact with the Mist Cloud APIs, built from the official [Mist OpenAPI specifications](https://www.juniper.net/documentation/us/en/software/mist/api/http/getting-started/how-to-get-started).
---
## Table of Contents
- [Features](#features)
- [Installation](#installation)
- [Quick Start](#quick-start)
- [Configuration](#configuration)
- [Authentication](#authentication)
- [Usage](#usage)
- [CLI Helper Functions](#cli-helper-functions)
- [Pagination](#pagination-support)
- [Examples](#examples)
- [Development](#development-and-testing)
- [Contributing](#contributing)
- [License](#license)
- [Links](#links)
---
## Features
### Supported Mist Clouds
Support for all Mist cloud instances worldwide:
- **APAC**: api.ac5.mist.com, api.gc5.mist.com, api.gc7.mist.com
- **EMEA**: api.eu.mist.com, api.gc3.mist.com, api.ac6.mist.com, api.gc6.mist.com
- **Global**: api.mist.com, api.gc1.mist.com, api.ac2.mist.com, api.gc2.mist.com, api.gc4.mist.com
### Authentication
- API token and username/password authentication (with 2FA support)
- Environment variable configuration (`.env` file support)
- HashiCorp Vault integration for secure credential storage
- System keyring integration (macOS Keychain, Windows Credential Locker, etc.)
- Interactive CLI prompts for credentials when needed
### Core Features
- **Complete API Coverage**: Auto-generated from OpenAPI specs
- **Automatic Pagination**: Built-in support for paginated responses
- **Error Handling**: Detailed error responses and logging
- **Proxy Support**: HTTP/HTTPS proxy configuration
- **Log Sanitization**: Automatic redaction of sensitive data in logs
### API Coverage
**Organization Level**: Organizations, Sites, Devices (APs/Switches/Gateways), WLANs, VPNs, Networks, NAC, Users, Admins, Guests, Alarms, Events, Statistics, SLE, Assets, Licenses, Webhooks, Security Policies, MSP management
**Site Level**: Device management, RF optimization, Location services, Maps, Client analytics, Asset tracking, Synthetic testing, Anomaly detection
**Constants & Utilities**: Device models, AP channels, Applications, Country codes, Alarm definitions, Event types, Webhook topics
**Additional Services**: OAuth, Two-factor authentication, Account recovery, Invitations, MDM workflows
---
## Installation
### Basic Installation
```bash
# Linux/macOS
python3 -m pip install mistapi
# Windows
py -m pip install mistapi
```
### Upgrade to Latest Version
```bash
# Linux/macOS
python3 -m pip install --upgrade mistapi
# Windows
py -m pip install --upgrade mistapi
```
### Development Installation
```bash
# Install with development dependencies (for contributors)
pip install mistapi[dev]
```
### Requirements
- Python 3.10 or higher
- Dependencies: `requests`, `python-dotenv`, `tabulate`, `deprecation`, `hvac`, `keyring`
---
## Quick Start
```python
import mistapi
# Initialize session
apisession = mistapi.APISession()
# Authenticate (interactive prompt if credentials not configured)
apisession.login()
# Use the API - Get device models
device_models = mistapi.api.v1.const.device_models.listDeviceModels(apisession)
print(f"Found {len(device_models.data)} device models")
# Interactive organization selection
org_id = mistapi.cli.select_org(apisession)[0]
# Get organization information
org_info = mistapi.api.v1.orgs.orgs.getOrg(apisession, org_id)
print(f"Organization: {org_info.data['name']}")
```
---
## Configuration
Configuration is optional - you can pass all parameters directly to `APISession`. However, using an `.env` file simplifies credential management.
### Using Environment File
```python
import mistapi
apisession = mistapi.APISession(env_file="~/.mist_env")
```
### Environment Variables
Create a `.env` file with your credentials:
```bash
MIST_HOST=api.mist.com
MIST_APITOKEN=your_api_token_here
# Alternative to API token
# MIST_USER=your_email@example.com
# MIST_PASSWORD=your_password
# Proxy configuration
# HTTPS_PROXY=http://user:password@myproxy.com:3128
# Logging configuration
# CONSOLE_LOG_LEVEL=20 # 0=Disabled, 10=Debug, 20=Info, 30=Warning, 40=Error, 50=Critical
# LOGGING_LOG_LEVEL=10
```
### All Configuration Options
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `MIST_HOST` | string | None | Mist Cloud API endpoint (e.g., `api.mist.com`) |
| `MIST_APITOKEN` | string | None | API Token for authentication (recommended) |
| `MIST_USER` | string | None | Username if not using API token |
| `MIST_PASSWORD` | string | None | Password if not using API token |
| `MIST_KEYRING_SERVICE` | string | None | Load credentials from system keyring |
| `MIST_VAULT_URL` | string | https://127.0.0.1:8200 | HashiCorp Vault URL |
| `MIST_VAULT_PATH` | string | None | Path to secret in Vault |
| `MIST_VAULT_MOUNT_POINT` | string | secret | Vault mount point |
| `MIST_VAULT_TOKEN` | string | None | Vault authentication token |
| `CONSOLE_LOG_LEVEL` | int | 20 | Console log level (0-50) |
| `LOGGING_LOG_LEVEL` | int | 10 | File log level (0-50) |
| `HTTPS_PROXY` | string | None | HTTP/HTTPS proxy URL |
---
## Authentication
The `login()` function must be called to authenticate. The package supports multiple authentication methods.
### 1. Interactive Authentication
If credentials are not configured, you'll be prompted interactively:
**Cloud Selection:**
```
----------------------------- Mist Cloud Selection -----------------------------
0) APAC 01 (host: api.ac5.mist.com)
1) EMEA 01 (host: api.eu.mist.com)
2) Global 01 (host: api.mist.com)
...
Select a Cloud (0 to 10, or q to exit):
```
**Credential Prompt:**
```
--------------------------- Login/Pwd authentication ---------------------------
Login: user@example.com
Password:
[ INFO ] Authentication successful!
Two Factor Authentication code required: 123456
[ INFO ] 2FA authentication succeeded
-------------------------------- Authenticated ---------------------------------
Welcome Thomas Munzer!
```
### 2. Environment File Authentication
```python
import mistapi
apisession = mistapi.APISession(env_file="~/.mist_env")
apisession.login()
# Output:
# -------------------------------- Authenticated ---------------------------------
# Welcome Thomas Munzer!
```
### 3. HashiCorp Vault Authentication
```python
import mistapi
apisession = mistapi.APISession(
vault_url="https://vault.mycompany.com:8200",
vault_path="secret/data/mist/credentials",
vault_token="s.xxxxxxx"
)
apisession.login()
```
### 4. System Keyring Authentication
```python
import mistapi
apisession = mistapi.APISession(keyring_service="my_mist_service")
apisession.login()
```
**Note:** The keyring must contain: `MIST_HOST`, `MIST_APITOKEN` (or `MIST_USER` and `MIST_PASSWORD`)
### 5. Direct Parameter Authentication
```python
import mistapi
apisession = mistapi.APISession(
host="api.mist.com",
apitoken="your_token_here"
)
apisession.login()
```
### APISession Parameters
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `email` | str | None | Email for login/password authentication |
| `password` | str | None | Password for login/password authentication |
| `apitoken` | str | None | API token (recommended method) |
| `host` | str | None | Mist Cloud endpoint (e.g., "api.mist.com") |
| `keyring_service` | str | None | System keyring service name |
| `vault_url` | str | https://127.0.0.1:8200 | HashiCorp Vault URL |
| `vault_path` | str | None | Path to secret in Vault |
| `vault_mount_point` | str | secret | Vault mount point |
| `vault_token` | str | None | Vault authentication token |
| `env_file` | str | None | Path to `.env` file |
| `console_log_level` | int | 20 | Console logging level (0-50) |
| `logging_log_level` | int | 10 | File logging level (0-50) |
| `https_proxy` | str | None | Proxy URL |
---
## Usage
### Basic API Calls
```python
# Get device models (constants)
response = mistapi.api.v1.const.device_models.listDeviceModels(apisession)
print(f"Status: {response.status_code}")
print(f"Data: {len(response.data)} models")
# Get organization information
org_info = mistapi.api.v1.orgs.orgs.getOrg(apisession, org_id)
print(f"Organization: {org_info.data['name']}")
# Get organization statistics
org_stats = mistapi.api.v1.orgs.stats.getOrgStats(apisession, org_id)
print(f"Organization has {org_stats.data['num_sites']} sites")
# Search for devices
devices = mistapi.api.v1.orgs.devices.searchOrgDevices(apisession, org_id, type="ap")
print(f"Found {len(devices.data['results'])} access points")
```
### Error Handling
```python
# Check response status
response = mistapi.api.v1.orgs.orgs.listOrgs(apisession)
if response.status_code == 200:
print(f"Success: {len(response.data)} organizations")
else:
print(f"Error {response.status_code}: {response.data}")
# Exception handling
try:
org_info = mistapi.api.v1.orgs.orgs.getOrg(apisession, "invalid-org-id")
except Exception as e:
print(f"API Error: {e}")
```
### Log Sanitization
The package automatically sanitizes sensitive data in logs:
```python
import logging
from mistapi.__logger import LogSanitizer
# Configure logging
LOG_FILE = "./app.log"
logging.basicConfig(filename=LOG_FILE, filemode="w")
LOGGER = logging.getLogger(__name__)
LOGGER.setLevel(logging.DEBUG)
# Add sanitization filter
LOGGER.addFilter(LogSanitizer())
# Sensitive data is automatically redacted
LOGGER.debug({"user": "john", "apitoken": "secret123", "password": "pass456"})
# Output: {"user": "john", "apitoken": "****", "password": "****"}
```
### Getting Help
```python
# Get detailed help on any API function
help(mistapi.api.v1.orgs.stats.getOrgStats)
```
---
## CLI Helper Functions
Interactive functions for selecting organizations and sites.
### Organization Selection
```python
# Select single organization
org_id = mistapi.cli.select_org(apisession)[0]
# Select multiple organizations
org_ids = mistapi.cli.select_org(apisession, allow_many=True)
```
**Output:**
```
Available organizations:
0) Acme Corp (id: 203d3d02-xxxx-xxxx-xxxx-76896a3330f4)
1) Demo Lab (id: 6374a757-xxxx-xxxx-xxxx-361e45b2d4ac)
Select an Org (0 to 1, or q to exit): 0
```
### Site Selection
```python
# Select site within an organization
site_id = mistapi.cli.select_site(apisession, org_id=org_id)[0]
```
**Output:**
```
Available sites:
0) Headquarters (id: f5fcbee5-xxxx-xxxx-xxxx-1619ede87879)
1) Branch Office (id: a8b2c3d4-xxxx-xxxx-xxxx-987654321abc)
Select a Site (0 to 1, or q to exit): 0
```
---
## Pagination Support
### Get Next Page
```python
# Get first page
response = mistapi.api.v1.orgs.clients.searchOrgClientsEvents(
apisession, org_id, duration="1d"
)
print(f"First page: {len(response.data['results'])} results")
# Get next page
if response.next:
response_2 = mistapi.get_next(apisession, response)
print(f"Second page: {len(response_2.data['results'])} results")
```
### Get All Pages Automatically
```python
# Get all pages with a single call
response = mistapi.api.v1.orgs.clients.searchOrgClientsEvents(
apisession, org_id, duration="1d"
)
print(f"First page: {len(response.data['results'])} results")
# Retrieve all remaining pages
all_data = mistapi.get_all(apisession, response)
print(f"Total results across all pages: {len(all_data)}")
```
---
## Examples
Comprehensive examples are available in the [Mist Library repository](https://github.com/tmunzer/mist_library).
### Device Management
```python
# List all devices in an organization
devices = mistapi.api.v1.orgs.devices.listOrgDevices(apisession, org_id)
# Get specific device details
device = mistapi.api.v1.orgs.devices.getOrgDevice(
apisession, org_id, device_id
)
# Update device configuration
update_data = {"name": "New Device Name"}
result = mistapi.api.v1.orgs.devices.updateOrgDevice(
apisession, device.org_id, device.id, body=update_data
)
```
### Site Management
```python
# Create a new site
site_data = {
"name": "New Branch Office",
"country_code": "US",
"timezone": "America/New_York"
}
new_site = mistapi.api.v1.orgs.sites.createOrgSite(
apisession, org_id, body=site_data
)
# Get site statistics
site_stats = mistapi.api.v1.sites.stats.getSiteStats(apisession, new_site.id)
```
### Client Analytics
```python
# Search for wireless clients
clients = mistapi.api.v1.orgs.clients.searchOrgWirelessClients(
apisession, org_id,
duration="1d",
limit=100
)
# Get client events
events = mistapi.api.v1.orgs.clients.searchOrgClientsEvents(
apisession, org_id,
duration="1h",
client_mac="aa:bb:cc:dd:ee:ff"
)
```
---
## Development and Testing
### Development Setup
```bash
# Clone the repository
git clone https://github.com/tmunzer/mistapi_python.git
cd mistapi_python
# Install with development dependencies
pip install -e ".[dev]"
```
### Running Tests
```bash
# Run all tests
pytest
# Run with coverage report
pytest --cov=src/mistapi --cov-report=html
# Run specific test file
pytest tests/unit/test_api_session.py
# Run linting
ruff check src/
```
### Package Structure
```
src/mistapi/
├── __init__.py # Main package exports
├── __api_session.py # Session management and authentication
├── __api_request.py # HTTP request handling
├── __api_response.py # Response parsing and pagination
├── __logger.py # Logging and sanitization
├── __pagination.py # Pagination utilities
├── cli.py # Interactive CLI functions
├── __models/ # Data models
│ ├── __init__.py
│ └── privilege.py
└── api/v1/ # Auto-generated API endpoints
├── const/ # Constants and enums
├── orgs/ # Organization-level APIs
├── sites/ # Site-level APIs
├── login/ # Authentication APIs
└── utils/ # Utility functions
```
---
## Contributing
Contributions are welcome! Please follow these guidelines:
### How to Contribute
1. **Fork** the repository
2. **Create** a feature branch
```bash
git checkout -b feature/amazing-feature
```
3. **Commit** your changes
```bash
git commit -m 'Add amazing feature'
```
4. **Push** to the branch
```bash
git push origin feature/amazing-feature
```
5. **Open** a Pull Request
### Development Guidelines
- Write tests for new features
- Ensure all tests pass before submitting PR
- Follow existing code style and conventions
- Update documentation as needed
- Add entries to CHANGELOG.md for significant changes
---
## License
**MIT License**
Copyright (c) 2023 Thomas Munzer
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
---
## Links
- **Mist API Specs**: [OpenAPI Documentation](https://www.juniper.net/documentation/us/en/software/mist/api/http/getting-started/how-to-get-started)
- **Source Code**: [GitHub Repository](https://github.com/tmunzer/mistapi_python)
- **PyPI Package**: [mistapi on PyPI](https://pypi.org/project/mistapi/)
- **Examples**: [Mist Library Examples](https://github.com/tmunzer/mist_library)
- **Bug Reports**: [GitHub Issues](https://github.com/tmunzer/mistapi_python/issues)
| text/markdown | null | Thomas Munzer <tmunzer@juniper.net> | null | null | MIT License | API, Juniper, Mist | [
"Development Status :: 4 - Beta",
"Intended Audience :: System Administrators",
"Intended Audience :: Telecommunications Industry",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: System :: Networking"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"deprecation>=2.1.0",
"hvac>=2.3.0",
"keyring>=24.3.0",
"python-dotenv>=1.1.0",
"requests>=2.32.3",
"tabulate>=0.9.0"
] | [] | [] | [] | [
"Source, https://github.com/tmunzer/mistapi_python",
"Bug Tracker, https://github.com/tmunzer/mistapi_python/issues"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:28:02.991077 | mistapi-0.60.0.tar.gz | 5,161,680 | 33/76/d2c5563c652b13d9b27811e143730c2d77b2ce837a85feb2bd70adb62171/mistapi-0.60.0.tar.gz | source | sdist | null | false | 718586945a341ba0f74632104972d395 | aa53ec812eaa44a5f022b0664dab26068ce902c50d2af3e2895a2f9fcea48929 | 3376d2c5563c652b13d9b27811e143730c2d77b2ce837a85feb2bd70adb62171 | null | [
"LICENSE"
] | 394 |
2.4 | numpack | 0.5.1 | A high-performance array storage and manipulation library | # NumPack
A high-performance NumPy array storage library combining Rust's speed with Python's simplicity. Optimized for frequent read/write operations on large arrays, with built-in SIMD-accelerated vector similarity search.
## Highlights
| Feature | Performance |
|---------|-------------|
| Row Replacement | **344x faster** than NPY |
| Data Append | **338x faster** than NPY |
| Lazy Loading | **51x faster** than NPY mmap |
| Full Load | **1.64x faster** than NPY |
| Batch Mode | **21x speedup** |
| Writable Batch | **92x speedup** |
**Core Capabilities:**
- Zero-copy mmap operations with minimal memory footprint
- SIMD-accelerated Vector Engine (AVX2, AVX-512, NEON, SVE)
- Batch & Writable Batch modes for high-frequency modifications
- Supports all NumPy dtypes: bool, int8-64, uint8-64, float16/32/64, complex64/128
## Installation
```bash
pip install numpack
```
**Requirements:** Python ≥ 3.9, NumPy ≥ 1.26.0
<details>
<summary><b>Build from Source</b></summary>
```bash
# Prerequisites: Rust >= 1.70.0 (rustup.rs), C/C++ compiler
git clone https://github.com/BirchKwok/NumPack.git
cd NumPack
pip install maturin>=1.0,<2.0
maturin develop # or: maturin build --release
```
</details>
## Quick Start
```python
import numpy as np
from numpack import NumPack
with NumPack("data.npk") as npk:
# Save
npk.save({'embeddings': np.random.rand(10000, 128).astype(np.float32)})
# Load (normal or lazy)
data = npk.load("embeddings")
lazy = npk.load("embeddings", lazy=True)
# Modify
npk.replace({'embeddings': new_rows}, indices=[0, 1, 2])
npk.append({'embeddings': more_rows})
npk.drop('embeddings', [0, 1, 2]) # drop rows
# Random access
subset = npk.getitem('embeddings', [100, 200, 300])
```
### Batch Modes
```python
# Batch Mode - cached writes (21x speedup)
with npk.batch_mode():
for i in range(1000):
arr = npk.load('data')
arr[:10] *= 2.0
npk.save({'data': arr})
# Writable Batch Mode - direct mmap (108x speedup)
with npk.writable_batch_mode() as wb:
arr = wb.load('data')
arr[:10] *= 2.0 # Auto-persisted
```
### Vector Engine
SIMD-accelerated similarity search (AVX2, AVX-512, NEON, SVE).
```python
from numpack.vector_engine import VectorEngine, StreamingVectorEngine
# In-memory search
engine = VectorEngine()
indices, scores = engine.top_k_search(query, candidates, 'cosine', k=10)
# Multi-query batch (30-50% faster)
all_indices, all_scores = engine.multi_query_top_k(queries, candidates, 'cosine', k=10)
# Streaming from file (for large datasets)
streaming = StreamingVectorEngine()
indices, scores = streaming.streaming_top_k_from_file(
query, 'vectors.npk', 'embeddings', 'cosine', k=10
)
```
**Supported Metrics:** `cosine`, `dot`, `l2`, `l2sq`, `hamming`, `jaccard`, `kl`, `js`
### Format Conversion
Convert between NumPack and other formats (PyTorch, Arrow, Parquet, SafeTensors).
```python
from numpack.io import from_tensor, to_tensor, from_table, to_table
# Memory <-> .npk (zero-copy when possible)
from_tensor(tensor, 'output.npk', array_name='embeddings') # tensor -> .npk
tensor = to_tensor('input.npk', array_name='embeddings') # .npk -> tensor
from_table(table, 'output.npk') # PyArrow Table -> .npk
table = to_table('input.npk') # .npk -> PyArrow Table
# File <-> File (streaming for large files)
from numpack.io import from_pt, to_pt
from_pt('model.pt', 'output.npk') # .pt -> .npk
to_pt('input.npk', 'output.pt') # .npk -> .pt
```
**Supported formats:** PyTorch (.pt), Feather, Parquet, SafeTensors, NumPy (.npy), HDF5, Zarr, CSV
### Pack & Unpack
Portable `.npkg` format for easy migration and sharing.
```python
from numpack import pack, unpack, get_package_info
# Pack NumPack directory into a single .npkg file
pack('data.npk') # -> data.npkg (with Zstd compression)
pack('data.npk', 'backup/data.npkg') # Custom output path
# Unpack .npkg back to NumPack directory
unpack('data.npkg') # -> data.npk
unpack('data.npkg', 'restored/') # Custom restore path
# View package info without extracting
info = get_package_info('data.npkg')
print(f"Files: {info['file_count']}, Compression: {info['compression_ratio']:.1%}")
```
## Benchmarks
*Tested on macOS Apple Silicon, 1M rows × 10 columns, Float32 (38.1MB)*
| Operation | NumPack | NPY | Advantage |
|-----------|---------|-----|----------:|
| Full Load | 4.00ms | 6.56ms | **1.64x** |
| Lazy Load | 0.002ms | 0.102ms | **51x** |
| Replace 100 rows | 0.040ms | 13.74ms | **344x** |
| Append 100 rows | 0.054ms | 18.26ms | **338x** |
| Random Access (100) | 0.004ms | 0.002ms | ~equal |
<details>
<summary><b>Multi-Format Comparison</b></summary>
**Core Operations (1M × 10, Float32, ~38.1MB):**
| Operation | NumPack | NPY | Zarr | HDF5 | Parquet | Arrow |
|-----------|--------:|----:|-----:|-----:|--------:|------:|
| Save | 11.94ms | 6.48ms | 70.91ms | 58.07ms | 142.11ms | 16.85ms |
| Full Load | 4.00ms | 6.56ms | 32.86ms | 53.99ms | 16.49ms | 12.39ms |
| Lazy Load | 0.002ms | 0.102ms | 0.374ms | 0.082ms | N/A | N/A |
| Replace 100 | 0.040ms | 13.74ms | 7.61ms | 0.29ms | 162.48ms | 26.93ms |
| Append 100 | 0.054ms | 18.26ms | 9.05ms | 0.39ms | 173.45ms | 42.46ms |
**Random Access Performance:**
| Batch Size | NumPack | NPY (mmap) | Zarr | HDF5 | Parquet | Arrow |
|------------|--------:|-----------:|-----:|-----:|--------:|------:|
| 100 rows | 0.004ms | 0.002ms | 2.66ms | 0.66ms | 16.25ms | 12.43ms |
| 1K rows | 0.025ms | 0.021ms | 2.86ms | 5.02ms | 16.48ms | 12.61ms |
| 10K rows | 0.118ms | 0.112ms | 16.63ms | 505.71ms | 17.45ms | 12.81ms |
**Batch Mode Performance (100 consecutive operations):**
| Mode | Time | Speedup |
|------|-----:|--------:|
| Normal | 414ms | - |
| Batch Mode | 20.1ms | **21x** |
| Writable Batch | 4.5ms | **92x** |
**File Size:**
| Format | Size | Compression |
|--------|-----:|:-----------:|
| NumPack | 38.15MB | - |
| NPY | 38.15MB | - |
| NPZ | 34.25MB | ✓ |
| Zarr | 34.13MB | ✓ |
| HDF5 | 38.18MB | - |
| Parquet | 44.09MB | ✓ |
| Arrow | 38.16MB | - |
</details>
### When to Use NumPack
| Use Case | Recommendation |
|----------|----------------|
| Frequent modifications | ✅ **NumPack** (344x faster) |
| ML/DL pipelines | ✅ **NumPack** (zero-copy random access, no full load) |
| Vector similarity search | ✅ **NumPack** (SIMD) |
| Write-once, read-many | ✅ **NumPack** (1.64x faster read) |
| Extreme compression | ✅ **NumPack** `.npkg` (better ratio, streaming, high I/O) |
| RAG/Embedding storage | ✅ **NumPack** (fast retrieval + SIMD search) |
| Feature store | ✅ **NumPack** (real-time updates + low latency) |
| Memory-constrained environments | ✅ **NumPack** (mmap + lazy loading) |
| Multi-process data sharing | ✅ **NumPack** (zero-copy mmap) |
| Incremental data pipelines | ✅ **NumPack** (338x faster append) |
| Real-time feature updates | ✅ **NumPack** (ms-level replace) |
## Documentation
See [`docs/`](docs/) for detailed guides and [`unified_benchmark.py`](unified_benchmark.py) for benchmark code.
## Contributing
Contributions welcome! Please submit a Pull Request.
## License
Apache License 2.0 - see [LICENSE](LICENSE) for details.
| text/markdown; charset=UTF-8; variant=GFM | NumPack Contributors | null | null | null | null | numpy, array, storage, performance | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming La... | [] | null | null | >=3.9 | [] | [] | [] | [
"numpy>=1.26.0"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.13.12 | 2026-02-19T10:27:56.226772 | numpack-0.5.1.tar.gz | 349,039 | 5f/f2/8d745d1a2ca24b1fcff49a09a9bfe2c600349983c6bd8f392e338b61c88e/numpack-0.5.1.tar.gz | source | sdist | null | false | e11221d9a94a66aa2975e0db12906bfa | 019dee006babb1fa21073ceef406d78983e8ab7d91cf4e8790e144f9cf33ae53 | 5ff28d745d1a2ca24b1fcff49a09a9bfe2c600349983c6bd8f392e338b61c88e | null | [
"LICENSE"
] | 1,384 |
2.3 | dsp-tools | 18.6.3 | DSP-TOOLS is a Python package with a command line interface that helps you interact with a DaSCH service platform (DSP) server. | [](https://pypi.org/project/dsp-tools/)
[](https://pypi.org/project/dsp-tools/)
[](https://pypi.org/project/dsp-tools/)
[](https://github.com/astral-sh/ruff)
[](https://github.com/astral-sh/uv)
[](https://github.com/python/mypy)
# DSP-TOOLS Documentation
## Installing `dsp-tools`
To install the latest version, run:
```bash
pip3 install dsp-tools
```
To update to the latest version run:
```bash
pip3 install --upgrade dsp-tools
```
> 🚨 If your Python version is older than ours,
> pip will silently install an outdated version of DSP-TOOLS.
> DSP-TOOLS requires one of these Python versions:
> [](https://pypi.org/project/dsp-tools/)
> The most recent version of DSP-TOOLS is
> [](https://pypi.org/project/dsp-tools/)
The `dsp-tools` package provides you with functionalities in the command line
to interact with the [DSP-API](https://github.com/dasch-swiss/dsp-api), both remote and locally.
Additionally, it contains the `xmllib` which helps you construct the XML file required for a mass upload.
## Where To Start?
`dsp-tools` provides you with the following core functionalities.
- **Running a Local Stack:** If you want to run your own DSP stack locally, take a look [here](./local-stack.md).
- **Data Modelling:** There are several ways to create a data model with `dsp-tools`
- Take a look at the technical specification for the [JSON file](./data-model/json-project/overview.md).
- Or take a look at our tool to [convert Excel files into the JSON format](./data-model/excel2json.md).
- You can create a data model on the DSP-APP. To re-use that data model on another server
you can use the CLI command described [here](./data-model/data-model-cli.md#get).
- **Data for Mass-Upload:**
- If you want to create the XML file required for a mass-upload onto DSP, take a look at the [`xmllib`](./xmllib-docs/overview.md).
- You can find an in-depth explanation of our XML file format [here](./data-file/xml-data-file.md).
Please note, that we recommend to use the `xmllib` library to create the file
as we will ensure interoperability between the DSP-API requirements and your input.
- If you want to validate and upload your XML file take a look [here](./data-file/data-file-commands.md).
Please note, that only DaSCH employees are permitted to upload data on a production server.
| text/markdown | DaSCH - Swiss National Data and Service Center for the Humanities | DaSCH - Swiss National Data and Service Center for the Humanities <info@dasch.swiss> | null | null | null | null | [
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.14",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: OS Independent"
] | [] | null | null | >=3.12 | [] | [] | [] | [
"argparse",
"jinja2",
"jsonpath-ng",
"jsonschema",
"loguru~=0.7.3",
"lxml",
"openpyxl",
"packaging",
"pandas[excel]",
"polars",
"pyld",
"pyoxigraph~=0.5.2",
"python-dotenv",
"pyyaml",
"rdflib",
"regex",
"requests",
"rustworkx~=0.17.1",
"tqdm",
"typing-extensions"
] | [] | [] | [] | [
"Homepage, https://www.dasch.swiss/",
"Documentation, https://docs.dasch.swiss/latest/DSP-TOOLS/",
"Repository, https://github.com/dasch-swiss/dsp-tools.git"
] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-19T10:27:51.057808 | dsp_tools-18.6.3.tar.gz | 316,564 | 47/41/196b0b29f30222d601b56c51c498096ac3d976f6623258dc9a83c9863326/dsp_tools-18.6.3.tar.gz | source | sdist | null | false | e7750e52a8f137a62a3eadc9b511bd01 | 79034ec1a634f49c58b1c0a4a13c61898bf0aa3f8583286585ab28a677a96609 | 4741196b0b29f30222d601b56c51c498096ac3d976f6623258dc9a83c9863326 | null | [] | 273 |
2.1 | pyuipc | 0.0.8 | A Cross-Platform Modern C++20 Library of Unified Incremental Potential Contact (CUDA 12.6+ required) | # libuipc
A Cross-Platform Modern C++20 **Lib**rary of **U**nified **I**ncremental **P**otential **C**ontact.
Both C++ and Python API are provided!
Website: ➡️ https://spirimirror.github.io/libuipc-web/
Documentation: ➡️ https://spirimirror.github.io/libuipc-doc/
Samples: ➡️ https://github.com/spiriMirror/libuipc-samples/

## Introduction
**Libuipc** is a library that offers a unified **GPU** incremental potential contact framework for simulating the dynamics of rigid bodies, soft bodies, cloth, and threads, and their couplings. It ensures accurate, **penetration-free frictional contact** and is naturally **differentiable**. Libuipc aims to provide robust and efficient **forward** and **backward** simulations, making it easy for users to integrate with machine learning frameworks, inverse dynamics, robotics, and more.
We are **actively** developing Libuipc and will continue to add more features and improve its performance. We welcome any feedback and contributions from the community!
## Why Libuipc
- **Easy & Powerful**: Libuipc offers an intuitive and unified approach to creating and accessing vivid simulation scenes, supporting a variety of objects and constraints that can be easily added.
- **Fast & Robust**: Libuipc is designed to run fully in parallel on the GPU, achieving high performance and enabling large-scale simulations. It features a robust and accurate frictional contact model that effectively handles complex frictional scenarios without penetration.
- **High Flexibility**: Libuipc provides APIs in both Python and C++ and supports both Linux and Windows systems.
- **Fully Differentiable**: Libuipc provides differentiable simulation APIs for backward optimizations. (Coming Soon)
<table>
<tr>
<td>
<img src="docs/tutorial/media/concepts_code.svg" width="400">
</td>
<td>
<img src="docs/tutorial/media/concepts.drawio.svg" width="450">
</td>
</tr>
</table>
## Key Features
- Finite Element-Based Deformable Simulation
- Rigid & Soft Body Strong Coupling Simulation
- Penetration-Free & Accurate Frictional Contact Handling
- User Scriptable Animation Control
- Fully Differentiable Simulation (Diff-Sim Coming Soon)
## News
**2026-2-13**: Libuipc now supports [Cursor](https://www.cursor.com/) AI-assisted development! Check out `.cursor/` for built-in commands (build, test, commit, PR workflows) and rules (C++ style, project structure, commit conventions).
**2026-2-7**: UIPC now supports PyPI install with `pip install pyuipc`. For the early test version, we support Win/Linux, Python 3.10–3.13 with CUDA 12.8.
**2025-11-01**: The prototype implementation of Libuipc has been open-sourced ([source code](https://github.com/KemengHuang/Stiff-GIPC)) and serves as the performance benchmark for comparisons with our paper.
**2025-5-23**: [StiffGIPC](https://dl.acm.org/doi/10.1145/3735126) will be presented at Siggraph 2025, and Libuipc v1.0.0 will be released soon!
**2024-11-25**: Libuipc v0.9.0 (Alpha) is published! We are excited to share our work with the community. This is a preview version, if you have any feedback or suggestions, please feel free to contact us! [Issues](https://github.com/spiriMirror/libuipc/issues) and [PRs](https://github.com/spiriMirror/libuipc/pulls) are welcome!
## Citation
If you use **Libuipc** in your project, please cite our works:
```
@article{stiffgipc2025,
author = {Huang, Kemeng and Lu, Xinyu and Lin, Huancheng and Komura, Taku and Li, Minchen},
title = {StiffGIPC: Advancing GPU IPC for Stiff Affine-Deformable Simulation},
year = {2025},
publisher = {Association for Computing Machinery},
volume = {44},
number = {3},
issn = {0730-0301},
doi = {10.1145/3735126},
journal = {ACM Trans. Graph.},
month = may,
articleno = {31},
numpages = {20}
}
```
```
@article{gipc2024,
author = {Huang, Kemeng and Chitalu, Floyd M. and Lin, Huancheng and Komura, Taku},
title = {GIPC: Fast and Stable Gauss-Newton Optimization of IPC Barrier Energy},
year = {2024},
publisher = {Association for Computing Machinery},
volume = {43},
number = {2},
issn = {0730-0301},
doi = {10.1145/3643028},
journal = {ACM Trans. Graph.},
month = {mar},
articleno = {23},
numpages = {18}
}
```
| text/markdown | null | Zihang Zhu <paradise_craftsman@hotmail.com>, MuGdxy <lxy819469559@gmail.com> | null | MuGdxy <lxy819469559@gmail.com> | Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
| ipc, computer graphics, physics simulation, cuda, fem, contact | [
"Development Status :: 4 - Beta",
"Intended Audience :: Science/Research",
"Intended Audience :: Developers",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"... | [] | null | null | <3.14,>=3.10 | [] | [] | [] | [
"numpy",
"polyscope<3.0.0,>=2.5.0",
"polyscope; extra == \"gui\"",
"pytest; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://spirimirror.github.io/libuipc-doc/",
"Issues, https://github.com/spiriMirror/libuipc/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:27:44.680357 | pyuipc-0.0.8-cp313-cp313-win_amd64.whl | 24,511,570 | a6/0d/f685f3678fb56464a9b89ded27c86b0f23aa538584e54c10517ffdb0d547/pyuipc-0.0.8-cp313-cp313-win_amd64.whl | cp313 | bdist_wheel | null | false | 0245fe8dabce8de4cc4fe9d97d58e1ad | fba78bb7a50e3262203fb3ff409a20ebbba2871e37b8d51d829b1c9d124b9aa7 | a60df685f3678fb56464a9b89ded27c86b0f23aa538584e54c10517ffdb0d547 | null | [] | 603 |
2.4 | casual-memory | 0.5.0 | Intelligent semantic memory with conflict detection, classification pipeline, and storage abstraction | # casual-memory
**Intelligent semantic memory with conflict detection, classification pipeline, and storage abstraction**
[](https://badge.fury.io/py/casual-memory)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
[](https://github.com/casualgenius/casual-memory/actions)
---
## 🚀 Features
### 👑 Classification Pipeline (Core Innovation)
- **Protocol-based architecture** - Composable, extensible classifiers
- **NLI Pre-filtering** - Fast semantic filtering (~50-200ms)
- **LLM Conflict Detection** - High-accuracy contradiction detection (96%+)
- **LLM Duplicate Detection** - Smart deduplication vs distinct facts
- **Auto-Resolution** - Confidence-based conflict resolution
- **Graceful degradation** - Heuristic fallback when LLM unavailable
### 🧠 Memory Intelligence
- **Memory extraction** from conversations (user & assistant messages)
- **Conflict detection** with categorization (location, preference, temporal, factual)
- **Confidence scoring** based on mention frequency and recency
- **Memory archiving** with soft-delete patterns
- **Temporal memory** with date normalization and expiry
### 🔌 Storage Abstraction
- **Protocol-based** - Works with any vector database
- **Optional adapters** - Qdrant, PostgreSQL, Redis
- **In-memory implementations** - For testing
- **Bring your own** - Implement custom backends
---
## 📦 Installation
### Minimal (core only)
```bash
pip install casual-memory
```
### With specific backends
```bash
# With NLI support (sentence-transformers)
pip install casual-memory[transformers]
# With Qdrant vector store
pip install casual-memory[qdrant]
# With PostgreSQL conflict store
pip install casual-memory[postgres]
# With Redis short-term store
pip install casual-memory[redis]
# Full installation (all extras)
pip install casual-memory[all]
```
### CPU-only installation (no CUDA)
By default, PyTorch includes CUDA which is a large download. For CPU-only machines:
```bash
# Install CPU-only PyTorch first
pip install torch --index-url https://download.pytorch.org/whl/cpu
# Then install casual-memory with transformers
pip install casual-memory[transformers]
```
### For development
```bash
git clone https://github.com/yourusername/casual-memory
cd casual-memory
uv sync --all-extras
```
---
## 🎯 Quick Start
### Classification Pipeline
```python
from casual_memory.classifiers import (
MemoryClassificationPipeline,
NLIClassifier,
ConflictClassifier,
DuplicateClassifier,
AutoResolutionClassifier,
SimilarMemory,
)
from casual_memory.intelligence import NLIPreFilter, LLMConflictVerifier, LLMDuplicateDetector
from casual_memory import MemoryFact
from casual_llm import create_client, create_model, ClientConfig, ModelConfig, Provider
# Initialize components
nli_filter = NLIPreFilter()
client = create_client(ClientConfig(
provider=Provider.OLLAMA,
base_url="http://localhost:11434"
))
model = create_model(client, ModelConfig(name="qwen2.5:7b-instruct"))
conflict_verifier = LLMConflictVerifier(model)
duplicate_detector = LLMDuplicateDetector(model)
# Build pipeline
pipeline = MemoryClassificationPipeline(
classifiers=[
NLIClassifier(nli_filter=nli_filter),
ConflictClassifier(llm_conflict_verifier=conflict_verifier),
DuplicateClassifier(llm_duplicate_detector=duplicate_detector),
AutoResolutionClassifier(supersede_threshold=1.3, keep_threshold=0.7),
],
strategy="tiered", # "single", "tiered", or "all"
)
# Create new memory and similar memories from vector search
new_memory = MemoryFact(
text="I live in Paris",
type="fact",
tags=["location"],
importance=0.9,
confidence=0.8,
entity_id="user-123",
)
similar_memories = [
SimilarMemory(
memory_id="mem_001",
memory=MemoryFact(
text="I live in London",
type="fact",
tags=["location"],
importance=0.8,
confidence=0.6,
entity_id="user-123",
),
similarity_score=0.91,
)
]
# Classify the new memory against similar memories
result = await pipeline.classify(new_memory, similar_memories)
# Check overall outcome: "add", "skip", or "conflict"
print(f"Overall outcome: {result.overall_outcome}")
# Check individual similarity results
for sim_result in result.similarity_results:
print(f"Similar memory: {sim_result.similar_memory.memory.text}")
print(f"Outcome: {sim_result.outcome}") # "conflict", "superseded", "same", "neutral"
print(f"Classifier: {sim_result.classifier_name}")
if sim_result.outcome == "conflict":
print(f"Category: {sim_result.metadata.get('category')}")
```
### Memory Extraction
```python
from casual_memory.extractors import LLMMemoryExtracter
from casual_llm import UserMessage, AssistantMessage
# Create extractors for user and assistant memories
user_extractor = LLMMemoryExtracter(model=model, source="user")
assistant_extractor = LLMMemoryExtracter(model=model, source="assistant")
messages = [
UserMessage(content="My name is Alex and I live in Bangkok"),
AssistantMessage(content="Nice to meet you, Alex!"),
]
# Extract user-stated memories
user_memories = await user_extractor.extract(messages)
# [MemoryFact(text="My name is Alex", type="fact", importance=0.9, ...),
# MemoryFact(text="I live in Bangkok", type="fact", importance=0.8, ...)]
# Extract assistant-observed memories
assistant_memories = await assistant_extractor.extract(messages)
```
### Custom Storage Backend
```python
from typing import Any, Optional
class MyVectorStore:
"""Custom vector store implementing VectorMemoryStore protocol"""
def add(self, embedding: list[float], payload: dict) -> str:
"""Add memory to store, return memory ID."""
# Your implementation
return "memory_id"
def search(
self,
query_embedding: list[float],
top_k: int = 5,
min_score: float = 0.7,
filters: Optional[Any] = None,
) -> list[Any]:
"""Search for similar memories."""
# Your implementation
return []
def update_memory(self, memory_id: str, updates: dict) -> bool:
"""Update memory metadata (mention_count, last_seen, etc.)."""
# Your implementation
return True
def archive_memory(self, memory_id: str, superseded_by: Optional[str] = None) -> bool:
"""Soft-delete memory."""
# Your implementation
return True
```
---
## 🏗️ Architecture
### Classification Pipeline Flow
```
Input: New Memory + Similar Memories (from vector search)
↓
1. NLI Classifier (~50-200ms)
├─ High entailment (≥0.85) → same (duplicate)
├─ High neutral (≥0.5) → neutral (distinct)
└─ Uncertain → Pass to next classifier
↓
2. Conflict Classifier (~500-2000ms)
├─ LLM detects contradiction → conflict
└─ No conflict → Pass to next classifier
↓
3. Duplicate Classifier (~500-2000ms)
├─ Same fact → same
├─ Refinement (more detail) → superseded
└─ Distinct facts → neutral
↓
4. Auto-Resolution Classifier
├─ Analyze conflict results
├─ High new confidence (ratio ≥1.3) → superseded (keep_new)
├─ High old confidence (ratio ≤0.7) → same (keep_old)
└─ Similar confidence → Keep as conflict
↓
Output: MemoryClassificationResult
├─ overall_outcome: "add" | "skip" | "conflict"
└─ similarity_results: Individual outcomes for each similar memory
```
### Key Concepts
**Similarity Outcomes** (for each similar memory):
- `conflict` - Contradictory memories requiring user resolution
- `superseded` - Similar memory should be archived (new one is better)
- `same` - Duplicate memory (update existing metadata)
- `neutral` - Distinct facts that can coexist
**Memory Outcomes** (overall action):
- `add` - Insert new memory to vector store
- `skip` - Update existing memory (increment mention_count)
- `conflict` - Create conflict record for user resolution
**Confidence Scoring:**
- Based on mention frequency (1 mention = 0.5, 5+ mentions = 0.95)
- Recency factor (decay after 30 days)
- Spread factor (boost if mentioned over time)
**Memory Types:**
- `fact` - Factual information (name, location, job, etc.)
- `preference` - User preferences (likes, dislikes, habits)
- `goal` - User goals and aspirations
- `event` - Events (past or future)
---
## 📚 Documentation
- [Architecture Guide](docs/ARCHITECTURE.md) - System design and concepts
- [Examples](examples/) - Working example code
---
## 🧪 Testing
```bash
# Run all tests
uv run pytest
# Run with coverage
uv run pytest --cov=casual_memory --cov-report=html
# Run specific test file
uv run pytest tests/classifiers/test_pipeline.py -v
# Run specific test
uv run pytest tests/classifiers/test_pipeline.py::test_pipeline_sequential_execution -v
```
---
## 🎯 Benchmarks
Classification pipeline performance on our test dataset:
| Model | Conflict Accuracy | Avg Time |
|-------|-------------------|----------|
| qwen2.5:7b-instruct | 96.2% | 1.2s |
| llama3:8b | 94.5% | 1.5s |
| gpt-3.5-turbo | 97.1% | 0.8s |
NLI Pre-filter performance:
- Accuracy: 92.38% (SNLI), 90.04% (MNLI)
- Speed: ~200ms CPU, ~50ms GPU
- Filters: 70-85% of obvious cases before LLM
---
## 🤝 Contributing
Contributions are welcome! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
---
## 📄 License
MIT License - see [LICENSE](LICENSE) for details.
---
## 🙏 Acknowledgments
Built with:
- [casual-llm](https://github.com/yourusername/casual-llm) - LLM provider abstraction
- [sentence-transformers](https://www.sbert.net/) - NLI models
- Inspired by research in semantic memory and conflict detection
---
## 🔗 Links
- [Documentation](https://github.com/yourusername/casual-memory#readme)
- [Issue Tracker](https://github.com/yourusername/casual-memory/issues)
- [Changelog](CHANGELOG.md)
- [Examples](examples/)
| text/markdown | null | Alex Stansfield <alex@casualgenius.com> | null | null | MIT | ai, chatbot, conflict-detection, llm, memory, semantic-memory | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engi... | [] | null | null | >=3.10 | [] | [] | [] | [
"casual-llm[openai]>=0.6.0",
"pydantic>=2.0.0",
"typing-extensions>=4.0.0",
"psycopg2-binary>=2.9.0; extra == \"all\"",
"qdrant-client>=1.7.0; extra == \"all\"",
"redis>=5.0.0; extra == \"all\"",
"sentence-transformers>=2.0.0; extra == \"all\"",
"sqlalchemy>=2.0.0; extra == \"all\"",
"black>=23.0.0;... | [] | [] | [] | [
"Homepage, https://github.com/casualgenius/casual-memory",
"Documentation, https://github.com/casualgenius/casual-memory#readme",
"Repository, https://github.com/casualgenius/casual-memory",
"Issues, https://github.com/casualgenius/casual-memory/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:25:25.768152 | casual_memory-0.5.0.tar.gz | 392,873 | ba/6f/cc3c84ecf5dc12489ca97de25344ac412e8c071e29b76ce7dd3e13603be3/casual_memory-0.5.0.tar.gz | source | sdist | null | false | c4b545742b4240c972cd096cf99470bd | 12f4bedec034fefb33ffb7c1548bf6f0b2e6143b236fef367b3b21f97e607fec | ba6fcc3c84ecf5dc12489ca97de25344ac412e8c071e29b76ce7dd3e13603be3 | null | [
"LICENSE"
] | 295 |
2.4 | pyipv8 | 3.2.0 | The Python implementation of the IPV8 library | **FAQ:**
- **Q:** Is this the new official Layer 3 solution for the Internet?
- **A:** No, the naming is a [10-years-old](https://www.tribler.org/IPv8/) mockery of the deployment failure of IPv6 (which we sincerely hope will be deployed properly at some point in time).
**Unit tests**: [](https://github.com/Tribler/py-ipv8/actions/workflows/coverage.yml)
**Mutation Tests**: [](https://jenkins-ci.tribler.org/job/ipv8/job/mutation_test_daily/HTML_20Report/)
**Read the Docs**: [](https://py-ipv8.readthedocs.io/)
## What is IPv8 ?
IPv8 aims to provide authenticated communication with privacy.
The design principle is to enable communication between public key pairs: IP addresses and physical network attachment points are abstracted away.
This Python 3 package is an amalgamation of peer-to-peer communication functionality from [Dispersy](https://github.com/Tribler/dispersy) and [Tribler](https://github.com/Tribler/tribler), developed over the last 19 years by students and employees of the Delft University of Technology.
The IPv8 library allows you to easily create network overlays on which to build your own applications.
### IPv8 Objectives
- **Authentication**. We offer mutual authentication using strong cryptography. During an IPv8 communication session, both parties can be sure of the other party’s identity. IPv8 users are identified by their public key. The initial key exchange is designed so that secrets are never transmitted across the Internet, not even in encrypted form. We use a standard challenge/response protocol with protection against spoofing, man-in-the-middle, and replay attacks.
- **Privacy**. IPv8 is specifically designed for strong privacy protection and end-to-end encryption with perfect forward secrecy. We enhanced the industry standard onion routing protocol, Tor, for usage in a trustless environment (e.g. no trusted central directory servers).
- **No infrastructure dependency**. Everybody is equal in the world of IPv8. No central web server, discovery server, or support foundation is needed.
- **Universal connectivity**. IPv8 can establish direct communication in difficult network situations. This includes connecting people behind a NAT or firewall. IPv8 includes a single simple and effective NAT traversal technique: UDP hole-punching. This is essential when offering privacy without infrastructure and consumer-grade donated resources.
- **Trust**. You can enhance your security if you tell IPv8 which people you know and trust. It tries to build a web-of-trust automatically.
### Dependencies
The dependencies for IPv8 are collected in the `requirements.txt` file and can be installed using `pip`:
```
python3 -m pip install --upgrade -r requirements.txt
```
On Windows or macOS you will need to install `Libsodium` separately, as explained [here](https://github.com/Tribler/py-ipv8/blob/master/doc/preliminaries/install_libsodium.rst).
### Tests
Running tests can be done by running:
```
python3 run_all_tests.py
```
Running code coverage requires the `coverage` package (`python3 -m pip install coverage`).
A coverage report can be generated by running:
```
python3 create_test_coverage_report.py
```
### Getting started
You can start creating your first network overlay by following [the overlay creation tutorial](https://py-ipv8.readthedocs.io/en/latest/basics/overlay_tutorial.html).
We provide additional documentation on [configuration](https://py-ipv8.readthedocs.io/en/latest/reference/configuration.html), [key generation](https://py-ipv8.readthedocs.io/en/latest/reference/keys.html) and [message serialization formats](https://py-ipv8.readthedocs.io/en/latest/reference/serialization.html) on [our ReadTheDocs page](https://py-ipv8.readthedocs.io/en/latest/).
| text/markdown | Tribler | null | null | null | null | null | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: GNU Lesser General Public License v3 (LGPLv3)",
"Natural Language :: English",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: ... | [] | https://github.com/Tribler/py-ipv8 | null | null | [] | [] | [] | [
"cryptography",
"libnacl",
"aiohttp",
"aiohttp_apispec",
"pyOpenSSL",
"pyasn1",
"marshmallow",
"typing-extensions",
"packaging",
"coverage; extra == \"all\"",
"coverage; extra == \"tests\""
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:24:35.649107 | pyipv8-3.2.0.tar.gz | 298,024 | a2/18/6aacfd4a520627f0025732c293be818e08d487e9298e53726544a7838f6b/pyipv8-3.2.0.tar.gz | source | sdist | null | false | 6eed0b56f2fee6cd49b519313a33c98b | ca22aaa75c48edfc27db78b8a97c40d3f5cba9f0f0a0ec4add8105cfaebac6eb | a2186aacfd4a520627f0025732c293be818e08d487e9298e53726544a7838f6b | null | [
"LICENSE"
] | 258 |
2.4 | mollie-api-py | 1.1.7 | Python Client SDK Generated by Speakeasy. | # mollie-api-py
Developer-friendly & type-safe Python SDK specifically catered to leverage *mollie-api-py* API.
<div align="left">
<a href="https://www.speakeasy.com/?utm_source=mollie-api-py&utm_campaign=python"><img src="https://custom-icon-badges.demolab.com/badge/-Built%20By%20Speakeasy-212015?style=for-the-badge&logoColor=FBE331&logo=speakeasy&labelColor=545454" /></a>
<a href="https://opensource.org/licenses/MIT">
<img src="https://img.shields.io/badge/License-MIT-blue.svg" style="width: 100px; height: 28px;" />
</a>
</div>
## Migration
This documentation is for the new Mollie's SDK. You can find more details on how to migrate from the old version to the new one [here](https://github.com/mollie/mollie-api-py/blob/master//MIGRATION.md).
<!-- Start Summary [summary] -->
## Summary
<!-- End Summary [summary] -->
<!-- Start Table of Contents [toc] -->
## Table of Contents
<!-- $toc-max-depth=2 -->
* [mollie-api-py](https://github.com/mollie/mollie-api-py/blob/master/#mollie-api-py)
* [Migration](https://github.com/mollie/mollie-api-py/blob/master/#migration)
* [SDK Installation](https://github.com/mollie/mollie-api-py/blob/master/#sdk-installation)
* [IDE Support](https://github.com/mollie/mollie-api-py/blob/master/#ide-support)
* [SDK Example Usage](https://github.com/mollie/mollie-api-py/blob/master/#sdk-example-usage)
* [Authentication](https://github.com/mollie/mollie-api-py/blob/master/#authentication)
* [Idempotency Key](https://github.com/mollie/mollie-api-py/blob/master/#idempotency-key)
* [Add Custom User-Agent Header](https://github.com/mollie/mollie-api-py/blob/master/#add-custom-user-agent-header)
* [Add Profile ID and Testmode to Client](https://github.com/mollie/mollie-api-py/blob/master/#add-profile-id-and-testmode-to-client)
* [Available Resources and Operations](https://github.com/mollie/mollie-api-py/blob/master/#available-resources-and-operations)
* [Global Parameters](https://github.com/mollie/mollie-api-py/blob/master/#global-parameters)
* [Retries](https://github.com/mollie/mollie-api-py/blob/master/#retries)
* [Error Handling](https://github.com/mollie/mollie-api-py/blob/master/#error-handling)
* [Server Selection](https://github.com/mollie/mollie-api-py/blob/master/#server-selection)
* [Custom HTTP Client](https://github.com/mollie/mollie-api-py/blob/master/#custom-http-client)
* [Resource Management](https://github.com/mollie/mollie-api-py/blob/master/#resource-management)
* [Debugging](https://github.com/mollie/mollie-api-py/blob/master/#debugging)
* [Development](https://github.com/mollie/mollie-api-py/blob/master/#development)
* [Contributions](https://github.com/mollie/mollie-api-py/blob/master/#contributions)
<!-- End Table of Contents [toc] -->
<!-- Start SDK Installation [installation] -->
## SDK Installation
> [!NOTE]
> **Python version upgrade policy**
>
> Once a Python version reaches its [official end of life date](https://devguide.python.org/versions/), a 3-month grace period is provided for users to upgrade. Following this grace period, the minimum python version supported in the SDK will be updated.
The SDK can be installed with *uv*, *pip*, or *poetry* package managers.
### uv
*uv* is a fast Python package installer and resolver, designed as a drop-in replacement for pip and pip-tools. It's recommended for its speed and modern Python tooling capabilities.
```bash
uv add mollie-api-py
```
### PIP
*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.
```bash
pip install mollie-api-py
```
### Poetry
*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.
```bash
poetry add mollie-api-py
```
### Shell and script usage with `uv`
You can use this SDK in a Python shell with [uv](https://docs.astral.sh/uv/) and the `uvx` command that comes with it like so:
```shell
uvx --from mollie-api-py python
```
It's also possible to write a standalone Python script without needing to set up a whole project like so:
```python
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.9"
# dependencies = [
# "mollie-api-py",
# ]
# ///
from mollie import ClientSDK
sdk = ClientSDK(
# SDK arguments
)
# Rest of script here...
```
Once that is saved to a file, you can run it with `uv run script.py` where
`script.py` can be replaced with the actual file name.
<!-- End SDK Installation [installation] -->
<!-- Start IDE Support [idesupport] -->
## IDE Support
### PyCharm
Generally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.
- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)
<!-- End IDE Support [idesupport] -->
<!-- Start SDK Example Usage [usage] -->
## SDK Example Usage
### Example
```python
# Synchronous Example
import mollie
from mollie import ClientSDK
import os
with ClientSDK(
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
```
</br>
The same SDK client can also be used to make asynchronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
import mollie
from mollie import ClientSDK
import os
async def main():
async with ClientSDK(
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = await client_sdk.balances.list_async(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
asyncio.run(main())
```
<!-- End SDK Example Usage [usage] -->
<!-- Start Authentication [security] -->
## Authentication
### Per-Client Security Schemes
This SDK supports the following security schemes globally:
| Name | Type | Scheme | Environment Variable |
| --------- | ------ | ------------ | -------------------- |
| `api_key` | http | HTTP Bearer | `CLIENT_API_KEY` |
| `o_auth` | oauth2 | OAuth2 token | `CLIENT_O_AUTH` |
You can set the security parameters through the `security` optional parameter when initializing the SDK client instance. The selected scheme will be used by default to authenticate with the API for all operations that support it. For example:
```python
import mollie
from mollie import ClientSDK
import os
with ClientSDK(
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
testmode=False,
) as client_sdk:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
```
<!-- End Authentication [security] -->
<!-- Start Idempotency Key -->
## Idempotency Key
This SDK supports the usage of Idempotency Keys. See our [documentation](https://docs.mollie.com/reference/api-idempotency) on how to use it.
```python
import os
from mollie import ClientSDK, Security
client = ClientSDK(
security = Security(
api_key = os.getenv("CLIENT_API_KEY", "test_..."),
)
)
payload = {
"description": "Description",
"amount": {
"currency": "EUR",
"value": "5.00",
},
"redirect_url": "https://example.org/redirect",
}
idempotency_key = "<some-idempotency-key>"
payment1 = client.payments.create(
payment_request=payload,
idempotency_key=idempotency_key
)
payment2 = client.payments.create(
payment_request=payload,
idempotency_key=idempotency_key
)
print(f"Payment created with ID: {payment1.id}")
print(f"Payment created with ID: {payment2.id}")
print("Payments are the same" if payment1.id == payment2.id else "Payments are different")
```
<!-- End Idempotency Key -->
<!-- Start Add Custom User-Agent Header -->
## Add Custom User-Agent Header
The SDK allows you to append a custom suffix to the `User-Agent` header for all requests. This can be used to identify
your application or integration when interacting with the API, making it easier to track usage or debug requests. The suffix is automatically added to the default User-Agent string generated by the SDK. You can add it when creating the
client:
```py
client = ClientSDK(
security = Security(
api_key = os.getenv("CLIENT_API_KEY", "test_..."),
),
custom_user_agent = "insert something here"
)
```
<!-- End Add Custom User-Agent Header -->
<!-- Start Add Profile ID and Testmode to Client -->
## Add Profile ID and Testmode to Client
The SDK allows you to define the `profileId` and `testmode` in the client. This way, you don't need to add this
information to the payload every time when using OAuth. This will not override the details provided in the individual
requests.
```py
client = ClientSDK(
security = Security(
o_auth = os.getenv("CLIENT_OAUTH_KEY", "test_..."),
),
testmode = False,
profileId = "pfl_..."
)
```
<!-- End Add Profile ID and Testmode to Client -->
<!-- Start Available Resources and Operations [operations] -->
## Available Resources and Operations
<details open>
<summary>Available methods</summary>
### [balance_transfers](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balancetransfers/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balancetransfers/README.md#create) - Create a Connect balance transfer
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balancetransfers/README.md#list) - List all Connect balance transfers
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balancetransfers/README.md#get) - Get a Connect balance transfer
### [balances](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balances/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balances/README.md#list) - List balances
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balances/README.md#get) - Get balance
* [get_primary](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balances/README.md#get_primary) - Get primary balance
* [get_report](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balances/README.md#get_report) - Get balance report
* [list_transactions](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/balances/README.md#list_transactions) - List balance transactions
### [capabilities](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/capabilities/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/capabilities/README.md#list) - List capabilities
### [captures](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/captures/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/captures/README.md#create) - Create capture
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/captures/README.md#list) - List captures
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/captures/README.md#get) - Get capture
### [chargebacks](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/chargebackssdk/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/chargebackssdk/README.md#list) - List payment chargebacks
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/chargebackssdk/README.md#get) - Get payment chargeback
* [all](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/chargebackssdk/README.md#all) - List all chargebacks
### [client_links](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/clientlinks/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/clientlinks/README.md#create) - Create client link
### [clients](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/clients/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/clients/README.md#list) - List clients
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/clients/README.md#get) - Get client
### [customers](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#create) - Create customer
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#list) - List customers
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#get) - Get customer
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#update) - Update customer
* [delete](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#delete) - Delete customer
* [create_payment](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#create_payment) - Create customer payment
* [list_payments](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/customers/README.md#list_payments) - List customer payments
### [delayed_routing](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/delayedrouting/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/delayedrouting/README.md#create) - Create a delayed route
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/delayedrouting/README.md#list) - List payment routes
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/delayedrouting/README.md#get) - Get a delayed route
### [invoices](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/invoices/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/invoices/README.md#list) - List invoices
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/invoices/README.md#get) - Get invoice
### [mandates](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/mandates/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/mandates/README.md#create) - Create mandate
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/mandates/README.md#list) - List mandates
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/mandates/README.md#get) - Get mandate
* [revoke](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/mandates/README.md#revoke) - Revoke mandate
### [methods](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/methods/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/methods/README.md#list) - List payment methods
* [all](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/methods/README.md#all) - List all payment methods
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/methods/README.md#get) - Get payment method
### [onboarding](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/onboarding/README.md)
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/onboarding/README.md#get) - Get onboarding status
* [submit](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/onboarding/README.md#submit) - Submit onboarding data
### [organizations](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/organizations/README.md)
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/organizations/README.md#get) - Get organization
* [get_current](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/organizations/README.md#get_current) - Get current organization
* [get_partner](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/organizations/README.md#get_partner) - Get partner status
### [payment_links](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md#create) - Create payment link
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md#list) - List payment links
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md#get) - Get payment link
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md#update) - Update payment link
* [delete](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md#delete) - Delete payment link
* [list_payments](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentlinks/README.md#list_payments) - Get payment link payments
### [payments](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md#create) - Create payment
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md#list) - List payments
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md#get) - Get payment
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md#update) - Update payment
* [cancel](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md#cancel) - Cancel payment
* [release_authorization](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/paymentssdk/README.md#release_authorization) - Release payment authorization
### [permissions](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/permissions/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/permissions/README.md#list) - List permissions
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/permissions/README.md#get) - Get permission
### [profiles](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md#create) - Create profile
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md#list) - List profiles
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md#get) - Get profile
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md#update) - Update profile
* [delete](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md#delete) - Delete profile
* [get_current](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/profiles/README.md#get_current) - Get current profile
### [refunds](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/refundssdk/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/refundssdk/README.md#create) - Create payment refund
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/refundssdk/README.md#list) - List payment refunds
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/refundssdk/README.md#get) - Get payment refund
* [cancel](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/refundssdk/README.md#cancel) - Cancel payment refund
* [all](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/refundssdk/README.md#all) - List all refunds
### [sales_invoices](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/salesinvoices/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/salesinvoices/README.md#create) - Create sales invoice
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/salesinvoices/README.md#list) - List sales invoices
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/salesinvoices/README.md#get) - Get sales invoice
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/salesinvoices/README.md#update) - Update sales invoice
* [delete](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/salesinvoices/README.md#delete) - Delete sales invoice
### [settlements](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#list) - List settlements
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#get) - Get settlement
* [get_open](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#get_open) - Get open settlement
* [get_next](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#get_next) - Get next settlement
* [list_payments](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#list_payments) - List settlement payments
* [list_captures](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#list_captures) - List settlement captures
* [list_refunds](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#list_refunds) - List settlement refunds
* [list_chargebacks](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/settlements/README.md#list_chargebacks) - List settlement chargebacks
### [subscriptions](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#create) - Create subscription
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#list) - List customer subscriptions
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#get) - Get subscription
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#update) - Update subscription
* [cancel](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#cancel) - Cancel subscription
* [all](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#all) - List all subscriptions
* [list_payments](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/subscriptions/README.md#list_payments) - List subscription payments
### [terminals](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/terminals/README.md)
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/terminals/README.md#list) - List terminals
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/terminals/README.md#get) - Get terminal
### [wallets](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/wallets/README.md)
* [request_apple_pay_session](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/wallets/README.md#request_apple_pay_session) - Request Apple Pay payment session
### [webhook_events](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhookevents/README.md)
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhookevents/README.md#get) - Get a Webhook Event
### [webhooks](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md)
* [create](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md#create) - Create a webhook
* [list](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md#list) - List all webhooks
* [update](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md#update) - Update a webhook
* [get](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md#get) - Get a webhook
* [delete](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md#delete) - Delete a webhook
* [test](https://github.com/mollie/mollie-api-py/blob/master/docs/sdks/webhooks/README.md#test) - Test a webhook
</details>
<!-- End Available Resources and Operations [operations] -->
<!-- Start Global Parameters [global-parameters] -->
## Global Parameters
Certain parameters are configured globally. These parameters may be set on the SDK client instance itself during initialization. When configured as an option during SDK initialization, These global values will be used as defaults on the operations that use them. When such operations are called, there is a place in each to override the global value, if needed.
For example, you can set `profileId` to `` at SDK initialization and then you do not have to pass the same value on calls to operations like `list`. But if you want to do so you may, which will locally override the global setting. See the example code below for a demonstration.
### Available Globals
The following global parameters are available.
Global parameters can also be set via environment variable.
| Name | Type | Description | Environment |
| ----------------- | ---- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------ |
| profile_id | str | The identifier referring to the [profile](https://github.com/mollie/mollie-api-py/blob/master/get-profile) you wish to<br/>retrieve the resources for.<br/><br/>Most API credentials are linked to a single profile. In these cases the `profileId` must not be sent. For<br/>organization-level credentials such as OAuth access tokens however, the `profileId` parameter is required. | CLIENT_PROFILE_ID |
| testmode | bool | Most API credentials are specifically created for either live mode or test mode. In those cases the `testmode` query<br/>parameter must not be sent. For organization-level credentials such as OAuth access tokens, you can enable test mode by<br/>setting the `testmode` query parameter to `true`.<br/><br/>Test entities cannot be retrieved when the endpoint is set to live mode, and vice versa. | CLIENT_TESTMODE |
| custom_user_agent | str | Custom user agent string to be appended to the default Mollie SDK user agent. | CLIENT_CUSTOM_USER_AGENT |
### Example
```python
import mollie
from mollie import ClientSDK
import os
with ClientSDK(
testmode=False,
profile_id="<id>",
custom_user_agent="<value>",
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
```
<!-- End Global Parameters [global-parameters] -->
<!-- Start Retries [retries] -->
## Retries
Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.
To change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:
```python
import mollie
from mollie import ClientSDK
from mollie.utils import BackoffStrategy, RetryConfig
import os
with ClientSDK(
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426",
RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False))
# Handle response
print(res)
```
If you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:
```python
import mollie
from mollie import ClientSDK
from mollie.utils import BackoffStrategy, RetryConfig
import os
with ClientSDK(
retry_config=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
```
<!-- End Retries [retries] -->
<!-- Start Error Handling [errors] -->
## Error Handling
[`ClientError`](https://github.com/mollie/mollie-api-py/blob/master/./src/mollie/models/clienterror.py) is the base class for all HTTP error responses. It has the following properties:
| Property | Type | Description |
| ------------------ | ---------------- | --------------------------------------------------------------------------------------- |
| `err.message` | `str` | Error message |
| `err.status_code` | `int` | HTTP response status code eg `404` |
| `err.headers` | `httpx.Headers` | HTTP response headers |
| `err.body` | `str` | HTTP body. Can be empty string if no body is returned. |
| `err.raw_response` | `httpx.Response` | Raw HTTP response |
| `err.data` | | Optional. Some errors may contain structured data. [See Error Classes](https://github.com/mollie/mollie-api-py/blob/master/#error-classes). |
### Example
```python
import mollie
from mollie import ClientSDK, models
import os
with ClientSDK(
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = None
try:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
except models.ClientError as e:
# The base class for HTTP error responses
print(e.message)
print(e.status_code)
print(e.body)
print(e.headers)
print(e.raw_response)
# Depending on the method different errors may be thrown
if isinstance(e, models.ErrorResponse):
print(e.data.status) # int
print(e.data.title) # str
print(e.data.detail) # str
print(e.data.field) # Optional[str]
print(e.data.links) # mollie.ErrorsLinks
```
### Error Classes
**Primary errors:**
* [`ClientError`](https://github.com/mollie/mollie-api-py/blob/master/./src/mollie/models/clienterror.py): The base class for HTTP error responses.
* [`ErrorResponse`](https://github.com/mollie/mollie-api-py/blob/master/./src/mollie/models/errorresponse.py): An error response object. *
<details><summary>Less common errors (5)</summary>
<br />
**Network errors:**
* [`httpx.RequestError`](https://www.python-httpx.org/exceptions/#httpx.RequestError): Base class for request errors.
* [`httpx.ConnectError`](https://www.python-httpx.org/exceptions/#httpx.ConnectError): HTTP client was unable to make a request to a server.
* [`httpx.TimeoutException`](https://www.python-httpx.org/exceptions/#httpx.TimeoutException): HTTP request timed out.
**Inherit from [`ClientError`](https://github.com/mollie/mollie-api-py/blob/master/./src/mollie/models/clienterror.py)**:
* [`ResponseValidationError`](https://github.com/mollie/mollie-api-py/blob/master/./src/mollie/models/responsevalidationerror.py): Type mismatch between the response data and the expected Pydantic model. Provides access to the Pydantic validation error via the `cause` attribute.
</details>
\* Check [the method documentation](https://github.com/mollie/mollie-api-py/blob/master/#available-resources-and-operations) to see if the error is applicable.
<!-- End Error Handling [errors] -->
<!-- Start Server Selection [server] -->
## Server Selection
### Override Server URL Per-Client
The default server can be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:
```python
import mollie
from mollie import ClientSDK
import os
with ClientSDK(
server_url="https://api.mollie.com/v2",
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
res = client_sdk.balances.list(currency="EUR", from_="bal_gVMhHKqSSRYJyPsuoPNFH", limit=50, idempotency_key="123e4567-e89b-12d3-a456-426")
# Handle response
print(res)
```
<!-- End Server Selection [server] -->
<!-- Start Custom HTTP Client [http-client] -->
## Custom HTTP Client
The Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.
Depending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.
This allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.
For example, you could specify a header for every request that this sdk makes as follows:
```python
from mollie import ClientSDK
import httpx
http_client = httpx.Client(headers={"x-custom-header": "someValue"})
s = ClientSDK(client=http_client)
```
or you could wrap the client with your own custom logic:
```python
from mollie import ClientSDK
from mollie.httpclient import AsyncHttpClient
import httpx
class CustomClient(AsyncHttpClient):
client: AsyncHttpClient
def __init__(self, client: AsyncHttpClient):
self.client = client
async def send(
self,
request: httpx.Request,
*,
stream: bool = False,
auth: Union[
httpx._types.AuthTypes, httpx._client.UseClientDefault, None
] = httpx.USE_CLIENT_DEFAULT,
follow_redirects: Union[
bool, httpx._client.UseClientDefault
] = httpx.USE_CLIENT_DEFAULT,
) -> httpx.Response:
request.headers["Client-Level-Header"] = "added by client"
return await self.client.send(
request, stream=stream, auth=auth, follow_redirects=follow_redirects
)
def build_request(
self,
method: str,
url: httpx._types.URLTypes,
*,
content: Optional[httpx._types.RequestContent] = None,
data: Optional[httpx._types.RequestData] = None,
files: Optional[httpx._types.RequestFiles] = None,
json: Optional[Any] = None,
params: Optional[httpx._types.QueryParamTypes] = None,
headers: Optional[httpx._types.HeaderTypes] = None,
cookies: Optional[httpx._types.CookieTypes] = None,
timeout: Union[
httpx._types.TimeoutTypes, httpx._client.UseClientDefault
] = httpx.USE_CLIENT_DEFAULT,
extensions: Optional[httpx._types.RequestExtensions] = None,
) -> httpx.Request:
return self.client.build_request(
method,
url,
content=content,
data=data,
files=files,
json=json,
params=params,
headers=headers,
cookies=cookies,
timeout=timeout,
extensions=extensions,
)
s = ClientSDK(async_client=CustomClient(httpx.AsyncClient()))
```
<!-- End Custom HTTP Client [http-client] -->
<!-- Start Resource Management [resource-management] -->
## Resource Management
The `ClientSDK` class implements the context manager protocol and registers a finalizer function to close the underlying sync and async HTTPX clients it uses under the hood. This will close HTTP connections, release memory and free up other resources held by the SDK. In short-lived Python programs and notebooks that make a few SDK method calls, resource management may not be a concern. However, in longer-lived programs, it is beneficial to create a single SDK instance via a [context manager][context-manager] and reuse it across the application.
[context-manager]: https://docs.python.org/3/reference/datamodel.html#context-managers
```python
import mollie
from mollie import ClientSDK
import os
def main():
with ClientSDK(
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
# Rest of application here...
# Or when using async:
async def amain():
async with ClientSDK(
testmode=False,
security=mollie.Security(
api_key=os.getenv("CLIENT_API_KEY", ""),
),
) as client_sdk:
# Rest of application here...
```
<!-- End Resource Management [resource-management] -->
<!-- Start Debugging [debug] -->
## Debugging
You can setup your SDK to emit debug logs for SDK requests and responses.
You can pass your own logger class directly into your SDK.
```python
from mollie import ClientSDK
import logging
logging.basicConfig(level=logging.DEBUG)
s = ClientSDK(debug_logger=logging.getLogger("mollie"))
```
You can also enable a default debug logger by setting an environment variable `CLIENT_DEBUG` to true.
<!-- End Debugging [debug] -->
<!-- Placeholder for Future Speakeasy SDK Sections -->
# Development
## Contributions
While we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation.
We look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release.
### SDK Created by [Speakeasy](https://www.speakeasy.com/?utm_source=mollie-api-py&utm_campaign=python)
| text/markdown | Speakeasy | null | null | null | null | null | [
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.14"
] | [] | null | null | >=3.9 | [] | [] | [] | [
"eval-type-backport>=0.2.0",
"httpx>=0.28.1",
"pydantic>=2.10.3",
"python-dateutil>=2.8.2",
"typing-inspect>=0.9.0"
] | [] | [] | [] | [] | poetry/2.2.1 CPython/3.10.19 Linux/6.8.0-1044-azure | 2026-02-19T10:24:08.829560 | mollie_api_py-1.1.7.tar.gz | 244,986 | 93/6a/8c2312ce49d4141176c74853e84cd2301b6ce7dac726a37bbdef26d46d71/mollie_api_py-1.1.7.tar.gz | source | sdist | null | false | 9fc68f31c1fbe829a2f7095c57548461 | 386fbdc679d7cb16e52300c4a6f99f39f4307520a88d17240c5b3adf56a0ed5b | 936a8c2312ce49d4141176c74853e84cd2301b6ce7dac726a37bbdef26d46d71 | null | [] | 283 |
2.4 | runcell | 0.1.19 | AI Agent for Jupyter | # runcell
[](https://github.com/ObservedObserver/d5m/actions/workflows/build.yml)
A JupyterLab extension.
## Requirements
- JupyterLab >= 4.0.0
## Install
To install the extension, execute:
```bash
pip install runcell
```
## API Key Setup
D5M AI supports multiple AI providers. You need to set up the appropriate API keys as environment variables:
### Anthropic Claude
```bash
export ANTHROPIC_API_KEY="your_anthropic_api_key_here"
```
### Google Gemini (via Google AI Studio)
```bash
export GOOGLE_API_KEY="your_google_ai_studio_api_key_here"
```
### OpenAI
```bash
export OPENAI_API_KEY="your_openai_api_key_here"
```
**Note**: For Gemini models, make sure to use the Google AI Studio API key, not Google Cloud/Vertex AI credentials. The extension uses the `gemini/` prefix format which works with Google AI Studio without requiring additional Google Cloud dependencies.
## Uninstall
To remove the extension, execute:
```bash
pip uninstall runcell
```
## Compilation System
D5M AI includes a unified compilation system for IP protection and performance optimization. The system compiles core AI modules using Nuitka while maintaining development flexibility.
### Quick Start
```bash
# Compile all handlers for production
python compile_unified.py
# Test the compilation system
python test_unified_compilation.py
# Development mode (uses source code)
export D5M_ENV=development
# Production mode (uses compiled modules)
export D5M_ENV=production
```
For detailed information, see [COMPILATION_SYSTEM.md](COMPILATION_SYSTEM.md).
## Contributing
### Development install
Note: You will need NodeJS to build the extension package.
The `jlpm` command is JupyterLab's pinned version of
[yarn](https://yarnpkg.com/) that is installed with JupyterLab. You may use
`yarn` or `npm` in lieu of `jlpm` below.
```bash
# Clone the repo to your local environment
# Change directory to the d5m_ai directory
# Install package in development mode
pip install -e "."
# Link your development version of the extension with JupyterLab
jupyter labextension develop . --overwrite
# Rebuild extension Typescript source after making changes
jlpm build
```
You can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.
```bash
# Watch the source directory in one terminal, automatically rebuilding when needed
jlpm watch
# Run JupyterLab in another terminal
jupyter lab
```
With the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).
By default, the `jlpm build` command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:
```bash
jupyter lab build --minimize=False
```
### Development uninstall
```bash
pip uninstall runcell
```
In development mode, you will also need to remove the symlink created by `jupyter labextension develop`
command. To find its location, you can run `jupyter labextension list` to figure out where the `labextensions`
folder is located. Then you can remove the symlink named `runcell` within that folder.
### Testing the extension
#### Frontend tests
This extension is using [Jest](https://jestjs.io/) for JavaScript code testing.
To execute them, execute:
```sh
jlpm
jlpm test
```
#### Integration tests
This extension uses [Playwright](https://playwright.dev/docs/intro) for the integration tests (aka user level tests).
More precisely, the JupyterLab helper [Galata](https://github.com/jupyterlab/jupyterlab/tree/master/galata) is used to handle testing the extension in JupyterLab.
More information are provided within the [ui-tests](./ui-tests/README.md) README.
### Packaging the extension
See [RELEASE](RELEASE.md)
### Keyboard Shortcuts
The AI chat panel can be opened using `Cmd/Ctrl+L`.
## Development
```bash
source jupyterlab-env/bin/activate
jupyter server extension enable d5m_ai
jupyter lab
```
| text/markdown | null | Elwynn Chen <elwynn.c@kanaries.net> | null | null | Copyright (c) 2025, Runcell, Kanaries Data Inc., Elwynn Chen
All rights reserved. | jupyter, jupyterlab, jupyterlab-extension | [
"Framework :: Jupyter",
"Framework :: Jupyter :: JupyterLab",
"Framework :: Jupyter :: JupyterLab :: 4",
"Framework :: Jupyter :: JupyterLab :: Extensions",
"Framework :: Jupyter :: JupyterLab :: Extensions :: Prebuilt",
"License :: OSI Approved :: BSD License",
"Programming Language :: Python",
"Prog... | [] | null | null | >=3.9 | [] | [] | [] | [
"aiohttp",
"dill",
"ipython",
"jupyter-server",
"jupyter-server-documents>=0.1.0",
"matplotlib",
"numpy",
"openai",
"pandas",
"python-dotenv",
"python-socks",
"websockets"
] | [] | [] | [] | [
"homepage, https://www.runcell.dev"
] | twine/6.2.0 CPython/3.10.9 | 2026-02-19T10:23:18.144426 | runcell-0.1.19-py3-none-any.whl | 9,267,286 | 87/ed/2c025378ec00c4a1350f5472891a426152200ba661cefb29e6b7bfa6ec70/runcell-0.1.19-py3-none-any.whl | py3 | bdist_wheel | null | false | 4567e2690be8133b65151f43cf96506e | 565f8e7ed893fadc1fa237ad48969b69770f5a8305d5d5463b9694b241746d85 | 87ed2c025378ec00c4a1350f5472891a426152200ba661cefb29e6b7bfa6ec70 | null | [
"LICENSE"
] | 178 |
2.4 | clear-skies-snyk | 2.0.6 | Snyk module for Clearskies | # clearskies-snyk
A [clearskies](https://github.com/cmancone/clearskies) module for interacting with the [Snyk API](https://docs.snyk.io/snyk-api).
This module provides pre-built models and backends for seamless integration with both the Snyk REST API and the legacy v1 API, allowing you to easily query and manage Snyk resources like organizations, projects, groups, issues, and more.
## Installation
```bash
pip install clearskies-snyk
```
Or with uv:
```bash
uv add clearskies-snyk
```
## Quick Start
### Authentication
Set up authentication using environment variables:
```bash
# Option 1: Direct API key
export SNYK_AUTH_KEY=your-snyk-api-key
# Option 2: Secret manager path (recommended for production)
export SNYK_AUTH_SECRET_PATH=/path/to/secret
```
### Basic Usage
```python
import clearskies
from clearskies_snyk.models import SnykOrg, SnykProject, SnykGroup
def my_handler(snyk_org: SnykOrg, snyk_project: SnykProject, snyk_group: SnykGroup):
"""Example handler using dependency injection."""
# List all organizations
for org in snyk_org:
print(f"Org: {org.name} ({org.slug})")
# Get projects for an organization
projects = snyk_project.where("org_id=your-org-id")
for project in projects:
print(f"Project: {project.name} - {project.project_type}")
# List groups
for group in snyk_group:
print(f"Group: {group.name}")
```
### Working with Issues
```python
import clearskies
from clearskies_snyk.models import SnykOrgIssue, SnykGroupIssue
def my_handler(snyk_org_issue: SnykOrgIssue, snyk_group_issue: SnykGroupIssue):
"""Example handler using dependency injection."""
# Get issues for an organization
org_issues = snyk_org_issue.where("org_id=your-org-id")
for issue in org_issues:
print(f"Issue: {issue.title} - Severity: {issue.effective_severity_level}")
# Get issues across a group
group_issues = snyk_group_issue.where("group_id=your-group-id")
for issue in group_issues:
print(f"Issue: {issue.title}")
```
### Using the V1 API
Some endpoints are only available through the legacy v1 API:
```python
import clearskies
from clearskies_snyk.models.v1 import SnykIntegration, SnykWebhook, SnykLicense
def my_handler(snyk_integration: SnykIntegration, snyk_webhook: SnykWebhook):
"""Example handler using dependency injection."""
# List integrations for an organization
integrations = snyk_integration.where("org_id=your-org-id")
for integration in integrations:
print(f"Integration: {integration.name} ({integration.integration_type})")
# List webhooks
webhooks = snyk_webhook.where("org_id=your-org-id")
for webhook in webhooks:
print(f"Webhook: {webhook.url}")
```
### Custom Backend Configuration
```python
import clearskies
from clearskies_snyk.backends import SnykBackend
# Custom authentication
backend = SnykBackend(
authentication=clearskies.authentication.SecretBearer(
environment_key="MY_SNYK_KEY",
header_prefix="token ",
)
)
# Custom API version
backend = SnykBackend(api_version="2024-10-15")
```
## Available Models
### REST API Models
| Category | Models |
|----------|--------|
| **Organizations** | `SnykOrg`, `SnykOrgMember`, `SnykOrgMembership`, `SnykOrgUser`, `SnykOrgInvite` |
| **Projects** | `SnykProject`, `SnykProjectHistory`, `SnykProjectIgnore`, `SnykProjectSbom` |
| **Groups** | `SnykGroup`, `SnykGroupMember`, `SnykGroupMembership`, `SnykGroupUser`, `SnykGroupOrgMembership` |
| **Issues** | `SnykOrgIssue`, `SnykGroupIssue` |
| **Policies** | `SnykOrgPolicy`, `SnykOrgPolicyEvent`, `SnykGroupPolicy` |
| **Service Accounts** | `SnykOrgServiceAccount`, `SnykGroupServiceAccount` |
| **Apps** | `SnykOrgApp`, `SnykOrgAppBot`, `SnykOrgAppInstall`, `SnykGroupAppInstall`, `SnykSelfApp` |
| **Cloud** | `SnykCloudEnvironment`, `SnykCloudResource`, `SnykCloudScan` |
| **Containers** | `SnykContainerImage`, `SnykCustomBaseImage` |
| **Settings** | `SnykOrgSettingsIac`, `SnykOrgSettingsSast`, `SnykOrgSettingsOpenSource`, `SnykGroupSettingsIac` |
| **Tenants** | `SnykTenant`, `SnykTenantMembership`, `SnykTenantRole` |
| **Other** | `SnykCollection`, `SnykTarget`, `SnykPackage`, `SnykAiBom`, `SnykLearnAssignment`, and more |
### V1 API Models
| Model | Description |
|-------|-------------|
| `SnykIntegration` | SCM and CI/CD integrations |
| `SnykIntegrationSetting` | Integration configuration settings |
| `SnykWebhook` | Webhook configurations |
| `SnykLicense` | License information |
| `SnykDependency` | Project dependencies |
| `SnykEntitlement` | Organization entitlements |
| `SnykGroupRoleV1` | Group roles (v1 format) |
| `SnykGroupSettings` | Group settings |
| `SnykGroupTag` | Group tags |
| `SnykImportJob` | Project import jobs |
## Development
To set up your development environment:
```bash
# Install uv if not already installed
pip install uv
# Create a virtual environment and install all dependencies
uv sync
# Install dev dependencies
uv pip install .[dev]
# Install pre-commit hooks
uv run pre-commit install
# Run pre-commit on all files
uv run pre-commit run --all-files
```
## Documentation
For full API documentation, visit the [Snyk API Documentation](https://docs.snyk.io/snyk-api).
## License
MIT License - see [LICENSE](LICENSE) for details.
| text/markdown | null | Tom Nijboer <tom.nijboer@cimpress.com> | null | null | null | null | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3"
] | [] | null | null | <4.0,>=3.11 | [] | [] | [] | [
"clear-skies<3.0.0,>=2.0.37",
"dacite>=1.9.2",
"types-requests>=2.32.4; extra == \"dev\""
] | [] | [] | [] | [
"Docs, https://https://clearskies.info/modules/clear-skies-snyk",
"Repository, https://github.com/clearskies-py/snyk",
"Issues, https://github.com/clearskies-py/snyk/issues",
"Changelog, https://github.com/clearskies-py/snyk/blob/main/CHANGELOG.md"
] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-19T10:23:04.038560 | clear_skies_snyk-2.0.6.tar.gz | 798,466 | 93/32/0d922a52f8c0974e9ff14ef6b6a75cd6ca21974cd3b3914057c502892f5c/clear_skies_snyk-2.0.6.tar.gz | source | sdist | null | false | 93c1cfa6861efc79856372d47ed0531e | a26049686526f8b6b920cb5e906a25d7ef6a699640bdf5b83ec7d5411a229450 | 93320d922a52f8c0974e9ff14ef6b6a75cd6ca21974cd3b3914057c502892f5c | MIT | [
"LICENSE"
] | 251 |
2.4 | crewai-x402 | 0.2.1 | x402 payment protocol integration for CrewAI - enable AI agents to pay for APIs with USDC | # crewai-x402
[](https://pypi.org/project/crewai-x402/)
[](https://opensource.org/licenses/MIT)
**Enable CrewAI agents to pay for APIs with USDC using the x402 protocol.**
crewai-x402 integrates the [x402 payment protocol](https://x402.org) with CrewAI, allowing your AI agent crews to autonomously access paid APIs without managing API keys or subscriptions.
## What is x402?
x402 is the HTTP-native payment protocol that finally implements the `402 Payment Required` status code. Instead of API keys and monthly subscriptions, software pays software—per request, in USDC, with cryptographic proof.
**How it works:**
1. Agent requests a resource
2. Server returns `402 Payment Required` with price info
3. Agent signs a USDC payment authorization (EIP-3009)
4. Agent retries with payment proof
5. Server settles on-chain, returns data
All in a single HTTP round-trip.
## Installation
```bash
pip install crewai-x402
```
## Quick Start
```python
import os
from crewai import Agent, Crew, Task
from crewai_x402 import X402Wallet, X402Tool
# 1. Create a wallet with a USDC budget
wallet = X402Wallet(
private_key=os.environ["WALLET_PRIVATE_KEY"],
network="eip155:8453", # Base mainnet (CAIP-2 format)
budget_usd=10.00
)
# 2. Create the payment tool
payment_tool = X402Tool(wallet=wallet)
# 3. Add to your agent
researcher = Agent(
role="Research Analyst",
goal="Find accurate data using any available source",
backstory="You have access to both free and paid APIs.",
tools=[payment_tool],
)
# 4. Agent can now pay for premium data
task = Task(
description="Get the premium analysis from https://sandbox.agentrails.io/api/x402/protected/analysis",
agent=researcher,
)
crew = Crew(agents=[researcher], tasks=[task])
result = crew.kickoff()
```
## Try It with the Sandbox
The [AgentRails Sandbox](https://sandbox.agentrails.io) is a free test environment with x402-protected endpoints you can hit right away. No signup required to see the 402 flow in action.
### 1. Check available endpoints and pricing
```bash
curl https://sandbox.agentrails.io/api/x402/pricing
```
```json
{
"endpoints": [
{ "resource": "/api/x402/protected/analysis", "amountUsdc": 0.01 },
{ "resource": "/api/x402/protected/data", "amountUsdc": 0.001 }
],
"supportedNetworks": [
"eip155:5042002", "eip155:84532", "eip155:11155111",
"eip155:8453", "eip155:1"
],
"payTo": "0x6255d8dd3f84ec460fc8b07db58ab06384a2f487"
}
```
### 2. See a 402 response
```bash
curl -i https://sandbox.agentrails.io/api/x402/protected/analysis
# → 402 Payment Required
# → PAYMENT-REQUIRED: <base64-encoded payment requirements>
```
### 3. Point your crew at the sandbox
```python
wallet = X402Wallet(
private_key=os.environ["WALLET_PRIVATE_KEY"],
network="eip155:84532", # Base Sepolia testnet (CAIP-2 format)
budget_usd=1.00
)
payment_tool = X402Tool(wallet=wallet)
researcher = Agent(
role="Research Analyst",
goal="Gather data from premium APIs",
tools=[payment_tool],
)
task = Task(
description="Get the analysis from https://sandbox.agentrails.io/api/x402/protected/analysis",
agent=researcher,
)
crew = Crew(agents=[researcher], tasks=[task])
result = crew.kickoff()
```
### Sandbox Endpoints
| Endpoint | Cost | Description |
|----------|------|-------------|
| `GET /api/x402/protected/analysis` | $0.01 USDC | AI analysis (premium) |
| `GET /api/x402/protected/data` | $0.001 USDC | Data endpoint (micropayment) |
| `GET /api/x402/pricing` | Free | Pricing for all protected endpoints |
| `GET /api/x402/stats` | Free | Payment statistics |
Full API reference: [sandbox.agentrails.io/swagger](https://sandbox.agentrails.io/swagger)
## Features
### Automatic Payment Handling
The `X402Tool` automatically detects 402 responses and handles payment negotiation:
```python
tool = X402Tool(
wallet=wallet,
auto_pay=True, # Automatically pay when within budget
timeout=30.0, # Request timeout in seconds
)
```
### Budget Control
Set spending limits at the wallet level:
```python
wallet = X402Wallet(
private_key=key,
network="eip155:8453",
budget_usd=5.00 # Crew can't spend more than $5
)
# Check remaining budget
print(f"Remaining: ${wallet.remaining_usd}")
# Check if can afford a specific amount
if wallet.can_afford(0.01):
print("Can afford $0.01 request")
```
### Per-Request Price Limits
Limit how much an agent can pay for a single request:
```python
# When calling the tool directly
result = tool._run(
url="https://sandbox.agentrails.io/api/x402/protected/analysis",
max_price_usd=0.05 # Won't pay more than $0.05 for this request
)
```
### Payment History
Track all payments made by the wallet:
```python
for payment in wallet.payments:
print(f"{payment.resource_url}: ${payment.amount_usd}")
# Get summary
summary = wallet.get_payment_summary()
print(f"Total spent: ${summary['spent_usd']}")
print(f"Payments made: {summary['payment_count']}")
```
### Multi-Network Support
Supports multiple EVM networks using [CAIP-2](https://github.com/ChainAgnostic/CAIPs/blob/main/CAIPs/caip-2.md) identifiers:
```python
# Base (recommended - low fees)
wallet = X402Wallet(private_key=key, network="eip155:8453")
# Ethereum
wallet = X402Wallet(private_key=key, network="eip155:1")
# Testnets
wallet = X402Wallet(private_key=key, network="eip155:84532") # Base Sepolia
wallet = X402Wallet(private_key=key, network="eip155:5042002") # Arc testnet
```
> Legacy network names (`base-mainnet`, `base-sepolia`, etc.) are still accepted for backwards compatibility.
## API Reference
### X402Wallet
```python
X402Wallet(
private_key: str, # Hex-encoded private key
network: str, # CAIP-2 network ID (e.g., "eip155:8453")
budget_usd: float, # Maximum USD to spend
)
```
**Properties:**
- `address` - Wallet address
- `spent_usd` - Total USD spent
- `remaining_usd` - Remaining budget
- `payments` - List of PaymentRecord objects
**Methods:**
- `can_afford(amount_usd)` - Check if budget allows payment
- `sign_payment(to, amount, valid_before)` - Sign EIP-3009 authorization
- `get_payment_summary()` - Get spending summary dict
- `reset_budget(new_budget)` - Reset budget and clear history
### X402Tool
```python
X402Tool(
wallet: X402Wallet, # Wallet for payments
auto_pay: bool = True, # Auto-pay when within budget
timeout: float = 30.0, # HTTP timeout
)
```
**Tool Input Fields:**
- `url` (required) - URL to request
- `method` - HTTP method (default: "GET")
- `body` - Request body
- `headers` - Additional headers
- `max_price_usd` - Per-request price limit
## Networks
V2 uses [CAIP-2](https://github.com/ChainAgnostic/CAIPs/blob/main/CAIPs/caip-2.md) network identifiers:
| Network ID (CAIP-2) | Chain ID | Environment | Legacy Alias |
|---------------------|----------|-------------|-------------|
| `eip155:8453` | 8453 | Production | `base-mainnet` |
| `eip155:84532` | 84532 | Testnet | `base-sepolia` |
| `eip155:1` | 1 | Production | `ethereum-mainnet` |
| `eip155:11155111` | 11155111 | Testnet | `ethereum-sepolia` |
| `eip155:5042002` | 5042002 | Testnet | `arc-testnet` |
## Security Considerations
1. **Never commit private keys** - Use environment variables or secret managers
2. **Set appropriate budgets** - Limit what crews can spend
3. **Use testnets first** - Test with `eip155:84532` (Base Sepolia) before mainnet
4. **Monitor spending** - Check `wallet.get_payment_summary()` regularly
## Examples
See the [examples/](examples/) directory:
- `research_crew.py` - Research crew with payment capability
## How It Differs From API Keys
| API Keys | x402 |
|----------|------|
| 1 key per service | 1 wallet for all services |
| Monthly subscriptions | Pay per request |
| Human signup required | Zero onboarding |
| Credential rotation | No credentials to leak |
| Service-level limits | Agent-level budgets |
## Related Packages
- [langchain-x402](https://pypi.org/project/langchain-x402/) - x402 integration for LangChain
## Resources
- [x402 Protocol Spec](https://x402.org)
- [AgentRails Documentation](https://agentrails.io/docs)
- [AgentRails Swagger (Sandbox)](https://sandbox.agentrails.io/swagger)
- [EIP-3009 Specification](https://eips.ethereum.org/EIPS/eip-3009)
- [CrewAI Documentation](https://docs.crewai.com)
## License
MIT License - see [LICENSE](LICENSE) for details.
## Contributing
Contributions welcome! Please read our contributing guidelines and submit PRs to the [GitHub repository](https://github.com/kmatthewsio/crewai-x402).
| text/markdown | null | AgentRails <dev@agentrails.io> | null | null | null | ai-agents, api-monetization, crewai, crypto, payments, usdc, x402 | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Software Develop... | [] | null | null | >=3.10 | [] | [] | [] | [
"crewai>=0.10.0",
"eth-account>=0.10.0",
"eth-typing>=3.0.0",
"httpx>=0.25.0",
"pydantic>=2.0.0",
"mypy>=1.0.0; extra == \"dev\"",
"pytest-asyncio>=0.21.0; extra == \"dev\"",
"pytest-httpx>=0.22.0; extra == \"dev\"",
"pytest>=7.0.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://agentrails.io",
"Documentation, https://agentrails.io/docs",
"Repository, https://github.com/kmatthewsio/crewai-x402",
"Issues, https://github.com/kmatthewsio/crewai-x402/issues"
] | twine/6.2.0 CPython/3.14.2 | 2026-02-19T10:22:26.068902 | crewai_x402-0.2.1.tar.gz | 15,803 | ac/0f/9f1df4e3e81e6378ee3fe6907e523a41b2814accdd4a192cc323089fe2e9/crewai_x402-0.2.1.tar.gz | source | sdist | null | false | bd977fd532a7ef5aa33df5e97fac53ea | e4cc683a7e16e9c3a9d7c8a77ea9106513d95d38819ba11938b52f5b5ed46ebd | ac0f9f1df4e3e81e6378ee3fe6907e523a41b2814accdd4a192cc323089fe2e9 | MIT | [
"LICENSE"
] | 245 |
2.4 | langchain-x402 | 0.2.1 | x402 payment protocol integration for LangChain - enable AI agents to pay for APIs with USDC | # langchain-x402
[](https://pypi.org/project/langchain-x402/)
[](https://opensource.org/licenses/MIT)
**Enable AI agents to pay for APIs with USDC using the x402 protocol.**
langchain-x402 integrates the [x402 payment protocol](https://x402.org) with LangChain, allowing your AI agents to autonomously access paid APIs without managing API keys or subscriptions.
## What is x402?
x402 is the HTTP-native payment protocol that finally implements the `402 Payment Required` status code. Instead of API keys and monthly subscriptions, software pays software—per request, in USDC, with cryptographic proof.
**How it works:**
1. Agent requests a resource
2. Server returns `402 Payment Required` with price info
3. Agent signs a USDC payment authorization (EIP-3009)
4. Agent retries with payment proof
5. Server settles on-chain, returns data
All in a single HTTP round-trip.
## Installation
```bash
pip install langchain-x402
```
## Quick Start
```python
import os
from langchain_x402 import X402Wallet, X402PaymentTool
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_react_agent
# 1. Create a wallet with a USDC budget
wallet = X402Wallet(
private_key=os.environ["WALLET_PRIVATE_KEY"],
network="eip155:8453", # Base mainnet (CAIP-2 format)
budget_usd=10.00
)
# 2. Create the payment tool
tool = X402PaymentTool(wallet=wallet)
# 3. Add to your agent
llm = ChatOpenAI(model="gpt-4o")
agent = create_react_agent(llm, tools=[tool], prompt=your_prompt)
executor = AgentExecutor(agent=agent, tools=[tool])
# 4. Agent can now access any x402-enabled API
result = executor.invoke({
"input": "Get the premium analysis from https://sandbox.agentrails.io/api/x402/protected/analysis"
})
```
## Try It with the Sandbox
The [AgentRails Sandbox](https://sandbox.agentrails.io) is a free test environment with x402-protected endpoints you can hit right away. No signup required to see the 402 flow in action.
### 1. Check available endpoints and pricing
```bash
curl https://sandbox.agentrails.io/api/x402/pricing
```
```json
{
"endpoints": [
{ "resource": "/api/x402/protected/analysis", "amountUsdc": 0.01 },
{ "resource": "/api/x402/protected/data", "amountUsdc": 0.001 }
],
"supportedNetworks": [
"eip155:5042002", "eip155:84532", "eip155:11155111",
"eip155:8453", "eip155:1"
],
"payTo": "0x6255d8dd3f84ec460fc8b07db58ab06384a2f487"
}
```
### 2. See a 402 response
```bash
curl -i https://sandbox.agentrails.io/api/x402/protected/analysis
# → 402 Payment Required
# → PAYMENT-REQUIRED: <base64-encoded payment requirements>
```
### 3. Point your agent at the sandbox
```python
wallet = X402Wallet(
private_key=os.environ["WALLET_PRIVATE_KEY"],
network="eip155:84532", # Base Sepolia testnet (CAIP-2 format)
budget_usd=1.00
)
tool = X402PaymentTool(wallet=wallet)
# The tool handles the 402 → sign → retry flow automatically
result = tool.invoke({
"url": "https://sandbox.agentrails.io/api/x402/protected/analysis"
})
```
### Sandbox Endpoints
| Endpoint | Cost | Description |
|----------|------|-------------|
| `GET /api/x402/protected/analysis` | $0.01 USDC | AI analysis (premium) |
| `GET /api/x402/protected/data` | $0.001 USDC | Data endpoint (micropayment) |
| `GET /api/x402/pricing` | Free | Pricing for all protected endpoints |
| `GET /api/x402/stats` | Free | Payment statistics |
Full API reference: [sandbox.agentrails.io/swagger](https://sandbox.agentrails.io/swagger)
## Features
### Automatic Payment Handling
The `X402PaymentTool` automatically detects 402 responses and handles payment negotiation:
```python
tool = X402PaymentTool(
wallet=wallet,
auto_pay=True, # Automatically pay when within budget
timeout=30.0, # Request timeout in seconds
)
```
### Budget Control
Set spending limits at the wallet level:
```python
wallet = X402Wallet(
private_key=key,
network="eip155:8453",
budget_usd=5.00 # Agent can't spend more than $5
)
# Check remaining budget
print(f"Remaining: ${wallet.remaining_usd}")
# Check if can afford a specific amount
if wallet.can_afford(0.01):
print("Can afford $0.01 request")
```
### Per-Request Price Limits
Limit how much an agent can pay for a single request:
```python
# In the tool input
result = tool.invoke({
"url": "https://sandbox.agentrails.io/api/x402/protected/analysis",
"max_price_usd": 0.05 # Won't pay more than $0.05 for this request
})
```
### Payment History
Track all payments made by the wallet:
```python
for payment in wallet.payments:
print(f"{payment.resource_url}: ${payment.amount_usd}")
# Get summary
summary = wallet.get_payment_summary()
print(f"Total spent: ${summary['spent_usd']}")
print(f"Payments made: {summary['payment_count']}")
```
### Multi-Network Support
Supports multiple EVM networks using [CAIP-2](https://github.com/ChainAgnostic/CAIPs/blob/main/CAIPs/caip-2.md) identifiers:
```python
# Base (recommended - low fees)
wallet = X402Wallet(private_key=key, network="eip155:8453")
# Ethereum
wallet = X402Wallet(private_key=key, network="eip155:1")
# Testnets
wallet = X402Wallet(private_key=key, network="eip155:84532") # Base Sepolia
wallet = X402Wallet(private_key=key, network="eip155:5042002") # Arc testnet
```
> Legacy network names (`base-mainnet`, `base-sepolia`, etc.) are still accepted for backwards compatibility.
## API Reference
### X402Wallet
```python
X402Wallet(
private_key: str, # Hex-encoded private key
network: str, # CAIP-2 network ID (e.g., "eip155:8453")
budget_usd: float, # Maximum USD to spend
)
```
**Properties:**
- `address` - Wallet address
- `spent_usd` - Total USD spent
- `remaining_usd` - Remaining budget
- `payments` - List of PaymentRecord objects
**Methods:**
- `can_afford(amount_usd)` - Check if budget allows payment
- `sign_payment(to, amount, valid_before)` - Sign EIP-3009 authorization
- `get_payment_summary()` - Get spending summary dict
- `reset_budget(new_budget)` - Reset budget and clear history
### X402PaymentTool
```python
X402PaymentTool(
wallet: X402Wallet, # Wallet for payments
auto_pay: bool = True, # Auto-pay when within budget
timeout: float = 30.0, # HTTP timeout
)
```
**Tool Input Schema:**
```python
{
"url": str, # Required: URL to request
"method": str = "GET", # HTTP method
"body": str | None, # Request body
"headers": dict | None, # Additional headers
"max_price_usd": float | None, # Per-request price limit
}
```
## Networks
V2 uses [CAIP-2](https://github.com/ChainAgnostic/CAIPs/blob/main/CAIPs/caip-2.md) network identifiers:
| Network ID (CAIP-2) | Chain ID | Environment | Legacy Alias |
|---------------------|----------|-------------|-------------|
| `eip155:8453` | 8453 | Production | `base-mainnet` |
| `eip155:84532` | 84532 | Testnet | `base-sepolia` |
| `eip155:1` | 1 | Production | `ethereum-mainnet` |
| `eip155:11155111` | 11155111 | Testnet | `ethereum-sepolia` |
| `eip155:5042002` | 5042002 | Testnet | `arc-testnet` |
## Security Considerations
1. **Never commit private keys** - Use environment variables or secret managers
2. **Set appropriate budgets** - Limit what agents can spend
3. **Use testnets first** - Test with `eip155:84532` (Base Sepolia) before mainnet
4. **Monitor spending** - Check `wallet.get_payment_summary()` regularly
## Examples
See the [examples/](examples/) directory:
- `basic_agent.py` - Simple ReAct agent with payment capability
- `multi_api.py` - Agent accessing multiple paid APIs
## How It Differs From API Keys
| API Keys | x402 |
|----------|------|
| 1 key per service | 1 wallet for all services |
| Monthly subscriptions | Pay per request |
| Human signup required | Zero onboarding |
| Credential rotation | No credentials to leak |
| Service-level limits | Agent-level budgets |
## Related Packages
- [crewai-x402](https://pypi.org/project/crewai-x402/) - x402 integration for CrewAI
## Resources
- [x402 Protocol Spec](https://x402.org)
- [AgentRails Documentation](https://agentrails.io/docs)
- [AgentRails Swagger (Sandbox)](https://sandbox.agentrails.io/swagger)
- [EIP-3009 Specification](https://eips.ethereum.org/EIPS/eip-3009)
- [LangChain Custom Tools](https://python.langchain.com/docs/modules/tools/custom_tools)
## License
MIT License - see [LICENSE](LICENSE) for details.
## Contributing
Contributions welcome! Please read our contributing guidelines and submit PRs to the [GitHub repository](https://github.com/kmatthewsio/langchain-x402).
| text/markdown | null | AgentRails <dev@agentrails.io> | null | null | null | ai-agents, api-monetization, crypto, langchain, payments, usdc, x402 | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Software Develop... | [] | null | null | >=3.10 | [] | [] | [] | [
"eth-account>=0.10.0",
"eth-typing>=3.0.0",
"httpx>=0.25.0",
"langchain-core>=0.1.0",
"pydantic>=2.0.0",
"mypy>=1.0.0; extra == \"dev\"",
"pytest-asyncio>=0.21.0; extra == \"dev\"",
"pytest-httpx>=0.22.0; extra == \"dev\"",
"pytest>=7.0.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://agentrails.io",
"Documentation, https://agentrails.io/docs",
"Repository, https://github.com/kmatthewsio/langchain-x402",
"Issues, https://github.com/kmatthewsio/langchain-x402/issues"
] | twine/6.2.0 CPython/3.14.2 | 2026-02-19T10:22:21.814158 | langchain_x402-0.2.1.tar.gz | 14,919 | 36/b7/c612e02890802d86dc6e677043f74d1d4b8e43b3f7c6a2d9f7eb1636f1f7/langchain_x402-0.2.1.tar.gz | source | sdist | null | false | 182247d6d3e8d4f6f744118d5ac44d42 | 1fa5bc6092562f3f50e6dd07cc84fa5993328e7e5e2d4308116e91a13d01e662 | 36b7c612e02890802d86dc6e677043f74d1d4b8e43b3f7c6a2d9f7eb1636f1f7 | MIT | [
"LICENSE"
] | 234 |
2.4 | arkparse | 0.4.1 | A package to parse and modify ark save files | # ArkParse: A Python Library for Reading and Modifying ARK Save Files
**ArkParse** is a Python library designed for **ARK: Survival Ascended** players, server administrators, and modders. This library enables you to read, analyze, and modify ARK save files with an intuitive API. With ArkParse, you can access detailed information about players, structures, equipment, dinosaurs, and more, enabling powerful tools for automation, analysis, and customization.
## Introduction
Hi everyone,
I originally created this package to manage a private ARK server I started with a few friends. What began as a small project quickly grew into something much bigger than I expected!
The foundation for this work was built on the awesome efforts of [Kakoen](https://github.com/Kakoen) and their contributors, whose Java-based save file property parsing tools were a fantastic starting point. You can check out their work here: [Kakoen's ark-sa-save-tools](https://github.com/Kakoen/ark-sa-save-tools). However, since I'm more comfortable with Python, I decided to start my own package and expand on it significantly.
The package has grown to a pretty expansive set of tools that can be used to retrieve nearly everything in the save files, in a simple object oriented way.
I mainly use this package for server management tasks. Some highlights include:
- Automatically changing server passwords to control when players can log in.
- A voting system to reveal dino and base locations.
- Sending random stats of the server to the chat
- Monitoring player activity like when people log off and on and such
- Randomly spawning bases with random loot for my friends to raid; probably my favorite feature (and the most complicated)
If you're curious or want to explore the features yourself, you can find the project here: [ark-server-manager](https://github.com/VincentHenauGithub/ark-server-manager).
Hope you find it useful or inspiring or both! 😊
## Discord
If you use the library a lot or want to chat about functionalities, I made a discord for that, which you can join here: [discord](https://discord.gg/cStrkZVzFE).
## Disclaimer
I'm not a professional Python programmer, so if you come across anything that could be done better, please bear with me, or, feel free to contribute! 😉
Secondly, the package is not fully complete, with some elements missing, such as blueprints in the Classes section, formulas for calculating coordinates for other maps than Abberation, and more. However, I hope the package is designed in a way that makes it relatively easy for you to add your own needs to it.
Last, I've never made an open source package like this so if I'm doing something wrong or don't know some general rules of thumb, feel free to tell me!
I just hope it's usefull for someone!
---
## Features
- **Player API**: Retrieve player and tribe data, including inventory details.
- **Structure API**: Analyze and filter structures by location, owner, and other criteria, create heatmaps and more...
- **Equipment API**: Explore equipment, armor, and saddles. Retrieve blueprints or create and insert custom items.
- **Dino API**: Analyze dino data, generate heatmaps, find specific dinos, or track stats like mutations and levels.
- **Base API**: Export and import entire bases for custom scenarios.
- **Stackable API**: Simple API for parsing basic resources, ammo, structure items and such...
- **Json API**: Simple API for exporting data as JSON.
- **General Tools**: Create custom save file content or perform bulk modifications.
---
## Installation
Install via pip:
```bash
pip install arkparse
```
Or install locally on your PC so you can add modifications
First clone the repo and then run the following in repo directory:
```bash
pip install -e .
```
---
### 4. **Quickstart**
There are quite a few examples under the examples folder, organized by api. These should help you on your way for most of the package functionalities, some of them are listed below already.
#### a. **Player API: Retrieve Player and Inventory Information**
```python
from arkparse import AsaSave
from arkparse.enums import ArkMap
from arkparse.ftp.ark_ftp_client import ArkFtpClient
from arkparse.api.player_api import PlayerApi
from arkparse.object_model.misc.inventory import Inventory
player_api = PlayerApi('../../ftp_config.json', ArkMap.ABERRATION)
save = AsaSave(contents=ArkFtpClient.from_config('../../ftp_config.json', ArkMap.ABERRATION).download_save_file())
for player in player_api.players:
inventory: Inventory = player_api.get_player_inventory(player, save)
print(player)
print(f"{player.name}'s inventory:")
print(inventory)
print("\n")
```
---
#### b. **Structure API: Analyze Structures and Generate Heatmaps**
Retrieve and filter structures by owner, location, or type. Generate heatmaps for visualization and analysis.
```python
from pathlib import Path
from uuid import UUID
from typing import Dict
from arkparse import AsaSave, Classes
from arkparse.api import StructureApi
from arkparse.ftp import ArkFtpClient
from arkparse.enums import ArkMap
from arkparse.object_model.structures import StructureWithInventory
# retrieve the save file (can also retrieve it from a local path)
save_path = ArkFtpClient.from_config(Path("../../ftp_config.json"), ArkMap.ABERRATION).download_save_file(Path.cwd())
save = AsaSave(save_path)
structure_api = StructureApi(save)
owning_tribe = 0 # add the tribe id here (check the player api examples to see how to get the tribe id)
vaults: Dict[UUID, StructureWithInventory] = structure_api.get_by_class([Classes.structures.placed.utility.vault])
vaults_owned_by = [v for v in vaults.values() if v.owner.tribe_id == owning_tribe]
print(f"Vaults owned by tribe {owning_tribe}:")
for v in vaults_owned_by:
print(v)
```
---
#### c. **Equipment API: Manage Equipment and Blueprints**
```python
from pathlib import Path
from uuid import UUID
from typing import Dict
from arkparse.object_model.equipment.weapon import Weapon
from arkparse.saves.asa_save import AsaSave
from arkparse.ftp.ark_ftp_client import ArkFtpClient, ArkMap
from arkparse.api.equipment_api import EquipmentApi
from arkparse.classes.equipment import Weapons
# Retrieve save file
save_path = ArkFtpClient.from_config(Path("../../ftp_config.json"), ArkMap.ABERRATION).download_save_file(Path.cwd())
save = AsaSave(save_path)
equipment_api = EquipmentApi(save)
# Get all longneck blueprints
weapons: Dict[UUID, Weapon] = equipment_api.get_filtered(
EquipmentApi.Classes.WEAPON,
classes=[Weapons.advanced.longneck],
only_blueprints=True
)
highest_dmg_bp = max(weapons.values(), key=lambda x: x.damage)
print(f"Highest damage on longneck bp: {highest_dmg_bp.damage}")
```
---
#### d. **Dino API: Analyze and Find Dinosaurs**
```python
from pathlib import Path
from arkparse.api.dino_api import DinoApi
from arkparse.enums import ArkMap
from arkparse.saves.asa_save import AsaSave
from arkparse.object_model.dinos.tamed_dino import TamedDino
save_path = Path.cwd() / "Aberration_WP.ark" # Replace with path to your save file
save = AsaSave(save_path)
dino_api = DinoApi(save)
dinos = dino_api.get_all_tamed()
if dinos is None:
print("No tamed dinos found")
exit()
most_mutations: TamedDino = None
for dino in dinos.values():
dino: TamedDino = dino
curr = 0 if most_mutations is None else most_mutations.stats.get_total_mutations()
if most_mutations is None or (dino.stats.get_total_mutations() > curr):
most_mutations = dino
print(f"The dino with the most mutations is a {most_mutations.get_short_name()} with {int(most_mutations.stats.get_total_mutations())} mutations")
print(f"Location: {most_mutations.location.as_map_coords(ArkMap.ABERRATION)}")
print(f"Level: {most_mutations.stats.current_level}")
print(f"Owner: {most_mutations.owner}")
```
---
#### e. **JSON API: Export parsed data as JSON**
```python
from pathlib import Path
from arkparse import AsaSave
from arkparse.api.json_api import JsonApi
save_path = Path.cwd() / "Ragnarok_WP.ark" # replace with path to your save file
save = AsaSave(save_path) # loads save file
json_api = JsonApi(save) # initializes the JSON API
json_api.export_items() # exports items to JSON
```
## Contributing
I welcome contributions! If you have updates to this library that you would like to share, feel free!
Special thanks go to [O-S Marin](https://github.com/K07H) for many contributions to the library!
Check out his Arkparse powered save visualizer, [ASI (Ark-Save-Inspector)](https://github.com/K07H/ASA-Save-Inspector) Spoiler alert: it's pretty awesome 😊
---
## License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
---
## Feedback & Support
- **Issues or Feature Requests**: Open an issue on the repo!
- **Help**: If you need help for something specific, you can always messageme, I will try to help you out
## Donation
If you really really love this package you can [donate here](https://www.paypal.com/donate/?hosted_button_id=BV63CTDUW7PKQ)
There is no need, but I also won't say no 😊
| text/markdown | null | Vincent Henau <vincent.henau.github@gmail.com> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.9 | [] | [] | [] | [
"matplotlib>=3.10.3",
"numpy>=2.3.2",
"pandas>=2.3.1",
"rcon>=2.4.9"
] | [] | [] | [] | [
"Homepage, https://github.com/VincentHenauGithub/ark-save-parser"
] | twine/6.2.0 CPython/3.9.25 | 2026-02-19T10:20:19.752684 | arkparse-0.4.1.tar.gz | 7,536,738 | 12/37/d45e6cc8456496e20fcfe35a7dc72f590197eed4bf35fb0441dbf74cdea0/arkparse-0.4.1.tar.gz | source | sdist | null | false | 17b99ed795297ccc3f98509054d78e80 | 0925d7fb23289a5f86170e957a607897feb0c8cb9bb77633998319331aaf08b4 | 1237d45e6cc8456496e20fcfe35a7dc72f590197eed4bf35fb0441dbf74cdea0 | null | [
"LICENSE"
] | 243 |
2.4 | pylife-odbserver | 2.2.1 | A server for odbAccess to be acessed by pylife-odbclient | # pylife-odbserver
A server for odbAccess to be accessed by pylife-odbclient
## Purpose
Unfortunately Abaqus usually comes with an outdated python engine. So you can't
access an Abaqus odb file from within modern python code using the latest
packages. This python package is the client part of a client server setup to
make odb files accessible from within python code using a current python
version in a transparent way.
## Solution
This package provides a slim server that as python software that is running
with old python versions of Abaqus (even 2.7), that can be run inside the
Abaqus python engine. It accepts command via `sys.stdin` and according to the
command is querying data from the `odbAccess` interface and returning them in a
pickle object.
The sibling package `pylife-odbclient` comes with a python class `OdbClient`
that spawns the server in the background when an instance of `OdbClient` is
instantiated. Then the client object can be used to transparently access data
from the odb file via the server. Once the client object goes out of scope
i.e. is deleted, the server process is stopped automatically.
## Installation
* Create and activate a plain python-2.7 environment without additional
packages. For example by
```
conda create -n odbserver python=2.7 pip
```
Instead of `2.7` you must choose the python version of your abaqus version. You
can find it out using
```
abaqus python --version
```
* Run
```
pip install pylife-odbserver
```
* Set environment variables (optional)
The ``odbclient`` will look for your Abaqus binary in
* ``C:/Program Files/SIMULIA/<release_year>/EstProducts/win_b64/code/bin/SMALauncher.exe``
and for the above Python environment in:
* ``<repo_root>/.venv-odbserver``
* ``C:/Users/<yourname>/.conda/envs/odbserver``
* ``C:/Users/<yourname>/.virtualenvs/odbserver``
If your paths are different, you must either specify them each time you run the ``odbclient``, or you can set them as environment variables as follows:
```powershell
[Environment]::SetEnvironmentVariable("ODBSERVER_ABAQUS_BIN", "<absolute/path/to/your/abq.exe>", "User")
[Environment]::SetEnvironmentVariable("ODBSERVER_PYTHON_ENV_PATH", "<absolute/path/to/the/above/python/env>", "User")
```
You can check the above with:
```powershell
[Environment]::GetEnvironmentVariable("ODBSERVER_ABAQUS_BIN", "User")
[Environment]::GetEnvironmentVariable("ODBSERVER_PYTHON_ENV_PATH", "User")
```
* See the <a href="../odbclient/">instructions in `pylife-odbclient`</a> on how
to install the client.
| text/markdown; charset=UTF-8 | Johannes Mueller | johannes.mueller4@de.bosch.com | null | null | Apache-2 | null | [
"Programming Language :: Python"
] | [] | http://github.com/boschresearch/pylife | null | null | [] | [] | [] | [] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:20:07.291601 | pylife_odbserver-2.2.1.tar.gz | 13,361 | 12/bd/9f0e5a59bc6bd0b84a8e92aae472ad8b3256217f8cc47119bc95269f932b/pylife_odbserver-2.2.1.tar.gz | source | sdist | null | false | da2d8038a86d1a815c9d59c0c9e4b5b7 | b5ea87331e29c724f8fe7bd8464cfd2003571af2d9b5560e1160e23363766554 | 12bd9f0e5a59bc6bd0b84a8e92aae472ad8b3256217f8cc47119bc95269f932b | null | [
"LICENSE",
"AUTHORS.rst"
] | 236 |
2.4 | rfdetr-plus | 1.0.1 | RF-DETR+ Extension Package | # RF-DETR+: Large-Scale Detection Models for RF-DETR
[](https://badge.fury.io/py/rfdetr-plus)
[](https://pypistats.org/packages/rfdetr-plus)
[](https://badge.fury.io/py/rfdetr-plus)
[](https://github.com/roboflow/rf-detr-plus/blob/main/LICENSE)
[](https://discord.gg/GbfgXGJ8Bk)
RF-DETR is the core package in the ecosystem. It provides the full training and inference stack, the {Nano, Small, Medium, Large} model lineup, and the APIs most users build on. RF-DETR+ is an extension package for [RF-DETR](https://github.com/roboflow/rf-detr) that adds the **XLarge** and **2XLarge** detection models for maximum accuracy.
RF-DETR+ models use a DINOv2 vision transformer backbone at higher resolutions and larger feature dimensions than the core RF-DETR lineup, pushing state-of-the-art accuracy on [Microsoft COCO](https://cocodataset.org/#home) and [RF100-VL](https://github.com/roboflow/rf100-vl) while retaining real-time inference speeds. Use RF-DETR for the standard model set and RF-DETR+ when you need the highest-accuracy variants.
## Install
Install RF-DETR+ in a [**Python>=3.10**](https://www.python.org/) environment with `pip`. This will also install [`rfdetr`](https://github.com/roboflow/rf-detr) as a dependency, which provides the core APIs and model definitions.
```bash
pip install rfdetr-plus
```
<details>
<summary>Install from source</summary>
<br>
```bash
pip install git+https://github.com/roboflow/rf-detr-plus.git
```
</details>
## Benchmarks
RF-DETR+ XLarge and 2XLarge sit at the top of the RF-DETR accuracy/latency curve, delivering the highest COCO AP scores in the family. All latency numbers were measured on an NVIDIA T4 using TensorRT, FP16, and batch size 1.
| Size | Class | COCO AP<sub>50</sub> | COCO AP<sub>50:95</sub> | RF100VL AP<sub>50</sub> | RF100VL AP<sub>50:95</sub> | Latency (ms) | Params (M) | Resolution | Package / License |
| :----: | :-------------: | :------------------: | :---------------------: | :---------------------: | :------------------------: | :----------: | :--------: | :--------: | :----------------------------------------------------------: |
| N | `RFDETRNano` | 67.6 | 48.4 | 85.0 | 57.7 | 2.3 | 30.5 | 384x384 | [`rfdetr`](https://github.com/roboflow/rf-detr) / Apache 2.0 |
| S | `RFDETRSmall` | 72.1 | 53.0 | 86.7 | 60.2 | 3.5 | 32.1 | 512x512 | [`rfdetr`](https://github.com/roboflow/rf-detr) / Apache 2.0 |
| M | `RFDETRMedium` | 73.6 | 54.7 | 87.4 | 61.2 | 4.4 | 33.7 | 576x576 | [`rfdetr`](https://github.com/roboflow/rf-detr) / Apache 2.0 |
| L | `RFDETRLarge` | 75.1 | 56.5 | 88.2 | 62.2 | 6.8 | 33.9 | 704x704 | [`rfdetr`](https://github.com/roboflow/rf-detr) / Apache 2.0 |
| ⭐ XL | `RFDETRXLarge` | 77.4 | 58.6 | 88.5 | 62.9 | 11.5 | 126.4 | 700x700 | `rfdetr_plus` / [PML 1.0](LICENSE) |
| ⭐ 2XL | `RFDETR2XLarge` | 78.5 | 60.1 | 89.0 | 63.2 | 17.2 | 126.9 | 880x880 | `rfdetr_plus` / [PML 1.0](LICENSE) |
## Run Models
Install RF-DETR+ to use XL and 2XL models alongside the core RF-DETR lineup:
```bash
pip install rfdetr_plus
```
RF-DETR+ models require you to accept the Platform Model License before use. Once accepted, usage mirrors the standard RF-DETR API -- you import the models from `rfdetr_plus` and keep using the `rfdetr` utilities:
```python
import requests
import supervision as sv
from PIL import Image
from rfdetr_plus import RFDETRXLarge
from rfdetr.util.coco_classes import COCO_CLASSES
model = RFDETRXLarge(accept_platform_model_license=True)
image = Image.open(requests.get("https://media.roboflow.com/dog.jpg", stream=True).raw)
detections = model.predict(image, threshold=0.5)
labels = [f"{COCO_CLASSES[class_id]}" for class_id in detections.class_id]
annotated_image = sv.BoxAnnotator().annotate(image, detections)
annotated_image = sv.LabelAnnotator().annotate(annotated_image, detections, labels)
```
### Train Models
RF-DETR+ models support fine-tuning with the same training API as core RF-DETR. You can train on your own dataset or use datasets from [Roboflow Universe](https://universe.roboflow.com/).
```python
from rfdetr_plus import RFDETRXLarge
model = RFDETRXLarge(accept_platform_model_license=True)
model.train(dataset_dir="path/to/dataset", epochs=50, lr=1e-4)
```
## Documentation
Visit the [RF-DETR documentation website](https://rfdetr.roboflow.com) to learn more about training, export, deployment, and the full model lineup.
## License
RF-DETR+ code and model checkpoints are licensed under the Platform Model License 1.0 (PML-1.0). See [`LICENSE`](LICENSE) for details. These models require a [Roboflow](https://roboflow.com) account to run and fine-tune.
The core RF-DETR models (Nano through Large) are available under the Apache License 2.0 in the [`rfdetr`](https://github.com/roboflow/rf-detr) package.
## Acknowledgements
Our work is built upon [LW-DETR](https://arxiv.org/pdf/2406.03459), [DINOv2](https://arxiv.org/pdf/2304.07193), and [Deformable DETR](https://arxiv.org/pdf/2010.04159). Thanks to their authors for their excellent work!
## Citation
If you find our work helpful for your research, please consider citing the following BibTeX entry.
```bibtex
@misc{rf-detr,
title={RF-DETR: Neural Architecture Search for Real-Time Detection Transformers},
author={Isaac Robinson and Peter Robicheaux and Matvei Popov and Deva Ramanan and Neehar Peri},
year={2025},
eprint={2511.09554},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2511.09554},
}
```
## Contribute
We welcome and appreciate all contributions! If you notice any issues or bugs, have questions, or would like to suggest new features, please [open an issue](https://github.com/roboflow/rf-detr-plus/issues/new) or pull request. By sharing your ideas and improvements, you help make RF-DETR better for everyone.
<p align="center">
<a href="https://youtube.com/roboflow"><img src="https://media.roboflow.com/notebooks/template/icons/purple/youtube.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634652" width="3%"/></a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://roboflow.com"><img src="https://media.roboflow.com/notebooks/template/icons/purple/roboflow-app.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949746649" width="3%"/></a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://www.linkedin.com/company/roboflow-ai/"><img src="https://media.roboflow.com/notebooks/template/icons/purple/linkedin.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633691" width="3%"/></a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://docs.roboflow.com"><img src="https://media.roboflow.com/notebooks/template/icons/purple/knowledge.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949634511" width="3%"/></a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://discuss.roboflow.com"><img src="https://media.roboflow.com/notebooks/template/icons/purple/forum.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633584" width="3%"/></a>
<img src="https://raw.githubusercontent.com/ultralytics/assets/main/social/logo-transparent.png" width="3%"/>
<a href="https://blog.roboflow.com"><img src="https://media.roboflow.com/notebooks/template/icons/purple/blog.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672949633605" width="3%"/></a>
</p>
| text/markdown | null | "Roboflow, Inc" <develop@roboflow.com> | null | null | null | machine-learning, deep-learning, vision, ML, DL, AI, DETR, RF-DETR, Roboflow | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"Intended Audience :: Science/Research",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python ::... | [] | null | null | >=3.10 | [] | [] | [] | [
"rfdetr<2,>=1.4.3"
] | [] | [] | [] | [
"Homepage, https://github.com/roboflow/rf-detr-plus"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:20:06.122267 | rfdetr_plus-1.0.1.tar.gz | 12,825 | 92/cc/6a35b7ffd503cef68f6098d777816e2f916947285fb98c4215693f02a431/rfdetr_plus-1.0.1.tar.gz | source | sdist | null | false | a9493c718993b80c7de3bc3c7aa01691 | 5243967b73519dc5b69df8a9a8a5d6670a01f4088787f86b1c2cde3c6a1582a0 | 92cc6a35b7ffd503cef68f6098d777816e2f916947285fb98c4215693f02a431 | LicenseRef-PML-1.0 | [
"LICENSE"
] | 1,519 |
2.4 | pylife-odbclient | 2.2.1 | A Python 3 client for odbAccess using pylife-odbserver | # pylife-odbclient
A Modern Python client for odbAccess using pylife-odbserver
## Purpose
Unfortunately Abaqus usually comes with an outdated python engine. So you can't
access an Abaqus odb file from within modern python code using the latest
packages. This python package is the client part of a client server setup to
make odb files accessible from within python code using a current python
version in a transparent way.
## Solution
The sibling package `pylife-odbserver` provides a slim server that as python
software that is running with old python versions of Abaqus (even 2.7), that
can be run inside the Abaqus python engine. It accepts command via `sys.stdin`
and according to the command is querying data from the `odbAccess` interface
and returning them in a pickle object.
This package comes with a python class `OdbClient` that spawns the server in
the background when an instance of `OdbClient` is instantiated. Then the client
object can be used to transparently access data from the odb file via the
server. Once the client object goes out of scope i.e. is deleted, the server
process is stopped automatically.
## Installation
* Install the odbclient using `pip` with the command
```
pip install pylife-odbclient
```
* See the <a href="../odbserver/">instructions in `pylife-odbserver`</a> on how
to install the server.
## Usage
Usually you only will see the `OdbClient` class interface when you access an
odb file. The only point you care about the server is when you instantiate an
`OdbClient` object. You need to know the following things
* The path to the Abaqus executable
* The path to the python environment `pylife-server` is installed into.
Then you can instantiate a `OdbClient` object using
```python
import odbclient as CL
client = CL.OdbClient("yourodb.odb")
```
See the [API docs of `OdbClient`][1]
for details.
## Limitations
### Limited functionality
Only a subset of Abaqus variable locations are supported. These are: nodal,
element nodal, whole element and centroid. Integration point variables are
extrapolated to element nodal.
You can only extract data from an odb file, not write to it.
### String literals
So far only names made of `ascii` strings are supported. That means that
instance names, node that names and the like containing non-ascii characters
like German umlauts will not work.
## Development
Due to the server client architechture running the unit tests is not completely
trivial. Here are some instructions on how to get them running.
### Setting up the environments
As of now, we are assuming that conda is used to setup the server
environments. Probably we will change for `uv` in the future.
We provide a bash script in `tests/create_server_envs.sh` that you can run from
within the root folder of `odbclient` (the folder this `README.md` resides
in). Then it should generate all the necessary environments. You will have to
run the script again, if there has been a release update in between.
The script is not well tested. So please be prepared for some manual steps.
___
[1]: https://pylife.readthedocs.io/en/latest/tools/odbclient/odbclient.html
| text/markdown; charset=UTF-8 | Johannes Mueller | johannes.mueller4@de.bosch.com | null | null | Apache-2 | null | [
"Development Status :: 4 - Beta",
"Programming Language :: Python"
] | [
"any"
] | https://github.com/boschresearch/pylife | null | >=3 | [] | [] | [] | [
"pandas",
"numpy",
"setuptools; extra == \"testing\"",
"pytest; extra == \"testing\"",
"pytest-timeout; extra == \"testing\"",
"pytest-cov; extra == \"testing\""
] | [] | [] | [] | [
"Documentation, https://pylife.readthedocs.io"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:20:05.175776 | pylife_odbclient-2.2.1.tar.gz | 1,342,063 | a1/aa/bc6cf51e28482fdb81917f7a23004f7bddc729c5348aba67b7acc0b72750/pylife_odbclient-2.2.1.tar.gz | source | sdist | null | false | 6508dae4bb9f5bd9eb102b1e03b69633 | 492c0f054c431e606c5dae86bf090d54e3a29466c49461334e51e8131c6293ba | a1aabc6cf51e28482fdb81917f7a23004f7bddc729c5348aba67b7acc0b72750 | null | [
"LICENSE",
"AUTHORS.rst"
] | 227 |
2.4 | ccxt | 4.5.39 | A cryptocurrency trading API with more than 100 exchanges in JavaScript / TypeScript / Python / C# / PHP / Go | # CCXT – CryptoCurrency eXchange Trading Library
[](https://www.npmjs.com/package/ccxt) [](https://npmjs.com/package/ccxt) [](https://pypi.python.org/pypi/ccxt) [](https://www.nuget.org/packages/ccxt) [](https://godoc.org/github.com/ccxt/ccxt/go/v4) [](https://discord.gg/ccxt) [](https://github.com/ccxt/ccxt/wiki/Exchange-Markets) [](https://x.com/ccxt_official)
A cryptocurrency trading API with more than 100 exchanges in JavaScript / TypeScript / Python / C# / PHP / Go.
### [Install](#install) · [Usage](#usage) · [Manual](https://github.com/ccxt/ccxt/wiki) · [FAQ](https://github.com/ccxt/ccxt/wiki/FAQ) · [Examples](https://github.com/ccxt/ccxt/tree/master/examples) · [Contributing](https://github.com/ccxt/ccxt/blob/master/CONTRIBUTING.md) · [Disclaimer](#disclaimer) · [Social](#social)
The **CCXT** library is used to connect and trade with cryptocurrency exchanges and payment processing services worldwide. It provides quick access to market data for storage, analysis, visualization, indicator development, algorithmic trading, strategy backtesting, bot programming, and related software engineering.
It is intended to be used by **coders, developers, technically-skilled traders, data-scientists and financial analysts** for building trading algorithms.
Current feature list:
- support for many cryptocurrency exchanges — more coming soon
- fully implemented public and private APIs
- optional normalized data for cross-exchange analytics and arbitrage
- an out of the box unified API that is extremely easy to integrate
- works in Node 10.4+, Python 3, PHP 8.1+, netstandard2.0/2.1, Go 1.20+ and web browsers
## See Also
- <sub>[](https://www.freqtrade.io)</sub> **[Freqtrade](https://www.freqtrade.io)** – leading opensource cryptocurrency algorithmic trading software!
- <sub>[](https://www.octobot.online)</sub> **[OctoBot](https://www.octobot.online)** – cryptocurrency trading bot with an advanced web interface.
- <sub>[](https://tokenbot.com/?utm_source=github&utm_medium=ccxt&utm_campaign=algodevs)</sub> **[TokenBot](https://tokenbot.com/?utm_source=github&utm_medium=ccxt&utm_campaign=algodevs)** – discover and copy the best algorithmic traders in the world.
## Certified Cryptocurrency Exchanges
|logo |id |name |ver |type |certified |pro |discount |
|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------|-----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------:|--------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [](https://accounts.binance.com/register?ref=CCXTCOM) | binance | [Binance](https://accounts.binance.com/register?ref=CCXTCOM) | [](https://developers.binance.com/en) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://accounts.binance.com/register?ref=CCXTCOM) |
| [](https://accounts.binance.com/register?ref=CCXTCOM) | binanceusdm | [Binance USDⓈ-M](https://accounts.binance.com/register?ref=CCXTCOM) | [](https://binance-docs.github.io/apidocs/futures/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://accounts.binance.com/register?ref=CCXTCOM) |
| [](https://accounts.binance.com/register?ref=CCXTCOM) | binancecoinm | [Binance COIN-M](https://accounts.binance.com/register?ref=CCXTCOM) | [](https://binance-docs.github.io/apidocs/delivery/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://accounts.binance.com/register?ref=CCXTCOM) |
| [](https://www.bybit.com/invite?ref=XDK12WP) | bybit | [Bybit](https://www.bybit.com/invite?ref=XDK12WP) | [](https://bybit-exchange.github.io/docs/inverse/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://www.okx.com/join/CCXTCOM) | okx | [OKX](https://www.okx.com/join/CCXTCOM) | [](https://www.okx.com/docs-v5/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://www.okx.com/join/CCXTCOM) |
| [](https://www.gate.com/share/CCXTGATE) | gate | [Gate](https://www.gate.com/share/CCXTGATE) | [](https://www.gate.com/docs/developers/apiv4/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://www.gate.com/share/CCXTGATE) |
| [](https://www.kucoin.com/ucenter/signup?rcode=E5wkqe) | kucoin | [KuCoin](https://www.kucoin.com/ucenter/signup?rcode=E5wkqe) | [](https://docs.kucoin.com) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://futures.kucoin.com/?rcode=E5wkqe) | kucoinfutures | [KuCoin Futures](https://futures.kucoin.com/?rcode=E5wkqe) | [](https://docs.kucoin.com/futures) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://www.bitget.com/expressly?languageType=0&channelCode=ccxt&vipCode=tg9j) | bitget | [Bitget](https://www.bitget.com/expressly?languageType=0&channelCode=ccxt&vipCode=tg9j) | [](https://www.bitget.com/api-doc/common/intro) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://app.hyperliquid.xyz/) | hyperliquid | [Hyperliquid](https://app.hyperliquid.xyz/) | [](https://hyperliquid.gitbook.io/hyperliquid-docs/for-developers/api) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://www.bitmex.com/app/register/NZTR1q) | bitmex | [BitMEX](https://www.bitmex.com/app/register/NZTR1q) | [](https://www.bitmex.com/app/apiOverview) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://www.bitmex.com/app/register/NZTR1q) |
| [](https://bingx.com/invite/OHETOM) | bingx | [BingX](https://bingx.com/invite/OHETOM) | [](https://bingx-api.github.io/docs/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://www.htx.com.vc/invite/en-us/1h?invite_code=6rmm2223) | htx | [HTX](https://www.htx.com.vc/invite/en-us/1h?invite_code=6rmm2223) | [](https://huobiapi.github.io/docs/spot/v1/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://www.htx.com.vc/invite/en-us/1h?invite_code=6rmm2223) |
| [](https://www.mexc.com/register?inviteCode=mexc-1FQ1GNu1) | mexc | [MEXC Global](https://www.mexc.com/register?inviteCode=mexc-1FQ1GNu1) | [](https://mexcdevelop.github.io/apidocs/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](http://www.bitmart.com/?r=rQCFLh) | bitmart | [BitMart](http://www.bitmart.com/?r=rQCFLh) | [](https://developer-pro.bitmart.com/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](http://www.bitmart.com/?r=rQCFLh) |
| [](https://crypto.com/exch/kdacthrnxt) | cryptocom | [Crypto.com](https://crypto.com/exch/kdacthrnxt) | [](https://exchange-docs.crypto.com/exchange/v1/rest-ws/index.html) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://crypto.com/exch/kdacthrnxt) |
| [](https://www.coinex.com/register?refer_code=yw5fz) | coinex | [CoinEx](https://www.coinex.com/register?refer_code=yw5fz) | [](https://docs.coinex.com/api/v2) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://global.hashkey.com/en-US/register/invite?invite_code=82FQUN) | hashkey | [HashKey Global](https://global.hashkey.com/en-US/register/invite?invite_code=82FQUN) | [](https://hashkeyglobal-apidoc.readme.io/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | |
| [](https://woox.io/register?ref=DIJT0CNL) | woo | [WOO X](https://woox.io/register?ref=DIJT0CNL) | [](https://docs.woox.io/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://woox.io/register?ref=DIJT0CNL) |
| [](https://dex.woo.org/en/trade?ref=CCXT) | woofipro | [WOOFI PRO](https://dex.woo.org/en/trade?ref=CCXT) | [](https://orderly.network/docs/build-on-omnichain/building-on-evm) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) | [](https://dex.woo.org/en/trade?ref=CCXT) |
## Supported Cryptocurrency Exchanges
<!--- init list -->The CCXT library currently supports the following 107 cryptocurrency exchange markets and trading APIs:
|logo |id |name |ver |type |certified |pro |
|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------|-----------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------:|--------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------|
| [](https://alp.com/?r=123788) | alp | [Alp](https://alp.com/?r=123788) | [](https://alpcomdev.github.io/alp-api-docs/) |  | | |
| [](https://alpaca.markets) | alpaca | [Alpaca](https://alpaca.markets) | [](https://alpaca.markets/docs/) |  | | [](https://ccxt.pro) |
| [](https://omni.apex.exchange/trade) | apex | [Apex](https://omni.apex.exchange/trade) | [](https://api-docs.pro.apex.exchange) |  | | [](https://ccxt.pro) |
| [](https://arkm.com/register?ref=ccxt) | arkham | [ARKHAM](https://arkm.com/register?ref=ccxt) | [](https://arkm.com/limits-api) |  | | [](https://ccxt.pro) |
| [](https://ascendex.com/en-us/register?inviteCode=EL6BXBQM) | ascendex | [AscendEX](https://ascendex.com/en-us/register?inviteCode=EL6BXBQM) | [](https://ascendex.github.io/ascendex-pro-api/#ascendex-pro-api-documentation) |  | | [](https://ccxt.pro) |
| [](https://www.asterdex.com/en/referral/aA1c2B) | aster | [Aster](https://www.asterdex.com/en/referral/aA1c2B) | [](https://github.com/asterdex/api-docs) |  | | [](https://ccxt.pro) |
| [](https://backpack.exchange/join/ccxt) | backpack | [Backpack](https://backpack.exchange/join/ccxt) | [](https://docs.backpack.exchange/) |  | | [](https://ccxt.pro) |
| [](https://bequant.io/referral/dd104e3bee7634ec) | bequant | [Bequant](https://bequant.io/referral/dd104e3bee7634ec) | [](https://api.bequant.io/) |  | | [](https://ccxt.pro) |
| [](https://b1.run/users/new?code=D3LLBVFT) | bigone | [BigONE](https://b1.run/users/new?code=D3LLBVFT) | [](https://open.big.one/docs/api.html) |  | | |
| [](https://accounts.binance.com/register?ref=CCXTCOM) | binance | [Binance](https://accounts.binance.com/register?ref=CCXTCOM) | [](https://developers.binance.com/en) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) |
| [](https://accounts.binance.com/register?ref=CCXTCOM) | binancecoinm | [Binance COIN-M](https://accounts.binance.com/register?ref=CCXTCOM) | [](https://binance-docs.github.io/apidocs/delivery/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) |
| [](https://www.binance.us/?ref=35005074) | binanceus | [Binance US](https://www.binance.us/?ref=35005074) | [](https://github.com/binance-us/binance-official-api-docs) |  | | [](https://ccxt.pro) |
| [](https://accounts.binance.com/register?ref=CCXTCOM) | binanceusdm | [Binance USDⓈ-M](https://accounts.binance.com/register?ref=CCXTCOM) | [](https://binance-docs.github.io/apidocs/futures/en/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) |
| [](https://bingx.com/invite/OHETOM) | bingx | [BingX](https://bingx.com/invite/OHETOM) | [](https://bingx-api.github.io/docs/) |  | [](https://github.com/ccxt/ccxt/wiki/Certification) | [](https://ccxt.pro) |
| [](https://bit2c.co.il/Aff/63bfed10-e359-420c-ab5a-ad368dab0baf) | bit2c | [Bit2C](https://bit2c.co.il/Aff/63bfed10-e359-420c-ab5a-ad368dab0baf) | [](https://www.bit2c.co.il/home/api) |  | | |
| [](https://bitbank.cc/) | bitbank | [bitbank](https://bitbank.cc/) | [](https://docs.bitbank.cc/) |  | | |
| [](https://ref.bitbns.com/1090961) | bitbns | [Bitbns](https://ref.bitbns.com/1090961) | [](https://bitbns.com/trade/#/api-trading/) |  | | |
| [](https://www.bitfinex.com) | bitfinex | [Bitfinex](https://www.bitfinex.com) | [](https://docs.bitfinex.com/v2/docs/) |  | | [](https://ccxt.pro) |
| [](https://bitflyer.com) | bitflyer | [bitFlyer](https://bitflyer.com) | [](https://lightning.bitflyer.com/docs?lang=en) |  | | |
| [ | [Getting Started](https://pyoz.dev/quickstart/) | [Examples](https://pyoz.dev/examples/complete-module/) | [GitHub](https://github.com/dzonerzy/PyOZ)
## Quick Example
Write normal Zig code -- PyOZ handles all the Python integration automatically:
```zig
const pyoz = @import("PyOZ");
const Point = struct {
x: f64,
y: f64,
pub fn magnitude(self: *const Point) f64 {
return @sqrt(self.x * self.x + self.y * self.y);
}
};
fn add(a: i64, b: i64) i64 {
return a + b;
}
const MyModule = pyoz.module(.{
.name = "mymodule",
.funcs = &.{
pyoz.func("add", add, "Add two numbers"),
},
.classes = &.{
pyoz.class("Point", Point),
},
});
pub export fn PyInit_mymodule() ?*pyoz.PyObject {
return MyModule.init();
}
```
```python
import mymodule
print(mymodule.add(2, 3)) # 5
p = mymodule.Point(3.0, 4.0)
print(p.magnitude()) # 5.0
print(p.x, p.y) # 3.0 4.0
```
## Features
- **Declarative API** -- Define modules, functions, and classes with simple struct literals
- **Automatic type conversion** -- Zig `i64`, `f64`, `[]const u8`, structs, optionals, error unions all map to Python types automatically
- **Full class support** -- `__init__`, `__repr__`, `__add__`, `__iter__`, `__getitem__`, properties, static/class methods, inheritance
- **NumPy integration** -- Zero-copy array access via buffer protocol
- **Error handling** -- Zig errors become Python exceptions; custom exception types supported
- **Type stubs** -- Automatic `.pyi` generation for IDE autocomplete and type checking
- **GIL management** -- Release the GIL for CPU-bound Zig code with `pyoz.releaseGIL()`
- **Cross-class references** -- Methods can accept/return instances of other classes in the same module
- **Simple tooling** -- `pyoz init`, `pyoz build`, `pyoz develop`, `pyoz publish`
## Installation
```bash
pip install pyoz
```
Requires **Zig 0.15.0+** and **Python 3.8--3.13**.
## Getting Started
```bash
# Create a new project
pyoz init myproject
cd myproject
# Build and install for development
pyoz develop
# Test it
python -c "import myproject; print(myproject.add(1, 2))"
```
## Documentation
Full documentation at **[pyoz.dev](https://pyoz.dev)**:
- [Installation](https://pyoz.dev/installation/) -- Setup and requirements
- [Quickstart](https://pyoz.dev/quickstart/) -- Your first PyOZ module in 5 minutes
- [Functions](https://pyoz.dev/guide/functions/) -- Module-level functions, keyword arguments
- [Classes](https://pyoz.dev/guide/classes/) -- Full class support with magic methods
- [Types](https://pyoz.dev/guide/types/) -- Type conversion reference
- [Properties](https://pyoz.dev/guide/properties/) -- Computed properties and getters/setters
- [Error Handling](https://pyoz.dev/guide/errors/) -- Zig errors to Python exceptions
- [NumPy](https://pyoz.dev/guide/numpy/) -- Zero-copy buffer protocol
- [GIL](https://pyoz.dev/guide/gil/) -- Releasing the GIL for parallelism
- [CLI Reference](https://pyoz.dev/cli/build/) -- Build, develop, publish commands
- [Complete Example](https://pyoz.dev/examples/complete-module/) -- Full-featured module walkthrough
## License
MIT
| text/markdown | Daniele Linguaglossa | null | null | null | MIT | null | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Topic :: Software Development :: Build Tools"
] | [] | https://pyoz.dev | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Documentation, https://pyoz.dev",
"Source, https://github.com/pyozig/PyOZ",
"Changelog, https://pyoz.dev/changelog"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:18:01.095189 | pyoz-0.11.4-cp38-abi3-win_arm64.whl | 582,513 | 53/6f/b9b122565c7560a3a973765c4fbabc7fe9787c8c063c28268acfe0ca39a6/pyoz-0.11.4-cp38-abi3-win_arm64.whl | cp38 | bdist_wheel | null | false | e1283a209c2692d2ff240608151520d9 | 1f6c63e77ab710044c6e89dd2b8ba56d95aa3ee95fbca1f252010635bab525b4 | 536fb9b122565c7560a3a973765c4fbabc7fe9787c8c063c28268acfe0ca39a6 | null | [] | 476 |
2.4 | genai-otel-instrument | 0.1.40 | Comprehensive OpenTelemetry auto-instrumentation for LLM/GenAI applications | # TraceVerde
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Logo.jpg" alt="TraceVerde - GenAI OpenTelemetry Instrumentation Logo" width="400"/>
</div>
<br/>
[](https://badge.fury.io/py/genai-otel-instrument)
[](https://pypi.org/project/genai-otel-instrument/)
[](https://www.gnu.org/licenses/agpl-3.0)
[](https://pepy.tech/project/genai-otel-instrument)
[](https://pepy.tech/project/genai-otel-instrument)
[](https://github.com/Mandark-droid/genai_otel_instrument)
[](https://github.com/Mandark-droid/genai_otel_instrument)
[](https://github.com/Mandark-droid/genai_otel_instrument/issues)
[](https://github.com/Mandark-droid/genai_otel_instrument/pulls)
[](https://github.com/Mandark-droid/genai_otel_instrument)
[](https://github.com/psf/black)
[](https://pycqa.github.io/isort/)
[](http://mypy-lang.org/)
[](https://opentelemetry.io/)
[](https://opentelemetry.io/docs/specs/semconv/gen-ai/)
[](https://github.com/Mandark-droid/genai_otel_instrument/actions)
---
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Landing_Page.jpg" alt="GenAI OpenTelemetry Instrumentation Overview" width="800"/>
</div>
---
Production-ready OpenTelemetry instrumentation for GenAI/LLM applications with zero-code setup.
## Features
🚀 **Zero-Code Instrumentation** - Just install and set env vars
🤖 **19+ LLM Providers** - OpenAI, OpenRouter, Anthropic, Google, AWS, Azure, SambaNova, Hyperbolic, Sarvam AI, and more
🤝 **Multi-Agent Frameworks** - CrewAI, LangGraph, OpenAI Agents SDK, AutoGen, AutoGen AgentChat, Google ADK, Pydantic AI for agent orchestration
🔧 **MCP Tool Support** - Auto-instrument databases, APIs, caches, vector DBs
💰 **Cost Tracking** - Automatic cost calculation for both streaming and non-streaming requests
⚡ **Streaming Support** - Full observability for streaming responses with TTFT/TBT metrics and cost tracking
🎮 **GPU Metrics** - Real-time GPU utilization, memory, temperature, power, and electricity cost tracking (NVIDIA & AMD)
🛡️ **PII Detection** (NEW) - Automatic PII detection with GDPR/HIPAA/PCI-DSS compliance modes
☢️ **Toxicity Detection** (NEW) - Detect harmful content with Perspective API and Detoxify
⚖️ **Bias Detection** (NEW) - Identify demographic and other biases in prompts and responses
📊 **Complete Observability** - Traces, metrics, and rich span attributes
➕ **Service Instance ID & Environment** - Identify your services and environments
⏱️ **Configurable Exporter Timeout** - Set timeout for OTLP exporter
🔗 **OpenInference Instrumentors** - Smolagents, MCP, and LiteLLM instrumentation
## Quick Start
### Installation
```bash
pip install genai-otel-instrument
```
### Usage
**Option 1: Environment Variables (No code changes)**
```bash
export OTEL_SERVICE_NAME=my-llm-app
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
python your_app.py
```
**Option 2: One line of code**
```python
import genai_otel
genai_otel.instrument()
# Your existing code works unchanged
import openai
client = openai.OpenAI()
response = client.chat.completions.create(...)
```
**Option 3: CLI wrapper**
```bash
genai-instrument python your_app.py
```
For a more comprehensive demonstration of various LLM providers and MCP tools, refer to `example_usage.py` in the project root. Note that running this example requires setting up relevant API keys and external services (e.g., databases, Redis, Pinecone).
## What Gets Instrumented?
### LLM Providers (Auto-detected)
- **With Full Cost Tracking**: OpenAI, OpenRouter, Anthropic, Google AI, AWS Bedrock, Azure OpenAI, Cohere, Mistral AI, Together AI, Groq, Ollama, Vertex AI, SambaNova, Hyperbolic, Sarvam AI
- **Hardware/Local Pricing**: Replicate (hardware-based $/second), HuggingFace (local execution with estimated costs)
- **HuggingFace Support**: `pipeline()`, `AutoModelForCausalLM.generate()`, `AutoModelForSeq2SeqLM.generate()`, `InferenceClient` API calls
- **Other Providers**: Anyscale
- **Special Configuration**: Hyperbolic (requires OTLP gRPC exporter - see `examples/hyperbolic_example.py`)
### Frameworks
- **LangChain** (chains, agents, tools)
- **LlamaIndex** (query engines, indices)
- **Haystack** (modular NLP pipelines with RAG support)
- **DSPy** (Stanford NLP declarative LM programming with automatic optimization)
- **Instructor** (Pydantic-based structured output extraction with validation and retries)
- **Guardrails AI** (input/output validation guards with on-fail policies: reask, fix, filter, refrain)
### Multi-Agent Frameworks
- **OpenAI Agents SDK** (agent orchestration with handoffs, sessions, guardrails)
- **CrewAI** (role-based multi-agent collaboration with crews and tasks)
- Complete span hierarchy: Crew -> Task -> Agent -> LLM calls
- All kickoff variants: `kickoff()`, `kickoff_async()`, `akickoff()`, `kickoff_for_each()`, and async batch variants
- Automatic ThreadPoolExecutor context propagation for worker threads
- Three span types: `crewai.crew.execution`, `crewai.task.execution`, `crewai.agent.execution`
- **Google ADK** (Google Agent Development Kit - open-source agent framework)
- Instruments `Runner.run_async()` and `InMemoryRunner.run_debug()`
- Captures agent name, model, tools, sub-agents, session info
- **AutoGen AgentChat** (v0.4+ - ChatAgent, Teams, RoundRobinGroupChat, SelectorGroupChat, Swarm)
- Instruments agent `run()`/`run_stream()` and team execution
- Captures participants, task content, termination conditions
- **AutoGen** (legacy Microsoft multi-agent conversations with group chats)
- **LangGraph** (stateful workflows with graph-based orchestration)
- **Pydantic AI** (type-safe agents with Pydantic validation and multi-provider support)
- **AWS Bedrock Agents** (managed agent runtime with knowledge bases and RAG)
### MCP Tools (Model Context Protocol)
- **Databases**: PostgreSQL, MySQL, MongoDB, SQLAlchemy
- **Caching**: Redis
- **Message Queues**: Apache Kafka
- **Vector Databases**: Pinecone, Weaviate, Qdrant, ChromaDB, Milvus, FAISS
- **APIs**: HTTP/REST requests (requests, httpx)
### OpenInference (Optional - Python 3.10+ only)
- Smolagents - HuggingFace smolagents framework tracing
- MCP - Model Context Protocol instrumentation
- LiteLLM - Multi-provider LLM proxy
**Cost Enrichment:** OpenInference instrumentors are automatically enriched with cost tracking! When cost tracking is enabled (`GENAI_ENABLE_COST_TRACKING=true`), a custom `CostEnrichmentSpanProcessor` extracts model and token usage from OpenInference spans and adds cost attributes (`gen_ai.usage.cost.total`, `gen_ai.usage.cost.prompt`, `gen_ai.usage.cost.completion`) using our comprehensive pricing database of 340+ models across 20+ providers.
The processor supports OpenInference semantic conventions:
- Model: `llm.model_name`, `embedding.model_name`
- Tokens: `llm.token_count.prompt`, `llm.token_count.completion`
- Operations: `openinference.span.kind` (LLM, EMBEDDING, CHAIN, RETRIEVER, etc.)
**Note:** OpenInference instrumentors require Python >= 3.10. Install with:
```bash
pip install genai-otel-instrument[openinference]
```
## Screenshots
See the instrumentation in action across different LLM providers and observability backends.
### OpenAI Instrumentation
Full trace capture for OpenAI API calls with token usage, costs, and latency metrics.
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Screenshots/Traces_OpenAI.png" alt="OpenAI Traces" width="900"/>
</div>
### Ollama (Local LLM) Instrumentation
Zero-code instrumentation for local models running on Ollama with comprehensive observability.
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Screenshots/Traces_Ollama.png" alt="Ollama Traces" width="900"/>
</div>
### HuggingFace Transformers
Direct instrumentation of HuggingFace Transformers with automatic token counting and cost estimation.
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Screenshots/Trace_HuggingFace_Transformer_Models.png" alt="HuggingFace Transformer Traces" width="900"/>
</div>
### SmolAgents Framework
Complete agent workflow tracing with tool calls, iterations, and cost breakdown.
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Screenshots/Traces_SmolAgent_with_tool_calls.png" alt="SmolAgent Traces with Tool Calls" width="900"/>
</div>
### GPU Metrics Collection
Real-time GPU utilization, memory, temperature, and power consumption metrics for both NVIDIA and AMD GPUs.
<div align="center">
<img src="https://raw.githubusercontent.com/Mandark-droid/genai_otel_instrument/main/.github/images/Screenshots/GPU_Metrics.png" alt="GPU Metrics Dashboard" width="900"/>
</div>
### Additional Screenshots
- **[Token Cost Breakdown](.github/images/Screenshots/Traces_SmolAgent_Token_Cost_breakdown.png)** - Detailed token usage and cost analysis for SmolAgent workflows
- **[OpenSearch Dashboard](.github/images/Screenshots/GENAI_OpenSearch_output.png)** - GenAI metrics visualization in OpenSearch/Kibana
---
## Demo Video
Watch a comprehensive walkthrough of GenAI OpenTelemetry Auto-Instrumentation in action, demonstrating setup, configuration, and real-time observability across multiple LLM providers.
<div align="center">
**🎥 [Watch Demo Video](https://youtu.be/YOUR_VIDEO_ID_HERE)**
*(Coming Soon)*
</div>
---
## Cost Tracking Coverage
The library includes comprehensive cost tracking with pricing data for **350+ models** across **21+ providers**:
### Providers with Full Token-Based Cost Tracking
- **OpenAI**: GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo, o1/o3 series, embeddings, audio, vision (35+ models)
- **Anthropic**: Claude 3.5 Sonnet/Opus/Haiku, Claude 3 series (10+ models)
- **Google AI**: Gemini 1.5/2.0 Pro/Flash, PaLM 2 (12+ models)
- **AWS Bedrock**: Amazon Titan, Claude, Llama, Mistral models (20+ models)
- **Azure OpenAI**: Same as OpenAI with Azure-specific pricing
- **Cohere**: Command R/R+, Command Light, Embed v3/v2 (8+ models)
- **Mistral AI**: Mistral Large/Medium/Small, Mixtral, embeddings (8+ models)
- **Together AI**: DeepSeek-R1, Llama 3.x, Qwen, Mixtral (25+ models)
- **Groq**: Llama 3.x series, Mixtral, Gemma models (15+ models)
- **Ollama**: Local models with token tracking (pricing via cost estimation)
- **Vertex AI**: Gemini models via Google Cloud with usage metadata extraction
- **Sarvam AI**: sarvam-m chat, Saarika/Saaras STT, Bulbul TTS, Mayura/Sarvam Translate, Vision (12+ models)
### Special Pricing Models
- **Replicate**: Hardware-based pricing ($/second of GPU/CPU time) - not token-based
- **HuggingFace Transformers**: Local model execution with estimated costs based on parameter count
- Supports `pipeline()`, `AutoModelForCausalLM.generate()`, `AutoModelForSeq2SeqLM.generate()`
- Cost estimation uses GPU/compute resource pricing tiers (tiny/small/medium/large)
- Automatic token counting from tensor shapes
### Pricing Features
- **Differential Pricing**: Separate rates for prompt tokens vs. completion tokens
- **Reasoning Tokens**: Special pricing for OpenAI o1/o3 reasoning tokens
- **Cache Pricing**: Anthropic prompt caching costs (read/write)
- **Granular Cost Metrics**: Per-request cost breakdown by token type
- **Auto-Updated Pricing**: Pricing data maintained in `llm_pricing.json`
- **Custom Pricing**: Add pricing for custom/proprietary models via environment variable
### Adding Custom Model Pricing
For custom or proprietary models not in `llm_pricing.json`, you can provide custom pricing via the `GENAI_CUSTOM_PRICING_JSON` environment variable:
```bash
# For chat models
export GENAI_CUSTOM_PRICING_JSON='{"chat":{"my-custom-model":{"promptPrice":0.001,"completionPrice":0.002}}}'
# For embeddings
export GENAI_CUSTOM_PRICING_JSON='{"embeddings":{"my-custom-embeddings":0.00005}}'
# For multiple categories
export GENAI_CUSTOM_PRICING_JSON='{
"chat": {
"my-custom-chat": {"promptPrice": 0.001, "completionPrice": 0.002}
},
"embeddings": {
"my-custom-embed": 0.00005
},
"audio": {
"my-custom-tts": 0.02
}
}'
```
**Pricing Format:**
- **Chat models**: `{"promptPrice": <$/1k tokens>, "completionPrice": <$/1k tokens>}`
- **Embeddings**: Single number for price per 1k tokens
- **Audio**: Price per 1k characters (TTS) or per second (STT)
- **Images**: Nested structure with quality/size pricing (see `llm_pricing.json` for examples)
**Hybrid Pricing:** Custom prices are merged with default pricing from `llm_pricing.json`. If you provide custom pricing for an existing model, the custom price overrides the default.
**Coverage Statistics**: As of v0.1.3, 89% test coverage with 415 passing tests, including comprehensive cost calculation validation and cost enrichment processor tests (supporting both GenAI and OpenInference semantic conventions).
## Collected Telemetry
### Traces
Every LLM call, database query, API request, and vector search is traced with full context propagation.
### Metrics
**GenAI Metrics:**
- `gen_ai.requests` - Request counts by provider and model
- `gen_ai.client.token.usage` - Token usage (prompt/completion)
- `gen_ai.client.operation.duration` - Request latency histogram (optimized buckets for LLM workloads)
- `gen_ai.usage.cost` - Total estimated costs in USD
- `gen_ai.usage.cost.prompt` - Prompt tokens cost (granular)
- `gen_ai.usage.cost.completion` - Completion tokens cost (granular)
- `gen_ai.usage.cost.reasoning` - Reasoning tokens cost (OpenAI o1 models)
- `gen_ai.usage.cost.cache_read` - Cache read cost (Anthropic)
- `gen_ai.usage.cost.cache_write` - Cache write cost (Anthropic)
- `gen_ai.client.errors` - Error counts by operation and type
- `gen_ai.gpu.*` - GPU utilization, memory, temperature, power (ObservableGauges)
- `gen_ai.co2.emissions` - CO2 emissions tracking with codecarbon integration (opt-in via `GENAI_ENABLE_CO2_TRACKING`)
- `gen_ai.power.cost` - Cumulative electricity cost in USD based on GPU power consumption (configurable via `GENAI_POWER_COST_PER_KWH`)
**CO2 Tracking Options:**
- **Automatic (codecarbon)**: Uses region-based carbon intensity data for accurate emissions calculation
- **Manual**: Uses `GENAI_CARBON_INTENSITY` value (gCO2e/kWh) for calculation
- Set `GENAI_CO2_USE_MANUAL=true` to force manual calculation even when codecarbon is installed
- `gen_ai.server.ttft` - Time to First Token for streaming responses (histogram, 1ms-10s buckets)
- `gen_ai.server.tbt` - Time Between Tokens for streaming responses (histogram, 10ms-2.5s buckets)
**MCP Metrics (Database Operations):**
- `mcp.requests` - Number of MCP/database requests
- `mcp.client.operation.duration` - Operation duration histogram (1ms to 10s buckets)
- `mcp.request.size` - Request payload size histogram (100B to 5MB buckets)
- `mcp.response.size` - Response payload size histogram (100B to 5MB buckets)
### Span Attributes
**Core Attributes:**
- `gen_ai.system` - Provider name (e.g., "openai")
- `gen_ai.operation.name` - Operation type (e.g., "chat")
- `gen_ai.request.model` - Model identifier
- `gen_ai.usage.prompt_tokens` / `gen_ai.usage.input_tokens` - Input tokens (dual emission supported)
- `gen_ai.usage.completion_tokens` / `gen_ai.usage.output_tokens` - Output tokens (dual emission supported)
- `gen_ai.usage.total_tokens` - Total tokens
**Request Parameters:**
- `gen_ai.request.temperature` - Temperature setting
- `gen_ai.request.top_p` - Top-p sampling
- `gen_ai.request.max_tokens` - Max tokens requested
- `gen_ai.request.frequency_penalty` - Frequency penalty
- `gen_ai.request.presence_penalty` - Presence penalty
**Response Attributes:**
- `gen_ai.response.id` - Response ID from provider
- `gen_ai.response.model` - Actual model used (may differ from request)
- `gen_ai.response.finish_reasons` - Array of finish reasons
**Tool/Function Calls:**
- `llm.tools` - JSON-serialized tool definitions
- `llm.output_messages.{choice}.message.tool_calls.{index}.tool_call.id` - Tool call ID
- `llm.output_messages.{choice}.message.tool_calls.{index}.tool_call.function.name` - Function name
- `llm.output_messages.{choice}.message.tool_calls.{index}.tool_call.function.arguments` - Function arguments
**Cost Attributes (granular):**
- `gen_ai.usage.cost.total` - Total cost
- `gen_ai.usage.cost.prompt` - Prompt tokens cost
- `gen_ai.usage.cost.completion` - Completion tokens cost
- `gen_ai.usage.cost.reasoning` - Reasoning tokens cost (o1 models)
- `gen_ai.usage.cost.cache_read` - Cache read cost (Anthropic)
- `gen_ai.usage.cost.cache_write` - Cache write cost (Anthropic)
**Streaming Attributes:**
- `gen_ai.server.ttft` - Time to First Token (seconds) for streaming responses
- `gen_ai.streaming.token_count` - Total number of chunks in streaming response
- `gen_ai.usage.prompt_tokens` - Actual prompt tokens (extracted from final chunk)
- `gen_ai.usage.completion_tokens` - Actual completion tokens (extracted from final chunk)
- `gen_ai.usage.total_tokens` - Total tokens (extracted from final chunk)
- `gen_ai.usage.cost.total` - Total cost for streaming request
- `gen_ai.usage.cost.prompt` - Prompt tokens cost for streaming request
- `gen_ai.usage.cost.completion` - Completion tokens cost for streaming request
- All granular cost attributes (reasoning, cache_read, cache_write) also available for streaming
**Content Events (opt-in):**
- `gen_ai.prompt.{index}` events with role and content
- `gen_ai.completion.{index}` events with role and content
**Additional:**
- Database, vector DB, and API attributes from MCP instrumentation
## Configuration
### Environment Variables
```bash
# Required
OTEL_SERVICE_NAME=my-app
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
# Optional
OTEL_EXPORTER_OTLP_HEADERS=x-api-key=secret
GENAI_ENABLE_GPU_METRICS=true
GENAI_ENABLE_COST_TRACKING=true
GENAI_ENABLE_MCP_INSTRUMENTATION=true
GENAI_GPU_COLLECTION_INTERVAL=5 # GPU metrics collection interval in seconds (default: 5)
OTEL_SERVICE_INSTANCE_ID=instance-1 # Optional service instance id
OTEL_ENVIRONMENT=production # Optional environment
OTEL_EXPORTER_OTLP_TIMEOUT=60 # Timeout for OTLP exporter in seconds (default: 60)
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf # Protocol: "http/protobuf" (default) or "grpc"
# Semantic conventions (NEW)
OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai # "gen_ai" for new conventions only, "gen_ai/dup" for dual emission
GENAI_ENABLE_CONTENT_CAPTURE=false # WARNING: May capture sensitive data. Enable with caution.
GENAI_CONTENT_MAX_LENGTH=200 # Max chars for captured content (0 = no limit). Only applies when content capture is enabled.
# Logging configuration
GENAI_OTEL_LOG_LEVEL=INFO # DEBUG, INFO, WARNING, ERROR, CRITICAL. Logs are written to 'logs/genai_otel.log' with rotation (10 files, 10MB each).
# Error handling
GENAI_FAIL_ON_ERROR=false # true to fail fast, false to continue on errors
```
### Programmatic Configuration
```python
import genai_otel
genai_otel.instrument(
service_name="my-app",
endpoint="http://localhost:4318",
enable_gpu_metrics=True,
enable_cost_tracking=True,
enable_mcp_instrumentation=True
)
```
### Sample Environment File (`sample.env`)
A `sample.env` file has been generated in the project root directory. This file contains commented-out examples of all supported environment variables, along with their default values or expected formats. You can copy this file to `.env` and uncomment/modify the variables to configure the instrumentation for your specific needs.
## Advanced Features
### Session and User Tracking
Track user sessions and identify users across multiple LLM requests for better analytics, debugging, and cost attribution.
**Configuration:**
```python
import genai_otel
from genai_otel import OTelConfig
# Define extractor functions
def extract_session_id(instance, args, kwargs):
"""Extract session ID from request metadata."""
# Option 1: From kwargs metadata
metadata = kwargs.get("metadata", {})
return metadata.get("session_id")
# Option 2: From custom headers
# headers = kwargs.get("headers", {})
# return headers.get("X-Session-ID")
# Option 3: From thread-local storage
# import threading
# return getattr(threading.current_thread(), "session_id", None)
def extract_user_id(instance, args, kwargs):
"""Extract user ID from request metadata."""
metadata = kwargs.get("metadata", {})
return metadata.get("user_id")
# Configure with extractors
config = OTelConfig(
service_name="my-rag-app",
endpoint="http://localhost:4318",
session_id_extractor=extract_session_id,
user_id_extractor=extract_user_id,
)
genai_otel.instrument(config)
```
**Usage:**
```python
from openai import OpenAI
client = OpenAI()
# Pass session and user info via metadata
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "What is OpenTelemetry?"}],
extra_body={"metadata": {"session_id": "sess_12345", "user_id": "user_alice"}}
)
```
**Span Attributes Added:**
- `session.id` - Unique session identifier for tracking conversations
- `user.id` - User identifier for per-user analytics and cost tracking
**Use Cases:**
- Track multi-turn conversations across requests
- Analyze usage patterns per user
- Debug session-specific issues
- Calculate per-user costs and quotas
- Build user-specific dashboards
### CrewAI Multi-Agent Orchestration (Zero-Code Context Propagation)
Get complete observability for CrewAI multi-agent applications with **automatic context propagation** across threads and async execution—no manual context management required!
**Key Features:**
- ✨ **Zero-code setup** - Just instrument and go, no wrapper functions needed
- 🔗 **Complete trace hierarchy** - Crew → Agent → Task → LLM calls automatically linked
- 🧵 **Automatic ThreadPoolExecutor patching** - Context propagates to worker threads
- 📊 **Three span types** with rich attributes:
- `crewai.crew.execution` - Crew-level attributes (process type, agent/task counts, tools, inputs)
- `crewai.task.execution` - Task attributes (description, expected output, assigned agent)
- `crewai.agent.execution` - Agent attributes (role, goal, backstory, LLM model)
**Setup (One-time):**
```python
from genai_otel import setup_auto_instrumentation, OTelConfig
# Setup BEFORE importing CrewAI
otel_config = OTelConfig(
service_name="my-crewai-app",
endpoint="http://localhost:4318",
enabled_instrumentors=["crewai", "openai", "ollama"], # Enable CrewAI + LLM providers
enable_cost_tracking=True,
)
setup_auto_instrumentation(otel_config)
# Now import and use CrewAI normally
from crewai import Crew, Agent, Task
```
**Usage (Zero Code Changes):**
```python
# Define your agents
researcher = Agent(
role="Senior Researcher",
goal="Research and analyze topics thoroughly",
backstory="Expert researcher with 10 years experience",
llm="gpt-4" # Or "ollama:llama2", etc.
)
writer = Agent(
role="Technical Writer",
goal="Create clear, comprehensive documentation",
backstory="Professional technical writer",
llm="ollama:llama2"
)
# Define your tasks
research_task = Task(
description="Research OpenTelemetry best practices",
expected_output="Comprehensive research report",
agent=researcher
)
writing_task = Task(
description="Write a blog post based on research",
expected_output="Well-structured blog post",
agent=writer
)
# Create and run the crew
crew = Crew(
agents=[researcher, writer],
tasks=[research_task, writing_task],
process="sequential"
)
# Just call kickoff() normally - context propagates automatically!
result = crew.kickoff()
# ✅ Complete traces with proper parent-child relationships
# ✅ All LLM calls (GPT-4, Ollama) are linked to tasks and agents
# ✅ Cost tracking works automatically
```
**FastAPI Integration (Like Your TraceVerde App):**
```python
from fastapi import FastAPI
import asyncio
# Setup instrumentation FIRST
setup_auto_instrumentation(OTelConfig(...))
# Then import CrewAI
from crewai import Crew
app = FastAPI()
crew = Crew(...)
@app.post("/analyze")
async def analyze(query: str):
# No manual context management needed!
result = await asyncio.to_thread(crew.kickoff, inputs={"query": query})
return {"result": result}
# ✅ HTTP request → Crew → Agents → Tasks → LLM calls all properly traced
```
**What You Get in Traces:**
```json
{
"spans": [
{
"name": "crewai.crew.execution",
"attributes": {
"crewai.process.type": "sequential",
"crewai.agent_count": 2,
"crewai.task_count": 2,
"crewai.agent.roles": ["Senior Researcher", "Technical Writer"],
"crewai.tools": ["search", "scrape", "write_file"]
}
},
{
"name": "crewai.agent.execution",
"parent": "crewai.crew.execution",
"attributes": {
"crewai.agent.role": "Senior Researcher",
"crewai.agent.goal": "Research and analyze topics thoroughly",
"crewai.agent.llm_model": "gpt-4"
}
},
{
"name": "crewai.task.execution",
"parent": "crewai.agent.execution",
"attributes": {
"crewai.task.description": "Research OpenTelemetry best practices",
"crewai.task.agent_role": "Senior Researcher"
}
},
{
"name": "openai.chat.completion",
"parent": "crewai.task.execution",
"attributes": {
"gen_ai.request.model": "gpt-4",
"gen_ai.response.model": "gpt-4",
"gen_ai.usage.prompt_tokens": 1250,
"gen_ai.usage.completion_tokens": 850,
"gen_ai.usage.cost.total": 0.0325
}
}
]
}
```
**Before vs After:**
| Aspect | Before (Manual) | After (Zero-Code) |
|--------|-----------------|-------------------|
| **Code Changes** | `run_in_thread_with_context()` wrapper | None - just setup |
| **Trace Continuity** | Manual propagation | Automatic |
| **FastAPI Integration** | Custom middleware | Works automatically |
| **Maintenance** | Update on CrewAI changes | Self-healing |
**Technical Details:**
- Instruments `Crew.kickoff()`, `Task.execute_sync()`, `Task.execute_async()`, `Agent.execute_task()`
- Patches `concurrent.futures.ThreadPoolExecutor.submit()` for automatic context propagation
- Thread-safe using OpenTelemetry's context API
- Works with CrewAI's internal threading model
- Compatible with all async frameworks (FastAPI, Flask, etc.)
### RAG and Embedding Attributes
Enhanced observability for Retrieval-Augmented Generation (RAG) workflows, including embedding generation and document retrieval.
**Helper Methods:**
The `BaseInstrumentor` provides helper methods to add RAG-specific attributes to your spans:
```python
from opentelemetry import trace
from genai_otel.instrumentors.base import BaseInstrumentor
# Get your instrumentor instance (or create spans manually)
tracer = trace.get_tracer(__name__)
# 1. Embedding Attributes
with tracer.start_as_current_span("embedding.create") as span:
# Your embedding logic
embedding_response = client.embeddings.create(
model="text-embedding-3-small",
input="OpenTelemetry provides observability"
)
# Add embedding attributes (if using BaseInstrumentor)
# instrumentor.add_embedding_attributes(
# span,
# model="text-embedding-3-small",
# input_text="OpenTelemetry provides observability",
# vector=embedding_response.data[0].embedding
# )
# Or manually set attributes
span.set_attribute("embedding.model_name", "text-embedding-3-small")
span.set_attribute("embedding.text", "OpenTelemetry provides observability"[:500])
span.set_attribute("embedding.vector.dimension", len(embedding_response.data[0].embedding))
# 2. Retrieval Attributes
with tracer.start_as_current_span("retrieval.search") as span:
# Your retrieval logic
retrieved_docs = [
{
"id": "doc_001",
"score": 0.95,
"content": "OpenTelemetry is an observability framework...",
"metadata": {"source": "docs.opentelemetry.io", "category": "intro"}
},
# ... more documents
]
# Add retrieval attributes (if using BaseInstrumentor)
# instrumentor.add_retrieval_attributes(
# span,
# documents=retrieved_docs,
# query="What is OpenTelemetry?",
# max_docs=5
# )
# Or manually set attributes
span.set_attribute("retrieval.query", "What is OpenTelemetry?"[:500])
span.set_attribute("retrieval.document_count", len(retrieved_docs))
for i, doc in enumerate(retrieved_docs[:5]): # Limit to 5 docs
prefix = f"retrieval.documents.{i}.document"
span.set_attribute(f"{prefix}.id", doc["id"])
span.set_attribute(f"{prefix}.score", doc["score"])
span.set_attribute(f"{prefix}.content", doc["content"][:500])
# Add metadata
for key, value in doc.get("metadata", {}).items():
span.set_attribute(f"{prefix}.metadata.{key}", str(value))
```
**Embedding Attributes:**
- `embedding.model_name` - Embedding model used
- `embedding.text` - Input text (truncated to 500 chars)
- `embedding.vector` - Embedding vector (optional, if configured)
- `embedding.vector.dimension` - Vector dimensions
**Retrieval Attributes:**
- `retrieval.query` - Search query (truncated to 500 chars)
- `retrieval.document_count` - Number of documents retrieved
- `retrieval.documents.{i}.document.id` - Document ID
- `retrieval.documents.{i}.document.score` - Relevance score
- `retrieval.documents.{i}.document.content` - Document content (truncated to 500 chars)
- `retrieval.documents.{i}.document.metadata.*` - Custom metadata fields
**Safeguards:**
- Text content truncated to 500 characters to avoid span size explosion
- Document count limited to 5 by default (configurable via `max_docs`)
- Metadata values truncated to prevent excessive attribute counts
**Complete RAG Workflow Example:**
See `examples/phase4_session_rag_tracking.py` for a comprehensive demonstration of:
- Session and user tracking across RAG pipeline
- Embedding attribute capture
- Retrieval attribute capture
- End-to-end RAG workflow with full observability
**Use Cases:**
- Monitor retrieval quality and relevance scores
- Debug RAG pipeline performance
- Track embedding model usage
- Analyze document retrieval patterns
- Optimize vector search configurations
## Example: Full-Stack GenAI App
```python
import genai_otel
genai_otel.instrument()
import openai
import pinecone
import redis
import psycopg2
# All of these are automatically instrumented:
# Cache check
cache = redis.Redis().get('key')
# Vector search
pinecone_index = pinecone.Index("embeddings")
results = pinecone_index.query(vector=[...], top_k=5)
# Database query
conn = psycopg2.connect("dbname=mydb")
cursor = conn.cursor()
cursor.execute("SELECT * FROM context")
# LLM call with full context
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4",
messages=[...]
)
# You get:
# ✓ Distributed traces across all services
# ✓ Cost tracking for the LLM call
# ✓ Performance metrics for DB, cache, vector DB
# ✓ GPU metrics if using local models
# ✓ Complete observability with zero manual instrumentation
```
## Backend Integration
Works with any OpenTelemetry-compatible backend:
- Jaeger, Zipkin
- Prometheus, Grafana
- Datadog, New Relic, Honeycomb
- AWS X-Ray, Google Cloud Trace
- Elastic APM, Splunk
- Self-hosted OTEL Collector
## Project Structure
```bash
genai-otel-instrument/
├── setup.py
├── MANIFEST.in
├── README.md
├── LICENSE
├── example_usage.py
└── genai_otel/
├── __init__.py
├── config.py
├── auto_instrument.py
├── cli.py
├── cost_calculator.py
├── gpu_metrics.py
├── instrumentors/
│ ├── __init__.py
│ ├── base.py
│ └── (other instrumentor files)
└── mcp_instrumentors/
├── __init__.py
├── manager.py
└── (other mcp files)
```
## Roadmap
### v0.2.0 Release (In Progress) - Q1 2026
We're implementing significant enhancements for this release, focusing on evaluation metrics and safety guardrails alongside completing OpenTelemetry semantic convention compliance.
**✅ Completed Features:**
- **PII Detection** - Automatic detection and handling of personally identifiable information with Microsoft Presidio
- Three modes: detect, redact, or block
- GDPR, HIPAA, and PCI-DSS compliance modes
- 15+ entity types (email, phone, SSN, credit cards, IP addresses, etc.)
- Span attributes and metrics for PII detections
- Example: `examples/pii_detection_example.py`
- **Toxicity Detection** - Monitor and alert on toxic or harmful content
- Dual detection methods: Perspective API (cloud) and Detoxify (local)
- Six toxicity categories: toxicity, severe_toxicity, identity_attack, insult, profanity, threat
- Automatic fallback from Perspective API to Detoxify
- Configurable threshold and blocking mode
- Batch processing support
- Span attributes and metrics for toxicity detections
- Example: `examples/toxicity_detection_example.py`
- **Bias Detection** - Identify demographic and other biases in prompts and responses
- 8 bias types: gender, race, ethnicity, religion, age, disability, sexual_orientation, political
- Pattern-based detection (always available, no external dependencies)
- Optional ML-based detection with Fairlearn
- Configurable threshold and blocking mode
- Batch processing and statistics generation
- Span attributes and metrics for bias detections
- Example: `examples/bias_detection_example.py`
- **Prompt Injection Detection** - Protect against prompt manipulation attacks
- 6 injection types: instruction_override, role_playing, jailbreak, context_switching, system_extraction, encoding_obfuscation
- Pattern-based detection (always available)
- Configurable threshold and blocking mode
- Automatic security blocking for high-risk prompts
- Span attributes and metrics for injection attempts
- Example: `examples/comprehensive_evaluation_example.py`
- **Restricted Topics Detection** - Monitor and block sensitive topics
- 9 topic categories: medical_advice, legal_advice, financial_advice, violence, self_harm, illegal_activities, adult_content, personal_information, political_manipulation
- Pattern-based topic classification
- Configurable topic blacklists
- Industry-specific content filters
- Span attributes and metrics for topic violations
- Example: `examples/comprehensive_evaluation_example.py`
- **Hallucination Detection** - Track factual accuracy and groundedness
- Factual claim extraction and validation
- Hedge word detection for uncertainty
- Citation and attribution tracking
- Context contradiction detection
- Unsupported claims identification
- Span attributes and metrics for hallucination risks
- Example: `examples/comprehensive_evaluation_example.py`
**Evaluation Support Coverage:**
15 out of 31 providers (48%) now support full evaluation metrics:
**Direct Provider Support (12 providers):**
- ✅ OpenAI
- ✅ Anthropic
- ✅ HuggingFace
- ✅ Ollama
- ✅ Google AI
- ✅ Hyperbolic
- ✅ SambaNova
- ✅ Cohere (NEW)
- ✅ Mistral AI (NEW)
- ✅ Groq (NEW)
- ✅ Azure OpenAI (NEW)
- ✅ AWS Bedrock (NEW)
**Via Span Enrichment (3 providers):**
- ✅ LiteLLM (NEW) - Enables evaluation for 100+ proxied providers
- ✅ Smolagents (NEW) - HuggingFace agents framework
- ✅ MCP (NEW) - Model Context Protocol tools
**Implementation:**
```python
import genai_otel
# Enable all 6 evaluation features
genai_otel.instrument(
# Detection & Safety
enable_pii_detection=True,
enable_toxicity_detection=True,
enable_bias_detection=True,
enable_prompt_injection_detection=True,
enable_restricted_topics=True,
enable_hallucination_detection=True,
# Configure thresholds (defaults shown - adjust based on your needs)
pii_threshold=0.5, # Presidio scores: 0.5-0.7 for valid PII
toxicity_threshold=0.7,
bias_threshold=0.4, # Pattern matching scores: 0.3-0.5 for clear bias
prompt_injection_threshold=0.5, # Injection patterns score: 0.5-0.7
restricted_topics_threshold=0.5,
hallucination_threshold=0.7,
)
```
**All Features Completed! ✅**
- **Restricted Topics** - Block sensitive or inappropriate topics
- Configurable topic blacklists (legal, medical, financial advice)
- Industry-specific content filters
- Topic detection with confidence scoring
- Custom topic definition support
- **Sensitive Information Protection** - ✅ COMPLETED - Prevent PII leakage
- ✅ PII detection (emails, phone numbers, SSN, credit cards, IPs, and more)
- ✅ Automatic redaction or blocking modes
- ✅ Compliance modes (GDPR, HIPAA, PCI-DSS)
- ✅ Data leak prevention metrics
- ✅ Microsoft Presidio integration with regex fallback
**Implementation:**
```python
import genai_otel
# Configure guardrails (PII Detection is LIVE!)
genai_otel.instrument(
# PII Detection (✅ AVAILABLE NOW)
enable_pii_detection=True,
pii_mode="redact", # "detect", "redact", or "block"
pii_threshold=0.5, # Default: 0.5 (Presidio typical scores: 0.5-0.7)
pii_gdpr_mode=True, # Enable GDPR compliance
pii_hipaa_mode=True, # Enable HIPAA compliance
pii_pci_dss_mode=True, # Enable PCI-DSS compliance
# Coming Soon:
enable_prompt_injection_detection=True,
enable_restricted_topics=True,
restricted_topics=["medical_advice", "legal_advice", "financial_advice"],
)
```
**Metrics Added:**
- ✅ `genai.evaluation.pii.detections` - PII detection events (by location and mode)
- ✅ `genai.evaluation.pii.entities` - PII entities detected by type
- ✅ `genai.evaluation.pii.blocked` - Requests/responses blocked due to PII
- ✅ `genai.evaluation.toxicity.detections` - Toxicity detection events
- ✅ `genai.evaluation.toxicity.categories` - Toxicity by category
- ✅ `genai.evaluation.toxicity.blocked` - Blocked due to toxicity
- ✅ `genai.evaluation.toxicity.score` - Toxicity score distribution (histogram)
- ✅ `genai.evaluation.bias.detections` - Bias detection events (by locati | text/markdown | null | Kshitij Thakkar <kshitijthakkar@rocketmail.com> | null | null | AGPL-3.0-or-later | opentelemetry, observability, llm, genai, instrumentation, tracing, metrics, monitoring | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: System :: Monitoring",
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
"Operating System :: OS Independent",
"Programming L... | [] | null | null | >=3.9 | [] | [] | [] | [
"opentelemetry-api<2.0.0,>=1.20.0",
"opentelemetry-sdk<2.0.0,>=1.20.0",
"opentelemetry-instrumentation>=0.41b0",
"opentelemetry-semantic-conventions<1.0.0,>=0.45b0",
"opentelemetry-exporter-otlp>=1.20.0",
"opentelemetry-instrumentation-requests>=0.41b0",
"opentelemetry-instrumentation-httpx>=0.41b0",
... | [] | [] | [] | [
"Homepage, https://github.com/Mandark-droid/genai_otel_instrument",
"Repository, https://github.com/Mandark-droid/genai_otel_instrument",
"Documentation, https://github.com/Mandark-droid/genai_otel_instrument#readme",
"Issues, https://github.com/Mandark-droid/genai_otel_instrument/issues",
"Changelog, https... | twine/6.2.0 CPython/3.11.14 | 2026-02-19T10:17:32.332098 | genai_otel_instrument-0.1.40.tar.gz | 2,195,795 | 8a/57/b800d71d7110e151cd09d942905f2d28add7061c5d02470c7cf84b6df9e4/genai_otel_instrument-0.1.40.tar.gz | source | sdist | null | false | 5b947d90ac89c28f3d1506311f0090b2 | 2301e9f87f7f5408c2a8df7825b0320539560f14267255baf886327c266ec459 | 8a57b800d71d7110e151cd09d942905f2d28add7061c5d02470c7cf84b6df9e4 | null | [
"LICENSE"
] | 236 |
2.4 | aspose-pdf | 26.1.0 | Aspose.PDF for Python via .NET is a wrapper for Python to perform document management can easily be used to generate, modify, convert, render, secure and print documents without using Adobe Acrobat. | [Product Page](https://products.aspose.com/pdf/python-net/) | [Documentation](https://docs.aspose.com/pdf/python-net/) | [Examples](https://github.com/aspose-pdf/Aspose.PDF-for-Python-via-.NET/) | [Demos](https://products.aspose.app/pdf/family) | [Blog](https://blog.aspose.com/categories/aspose.pdf-product-family/) | [API Reference](https://reference.aspose.com/pdf/python-net/) | [Search](https://search.aspose.com/) | [Free Support](https://forum.aspose.com/c/pdf) | [Temporary License](https://purchase.aspose.com/temporary-license)
Try our [Free Online Apps](https://products.aspose.app/pdf/applications) demonstrating some of the most popular Aspose.PDF functionality.
**Aspose.PDF for Python via .NET** is a Python wrapper that enables the developers to add PDF processing capabilities to their applications. It can be used to generate or read, convert and manipulate PDF files without the use of Adobe Acrobat. **Aspose.PDF for Python via .NET** allows to perform a range of document processing tasks such as form processing, get and set metadata information, text and page manipulation, management of annotations, add or remove bookmarks and watermarks, attachments, custom font handling and much more.
Check out the [Landing Pages](https://products.aspose.com/pdf/python-net/) of **Aspose.PDF for Python via .NET** for a more detailed description of the features and possibilities of the library.
## General PDF Features
- Supports most established PDF standards and PDF specifications.
- Ability to read & export PDFs in multiple image formats including BMP, GIF, JPEG & PNG.
- Set basic information (e.g. author, creator) of the PDF document.
- Configure PDF Page properties (e.g. width, height, cropbox, bleedbox etc.).
- Set page numbering, bookmark level, page sizes etc.
- Ability to work with text, paragraphs, headings, hyperlinks, graphs, attachments etc.
## Supported PDF versions
**Aspose.PDF for Python via .NET** supports PDF versions 1.2, 1.3, 1.4, 1.5, 1.6, 1.7 and 2.0.
## Conversion Features
**Aspose.PDF for Python via .NET** library allows you to successfully, quickly and easily convert your PDF documents to the most popular formats and vice versa.
- Convert PDF to Word, Excel, and PowerPoint.
- Convert PDF to Images formats.
- Convert PDF file to HTML format and vice versa.
- Convert PDF to EPUB, Text, XPS, etc.
- Convert EPUB, Markdown, Text, XPS, PostScript, XML, LaTex to PDF.
## Package Features
- Add, search, extract and replace text in PDF files.
- Add/delete, extract and replace images.
- Insert, delete, split PDF pages.
- Set and get XMP metadata.
- Validate (PDF/A-1a, PDF/A-1b).
- Work with bookmarks, annotations, PDF forms, stamps, watermarks and more.
## Supported File Formats
The following table indicates the file formats that **Aspose.PDF for Python via .NET** can load and Save.
|**Format**|**Description**|**Load**|**Save**|**Remarks**|
| :- | :- | :- | :- | :- |
|[PDF](https://docs.fileformat.com/pdf/)|Portable Document Format|**Yes**|**Yes**| |
|[CGM](https://docs.fileformat.com/page-description-language/cgm/)|Computer Graphics Metafile for 2D vector graphics|**Yes**|**No**| |
|[EPUB](https://docs.fileformat.com/ebook/epub/)|Ebook file format|**Yes**|**Yes**| |
|[HTML](https://docs.fileformat.com/web/html/)|HTML Format|**Yes**|**Yes**| |
|[TeX](https://docs.fileformat.com/page-description-language/tex/)|LaTex typesetting file format|**Yes**|**Yes**| |
|[MHT](https://docs.fileformat.com/web/mhtml/)|MHTML Document|**Yes**|**No**| |
|[PCL](https://docs.fileformat.com/page-description-language/pcl/)|Printer Control Language Files|**Yes**|**No**| |
|[PS](https://docs.fileformat.com/page-description-language/ps/)|Postscript Files|**Yes**|**No**| |
|[SVG](https://docs.fileformat.com/page-description-language/svg/)|Scalable Vector Graphics (An XML-based vector image format)|**Yes**|**Yes**| |
|[XML](https://docs.fileformat.com/web/xml/)|XML Format|**Yes**|**Yes**| |
|[XPS](https://docs.fileformat.com/page-description-language/xps/)|XPS Documents|**Yes**|**Yes**| |
|[XSLFO](https://docs.fileformat.com/page-description-language/xslfo/)|XSL-FO is part of XSL file which is used for the transformation and formatting of XML data|**Yes**|**No**| |
|[MD](https://docs.fileformat.com/word-processing/md/)|Markdown Format|**Yes**|**No**| |
|[XLS](https://docs.fileformat.com/spreadsheet/xls/)|Saves the document in the Microsoft Excel SpreadSheet|**No**|**Yes**| |
|[XLSX](https://docs.fileformat.com/spreadsheet/xlsx/)|Saves the document in the Microsoft Excel 2007 format|**No**|**Yes**| |
|[PPTX](https://docs.fileformat.com/presentation/pptx/)|Saves the document in the Microsoft PowerPoint Presentations format|**No**|**Yes**| |
|[DOC](https://docs.fileformat.com/word-processing/doc/)|Saves the document in the Microsoft Word format|**No**|**Yes**| |
|[DOCX](https://docs.fileformat.com/word-processing/docx/)|Saves the document in the Microsoft Word format|**No**|**Yes**| |
|[MobiXML](https://docs.fileformat.com/ebook/mobi/)|Saves the document in eBook MobiXML Standard format|**No**|**Yes**| |
|[JPEG](https://docs.fileformat.com/image/jpeg/)|Saves the document in JPEG Format|**Yes**|**Yes**| |
|[EMF](https://docs.fileformat.com/image/emf/)|Enhanced metafile format (EMF)|**Yes**|**Yes**| |
|[PNG](https://docs.fileformat.com/image/png/)|Saves the document in PNG Format|**Yes**|**Yes**| |
|[BMP](https://docs.fileformat.com/image/bmp/)|Saves the document in BMP Format|**Yes**|**Yes**| |
|[GIF](https://docs.fileformat.com/image/gif/)|Graphic Interchange Format|**No**|**Yes**| |
|[TIFF](https://docs.fileformat.com/image/tiff/)|Saves the document as Single or Multi-Page TIFF Image|**Yes**|**Yes**| |
|[Text](https://docs.fileformat.com/word-processing/txt/)|Save the document int Text Format|**Yes**|**Yes**| |
## Platform Independence
**Aspose.PDF for Python via .NET** can be used to develop 32-bit and 64-bit Python applications for different operating systems (such as Windows, Linux, macOS) where Python 3.9 or later is installed.
## Get Started
Run ```pip install aspose-pdf``` to install the package. If you already have **Aspose.PDF for Python via .NET** installed and want to upgrade to the latest version, run ```pip install --upgrade aspose-pdf```.
If necessary, you can download **Aspose.PDF for Python via .NET** from [releases.aspose.com](https://releases.aspose.com/pdf/pythonnet/) and install it using the command ```pip install <path_to_whl>```.
To learn more about **Aspose.PDF for Python via .NET** , including its basic requirements and features, please refer to the [Aspose.PDF for Python via .NET Documentation](https://docs.aspose.com/pdf/python-net/) or visit the [GitHub repository](https://github.com/aspose-pdf/Aspose.PDF-for-Python-via-.NET/).
## Create a PDF file from scratch in Python
In the next code snippet, we are creating a PDF document fron scratch containing the text “Hello World!”. After installing **Aspose.PDF for Python via .NET** in your environment, you can execute below code sample to see how Aspose.PDF API works.
Below code snippet follows these steps:
1. Instantiate a Document object.
1. Add a Page to the document object.
1. Create a TextFragment object.
1. Add TextFragment to Paragraph collection of the page.
1. Save the resultant PDF document.
The following code snippet is a “Hello, World!” program to exhibit working of **Aspose.PDF for Python via .NET** API:
```python
import aspose.pdf as ap
# Initialize document object
document = ap.Document()
# Add page
page = document.pages.add()
# Initialize textfragment object
text_fragment = ap.text.TextFragment("Hello,world!")
# Add text fragment to new page
page.paragraphs.add(text_fragment)
# Save updated PDF
document.save("output.pdf")
```
## Example of converting HTML to PDF
**Aspose.PDF for Python via .NET** is a PDF manipulation API that lets you convert any existing HTML documents to PDF format. The process of converting HTML to PDF can be flexibly customized.
Below code snippet follows these steps:
1. Create an instance of the HtmlLoadOptions object.
1. Initialize Document object.
1. Save output PDF document by calling Document.Save() method.
```python
import aspose.pdf as ap
# Instantiate an object of HtmlLoadOptions
options = ap.HtmlLoadOptions()
# Convert HTML to PDF
document = ap.Document("input.html", options)
# Save PDF
document.save("output.pdf")
```
## Example of converting PDF to SVG
**Aspose.PDF for Python via .NET** supports the feature to convert SVG image to PDF format. To accomplish this requirement, the SvgSaveOptions class has been introduced into the Aspose.PDF namespace. Instantiate an object of SvgSaveOptions and pass it as a second argument to the Document.Save(..) method.
Below code snippet follows these steps:
1. Create an object of the Document class.
1. Create SvgSaveOptions object with needed settings.
1. Call the Document.Save() method and pass it SvgSaveOptions object convert the PDF document to SVG.
```python
import aspose.pdf as ap
# Open PDF document
document = ap.Document("input.pdf")
# Instantiate an object of SvgSaveOptions
saveOptions = ap.SvgSaveOptions()
# Do not compress SVG image to Zip archive
saveOptions.compress_output_to_zip_archive = False
saveOptions.treat_target_file_name_as_directory = True
# Save the output in SVG files
document.save("output.svg", saveOptions)
```
## Merge PDF Files
Merge multiple PDF into single file in Python with Aspose.PDF programmatically. PDF files are merged such that the first one is joined at the end of the other document.
Below code snippet follows these steps:
1. Open first document.
1. Open second document.
1. Add pages of second document to the first.
1. Save concatenated output file.
```python
import aspose.pdf as ap
# Open first document
document1 = ap.Document("input_1.pdf")
# Open second document
document2 = ap.Document("input_2.pdf")
# Add pages of second document to the first
document1.pages.add(document2.pages)
# Save concatenated output file
document1.save("output.pdf")
```
## Print PDF to XPS printer
You can print a PDF file to an XPS printer, or some other soft printer for that matter, using the PdfViewer class.
Below code snippet follows these steps:
1. Create an object of the PdfViewer class.
1. Open the PDF file using the bind_pdf method.
1. Set different print settings using the PrinterSettings and PageSettings classes.
1. Set the printer_name property to the XPS or other printer.
1. Print document using the print_document_with_settings method.
```python
import aspose.pdf as ap
import aspose.pydrawing as drawing
# Create PdfViewer object
viewer = ap.facades.PdfViewer()
# Open input PDF file
viewer.bind_pdf("input.pdf")
# Set attributes for printing
# Print the file with adjusted size
viewer.auto_resize = True
# Print the file with adjusted rotation
viewer.auto_rotate = True
# Do not produce the page number dialog when printing
viewer.print_page_dialog = False
# Create objects for printer and page settings
ps = ap.printing.PrinterSettings()
pgs = ap.printing.PageSettings()
# Set XPS/PDF printer name
ps.printer_name = "Microsoft XPS Document Writer"
# Or set the PDF printer
# ps.printer_name = "Adobe PDF"
# Set PageSize(if required)
pgs.paper_size = ap.printing.PaperSize("A4", 827, 1169)
# Set PageMargins(if required)
pgs.margins = ap.devices.Margins(0, 0, 0, 0)
# Print document using printer and page settings
viewer.print_document_with_settings(pgs, ps)
# Close the PDF file after printing
viewer.close()
```
[Product Page](https://products.aspose.com/pdf/python-net/) | [Documentation](https://docs.aspose.com/pdf/python-net/) | [Examples](https://github.com/aspose-pdf/Aspose.PDF-for-Python-via-.NET/) | [Demos](https://products.aspose.app/pdf/family) | [Blog](https://blog.aspose.com/categories/aspose.pdf-product-family/) | [API Reference](https://reference.aspose.com/pdf/python-net/) | [Search](https://search.aspose.com/) | [Free Support](https://forum.aspose.com/c/pdf) | [Temporary License](https://purchase.aspose.com/temporary-license)
| text/markdown | Aspose | null | null | null | Other/Proprietary License | Aspose.PDF, PDF, XFA, XPS, TIFF, PCL, SVG, HTML, XML, XSL-FO, FDF, XFDF, PDF/A, form, Portfolio, EPUB, PSD, to, XLS, PDF-to-DOC | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"License :: Other/Proprietary License",
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX :: Linux",
"Operating System :: Uni... | [
"macos_arm64"
] | https://products.aspose.com/pdf/python-net/ | null | <3.14,>=3.9 | [] | [] | [] | [] | [] | [] | [] | [
"Blog, https://blog.aspose.com/categories/aspose.pdf-product-family/",
"Demos, https://products.aspose.app/pdf/family",
"Docs, https://docs.aspose.com/pdf/python-net/",
"Examples, https://github.com/aspose-pdf/Aspose.PDF-for-Python-via-.NET/",
"Free Support, https://forum.aspose.com/c/pdf",
"API Reference... | twine/6.2.0 CPython/3.13.0 | 2026-02-19T10:17:23.699643 | aspose_pdf-26.1.0-py3-none-macosx_11_0_arm64.whl | 93,835,950 | 3c/b4/d4d009bfe75d4552d90f670d1c3cc4cbe6b069f689a9e24bfea2df908e2b/aspose_pdf-26.1.0-py3-none-macosx_11_0_arm64.whl | py3 | bdist_wheel | null | false | 01db7f6016d260d8d04bfa1feacd5a87 | a501b4975fe43053a4dcabfc286acc6504b99680a5bbb990f366ec5b4de399cb | 3cb4d4d009bfe75d4552d90f670d1c3cc4cbe6b069f689a9e24bfea2df908e2b | null | [] | 1,023 |
2.4 | TiRiFiG | 2.0.2 | Development Status :: 4 - Beta | ================================================
TiRiFiG: A graphical 3D kinematic modelling tool
================================================
|PyPI Version|
TiRiFiC_ is a 3D kinematic modelling tool used to model resolved spectroscopic
observations of rotating discs in terms of the tilted-ring model with varying complexity.
The front-end (TiRiFiG, Tilted-Ring-Fitting-GUI), part of the toolkit, is an aid to
enable the user to perform the modelling process interactively.
.. |PyPI Version| image:: https://img.shields.io/badge/pypi-beta-orange.svg
:target: https://pypi.org/project/TiRiFiG/
:alt:
.. _PEP8: https://www.python.org/dev/peps/pep-0008/
.. _source: https://github.com/gigjozsa/TiRiFiG
.. _license: https://github.com/gigjozsa/TiRiFiG/blob/master/LICENSE
.. _TiRiFiC: http://gigjozsa.github.io/tirific/
.. _website: https://www.riverbankcomputing.com/software/pyqt/download
============
Requirements
============
The code requires full installation of:
.. code-block:: bash
PyQt6 (available through pip)
TiRiFiC
============
Installation
============
Installation from source_, working directory where source is checked out
.. code-block:: bash
$ pip install .
This package is available on *PYPI*, allowing
.. code-block:: bash
$ pip install TiRiFiG (This is for now an older version)
Download and installation notes for TiRiFiC_ is on its website. Once installed, add TiRiFiG to your PYTHONPATH using
``export PATH='path_to_installation_directory:$PATH'``.
=====
Usage
=====
Start TiRiFiG, from the terminal.
With the GUI running, the next steps are:
- Adjust data points for the parameter(s) using the mouse.
- Possibly fit a polynomial to the dat points.
- Start TiRiFiC from run menu to perform fitting.
=======
License
=======
This project is licensed under the MIT License - see license_ for details.
==========
Contribute
==========
Contributions are always welcome! Please ensure that you adhere to PEP8_ coding style guide.
| text/x-rst | null | "S. Twum, P. Kamphuis" <peterkamphuisastronomy@gmail.com> | null | null | null | null | [
"Development Status :: 4 - Beta",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3",
"Topic :: Scientific/Engineering :: Astronomy"
] | [] | null | null | >=3.8 | [] | [] | [] | [
"pyfat-astro",
"pyqt6",
"trm-errors>=0.1.1"
] | [] | [] | [] | [
"Homepage, https://github.com/PeterKamphuis/TiRiFiG"
] | twine/6.2.0 CPython/3.13.7 | 2026-02-19T10:17:23.404258 | tirifig-2.0.2.tar.gz | 15,245,930 | 3b/27/80e137df4f24d809331da20f4fea0b656cef174fafe8a33b47ca44137bbe/tirifig-2.0.2.tar.gz | source | sdist | null | false | 40ba7cf9e8ddd1580238ccbf6e53e15f | f040b38d092e7e449149815c526e0c655f532e325a717f578421fdba95242238 | 3b2780e137df4f24d809331da20f4fea0b656cef174fafe8a33b47ca44137bbe | GPL-3.0-or-later | [
"LICENSE"
] | 0 |
2.4 | helm-sdk | 0.1.1 | Python SDK for HELM — fail-closed tool calling for AI agents | # HELM SDK — Python
Typed Python client for the HELM kernel API. One dependency: `httpx`.
## Install
```bash
pip install helm-sdk
```
## Quick Example
```python
from helm_sdk import HelmClient, HelmApiError, ChatCompletionRequest, ChatMessage
helm = HelmClient(base_url="http://localhost:8080")
# OpenAI-compatible chat (tool calls governed by HELM)
try:
res = helm.chat_completions(ChatCompletionRequest(
model="gpt-4",
messages=[ChatMessage(role="user", content="List files in /tmp")],
))
print(res.choices[0].message.content)
except HelmApiError as e:
print(f"Denied: {e.reason_code}") # e.g. DENY_TOOL_NOT_FOUND
# Export + verify evidence pack
pack = helm.export_evidence()
result = helm.verify_evidence(pack)
print(result.verdict) # PASS
# Conformance
from helm_sdk import ConformanceRequest
conf = helm.conformance_run(ConformanceRequest(level="L2"))
print(conf.verdict, conf.gates, "gates")
```
## API
| Method | Endpoint |
|--------|----------|
| `chat_completions(req)` | `POST /v1/chat/completions` |
| `approve_intent(req)` | `POST /api/v1/kernel/approve` |
| `list_sessions()` | `GET /api/v1/proofgraph/sessions` |
| `get_receipts(session_id)` | `GET /api/v1/proofgraph/sessions/{id}/receipts` |
| `export_evidence(session_id?)` | `POST /api/v1/evidence/export` |
| `verify_evidence(bundle)` | `POST /api/v1/evidence/verify` |
| `replay_verify(bundle)` | `POST /api/v1/replay/verify` |
| `conformance_run(req)` | `POST /api/v1/conformance/run` |
| `health()` | `GET /healthz` |
| `version()` | `GET /version` |
## Error Handling
All errors raise `HelmApiError` with a typed `reason_code`:
```python
try: helm.chat_completions(req)
except HelmApiError as e: print(e.reason_code)
```
| text/markdown | null | Mindburn Labs <oss@mindburn.org> | null | null | BSL-1.1 | null | [] | [] | null | null | >=3.9 | [] | [] | [] | [
"httpx>=0.25.0",
"pytest>=7.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\"",
"mypy>=1.0; extra == \"dev\"",
"types-requests; extra == \"dev\""
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.11 | 2026-02-19T10:17:09.334707 | helm_sdk-0.1.1.tar.gz | 6,480 | 1d/1a/8cfc2f5f929689a1ed48d42b35d612bfe5111365f1f50b3da884c6d20729/helm_sdk-0.1.1.tar.gz | source | sdist | null | false | 825f41a0490a0cf494f59d1ba7eb54f1 | f55fb372fd525ea8dc7a58255cb3c587107b38b459f29664c274dd41e5addb7e | 1d1a8cfc2f5f929689a1ed48d42b35d612bfe5111365f1f50b3da884c6d20729 | null | [] | 224 |
2.4 | opensips | 0.1.9 | OpenSIPS Python Packages | # OpenSIPS Python Packages
This repository contains a collection of Python packages for OpenSIPS. These modules are designed to be as lightweight as possible and provide a simple interface for interacting with OpenSIPS. Alongside the source code, the repository also contains a [Docker](docker/Dockerfile) image that comes with the OpenSIPS Python packages pre-installed.
## Features
Currently, the following packages are available:
- `mi` - can be used to execute OpenSIPS Management Interface (MI) commands.
- `event` - allows you to use OpenSIPS Event Interface subscriptions.
## Usage
1. Install the package from source code:
```bash
git clone
cd python-opensips
pip install .
```
or from PyPI:
```bash
pip install opensips
```
2. Import the package in your Python code:
```python
from opensips.mi import OpenSIPSMI, OpenSIPSMIException
from opensips.event import OpenSIPSEvent, OpenSIPSEventException, OpenSIPSEventHandler
```
3. Use the methods provided by the modules:
```python
mi = OpenSIPSMI('http', url='http://localhost:8888/mi')
try:
response = mi.execute('ps')
# do something with the response
except OpenSIPSMIException as e:
# handle the exception
```
```python
mi_connector = OpenSIPSMI('http', url='http://localhost:8888/mi')
hdl = OpenSIPSEventHandler(mi_connector, 'datagram', ip='127.0.0.1', port=50012)
def some_callback(message):
# do something with the message (it is a JSON object)
pass
ev: OpenSIPSEvent = None
try:
event = hdl.subscribe('E_PIKE_BLOCKED', some_callback)
except OpenSIPSEventException as e:
# handle the exception
try:
ev.unsubscribe('E_PIKE_BLOCKED')
except OpenSIPSEventException as e:
# handle the exception
```
## Documentation
* [MI](docs/mi.md) - contains information about supported MI communication types and required parameters for each type.
* [Event Interface](docs/event.md) - lists the supported event transport protocols and provides information about the required parameters for each protocol.
* [Docker](docker/docker.md) - provides information about the Docker image that contains the OpenSIPS Python packages.
## Scripts
### MI
After installing the package, you can use the provided [opensips-mi](opensips/mi/__main__.py) script to run MI commands. This script takes the following arguments:
- `-t` or `--type` - the type of the MI communication (`http`, `datagram` or `fifo`).
- `-i` or `--ip` - the IP address of the OpenSIPS server.
- `-p` or `--port` - the port of the OpenSIPS MI.
- `-f` or `--fifo-file` - the path to the FIFO file.
- `-fb` or `--fifo-fallback` - the path to the FIFO fallback file.
- `-fd` or `--fifo-reply-dir` - the directory where the FIFO reply files are stored.
- `--env-file` - the path to the environment file that contains the MI parameters (by default, the script will look for the `.env` file in the current directory); lower priority than the command line arguments.
- `-ds` or `--datagram-socket` - Unix Datagram Socket.
- `-dt` or `--datagram-timeout` - Datagram Socket timeout in seconds. Default is 0.1.
- `-db` or `--datagram-buffer-size` - Datagram Socket buffer size in bytes. Default is 32768.
#### Usage
```bash
# general usage
opensips-mi -t datagram -p 8080 command_name [command_args ...]
# this will execute get_statistics command
opensips-mi -t datagram -p 8080 -s core: shmem:
# you can pass json string as argument with -j flag for commands that require arrays as arguments
opensips-mi -t datagram -p 8080 get_statistics -j "{'statistics': ['core:', 'shmem:']}"
```
### Event
You can use the provided [opensips-event](opensips/event/__main__.py) script to subscribe for OpenSIPS events. This script takes the following arguments:
- all the above arguments for the MI communication
- `-T` or `--transport` - the transport protocol to use (`datagram`, `stream`).
- `-li` or `--listen-ip` - the IP address to listen on.
- `-lp` or `--listen-port` - the port to listen on.
- `-e` or `--expire` - the expiration time for the subscription.
- the event name to subscribe for.
- `--env-file` - the path to the environment file that contains the MI parameters (by default, the script will look for the `.env` file in the current directory); lower priority than the command line arguments.
#### Usage
```bash
opensips-event -t datagram -p 8080 -T datagram -lp 50012 -e 3600 E_PIKE_BLOCKED
```
## License
<!-- License source -->
[License-GPLv3]: https://www.gnu.org/licenses/gpl-3.0.en.html "GNU GPLv3"
[Logo-CC_BY]: https://i.creativecommons.org/l/by/4.0/88x31.png "Creative Commons Logo"
[License-CC_BY]: https://creativecommons.org/licenses/by/4.0/legalcode "Creative Commons License"
The `python-opensips` source code is licensed under the [GNU General Public License v3.0][License-GPLv3]
All documentation files (i.e. `.md` extension) are licensed under the [Creative Commons License 4.0][License-CC_BY]
![Creative Commons Logo][Logo-CC_BY]
© 2024 - OpenSIPS Solutions
| text/markdown | null | Darius Stefan <darius.stefan@opensips.org> | null | null | null | null | [
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.6 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/OpenSIPS/python-opensips",
"Repository, https://github.com/OpenSIPS/python-opensips"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:16:57.008653 | opensips-0.1.9.tar.gz | 31,281 | dd/ca/4eb6a144228e608380aa4261133551c32b604e1366b4861bf3f567d34f65/opensips-0.1.9.tar.gz | source | sdist | null | false | 3bcad7642bfbf1475e75c5e404660455 | 409a9078908c2321df0a727fdca700ac3300cae0e1d7ab01b0c92cc5314d213f | ddca4eb6a144228e608380aa4261133551c32b604e1366b4861bf3f567d34f65 | GPL-3.0-or-later | [
"LICENSE"
] | 392 |
2.1 | vectorvein-sdk | 0.3.74 | VectorVein API SDK and workflow designer | # vectorvein-sdk
[中文文档](README_ZH.md)
Python SDK for the [VectorVein](https://vectorvein.com) platform — run workflows, build workflows programmatically, and manage AI agents.
## Installation
```bash
pip install vectorvein-sdk
```
Requires Python 3.10+.
## Quick Start
```python
from vectorvein.api import VectorVeinClient, WorkflowInputField
with VectorVeinClient(api_key="YOUR_API_KEY") as client:
result = client.run_workflow(
wid="YOUR_WORKFLOW_ID",
input_fields=[
WorkflowInputField(node_id="node_id", field_name="input", value="Hello"),
],
wait_for_completion=True,
timeout=60,
)
for output in result.data:
print(f"{output.title}: {output.value}")
```
## Features
- **Sync & Async clients** — `VectorVeinClient` and `AsyncVectorVeinClient`
- **Workflow execution** — run workflows, poll status, create workflows via API
- **Workflow builder** — programmatically construct workflows with 50+ node types
- **AI Agent management** — create agents, run tasks, manage cycles
- **File upload** — upload files to the platform
- **Access key management** — create, list, update, delete access keys
- **Agent workspace** — read/write/list files in agent workspaces
## Workflow Execution
### Synchronous
```python
from vectorvein.api import VectorVeinClient, WorkflowInputField
with VectorVeinClient(api_key="YOUR_API_KEY") as client:
# Fire-and-forget
rid = client.run_workflow(
wid="workflow_id",
input_fields=[WorkflowInputField(node_id="n1", field_name="text", value="hello")],
wait_for_completion=False,
)
# Poll for result
result = client.check_workflow_status(rid=rid)
print(result.status, result.data)
```
### Asynchronous
```python
import asyncio
from vectorvein.api import AsyncVectorVeinClient, WorkflowInputField
async def main():
async with AsyncVectorVeinClient(api_key="YOUR_API_KEY") as client:
result = await client.run_workflow(
wid="workflow_id",
input_fields=[WorkflowInputField(node_id="n1", field_name="text", value="hello")],
wait_for_completion=True,
timeout=120,
)
print(result.data)
asyncio.run(main())
```
### Create a Workflow via API
```python
workflow = client.create_workflow(
title="My Workflow",
brief="Created via SDK",
data={"nodes": [...], "edges": [...]},
language="en-US",
)
print(workflow.wid)
```
## Workflow Builder
Build workflows in pure Python — no JSON editing required.
```python
from vectorvein.workflow.graph.workflow import Workflow
from vectorvein.workflow.nodes import OpenAI, TemplateCompose, TextInOut, Text
workflow = Workflow()
# Create nodes
text_input = TextInOut("input")
text_input.ports["text"].value = "Tell me a joke"
template = TemplateCompose("tpl")
template.ports["template"].value = "User says: {{user_input}}\nRespond with humor."
template.add_port("user_input", "text", value="", is_output=False)
llm = OpenAI("llm")
llm.ports["llm_model"].value = "gpt-4"
llm.ports["temperature"].value = 0.9
output = Text("out")
# Assemble
workflow.add_nodes([text_input, template, llm, output])
workflow.connect(text_input, "output", template, "user_input")
workflow.connect(template, "output", llm, "prompt")
workflow.connect(llm, "output", output, "text")
# Validate & layout
print(workflow.check()) # {"no_cycle": True, "no_isolated_nodes": True, ...}
workflow.layout({"direction": "LR"})
# Export
json_str = workflow.to_json()
mermaid_str = workflow.to_mermaid()
# Push to platform
client.create_workflow(title="Joke Bot", data=workflow.to_dict())
```
### Available Node Types (50+)
| Category | Nodes |
|---|---|
| LLMs | `OpenAI`, `Claude`, `Gemini`, `Deepseek`, `AliyunQwen`, `BaiduWenxin`, `ChatGLM`, `MiniMax`, `Moonshot`, `LingYiWanWu`, `XAi`, `CustomModel` |
| Text Processing | `TextInOut`, `TemplateCompose`, `TextSplitters`, `TextReplace`, `TextTruncation`, `RegexExtract`, `ListRender`, `MarkdownToHtml` |
| Output | `Text`, `Table`, `Audio`, `Document`, `Html`, `Echarts`, `Email`, `Mermaid`, `Mindmap`, `PictureRender` |
| Image Generation | `DallE`, `StableDiffusion`, `Flux1`, `Kolors`, `Recraft`, `Pulid`, `Inpainting`, `BackgroundGeneration` |
| Media Processing | `GptVision`, `ClaudeVision`, `GeminiVision`, `QwenVision`, `DeepseekVl`, `GlmVision`, `InternVision`, `Ocr`, `SpeechRecognition` |
| Media Editing | `ImageEditing`, `ImageBackgroundRemoval`, `ImageSegmentation`, `ImageWatermark`, `AudioEditing`, `VideoEditing`, `VideoScreenshot` |
| Video Generation | `KlingVideo`, `CogVideoX` |
| Audio | `Tts`, `SoundEffects`, `MinimaxMusicGeneration` |
| Web Crawlers | `TextCrawler`, `BilibiliCrawler`, `DouyinCrawler`, `YoutubeCrawler` |
| Tools | `ProgrammingFunction`, `TextSearch`, `ImageSearch`, `TextTranslation`, `CodebaseAnalysis`, `WorkflowInvoke` |
| Control Flow | `Conditional`, `JsonProcess`, `RandomChoice`, `HumanFeedback`, `Empty` |
| File Processing | `FileUpload`, `FileLoader` |
| Database | `RunSql`, `GetTableInfo`, `SmartQuery` |
| Vector DB | `AddData`, `DeleteData`, `Search` |
| Triggers | `ButtonTrigger` |
### Workflow Utilities
```python
from vectorvein.workflow.utils.json_to_code import generate_python_code
from vectorvein.workflow.utils.analyse import analyse_workflow_record, format_workflow_analysis_for_llm
# Convert workflow JSON to Python code
code = generate_python_code(json_file="workflow.json")
# Analyse workflow structure
result = analyse_workflow_record(json_str, connected_only=True)
summary = format_workflow_analysis_for_llm(result, max_length=200)
```
## AI Agent
### Create and Run an Agent Task
```python
from vectorvein.api import VectorVeinClient, TaskInfo
with VectorVeinClient(api_key="YOUR_API_KEY") as client:
# Create an agent
agent = client.create_agent(
name="Research Assistant",
system_prompt="You are a helpful research assistant.",
default_model_name="gpt-4",
)
# Run a task
task = client.create_agent_task(
task_info=TaskInfo(text="Summarize the latest AI news"),
agent_id_to_start=agent.agent_id,
)
# Check status
task = client.get_agent_task(task.task_id)
print(task.status, task.result)
# List cycles (reasoning steps)
cycles = client.list_agent_cycles(task_id=task.task_id)
for cycle in cycles.cycles:
print(f"Cycle {cycle.cycle_index}: {cycle.title}")
```
### Agent Task Control
```python
client.pause_agent_task(task_id=task.task_id)
client.resume_agent_task(task_id=task.task_id)
client.continue_agent_task(task_id=task.task_id, task_info=TaskInfo(text="Also check arxiv"))
```
## File Upload
```python
result = client.upload_file("report.pdf")
print(result.oss_path, result.file_size)
```
## Access Key Management
```python
# Create a long-term access key
keys = client.create_access_keys(access_key_type="L", app_id="app_id", description="prod key")
print(keys[0].access_key)
# List keys
response = client.list_access_keys(page=1, page_size=20)
for key in response.access_keys:
print(key.access_key, key.status, key.use_count)
# Delete
client.delete_access_keys(app_id="app_id", access_keys=["key_to_delete"])
```
## Agent Workspace
```python
# List files in workspace
files = client.list_workspace_files(workspace_id="ws_id")
for f in files.files:
print(f.key, f.size)
# Read / Write
content = client.read_workspace_file(workspace_id="ws_id", file_path="notes.txt")
client.write_workspace_file(workspace_id="ws_id", file_path="output.txt", content="done")
# Download
url = client.download_workspace_file(workspace_id="ws_id", file_path="result.csv")
```
## Exceptions
All exceptions inherit from `VectorVeinAPIError`:
| Exception | Description |
|---|---|
| `APIKeyError` | Invalid or expired API key |
| `WorkflowError` | Workflow execution failure |
| `AccessKeyError` | Access key operation failure |
| `RequestError` | HTTP request failure |
| `TimeoutError` | Operation timed out |
```python
from vectorvein.api import VectorVeinClient, APIKeyError, WorkflowError, TimeoutError
try:
result = client.run_workflow(wid="wf_id", input_fields=[], wait_for_completion=True, timeout=30)
except TimeoutError:
print("Workflow took too long")
except WorkflowError as e:
print(f"Workflow failed: {e}")
except APIKeyError:
print("Check your API key")
```
## Development
```bash
git clone <repo-url>
cd vectorvein-sdk
pip install -e ".[dev]"
# Run unit tests (no API key needed)
python -m pytest tests/ -v
# Run all tests including live API tests
VECTORVEIN_RUN_LIVE_TESTS=1 python -m pytest tests/ -v
```
For live tests, copy `tests/dev_settings.example.py` to `tests/dev_settings.py` and fill in your credentials.
## License
MIT
| text/markdown | null | Anderson <andersonby@163.com> | null | null | MIT | null | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"httpx>=0.27.0",
"pydantic>=2.8.2",
"pycryptodome>=3.21.0"
] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:16:53.433579 | vectorvein_sdk-0.3.74.tar.gz | 59,871 | 98/d5/c4f1b06784985fd9ff850630e481ae8b8633eba2478be81f0f6dc6962370/vectorvein_sdk-0.3.74.tar.gz | source | sdist | null | false | c894947c5e22e25f8b40346336a1d412 | d54c12857b2860417ae939bec1bd65ba7995c406aac251ac67814bafcfadd9a4 | 98d5c4f1b06784985fd9ff850630e481ae8b8633eba2478be81f0f6dc6962370 | null | [] | 247 |
2.4 | vetch | 0.1.3 | Planet-aware observability for LLM inference | # Vetch SDK
[](https://pypi.org/project/vetch/)
[](https://pypi.org/project/vetch/)
[](https://opensource.org/licenses/Apache-2.0)
[](https://github.com/prismatic-labs/vetch/actions/workflows/ci.yml)
[](https://colab.research.google.com/github/prismatic-labs/vetch/blob/main/demo.ipynb)
Planet-aware observability for LLM inference.
Vetch is a Python SDK that wraps LLM API calls to log energy consumption, cost, and carbon per inference using live grid data. It never reads prompt or completion content—only metadata from the response usage.
## Features
- **Fail-Open**: LLM calls always proceed even if Vetch fails.
- **Privacy-First**: No prompt or completion data is ever read or buffered.
- **Multi-tier Caching**: Memory and file-based caching for grid intensity data.
- **Observability-Transparent**: Works seamlessly with Datadog, OpenTelemetry, and Sentry.
- **Low Overhead**: Under 5ms overhead for sync calls; zero TTFT latency for streaming.
## Installation
```bash
pip install vetch
```
## Quick Start
```python
from vetch import wrap
from openai import OpenAI
client = OpenAI()
with wrap(region="us-east-1", tags={"team": "ml", "env": "prod"}) as ctx:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello world"}]
)
# Access inference metadata
print(f"Energy: {ctx.event['estimated_energy_wh']} Wh")
print(f"Carbon: {ctx.event['estimated_carbon_g']} gCO2e")
```
## CLI Usage
Estimate energy/carbon for a model without running code:
```bash
vetch estimate --model gpt-4o --input-tokens 1000 --output-tokens 500 --region us-east-1
```
Compare multiple models:
```bash
vetch compare --models gpt-4o,claude-3-opus,gemini-1.5-pro --tokens 1000
```
Analyze your token usage patterns:
```bash
vetch audit
```
Check your environment:
```bash
vetch check
```
## Token Waste Audit
Vetch tracks token usage patterns across your session and provides actionable recommendations:
```python
from vetch import wrap, get_session_stats, generate_advisories
# Make multiple LLM calls
for _ in range(10):
with wrap() as ctx:
response = client.chat.completions.create(...)
# Analyze patterns
stats = get_session_stats()
advisories = generate_advisories(stats)
for a in advisories:
print(f"[{a.level.value}] {a.title}")
print(f" {a.description}")
```
**What it detects:**
- **Static system prompts**: Repeated input token counts suggest cacheable prompts
- **High input:output ratios**: Large inputs producing small outputs
- **Expensive model usage**: Opportunities to use smaller, cheaper models
## GPU Calibration (Local Inference)
For local inference (Ollama, vLLM, llama.cpp), calibrate energy measurements using actual GPU power draw:
```python
from vetch.calibrate import calibrate_model, format_calibration_result
def my_inference():
# Run your inference workload
# Return (input_tokens, output_tokens)
response = ollama.generate(model="llama3.1:8b", prompt="Hello world")
return 100, 50 # Your actual token counts
result = calibrate_model("ollama", "llama3.1:8b", workload=my_inference)
print(format_calibration_result(result))
# Use calibrated values for accurate tracking
with wrap(energy_override=result.to_override()) as ctx:
response = ollama.generate(...)
```
Check calibration status:
```bash
vetch calibrate --status
```
**Requirements:** NVIDIA GPU with `pynvml` (`pip install nvidia-ml-py3`)
## Historical Analysis & Reporting
Vetch can persist events to SQLite for historical FinOps analysis:
```python
from vetch import configure_storage, query_usage, wrap
from datetime import datetime, timedelta
# Enable persistent storage
configure_storage() # Uses ~/.vetch/usage.db
# Your LLM calls are now tracked
with wrap(tags={"team": "ml", "feature": "chat"}) as ctx:
response = client.chat.completions.create(...)
# Query historical usage
summary = query_usage(
start=datetime.now() - timedelta(days=7),
tags={"team": "ml"}
)
print(f"Total cost: ${summary.total_cost_usd:.2f}")
print(f"Total energy: {summary.total_energy_wh:.2f} Wh")
print(f"Requests: {summary.total_requests}")
```
Generate reports from CLI:
```bash
# Weekly report
vetch report --days 7
# Filter by team
vetch report --tags team=ml
# Show top consumers
vetch report --top --top-by team --days 30
# JSON output for dashboards
vetch report --format json
```
## Energy Tiers
Vetch uses a tiered system for energy estimate confidence:
| Tier | Name | Uncertainty | Source |
|------|------|-------------|--------|
| 0 | **Measured** | ±10-20% | Direct GPU measurement (pynvml) |
| 1 | **Vendor-Published** | ±20-50% | Official provider data |
| 2 | **Validated** | ±50-100% | Crowdsourced aggregates |
| 3 | **Estimated** | order of magnitude | Parameter-based calculation |
Run `vetch methodology` to see full methodology documentation.
## Environment Variables
| Variable | Description |
|----------|-------------|
| `VETCH_DISABLED` | Set to `true` to completely disable Vetch (emergency kill switch) |
| `VETCH_REGION` | Default grid region (e.g., `us-east-1`, `eu-west-1`) |
| `VETCH_OUTPUT` | Output target: `stderr` (default), `none`, or file path |
| `ELECTRICITY_MAPS_API_KEY` | API key for live grid carbon intensity data |
| `VETCH_CACHE_MODE` | Set to `memory-only` for serverless/Lambda environments |
## Alpha Limitations
This is an alpha release. Please be aware of:
1. **Energy estimates are uncertain**: Most models use Tier 3 estimates (±10x uncertainty). See `vetch methodology` for details.
2. **Region inference is approximate**: Without explicit `VETCH_REGION`, timezone-based inference is ~30% accurate. Set the region explicitly for accurate carbon calculations.
3. **Experimental modules**: `vetch.calibrate`, `vetch.storage`, and `vetch.ci` emit `FutureWarning` and may change in future versions.
4. **Provider support**: Currently supports OpenAI, Anthropic, and Vertex AI. Other providers coming soon.
## Troubleshooting
**Vetch is blocking my LLM calls:**
```bash
export VETCH_DISABLED=true # Emergency kill switch
```
**Too much output:**
```bash
export VETCH_OUTPUT=none # Silence all output
```
**Need to debug:**
```python
import logging
logging.getLogger("vetch").setLevel(logging.DEBUG)
```
## License
Apache License 2.0. See `LICENSE` and `NOTICE` for details.
| text/markdown | Prismatic Labs | null | null | null | null | carbon, energy, llm, observability, openai, vertexai | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Langu... | [] | null | null | >=3.9 | [] | [] | [] | [
"nvidia-ml-py3; extra == \"calibrate\"",
"build; extra == \"dev\"",
"mypy>=1.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\"",
"twine; extra == \"dev\"",
"openai<2.0,>=1.0; extra == \"openai\"",
"hypothesis>=6.0; extra == \"test\"",
"pytest-cov>=4.0; extra == \"test\"",
"pytest>=7.0; extra == \... | [] | [] | [] | [
"Homepage, https://github.com/prismatic-labs/vetch",
"Documentation, https://github.com/prismatic-labs/vetch#readme",
"Repository, https://github.com/prismatic-labs/vetch",
"Issues, https://github.com/prismatic-labs/vetch/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:16:30.455389 | vetch-0.1.3.tar.gz | 104,005 | ed/58/474fc4d8806688233591d7ed4e9500058a1a7e2250f74ed9facd4df7c3d1/vetch-0.1.3.tar.gz | source | sdist | null | false | ef5973e11f4dcb7836b9755c94c9f199 | a2ba8b07046b7cda16da61acd5d3bb656a51227110d343ee38038dd0e7c3315d | ed58474fc4d8806688233591d7ed4e9500058a1a7e2250f74ed9facd4df7c3d1 | Apache-2.0 | [
"LICENSE",
"NOTICE"
] | 214 |
2.4 | nervatura | 6.0.2 | Nervatura Framework Python gRPC packages | Nervatura Framework Python gRPC packages
=========================
Open Source Business Management Framework
Please see more the [Nervatura Docs](https://nervatura.github.io/nervatura/)
| text/markdown | Csaba Kappel | info@nervatura.com | null | null | null | null | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
"Operating System :: OS Independent"
] | [] | https://github.com/nervatura/nervatura | null | >=3.6 | [] | [] | [] | [] | [] | [] | [] | [
"Bug Tracker, https://github.com/nervatura/nervatura/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:16:10.253517 | nervatura-6.0.2.tar.gz | 37,634 | e1/15/99a6ee61be027df5469187fab4a31cda39abf76f5f7a4e8888a1f9ea5053/nervatura-6.0.2.tar.gz | source | sdist | null | false | 000622568555b54ec1febcf61dd14e9f | 2defeb7c4abc9e8953387f545e2f60a98ed4d83204e66cd37c879f45bc5bb3c0 | e11599a6ee61be027df5469187fab4a31cda39abf76f5f7a4e8888a1f9ea5053 | null | [
"LICENSE"
] | 237 |
2.4 | appabuild | 0.4.2 | Appa Build is a package to build impact models | # Appa Build
Appa Build is a package used to build impact model which are meant to be loaded and executed by the Appa Run package. It is part of the Appa LCA framework.
Appa LCA (**A**utomatable, **P**ortable and **Pa**rametric **L**ife **C**ycle **A**ssessment) framework was developed to ease the usage of screening LCA in any workflow or software, in the perspective of easing ecodesign initiatives.
It intends to bring the best of the LCA method, which is versatile, holistic and flexible, and of _ad hoc_ impact assessment tools, which are easy to use and integrate.
It relies on the production and usage of impact models, which can be seen as standalone, parametric and modular LCA templates that operate at the impact level, i.e. after application of the LCIA methods.
Documentation of Appa LCA is hosted here: https://appalca.github.io/
## Install
**Warning:** as Appa Build and Appa Run are currently not packaged and published on PyPI, you wille need to follow the
[installation with source code](https://appalca.github.io/basics/getting_started/#installation-with-source-code) instructions of the documentation.
| text/markdown | null | Maxime Peralta <maxime.peralta@cea.fr> | null | Maxime Peralta <maxime.peralta@cea.fr> | # GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc.
<https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.
This version of the GNU Lesser General Public License incorporates the
terms and conditions of version 3 of the GNU General Public License,
supplemented by the additional permissions listed below.
## 0. Additional Definitions.
As used herein, "this License" refers to version 3 of the GNU Lesser
General Public License, and the "GNU GPL" refers to version 3 of the
GNU General Public License.
"The Library" refers to a covered work governed by this License, other
than an Application or a Combined Work as defined below.
An "Application" is any work that makes use of an interface provided
by the Library, but which is not otherwise based on the Library.
Defining a subclass of a class defined by the Library is deemed a mode
of using an interface provided by the Library.
A "Combined Work" is a work produced by combining or linking an
Application with the Library. The particular version of the Library
with which the Combined Work was made is also called the "Linked
Version".
The "Minimal Corresponding Source" for a Combined Work means the
Corresponding Source for the Combined Work, excluding any source code
for portions of the Combined Work that, considered in isolation, are
based on the Application, and not on the Linked Version.
The "Corresponding Application Code" for a Combined Work means the
object code and/or source code for the Application, including any data
and utility programs needed for reproducing the Combined Work from the
Application, but excluding the System Libraries of the Combined Work.
## 1. Exception to Section 3 of the GNU GPL.
You may convey a covered work under sections 3 and 4 of this License
without being bound by section 3 of the GNU GPL.
## 2. Conveying Modified Versions.
If you modify a copy of the Library, and, in your modifications, a
facility refers to a function or data to be supplied by an Application
that uses the facility (other than as an argument passed when the
facility is invoked), then you may convey a copy of the modified
version:
- a) under this License, provided that you make a good faith effort
to ensure that, in the event an Application does not supply the
function or data, the facility still operates, and performs
whatever part of its purpose remains meaningful, or
- b) under the GNU GPL, with none of the additional permissions of
this License applicable to that copy.
## 3. Object Code Incorporating Material from Library Header Files.
The object code form of an Application may incorporate material from a
header file that is part of the Library. You may convey such object
code under terms of your choice, provided that, if the incorporated
material is not limited to numerical parameters, data structure
layouts and accessors, or small macros, inline functions and templates
(ten or fewer lines in length), you do both of the following:
- a) Give prominent notice with each copy of the object code that
the Library is used in it and that the Library and its use are
covered by this License.
- b) Accompany the object code with a copy of the GNU GPL and this
license document.
## 4. Combined Works.
You may convey a Combined Work under terms of your choice that, taken
together, effectively do not restrict modification of the portions of
the Library contained in the Combined Work and reverse engineering for
debugging such modifications, if you also do each of the following:
- a) Give prominent notice with each copy of the Combined Work that
the Library is used in it and that the Library and its use are
covered by this License.
- b) Accompany the Combined Work with a copy of the GNU GPL and this
license document.
- c) For a Combined Work that displays copyright notices during
execution, include the copyright notice for the Library among
these notices, as well as a reference directing the user to the
copies of the GNU GPL and this license document.
- d) Do one of the following:
- 0) Convey the Minimal Corresponding Source under the terms of
this License, and the Corresponding Application Code in a form
suitable for, and under terms that permit, the user to
recombine or relink the Application with a modified version of
the Linked Version to produce a modified Combined Work, in the
manner specified by section 6 of the GNU GPL for conveying
Corresponding Source.
- 1) Use a suitable shared library mechanism for linking with
the Library. A suitable mechanism is one that (a) uses at run
time a copy of the Library already present on the user's
computer system, and (b) will operate properly with a modified
version of the Library that is interface-compatible with the
Linked Version.
- e) Provide Installation Information, but only if you would
otherwise be required to provide such information under section 6
of the GNU GPL, and only to the extent that such information is
necessary to install and execute a modified version of the
Combined Work produced by recombining or relinking the Application
with a modified version of the Linked Version. (If you use option
4d0, the Installation Information must accompany the Minimal
Corresponding Source and Corresponding Application Code. If you
use option 4d1, you must provide the Installation Information in
the manner specified by section 6 of the GNU GPL for conveying
Corresponding Source.)
## 5. Combined Libraries.
You may place library facilities that are a work based on the Library
side by side in a single library together with other library
facilities that are not Applications and are not covered by this
License, and convey such a combined library under terms of your
choice, if you do both of the following:
- a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities, conveyed under the terms of this License.
- b) Give prominent notice with the combined library that part of it
is a work based on the Library, and explaining where to find the
accompanying uncombined form of the same work.
## 6. Revised Versions of the GNU Lesser General Public License.
The Free Software Foundation may publish revised and/or new versions
of the GNU Lesser General Public License from time to time. Such new
versions will be similar in spirit to the present version, but may
differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
as you received it specifies that a certain numbered version of the
GNU Lesser General Public License "or any later version" applies to
it, you have the option of following the terms and conditions either
of that published version or of any later version published by the
Free Software Foundation. If the Library as you received it does not
specify a version number of the GNU Lesser General Public License, you
may choose any version of the GNU Lesser General Public License ever
published by the Free Software Foundation.
If the Library as you received it specifies that a proxy can decide
whether future versions of the GNU Lesser General Public License shall
apply, that proxy's public statement of acceptance of any version is
permanent authorization for you to choose that version for the
Library. | ecodesign, life cycle assessment | [
"Intended Audience :: Developers",
"Intended Audience :: Manufacturing",
"Intended Audience :: Science/Research",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Topic :: Scientific/Engineering"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"aenum",
"apparun==0.4.2",
"brightway25==1.1.0",
"bw-migrations==0.2",
"bw-processing==1.0",
"bw-simapro-csv==0.4.3",
"bw2analyzer==0.11.8",
"bw2calc==2.2.1",
"bw2data==4.5.1",
"bw2io==0.9.11",
"bw2parameters==1.1.0",
"click<=8.1.8",
"fastapi",
"ipython<=8.34.0,>=7.6.0",
"kaleido",
"lc... | [] | [] | [] | [
"Source, https://github.com/appalca/appabuild/"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:15:45.372715 | appabuild-0.4.2.tar.gz | 57,911 | 69/1a/42ca75d25aede8df834f45642aae3d39ddddd55e7a0df8621c50de18e710/appabuild-0.4.2.tar.gz | source | sdist | null | false | 348469d5ea6a25155e0c3d45997d21ea | 7170f883289d5c4f04537e6460585f6acee3b790153a26b4f8ed49725a1fca3b | 691a42ca75d25aede8df834f45642aae3d39ddddd55e7a0df8621c50de18e710 | null | [
"LICENSE.md"
] | 241 |
2.4 | diresa-torch | 1.2.2 | Diresa - distance-regularized siamese twin autoencoder | # *DIRESA-Torch*
### Overview
*DIRESA-Torch* is a Python package for dimension reduction based on
[PyTorch](https://pytorch.org). The distance-regularized
Siamese twin autoencoder architecture is designed to preserve distance
(ordering) in latent space while capturing the non-linearities in
the datasets.
### Install *DIRESA-Torch*
Install *DIRESA-Torch* with the following command:
``` bash
pip install diresa-torch
```
### Documentation
The *DIRESA* documentation can be found on [Read the Docs](https://diresa-torch.readthedocs.io)
### Paper
The *DIRESA* paper can be found [here](https://journals.ametsoc.org/view/journals/aies/4/3/AIES-D-24-0034.1.xml) | text/markdown | null | Lars Bonnefoy <lars.bonnefoy@vub.be>, Janne Bouillon <janne.lisa.bouillon@vub.be>, Geert De Paepe <geert.de.paepe@vub.be> | null | null | null | climate, learning, machine, pytorch, weather | [
"Development Status :: 4 - Beta",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Scientific/Engineering :: Artificial Intelligence"
] | [] | null | null | >=3.9 | [] | [] | [] | [
"torch>=2.1"
] | [] | [] | [] | [
"Homepage, https://gitlab.com/etrovub/ai4wcm/public/diresa-torch",
"Issues, https://gitlab.com/etrovub/ai4wcm/public/diresa-torch/-/issues"
] | twine/6.2.0 CPython/3.12.12 | 2026-02-19T10:15:07.979412 | diresa_torch-1.2.2.tar.gz | 17,829 | 51/ce/a0e082b3bb0e3f8af18be3cb7158411d35cf0ee67dc4b675ac97da77bac5/diresa_torch-1.2.2.tar.gz | source | sdist | null | false | 3db4ff956c78363ad3fdd99a4ef8e865 | 381d4e4b91e0511794df08ebca5b0fd5937be994cda5597d9f7fe5a7b8f22335 | 51cea0e082b3bb0e3f8af18be3cb7158411d35cf0ee67dc4b675ac97da77bac5 | null | [
"LICENSE"
] | 234 |
2.4 | bitmex-api | 0.0.124 | bitmex crypto exchange api client | # bitmex-python
Python SDK (sync and async) for Bitmex cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/bitmex)
- You can check Bitmex's docs here: [Docs](https://www.google.com/search?q=google+bitmex+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/bitmex-python
- Pypi package: https://pypi.org/project/bitmex-api
## Installation
```
pip install bitmex-api
```
## Usage
### Sync
```Python
from bitmex import BitmexSync
def main():
instance = BitmexSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from bitmex import BitmexAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BitmexAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from bitmex import BitmexWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BitmexWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `fetch_balance(self, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverages(self, symbols: Strings = None, params={})`
- `fetch_liquidations(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `amount_to_precision(self, symbol, amount)`
- `calculate_rate_limiter_cost(self, api, method, path, params, config={})`
- `cancel_all_orders_after(self, timeout: Int, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `convert_from_raw_cost(self, symbol, rawQuantity)`
- `convert_from_raw_quantity(self, symbol, rawQuantity, currencySide='base')`
- `convert_from_real_amount(self, code, amount)`
- `convert_to_real_amount(self, code: Str, amount: Str)`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `nonce(self)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_get_announcement(request)`
- `public_get_announcement_urgent(request)`
- `public_get_chat(request)`
- `public_get_chat_channels(request)`
- `public_get_chat_connected(request)`
- `public_get_chat_pinned(request)`
- `public_get_funding(request)`
- `public_get_guild(request)`
- `public_get_instrument(request)`
- `public_get_instrument_active(request)`
- `public_get_instrument_activeandindices(request)`
- `public_get_instrument_activeintervals(request)`
- `public_get_instrument_compositeindex(request)`
- `public_get_instrument_indices(request)`
- `public_get_instrument_usdvolume(request)`
- `public_get_insurance(request)`
- `public_get_leaderboard(request)`
- `public_get_liquidation(request)`
- `public_get_orderbook_l2(request)`
- `public_get_porl_nonce(request)`
- `public_get_quote(request)`
- `public_get_quote_bucketed(request)`
- `public_get_schema(request)`
- `public_get_schema_websockethelp(request)`
- `public_get_settlement(request)`
- `public_get_stats(request)`
- `public_get_stats_history(request)`
- `public_get_stats_historyusd(request)`
- `public_get_trade(request)`
- `public_get_trade_bucketed(request)`
- `public_get_wallet_assets(request)`
- `public_get_wallet_networks(request)`
- `private_get_address(request)`
- `private_get_apikey(request)`
- `private_get_execution(request)`
- `private_get_execution_tradehistory(request)`
- `private_get_globalnotification(request)`
- `private_get_leaderboard_name(request)`
- `private_get_order(request)`
- `private_get_porl_snapshots(request)`
- `private_get_position(request)`
- `private_get_user(request)`
- `private_get_user_affiliatestatus(request)`
- `private_get_user_checkreferralcode(request)`
- `private_get_user_commission(request)`
- `private_get_user_csa(request)`
- `private_get_user_depositaddress(request)`
- `private_get_user_executionhistory(request)`
- `private_get_user_getwallettransferaccounts(request)`
- `private_get_user_margin(request)`
- `private_get_user_quotefillratio(request)`
- `private_get_user_quotevalueratio(request)`
- `private_get_user_staking(request)`
- `private_get_user_staking_instruments(request)`
- `private_get_user_staking_tiers(request)`
- `private_get_user_tradingvolume(request)`
- `private_get_user_unstakingrequests(request)`
- `private_get_user_wallet(request)`
- `private_get_user_wallethistory(request)`
- `private_get_user_walletsummary(request)`
- `private_get_useraffiliates(request)`
- `private_get_userevent(request)`
- `private_post_address(request)`
- `private_post_chat(request)`
- `private_post_guild(request)`
- `private_post_guild_archive(request)`
- `private_post_guild_join(request)`
- `private_post_guild_kick(request)`
- `private_post_guild_leave(request)`
- `private_post_guild_sharestrades(request)`
- `private_post_order(request)`
- `private_post_order_cancelallafter(request)`
- `private_post_order_closeposition(request)`
- `private_post_position_isolate(request)`
- `private_post_position_leverage(request)`
- `private_post_position_risklimit(request)`
- `private_post_position_transfermargin(request)`
- `private_post_user_addsubaccount(request)`
- `private_post_user_cancelwithdrawal(request)`
- `private_post_user_communicationtoken(request)`
- `private_post_user_confirmemail(request)`
- `private_post_user_confirmwithdrawal(request)`
- `private_post_user_logout(request)`
- `private_post_user_preferences(request)`
- `private_post_user_requestwithdrawal(request)`
- `private_post_user_unstakingrequests(request)`
- `private_post_user_updatesubaccount(request)`
- `private_post_user_wallettransfer(request)`
- `private_put_guild(request)`
- `private_put_order(request)`
- `private_delete_order(request)`
- `private_delete_order_all(request)`
- `private_delete_user_unstakingrequests(request)`
### WS Unified
- `describe(self)`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_liquidations(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_liquidations_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `watch_balance(self, params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `authenticate(self, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `watch_heartbeat(self, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:14:13.333187 | bitmex_api-0.0.124.tar.gz | 727,647 | 26/d1/7bfe0559dca797639cdbd5b7897c2e26d061809a9017f2ebdd56dfcd1ba6/bitmex_api-0.0.124.tar.gz | source | sdist | null | false | 93cd4636e8effab895bf792c36dffac7 | 567f349ccd46e033928a818022a86a49f72bbf557134dd1442b296fd242337b3 | 26d17bfe0559dca797639cdbd5b7897c2e26d061809a9017f2ebdd56dfcd1ba6 | null | [] | 231 |
2.4 | woo-api | 0.0.121 | woo crypto exchange api client | # woo-python
Python SDK (sync and async) for Woo cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/woo)
- You can check Woo's docs here: [Docs](https://www.google.com/search?q=google+woo+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/woo-python
- Pypi package: https://pypi.org/project/woo-api
## Installation
```
pip install woo-api
```
## Usage
### Sync
```Python
from woo import WooSync
def main():
instance = WooSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from woo import WooAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = WooAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from woo import WooWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = WooWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_convert_trade(self, id: str, fromCode: str, toCode: str, amount: Num = None, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_trailing_amount_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, trailingAmount: Num = None, trailingTriggerPrice: Num = None, params={})`
- `create_trailing_percent_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, trailingPercent: Num = None, trailingTriggerPrice: Num = None, params={})`
- `fetch_accounts(self, params={})`
- `fetch_balance(self, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_currencies(self, params={})`
- `fetch_convert_quote(self, fromCode: str, toCode: str, amount: Num = None, params={})`
- `fetch_convert_trade_history(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_trade(self, id: str, code: Str = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposits_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: Str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_status(self, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `cancel_all_orders_after(self, timeout: Int, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `default_network_code_for_currency(self, code)`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `encode_margin_mode(self, mode)`
- `get_asset_history_rows(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `get_currency_from_chaincode(self, networkizedCode, currency)`
- `get_dedicated_network_id(self, currency, params: dict)`
- `modify_margin_helper(self, symbol: str, amount, type, params={})`
- `nonce(self)`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `repay_margin(self, code: str, amount: float, symbol: Str = None, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enable: bool)`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `v1_pub_get_hist_kline(request)`
- `v1_pub_get_hist_trades(request)`
- `v1_public_get_info(request)`
- `v1_public_get_info_symbol(request)`
- `v1_public_get_system_info(request)`
- `v1_public_get_market_trades(request)`
- `v1_public_get_token(request)`
- `v1_public_get_token_network(request)`
- `v1_public_get_funding_rates(request)`
- `v1_public_get_funding_rate_symbol(request)`
- `v1_public_get_funding_rate_history(request)`
- `v1_public_get_futures(request)`
- `v1_public_get_futures_symbol(request)`
- `v1_public_get_orderbook_symbol(request)`
- `v1_public_get_kline(request)`
- `v1_private_get_client_token(request)`
- `v1_private_get_order_oid(request)`
- `v1_private_get_client_order_client_order_id(request)`
- `v1_private_get_orders(request)`
- `v1_private_get_client_trade_tid(request)`
- `v1_private_get_order_oid_trades(request)`
- `v1_private_get_client_trades(request)`
- `v1_private_get_client_hist_trades(request)`
- `v1_private_get_staking_yield_history(request)`
- `v1_private_get_client_holding(request)`
- `v1_private_get_asset_deposit(request)`
- `v1_private_get_asset_history(request)`
- `v1_private_get_sub_account_all(request)`
- `v1_private_get_sub_account_assets(request)`
- `v1_private_get_sub_account_asset_detail(request)`
- `v1_private_get_sub_account_ip_restriction(request)`
- `v1_private_get_asset_main_sub_transfer_history(request)`
- `v1_private_get_token_interest(request)`
- `v1_private_get_token_interest_token(request)`
- `v1_private_get_interest_history(request)`
- `v1_private_get_interest_repay(request)`
- `v1_private_get_funding_fee_history(request)`
- `v1_private_get_positions(request)`
- `v1_private_get_position_symbol(request)`
- `v1_private_get_client_transaction_history(request)`
- `v1_private_get_client_futures_leverage(request)`
- `v1_private_post_order(request)`
- `v1_private_post_order_cancel_all_after(request)`
- `v1_private_post_asset_ltv(request)`
- `v1_private_post_asset_internal_withdraw(request)`
- `v1_private_post_interest_repay(request)`
- `v1_private_post_client_account_mode(request)`
- `v1_private_post_client_position_mode(request)`
- `v1_private_post_client_leverage(request)`
- `v1_private_post_client_futures_leverage(request)`
- `v1_private_post_client_isolated_margin(request)`
- `v1_private_delete_order(request)`
- `v1_private_delete_client_order(request)`
- `v1_private_delete_orders(request)`
- `v1_private_delete_asset_withdraw(request)`
- `v2_private_get_client_holding(request)`
- `v3_public_get_systeminfo(request)`
- `v3_public_get_instruments(request)`
- `v3_public_get_token(request)`
- `v3_public_get_tokennetwork(request)`
- `v3_public_get_tokeninfo(request)`
- `v3_public_get_markettrades(request)`
- `v3_public_get_markettradeshistory(request)`
- `v3_public_get_orderbook(request)`
- `v3_public_get_kline(request)`
- `v3_public_get_klinehistory(request)`
- `v3_public_get_futures(request)`
- `v3_public_get_fundingrate(request)`
- `v3_public_get_fundingratehistory(request)`
- `v3_public_get_insurancefund(request)`
- `v3_private_get_trade_order(request)`
- `v3_private_get_trade_orders(request)`
- `v3_private_get_trade_algoorder(request)`
- `v3_private_get_trade_algoorders(request)`
- `v3_private_get_trade_transaction(request)`
- `v3_private_get_trade_transactionhistory(request)`
- `v3_private_get_trade_tradingfee(request)`
- `v3_private_get_account_info(request)`
- `v3_private_get_account_tokenconfig(request)`
- `v3_private_get_account_symbolconfig(request)`
- `v3_private_get_account_subaccounts_all(request)`
- `v3_private_get_account_referral_summary(request)`
- `v3_private_get_account_referral_rewardhistory(request)`
- `v3_private_get_account_credentials(request)`
- `v3_private_get_asset_balances(request)`
- `v3_private_get_asset_token_history(request)`
- `v3_private_get_asset_transfer_history(request)`
- `v3_private_get_asset_wallet_history(request)`
- `v3_private_get_asset_wallet_deposit(request)`
- `v3_private_get_asset_staking_yieldhistory(request)`
- `v3_private_get_futures_positions(request)`
- `v3_private_get_futures_leverage(request)`
- `v3_private_get_futures_defaultmarginmode(request)`
- `v3_private_get_futures_fundingfee_history(request)`
- `v3_private_get_spotmargin_interestrate(request)`
- `v3_private_get_spotmargin_interesthistory(request)`
- `v3_private_get_spotmargin_maxmargin(request)`
- `v3_private_get_algo_order_oid(request)`
- `v3_private_get_algo_orders(request)`
- `v3_private_get_positions(request)`
- `v3_private_get_buypower(request)`
- `v3_private_get_convert_exchangeinfo(request)`
- `v3_private_get_convert_assetinfo(request)`
- `v3_private_get_convert_rfq(request)`
- `v3_private_get_convert_trade(request)`
- `v3_private_get_convert_trades(request)`
- `v3_private_post_trade_order(request)`
- `v3_private_post_trade_algoorder(request)`
- `v3_private_post_trade_cancelallafter(request)`
- `v3_private_post_account_tradingmode(request)`
- `v3_private_post_account_listenkey(request)`
- `v3_private_post_asset_transfer(request)`
- `v3_private_post_asset_wallet_withdraw(request)`
- `v3_private_post_spotmargin_leverage(request)`
- `v3_private_post_spotmargin_interestrepay(request)`
- `v3_private_post_algo_order(request)`
- `v3_private_post_convert_rft(request)`
- `v3_private_put_trade_order(request)`
- `v3_private_put_trade_algoorder(request)`
- `v3_private_put_futures_leverage(request)`
- `v3_private_put_futures_positionmode(request)`
- `v3_private_put_order_oid(request)`
- `v3_private_put_order_client_client_order_id(request)`
- `v3_private_put_algo_order_oid(request)`
- `v3_private_put_algo_order_client_client_order_id(request)`
- `v3_private_delete_trade_order(request)`
- `v3_private_delete_trade_orders(request)`
- `v3_private_delete_trade_algoorder(request)`
- `v3_private_delete_trade_algoorders(request)`
- `v3_private_delete_trade_allorders(request)`
- `v3_private_delete_algo_order_order_id(request)`
- `v3_private_delete_algo_orders_pending(request)`
- `v3_private_delete_algo_orders_pending_symbol(request)`
- `v3_private_delete_orders_pending(request)`
### WS Unified
- `describe(self)`
- `watch_public(self, messageHash, message)`
- `unwatch_public(self, subHash: str, symbol: str, topic: str, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `fetch_order_book_snapshot(self, client, message, subscription)`
- `watch_ticker(self, symbol: str, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `un_watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `un_watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `check_required_uid(self, error=True)`
- `authenticate(self, params={})`
- `watch_private(self, messageHash, message, params={})`
- `watch_private_multiple(self, messageHashes, message, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `set_positions_cache(self, client: Client, type, symbols: Strings = None)`
- `load_positions_snapshot(self, client, messageHash)`
- `watch_balance(self, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:14:10.658312 | woo_api-0.0.121.tar.gz | 736,081 | 01/16/f4e4ef23aadff4c26f7a1fb0f998c114f2b71d3a2ef059b1a1c8a8c1cdf0/woo_api-0.0.121.tar.gz | source | sdist | null | false | 2896c5eedbc502e32376ad90268065dc | a4d09142e4514ee10482ff404a3e5eb4155c49230b9cb6d4114a59015676b9ae | 0116f4e4ef23aadff4c26f7a1fb0f998c114f2b71d3a2ef059b1a1c8a8c1cdf0 | null | [] | 225 |
2.4 | gate-io-api | 0.0.124 | gate crypto exchange api client | # gate-python
Python SDK (sync and async) for Gate cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/gate)
- You can check Gate's docs here: [Docs](https://www.google.com/search?q=google+gate+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/gate-python
- Pypi package: https://pypi.org/project/gate-io-api
## Installation
```
pip install gate-io-api
```
## Usage
### Sync
```Python
from gate import GateSync
def main():
instance = GateSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from gate import GateAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = GateAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from gate import GateWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = GateWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_expired_option_market(self, symbol: str)`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders_request(self, orders: List[OrderRequest], params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_future_markets(self, params={})`
- `fetch_greeks(self, symbol: str, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage_tiers(self, symbols: Strings = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_leverages(self, symbols: Strings = None, params={})`
- `fetch_liquidations(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_liquidations(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_network_deposit_address(self, code: str, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest_history(self, symbol: str, timeframe='5m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_option_chain(self, code: str, params={})`
- `fetch_option_markets(self, params={})`
- `fetch_option_ohlcv(self, symbol: str, timeframe='1m', since: Int = None, limit: Int = None, params={})`
- `fetch_option_underlyings(self)`
- `fetch_option(self, symbol: str, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_request(self, id: str, symbol: Str = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders_by_status(self, status, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_history(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_spot_markets(self, params={})`
- `fetch_swap_markets(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_transaction_fees(self, codes: Strings = None, params={})`
- `fetch_underlying_assets(self, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders_for_symbols(self, orders: List[CancellationRequest], params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `describe(self)`
- `edit_order_request(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `get_margin_mode(self, trigger, params)`
- `get_settlement_currencies(self, type, method)`
- `modify_margin_helper(self, symbol: str, amount, params={})`
- `multi_order_spot_prepare_request(self, market=None, trigger=False, params={})`
- `nonce(self)`
- `prepare_orders_by_status_request(self, status, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `prepare_request(self, market=None, type=None, params={})`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `repay_cross_margin(self, code: str, amount, params={})`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `safe_market(self, marketId: Str = None, market: Market = None, delimiter: Str = None, marketType: Str = None)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enable: bool)`
- `spot_order_prepare_request(self, market=None, trigger=False, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `upgrade_unified_trade_account(self, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_wallet_get_currency_chains(request)`
- `public_unified_get_currencies(request)`
- `public_unified_get_history_loan_rate(request)`
- `public_spot_get_currencies(request)`
- `public_spot_get_currencies_currency(request)`
- `public_spot_get_currency_pairs(request)`
- `public_spot_get_currency_pairs_currency_pair(request)`
- `public_spot_get_tickers(request)`
- `public_spot_get_order_book(request)`
- `public_spot_get_trades(request)`
- `public_spot_get_candlesticks(request)`
- `public_spot_get_time(request)`
- `public_spot_get_insurance_history(request)`
- `public_margin_get_uni_currency_pairs(request)`
- `public_margin_get_uni_currency_pairs_currency_pair(request)`
- `public_margin_get_loan_margin_tiers(request)`
- `public_margin_get_currency_pairs(request)`
- `public_margin_get_currency_pairs_currency_pair(request)`
- `public_margin_get_funding_book(request)`
- `public_margin_get_cross_currencies(request)`
- `public_margin_get_cross_currencies_currency(request)`
- `public_flash_swap_get_currency_pairs(request)`
- `public_flash_swap_get_currencies(request)`
- `public_futures_get_settle_contracts(request)`
- `public_futures_get_settle_contracts_contract(request)`
- `public_futures_get_settle_order_book(request)`
- `public_futures_get_settle_trades(request)`
- `public_futures_get_settle_candlesticks(request)`
- `public_futures_get_settle_premium_index(request)`
- `public_futures_get_settle_tickers(request)`
- `public_futures_get_settle_funding_rate(request)`
- `public_futures_get_settle_insurance(request)`
- `public_futures_get_settle_contract_stats(request)`
- `public_futures_get_settle_index_constituents_index(request)`
- `public_futures_get_settle_liq_orders(request)`
- `public_futures_get_settle_risk_limit_tiers(request)`
- `public_delivery_get_settle_contracts(request)`
- `public_delivery_get_settle_contracts_contract(request)`
- `public_delivery_get_settle_order_book(request)`
- `public_delivery_get_settle_trades(request)`
- `public_delivery_get_settle_candlesticks(request)`
- `public_delivery_get_settle_tickers(request)`
- `public_delivery_get_settle_insurance(request)`
- `public_delivery_get_settle_risk_limit_tiers(request)`
- `public_options_get_underlyings(request)`
- `public_options_get_expirations(request)`
- `public_options_get_contracts(request)`
- `public_options_get_contracts_contract(request)`
- `public_options_get_settlements(request)`
- `public_options_get_settlements_contract(request)`
- `public_options_get_order_book(request)`
- `public_options_get_tickers(request)`
- `public_options_get_underlying_tickers_underlying(request)`
- `public_options_get_candlesticks(request)`
- `public_options_get_underlying_candlesticks(request)`
- `public_options_get_trades(request)`
- `public_earn_get_uni_currencies(request)`
- `public_earn_get_uni_currencies_currency(request)`
- `public_earn_get_dual_investment_plan(request)`
- `public_earn_get_structured_products(request)`
- `public_loan_get_collateral_currencies(request)`
- `public_loan_get_multi_collateral_currencies(request)`
- `public_loan_get_multi_collateral_ltv(request)`
- `public_loan_get_multi_collateral_fixed_rate(request)`
- `public_loan_get_multi_collateral_current_rate(request)`
- `private_withdrawals_post_withdrawals(request)`
- `private_withdrawals_post_push(request)`
- `private_withdrawals_delete_withdrawals_withdrawal_id(request)`
- `private_wallet_get_deposit_address(request)`
- `private_wallet_get_withdrawals(request)`
- `private_wallet_get_deposits(request)`
- `private_wallet_get_sub_account_transfers(request)`
- `private_wallet_get_order_status(request)`
- `private_wallet_get_withdraw_status(request)`
- `private_wallet_get_sub_account_balances(request)`
- `private_wallet_get_sub_account_margin_balances(request)`
- `private_wallet_get_sub_account_futures_balances(request)`
- `private_wallet_get_sub_account_cross_margin_balances(request)`
- `private_wallet_get_saved_address(request)`
- `private_wallet_get_fee(request)`
- `private_wallet_get_total_balance(request)`
- `private_wallet_get_small_balance(request)`
- `private_wallet_get_small_balance_history(request)`
- `private_wallet_get_push(request)`
- `private_wallet_get_getlowcapexchangelist(request)`
- `private_wallet_post_transfers(request)`
- `private_wallet_post_sub_account_transfers(request)`
- `private_wallet_post_sub_account_to_sub_account(request)`
- `private_wallet_post_small_balance(request)`
- `private_subaccounts_get_sub_accounts(request)`
- `private_subaccounts_get_sub_accounts_user_id(request)`
- `private_subaccounts_get_sub_accounts_user_id_keys(request)`
- `private_subaccounts_get_sub_accounts_user_id_keys_key(request)`
- `private_subaccounts_post_sub_accounts(request)`
- `private_subaccounts_post_sub_accounts_user_id_keys(request)`
- `private_subaccounts_post_sub_accounts_user_id_lock(request)`
- `private_subaccounts_post_sub_accounts_user_id_unlock(request)`
- `private_subaccounts_put_sub_accounts_user_id_keys_key(request)`
- `private_subaccounts_delete_sub_accounts_user_id_keys_key(request)`
- `private_unified_get_accounts(request)`
- `private_unified_get_borrowable(request)`
- `private_unified_get_transferable(request)`
- `private_unified_get_transferables(request)`
- `private_unified_get_batch_borrowable(request)`
- `private_unified_get_loans(request)`
- `private_unified_get_loan_records(request)`
- `private_unified_get_interest_records(request)`
- `private_unified_get_risk_units(request)`
- `private_unified_get_unified_mode(request)`
- `private_unified_get_estimate_rate(request)`
- `private_unified_get_currency_discount_tiers(request)`
- `private_unified_get_loan_margin_tiers(request)`
- `private_unified_get_leverage_user_currency_config(request)`
- `private_unified_get_leverage_user_currency_setting(request)`
- `private_unified_get_account_mode(request)`
- `private_unified_post_loans(request)`
- `private_unified_post_portfolio_calculator(request)`
- `private_unified_post_leverage_user_currency_setting(request)`
- `private_unified_post_collateral_currencies(request)`
- `private_unified_post_account_mode(request)`
- `private_unified_put_unified_mode(request)`
- `private_spot_get_fee(request)`
- `private_spot_get_batch_fee(request)`
- `private_spot_get_accounts(request)`
- `private_spot_get_account_book(request)`
- `private_spot_get_open_orders(request)`
- `private_spot_get_orders(request)`
- `private_spot_get_orders_order_id(request)`
- `private_spot_get_my_trades(request)`
- `private_spot_get_price_orders(request)`
- `private_spot_get_price_orders_order_id(request)`
- `private_spot_post_batch_orders(request)`
- `private_spot_post_cross_liquidate_orders(request)`
- `private_spot_post_orders(request)`
- `private_spot_post_cancel_batch_orders(request)`
- `private_spot_post_countdown_cancel_all(request)`
- `private_spot_post_amend_batch_orders(request)`
- `private_spot_post_price_orders(request)`
- `private_spot_delete_orders(request)`
- `private_spot_delete_orders_order_id(request)`
- `private_spot_delete_price_orders(request)`
- `private_spot_delete_price_orders_order_id(request)`
- `private_spot_patch_orders_order_id(request)`
- `private_margin_get_accounts(request)`
- `private_margin_get_account_book(request)`
- `private_margin_get_funding_accounts(request)`
- `private_margin_get_auto_repay(request)`
- `private_margin_get_transferable(request)`
- `private_margin_get_uni_estimate_rate(request)`
- `private_margin_get_uni_loans(request)`
- `private_margin_get_uni_loan_records(request)`
- `private_margin_get_uni_interest_records(request)`
- `private_margin_get_uni_borrowable(request)`
- `private_margin_get_user_loan_margin_tiers(request)`
- `private_margin_get_user_account(request)`
- `private_margin_get_loans(request)`
- `private_margin_get_loans_loan_id(request)`
- `private_margin_get_loans_loan_id_repayment(request)`
- `private_margin_get_loan_records(request)`
- `private_margin_get_loan_records_loan_record_id(request)`
- `private_margin_get_borrowable(request)`
- `private_margin_get_cross_accounts(request)`
- `private_margin_get_cross_account_book(request)`
- `private_margin_get_cross_loans(request)`
- `private_margin_get_cross_loans_loan_id(request)`
- `private_margin_get_cross_repayments(request)`
- `private_margin_get_cross_interest_records(request)`
- `private_margin_get_cross_transferable(request)`
- `private_margin_get_cross_estimate_rate(request)`
- `private_margin_get_cross_borrowable(request)`
- `private_margin_post_auto_repay(request)`
- `private_margin_post_uni_loans(request)`
- `private_margin_post_leverage_user_market_setting(request)`
- `private_margin_post_loans(request)`
- `private_margin_post_merged_loans(request)`
- `private_margin_post_loans_loan_id_repayment(request)`
- `private_margin_post_cross_loans(request)`
- `private_margin_post_cross_repayments(request)`
- `private_margin_patch_loans_loan_id(request)`
- `private_margin_patch_loan_records_loan_record_id(request)`
- `private_margin_delete_loans_loan_id(request)`
- `private_flash_swap_get_orders(request)`
- `private_flash_swap_get_orders_order_id(request)`
- `private_flash_swap_post_orders(request)`
- `private_flash_swap_post_orders_preview(request)`
- `private_futures_get_settle_accounts(request)`
- `private_futures_get_settle_account_book(request)`
- `private_futures_get_settle_positions(request)`
- `private_futures_get_settle_positions_contract(request)`
- `private_futures_get_settle_get_leverage_contract(request)`
- `private_futures_get_settle_dual_comp_positions_contract(request)`
- `private_futures_get_settle_orders(request)`
- `private_futures_get_settle_orders_timerange(request)`
- `private_futures_get_settle_orders_order_id(request)`
- `private_futures_get_settle_my_trades(request)`
- `private_futures_get_settle_my_trades_timerange(request)`
- `private_futures_get_settle_position_close(request)`
- `private_futures_get_settle_liquidates(request)`
- `private_futures_get_settle_auto_deleverages(request)`
- `private_futures_get_settle_fee(request)`
- `private_futures_get_settle_risk_limit_table(request)`
- `private_futures_get_settle_price_orders(request)`
- `private_futures_get_settle_price_orders_order_id(request)`
- `private_futures_post_settle_positions_contract_margin(request)`
- `private_futures_post_settle_positions_contract_leverage(request)`
- `private_futures_post_settle_positions_contract_set_leverage(request)`
- `private_futures_post_settle_positions_contract_risk_limit(request)`
- `private_futures_post_settle_positions_cross_mode(request)`
- `private_futures_post_settle_dual_comp_positions_cross_mode(request)`
- `private_futures_post_settle_dual_mode(request)`
- `private_futures_post_settle_set_position_mode(request)`
- `private_futures_post_settle_dual_comp_positions_contract_margin(request)`
- `private_futures_post_settle_dual_comp_positions_contract_leverage(request)`
- `private_futures_post_settle_dual_comp_positions_contract_risk_limit(request)`
- `private_futures_post_settle_orders(request)`
- `private_futures_post_settle_batch_orders(request)`
- `private_futures_post_settle_countdown_cancel_all(request)`
- `private_futures_post_settle_batch_cancel_orders(request)`
- `private_futures_post_settle_batch_amend_orders(request)`
- `private_futures_post_settle_bbo_orders(request)`
- `private_futures_post_settle_price_orders(request)`
- `private_futures_put_settle_orders_order_id(request)`
- `private_futures_put_settle_price_orders_order_id(request)`
- `private_futures_delete_settle_orders(request)`
- `private_futures_delete_settle_orders_order_id(request)`
- `private_futures_delete_settle_price_orders(request)`
- `private_futures_delete_settle_price_orders_order_id(request)`
- `private_delivery_get_settle_accounts(request)`
- `private_delivery_get_settle_account_book(request)`
- `private_delivery_get_settle_positions(request)`
- `private_delivery_get_settle_positions_contract(request)`
- `private_delivery_get_settle_orders(request)`
- `private_delivery_get_settle_orders_order_id(request)`
- `private_delivery_get_settle_my_trades(request)`
- `private_delivery_get_settle_position_close(request)`
- `private_delivery_get_settle_liquidates(request)`
- `private_delivery_get_settle_settlements(request)`
- `private_delivery_get_settle_price_orders(request)`
- `private_delivery_get_settle_price_orders_order_id(request)`
- `private_delivery_post_settle_positions_contract_margin(request)`
- `private_delivery_post_settle_positions_contract_leverage(request)`
- `private_delivery_post_settle_positions_contract_risk_limit(request)`
- `private_delivery_post_settle_orders(request)`
- `private_delivery_post_settle_price_orders(request)`
- `private_delivery_delete_settle_orders(request)`
- `private_delivery_delete_settle_orders_order_id(request)`
- `private_delivery_delete_settle_price_orders(request)`
- `private_delivery_delete_settle_price_orders_order_id(request)`
- `private_options_get_my_settlements(request)`
- `private_options_get_accounts(request)`
- `private_options_get_account_book(request)`
- `private_options_get_positions(request)`
- `private_options_get_positions_contract(request)`
- `private_options_get_position_close(request)`
- `private_options_get_orders(request)`
- `private_options_get_orders_order_id(request)`
- `private_options_get_my_trades(request)`
- `private_options_get_mmp(request)`
- `private_options_post_orders(request)`
- `private_options_post_countdown_cancel_all(request)`
- `private_options_post_mmp(request)`
- `private_options_post_mmp_reset(request)`
- `private_options_delete_orders(request)`
- `private_options_delete_orders_order_id(request)`
- `private_earn_get_uni_lends(request)`
- `private_earn_get_uni_lend_records(request)`
- `private_earn_get_uni_interests_currency(request)`
- `private_earn_get_uni_interest_records(request)`
- `private_earn_get_uni_interest_status_currency(request)`
- `private_earn_get_uni_chart(request)`
- `private_earn_get_uni_rate(request)`
- `private_earn_get_staking_eth2_rate_records(request)`
- `private_earn_get_dual_orders(request)`
- `private_earn_get_dual_balance(request)`
- `private_earn_get_structured_orders(request)`
- `private_earn_get_staking_coins(request)`
- `private_earn_get_staking_order_list(request)`
- `private_earn_get_staking_award_list(request)`
- `private_earn_get_staking_assets(request)`
- `private_earn_get_uni_currencies(request)`
- `private_earn_get_uni_currencies_currency(request)`
- `private_earn_post_uni_lends(request)`
- `private_earn_post_staking_eth2_swap(request)`
- `private_earn_post_dual_orders(request)`
- `private_earn_post_structured_orders(request)`
- `private_earn_post_staking_swap(request)`
- `private_earn_put_uni_interest_reinvest(request)`
- `private_earn_patch_uni_lends(request)`
- `private_loan_get_collateral_orders(request)`
- `private_loan_get_collateral_orders_order_id(request)`
- `private_loan_get_collateral_repay_records(request)`
- `private_loan_get_collateral_collaterals(request)`
- `private_loan_get_collateral_total_amount(request)`
- `private_loan_get_collateral_ltv(request)`
- `private_loan_get_multi_collateral_orders(request)`
- `private_loan_get_multi_collateral_orders_order_id(request)`
- `private_loan_get_multi_collateral_repay(request)`
- `private_loan_get_multi_collateral_mortgage(request)`
- `private_loan_get_multi_collateral_currency_quota(request)`
- `private_loan_get_collateral_currencies(request)`
- `private_loan_get_multi_collateral_currencies(request)`
- `private_loan_get_multi_collateral_ltv(request)`
- `private_loan_get_multi_collateral_fixed_rate(request)`
- `private_loan_get_multi_collateral_current_rate(request)`
- `private_loan_post_collateral_orders(request)`
- `private_loan_post_collateral_repay(request)`
- `private_loan_post_collateral_collaterals(request)`
- `private_loan_post_multi_collateral_orders(request)`
- `private_loan_post_multi_collateral_repay(request)`
- `private_loan_post_multi_collateral_mortgage(request)`
- `private_account_get_detail(request)`
- `private_account_get_main_keys(request)`
- `private_account_get_rate_limit(request)`
- `private_account_get_stp_groups(request)`
- `private_account_get_stp_groups_stp_id_users(request)`
- `private_account_get_stp_groups_debit_fee(request)`
- `private_account_get_debit_fee(request)`
- `private_account_post_stp_groups(request)`
- `private_account_post_stp_groups_stp_id_users(request)`
- `private_account_post_debit_fee(request)`
- `private_account_delete_stp_groups_stp_id_users(request)`
- `private_rebate_get_agency_transaction_history(request)`
- `private_rebate_get_agency_commission_history(request)`
- `private_rebate_get_partner_transaction_history(request)`
- `private_rebate_get_partner_commission_history(request)`
- `private_rebate_get_partner_sub_list(request)`
- `private_rebate_get_broker_commission_history(request)`
- `private_rebate_get_broker_transaction_history(request)`
- `private_rebate_get_user_info(request)`
- `private_rebate_get_user_sub_relation(request)`
- `private_otc_get_get_user_def_bank(request)`
- `private_otc_get_order_list(request)`
- `private_otc_get_stable_coin_order_list(request)`
- `private_otc_get_order_detail(request)`
- `private_otc_post_quote(request)`
- `private_otc_post_order_create(request)`
- `private_otc_post_stable_coin_order_create(request)`
- `private_otc_post_order_paid(request)`
- `private_otc_post_order_cancel(request)`
### WS Unified
- `describe(self)`
- `create_order_ws(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders_ws(self, orders: List[OrderRequest], params={})`
- `cancel_all_orders_ws(self, symbol: Str = None, params={})`
- `cancel_order_ws(self, id: str, symbol: Str = None, params={})`
- `edit_order_ws(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `fetch_order_ws(self, id: str, symbol: Str = None, params={})`
- `fetch_open_orders_ws(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders_ws(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_orders_by_status_ws(self, status: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `get_cache_index(self, orderBook, cache)`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `subscribe_watch_tickers_and_bids_asks(self, symbols: Strings = None, callerMethodName: Str = None, params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_balance(self, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `set_positions_cache(self, client: Client, type, symbols: Strings = None)`
- `load_positions_snapshot(self, client, messageHash, type)`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_my_liquidations(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_my_liquidations_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `get_url_by_market(self, market)`
- `get_type_by_market(self, market: Market)`
- `get_url_by_market_type(self, type: MarketType, isInverse=False)`
- `get_market_type_by_url(self, url: str)`
- `subscribe_public(self, url, messageHash, payload, channel, params={}, subscription=None)`
- `subscribe_public_multiple(self, url, messageHashes, payload, channel, params={})`
- `un_subscribe_public_multiple(self, url, topic, symbols, messageHashes, subMessageHashes, payload, channel, params={})`
- `authenticate(self, url, messageType)`
- `subscribe_private(self, url, messageHash, payload, channel, params, requiresUid=False)`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:14:04.776129 | gate_io_api-0.0.124.tar.gz | 802,286 | 57/46/3ab7b98297ab8e201a00fc27e9ecdb3290e1a513db37c6f3843754fac86e/gate_io_api-0.0.124.tar.gz | source | sdist | null | false | 088a902e6c0bb49ab914292ab1864458 | 0478d6ba8aa0411d7b594bbb6381465ba298b43f7f16625606141e8d6fd3e904 | 57463ab7b98297ab8e201a00fc27e9ecdb3290e1a513db37c6f3843754fac86e | null | [] | 244 |
2.4 | bitmart | 0.0.125 | bitmart crypto exchange api client | # bitmart-python
Python SDK (sync and async) for Bitmart cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/bitmart)
- You can check Bitmart's docs here: [Docs](https://www.google.com/search?q=google+bitmart+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/bitmart-python
- Pypi package: https://pypi.org/project/bitmart
## Installation
```
pip install bitmart
```
## Usage
### Sync
```Python
from bitmart import BitmartSync
def main():
instance = BitmartSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from bitmart import BitmartAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BitmartAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from bitmart import BitmartWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BitmartWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `create_spot_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_swap_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_contract_markets(self, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_withdraw_fee(self, code: str, params={})`
- `fetch_deposit(self, id: str, code: Str = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_isolated_borrow_rate(self, symbol: str, params={})`
- `fetch_isolated_borrow_rates(self, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_liquidations(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders_by_status(self, status, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position_mode(self, symbol: Str = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_spot_markets(self, params={})`
- `fetch_status(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_transaction_fee(self, code: str, params={})`
- `fetch_transactions_by_type(self, type, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_transactions_request(self, flowType: Int = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_withdraw_addresses(self, code: str, note=None, networkCode=None, params={})`
- `fetch_withdrawal(self, id: str, code: Str = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `custom_parse_balance(self, response, marketType)`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `get_currency_id_from_code_and_network(self, currencyCode: Str, networkCode: Str)`
- `nonce(self)`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_get_system_time(request)`
- `public_get_system_service(request)`
- `public_get_spot_v1_currencies(request)`
- `public_get_spot_v1_symbols(request)`
- `public_get_spot_v1_symbols_details(request)`
- `public_get_spot_quotation_v3_tickers(request)`
- `public_get_spot_quotation_v3_ticker(request)`
- `public_get_spot_quotation_v3_lite_klines(request)`
- `public_get_spot_quotation_v3_klines(request)`
- `public_get_spot_quotation_v3_books(request)`
- `public_get_spot_quotation_v3_trades(request)`
- `public_get_spot_v1_ticker(request)`
- `public_get_spot_v2_ticker(request)`
- `public_get_spot_v1_ticker_detail(request)`
- `public_get_spot_v1_steps(request)`
- `public_get_spot_v1_symbols_kline(request)`
- `public_get_spot_v1_symbols_book(request)`
- `public_get_spot_v1_symbols_trades(request)`
- `public_get_contract_v1_tickers(request)`
- `public_get_contract_public_details(request)`
- `public_get_contract_public_depth(request)`
- `public_get_contract_public_open_interest(request)`
- `public_get_contract_public_funding_rate(request)`
- `public_get_contract_public_funding_rate_history(request)`
- `public_get_contract_public_kline(request)`
- `public_get_account_v1_currencies(request)`
- `public_get_contract_public_markprice_kline(request)`
- `private_get_account_sub_account_v1_transfer_list(request)`
- `private_get_account_sub_account_v1_transfer_history(request)`
- `private_get_account_sub_account_main_v1_wallet(request)`
- `private_get_account_sub_account_main_v1_subaccount_list(request)`
- `private_get_account_contract_sub_account_main_v1_wallet(request)`
- `private_get_account_contract_sub_account_main_v1_transfer_list(request)`
- `private_get_account_contract_sub_account_v1_transfer_history(request)`
- `private_get_account_v1_wallet(request)`
- `private_get_account_v1_currencies(request)`
- `private_get_spot_v1_wallet(request)`
- `private_get_account_v1_deposit_address(request)`
- `private_get_account_v1_withdraw_charge(request)`
- `private_get_account_v2_deposit_withdraw_history(request)`
- `private_get_account_v1_deposit_withdraw_detail(request)`
- `private_get_account_v1_withdraw_address_list(request)`
- `private_get_spot_v1_order_detail(request)`
- `private_get_spot_v2_orders(request)`
- `private_get_spot_v1_trades(request)`
- `private_get_spot_v2_trades(request)`
- `private_get_spot_v3_orders(request)`
- `private_get_spot_v2_order_detail(request)`
- `private_get_spot_v1_margin_isolated_borrow_record(request)`
- `private_get_spot_v1_margin_isolated_repay_record(request)`
- `private_get_spot_v1_margin_isolated_pairs(request)`
- `private_get_spot_v1_margin_isolated_account(request)`
- `private_get_spot_v1_trade_fee(request)`
- `private_get_spot_v1_user_fee(request)`
- `private_get_spot_v1_broker_rebate(request)`
- `private_get_contract_private_assets_detail(request)`
- `private_get_contract_private_order(request)`
- `private_get_contract_private_order_history(request)`
- `private_get_contract_private_position(request)`
- `private_get_contract_private_position_v2(request)`
- `private_get_contract_private_get_open_orders(request)`
- `private_get_contract_private_current_plan_order(request)`
- `private_get_contract_private_trades(request)`
- `private_get_contract_private_position_risk(request)`
- `private_get_contract_private_affilate_rebate_list(request)`
- `private_get_contract_private_affilate_trade_list(request)`
- `private_get_contract_private_transaction_history(request)`
- `private_get_contract_private_get_position_mode(request)`
- `private_post_account_sub_account_main_v1_sub_to_main(request)`
- `private_post_account_sub_account_sub_v1_sub_to_main(request)`
- `private_post_account_sub_account_main_v1_main_to_sub(request)`
- `private_post_account_sub_account_sub_v1_sub_to_sub(request)`
- `private_post_account_sub_account_main_v1_sub_to_sub(request)`
- `private_post_account_contract_sub_account_main_v1_sub_to_main(request)`
- `private_post_account_contract_sub_account_main_v1_main_to_sub(request)`
- `private_post_account_contract_sub_account_sub_v1_sub_to_main(request)`
- `private_post_account_v1_withdraw_apply(request)`
- `private_post_spot_v1_submit_order(request)`
- `private_post_spot_v1_batch_orders(request)`
- `private_post_spot_v2_cancel_order(request)`
- `private_post_spot_v1_cancel_orders(request)`
- `private_post_spot_v4_query_order(request)`
- `private_post_spot_v4_query_client_order(request)`
- `private_post_spot_v4_query_open_orders(request)`
- `private_post_spot_v4_query_history_orders(request)`
- `private_post_spot_v4_query_trades(request)`
- `private_post_spot_v4_query_order_trades(request)`
- `private_post_spot_v4_cancel_orders(request)`
- `private_post_spot_v4_cancel_all(request)`
- `private_post_spot_v4_batch_orders(request)`
- `private_post_spot_v3_cancel_order(request)`
- `private_post_spot_v2_batch_orders(request)`
- `private_post_spot_v2_submit_order(request)`
- `private_post_spot_v1_margin_submit_order(request)`
- `private_post_spot_v1_margin_isolated_borrow(request)`
- `private_post_spot_v1_margin_isolated_repay(request)`
- `private_post_spot_v1_margin_isolated_transfer(request)`
- `private_post_account_v1_transfer_contract_list(request)`
- `private_post_account_v1_transfer_contract(request)`
- `private_post_contract_private_submit_order(request)`
- `private_post_contract_private_cancel_order(request)`
- `private_post_contract_private_cancel_orders(request)`
- `private_post_contract_private_submit_plan_order(request)`
- `private_post_contract_private_cancel_plan_order(request)`
- `private_post_contract_private_submit_leverage(request)`
- `private_post_contract_private_submit_tp_sl_order(request)`
- `private_post_contract_private_modify_plan_order(request)`
- `private_post_contract_private_modify_preset_plan_order(request)`
- `private_post_contract_private_modify_limit_order(request)`
- `private_post_contract_private_modify_tp_sl_order(request)`
- `private_post_contract_private_submit_trail_order(request)`
- `private_post_contract_private_cancel_trail_order(request)`
- `private_post_contract_private_set_position_mode(request)`
### WS Unified
- `describe(self)`
- `subscribe(self, channel, symbol, type, params={})`
- `subscribe_multiple(self, channel: str, type: str, symbols: Strings = None, params={})`
- `watch_balance(self, params={})`
- `set_balance_cache(self, client: Client, type, subscribeHash)`
- `load_balance_snapshot(self, client, messageHash, type)`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `get_params_for_multiple_sub(self, methodName: str, symbols: List[str], limit: Int = None, params={})`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `un_watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `un_watch_orders(self, symbol: Str = None, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `un_watch_positions(self, symbols: Strings = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `un_watch_order_book_for_symbols(self, symbols: List[str], params={})`
- `authenticate(self, type, params={})`
- `get_un_sub_params(self, messageTopic)`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:14:02.338001 | bitmart-0.0.125.tar.gz | 765,330 | 81/05/8462fd25930d295f4c324a10aa8b635af5ec9bbcaf0950bc2b9e7a8c2c47/bitmart-0.0.125.tar.gz | source | sdist | null | false | a4e52d79781a0f15e355dc64681fdb04 | 4ba3d5d84c6973d800983061f1663153cef0cee2aed15b09b0550889440ac1af | 81058462fd25930d295f4c324a10aa8b635af5ec9bbcaf0950bc2b9e7a8c2c47 | null | [] | 234 |
2.1 | lpd-nodeps | 0.4.13 | A Fast, Flexible Trainer with Callbacks and Extensions for PyTorch | 
# lpd
A Fast, Flexible Trainer with Callbacks and Extensions for PyTorch
``lpd`` derives from the Hebrew word *lapid* (לפיד) which means "torch".
## For latest PyPI stable release
[](https://badge.fury.io/py/lpd)
[](https://pepy.tech/project/lpd)

<!--  -->
There are 2 types of ``lpd`` packages available
* ``lpd`` which brings dependencies for pytorch, numpy and tensorboard
```sh
pip install lpd
```
* ``lpd-nodeps`` which **you provide** your own dependencies for pytorch, numpy and tensorboard
```sh
pip install lpd-nodeps
```
<b>[v0.4.13-beta](https://github.com/RoySadaka/lpd/releases) Release - contains the following:</b>
* ``ThresholdChecker`` is updated to compute improvement according to last improved step and not to the best received metric
* Some minor cosmetic changes
Previously on lpd:
* ``Dense`` custom layer to support apply norm (configurable to before or after activation)
* ``StatsPrint`` callback to support printing best confusion matrix when at least one of the metrics is of type ``MetricConfusionMatrixBase``
* ``TransformerEncoderStack`` to support activation as input
* ``PositionalEncoding`` to support more than 3 dimensions input
* Updated Pipfile
* Fixed confusion matrix cpu/gpu device error
* Better handling on callbacks where apply_on_states=None (apply on all states)
* Bug fix in case validation samples are empty
## Usage
``lpd`` intended to properly structure your PyTorch model training.
The main usages are given below.
### Training your model
```python
from lpd.trainer import Trainer
from lpd.enums import Phase, State, MonitorType, MonitorMode, StatsType
from lpd.callbacks import LossOptimizerHandler, StatsPrint, ModelCheckPoint, Tensorboard, EarlyStopping, SchedulerStep, CallbackMonitor
from lpd.extensions.custom_schedulers import KerasDecay
from lpd.metrics import BinaryAccuracyWithLogits, FalsePositives
from lpd.utils.torch_utils import get_gpu_device_if_available
from lpd.utils.general_utils import seed_all
from lpd.utils.threshold_checker import AbsoluteThresholdChecker
seed_all(seed=42) # because its the answer to life and the universe
device = get_gpu_device_if_available() # with fallback to CPU if GPU not available
model = MyModel().to(device) # this is your model class, and its being sent to the relevant device
optimizer = torch.optim.SGD(params=model.parameters())
scheduler = KerasDecay(optimizer, decay=0.01, last_step=-1) # decay scheduler using keras formula
loss_func = torch.nn.BCEWithLogitsLoss().to(device) # this is your loss class, already sent to the relevant device
metrics = [BinaryAccuracyWithLogits(name='Accuracy'), FalsePositives(name='FP', num_class=2, threshold=0)] # define your metrics
# you can use some of the defined callbacks, or you can create your own
callbacks = [
LossOptimizerHandler(),
SchedulerStep(apply_on_phase=Phase.BATCH_END, apply_on_states=State.TRAIN),
ModelCheckPoint(checkpoint_dir,
checkpoint_file_name,
CallbackMonitor(monitor_type=MonitorType.LOSS,
stats_type=StatsType.VAL,
monitor_mode=MonitorMode.MIN),
save_best_only=True),
Tensorboard(summary_writer_dir=summary_writer_dir),
EarlyStopping(CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.VAL,
monitor_mode=MonitorMode.MAX,
patience=10,
metric_name='Accuracy'),
threshold_checker=AbsoluteThresholdChecker(monitor_mode=MonitorMode.MAX, threshold=0.01)),
StatsPrint(train_metrics_monitors=[CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.TRAIN,
monitor_mode=MonitorMode.MAX, # <-- notice MAX
metric_name='Accuracy'),
CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.TRAIN,
monitor_mode=MonitorMode.MIN, # <-- notice MIN
metric_name='FP')],
print_confusion_matrix=True) # since one of the metric (FalsePositives) is confusion matrix based, lets print the whole confusion matrix
]
trainer = Trainer(model,
device,
loss_func,
optimizer,
scheduler,
metrics,
train_data_loader, # DataLoader, Iterable or Generator
val_data_loader, # DataLoader, Iterable or Generator
train_steps,
val_steps,
callbacks,
name='Readme-Example')
trainer.train(num_epochs)
```
### Evaluating your model
``trainer.evaluate`` will return ``StatsResult`` that stores the loss and metrics results for the test set
```python
evaluation_result = trainer.evaluate(test_data_loader, test_steps)
```
### Making predictions
``Predictor`` class will generate output predictions from input samples.
``Predictor`` class can be created from ``Trainer``
```python
predictor_from_trainer = Predictor.from_trainer(trainer)
predictions = predictor_from_trainer.predict_batch(batch)
```
``Predictor`` class can also be created from saved checkpoint
```python
predictor_from_checkpoint = Predictor.from_checkpoint(checkpoint_dir,
checkpoint_file_name,
model, # nn.Module, weights will be loaded from checkpoint
device)
prediction = predictor_from_checkpoint.predict_sample(sample)
```
Lastly, ``Predictor`` class can be initialized explicitly
```python
predictor = Predictor(model,
device,
callbacks, # relevant only for prediction callbacks (see callbacks Phases and States)
name='lpd predictor')
predictions = predictor.predict_data_loader(data_loader, steps)
```
Just to be fair, you can also predict directly from ``Trainer`` class
```python
# On single sample:
prediction = trainer.predict_sample(sample)
# On batch:
predictions = trainer.predict_batch(batch)
# On Dataloader/Iterable/Generator:
predictions = trainer.predict_data_loader(data_loader, steps)
```
## TrainerStats
``Trainer`` tracks stats for `train/validate/test` and you can access them in your custom callbacks
or any other place that has access to your trainer.
Here are some examples
```python
train_loss = trainer.train_stats.get_loss() # the mean of the last epoch's train losses
val_loss = trainer.val_stats.get_loss() # the mean of the last epoch's validation losses
test_loss = trainer.test_stats.get_loss() # the mean of the test losses (available only after calling evaluate)
train_metrics = trainer.train_stats.get_metrics() # dict(metric_name, MetricMethod(values)) of the current epoch in train state
val_metrics = trainer.val_stats.get_metrics() # dict(metric_name, MetricMethod(values)) of the current epoch in validation state
test_metrics = trainer.test_stats.get_metrics() # dict(metric_name, MetricMethod(values)) of the test (available only after calling evaluate)
```
## Callbacks
Will be used to perform actions at various stages.
Some common callbacks are available under ``lpd.callbacks``, and you can also create your own, more details below.
In a callback, ``apply_on_phase`` (``lpd.enums.Phase``) will determine the execution phase,
and ``apply_on_states`` (``lpd.enums.State`` or ``list(lpd.enums.State)``) will determine the execution states
These are the current available phases and states, more might be added in future releases
### Training and Validation phases and states will behave as follow
```python
State.EXTERNAL
Phase.TRAIN_BEGIN
# train loop:
Phase.EPOCH_BEGIN
State.TRAIN
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.VAL
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.EXTERNAL
Phase.EPOCH_END
Phase.TRAIN_END
```
### Evaluation phases and states will behave as follow
```python
State.EXTERNAL
Phase.TEST_BEGIN
State.TEST
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.EXTERNAL
Phase.TEST_END
```
### Predict phases and states will behave as follow
```python
State.EXTERNAL
Phase.PREDICT_BEGIN
State.PREDICT
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.EXTERNAL
Phase.PREDICT_END
```
Callbacks will be executed under the relevant phase and state, and by their order.
With phases and states, you have full control over the timing of your callbacks.
Let's take a look at some of the callbacks ``lpd`` provides:
### LossOptimizerHandler Callback
Derives from ``LossOptimizerHandlerBase``, probably the most important callback during training 😎
Use ``LossOptimizerHandler`` to determine when to call:
```python
loss.backward(...)
optimizer.step(...)
optimizer.zero_grad(...)
```
Or, you may choose to create your own ``AwesomeLossOptimizerHandler`` class by deriving from ``LossOptimizerHandlerBase``.
``Trainer.train(...)`` will validate that at least one ``LossOptimizerHandlerBase`` callback was provided.
### LossOptimizerHandlerAccumulateBatches Callback
As well as ``LossOptimizerHandlerAccumulateSamples`` will call loss.backward() every batch, but invoke optimizer.step() and optimizer.zero_grad()
only after the defined num of batches (or samples) were accumulated
### StatsPrint Callback
``StatsPrint`` callback prints informative summary of the trainer stats including loss and metrics.
* ``CallbackMonitor`` can add nicer look with ``IMPROVED`` indication on improved loss or metric, see output example below.
* Loss (for all states) will be monitored as ``MonitorMode.MIN``
* For train metrics, provide your own monitors via ``train_metrics_monitors`` argument
* Validation metrics monitors will be added automatically according to ``train_metrics_monitors`` argument
```python
from lpd.enums import Phase, State, MonitorType, StatsType, MonitorMode
StatsPrint(apply_on_phase=Phase.EPOCH_END,
apply_on_states=State.EXTERNAL,
train_metrics_monitors=CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.TRAIN,
monitor_mode=MonitorMode.MAX,
metric_name='TruePositives'),
print_confusion_matrix_normalized=True) # in case you use one of the ConfusionMatrix metrics (e.g. TruePositives), you may also print the confusion matrix
```
Output example:

### ModelCheckPoint Callback
Saving a checkpoint when a monitored loss/metric has improved.
The callback will save the model, optimizer, scheduler, and epoch number.
You can also configure it to save Full Trainer.
For example, ``ModelCheckPoint`` that will save a new *full trainer checkpoint* every time the validation metric_name ``my_metric``
is getting higher than the highest value so far.
```python
ModelCheckPoint(Phase.EPOCH_END,
State.EXTERNAL,
checkpoint_dir,
checkpoint_file_name,
CallbackMonitor(monitor_type=MonitorType.METRIC, # It's a Metric and not a Loss
stats_type=StatsType.VAL, # check the value on the Validation set
monitor_mode=MonitorMode.MAX, # MAX indicates higher is better
metric_name='my_metric'), # since it's a Metric, mention its name
save_best_only=False,
save_full_trainer=True)
```
### EarlyStopping Callback
Stops the trainer when a monitored loss/metric has stopped improving.
For example, EarlyStopping that will monitor at the end of every epoch, and stop the trainer if the validation loss didn't improve (decrease) for the last 10 epochs.
```python
EarlyStopping(Phase.EPOCH_END,
State.EXTERNAL,
CallbackMonitor(monitor_type=MonitorType.LOSS,
stats_type=StatsType.VAL,
monitor_mode=MonitorMode.MIN,
patience=10))
```
### SchedulerStep Callback
Will invoke ``step()`` on your scheduler in the desired phase and state.
For example, SchedulerStep callback to invoke ``scheduler.step()`` at the end of every batch, in train state (as opposed to validation and test):
```python
from lpd.callbacks import SchedulerStep
from lpd.enums import Phase, State
SchedulerStep(apply_on_phase=Phase.BATCH_END, apply_on_states=State.TRAIN)
```
### Tensorboard Callback
Will export the loss and the metrics at a given phase and state, in a format that can be viewed on Tensorboard
```python
from lpd.callbacks import Tensorboard
Tensorboard(apply_on_phase=Phase.EPOCH_END,
apply_on_states=State.EXTERNAL,
summary_writer_dir=dir_path)
```
### TensorboardImage Callback
Will export images, in a format that can be viewed on Tensorboard.
For example, a TensorboardImage callback that will output all the images generated in validation
```python
from lpd.callbacks import TensorboardImage
TensorboardImage(apply_on_phase=Phase.BATCH_END,
apply_on_states=State.VAL,
summary_writer_dir=dir_path,
description='Generated Images',
outputs_parser=None)
```
Lets pass outputs_parser that will change the range of the outputs from [-1,1] to [0,255]
```python
from lpd.callbacks import TensorboardImage
def outputs_parser(input_output_label: InputOutputLabel):
outputs_scaled = (input_output_label.outputs + 1.0) / 2.0 * 255
outputs_scaled = torchvision.utils.make_grid(input_output_label.output)
return outputs_scaled
TensorboardImage(apply_on_phase=Phase.BATCH_END,
apply_on_states=State.VAL,
summary_writer_dir=dir_path,
description='Generated Images',
outputs_parser=outputs_parser)
```
### CollectOutputs Callback
Will collect model's outputs for the defined states.
CollectOutputs is automatically used by ``Trainer`` to collect the predictions when calling one of the ``predict`` methods.
```python
CollectOutputs(apply_on_phase=Phase.BATCH_END, apply_on_states=State.VAL)
```
### Create your custom callbacks
```python
from lpd.enums import Phase, State
from lpd.callbacks import CallbackBase
class MyAwesomeCallback(CallbackBase):
def __init__(self, apply_on_phase=Phase.BATCH_END, apply_on_states=[State.TRAIN, State.VAL]):
# make sure to call init parent class
super(MyAwesomeCallback, self).__init__(apply_on_phase, apply_on_states)
def __call__(self, callback_context): # <=== implement this method!
# your implementation here
# using callback_context, you can access anything in your trainer
# below are some examples to get the hang of it
val_loss = callback_context.val_stats.get_loss()
train_loss = callback_context.train_stats.get_loss()
train_metrics = callback_context.train_stats.get_metrics()
val_metrics = callback_context.val_stats.get_metrics()
optimizer = callback_context.optimizer
scheduler = callback_context.scheduler
trainer = callback_context.trainer
if val_loss < 0.0001:
# you can also mark the trainer to STOP training by calling stop()
trainer.stop()
```
Lets expand ``MyAwesomeCallback`` with ``CallbackMonitor`` to track if our validation loss is getting better
```python
from lpd.callbacks import CallbackBase, CallbackMonitor # <== CallbackMonitor added
from lpd.enums import Phase, State, MonitorType, StatsType, MonitorMode # <== added few needed enums to configure CallbackMonitor
class MyAwesomeCallback(CallbackBase):
def __init__(self, apply_on_phase=Phase.BATCH_END, apply_on_states=[State.TRAIN, State.VAL]):
super(MyAwesomeCallback, self).__init__(apply_on_phase, apply_on_states)
# adding CallbackMonitor to track VAL LOSS with regards to MIN (lower is better) and patience of 20 epochs
self.val_loss_monitor = CallbackMonitor(MonitorType.LOSS, StatsType.VAL, MonitorMode.MIN, patience=20)
def __call__(self, callback_context: CallbackContext): # <=== implement this method!
# same as before, using callback_context, you can access anything in your trainer
train_metrics = callback_context.train_stats.get_metrics()
val_metrics = callback_context.val_stats.get_metrics()
# invoke track() method on your monitor and pass callback_context as parameter
# since you configured your val_loss_monitor, it will get the relevant parameters from callback_context
monitor_result = self.val_loss_monitor.track(callback_context)
# monitor_result (lpd.callbacks.CallbackMonitorResult) contains informative properties
# for example lets check the status of the patience countdown
if monitor_result.has_patience():
print(f'[MyAwesomeCallback] - patience left: {monitor_result.patience_left}')
# Or, let's stop the trainer, by calling the trainer.stop()
# if our monitored value did not improve
if not monitor_result.has_improved():
print(f'[MyAwesomeCallback] - {monitor_result.description} has stopped improving')
callback_context.trainer.stop()
```
### CallbackMonitor, AbsoluteThresholdChecker and RelativeThresholdChecker
When using callbacks such as ``EarlyStopping``, a ``CallbackMonitor`` is provided to track
a certain metric and reset/trigger the stopping event (or any event in other callbacks).
``CallbackMonitor`` will internally use ``ThresholdChecker`` when comparing new value to old value
for the tracked metric, and ``AbsoluteThresholdChecker`` or ``RelativeThresholdChecker`` will be used
to check if the criteria was met.
The following example creates a ``CallbackMonitor`` that will track if the metric 'accuracy'
has increased with more then 1% using ``RelativeThresholdChecker``
```python
from lpd.utils.threshold_checker import RelativeThresholdChecker
relative_threshold_checker_1_percent = RelativeThresholdChecker(monitor_mode=MonitorMode.MAX, threshold=0.01)
CallbackMonitor(monitor_type=MonitorType.METRIC, # It's a Metric and not a Loss
stats_type=StatsType.VAL, # check the value on the Validation set
monitor_mode=MonitorMode.MAX, # MAX indicates higher is better
metric_name='accuracy', # since it's a Metric, mention its name
threshold_checker=relative_threshold_checker_1_percent) # track 1% increase from last highest value
```
## Metrics
``lpd.metrics`` provides metrics to check the accuracy of your model.
Let's create a custom metric using ``MetricBase`` and also show the use of ``BinaryAccuracyWithLogits`` in this example
```python
from lpd.metrics import BinaryAccuracyWithLogits, MetricBase
from lpd.enums import MetricMethod
# our custom metric
class InaccuracyWithLogits(MetricBase):
def __init__(self):
super(InaccuracyWithLogits, self).__init__(MetricMethod.MEAN) # use mean over the batches
self.bawl = BinaryAccuracyWithLogits() # we exploit BinaryAccuracyWithLogits for the computation
def __call__(self, y_pred, y_true): # <=== implement this method!
# your implementation here
acc = self.bawl(y_pred, y_true)
return 1 - acc # return the inaccuracy
# we can now define our metrics and pass them to the trainer
metrics = [BinaryAccuracyWithLogits(name='accuracy'), InaccuracyWithLogits(name='inaccuracy')]
```
Let's do another example, a custom metric ``Truthfulness`` based on confusion matrix using ``MetricConfusionMatrixBase``
```python
from lpd.metrics import MetricConfusionMatrixBase, TruePositives, TrueNegatives
from lpd.enums import ConfusionMatrixBasedMetric
# our custom metric
class Truthfulness(MetricConfusionMatrixBase):
def __init__(self, num_classes, labels=None, predictions_to_classes_convertor=None, threshold=0.5):
super(Truthfulness, self).__init__(num_classes, labels, predictions_to_classes_convertor, threshold)
self.tp = TruePositives(num_classes, labels, predictions_to_classes_convertor, threshold) # we exploit TruePositives for the computation
self.tn = TrueNegatives(num_classes, labels, predictions_to_classes_convertor, threshold) # we exploit TrueNegatives for the computation
def __call__(self, y_pred, y_true): # <=== implement this method!
tp_per_class = self.tp(y_pred, y_true)
tn_per_class = self.tn(y_pred, y_true)
# you can also access more confusion matrix metrics such as
f1score = self.get_stats(ConfusionMatrixBasedMetric.F1SCORE)
precision = self.get_stats(ConfusionMatrixBasedMetric.PRECISION)
recall = self.get_stats(ConfusionMatrixBasedMetric.RECALL)
# see ConfusionMatrixBasedMetric enum for more
return tp_per_class + tn_per_class
```
## Save and Load full Trainer
Sometimes you just want to save everything so you can continue training where you left off.
To do so, you may use ``ModelCheckPoint`` for saving full trainer by setting parameter
```python
save_full_trainer=True
```
Or, you can invoke it directly from your trainer
```python
your_trainer.save_trainer(dir_path, file_name)
```
Loading a trainer from checkpoint is as simple as:
```python
loaded_trainer = Trainer.load_trainer(dir_path, # the folder where the saved trainer file exists
trainer_file_name, # the saved trainer file name
model, # state_dict will be loaded
device,
loss_func, # state_dict will be loaded
optimizer, # state_dict will be loaded
scheduler, # state_dict will be loaded
train_data_loader, # provide new/previous data_loader
val_data_loader, # provide new/previous data_loader
train_steps,
val_steps)
```
### Utils
``lpd.utils`` provides ``torch_utils``, ``file_utils`` and ``general_utils``
For example, a good practice is to use ``seed_all`` as early as possible in your code, to make sure that results are reproducible:
```python
import lpd.utils.general_utils as gu
gu.seed_all(seed=42) # because its the answer to life and the universe
```
### Extensions
``lpd.extensions`` provides some custom PyTorch layers, and schedulers, these are just some stuff we like using when we create our models, to gain better flexibility.
So you can use them at your own will, more extensions are added from time to time.
## Something is missing?! please share with us
You can open an issue, but also feel free to email us at torch.lpd@gmail.com
| text/markdown | Roy Sadaka | null | lpd developers | torch.lpd@gmail.com | MIT Licences | lpd-nodeps | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Topic :: Scientific/Engineering :: Art... | [] | https://github.com/roysadaka/lpd | null | >=3.9 | [] | [] | [] | [
"tqdm"
] | [] | [] | [] | [] | twine/4.0.1 CPython/3.7.6 | 2026-02-19T10:13:58.856622 | lpd_nodeps-0.4.13-py3-none-any.whl | 50,186 | 4f/0d/881233914dcd15902d1b3ffaf313be9879d5f07cae06246dada6e752c1e6/lpd_nodeps-0.4.13-py3-none-any.whl | py3 | bdist_wheel | null | false | 4da4c1f76c1a9bdb087b928c0321f2da | d754c855a7c3a97e0e642473c413460ef2f41e384baae8259ee7c9f65b3f4d5e | 4f0d881233914dcd15902d1b3ffaf313be9879d5f07cae06246dada6e752c1e6 | null | [] | 103 |
2.4 | bingx | 0.0.124 | bingx crypto exchange api client | # bingx-python
Python SDK (sync and async) for Bingx cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/bingx)
- You can check Bingx's docs here: [Docs](https://www.google.com/search?q=google+bingx+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/bingx-python
- Pypi package: https://pypi.org/project/bingx
## Installation
```
pip install bingx
```
## Usage
### Sync
```Python
from bingx import BingxSync
def main():
instance = BingxSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from bingx import BingxAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BingxAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from bingx import BingxWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BingxWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_order_with_cost(self, symbol: str, side: OrderSide, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_balance(self, params={})`
- `fetch_canceled_and_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_inverse_swap_markets(self, params)`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_margin_mode(self, symbol: str, params={})`
- `fetch_mark_price(self, symbol: str, params={})`
- `fetch_mark_prices(self, symbols: Strings = None, params={})`
- `fetch_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_liquidations(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position_history(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_position_mode(self, symbol: Str = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_spot_markets(self, params)`
- `fetch_swap_markets(self, params)`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `cancel_all_orders_after(self, timeout: Int, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_all_positions(self, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `custom_encode(self, params)`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `nonce(self)`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_margin(self, symbol: str, amount: float, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enable: bool)`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `fund_v1_private_get_account_balance(request)`
- `spot_v1_public_get_server_time(request)`
- `spot_v1_public_get_common_symbols(request)`
- `spot_v1_public_get_market_trades(request)`
- `spot_v1_public_get_market_depth(request)`
- `spot_v1_public_get_market_kline(request)`
- `spot_v1_public_get_ticker_24hr(request)`
- `spot_v1_public_get_ticker_price(request)`
- `spot_v1_public_get_ticker_bookticker(request)`
- `spot_v1_private_get_trade_query(request)`
- `spot_v1_private_get_trade_openorders(request)`
- `spot_v1_private_get_trade_historyorders(request)`
- `spot_v1_private_get_trade_mytrades(request)`
- `spot_v1_private_get_user_commissionrate(request)`
- `spot_v1_private_get_account_balance(request)`
- `spot_v1_private_get_oco_orderlist(request)`
- `spot_v1_private_get_oco_openorderlist(request)`
- `spot_v1_private_get_oco_historyorderlist(request)`
- `spot_v1_private_post_trade_order(request)`
- `spot_v1_private_post_trade_cancel(request)`
- `spot_v1_private_post_trade_batchorders(request)`
- `spot_v1_private_post_trade_order_cancelreplace(request)`
- `spot_v1_private_post_trade_cancelorders(request)`
- `spot_v1_private_post_trade_cancelopenorders(request)`
- `spot_v1_private_post_trade_cancelallafter(request)`
- `spot_v1_private_post_oco_order(request)`
- `spot_v1_private_post_oco_cancel(request)`
- `spot_v2_public_get_market_depth(request)`
- `spot_v2_public_get_market_kline(request)`
- `spot_v2_public_get_ticker_price(request)`
- `spot_v3_private_get_get_asset_transfer(request)`
- `spot_v3_private_get_asset_transfer(request)`
- `spot_v3_private_get_capital_deposit_hisrec(request)`
- `spot_v3_private_get_capital_withdraw_history(request)`
- `spot_v3_private_post_post_asset_transfer(request)`
- `swap_v1_public_get_ticker_price(request)`
- `swap_v1_public_get_market_historicaltrades(request)`
- `swap_v1_public_get_market_markpriceklines(request)`
- `swap_v1_public_get_trade_multiassetsrules(request)`
- `swap_v1_public_get_tradingrules(request)`
- `swap_v1_private_get_positionside_dual(request)`
- `swap_v1_private_get_trade_batchcancelreplace(request)`
- `swap_v1_private_get_trade_fullorder(request)`
- `swap_v1_private_get_maintmarginratio(request)`
- `swap_v1_private_get_trade_positionhistory(request)`
- `swap_v1_private_get_positionmargin_history(request)`
- `swap_v1_private_get_twap_openorders(request)`
- `swap_v1_private_get_twap_historyorders(request)`
- `swap_v1_private_get_twap_orderdetail(request)`
- `swap_v1_private_get_trade_assetmode(request)`
- `swap_v1_private_get_user_marginassets(request)`
- `swap_v1_private_post_trade_amend(request)`
- `swap_v1_private_post_trade_cancelreplace(request)`
- `swap_v1_private_post_positionside_dual(request)`
- `swap_v1_private_post_trade_batchcancelreplace(request)`
- `swap_v1_private_post_trade_closeposition(request)`
- `swap_v1_private_post_trade_getvst(request)`
- `swap_v1_private_post_twap_order(request)`
- `swap_v1_private_post_twap_cancelorder(request)`
- `swap_v1_private_post_trade_assetmode(request)`
- `swap_v1_private_post_trade_reverse(request)`
- `swap_v1_private_post_trade_autoaddmargin(request)`
- `swap_v2_public_get_server_time(request)`
- `swap_v2_public_get_quote_contracts(request)`
- `swap_v2_public_get_quote_price(request)`
- `swap_v2_public_get_quote_depth(request)`
- `swap_v2_public_get_quote_trades(request)`
- `swap_v2_public_get_quote_premiumindex(request)`
- `swap_v2_public_get_quote_fundingrate(request)`
- `swap_v2_public_get_quote_klines(request)`
- `swap_v2_public_get_quote_openinterest(request)`
- `swap_v2_public_get_quote_ticker(request)`
- `swap_v2_public_get_quote_bookticker(request)`
- `swap_v2_private_get_user_balance(request)`
- `swap_v2_private_get_user_positions(request)`
- `swap_v2_private_get_user_income(request)`
- `swap_v2_private_get_trade_openorders(request)`
- `swap_v2_private_get_trade_openorder(request)`
- `swap_v2_private_get_trade_order(request)`
- `swap_v2_private_get_trade_margintype(request)`
- `swap_v2_private_get_trade_leverage(request)`
- `swap_v2_private_get_trade_forceorders(request)`
- `swap_v2_private_get_trade_allorders(request)`
- `swap_v2_private_get_trade_allfillorders(request)`
- `swap_v2_private_get_trade_fillhistory(request)`
- `swap_v2_private_get_user_income_export(request)`
- `swap_v2_private_get_user_commissionrate(request)`
- `swap_v2_private_get_quote_bookticker(request)`
- `swap_v2_private_post_trade_getvst(request)`
- `swap_v2_private_post_trade_order(request)`
- `swap_v2_private_post_trade_batchorders(request)`
- `swap_v2_private_post_trade_closeallpositions(request)`
- `swap_v2_private_post_trade_cancelallafter(request)`
- `swap_v2_private_post_trade_margintype(request)`
- `swap_v2_private_post_trade_leverage(request)`
- `swap_v2_private_post_trade_positionmargin(request)`
- `swap_v2_private_post_trade_order_test(request)`
- `swap_v2_private_delete_trade_order(request)`
- `swap_v2_private_delete_trade_batchorders(request)`
- `swap_v2_private_delete_trade_allopenorders(request)`
- `swap_v3_public_get_quote_klines(request)`
- `swap_v3_private_get_user_balance(request)`
- `cswap_v1_public_get_market_contracts(request)`
- `cswap_v1_public_get_market_premiumindex(request)`
- `cswap_v1_public_get_market_openinterest(request)`
- `cswap_v1_public_get_market_klines(request)`
- `cswap_v1_public_get_market_depth(request)`
- `cswap_v1_public_get_market_ticker(request)`
- `cswap_v1_private_get_trade_leverage(request)`
- `cswap_v1_private_get_trade_forceorders(request)`
- `cswap_v1_private_get_trade_allfillorders(request)`
- `cswap_v1_private_get_trade_openorders(request)`
- `cswap_v1_private_get_trade_orderdetail(request)`
- `cswap_v1_private_get_trade_orderhistory(request)`
- `cswap_v1_private_get_trade_margintype(request)`
- `cswap_v1_private_get_user_commissionrate(request)`
- `cswap_v1_private_get_user_positions(request)`
- `cswap_v1_private_get_user_balance(request)`
- `cswap_v1_private_post_trade_order(request)`
- `cswap_v1_private_post_trade_leverage(request)`
- `cswap_v1_private_post_trade_allopenorders(request)`
- `cswap_v1_private_post_trade_closeallpositions(request)`
- `cswap_v1_private_post_trade_margintype(request)`
- `cswap_v1_private_post_trade_positionmargin(request)`
- `cswap_v1_private_delete_trade_allopenorders(request)`
- `cswap_v1_private_delete_trade_cancelorder(request)`
- `contract_v1_private_get_allposition(request)`
- `contract_v1_private_get_allorders(request)`
- `contract_v1_private_get_balance(request)`
- `wallets_v1_private_get_capital_config_getall(request)`
- `wallets_v1_private_get_capital_deposit_address(request)`
- `wallets_v1_private_get_capital_innertransfer_records(request)`
- `wallets_v1_private_get_capital_subaccount_deposit_address(request)`
- `wallets_v1_private_get_capital_deposit_subhisrec(request)`
- `wallets_v1_private_get_capital_subaccount_innertransfer_records(request)`
- `wallets_v1_private_get_capital_deposit_riskrecords(request)`
- `wallets_v1_private_post_capital_withdraw_apply(request)`
- `wallets_v1_private_post_capital_innertransfer_apply(request)`
- `wallets_v1_private_post_capital_subaccountinnertransfer_apply(request)`
- `wallets_v1_private_post_capital_deposit_createsubaddress(request)`
- `subaccount_v1_private_get_list(request)`
- `subaccount_v1_private_get_assets(request)`
- `subaccount_v1_private_get_allaccountbalance(request)`
- `subaccount_v1_private_post_create(request)`
- `subaccount_v1_private_post_apikey_create(request)`
- `subaccount_v1_private_post_apikey_edit(request)`
- `subaccount_v1_private_post_apikey_del(request)`
- `subaccount_v1_private_post_updatestatus(request)`
- `account_v1_private_get_uid(request)`
- `account_v1_private_get_apikey_query(request)`
- `account_v1_private_get_account_apipermissions(request)`
- `account_v1_private_get_allaccountbalance(request)`
- `account_v1_private_post_innertransfer_authorizesubaccount(request)`
- `account_transfer_v1_private_get_subaccount_asset_transferhistory(request)`
- `account_transfer_v1_private_post_subaccount_transferasset_supportcoins(request)`
- `account_transfer_v1_private_post_subaccount_transferasset(request)`
- `user_auth_private_post_userdatastream(request)`
- `user_auth_private_put_userdatastream(request)`
- `user_auth_private_delete_userdatastream(request)`
- `copytrading_v1_private_get_swap_trace_currenttrack(request)`
- `copytrading_v1_private_get_pfutures_traderdetail(request)`
- `copytrading_v1_private_get_pfutures_profithistorysummarys(request)`
- `copytrading_v1_private_get_pfutures_profitdetail(request)`
- `copytrading_v1_private_get_pfutures_tradingpairs(request)`
- `copytrading_v1_private_get_spot_traderdetail(request)`
- `copytrading_v1_private_get_spot_profithistorysummarys(request)`
- `copytrading_v1_private_get_spot_profitdetail(request)`
- `copytrading_v1_private_get_spot_historyorder(request)`
- `copytrading_v1_private_post_swap_trace_closetrackorder(request)`
- `copytrading_v1_private_post_swap_trace_settpsl(request)`
- `copytrading_v1_private_post_pfutures_setcommission(request)`
- `copytrading_v1_private_post_spot_trader_sellorder(request)`
- `api_v3_private_get_asset_transfer(request)`
- `api_v3_private_get_asset_transferrecord(request)`
- `api_v3_private_get_capital_deposit_hisrec(request)`
- `api_v3_private_get_capital_withdraw_history(request)`
- `api_v3_private_post_post_asset_transfer(request)`
- `api_asset_v1_private_post_transfer(request)`
- `api_asset_v1_public_get_transfer_supportcoins(request)`
- `agent_v1_private_get_account_inviteaccountlist(request)`
- `agent_v1_private_get_reward_commissiondatalist(request)`
- `agent_v1_private_get_account_inviterelationcheck(request)`
- `agent_v1_private_get_asset_depositdetaillist(request)`
- `agent_v1_private_get_reward_third_commissiondatalist(request)`
- `agent_v1_private_get_asset_partnerdata(request)`
- `agent_v1_private_get_commissiondatalist_referralcode(request)`
- `agent_v1_private_get_account_superiorcheck(request)`
### WS Unified
- `describe(self)`
- `un_watch(self, messageHash: str, subMessageHash: str, subscribeHash: str, dataType: str, topic: str, market: Market, methodName: str, params={})`
- `watch_ticker(self, symbol: str, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `get_order_book_limit_by_market_type(self, marketType: str, limit: Int = None)`
- `get_message_hash(self, unifiedChannel: str, symbol: Str = None, extra: Str = None)`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_balance(self, params={})`
- `set_balance_cache(self, client: Client, type, subType, subscriptionHash, params)`
- `load_balance_snapshot(self, client, messageHash, type, subType)`
- `keep_alive_listen_key(self, params={})`
- `authenticate(self, params={})`
- `pong(self, client, message)`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:58.238906 | bingx-0.0.124.tar.gz | 765,996 | a4/61/63ced8666252d800446561c0d0ad0a6b476706917c1dbb7e4cb470b7f316/bingx-0.0.124.tar.gz | source | sdist | null | false | 26b8a2f8431d52bbc2cb7e7d2fc22712 | dbb9954511b7bbc406a0dacdb169a937d8cf1e8b0761311b8cb4cc7e0b34a0ad | a46163ced8666252d800446561c0d0ad0a6b476706917c1dbb7e4cb470b7f316 | null | [] | 292 |
2.4 | okx-exchange | 0.0.125 | okx crypto exchange api client | # okx-python
Python SDK (sync and async) for Okx cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/okx)
- You can check Okx's docs here: [Docs](https://www.google.com/search?q=google+okx+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/okx-python
- Pypi package: https://pypi.org/project/okx-exchange
## Installation
```
pip install okx-exchange
```
## Usage
### Sync
```Python
from okx import OkxSync
def main():
instance = OkxSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from okx import OkxAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = OkxAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from okx import OkxWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = OkxWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_convert_trade(self, id: str, fromCode: str, toCode: str, amount: Num = None, params={})`
- `create_expired_option_market(self, symbol: str)`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_accounts(self, params={})`
- `fetch_all_greeks(self, symbols: Strings = None, params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_borrow_rate_histories(self, codes=None, since: Int = None, limit: Int = None, params={})`
- `fetch_borrow_rate_history(self, code: str, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_currencies(self, params={})`
- `fetch_convert_quote(self, fromCode: str, toCode: str, amount: Num = None, params={})`
- `fetch_convert_trade_history(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_trade(self, id: str, code: Str = None, params={})`
- `fetch_cross_borrow_rate(self, code: str, params={})`
- `fetch_cross_borrow_rates(self, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposit(self, id: str, code: Str = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_greeks(self, symbol: str, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_long_short_ratio_history(self, symbol: Str = None, timeframe: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_margin_adjustment_history(self, symbol: Str = None, type: Str = None, since: Num = None, limit: Num = None, params={})`
- `fetch_mark_price(self, symbol: str, params={})`
- `fetch_mark_prices(self, symbols: Strings = None, params={})`
- `fetch_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_markets_by_type(self, type, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest_history(self, symbol: str, timeframe='1d', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_interests(self, symbols: Strings = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_option_chain(self, code: str, params={})`
- `fetch_option(self, symbol: str, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_position_mode(self, symbol: Str = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_for_symbol(self, symbol: str, params={})`
- `fetch_positions_history(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_status(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_transfer(self, id: str, code: Str = None, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_underlying_assets(self, params={})`
- `fetch_withdrawal(self, id: str, code: Str = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `cancel_all_orders_after(self, timeout: Int, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders_for_symbols(self, orders: List[CancellationRequest], params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `convert_to_instrument_type(self, type)`
- `describe(self)`
- `edit_order_request(self, id: str, symbol, type, side, amount=None, price=None, params={})`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `modify_margin_helper(self, symbol: str, amount, type, params={})`
- `nonce(self)`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `repay_cross_margin(self, code: str, amount, params={})`
- `safe_market(self, marketId: Str = None, market: Market = None, delimiter: Str = None, marketType: Str = None)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enable: bool)`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_get_market_tickers(request)`
- `public_get_market_ticker(request)`
- `public_get_market_books(request)`
- `public_get_market_books_full(request)`
- `public_get_market_candles(request)`
- `public_get_market_history_candles(request)`
- `public_get_market_trades(request)`
- `public_get_market_history_trades(request)`
- `public_get_market_option_instrument_family_trades(request)`
- `public_get_market_platform_24_volume(request)`
- `public_get_market_call_auction_detail(request)`
- `public_get_market_books_sbe(request)`
- `public_get_market_block_tickers(request)`
- `public_get_market_block_ticker(request)`
- `public_get_market_sprd_ticker(request)`
- `public_get_market_sprd_candles(request)`
- `public_get_market_sprd_history_candles(request)`
- `public_get_market_index_tickers(request)`
- `public_get_market_index_candles(request)`
- `public_get_market_history_index_candles(request)`
- `public_get_market_mark_price_candles(request)`
- `public_get_market_history_mark_price_candles(request)`
- `public_get_market_exchange_rate(request)`
- `public_get_market_index_components(request)`
- `public_get_market_open_oracle(request)`
- `public_get_market_books_lite(request)`
- `public_get_public_option_trades(request)`
- `public_get_public_block_trades(request)`
- `public_get_public_instruments(request)`
- `public_get_public_estimated_price(request)`
- `public_get_public_delivery_exercise_history(request)`
- `public_get_public_estimated_settlement_info(request)`
- `public_get_public_settlement_history(request)`
- `public_get_public_funding_rate(request)`
- `public_get_public_funding_rate_history(request)`
- `public_get_public_open_interest(request)`
- `public_get_public_price_limit(request)`
- `public_get_public_opt_summary(request)`
- `public_get_public_discount_rate_interest_free_quota(request)`
- `public_get_public_time(request)`
- `public_get_public_mark_price(request)`
- `public_get_public_position_tiers(request)`
- `public_get_public_interest_rate_loan_quota(request)`
- `public_get_public_underlying(request)`
- `public_get_public_insurance_fund(request)`
- `public_get_public_convert_contract_coin(request)`
- `public_get_public_instrument_tick_bands(request)`
- `public_get_public_premium_history(request)`
- `public_get_public_economic_calendar(request)`
- `public_get_public_market_data_history(request)`
- `public_get_public_vip_interest_rate_loan_quota(request)`
- `public_get_rubik_stat_trading_data_support_coin(request)`
- `public_get_rubik_stat_contracts_open_interest_history(request)`
- `public_get_rubik_stat_taker_volume(request)`
- `public_get_rubik_stat_taker_volume_contract(request)`
- `public_get_rubik_stat_margin_loan_ratio(request)`
- `public_get_rubik_stat_contracts_long_short_account_ratio_contract_top_trader(request)`
- `public_get_rubik_stat_contracts_long_short_account_ratio_contract(request)`
- `public_get_rubik_stat_contracts_long_short_account_ratio(request)`
- `public_get_rubik_stat_contracts_open_interest_volume(request)`
- `public_get_rubik_stat_option_open_interest_volume(request)`
- `public_get_rubik_stat_option_open_interest_volume_ratio(request)`
- `public_get_rubik_stat_option_open_interest_volume_expiry(request)`
- `public_get_rubik_stat_option_open_interest_volume_strike(request)`
- `public_get_rubik_stat_option_taker_block_volume(request)`
- `public_get_system_status(request)`
- `public_get_sprd_spreads(request)`
- `public_get_sprd_books(request)`
- `public_get_sprd_public_trades(request)`
- `public_get_sprd_ticker(request)`
- `public_get_tradingbot_grid_ai_param(request)`
- `public_get_tradingbot_grid_min_investment(request)`
- `public_get_tradingbot_public_rsi_back_testing(request)`
- `public_get_tradingbot_grid_grid_quantity(request)`
- `public_get_asset_exchange_list(request)`
- `public_get_finance_staking_defi_eth_apy_history(request)`
- `public_get_finance_staking_defi_sol_apy_history(request)`
- `public_get_finance_savings_lending_rate_summary(request)`
- `public_get_finance_savings_lending_rate_history(request)`
- `public_get_finance_fixed_loan_lending_offers(request)`
- `public_get_finance_fixed_loan_lending_apy_history(request)`
- `public_get_finance_fixed_loan_pending_lending_volume(request)`
- `public_get_finance_sfp_dcd_products(request)`
- `public_get_copytrading_public_config(request)`
- `public_get_copytrading_public_lead_traders(request)`
- `public_get_copytrading_public_weekly_pnl(request)`
- `public_get_copytrading_public_pnl(request)`
- `public_get_copytrading_public_stats(request)`
- `public_get_copytrading_public_preference_currency(request)`
- `public_get_copytrading_public_current_subpositions(request)`
- `public_get_copytrading_public_subpositions_history(request)`
- `public_get_copytrading_public_copy_traders(request)`
- `public_get_support_announcements(request)`
- `public_get_support_announcements_types(request)`
- `public_post_tradingbot_grid_min_investment(request)`
- `private_get_rfq_counterparties(request)`
- `private_get_rfq_maker_instrument_settings(request)`
- `private_get_rfq_mmp_config(request)`
- `private_get_rfq_rfqs(request)`
- `private_get_rfq_quotes(request)`
- `private_get_rfq_trades(request)`
- `private_get_rfq_public_trades(request)`
- `private_get_sprd_order(request)`
- `private_get_sprd_orders_pending(request)`
- `private_get_sprd_orders_history(request)`
- `private_get_sprd_orders_history_archive(request)`
- `private_get_sprd_trades(request)`
- `private_get_trade_order(request)`
- `private_get_trade_orders_pending(request)`
- `private_get_trade_orders_history(request)`
- `private_get_trade_orders_history_archive(request)`
- `private_get_trade_fills(request)`
- `private_get_trade_fills_history(request)`
- `private_get_trade_fills_archive(request)`
- `private_get_trade_order_algo(request)`
- `private_get_trade_orders_algo_pending(request)`
- `private_get_trade_orders_algo_history(request)`
- `private_get_trade_easy_convert_currency_list(request)`
- `private_get_trade_easy_convert_history(request)`
- `private_get_trade_one_click_repay_currency_list(request)`
- `private_get_trade_one_click_repay_currency_list_v2(request)`
- `private_get_trade_one_click_repay_history(request)`
- `private_get_trade_one_click_repay_history_v2(request)`
- `private_get_trade_account_rate_limit(request)`
- `private_get_asset_currencies(request)`
- `private_get_asset_balances(request)`
- `private_get_asset_non_tradable_assets(request)`
- `private_get_asset_asset_valuation(request)`
- `private_get_asset_transfer_state(request)`
- `private_get_asset_bills(request)`
- `private_get_asset_bills_history(request)`
- `private_get_asset_deposit_lightning(request)`
- `private_get_asset_deposit_address(request)`
- `private_get_asset_deposit_history(request)`
- `private_get_asset_withdrawal_history(request)`
- `private_get_asset_deposit_withdraw_status(request)`
- `private_get_asset_monthly_statement(request)`
- `private_get_asset_convert_currencies(request)`
- `private_get_asset_convert_currency_pair(request)`
- `private_get_asset_convert_history(request)`
- `private_get_account_instruments(request)`
- `private_get_account_balance(request)`
- `private_get_account_positions(request)`
- `private_get_account_positions_history(request)`
- `private_get_account_account_position_risk(request)`
- `private_get_account_bills(request)`
- `private_get_account_bills_archive(request)`
- `private_get_account_bills_history_archive(request)`
- `private_get_account_config(request)`
- `private_get_account_max_size(request)`
- `private_get_account_max_avail_size(request)`
- `private_get_account_leverage_info(request)`
- `private_get_account_adjust_leverage_info(request)`
- `private_get_account_max_loan(request)`
- `private_get_account_trade_fee(request)`
- `private_get_account_interest_accrued(request)`
- `private_get_account_interest_rate(request)`
- `private_get_account_max_withdrawal(request)`
- `private_get_account_risk_state(request)`
- `private_get_account_interest_limits(request)`
- `private_get_account_spot_borrow_repay_history(request)`
- `private_get_account_greeks(request)`
- `private_get_account_position_tiers(request)`
- `private_get_account_set_account_switch_precheck(request)`
- `private_get_account_collateral_assets(request)`
- `private_get_account_mmp_config(request)`
- `private_get_account_move_positions_history(request)`
- `private_get_account_precheck_set_delta_neutral(request)`
- `private_get_account_quick_margin_borrow_repay_history(request)`
- `private_get_account_borrow_repay_history(request)`
- `private_get_account_vip_interest_accrued(request)`
- `private_get_account_vip_interest_deducted(request)`
- `private_get_account_vip_loan_order_list(request)`
- `private_get_account_vip_loan_order_detail(request)`
- `private_get_account_fixed_loan_borrowing_limit(request)`
- `private_get_account_fixed_loan_borrowing_quote(request)`
- `private_get_account_fixed_loan_borrowing_orders_list(request)`
- `private_get_account_spot_manual_borrow_repay(request)`
- `private_get_account_set_auto_repay(request)`
- `private_get_users_subaccount_list(request)`
- `private_get_account_subaccount_balances(request)`
- `private_get_asset_subaccount_balances(request)`
- `private_get_account_subaccount_max_withdrawal(request)`
- `private_get_asset_subaccount_bills(request)`
- `private_get_asset_subaccount_managed_subaccount_bills(request)`
- `private_get_users_entrust_subaccount_list(request)`
- `private_get_account_subaccount_interest_limits(request)`
- `private_get_users_subaccount_apikey(request)`
- `private_get_tradingbot_grid_orders_algo_pending(request)`
- `private_get_tradingbot_grid_orders_algo_history(request)`
- `private_get_tradingbot_grid_orders_algo_details(request)`
- `private_get_tradingbot_grid_sub_orders(request)`
- `private_get_tradingbot_grid_positions(request)`
- `private_get_tradingbot_grid_ai_param(request)`
- `private_get_tradingbot_signal_signals(request)`
- `private_get_tradingbot_signal_orders_algo_details(request)`
- `private_get_tradingbot_signal_orders_algo_pending(request)`
- `private_get_tradingbot_signal_orders_algo_history(request)`
- `private_get_tradingbot_signal_positions(request)`
- `private_get_tradingbot_signal_positions_history(request)`
- `private_get_tradingbot_signal_sub_orders(request)`
- `private_get_tradingbot_signal_event_history(request)`
- `private_get_tradingbot_recurring_orders_algo_pending(request)`
- `private_get_tradingbot_recurring_orders_algo_history(request)`
- `private_get_tradingbot_recurring_orders_algo_details(request)`
- `private_get_tradingbot_recurring_sub_orders(request)`
- `private_get_finance_savings_balance(request)`
- `private_get_finance_savings_lending_history(request)`
- `private_get_finance_staking_defi_offers(request)`
- `private_get_finance_staking_defi_orders_active(request)`
- `private_get_finance_staking_defi_orders_history(request)`
- `private_get_finance_staking_defi_eth_product_info(request)`
- `private_get_finance_staking_defi_eth_balance(request)`
- `private_get_finance_staking_defi_eth_purchase_redeem_history(request)`
- `private_get_finance_staking_defi_sol_product_info(request)`
- `private_get_finance_staking_defi_sol_balance(request)`
- `private_get_finance_staking_defi_sol_purchase_redeem_history(request)`
- `private_get_finance_flexible_loan_borrow_currencies(request)`
- `private_get_finance_flexible_loan_collateral_assets(request)`
- `private_get_finance_flexible_loan_max_collateral_redeem_amount(request)`
- `private_get_finance_flexible_loan_loan_info(request)`
- `private_get_finance_flexible_loan_loan_history(request)`
- `private_get_finance_flexible_loan_interest_accrued(request)`
- `private_get_copytrading_current_subpositions(request)`
- `private_get_copytrading_subpositions_history(request)`
- `private_get_copytrading_instruments(request)`
- `private_get_copytrading_profit_sharing_details(request)`
- `private_get_copytrading_total_profit_sharing(request)`
- `private_get_copytrading_unrealized_profit_sharing_details(request)`
- `private_get_copytrading_total_unrealized_profit_sharing(request)`
- `private_get_copytrading_config(request)`
- `private_get_copytrading_copy_settings(request)`
- `private_get_copytrading_current_lead_traders(request)`
- `private_get_copytrading_batch_leverage_info(request)`
- `private_get_copytrading_lead_traders_history(request)`
- `private_get_broker_dma_subaccount_info(request)`
- `private_get_broker_dma_subaccount_trade_fee(request)`
- `private_get_broker_dma_subaccount_apikey(request)`
- `private_get_broker_dma_rebate_per_orders(request)`
- `private_get_broker_fd_rebate_per_orders(request)`
- `private_get_broker_fd_if_rebate(request)`
- `private_get_broker_nd_info(request)`
- `private_get_broker_nd_subaccount_info(request)`
- `private_get_broker_nd_subaccount_apikey(request)`
- `private_get_asset_broker_nd_subaccount_deposit_address(request)`
- `private_get_asset_broker_nd_subaccount_deposit_history(request)`
- `private_get_asset_broker_nd_subaccount_withdrawal_history(request)`
- `private_get_broker_nd_rebate_daily(request)`
- `private_get_broker_nd_rebate_per_orders(request)`
- `private_get_finance_sfp_dcd_order(request)`
- `private_get_finance_sfp_dcd_orders(request)`
- `private_get_affiliate_invitee_detail(request)`
- `private_get_users_partner_if_rebate(request)`
- `private_get_support_announcements(request)`
- `private_post_rfq_create_rfq(request)`
- `private_post_rfq_cancel_rfq(request)`
- `private_post_rfq_cancel_batch_rfqs(request)`
- `private_post_rfq_cancel_all_rfqs(request)`
- `private_post_rfq_execute_quote(request)`
- `private_post_rfq_maker_instrument_settings(request)`
- `private_post_rfq_mmp_reset(request)`
- `private_post_rfq_mmp_config(request)`
- `private_post_rfq_create_quote(request)`
- `private_post_rfq_cancel_quote(request)`
- `private_post_rfq_cancel_batch_quotes(request)`
- `private_post_rfq_cancel_all_quotes(request)`
- `private_post_rfq_cancel_all_after(request)`
- `private_post_sprd_order(request)`
- `private_post_sprd_cancel_order(request)`
- `private_post_sprd_mass_cancel(request)`
- `private_post_sprd_amend_order(request)`
- `private_post_sprd_cancel_all_after(request)`
- `private_post_trade_order(request)`
- `private_post_trade_batch_orders(request)`
- `private_post_trade_cancel_order(request)`
- `private_post_trade_cancel_batch_orders(request)`
- `private_post_trade_amend_order(request)`
- `private_post_trade_amend_batch_orders(request)`
- `private_post_trade_close_position(request)`
- `private_post_trade_fills_archive(request)`
- `private_post_trade_cancel_advance_algos(request)`
- `private_post_trade_easy_convert(request)`
- `private_post_trade_one_click_repay(request)`
- `private_post_trade_one_click_repay_v2(request)`
- `private_post_trade_mass_cancel(request)`
- `private_post_trade_cancel_all_after(request)`
- `private_post_trade_order_precheck(request)`
- `private_post_trade_order_algo(request)`
- `private_post_trade_cancel_algos(request)`
- `private_post_trade_amend_algos(request)`
- `private_post_asset_transfer(request)`
- `private_post_asset_withdrawal(request)`
- `private_post_asset_withdrawal_lightning(request)`
- `private_post_asset_cancel_withdrawal(request)`
- `private_post_asset_convert_dust_assets(request)`
- `private_post_asset_monthly_statement(request)`
- `private_post_asset_convert_estimate_quote(request)`
- `private_post_asset_convert_trade(request)`
- `private_post_account_bills_history_archive(request)`
- `private_post_account_set_position_mode(request)`
- `private_post_account_set_leverage(request)`
- `private_post_account_position_margin_balance(request)`
- `private_post_account_set_fee_type(request)`
- `private_post_account_set_greeks(request)`
- `private_post_account_set_isolated_mode(request)`
- `private_post_account_spot_manual_borrow_repay(request)`
- `private_post_account_set_auto_repay(request)`
- `private_post_account_quick_margin_borrow_repay(request)`
- `private_post_account_borrow_repay(request)`
- `private_post_account_simulated_margin(request)`
- `private_post_account_position_builder(request)`
- `private_post_account_position_builder_graph(request)`
- `private_post_account_set_riskoffset_type(request)`
- `private_post_account_activate_option(request)`
- `private_post_account_set_auto_loan(request)`
- `private_post_account_account_level_switch_preset(request)`
- `private_post_account_set_account_level(request)`
- `private_post_account_set_collateral_assets(request)`
- `private_post_account_mmp_reset(request)`
- `private_post_account_mmp_config(request)`
- `private_post_account_fixed_loan_borrowing_order(request)`
- `private_post_account_fixed_loan_amend_borrowing_order(request)`
- `private_post_account_fixed_loan_manual_reborrow(request)`
- `private_post_account_fixed_loan_repay_borrowing_order(request)`
- `private_post_account_move_positions(request)`
- `private_post_account_set_auto_earn(request)`
- `private_post_account_set_settle_currency(request)`
- `private_post_account_set_trading_config(request)`
- `private_post_asset_subaccount_transfer(request)`
- `private_post_account_subaccount_set_loan_allocation(request)`
- `private_post_users_subaccount_create_subaccount(request)`
- `private_post_users_subaccount_apikey(request)`
- `private_post_users_subaccount_modify_apikey(request)`
- `private_post_users_subaccount_subaccount_apikey(request)`
- `private_post_users_subaccount_delete_apikey(request)`
- `private_post_users_subaccount_set_transfer_out(request)`
- `private_post_tradingbot_grid_order_algo(request)`
- `private_post_tradingbot_grid_amend_algo_basic_param(request)`
- `private_post_tradingbot_grid_amend_order_algo(request)`
- `private_post_tradingbot_grid_stop_order_algo(request)`
- `private_post_tradingbot_grid_close_position(request)`
- `private_post_tradingbot_grid_cancel_close_order(request)`
- `private_post_tradingbot_grid_order_instant_trigger(request)`
- `private_post_tradingbot_grid_withdraw_income(request)`
- `private_post_tradingbot_grid_compute_margin_balance(request)`
- `private_post_tradingbot_grid_margin_balance(request)`
- `private_post_tradingbot_grid_min_investment(request)`
- `private_post_tradingbot_grid_adjust_investment(request)`
- `private_post_tradingbot_signal_create_signal(request)`
- `private_post_tradingbot_signal_order_algo(request)`
- `private_post_tradingbot_signal_stop_order_algo(request)`
- `private_post_tradingbot_signal_margin_balance(request)`
- `private_post_tradingbot_signal_amendtpsl(request)`
- `private_post_tradingbot_signal_set_instruments(request)`
- `private_post_tradingbot_signal_close_position(request)`
- `private_post_tradingbot_signal_sub_order(request)`
- `private_post_tradingbot_signal_cancel_sub_order(request)`
- `private_post_tradingbot_recurring_order_algo(request)`
- `private_post_tradingbot_recurring_amend_order_algo(request)`
- `private_post_tradingbot_recurring_stop_order_algo(request)`
- `private_post_finance_savings_purchase_redempt(request)`
- `private_post_finance_savings_set_lending_rate(request)`
- `private_post_finance_staking_defi_purchase(request)`
- `private_post_finance_staking_defi_redeem(request)`
- `private_post_finance_staking_defi_cancel(request)`
- `private_post_finance_staking_defi_eth_purchase(request)`
- `private_post_finance_staking_defi_eth_redeem(request)`
- `private_post_finance_staking_defi_eth_cancel_redeem(request)`
- `private_post_finance_staking_defi_sol_purchase(request)`
- `private_post_finance_staking_defi_sol_redeem(request)`
- `private_post_finance_staking_defi_sol_cancel_redeem(request)`
- `private_post_finance_flexible_loan_max_loan(request)`
- `private_post_finance_flexible_loan_adjust_collateral(request)`
- `private_post_copytrading_algo_order(request)`
- `private_post_copytrading_close_subposition(request)`
- `private_post_copytrading_set_instruments(request)`
- `private_post_copytrading_amend_profit_sharing_ratio(request)`
- `private_post_copytrading_first_copy_settings(request)`
- `private_post_copytrading_amend_copy_settings(request)`
- `private_post_copytrading_stop_copy_trading(request)`
- `private_post_copytrading_batch_set_leverage(request)`
- `private_post_broker_nd_create_subaccount(request)`
- `private_post_broker_nd_delete_subaccount(request)`
- `private_post_broker_nd_subaccount_apikey(request)`
- `private_post_broker_nd_subaccount_modify_apikey(request)`
- `private_post_broker_nd_subaccount_delete_apikey(request)`
- `private_post_broker_nd_set_subaccount_level(request)`
- `private_post_broker_nd_set_subaccount_fee_rate(request)`
- `private_post_broker_nd_set_subaccount_assets(request)`
- `private_post_asset_broker_nd_subaccount_deposit_address(request)`
- `private_post_asset_broker_nd_modify_subaccount_deposit_address(request)`
- `private_post_broker_nd_rebate_per_orders(request)`
- `private_post_finance_sfp_dcd_quote(request)`
- `private_post_finance_sfp_dcd_order(request)`
- `private_post_broker_nd_report_subaccount_ip(request)`
- `private_post_broker_dma_subaccount_apikey(request)`
- `private_post_broker_dma_trades(request)`
- `private_post_broker_fd_rebate_per_orders(request)`
### WS Unified
- `describe(self)`
- `get_url(self, channel: str, access='public')`
- `subscribe_multiple(self, access, channel, symbols: Strings = None, params={})`
- `subscribe(self, access, messageHash, channel, symbol, params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `watch_funding_rate(self, symbol: str, params={})`
- `watch_funding_rates(self, symbols: List[str], params={})`
- `watch_ticker(self, symbol: str, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_mark_price(self, symbol: str, params={})`
- `watch_mark_prices(self, symbols: Strings = None, params={})`
- `un_watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_liquidations_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `watch_my_liquidations_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_ohlcv_for_symbols(self, symbolsAndTimeframes: List[List[str]], since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv_for_symbols(self, symbolsAndTimeframes: List[List[str]], params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `un_watch_order_book_for_symbols(self, symbols: List[str], params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `authenticate(self, params={})`
- `watch_balance(self, params={})`
- `order_to_trade(self, order, market=None)`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `create_order_ws(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `edit_order_ws(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `cancel_order_ws(self, id: str, symbol: Str = None, params={})`
- `cancel_orders_ws(self, ids: List[str], symbol: Str = None, params={})`
- `cancel_all_orders_ws(self, symbol: Str = None, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:57.270685 | okx_exchange-0.0.125.tar.gz | 831,961 | 8f/81/23bd949a48cb52656f8722168c2ae5b70b723b5c7b4db9817b0b34465b05/okx_exchange-0.0.125.tar.gz | source | sdist | null | false | 1fb7026c91584e969d94f300e66e7e6a | 8047ba074f129c242bc214dc6f342d7888247b90ca200077dcea822de3c9a484 | 8f8123bd949a48cb52656f8722168c2ae5b70b723b5c7b4db9817b0b34465b05 | null | [] | 218 |
2.4 | woofipro-api | 0.0.123 | woofipro crypto exchange api client | # woofipro-python
Python SDK (sync and async) for Woofipro cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/woofipro)
- You can check Woofipro's docs here: [Docs](https://www.google.com/search?q=google+woofipro+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/woofipro-python
- Pypi package: https://pypi.org/project/woofipro-api
## Installation
```
pip install woofipro-api
```
## Usage
### Sync
```Python
from woofipro import WoofiproSync
def main():
instance = WoofiproSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from woofipro import WoofiproAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = WoofiproAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from woofipro import WoofiproWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = WoofiproWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_balance(self, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposits_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: Str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_status(self, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `get_asset_history_rows(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `get_withdraw_nonce(self, params={})`
- `hash_message(self, message)`
- `nonce(self)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enable: bool)`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `v1_public_get_public_volume_stats(request)`
- `v1_public_get_public_broker_name(request)`
- `v1_public_get_public_chain_info_broker_id(request)`
- `v1_public_get_public_system_info(request)`
- `v1_public_get_public_vault_balance(request)`
- `v1_public_get_public_insurancefund(request)`
- `v1_public_get_public_chain_info(request)`
- `v1_public_get_faucet_usdc(request)`
- `v1_public_get_public_account(request)`
- `v1_public_get_get_account(request)`
- `v1_public_get_registration_nonce(request)`
- `v1_public_get_get_orderly_key(request)`
- `v1_public_get_public_liquidation(request)`
- `v1_public_get_public_liquidated_positions(request)`
- `v1_public_get_public_config(request)`
- `v1_public_get_public_campaign_ranking(request)`
- `v1_public_get_public_campaign_stats(request)`
- `v1_public_get_public_campaign_user(request)`
- `v1_public_get_public_campaign_stats_details(request)`
- `v1_public_get_public_campaigns(request)`
- `v1_public_get_public_points_leaderboard(request)`
- `v1_public_get_client_points(request)`
- `v1_public_get_public_points_epoch(request)`
- `v1_public_get_public_points_epoch_dates(request)`
- `v1_public_get_public_referral_check_ref_code(request)`
- `v1_public_get_public_referral_verify_ref_code(request)`
- `v1_public_get_referral_admin_info(request)`
- `v1_public_get_referral_info(request)`
- `v1_public_get_referral_referee_info(request)`
- `v1_public_get_referral_referee_rebate_summary(request)`
- `v1_public_get_referral_referee_history(request)`
- `v1_public_get_referral_referral_history(request)`
- `v1_public_get_referral_rebate_summary(request)`
- `v1_public_get_client_distribution_history(request)`
- `v1_public_get_tv_config(request)`
- `v1_public_get_tv_history(request)`
- `v1_public_get_tv_symbol_info(request)`
- `v1_public_get_public_funding_rate_history(request)`
- `v1_public_get_public_funding_rate_symbol(request)`
- `v1_public_get_public_funding_rates(request)`
- `v1_public_get_public_info(request)`
- `v1_public_get_public_info_symbol(request)`
- `v1_public_get_public_market_trades(request)`
- `v1_public_get_public_token(request)`
- `v1_public_get_public_futures(request)`
- `v1_public_get_public_futures_symbol(request)`
- `v1_public_post_register_account(request)`
- `v1_private_get_client_key_info(request)`
- `v1_private_get_client_orderly_key_ip_restriction(request)`
- `v1_private_get_order_oid(request)`
- `v1_private_get_client_order_client_order_id(request)`
- `v1_private_get_algo_order_oid(request)`
- `v1_private_get_algo_client_order_client_order_id(request)`
- `v1_private_get_orders(request)`
- `v1_private_get_algo_orders(request)`
- `v1_private_get_trade_tid(request)`
- `v1_private_get_trades(request)`
- `v1_private_get_order_oid_trades(request)`
- `v1_private_get_client_liquidator_liquidations(request)`
- `v1_private_get_liquidations(request)`
- `v1_private_get_asset_history(request)`
- `v1_private_get_client_holding(request)`
- `v1_private_get_withdraw_nonce(request)`
- `v1_private_get_settle_nonce(request)`
- `v1_private_get_pnl_settlement_history(request)`
- `v1_private_get_volume_user_daily(request)`
- `v1_private_get_volume_user_stats(request)`
- `v1_private_get_client_statistics(request)`
- `v1_private_get_client_info(request)`
- `v1_private_get_client_statistics_daily(request)`
- `v1_private_get_positions(request)`
- `v1_private_get_position_symbol(request)`
- `v1_private_get_funding_fee_history(request)`
- `v1_private_get_notification_inbox_notifications(request)`
- `v1_private_get_notification_inbox_unread(request)`
- `v1_private_get_volume_broker_daily(request)`
- `v1_private_get_broker_fee_rate_default(request)`
- `v1_private_get_broker_user_info(request)`
- `v1_private_get_orderbook_symbol(request)`
- `v1_private_get_kline(request)`
- `v1_private_post_orderly_key(request)`
- `v1_private_post_client_set_orderly_key_ip_restriction(request)`
- `v1_private_post_client_reset_orderly_key_ip_restriction(request)`
- `v1_private_post_order(request)`
- `v1_private_post_batch_order(request)`
- `v1_private_post_algo_order(request)`
- `v1_private_post_liquidation(request)`
- `v1_private_post_claim_insurance_fund(request)`
- `v1_private_post_withdraw_request(request)`
- `v1_private_post_settle_pnl(request)`
- `v1_private_post_notification_inbox_mark_read(request)`
- `v1_private_post_notification_inbox_mark_read_all(request)`
- `v1_private_post_client_leverage(request)`
- `v1_private_post_client_maintenance_config(request)`
- `v1_private_post_delegate_signer(request)`
- `v1_private_post_delegate_orderly_key(request)`
- `v1_private_post_delegate_settle_pnl(request)`
- `v1_private_post_delegate_withdraw_request(request)`
- `v1_private_post_broker_fee_rate_set(request)`
- `v1_private_post_broker_fee_rate_set_default(request)`
- `v1_private_post_broker_fee_rate_default(request)`
- `v1_private_post_referral_create(request)`
- `v1_private_post_referral_update(request)`
- `v1_private_post_referral_bind(request)`
- `v1_private_post_referral_edit_split(request)`
- `v1_private_put_order(request)`
- `v1_private_put_algo_order(request)`
- `v1_private_delete_order(request)`
- `v1_private_delete_algo_order(request)`
- `v1_private_delete_client_order(request)`
- `v1_private_delete_algo_client_order(request)`
- `v1_private_delete_algo_orders(request)`
- `v1_private_delete_orders(request)`
- `v1_private_delete_batch_order(request)`
- `v1_private_delete_client_batch_order(request)`
### WS Unified
- `describe(self)`
- `watch_public(self, messageHash, message)`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `authenticate(self, params={})`
- `watch_private(self, messageHash, message, params={})`
- `watch_private_multiple(self, messageHashes, message, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `set_positions_cache(self, client: Client, type, symbols: Strings = None)`
- `load_positions_snapshot(self, client, messageHash)`
- `watch_balance(self, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:55.452267 | woofipro_api-0.0.123.tar.gz | 719,315 | b0/1f/c16a0d99e3d6783e656e357b4f41454ef88c62c7daea659ac69d3e16bb71/woofipro_api-0.0.123.tar.gz | source | sdist | null | false | eb8fcddf80dc8c59c7b7437286cafd52 | 322c9214f345c4f60533e413d4a5cc0b02a9cebba2ac4f62f387bcab6b93aeec | b01fc16a0d99e3d6783e656e357b4f41454ef88c62c7daea659ac69d3e16bb71 | null | [] | 245 |
2.4 | kucoin-api | 0.0.128 | kucoin crypto exchange api client | # kucoin-python
Python SDK (sync and async) for Kucoin cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/kucoin)
- You can check Kucoin's docs here: [Docs](https://www.google.com/search?q=google+kucoin+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/kucoin-python
- Pypi package: https://pypi.org/project/kucoin-api
## Installation
```
pip install kucoin-api
```
## Usage
### Sync
```Python
from kucoin import KucoinSync
def main():
instance = KucoinSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from kucoin import KucoinAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = KucoinAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from kucoin import KucoinWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = KucoinWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_deposit_address(self, code: str, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_order_with_cost(self, symbol: str, side: OrderSide, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_accounts(self, params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_borrow_rate_histories(self, codes=None, since: Int = None, limit: Int = None, params={})`
- `fetch_borrow_rate_history(self, code: str, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fee(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_mark_price(self, symbol: str, params={})`
- `fetch_mark_prices(self, symbols: Strings = None, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders_by_status(self, status, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_status(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_transaction_fee(self, code: str, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_uta_markets(self, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `calculate_rate_limiter_cost(self, api, method, path, params, config={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `is_futures_method(self, methodName, params)`
- `is_hf_or_mining(self, fromId: Str, toId: Str)`
- `market_order_amount_to_precision(self, symbol: str, amount)`
- `nonce(self)`
- `repay_cross_margin(self, code: str, amount, params={})`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_get_currencies(request)`
- `public_get_currencies_currency(request)`
- `public_get_symbols(request)`
- `public_get_market_orderbook_level1(request)`
- `public_get_market_alltickers(request)`
- `public_get_market_stats(request)`
- `public_get_markets(request)`
- `public_get_market_orderbook_level_level_limit(request)`
- `public_get_market_orderbook_level2_20(request)`
- `public_get_market_orderbook_level2_100(request)`
- `public_get_market_histories(request)`
- `public_get_market_candles(request)`
- `public_get_prices(request)`
- `public_get_timestamp(request)`
- `public_get_status(request)`
- `public_get_mark_price_symbol_current(request)`
- `public_get_mark_price_all_symbols(request)`
- `public_get_margin_config(request)`
- `public_get_announcements(request)`
- `public_get_margin_collateralratio(request)`
- `public_get_convert_symbol(request)`
- `public_get_convert_currencies(request)`
- `public_post_bullet_public(request)`
- `private_get_user_info(request)`
- `private_get_user_api_key(request)`
- `private_get_accounts(request)`
- `private_get_accounts_accountid(request)`
- `private_get_accounts_ledgers(request)`
- `private_get_hf_accounts_ledgers(request)`
- `private_get_hf_margin_account_ledgers(request)`
- `private_get_transaction_history(request)`
- `private_get_sub_user(request)`
- `private_get_sub_accounts_subuserid(request)`
- `private_get_sub_accounts(request)`
- `private_get_sub_api_key(request)`
- `private_get_margin_account(request)`
- `private_get_margin_accounts(request)`
- `private_get_isolated_accounts(request)`
- `private_get_deposit_addresses(request)`
- `private_get_deposits(request)`
- `private_get_hist_deposits(request)`
- `private_get_withdrawals(request)`
- `private_get_hist_withdrawals(request)`
- `private_get_withdrawals_quotas(request)`
- `private_get_accounts_transferable(request)`
- `private_get_transfer_list(request)`
- `private_get_base_fee(request)`
- `private_get_trade_fees(request)`
- `private_get_market_orderbook_level_level(request)`
- `private_get_market_orderbook_level2(request)`
- `private_get_market_orderbook_level3(request)`
- `private_get_hf_accounts_opened(request)`
- `private_get_hf_orders_active(request)`
- `private_get_hf_orders_active_symbols(request)`
- `private_get_hf_margin_order_active_symbols(request)`
- `private_get_hf_orders_done(request)`
- `private_get_hf_orders_orderid(request)`
- `private_get_hf_orders_client_order_clientoid(request)`
- `private_get_hf_orders_dead_cancel_all_query(request)`
- `private_get_hf_fills(request)`
- `private_get_orders(request)`
- `private_get_limit_orders(request)`
- `private_get_orders_orderid(request)`
- `private_get_order_client_order_clientoid(request)`
- `private_get_fills(request)`
- `private_get_limit_fills(request)`
- `private_get_stop_order(request)`
- `private_get_stop_order_orderid(request)`
- `private_get_stop_order_queryorderbyclientoid(request)`
- `private_get_oco_order_orderid(request)`
- `private_get_oco_order_details_orderid(request)`
- `private_get_oco_client_order_clientoid(request)`
- `private_get_oco_orders(request)`
- `private_get_hf_margin_orders_active(request)`
- `private_get_hf_margin_orders_done(request)`
- `private_get_hf_margin_orders_orderid(request)`
- `private_get_hf_margin_orders_client_order_clientoid(request)`
- `private_get_hf_margin_fills(request)`
- `private_get_etf_info(request)`
- `private_get_margin_currencies(request)`
- `private_get_risk_limit_strategy(request)`
- `private_get_isolated_symbols(request)`
- `private_get_margin_symbols(request)`
- `private_get_isolated_account_symbol(request)`
- `private_get_margin_borrow(request)`
- `private_get_margin_repay(request)`
- `private_get_margin_interest(request)`
- `private_get_project_list(request)`
- `private_get_project_marketinterestrate(request)`
- `private_get_redeem_orders(request)`
- `private_get_purchase_orders(request)`
- `private_get_broker_api_rebase_download(request)`
- `private_get_broker_querymycommission(request)`
- `private_get_broker_queryuser(request)`
- `private_get_broker_querydetailbyuid(request)`
- `private_get_migrate_user_account_status(request)`
- `private_get_convert_quote(request)`
- `private_get_convert_order_detail(request)`
- `private_get_convert_order_history(request)`
- `private_get_convert_limit_quote(request)`
- `private_get_convert_limit_order_detail(request)`
- `private_get_convert_limit_orders(request)`
- `private_get_affiliate_inviter_statistics(request)`
- `private_get_earn_redeem_preview(request)`
- `private_post_sub_user_created(request)`
- `private_post_sub_api_key(request)`
- `private_post_sub_api_key_update(request)`
- `private_post_deposit_addresses(request)`
- `private_post_withdrawals(request)`
- `private_post_accounts_universal_transfer(request)`
- `private_post_accounts_sub_transfer(request)`
- `private_post_accounts_inner_transfer(request)`
- `private_post_transfer_out(request)`
- `private_post_transfer_in(request)`
- `private_post_hf_orders(request)`
- `private_post_hf_orders_test(request)`
- `private_post_hf_orders_sync(request)`
- `private_post_hf_orders_multi(request)`
- `private_post_hf_orders_multi_sync(request)`
- `private_post_hf_orders_alter(request)`
- `private_post_hf_orders_dead_cancel_all(request)`
- `private_post_orders(request)`
- `private_post_orders_test(request)`
- `private_post_orders_multi(request)`
- `private_post_stop_order(request)`
- `private_post_oco_order(request)`
- `private_post_hf_margin_order(request)`
- `private_post_hf_margin_order_test(request)`
- `private_post_margin_order(request)`
- `private_post_margin_order_test(request)`
- `private_post_margin_borrow(request)`
- `private_post_margin_repay(request)`
- `private_post_purchase(request)`
- `private_post_redeem(request)`
- `private_post_lend_purchase_update(request)`
- `private_post_convert_order(request)`
- `private_post_convert_limit_order(request)`
- `private_post_bullet_private(request)`
- `private_post_position_update_user_leverage(request)`
- `private_post_deposit_address_create(request)`
- `private_delete_sub_api_key(request)`
- `private_delete_withdrawals_withdrawalid(request)`
- `private_delete_hf_orders_orderid(request)`
- `private_delete_hf_orders_sync_orderid(request)`
- `private_delete_hf_orders_client_order_clientoid(request)`
- `private_delete_hf_orders_sync_client_order_clientoid(request)`
- `private_delete_hf_orders_cancel_orderid(request)`
- `private_delete_hf_orders(request)`
- `private_delete_hf_orders_cancelall(request)`
- `private_delete_orders_orderid(request)`
- `private_delete_order_client_order_clientoid(request)`
- `private_delete_orders(request)`
- `private_delete_stop_order_orderid(request)`
- `private_delete_stop_order_cancelorderbyclientoid(request)`
- `private_delete_stop_order_cancel(request)`
- `private_delete_oco_order_orderid(request)`
- `private_delete_oco_client_order_clientoid(request)`
- `private_delete_oco_orders(request)`
- `private_delete_hf_margin_orders_orderid(request)`
- `private_delete_hf_margin_orders_client_order_clientoid(request)`
- `private_delete_hf_margin_orders(request)`
- `private_delete_convert_limit_order_cancel(request)`
- `futurespublic_get_contracts_active(request)`
- `futurespublic_get_contracts_symbol(request)`
- `futurespublic_get_ticker(request)`
- `futurespublic_get_level2_snapshot(request)`
- `futurespublic_get_level2_depth20(request)`
- `futurespublic_get_level2_depth100(request)`
- `futurespublic_get_trade_history(request)`
- `futurespublic_get_kline_query(request)`
- `futurespublic_get_interest_query(request)`
- `futurespublic_get_index_query(request)`
- `futurespublic_get_mark_price_symbol_current(request)`
- `futurespublic_get_premium_query(request)`
- `futurespublic_get_trade_statistics(request)`
- `futurespublic_get_funding_rate_symbol_current(request)`
- `futurespublic_get_contract_funding_rates(request)`
- `futurespublic_get_timestamp(request)`
- `futurespublic_get_status(request)`
- `futurespublic_get_level2_message_query(request)`
- `futurespublic_post_bullet_public(request)`
- `futuresprivate_get_transaction_history(request)`
- `futuresprivate_get_account_overview(request)`
- `futuresprivate_get_account_overview_all(request)`
- `futuresprivate_get_transfer_list(request)`
- `futuresprivate_get_orders(request)`
- `futuresprivate_get_stoporders(request)`
- `futuresprivate_get_recentdoneorders(request)`
- `futuresprivate_get_orders_orderid(request)`
- `futuresprivate_get_orders_byclientoid(request)`
- `futuresprivate_get_fills(request)`
- `futuresprivate_get_recentfills(request)`
- `futuresprivate_get_openorderstatistics(request)`
- `futuresprivate_get_position(request)`
- `futuresprivate_get_positions(request)`
- `futuresprivate_get_margin_maxwithdrawmargin(request)`
- `futuresprivate_get_contracts_risk_limit_symbol(request)`
- `futuresprivate_get_funding_history(request)`
- `futuresprivate_get_copy_trade_futures_get_max_open_size(request)`
- `futuresprivate_get_copy_trade_futures_position_margin_max_withdraw_margin(request)`
- `futuresprivate_post_transfer_out(request)`
- `futuresprivate_post_transfer_in(request)`
- `futuresprivate_post_orders(request)`
- `futuresprivate_post_orders_test(request)`
- `futuresprivate_post_orders_multi(request)`
- `futuresprivate_post_position_margin_auto_deposit_status(request)`
- `futuresprivate_post_margin_withdrawmargin(request)`
- `futuresprivate_post_position_margin_deposit_margin(request)`
- `futuresprivate_post_position_risk_limit_level_change(request)`
- `futuresprivate_post_copy_trade_futures_orders(request)`
- `futuresprivate_post_copy_trade_futures_orders_test(request)`
- `futuresprivate_post_copy_trade_futures_st_orders(request)`
- `futuresprivate_post_copy_trade_futures_position_margin_deposit_margin(request)`
- `futuresprivate_post_copy_trade_futures_position_margin_withdraw_margin(request)`
- `futuresprivate_post_copy_trade_futures_position_risk_limit_level_change(request)`
- `futuresprivate_post_copy_trade_futures_position_margin_auto_deposit_status(request)`
- `futuresprivate_post_copy_trade_futures_position_changemarginmode(request)`
- `futuresprivate_post_copy_trade_futures_position_changecrossuserleverage(request)`
- `futuresprivate_post_copy_trade_getcrossmodemarginrequirement(request)`
- `futuresprivate_post_copy_trade_position_switchpositionmode(request)`
- `futuresprivate_post_bullet_private(request)`
- `futuresprivate_delete_orders_orderid(request)`
- `futuresprivate_delete_orders_client_order_clientoid(request)`
- `futuresprivate_delete_orders(request)`
- `futuresprivate_delete_stoporders(request)`
- `futuresprivate_delete_copy_trade_futures_orders(request)`
- `futuresprivate_delete_copy_trade_futures_orders_client_order(request)`
- `webexchange_get_currency_currency_chain_info(request)`
- `broker_get_broker_nd_info(request)`
- `broker_get_broker_nd_account(request)`
- `broker_get_broker_nd_account_apikey(request)`
- `broker_get_broker_nd_rebase_download(request)`
- `broker_get_asset_ndbroker_deposit_list(request)`
- `broker_get_broker_nd_transfer_detail(request)`
- `broker_get_broker_nd_deposit_detail(request)`
- `broker_get_broker_nd_withdraw_detail(request)`
- `broker_post_broker_nd_transfer(request)`
- `broker_post_broker_nd_account(request)`
- `broker_post_broker_nd_account_apikey(request)`
- `broker_post_broker_nd_account_update_apikey(request)`
- `broker_delete_broker_nd_account_apikey(request)`
- `earn_get_otc_loan_discount_rate_configs(request)`
- `earn_get_otc_loan_loan(request)`
- `earn_get_otc_loan_accounts(request)`
- `earn_get_earn_redeem_preview(request)`
- `earn_get_earn_saving_products(request)`
- `earn_get_earn_hold_assets(request)`
- `earn_get_earn_promotion_products(request)`
- `earn_get_earn_kcs_staking_products(request)`
- `earn_get_earn_staking_products(request)`
- `earn_get_earn_eth_staking_products(request)`
- `earn_get_struct_earn_dual_products(request)`
- `earn_get_struct_earn_orders(request)`
- `earn_post_earn_orders(request)`
- `earn_post_struct_earn_orders(request)`
- `earn_delete_earn_orders(request)`
- `uta_get_market_announcement(request)`
- `uta_get_market_currency(request)`
- `uta_get_market_currencies(request)`
- `uta_get_market_instrument(request)`
- `uta_get_market_ticker(request)`
- `uta_get_market_trade(request)`
- `uta_get_market_kline(request)`
- `uta_get_market_funding_rate(request)`
- `uta_get_market_funding_rate_history(request)`
- `uta_get_market_cross_config(request)`
- `uta_get_market_collateral_discount_ratio(request)`
- `uta_get_market_index_price(request)`
- `uta_get_market_position_tiers(request)`
- `uta_get_market_open_interest(request)`
- `uta_get_server_status(request)`
- `utaprivate_get_market_orderbook(request)`
- `utaprivate_get_account_balance(request)`
- `utaprivate_get_account_transfer_quota(request)`
- `utaprivate_get_account_mode(request)`
- `utaprivate_get_account_ledger(request)`
- `utaprivate_get_account_interest_history(request)`
- `utaprivate_get_account_deposit_address(request)`
- `utaprivate_get_accountmode_account_balance(request)`
- `utaprivate_get_accountmode_account_overview(request)`
- `utaprivate_get_accountmode_order_detail(request)`
- `utaprivate_get_accountmode_order_open_list(request)`
- `utaprivate_get_accountmode_order_history(request)`
- `utaprivate_get_accountmode_order_execution(request)`
- `utaprivate_get_accountmode_position_open_list(request)`
- `utaprivate_get_accountmode_position_history(request)`
- `utaprivate_get_accountmode_position_tiers(request)`
- `utaprivate_get_sub_account_balance(request)`
- `utaprivate_get_user_fee_rate(request)`
- `utaprivate_get_dcp_query(request)`
- `utaprivate_post_account_transfer(request)`
- `utaprivate_post_account_mode(request)`
- `utaprivate_post_accountmode_account_modify_leverage(request)`
- `utaprivate_post_accountmode_order_place(request)`
- `utaprivate_post_accountmode_order_place_batch(request)`
- `utaprivate_post_accountmode_order_cancel(request)`
- `utaprivate_post_accountmode_order_cancel_batch(request)`
- `utaprivate_post_sub_account_cantransferout(request)`
- `utaprivate_post_dcp_set(request)`
### WS Unified
- `describe(self)`
- `negotiate(self, privateChannel, params={})`
- `negotiate_helper(self, privateChannel, params={})`
- `subscribe(self, url, messageHash, subscriptionHash, params={}, subscription=None)`
- `un_subscribe(self, url, messageHash, topic, subscriptionHash, params={}, subscription: dict = None)`
- `subscribe_multiple(self, url, messageHashes, topic, subscriptionHashes, params={}, subscription=None)`
- `un_subscribe_multiple(self, url, messageHashes, topic, subscriptionHashes, params={}, subscription: dict = None)`
- `watch_ticker(self, symbol: str, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_multi_helper(self, methodName, channelName: str, symbols: Strings = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `un_watch_order_book_for_symbols(self, symbols: List[str], params={})`
- `get_cache_index(self, orderbook, cache)`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_balance(self, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:53.956926 | kucoin_api-0.0.128.tar.gz | 774,281 | da/fc/cb13273ee24802e454bd2cc09e7d248970f72e74b8bb78bc791ea7ab1f6e/kucoin_api-0.0.128.tar.gz | source | sdist | null | false | 1e31462b8eb0ad5f8466a9f1344e0ac0 | 5bc672f2b4700528304290e49a8f819e3b48fc58dce01db294a3e7f14ccc6937 | dafccb13273ee24802e454bd2cc09e7d248970f72e74b8bb78bc791ea7ab1f6e | null | [] | 241 |
2.4 | kucoin-futures-api | 0.0.123 | kucoinfutures crypto exchange api client | # kucoinfutures-python
Python SDK (sync and async) for Kucoinfutures cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/kucoinfutures)
- You can check Kucoinfutures's docs here: [Docs](https://www.google.com/search?q=google+kucoinfutures+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/kucoinfutures-python
- Pypi package: https://pypi.org/project/kucoin-futures-api
## Installation
```
pip install kucoin-futures-api
```
## Usage
### Sync
```Python
from kucoinfutures import KucoinfuturesSync
def main():
instance = KucoinfuturesSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from kucoinfutures import KucoinfuturesAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = KucoinfuturesAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from kucoinfutures import KucoinfuturesWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = KucoinfuturesWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_contract_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_balance(self, params={})`
- `fetch_bids_asks(self, symbols: Strings = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_margin_mode(self, symbol: str, params={})`
- `fetch_mark_price(self, symbol: str, params={})`
- `fetch_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order(self, id: Str, symbol: Str = None, params={})`
- `fetch_orders_by_status(self, status, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_history(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_status(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `describe(self)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
### REST Raw
- `public_get_currencies(request)`
- `public_get_currencies_currency(request)`
- `public_get_symbols(request)`
- `public_get_market_orderbook_level1(request)`
- `public_get_market_alltickers(request)`
- `public_get_market_stats(request)`
- `public_get_markets(request)`
- `public_get_market_orderbook_level_level_limit(request)`
- `public_get_market_orderbook_level2_20(request)`
- `public_get_market_orderbook_level2_100(request)`
- `public_get_market_histories(request)`
- `public_get_market_candles(request)`
- `public_get_prices(request)`
- `public_get_timestamp(request)`
- `public_get_status(request)`
- `public_get_mark_price_symbol_current(request)`
- `public_get_mark_price_all_symbols(request)`
- `public_get_margin_config(request)`
- `public_get_announcements(request)`
- `public_get_margin_collateralratio(request)`
- `public_get_convert_symbol(request)`
- `public_get_convert_currencies(request)`
- `public_post_bullet_public(request)`
- `private_get_user_info(request)`
- `private_get_user_api_key(request)`
- `private_get_accounts(request)`
- `private_get_accounts_accountid(request)`
- `private_get_accounts_ledgers(request)`
- `private_get_hf_accounts_ledgers(request)`
- `private_get_hf_margin_account_ledgers(request)`
- `private_get_transaction_history(request)`
- `private_get_sub_user(request)`
- `private_get_sub_accounts_subuserid(request)`
- `private_get_sub_accounts(request)`
- `private_get_sub_api_key(request)`
- `private_get_margin_account(request)`
- `private_get_margin_accounts(request)`
- `private_get_isolated_accounts(request)`
- `private_get_deposit_addresses(request)`
- `private_get_deposits(request)`
- `private_get_hist_deposits(request)`
- `private_get_withdrawals(request)`
- `private_get_hist_withdrawals(request)`
- `private_get_withdrawals_quotas(request)`
- `private_get_accounts_transferable(request)`
- `private_get_transfer_list(request)`
- `private_get_base_fee(request)`
- `private_get_trade_fees(request)`
- `private_get_market_orderbook_level_level(request)`
- `private_get_market_orderbook_level2(request)`
- `private_get_market_orderbook_level3(request)`
- `private_get_hf_accounts_opened(request)`
- `private_get_hf_orders_active(request)`
- `private_get_hf_orders_active_symbols(request)`
- `private_get_hf_margin_order_active_symbols(request)`
- `private_get_hf_orders_done(request)`
- `private_get_hf_orders_orderid(request)`
- `private_get_hf_orders_client_order_clientoid(request)`
- `private_get_hf_orders_dead_cancel_all_query(request)`
- `private_get_hf_fills(request)`
- `private_get_orders(request)`
- `private_get_limit_orders(request)`
- `private_get_orders_orderid(request)`
- `private_get_order_client_order_clientoid(request)`
- `private_get_fills(request)`
- `private_get_limit_fills(request)`
- `private_get_stop_order(request)`
- `private_get_stop_order_orderid(request)`
- `private_get_stop_order_queryorderbyclientoid(request)`
- `private_get_oco_order_orderid(request)`
- `private_get_oco_order_details_orderid(request)`
- `private_get_oco_client_order_clientoid(request)`
- `private_get_oco_orders(request)`
- `private_get_hf_margin_orders_active(request)`
- `private_get_hf_margin_orders_done(request)`
- `private_get_hf_margin_orders_orderid(request)`
- `private_get_hf_margin_orders_client_order_clientoid(request)`
- `private_get_hf_margin_fills(request)`
- `private_get_etf_info(request)`
- `private_get_margin_currencies(request)`
- `private_get_risk_limit_strategy(request)`
- `private_get_isolated_symbols(request)`
- `private_get_margin_symbols(request)`
- `private_get_isolated_account_symbol(request)`
- `private_get_margin_borrow(request)`
- `private_get_margin_repay(request)`
- `private_get_margin_interest(request)`
- `private_get_project_list(request)`
- `private_get_project_marketinterestrate(request)`
- `private_get_redeem_orders(request)`
- `private_get_purchase_orders(request)`
- `private_get_broker_api_rebase_download(request)`
- `private_get_broker_querymycommission(request)`
- `private_get_broker_queryuser(request)`
- `private_get_broker_querydetailbyuid(request)`
- `private_get_migrate_user_account_status(request)`
- `private_get_convert_quote(request)`
- `private_get_convert_order_detail(request)`
- `private_get_convert_order_history(request)`
- `private_get_convert_limit_quote(request)`
- `private_get_convert_limit_order_detail(request)`
- `private_get_convert_limit_orders(request)`
- `private_get_affiliate_inviter_statistics(request)`
- `private_get_earn_redeem_preview(request)`
- `private_post_sub_user_created(request)`
- `private_post_sub_api_key(request)`
- `private_post_sub_api_key_update(request)`
- `private_post_deposit_addresses(request)`
- `private_post_withdrawals(request)`
- `private_post_accounts_universal_transfer(request)`
- `private_post_accounts_sub_transfer(request)`
- `private_post_accounts_inner_transfer(request)`
- `private_post_transfer_out(request)`
- `private_post_transfer_in(request)`
- `private_post_hf_orders(request)`
- `private_post_hf_orders_test(request)`
- `private_post_hf_orders_sync(request)`
- `private_post_hf_orders_multi(request)`
- `private_post_hf_orders_multi_sync(request)`
- `private_post_hf_orders_alter(request)`
- `private_post_hf_orders_dead_cancel_all(request)`
- `private_post_orders(request)`
- `private_post_orders_test(request)`
- `private_post_orders_multi(request)`
- `private_post_stop_order(request)`
- `private_post_oco_order(request)`
- `private_post_hf_margin_order(request)`
- `private_post_hf_margin_order_test(request)`
- `private_post_margin_order(request)`
- `private_post_margin_order_test(request)`
- `private_post_margin_borrow(request)`
- `private_post_margin_repay(request)`
- `private_post_purchase(request)`
- `private_post_redeem(request)`
- `private_post_lend_purchase_update(request)`
- `private_post_convert_order(request)`
- `private_post_convert_limit_order(request)`
- `private_post_bullet_private(request)`
- `private_post_position_update_user_leverage(request)`
- `private_post_deposit_address_create(request)`
- `private_delete_sub_api_key(request)`
- `private_delete_withdrawals_withdrawalid(request)`
- `private_delete_hf_orders_orderid(request)`
- `private_delete_hf_orders_sync_orderid(request)`
- `private_delete_hf_orders_client_order_clientoid(request)`
- `private_delete_hf_orders_sync_client_order_clientoid(request)`
- `private_delete_hf_orders_cancel_orderid(request)`
- `private_delete_hf_orders(request)`
- `private_delete_hf_orders_cancelall(request)`
- `private_delete_orders_orderid(request)`
- `private_delete_order_client_order_clientoid(request)`
- `private_delete_orders(request)`
- `private_delete_stop_order_orderid(request)`
- `private_delete_stop_order_cancelorderbyclientoid(request)`
- `private_delete_stop_order_cancel(request)`
- `private_delete_oco_order_orderid(request)`
- `private_delete_oco_client_order_clientoid(request)`
- `private_delete_oco_orders(request)`
- `private_delete_hf_margin_orders_orderid(request)`
- `private_delete_hf_margin_orders_client_order_clientoid(request)`
- `private_delete_hf_margin_orders(request)`
- `private_delete_convert_limit_order_cancel(request)`
- `futurespublic_get_contracts_active(request)`
- `futurespublic_get_contracts_symbol(request)`
- `futurespublic_get_ticker(request)`
- `futurespublic_get_level2_snapshot(request)`
- `futurespublic_get_level2_depth20(request)`
- `futurespublic_get_level2_depth100(request)`
- `futurespublic_get_trade_history(request)`
- `futurespublic_get_kline_query(request)`
- `futurespublic_get_interest_query(request)`
- `futurespublic_get_index_query(request)`
- `futurespublic_get_mark_price_symbol_current(request)`
- `futurespublic_get_premium_query(request)`
- `futurespublic_get_trade_statistics(request)`
- `futurespublic_get_funding_rate_symbol_current(request)`
- `futurespublic_get_contract_funding_rates(request)`
- `futurespublic_get_timestamp(request)`
- `futurespublic_get_status(request)`
- `futurespublic_get_level2_message_query(request)`
- `futurespublic_get_contracts_risk_limit_symbol(request)`
- `futurespublic_get_alltickers(request)`
- `futurespublic_get_level2_depth_limit(request)`
- `futurespublic_get_level3_message_query(request)`
- `futurespublic_get_level3_snapshot(request)`
- `futurespublic_post_bullet_public(request)`
- `futuresprivate_get_transaction_history(request)`
- `futuresprivate_get_account_overview(request)`
- `futuresprivate_get_account_overview_all(request)`
- `futuresprivate_get_transfer_list(request)`
- `futuresprivate_get_orders(request)`
- `futuresprivate_get_stoporders(request)`
- `futuresprivate_get_recentdoneorders(request)`
- `futuresprivate_get_orders_orderid(request)`
- `futuresprivate_get_orders_byclientoid(request)`
- `futuresprivate_get_fills(request)`
- `futuresprivate_get_recentfills(request)`
- `futuresprivate_get_openorderstatistics(request)`
- `futuresprivate_get_position(request)`
- `futuresprivate_get_positions(request)`
- `futuresprivate_get_margin_maxwithdrawmargin(request)`
- `futuresprivate_get_contracts_risk_limit_symbol(request)`
- `futuresprivate_get_funding_history(request)`
- `futuresprivate_get_copy_trade_futures_get_max_open_size(request)`
- `futuresprivate_get_copy_trade_futures_position_margin_max_withdraw_margin(request)`
- `futuresprivate_get_deposit_address(request)`
- `futuresprivate_get_deposit_list(request)`
- `futuresprivate_get_withdrawals_quotas(request)`
- `futuresprivate_get_withdrawal_list(request)`
- `futuresprivate_get_sub_api_key(request)`
- `futuresprivate_get_trade_statistics(request)`
- `futuresprivate_get_trade_fees(request)`
- `futuresprivate_get_history_positions(request)`
- `futuresprivate_get_getmaxopensize(request)`
- `futuresprivate_get_getcrossuserleverage(request)`
- `futuresprivate_get_position_getmarginmode(request)`
- `futuresprivate_post_transfer_out(request)`
- `futuresprivate_post_transfer_in(request)`
- `futuresprivate_post_orders(request)`
- `futuresprivate_post_orders_test(request)`
- `futuresprivate_post_orders_multi(request)`
- `futuresprivate_post_position_margin_auto_deposit_status(request)`
- `futuresprivate_post_margin_withdrawmargin(request)`
- `futuresprivate_post_position_margin_deposit_margin(request)`
- `futuresprivate_post_position_risk_limit_level_change(request)`
- `futuresprivate_post_copy_trade_futures_orders(request)`
- `futuresprivate_post_copy_trade_futures_orders_test(request)`
- `futuresprivate_post_copy_trade_futures_st_orders(request)`
- `futuresprivate_post_copy_trade_futures_position_margin_deposit_margin(request)`
- `futuresprivate_post_copy_trade_futures_position_margin_withdraw_margin(request)`
- `futuresprivate_post_copy_trade_futures_position_risk_limit_level_change(request)`
- `futuresprivate_post_copy_trade_futures_position_margin_auto_deposit_status(request)`
- `futuresprivate_post_copy_trade_futures_position_changemarginmode(request)`
- `futuresprivate_post_copy_trade_futures_position_changecrossuserleverage(request)`
- `futuresprivate_post_copy_trade_getcrossmodemarginrequirement(request)`
- `futuresprivate_post_copy_trade_position_switchpositionmode(request)`
- `futuresprivate_post_bullet_private(request)`
- `futuresprivate_post_withdrawals(request)`
- `futuresprivate_post_st_orders(request)`
- `futuresprivate_post_sub_api_key(request)`
- `futuresprivate_post_sub_api_key_update(request)`
- `futuresprivate_post_changecrossuserleverage(request)`
- `futuresprivate_post_position_changemarginmode(request)`
- `futuresprivate_post_position_switchpositionmode(request)`
- `futuresprivate_delete_orders_orderid(request)`
- `futuresprivate_delete_orders_client_order_clientoid(request)`
- `futuresprivate_delete_orders(request)`
- `futuresprivate_delete_stoporders(request)`
- `futuresprivate_delete_copy_trade_futures_orders(request)`
- `futuresprivate_delete_copy_trade_futures_orders_client_order(request)`
- `futuresprivate_delete_withdrawals_withdrawalid(request)`
- `futuresprivate_delete_cancel_transfer_out(request)`
- `futuresprivate_delete_sub_api_key(request)`
- `futuresprivate_delete_orders_multi_cancel(request)`
- `webexchange_get_currency_currency_chain_info(request)`
- `webexchange_get_contract_symbol_funding_rates(request)`
- `broker_get_broker_nd_info(request)`
- `broker_get_broker_nd_account(request)`
- `broker_get_broker_nd_account_apikey(request)`
- `broker_get_broker_nd_rebase_download(request)`
- `broker_get_asset_ndbroker_deposit_list(request)`
- `broker_get_broker_nd_transfer_detail(request)`
- `broker_get_broker_nd_deposit_detail(request)`
- `broker_get_broker_nd_withdraw_detail(request)`
- `broker_post_broker_nd_transfer(request)`
- `broker_post_broker_nd_account(request)`
- `broker_post_broker_nd_account_apikey(request)`
- `broker_post_broker_nd_account_update_apikey(request)`
- `broker_delete_broker_nd_account_apikey(request)`
- `earn_get_otc_loan_discount_rate_configs(request)`
- `earn_get_otc_loan_loan(request)`
- `earn_get_otc_loan_accounts(request)`
- `earn_get_earn_redeem_preview(request)`
- `earn_get_earn_saving_products(request)`
- `earn_get_earn_hold_assets(request)`
- `earn_get_earn_promotion_products(request)`
- `earn_get_earn_kcs_staking_products(request)`
- `earn_get_earn_staking_products(request)`
- `earn_get_earn_eth_staking_products(request)`
- `earn_get_struct_earn_dual_products(request)`
- `earn_get_struct_earn_orders(request)`
- `earn_post_earn_orders(request)`
- `earn_post_struct_earn_orders(request)`
- `earn_delete_earn_orders(request)`
- `uta_get_market_announcement(request)`
- `uta_get_market_currency(request)`
- `uta_get_market_currencies(request)`
- `uta_get_market_instrument(request)`
- `uta_get_market_ticker(request)`
- `uta_get_market_trade(request)`
- `uta_get_market_kline(request)`
- `uta_get_market_funding_rate(request)`
- `uta_get_market_funding_rate_history(request)`
- `uta_get_market_cross_config(request)`
- `uta_get_market_collateral_discount_ratio(request)`
- `uta_get_market_index_price(request)`
- `uta_get_market_position_tiers(request)`
- `uta_get_market_open_interest(request)`
- `uta_get_server_status(request)`
- `utaprivate_get_market_orderbook(request)`
- `utaprivate_get_account_balance(request)`
- `utaprivate_get_account_transfer_quota(request)`
- `utaprivate_get_account_mode(request)`
- `utaprivate_get_account_ledger(request)`
- `utaprivate_get_account_interest_history(request)`
- `utaprivate_get_account_deposit_address(request)`
- `utaprivate_get_accountmode_account_balance(request)`
- `utaprivate_get_accountmode_account_overview(request)`
- `utaprivate_get_accountmode_order_detail(request)`
- `utaprivate_get_accountmode_order_open_list(request)`
- `utaprivate_get_accountmode_order_history(request)`
- `utaprivate_get_accountmode_order_execution(request)`
- `utaprivate_get_accountmode_position_open_list(request)`
- `utaprivate_get_accountmode_position_history(request)`
- `utaprivate_get_accountmode_position_tiers(request)`
- `utaprivate_get_sub_account_balance(request)`
- `utaprivate_get_user_fee_rate(request)`
- `utaprivate_get_dcp_query(request)`
- `utaprivate_post_account_transfer(request)`
- `utaprivate_post_account_mode(request)`
- `utaprivate_post_accountmode_account_modify_leverage(request)`
- `utaprivate_post_accountmode_order_place(request)`
- `utaprivate_post_accountmode_order_place_batch(request)`
- `utaprivate_post_accountmode_order_cancel(request)`
- `utaprivate_post_accountmode_order_cancel_batch(request)`
- `utaprivate_post_sub_account_cantransferout(request)`
- `utaprivate_post_dcp_set(request)`
### WS Unified
- `describe(self)`
- `negotiate(self, privateChannel, params={})`
- `negotiate_helper(self, privateChannel, params={})`
- `subscribe(self, url, messageHash, subscriptionHash, subscription, params={})`
- `subscribe_multiple(self, url, messageHashes, topic, subscriptionHashes, subscriptionArgs, params={})`
- `un_subscribe_multiple(self, url, messageHashes, topic, subscriptionHashes, params={}, subscription: dict = None)`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_multi_request(self, methodName, channelName: str, symbols: Strings = None, params={})`
- `watch_position(self, symbol: Str = None, params={})`
- `get_current_position(self, symbol)`
- `set_position_cache(self, client: Client, symbol: str)`
- `load_position_snapshot(self, client, messageHash, symbol)`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `un_watch_order_book_for_symbols(self, symbols: List[str], params={})`
- `get_cache_index(self, orderbook, cache)`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_balance(self, params={})`
- `fetch_balance_snapshot(self, client, message)`
- `get_message_hash(self, elementName: str, symbol: Str = None)`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:47.366655 | kucoin_futures_api-0.0.123.tar.gz | 738,538 | 50/d8/d1d8e34b2733700009a415bcc8bf71992ccfe2698b4b45456d6c87cc2537/kucoin_futures_api-0.0.123.tar.gz | source | sdist | null | false | 84acd8a5e61947b4b873beaf36c7095f | 46b426e24a943c0cb8e3bfb50f26e8f074849dfd7c78ffd03945f3c74eb23c74 | 50d8d1d8e34b2733700009a415bcc8bf71992ccfe2698b4b45456d6c87cc2537 | null | [] | 235 |
2.4 | binance | 0.3.105 | binance crypto exchange api client | # binance-python
Python SDK (sync and async) for Binance cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/binance)
- You can check Binance's docs here: [Docs](https://www.google.com/search?q=google+binance+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/binance-python
- Pypi package: https://pypi.org/project/binance
## Installation
```
pip install binance
```
## Usage
### Sync
```Python
from binance import BinanceSync
def main():
instance = BinanceSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from binance import BinanceAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BinanceAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from binance import BinanceWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BinanceWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_convert_trade(self, id: str, fromCode: str, toCode: str, amount: Num = None, params={})`
- `create_expired_option_market(self, symbol: str)`
- `create_gift_code(self, code: str, amount, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_order_with_cost(self, symbol: str, side: OrderSide, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_account_positions(self, symbols: Strings = None, params={})`
- `fetch_all_greeks(self, symbols: Strings = None, params={})`
- `fetch_balance(self, params={})`
- `fetch_bids_asks(self, symbols: Strings = None, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_borrow_rate_history(self, code: str, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_and_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_currencies(self, params={})`
- `fetch_convert_quote(self, fromCode: str, toCode: str, amount: Num = None, params={})`
- `fetch_convert_trade_history(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_trade(self, id: str, code: Str = None, params={})`
- `fetch_cross_borrow_rate(self, code: str, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_intervals(self, symbols: Strings = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_greeks(self, symbol: str, params={})`
- `fetch_isolated_borrow_rate(self, symbol: str, params={})`
- `fetch_isolated_borrow_rates(self, params={})`
- `fetch_last_prices(self, symbols: Strings = None, params={})`
- `fetch_ledger_entry(self, id: str, code: Str = None, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage_tiers(self, symbols: Strings = None, params={})`
- `fetch_leverages(self, symbols: Strings = None, params={})`
- `fetch_long_short_ratio_history(self, symbol: Str = None, timeframe: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_margin_adjustment_history(self, symbol: Str = None, type: Str = None, since: Num = None, limit: Num = None, params={})`
- `fetch_margin_mode(self, symbol: str, params={})`
- `fetch_margin_modes(self, symbols: Strings = None, params={})`
- `fetch_mark_price(self, symbol: str, params={})`
- `fetch_mark_prices(self, symbols: Strings = None, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_dust_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_liquidations(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest_history(self, symbol: str, timeframe='5m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_order(self, id: str, symbol: Str = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_option_positions(self, symbols: Strings = None, params={})`
- `fetch_option(self, symbol: str, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position_mode(self, symbol: Str = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_risk(self, symbols: Strings = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_status(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_trading_limits(self, symbols: Strings = None, params={})`
- `fetch_transaction_fees(self, codes: Strings = None, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `calculate_rate_limiter_cost(self, api, method, path, params, config={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `cost_to_precision(self, symbol, cost)`
- `describe(self)`
- `edit_contract_order_request(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `edit_contract_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `edit_orders(self, orders: List[OrderRequest], params={})`
- `edit_spot_order_request(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `edit_spot_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `enable_demo_trading(self, enable: bool)`
- `futures_transfer(self, code: str, amount, type, params={})`
- `get_base_domain_from_url(self, url: Str)`
- `get_exceptions_by_url(self, url: str, exactOrBroad: str)`
- `get_network_code_by_network_url(self, currencyCode: str, depositUrl: Str = None)`
- `is_inverse(self, type: str, subType: Str = None)`
- `is_linear(self, type: str, subType: Str = None)`
- `market(self, symbol: str)`
- `modify_margin_helper(self, symbol: str, amount, addOrReduce, params={})`
- `nonce(self)`
- `redeem_gift_code(self, giftcardCode, params={})`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `repay_cross_margin(self, code: str, amount, params={})`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `request(self, path, api='public', method='GET', params={}, headers=None, body=None, config={})`
- `safe_market(self, marketId: Str = None, market: Market = None, delimiter: Str = None, marketType: Str = None)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enable: bool)`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `verify_gift_code(self, id: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `sapi_get_copytrading_futures_userstatus(request)`
- `sapi_get_copytrading_futures_leadsymbol(request)`
- `sapi_get_system_status(request)`
- `sapi_get_accountsnapshot(request)`
- `sapi_get_account_info(request)`
- `sapi_get_margin_asset(request)`
- `sapi_get_margin_pair(request)`
- `sapi_get_margin_allassets(request)`
- `sapi_get_margin_allpairs(request)`
- `sapi_get_margin_priceindex(request)`
- `sapi_get_spot_delist_schedule(request)`
- `sapi_get_asset_assetdividend(request)`
- `sapi_get_asset_dribblet(request)`
- `sapi_get_asset_transfer(request)`
- `sapi_get_asset_assetdetail(request)`
- `sapi_get_asset_tradefee(request)`
- `sapi_get_asset_ledger_transfer_cloud_mining_querybypage(request)`
- `sapi_get_asset_convert_transfer_querybypage(request)`
- `sapi_get_asset_wallet_balance(request)`
- `sapi_get_asset_custody_transfer_history(request)`
- `sapi_get_margin_borrow_repay(request)`
- `sapi_get_margin_loan(request)`
- `sapi_get_margin_repay(request)`
- `sapi_get_margin_account(request)`
- `sapi_get_margin_transfer(request)`
- `sapi_get_margin_interesthistory(request)`
- `sapi_get_margin_forceliquidationrec(request)`
- `sapi_get_margin_order(request)`
- `sapi_get_margin_openorders(request)`
- `sapi_get_margin_allorders(request)`
- `sapi_get_margin_mytrades(request)`
- `sapi_get_margin_maxborrowable(request)`
- `sapi_get_margin_maxtransferable(request)`
- `sapi_get_margin_tradecoeff(request)`
- `sapi_get_margin_isolated_transfer(request)`
- `sapi_get_margin_isolated_account(request)`
- `sapi_get_margin_isolated_pair(request)`
- `sapi_get_margin_isolated_allpairs(request)`
- `sapi_get_margin_isolated_accountlimit(request)`
- `sapi_get_margin_interestratehistory(request)`
- `sapi_get_margin_orderlist(request)`
- `sapi_get_margin_allorderlist(request)`
- `sapi_get_margin_openorderlist(request)`
- `sapi_get_margin_crossmargindata(request)`
- `sapi_get_margin_isolatedmargindata(request)`
- `sapi_get_margin_isolatedmargintier(request)`
- `sapi_get_margin_ratelimit_order(request)`
- `sapi_get_margin_dribblet(request)`
- `sapi_get_margin_dust(request)`
- `sapi_get_margin_crossmargincollateralratio(request)`
- `sapi_get_margin_exchange_small_liability(request)`
- `sapi_get_margin_exchange_small_liability_history(request)`
- `sapi_get_margin_next_hourly_interest_rate(request)`
- `sapi_get_margin_capital_flow(request)`
- `sapi_get_margin_delist_schedule(request)`
- `sapi_get_margin_available_inventory(request)`
- `sapi_get_margin_leveragebracket(request)`
- `sapi_get_loan_vip_loanable_data(request)`
- `sapi_get_loan_vip_collateral_data(request)`
- `sapi_get_loan_vip_request_data(request)`
- `sapi_get_loan_vip_request_interestrate(request)`
- `sapi_get_loan_income(request)`
- `sapi_get_loan_ongoing_orders(request)`
- `sapi_get_loan_ltv_adjustment_history(request)`
- `sapi_get_loan_borrow_history(request)`
- `sapi_get_loan_repay_history(request)`
- `sapi_get_loan_loanable_data(request)`
- `sapi_get_loan_collateral_data(request)`
- `sapi_get_loan_repay_collateral_rate(request)`
- `sapi_get_loan_flexible_ongoing_orders(request)`
- `sapi_get_loan_flexible_borrow_history(request)`
- `sapi_get_loan_flexible_repay_history(request)`
- `sapi_get_loan_flexible_ltv_adjustment_history(request)`
- `sapi_get_loan_vip_ongoing_orders(request)`
- `sapi_get_loan_vip_repay_history(request)`
- `sapi_get_loan_vip_collateral_account(request)`
- `sapi_get_fiat_orders(request)`
- `sapi_get_fiat_payments(request)`
- `sapi_get_futures_transfer(request)`
- `sapi_get_futures_histdatalink(request)`
- `sapi_get_rebate_taxquery(request)`
- `sapi_get_capital_config_getall(request)`
- `sapi_get_capital_deposit_address(request)`
- `sapi_get_capital_deposit_address_list(request)`
- `sapi_get_capital_deposit_hisrec(request)`
- `sapi_get_capital_deposit_subaddress(request)`
- `sapi_get_capital_deposit_subhisrec(request)`
- `sapi_get_capital_withdraw_history(request)`
- `sapi_get_capital_withdraw_address_list(request)`
- `sapi_get_capital_contract_convertible_coins(request)`
- `sapi_get_convert_tradeflow(request)`
- `sapi_get_convert_exchangeinfo(request)`
- `sapi_get_convert_assetinfo(request)`
- `sapi_get_convert_orderstatus(request)`
- `sapi_get_convert_limit_queryopenorders(request)`
- `sapi_get_account_status(request)`
- `sapi_get_account_apitradingstatus(request)`
- `sapi_get_account_apirestrictions_iprestriction(request)`
- `sapi_get_bnbburn(request)`
- `sapi_get_sub_account_futures_account(request)`
- `sapi_get_sub_account_futures_accountsummary(request)`
- `sapi_get_sub_account_futures_positionrisk(request)`
- `sapi_get_sub_account_futures_internaltransfer(request)`
- `sapi_get_sub_account_list(request)`
- `sapi_get_sub_account_margin_account(request)`
- `sapi_get_sub_account_margin_accountsummary(request)`
- `sapi_get_sub_account_spotsummary(request)`
- `sapi_get_sub_account_status(request)`
- `sapi_get_sub_account_sub_transfer_history(request)`
- `sapi_get_sub_account_transfer_subuserhistory(request)`
- `sapi_get_sub_account_universaltransfer(request)`
- `sapi_get_sub_account_apirestrictions_iprestriction_thirdpartylist(request)`
- `sapi_get_sub_account_transaction_statistics(request)`
- `sapi_get_sub_account_subaccountapi_iprestriction(request)`
- `sapi_get_managed_subaccount_asset(request)`
- `sapi_get_managed_subaccount_accountsnapshot(request)`
- `sapi_get_managed_subaccount_querytranslogforinvestor(request)`
- `sapi_get_managed_subaccount_querytranslogfortradeparent(request)`
- `sapi_get_managed_subaccount_fetch_future_asset(request)`
- `sapi_get_managed_subaccount_marginasset(request)`
- `sapi_get_managed_subaccount_info(request)`
- `sapi_get_managed_subaccount_deposit_address(request)`
- `sapi_get_managed_subaccount_query_trans_log(request)`
- `sapi_get_lending_daily_product_list(request)`
- `sapi_get_lending_daily_userleftquota(request)`
- `sapi_get_lending_daily_userredemptionquota(request)`
- `sapi_get_lending_daily_token_position(request)`
- `sapi_get_lending_union_account(request)`
- `sapi_get_lending_union_purchaserecord(request)`
- `sapi_get_lending_union_redemptionrecord(request)`
- `sapi_get_lending_union_interesthistory(request)`
- `sapi_get_lending_project_list(request)`
- `sapi_get_lending_project_position_list(request)`
- `sapi_get_eth_staking_eth_history_stakinghistory(request)`
- `sapi_get_eth_staking_eth_history_redemptionhistory(request)`
- `sapi_get_eth_staking_eth_history_rewardshistory(request)`
- `sapi_get_eth_staking_eth_quota(request)`
- `sapi_get_eth_staking_eth_history_ratehistory(request)`
- `sapi_get_eth_staking_account(request)`
- `sapi_get_eth_staking_wbeth_history_wraphistory(request)`
- `sapi_get_eth_staking_wbeth_history_unwraphistory(request)`
- `sapi_get_eth_staking_eth_history_wbethrewardshistory(request)`
- `sapi_get_sol_staking_sol_history_stakinghistory(request)`
- `sapi_get_sol_staking_sol_history_redemptionhistory(request)`
- `sapi_get_sol_staking_sol_history_bnsolrewardshistory(request)`
- `sapi_get_sol_staking_sol_history_ratehistory(request)`
- `sapi_get_sol_staking_account(request)`
- `sapi_get_sol_staking_sol_quota(request)`
- `sapi_get_mining_pub_algolist(request)`
- `sapi_get_mining_pub_coinlist(request)`
- `sapi_get_mining_worker_detail(request)`
- `sapi_get_mining_worker_list(request)`
- `sapi_get_mining_payment_list(request)`
- `sapi_get_mining_statistics_user_status(request)`
- `sapi_get_mining_statistics_user_list(request)`
- `sapi_get_mining_payment_uid(request)`
- `sapi_get_bswap_pools(request)`
- `sapi_get_bswap_liquidity(request)`
- `sapi_get_bswap_liquidityops(request)`
- `sapi_get_bswap_quote(request)`
- `sapi_get_bswap_swap(request)`
- `sapi_get_bswap_poolconfigure(request)`
- `sapi_get_bswap_addliquiditypreview(request)`
- `sapi_get_bswap_removeliquiditypreview(request)`
- `sapi_get_bswap_unclaimedrewards(request)`
- `sapi_get_bswap_claimedhistory(request)`
- `sapi_get_blvt_tokeninfo(request)`
- `sapi_get_blvt_subscribe_record(request)`
- `sapi_get_blvt_redeem_record(request)`
- `sapi_get_blvt_userlimit(request)`
- `sapi_get_apireferral_ifnewuser(request)`
- `sapi_get_apireferral_customization(request)`
- `sapi_get_apireferral_usercustomization(request)`
- `sapi_get_apireferral_rebate_recentrecord(request)`
- `sapi_get_apireferral_rebate_historicalrecord(request)`
- `sapi_get_apireferral_kickback_recentrecord(request)`
- `sapi_get_apireferral_kickback_historicalrecord(request)`
- `sapi_get_broker_subaccountapi(request)`
- `sapi_get_broker_subaccount(request)`
- `sapi_get_broker_subaccountapi_commission_futures(request)`
- `sapi_get_broker_subaccountapi_commission_coinfutures(request)`
- `sapi_get_broker_info(request)`
- `sapi_get_broker_transfer(request)`
- `sapi_get_broker_transfer_futures(request)`
- `sapi_get_broker_rebate_recentrecord(request)`
- `sapi_get_broker_rebate_historicalrecord(request)`
- `sapi_get_broker_subaccount_bnbburn_status(request)`
- `sapi_get_broker_subaccount_deposithist(request)`
- `sapi_get_broker_subaccount_spotsummary(request)`
- `sapi_get_broker_subaccount_marginsummary(request)`
- `sapi_get_broker_subaccount_futuressummary(request)`
- `sapi_get_broker_rebate_futures_recentrecord(request)`
- `sapi_get_broker_subaccountapi_iprestriction(request)`
- `sapi_get_broker_universaltransfer(request)`
- `sapi_get_account_apirestrictions(request)`
- `sapi_get_c2c_ordermatch_listuserorderhistory(request)`
- `sapi_get_nft_history_transactions(request)`
- `sapi_get_nft_history_deposit(request)`
- `sapi_get_nft_history_withdraw(request)`
- `sapi_get_nft_user_getasset(request)`
- `sapi_get_pay_transactions(request)`
- `sapi_get_giftcard_verify(request)`
- `sapi_get_giftcard_cryptography_rsa_public_key(request)`
- `sapi_get_giftcard_buycode_token_limit(request)`
- `sapi_get_algo_spot_openorders(request)`
- `sapi_get_algo_spot_historicalorders(request)`
- `sapi_get_algo_spot_suborders(request)`
- `sapi_get_algo_futures_openorders(request)`
- `sapi_get_algo_futures_historicalorders(request)`
- `sapi_get_algo_futures_suborders(request)`
- `sapi_get_portfolio_account(request)`
- `sapi_get_portfolio_collateralrate(request)`
- `sapi_get_portfolio_pmloan(request)`
- `sapi_get_portfolio_interest_history(request)`
- `sapi_get_portfolio_asset_index_price(request)`
- `sapi_get_portfolio_repay_futures_switch(request)`
- `sapi_get_portfolio_margin_asset_leverage(request)`
- `sapi_get_portfolio_balance(request)`
- `sapi_get_portfolio_negative_balance_exchange_record(request)`
- `sapi_get_portfolio_pmloan_history(request)`
- `sapi_get_portfolio_earn_asset_balance(request)`
- `sapi_get_portfolio_delta_mode(request)`
- `sapi_get_staking_productlist(request)`
- `sapi_get_staking_position(request)`
- `sapi_get_staking_stakingrecord(request)`
- `sapi_get_staking_personalleftquota(request)`
- `sapi_get_lending_auto_invest_target_asset_list(request)`
- `sapi_get_lending_auto_invest_target_asset_roi_list(request)`
- `sapi_get_lending_auto_invest_all_asset(request)`
- `sapi_get_lending_auto_invest_source_asset_list(request)`
- `sapi_get_lending_auto_invest_plan_list(request)`
- `sapi_get_lending_auto_invest_plan_id(request)`
- `sapi_get_lending_auto_invest_history_list(request)`
- `sapi_get_lending_auto_invest_index_info(request)`
- `sapi_get_lending_auto_invest_index_user_summary(request)`
- `sapi_get_lending_auto_invest_one_off_status(request)`
- `sapi_get_lending_auto_invest_redeem_history(request)`
- `sapi_get_lending_auto_invest_rebalance_history(request)`
- `sapi_get_simple_earn_flexible_list(request)`
- `sapi_get_simple_earn_locked_list(request)`
- `sapi_get_simple_earn_flexible_personalleftquota(request)`
- `sapi_get_simple_earn_locked_personalleftquota(request)`
- `sapi_get_simple_earn_flexible_subscriptionpreview(request)`
- `sapi_get_simple_earn_locked_subscriptionpreview(request)`
- `sapi_get_simple_earn_flexible_history_ratehistory(request)`
- `sapi_get_simple_earn_flexible_position(request)`
- `sapi_get_simple_earn_locked_position(request)`
- `sapi_get_simple_earn_account(request)`
- `sapi_get_simple_earn_flexible_history_subscriptionrecord(request)`
- `sapi_get_simple_earn_locked_history_subscriptionrecord(request)`
- `sapi_get_simple_earn_flexible_history_redemptionrecord(request)`
- `sapi_get_simple_earn_locked_history_redemptionrecord(request)`
- `sapi_get_simple_earn_flexible_history_rewardsrecord(request)`
- `sapi_get_simple_earn_locked_history_rewardsrecord(request)`
- `sapi_get_simple_earn_flexible_history_collateralrecord(request)`
- `sapi_get_dci_product_list(request)`
- `sapi_get_dci_product_positions(request)`
- `sapi_get_dci_product_accounts(request)`
- `sapi_get_accumulator_product_list(request)`
- `sapi_get_accumulator_product_position_list(request)`
- `sapi_get_accumulator_product_sum_holding(request)`
- `sapi_post_asset_dust(request)`
- `sapi_post_asset_dust_btc(request)`
- `sapi_post_asset_transfer(request)`
- `sapi_post_asset_get_funding_asset(request)`
- `sapi_post_asset_convert_transfer(request)`
- `sapi_post_account_disablefastwithdrawswitch(request)`
- `sapi_post_account_enablefastwithdrawswitch(request)`
- `sapi_post_capital_withdraw_apply(request)`
- `sapi_post_capital_contract_convertible_coins(request)`
- `sapi_post_capital_deposit_credit_apply(request)`
- `sapi_post_margin_borrow_repay(request)`
- `sapi_post_margin_transfer(request)`
- `sapi_post_margin_loan(request)`
- `sapi_post_margin_repay(request)`
- `sapi_post_margin_order(request)`
- `sapi_post_margin_order_oco(request)`
- `sapi_post_margin_dust(request)`
- `sapi_post_margin_exchange_small_liability(request)`
- `sapi_post_margin_isolated_transfer(request)`
- `sapi_post_margin_isolated_account(request)`
- `sapi_post_margin_max_leverage(request)`
- `sapi_post_bnbburn(request)`
- `sapi_post_sub_account_virtualsubaccount(request)`
- `sapi_post_sub_account_margin_transfer(request)`
- `sapi_post_sub_account_margin_enable(request)`
- `sapi_post_sub_account_futures_enable(request)`
- `sapi_post_sub_account_futures_transfer(request)`
- `sapi_post_sub_account_futures_internaltransfer(request)`
- `sapi_post_sub_account_transfer_subtosub(request)`
- `sapi_post_sub_account_transfer_subtomaster(request)`
- `sapi_post_sub_account_universaltransfer(request)`
- `sapi_post_sub_account_options_enable(request)`
- `sapi_post_managed_subaccount_deposit(request)`
- `sapi_post_managed_subaccount_withdraw(request)`
- `sapi_post_userdatastream(request)`
- `sapi_post_userdatastream_isolated(request)`
- `sapi_post_userlistentoken(request)`
- `sapi_post_futures_transfer(request)`
- `sapi_post_lending_customizedfixed_purchase(request)`
- `sapi_post_lending_daily_purchase(request)`
- `sapi_post_lending_daily_redeem(request)`
- `sapi_post_bswap_liquidityadd(request)`
- `sapi_post_bswap_liquidityremove(request)`
- `sapi_post_bswap_swap(request)`
- `sapi_post_bswap_claimrewards(request)`
- `sapi_post_blvt_subscribe(request)`
- `sapi_post_blvt_redeem(request)`
- `sapi_post_apireferral_customization(request)`
- `sapi_post_apireferral_usercustomization(request)`
- `sapi_post_apireferral_rebate_historicalrecord(request)`
- `sapi_post_apireferral_kickback_historicalrecord(request)`
- `sapi_post_broker_subaccount(request)`
- `sapi_post_broker_subaccount_margin(request)`
- `sapi_post_broker_subaccount_futures(request)`
- `sapi_post_broker_subaccountapi(request)`
- `sapi_post_broker_subaccountapi_permission(request)`
- `sapi_post_broker_subaccountapi_commission(request)`
- `sapi_post_broker_subaccountapi_commission_futures(request)`
- `sapi_post_broker_subaccountapi_commission_coinfutures(request)`
- `sapi_post_broker_transfer(request)`
- `sapi_post_broker_transfer_futures(request)`
- `sapi_post_broker_rebate_historicalrecord(request)`
- `sapi_post_broker_subaccount_bnbburn_spot(request)`
- `sapi_post_broker_subaccount_bnbburn_margininterest(request)`
- `sapi_post_broker_subaccount_blvt(request)`
- `sapi_post_broker_subaccountapi_iprestriction(request)`
- `sapi_post_broker_subaccountapi_iprestriction_iplist(request)`
- `sapi_post_broker_universaltransfer(request)`
- `sapi_post_broker_subaccountapi_permission_universaltransfer(request)`
- `sapi_post_broker_subaccountapi_permission_vanillaoptions(request)`
- `sapi_post_giftcard_createcode(request)`
- `sapi_post_giftcard_redeemcode(request)`
- `sapi_post_giftcard_buycode(request)`
- `sapi_post_algo_spot_newordertwap(request)`
- `sapi_post_algo_futures_newordervp(request)`
- `sapi_post_algo_futures_newordertwap(request)`
- `sapi_post_staking_purchase(request)`
- `sapi_post_staking_redeem(request)`
- `sapi_post_staking_setautostaking(request)`
- `sapi_post_eth_staking_eth_stake(request)`
- `sapi_post_eth_staking_eth_redeem(request)`
- `sapi_post_eth_staking_wbeth_wrap(request)`
- `sapi_post_sol_staking_sol_stake(request)`
- `sapi_post_sol_staking_sol_redeem(request)`
- `sapi_post_mining_hash_transfer_config(request)`
- `sapi_post_mining_hash_transfer_config_cancel(request)`
- `sapi_post_portfolio_repay(request)`
- `sapi_post_loan_vip_renew(request)`
- `sapi_post_loan_vip_borrow(request)`
- `sapi_post_loan_borrow(request)`
- `sapi_post_loan_repay(request)`
- `sapi_post_loan_adjust_ltv(request)`
- `sapi_post_loan_customize_margin_call(request)`
- `sapi_post_loan_flexible_repay(request)`
- `sapi_post_loan_flexible_adjust_ltv(request)`
- `sapi_post_loan_vip_repay(request)`
- `sapi_post_convert_getquote(request)`
- `sapi_post_convert_acceptquote(request)`
- `sapi_post_convert_limit_placeorder(request)`
- `sapi_post_convert_limit_cancelorder(request)`
- `sapi_post_portfolio_auto_collection(request)`
- `sapi_post_portfolio_asset_collection(request)`
- `sapi_post_portfolio_bnb_transfer(request)`
- `sapi_post_portfolio_repay_futures_switch(request)`
- `sapi_post_portfolio_repay_futures_negative_balance(request)`
- `sapi_post_portfolio_mint(request)`
- `sapi_post_portfolio_redeem(request)`
- `sapi_post_portfolio_earn_asset_transfer(request)`
- `sapi_post_portfolio_delta_mode(request)`
- `sapi_post_lending_auto_invest_plan_add(request)`
- `sapi_post_lending_auto_invest_plan_edit(request)`
- `sapi_post_lending_auto_invest_plan_edit_status(request)`
- `sapi_post_lending_auto_invest_one_off(request)`
- `sapi_post_lending_auto_invest_redeem(request)`
- `sapi_post_simple_earn_flexible_subscribe(request)`
- `sapi_post_simple_earn_locked_subscribe(request)`
- `sapi_post_simple_earn_flexible_redeem(request)`
- `sapi_post_simple_earn_locked_redeem(request)`
- `sapi_post_simple_earn_flexible_setautosubscribe(request)`
- `sapi_post_simple_earn_locked_setautosubscribe(request)`
- `sapi_post_simple_earn_locked_setredeemoption(request)`
- `sapi_post_dci_product_subscribe(request)`
- `sapi_post_dci_product_auto_compound_edit(request)`
- `sapi_post_accumulator_product_subscribe(request)`
- `sapi_put_userdatastream(request)`
- `sapi_put_userdatastream_isolated(request)`
- `sapi_delete_margin_openorders(request)`
- `sapi_delete_margin_order(request)`
- `sapi_delete_margin_orderlist(request)`
- `sapi_delete_margin_isolated_account(request)`
- `sapi_delete_userdatastream(request)`
- `sapi_delete_userdatastream_isolated(request)`
- `sapi_delete_broker_subaccountapi(request)`
- `sapi_delete_broker_subaccountapi_iprestriction_iplist(request)`
- `sapi_delete_algo_spot_order(request)`
- `sapi_delete_algo_futures_order(request)`
- `sapi_delete_sub_account_subaccountapi_iprestriction_iplist(request)`
- `sapiv2_get_eth_staking_account(request)`
- `sapiv2_get_sub_account_futures_account(request)`
- `sapiv2_get_sub_account_futures_accountsummary(request)`
- `sapiv2_get_sub_account_futures_positionrisk(request)`
- `sapiv2_get_loan_flexible_ongoing_orders(request)`
- `sapiv2_get_loan_flexible_borrow_history(request)`
- `sapiv2_get_loan_flexible_repay_history(request)`
- `sapiv2_get_loan_flexible_ltv_adjustment_history(request)`
- `sapiv2_get_loan_flexible_loanable_data(request)`
- `sapiv2_get_loan_flexible_collateral_data(request)`
- `sapiv2_get_portfolio_account(request)`
- `sapiv2_post_eth_staking_eth_stake(request)`
- `sapiv2_post_sub_account_subaccountapi_iprestriction(request)`
- `sapiv2_post_loan_flexible_borrow(request)`
- `sapiv2_post_loan_flexible_repay(request)`
- `sapiv2_post_loan_flexible_adjust_ltv(request)`
- `sapiv3_get_sub_account_assets(request)`
- `sapiv3_post_asset_getuserasset(request)`
- `sapiv4_get_sub_account_assets(request)`
- `dapipublic_get_ping(request)`
- `dapipublic_get_time(request)`
- `dapipublic_get_exchangeinfo(request)`
- `dapipublic_get_depth(request)`
- `dapipublic_get_trades(request)`
- `dapipublic_get_historicaltrades(request)`
- `dapipublic_get_aggtrades(request)`
- `dapipublic_get_premiumindex(request)`
- `dapipublic_get_fundingrate(request)`
- `dapipublic_get_klines(request)`
- `dapipublic_get_continuousklines(request)`
- `dapipublic_get_indexpriceklines(request)`
- `dapipublic_get_markpriceklines(request)`
- `dapipublic_get_premiumindexklines(request)`
- `dapipublic_get_ticker_24hr(request)`
- `dapipublic_get_ticker_price(request)`
- `dapipublic_get_ticker_bookticker(request)`
- `dapipublic_get_constituents(request)`
- `dapipublic_get_openinterest(request)`
- `dapipublic_get_fundinginfo(request)`
- `dapidata_get_delivery_price(request)`
- `dapidata_get_openinteresthist(request)`
- `dapidata_get_toplongshortaccountratio(request)`
- `dapidata_get_toplongshortpositionratio(request)`
- `dapidata_get_globallongshortaccountratio(request)`
- `dapidata_get_takerbuysellvol(request)`
- `dapidata_get_basis(request)`
- `dapiprivate_get_positionside_dual(request)`
- `dapiprivate_get_orderamendment(request)`
- `dapiprivate_get_order(request)`
- `dapiprivate_get_openorder(request)`
- `dapiprivate_get_openorders(request)`
- `dapiprivate_get_allorders(request)`
- `dapiprivate_get_balance(request)`
- `dapiprivate_get_account(request)`
- `dapiprivate_get_positionmargin_history(request)`
- `dapiprivate_get_positionrisk(request)`
- `dapiprivate_get_usertrades(request)`
- `dapiprivate_get_income(request)`
- `dapiprivate_get_leveragebracket(request)`
- `dapiprivate_get_forceorders(request)`
- `dapiprivate_get_adlquantile(request)`
- `dapiprivate_get_commissionrate(request)`
- `dapiprivate_get_income_asyn(request)`
- `dapiprivate_get_income_asyn_id(request)`
- `dapiprivate_get_trade_asyn(request)`
- `dapiprivate_get_trade_asyn_id(request)`
- `dapiprivate_get_order_asyn(request)`
- `dapiprivate_get_order_asyn_id(request)`
- `dapiprivate_get_pmexchangeinfo(request)`
- `dapiprivate_get_pmaccountinfo(request)`
- `dapiprivate_post_positionside_dual(request)`
- `dapiprivate_post_order(request)`
- `dapiprivate_post_batchorders(request)`
- `dapiprivate_post_countdowncancelall(request)`
- `dapiprivate_post_leverage(request)`
- `dapiprivate_post_margintype(request)`
- `dapiprivate_post_positionmargin(request)`
- `dapiprivate_post_listenkey(request)`
- `dapiprivate_put_listenkey(request)`
- `dapiprivate_put_order(request)`
- `dapiprivate_put_batchorders(request)`
- `dapiprivate_delete_order(request)`
- `dapiprivate_delete_allopenorders(request)`
- `dapiprivate_delete_batchorders(request)`
- `dapiprivate_delete_listenkey(request)`
- `dapiprivatev2_get_leveragebracket(request)`
- `fapipublic_get_ping(request)`
- `fapipublic_get_time(request)`
- `fapipublic_get_exchangeinfo(request)`
- `fapipublic_get_depth(request)`
- `fapipublic_get_rpidepth(request)`
- `fapipublic_get_trades(request)`
- `fapipublic_get_historicaltrades(request)`
- `fapipublic_get_aggtrades(request)`
- `fapipublic_get_klines(request)`
- `fapipublic_get_continuousklines(request)`
- `fapipublic_get_markpriceklines(request)`
- `fapipublic_get_indexpriceklines(request)`
- `fapipublic_get_premiumindexklines(request)`
- `fapipublic_get_fundingrate(request)`
- `fapipublic_get_fundinginfo(request)`
- `fapipublic_get_premiumindex(request)`
- `fapipublic_get_ticker_24hr(request)`
- `fapipublic_get_ticker_price(request)`
- `fapipublic_get_ticker_bookticker(request)`
- `fapipublic_get_openinterest(request)`
- `fapipublic_get_indexinfo(request)`
- `fapipublic_get_assetindex(request)`
- `fapipublic_get_constituents(request)`
- `fapipublic_get_apitradingstatus(request)`
- `fapipublic_get_lvtklines(request)`
- `fapipublic_get_convert_exchangeinfo(request)`
- `fapipublic_get_insurancebalance(request)`
- `fapipublic_get_symboladlrisk(request)`
- `fapipublic_get_tradingschedule(request)`
- `fapidata_get_delivery_price(request)`
- `fapidata_get_openinteresthist(request)`
- `fapidata_get_toplongshortaccountratio(request)`
- `fapidata_get_toplongshortpositionratio(request)`
- `fapidata_get_globallongshortaccountratio(request)`
- `fapidata_get_takerlongshortratio(request)`
- `fapidata_get_basis(request)`
- `fapiprivate_get_forceorders(request)`
- `fapiprivate_get_allorders(request)`
- `fapiprivate_get_openorder(request)`
- `fapiprivate_get_openorders(request)`
- `fapiprivate_get_order(request)`
- `fapiprivate_get_account(request)`
- `fapiprivate_get_balance(request)`
- `fapiprivate_get_leveragebracket(request)`
- `fapiprivate_get_positionmargin_history(request)`
- `fapiprivate_get_positionrisk(request)`
- `fapiprivate_get_positionside_dual(request)`
- `fapiprivate_get_usertrades(request)`
- `fapiprivate_get_income(request)`
- `fapiprivate_get_commissionrate(request)`
- `fapiprivate_get_ratelimit_order(request)`
- `fapiprivate_get_apitradingstatus(request)`
- `fapiprivate_get_multiassetsmargin(request)`
- `fapiprivate_get_apireferral_ifnewuser(request)`
- `fapiprivate_get_apireferral_customization(request)`
- `fapiprivate_get_apireferral_usercustomization(request)`
- `fapiprivate_get_apireferral_tradernum(request)`
- `fapiprivate_get_apireferral_overview(request)`
- `fapiprivate_get_apireferral_tradevol(request)`
- `fapiprivate_get_apireferral_rebatevol(request)`
- `fapiprivate_get_apireferral_tradersummary(request)`
- `fapiprivate_get_adlquantile(request)`
- `fapiprivate_get_pmaccountinfo(request)`
- `fapiprivate_get_orderamendment(request)`
- `fapiprivate_get_income_asyn(request)`
- `fapiprivate_get_income_asyn_id(request)`
- `fapiprivate_get_order_asyn(request)`
- `fapiprivate_get_order_asyn_id(request)`
- `fapiprivate_get_trade_asyn(request)`
- `fapiprivate_get_trade_asyn_id(request)`
- `fapiprivate_get_feeburn(request)`
- `fapiprivate_get_symbolconfig(request)`
- `fapiprivate_get_accountconfig(request)`
- `fapiprivate_get_convert_orderstatus(request)`
- `fapiprivate_get_algoorder(request)`
- `fapiprivate_get_openalgoorders(request)`
- `fapiprivate_get_allalgoorders(request)`
- `fapiprivate_get_stock_contract(request)`
- `fapiprivate_post_batchorders(request)`
- `fapiprivate_post_positionside_dual(request)`
- `fapiprivate_post_positionmargin(request)`
- `fapiprivate_post_margintype(request)`
- `fapiprivate_post_order(request)`
- `fapiprivate_post_order_test(request)`
- `fapiprivate_post_leverage(request)`
- `fapiprivate_post_listenkey(request)`
- `fapiprivate_post_countdowncancelall(request)`
- `fapiprivate_post_multiassetsmargin(request)`
- `fapiprivate_post_apireferral_customization(request)`
- `fapiprivate_post_apireferral_usercustomization(request)`
- `fapiprivate_post_feeburn(request)`
- `fapiprivate_post_convert_getquote(request)`
- `fapiprivate_post_convert_acceptquote(request)`
- `fapiprivate_post_algoorder(request)`
- `fapiprivate_put_listenkey(request)`
- `fapiprivate_put_order(request)`
- `fapiprivate_put_batchorders(request)`
- `fapiprivate_delete_batchorders(request)`
- `fapiprivate_delete_order(request)`
- `fapiprivate_delete_allopenorders(request)`
- `fapiprivate_delete_listenkey(request)`
- `fapiprivate_delete_algoorder(request)`
- `fapiprivate_delete_algoopenorders(request)`
- `fapipublicv2_get_ticker_price(request)`
- `fapiprivatev2_get_account(request)`
- `fapiprivatev2_get_balance(request)`
- `fapiprivatev2_get_positionrisk(request)`
- `fapiprivatev3_get_account(request)`
- `fapiprivatev3_get_balance(request)`
- `fapiprivatev3_get_positionrisk(request)`
- `eapipublic_get_ping(request)`
- `eapipublic_get_time(request)`
- `eapipublic_get_exchangeinfo(request)`
- `eapipublic_get_index(request)`
- `eapipublic_get_ticker(request)`
- `eapipublic_get_mark(request)`
- `eapipublic_get_depth(request)`
- `eapipublic_get_klines(request)`
- `eapipublic_get_trades(request)`
- `eapipublic_get_historicaltrades(request)`
- `eapipublic_get_exercisehistory(request)`
- `eapipublic_get_openinterest(request)`
- `eapiprivate_get_account(request)`
- `eapiprivate_get_position(request)`
- `eapiprivate_get_openorders(request)`
- `eapiprivate_get_historyorders(request)`
- `eapiprivate_get_usertrades(request)`
- `eapiprivate_get_exerciserecord(request)`
- `eapiprivate_get_bill(request)`
- `eapiprivate_get_income_asyn(request)`
- `eapiprivate_get_income_asyn_id(request)`
- `eapiprivate_get_marginaccount(request)`
- `eapiprivate_get_mmp(request)`
- `eapiprivate_get_countdowncancelall(request)`
- `eapiprivate_get_order(request)`
- `eapiprivate_get_block_order_orders(request)`
- `eapiprivate_get_block_order_execute(request)`
- `eapiprivate_get_block_user_trades(request)`
- `eapiprivate_get_blocktrades(request)`
- `eapiprivate_get_comission(request)`
- `eapiprivate_post_order(request)`
- `eapiprivate_post_batchorders(request)`
- `eapiprivate_post_listenkey(request)`
- `eapiprivate_post_mmpset(request) | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:47.072391 | binance-0.3.105.tar.gz | 929,972 | 0f/8c/00a44d5b22d183429177f9e513982b9d2827ce2933571b14ea394e6e48bf/binance-0.3.105.tar.gz | source | sdist | null | false | 3fcd4a1b020b9259439b7ff58c3ad189 | 0dbb0c3b892000958eafb54ede8b3597970460ac8d757c8f609aa513024d15aa | 0f8c00a44d5b22d183429177f9e513982b9d2827ce2933571b14ea394e6e48bf | null | [] | 999 |
2.4 | bybit-api | 0.0.124 | bybit crypto exchange api client | # bybit-python
Python SDK (sync and async) for Bybit cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/bybit)
- You can check Bybit's docs here: [Docs](https://bybit.com/apidocs1)
- Github repo: https://github.com/ccxt/bybit-python
- Pypi package: https://pypi.org/project/bybit-api
## Installation
```
pip install bybit-api
```
## Usage
### Sync
```Python
from bybit import BybitSync
def main():
instance = BybitSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from bybit import BybitAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BybitAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from bybit import BybitWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BybitWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_convert_trade(self, id: str, fromCode: str, toCode: str, amount: Num = None, params={})`
- `create_expired_option_market(self, symbol: str)`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={}, isUTA=True)`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_all_greeks(self, symbols: Strings = None, params={})`
- `fetch_balance(self, params={})`
- `fetch_bids_asks(self, symbols: Strings = None, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_borrow_rate_history(self, code: str, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_and_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_order(self, id: str, symbol: Str = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_currencies(self, params={})`
- `fetch_convert_quote(self, fromCode: str, toCode: str, amount: Num = None, params={})`
- `fetch_convert_trade_history(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_trade(self, id: str, code: Str = None, params={})`
- `fetch_cross_borrow_rate(self, code: str, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_derivatives_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_derivatives_open_interest_history(self, symbol: str, timeframe='1h', since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_future_markets(self, params)`
- `fetch_greeks(self, symbol: str, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage_tiers(self, symbols: Strings = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_long_short_ratio_history(self, symbol: Str = None, timeframe: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_margin_mode(self, symbol: str, params={})`
- `fetch_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_liquidations(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest_history(self, symbol: str, timeframe='1h', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_order(self, id: str, symbol: Str = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_option_chain(self, code: str, params={})`
- `fetch_option_markets(self, params)`
- `fetch_option(self, symbol: str, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_classic(self, id: str, symbol: Str = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders_classic(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_history(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_spot_markets(self, params)`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_volatility_history(self, code: str, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_pagination_cursor_to_result(self, response)`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `cancel_all_orders_after(self, timeout: Int, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order_request(self, id: str, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders_for_symbols(self, orders: List[CancellationRequest], params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `describe(self)`
- `edit_order_request(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `edit_orders(self, orders: List[OrderRequest], params={})`
- `enable_demo_trading(self, enable: bool)`
- `get_amount(self, symbol: str, amount: float)`
- `get_bybit_type(self, method, market, params={})`
- `get_cost(self, symbol: str, cost: str)`
- `get_leverage_tiers_paginated(self, symbol: Str = None, params={})`
- `get_price(self, symbol: str, price: str)`
- `is_unified_enabled(self, params={})`
- `nonce(self)`
- `repay_cross_margin(self, code: str, amount, params={})`
- `safe_market(self, marketId: Str = None, market: Market = None, delimiter: Str = None, marketType: Str = None)`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `upgrade_unified_trade_account(self, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_get_spot_v3_public_symbols(request)`
- `public_get_spot_v3_public_quote_depth(request)`
- `public_get_spot_v3_public_quote_depth_merged(request)`
- `public_get_spot_v3_public_quote_trades(request)`
- `public_get_spot_v3_public_quote_kline(request)`
- `public_get_spot_v3_public_quote_ticker_24hr(request)`
- `public_get_spot_v3_public_quote_ticker_price(request)`
- `public_get_spot_v3_public_quote_ticker_bookticker(request)`
- `public_get_spot_v3_public_server_time(request)`
- `public_get_spot_v3_public_infos(request)`
- `public_get_spot_v3_public_margin_product_infos(request)`
- `public_get_spot_v3_public_margin_ensure_tokens(request)`
- `public_get_v3_public_time(request)`
- `public_get_contract_v3_public_copytrading_symbol_list(request)`
- `public_get_derivatives_v3_public_order_book_l2(request)`
- `public_get_derivatives_v3_public_kline(request)`
- `public_get_derivatives_v3_public_tickers(request)`
- `public_get_derivatives_v3_public_instruments_info(request)`
- `public_get_derivatives_v3_public_mark_price_kline(request)`
- `public_get_derivatives_v3_public_index_price_kline(request)`
- `public_get_derivatives_v3_public_funding_history_funding_rate(request)`
- `public_get_derivatives_v3_public_risk_limit_list(request)`
- `public_get_derivatives_v3_public_delivery_price(request)`
- `public_get_derivatives_v3_public_recent_trade(request)`
- `public_get_derivatives_v3_public_open_interest(request)`
- `public_get_derivatives_v3_public_insurance(request)`
- `public_get_v5_announcements_index(request)`
- `public_get_v5_market_time(request)`
- `public_get_v5_market_kline(request)`
- `public_get_v5_market_mark_price_kline(request)`
- `public_get_v5_market_index_price_kline(request)`
- `public_get_v5_market_premium_index_price_kline(request)`
- `public_get_v5_market_instruments_info(request)`
- `public_get_v5_market_orderbook(request)`
- `public_get_v5_market_tickers(request)`
- `public_get_v5_market_funding_history(request)`
- `public_get_v5_market_recent_trade(request)`
- `public_get_v5_market_open_interest(request)`
- `public_get_v5_market_historical_volatility(request)`
- `public_get_v5_market_insurance(request)`
- `public_get_v5_market_risk_limit(request)`
- `public_get_v5_market_delivery_price(request)`
- `public_get_v5_market_account_ratio(request)`
- `public_get_v5_spot_lever_token_info(request)`
- `public_get_v5_spot_lever_token_reference(request)`
- `public_get_v5_spot_margin_trade_data(request)`
- `public_get_v5_spot_margin_trade_collateral(request)`
- `public_get_v5_spot_cross_margin_trade_data(request)`
- `public_get_v5_spot_cross_margin_trade_pledge_token(request)`
- `public_get_v5_spot_cross_margin_trade_borrow_token(request)`
- `public_get_v5_crypto_loan_collateral_data(request)`
- `public_get_v5_crypto_loan_loanable_data(request)`
- `public_get_v5_crypto_loan_common_loanable_data(request)`
- `public_get_v5_crypto_loan_common_collateral_data(request)`
- `public_get_v5_crypto_loan_fixed_supply_order_quote(request)`
- `public_get_v5_crypto_loan_fixed_borrow_order_quote(request)`
- `public_get_v5_ins_loan_product_infos(request)`
- `public_get_v5_ins_loan_ensure_tokens_convert(request)`
- `public_get_v5_earn_product(request)`
- `private_get_v5_market_instruments_info(request)`
- `private_get_v2_private_wallet_fund_records(request)`
- `private_get_spot_v3_private_order(request)`
- `private_get_spot_v3_private_open_orders(request)`
- `private_get_spot_v3_private_history_orders(request)`
- `private_get_spot_v3_private_my_trades(request)`
- `private_get_spot_v3_private_account(request)`
- `private_get_spot_v3_private_reference(request)`
- `private_get_spot_v3_private_record(request)`
- `private_get_spot_v3_private_cross_margin_orders(request)`
- `private_get_spot_v3_private_cross_margin_account(request)`
- `private_get_spot_v3_private_cross_margin_loan_info(request)`
- `private_get_spot_v3_private_cross_margin_repay_history(request)`
- `private_get_spot_v3_private_margin_loan_infos(request)`
- `private_get_spot_v3_private_margin_repaid_infos(request)`
- `private_get_spot_v3_private_margin_ltv(request)`
- `private_get_asset_v3_private_transfer_inter_transfer_list_query(request)`
- `private_get_asset_v3_private_transfer_sub_member_list_query(request)`
- `private_get_asset_v3_private_transfer_sub_member_transfer_list_query(request)`
- `private_get_asset_v3_private_transfer_universal_transfer_list_query(request)`
- `private_get_asset_v3_private_coin_info_query(request)`
- `private_get_asset_v3_private_deposit_address_query(request)`
- `private_get_contract_v3_private_copytrading_order_list(request)`
- `private_get_contract_v3_private_copytrading_position_list(request)`
- `private_get_contract_v3_private_copytrading_wallet_balance(request)`
- `private_get_contract_v3_private_position_limit_info(request)`
- `private_get_contract_v3_private_order_unfilled_orders(request)`
- `private_get_contract_v3_private_order_list(request)`
- `private_get_contract_v3_private_position_list(request)`
- `private_get_contract_v3_private_execution_list(request)`
- `private_get_contract_v3_private_position_closed_pnl(request)`
- `private_get_contract_v3_private_account_wallet_balance(request)`
- `private_get_contract_v3_private_account_fee_rate(request)`
- `private_get_contract_v3_private_account_wallet_fund_records(request)`
- `private_get_unified_v3_private_order_unfilled_orders(request)`
- `private_get_unified_v3_private_order_list(request)`
- `private_get_unified_v3_private_position_list(request)`
- `private_get_unified_v3_private_execution_list(request)`
- `private_get_unified_v3_private_delivery_record(request)`
- `private_get_unified_v3_private_settlement_record(request)`
- `private_get_unified_v3_private_account_wallet_balance(request)`
- `private_get_unified_v3_private_account_transaction_log(request)`
- `private_get_unified_v3_private_account_borrow_history(request)`
- `private_get_unified_v3_private_account_borrow_rate(request)`
- `private_get_unified_v3_private_account_info(request)`
- `private_get_user_v3_private_frozen_sub_member(request)`
- `private_get_user_v3_private_query_sub_members(request)`
- `private_get_user_v3_private_query_api(request)`
- `private_get_user_v3_private_get_member_type(request)`
- `private_get_asset_v3_private_transfer_transfer_coin_list_query(request)`
- `private_get_asset_v3_private_transfer_account_coin_balance_query(request)`
- `private_get_asset_v3_private_transfer_account_coins_balance_query(request)`
- `private_get_asset_v3_private_transfer_asset_info_query(request)`
- `private_get_asset_v3_public_deposit_allowed_deposit_list_query(request)`
- `private_get_asset_v3_private_deposit_record_query(request)`
- `private_get_asset_v3_private_withdraw_record_query(request)`
- `private_get_v5_order_realtime(request)`
- `private_get_v5_order_history(request)`
- `private_get_v5_order_spot_borrow_check(request)`
- `private_get_v5_position_list(request)`
- `private_get_v5_execution_list(request)`
- `private_get_v5_position_closed_pnl(request)`
- `private_get_v5_position_move_history(request)`
- `private_get_v5_pre_upgrade_order_history(request)`
- `private_get_v5_pre_upgrade_execution_list(request)`
- `private_get_v5_pre_upgrade_position_closed_pnl(request)`
- `private_get_v5_pre_upgrade_account_transaction_log(request)`
- `private_get_v5_pre_upgrade_asset_delivery_record(request)`
- `private_get_v5_pre_upgrade_asset_settlement_record(request)`
- `private_get_v5_account_wallet_balance(request)`
- `private_get_v5_account_borrow_history(request)`
- `private_get_v5_account_instruments_info(request)`
- `private_get_v5_account_collateral_info(request)`
- `private_get_v5_asset_coin_greeks(request)`
- `private_get_v5_account_fee_rate(request)`
- `private_get_v5_account_info(request)`
- `private_get_v5_account_transaction_log(request)`
- `private_get_v5_account_contract_transaction_log(request)`
- `private_get_v5_account_smp_group(request)`
- `private_get_v5_account_mmp_state(request)`
- `private_get_v5_account_withdrawal(request)`
- `private_get_v5_asset_exchange_query_coin_list(request)`
- `private_get_v5_asset_exchange_convert_result_query(request)`
- `private_get_v5_asset_exchange_query_convert_history(request)`
- `private_get_v5_asset_exchange_order_record(request)`
- `private_get_v5_asset_delivery_record(request)`
- `private_get_v5_asset_settlement_record(request)`
- `private_get_v5_asset_transfer_query_asset_info(request)`
- `private_get_v5_asset_transfer_query_account_coins_balance(request)`
- `private_get_v5_asset_transfer_query_account_coin_balance(request)`
- `private_get_v5_asset_transfer_query_transfer_coin_list(request)`
- `private_get_v5_asset_transfer_query_inter_transfer_list(request)`
- `private_get_v5_asset_transfer_query_sub_member_list(request)`
- `private_get_v5_asset_transfer_query_universal_transfer_list(request)`
- `private_get_v5_asset_deposit_query_allowed_list(request)`
- `private_get_v5_asset_deposit_query_record(request)`
- `private_get_v5_asset_deposit_query_sub_member_record(request)`
- `private_get_v5_asset_deposit_query_internal_record(request)`
- `private_get_v5_asset_deposit_query_address(request)`
- `private_get_v5_asset_deposit_query_sub_member_address(request)`
- `private_get_v5_asset_coin_query_info(request)`
- `private_get_v5_asset_withdraw_query_address(request)`
- `private_get_v5_asset_withdraw_query_record(request)`
- `private_get_v5_asset_withdraw_withdrawable_amount(request)`
- `private_get_v5_asset_withdraw_vasp_list(request)`
- `private_get_v5_asset_convert_small_balance_list(request)`
- `private_get_v5_asset_convert_small_balance_history(request)`
- `private_get_v5_fiat_query_coin_list(request)`
- `private_get_v5_fiat_reference_price(request)`
- `private_get_v5_fiat_trade_query(request)`
- `private_get_v5_fiat_query_trade_history(request)`
- `private_get_v5_fiat_balance_query(request)`
- `private_get_v5_user_query_sub_members(request)`
- `private_get_v5_user_query_api(request)`
- `private_get_v5_user_sub_apikeys(request)`
- `private_get_v5_user_get_member_type(request)`
- `private_get_v5_user_aff_customer_info(request)`
- `private_get_v5_user_del_submember(request)`
- `private_get_v5_user_submembers(request)`
- `private_get_v5_affiliate_aff_user_list(request)`
- `private_get_v5_spot_lever_token_order_record(request)`
- `private_get_v5_spot_margin_trade_interest_rate_history(request)`
- `private_get_v5_spot_margin_trade_state(request)`
- `private_get_v5_spot_margin_trade_max_borrowable(request)`
- `private_get_v5_spot_margin_trade_position_tiers(request)`
- `private_get_v5_spot_margin_trade_coinstate(request)`
- `private_get_v5_spot_margin_trade_repayment_available_amount(request)`
- `private_get_v5_spot_margin_trade_get_auto_repay_mode(request)`
- `private_get_v5_spot_cross_margin_trade_loan_info(request)`
- `private_get_v5_spot_cross_margin_trade_account(request)`
- `private_get_v5_spot_cross_margin_trade_orders(request)`
- `private_get_v5_spot_cross_margin_trade_repay_history(request)`
- `private_get_v5_crypto_loan_borrowable_collateralisable_number(request)`
- `private_get_v5_crypto_loan_ongoing_orders(request)`
- `private_get_v5_crypto_loan_repayment_history(request)`
- `private_get_v5_crypto_loan_borrow_history(request)`
- `private_get_v5_crypto_loan_max_collateral_amount(request)`
- `private_get_v5_crypto_loan_adjustment_history(request)`
- `private_get_v5_crypto_loan_common_max_collateral_amount(request)`
- `private_get_v5_crypto_loan_common_adjustment_history(request)`
- `private_get_v5_crypto_loan_common_position(request)`
- `private_get_v5_crypto_loan_flexible_ongoing_coin(request)`
- `private_get_v5_crypto_loan_flexible_borrow_history(request)`
- `private_get_v5_crypto_loan_flexible_repayment_history(request)`
- `private_get_v5_crypto_loan_fixed_borrow_contract_info(request)`
- `private_get_v5_crypto_loan_fixed_supply_contract_info(request)`
- `private_get_v5_crypto_loan_fixed_borrow_order_info(request)`
- `private_get_v5_crypto_loan_fixed_renew_info(request)`
- `private_get_v5_crypto_loan_fixed_supply_order_info(request)`
- `private_get_v5_crypto_loan_fixed_repayment_history(request)`
- `private_get_v5_ins_loan_product_infos(request)`
- `private_get_v5_ins_loan_ensure_tokens_convert(request)`
- `private_get_v5_ins_loan_loan_order(request)`
- `private_get_v5_ins_loan_repaid_history(request)`
- `private_get_v5_ins_loan_ltv_convert(request)`
- `private_get_v5_lending_info(request)`
- `private_get_v5_lending_history_order(request)`
- `private_get_v5_lending_account(request)`
- `private_get_v5_broker_earning_record(request)`
- `private_get_v5_broker_earnings_info(request)`
- `private_get_v5_broker_account_info(request)`
- `private_get_v5_broker_asset_query_sub_member_deposit_record(request)`
- `private_get_v5_earn_product(request)`
- `private_get_v5_earn_order(request)`
- `private_get_v5_earn_position(request)`
- `private_get_v5_earn_yield(request)`
- `private_get_v5_earn_hourly_yield(request)`
- `private_post_spot_v3_private_order(request)`
- `private_post_spot_v3_private_cancel_order(request)`
- `private_post_spot_v3_private_cancel_orders(request)`
- `private_post_spot_v3_private_cancel_orders_by_ids(request)`
- `private_post_spot_v3_private_purchase(request)`
- `private_post_spot_v3_private_redeem(request)`
- `private_post_spot_v3_private_cross_margin_loan(request)`
- `private_post_spot_v3_private_cross_margin_repay(request)`
- `private_post_asset_v3_private_transfer_inter_transfer(request)`
- `private_post_asset_v3_private_withdraw_create(request)`
- `private_post_asset_v3_private_withdraw_cancel(request)`
- `private_post_asset_v3_private_transfer_sub_member_transfer(request)`
- `private_post_asset_v3_private_transfer_transfer_sub_member_save(request)`
- `private_post_asset_v3_private_transfer_universal_transfer(request)`
- `private_post_user_v3_private_create_sub_member(request)`
- `private_post_user_v3_private_create_sub_api(request)`
- `private_post_user_v3_private_update_api(request)`
- `private_post_user_v3_private_delete_api(request)`
- `private_post_user_v3_private_update_sub_api(request)`
- `private_post_user_v3_private_delete_sub_api(request)`
- `private_post_contract_v3_private_copytrading_order_create(request)`
- `private_post_contract_v3_private_copytrading_order_cancel(request)`
- `private_post_contract_v3_private_copytrading_order_close(request)`
- `private_post_contract_v3_private_copytrading_position_close(request)`
- `private_post_contract_v3_private_copytrading_position_set_leverage(request)`
- `private_post_contract_v3_private_copytrading_wallet_transfer(request)`
- `private_post_contract_v3_private_copytrading_order_trading_stop(request)`
- `private_post_contract_v3_private_order_create(request)`
- `private_post_contract_v3_private_order_cancel(request)`
- `private_post_contract_v3_private_order_cancel_all(request)`
- `private_post_contract_v3_private_order_replace(request)`
- `private_post_contract_v3_private_position_set_auto_add_margin(request)`
- `private_post_contract_v3_private_position_switch_isolated(request)`
- `private_post_contract_v3_private_position_switch_mode(request)`
- `private_post_contract_v3_private_position_switch_tpsl_mode(request)`
- `private_post_contract_v3_private_position_set_leverage(request)`
- `private_post_contract_v3_private_position_trading_stop(request)`
- `private_post_contract_v3_private_position_set_risk_limit(request)`
- `private_post_contract_v3_private_account_setmarginmode(request)`
- `private_post_unified_v3_private_order_create(request)`
- `private_post_unified_v3_private_order_replace(request)`
- `private_post_unified_v3_private_order_cancel(request)`
- `private_post_unified_v3_private_order_create_batch(request)`
- `private_post_unified_v3_private_order_replace_batch(request)`
- `private_post_unified_v3_private_order_cancel_batch(request)`
- `private_post_unified_v3_private_order_cancel_all(request)`
- `private_post_unified_v3_private_position_set_leverage(request)`
- `private_post_unified_v3_private_position_tpsl_switch_mode(request)`
- `private_post_unified_v3_private_position_set_risk_limit(request)`
- `private_post_unified_v3_private_position_trading_stop(request)`
- `private_post_unified_v3_private_account_upgrade_unified_account(request)`
- `private_post_unified_v3_private_account_setmarginmode(request)`
- `private_post_fht_compliance_tax_v3_private_registertime(request)`
- `private_post_fht_compliance_tax_v3_private_create(request)`
- `private_post_fht_compliance_tax_v3_private_status(request)`
- `private_post_fht_compliance_tax_v3_private_url(request)`
- `private_post_v5_order_create(request)`
- `private_post_v5_order_amend(request)`
- `private_post_v5_order_cancel(request)`
- `private_post_v5_order_cancel_all(request)`
- `private_post_v5_order_create_batch(request)`
- `private_post_v5_order_amend_batch(request)`
- `private_post_v5_order_cancel_batch(request)`
- `private_post_v5_order_disconnected_cancel_all(request)`
- `private_post_v5_position_set_leverage(request)`
- `private_post_v5_position_switch_isolated(request)`
- `private_post_v5_position_set_tpsl_mode(request)`
- `private_post_v5_position_switch_mode(request)`
- `private_post_v5_position_set_risk_limit(request)`
- `private_post_v5_position_trading_stop(request)`
- `private_post_v5_position_set_auto_add_margin(request)`
- `private_post_v5_position_add_margin(request)`
- `private_post_v5_position_move_positions(request)`
- `private_post_v5_position_confirm_pending_mmr(request)`
- `private_post_v5_account_upgrade_to_uta(request)`
- `private_post_v5_account_quick_repayment(request)`
- `private_post_v5_account_set_margin_mode(request)`
- `private_post_v5_account_set_hedging_mode(request)`
- `private_post_v5_account_mmp_modify(request)`
- `private_post_v5_account_mmp_reset(request)`
- `private_post_v5_account_borrow(request)`
- `private_post_v5_account_repay(request)`
- `private_post_v5_account_no_convert_repay(request)`
- `private_post_v5_account_set_limit_px_action(request)`
- `private_post_v5_asset_exchange_quote_apply(request)`
- `private_post_v5_asset_exchange_convert_execute(request)`
- `private_post_v5_asset_transfer_inter_transfer(request)`
- `private_post_v5_asset_transfer_save_transfer_sub_member(request)`
- `private_post_v5_asset_transfer_universal_transfer(request)`
- `private_post_v5_asset_deposit_deposit_to_account(request)`
- `private_post_v5_asset_withdraw_create(request)`
- `private_post_v5_asset_withdraw_cancel(request)`
- `private_post_v5_asset_covert_get_quote(request)`
- `private_post_v5_asset_covert_small_balance_execute(request)`
- `private_post_v5_fiat_quote_apply(request)`
- `private_post_v5_fiat_trade_execute(request)`
- `private_post_v5_user_create_sub_member(request)`
- `private_post_v5_user_create_sub_api(request)`
- `private_post_v5_user_frozen_sub_member(request)`
- `private_post_v5_user_update_api(request)`
- `private_post_v5_user_update_sub_api(request)`
- `private_post_v5_user_delete_api(request)`
- `private_post_v5_user_delete_sub_api(request)`
- `private_post_v5_spot_lever_token_purchase(request)`
- `private_post_v5_spot_lever_token_redeem(request)`
- `private_post_v5_spot_margin_trade_switch_mode(request)`
- `private_post_v5_spot_margin_trade_set_leverage(request)`
- `private_post_v5_spot_margin_trade_set_auto_repay_mode(request)`
- `private_post_v5_spot_cross_margin_trade_loan(request)`
- `private_post_v5_spot_cross_margin_trade_repay(request)`
- `private_post_v5_spot_cross_margin_trade_switch(request)`
- `private_post_v5_crypto_loan_borrow(request)`
- `private_post_v5_crypto_loan_repay(request)`
- `private_post_v5_crypto_loan_adjust_ltv(request)`
- `private_post_v5_crypto_loan_common_adjust_ltv(request)`
- `private_post_v5_crypto_loan_common_max_loan(request)`
- `private_post_v5_crypto_loan_flexible_borrow(request)`
- `private_post_v5_crypto_loan_flexible_repay(request)`
- `private_post_v5_crypto_loan_flexible_repay_collateral(request)`
- `private_post_v5_crypto_loan_fixed_borrow(request)`
- `private_post_v5_crypto_loan_fixed_renew(request)`
- `private_post_v5_crypto_loan_fixed_supply(request)`
- `private_post_v5_crypto_loan_fixed_borrow_order_cancel(request)`
- `private_post_v5_crypto_loan_fixed_supply_order_cancel(request)`
- `private_post_v5_crypto_loan_fixed_fully_repay(request)`
- `private_post_v5_crypto_loan_fixed_repay_collateral(request)`
- `private_post_v5_ins_loan_association_uid(request)`
- `private_post_v5_ins_loan_repay_loan(request)`
- `private_post_v5_lending_purchase(request)`
- `private_post_v5_lending_redeem(request)`
- `private_post_v5_lending_redeem_cancel(request)`
- `private_post_v5_account_set_collateral_switch(request)`
- `private_post_v5_account_set_collateral_switch_batch(request)`
- `private_post_v5_account_demo_apply_money(request)`
- `private_post_v5_broker_award_info(request)`
- `private_post_v5_broker_award_distribute_award(request)`
- `private_post_v5_broker_award_distribution_record(request)`
- `private_post_v5_earn_place_order(request)`
### WS Unified
- `describe(self)`
- `get_url_by_market_type(self, symbol: Str = None, isPrivate=False, method: Str = None, params={})`
- `clean_params(self, params)`
- `create_order_ws(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `edit_order_ws(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `cancel_order_ws(self, id: str, symbol: Str = None, params={})`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `un_watch_tickers(self, symbols: Strings = None, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `watch_ohlcv_for_symbols(self, symbolsAndTimeframes: List[List[str]], since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv_for_symbols(self, symbolsAndTimeframes: List[List[str]], params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `un_watch_order_book_for_symbols(self, symbols: List[str], params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `get_private_type(self, url)`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `un_watch_my_trades(self, symbol: Str = None, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `set_positions_cache(self, client: Client, symbols: Strings = None)`
- `load_positions_snapshot(self, client, messageHash)`
- `un_watch_positions(self, symbols: Strings = None, params={})`
- `watch_liquidations(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `un_watch_orders(self, symbol: Str = None, params={})`
- `watch_balance(self, params={})`
- `watch_topics(self, url, messageHashes, topics, params={})`
- `un_watch_topics(self, url: str, topic: str, symbols: Strings, messageHashes: List[str], subMessageHashes: List[str], topics, params={}, subExtension={})`
- `authenticate(self, url, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:44.345402 | bybit_api-0.0.124.tar.gz | 834,008 | c6/c7/1ee573ee2c63eb086683fcf0d7010b8a1d4f3799f09966e8516a0649ea34/bybit_api-0.0.124.tar.gz | source | sdist | null | false | 519c0930710fec265303010b6b932239 | 54ba9a6aa83bc17540dabbd178a84a62bf1dc3a1ddfb457aeefdc1c2901591fd | c6c71ee573ee2c63eb086683fcf0d7010b8a1d4f3799f09966e8516a0649ea34 | null | [] | 248 |
2.4 | bitget | 0.0.124 | bitget crypto exchange api client | # bitget-python
Python SDK (sync and async) for Bitget cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/bitget)
- You can check Bitget's docs here: [Docs](https://www.google.com/search?q=google+bitget+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/bitget-python
- Pypi package: https://pypi.org/project/bitget
## Installation
```
pip install bitget
```
## Usage
### Sync
```Python
from bitget import BitgetSync
def main():
instance = BitgetSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from bitget import BitgetAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BitgetAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from bitget import BitgetWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = BitgetWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_convert_trade(self, id: str, fromCode: str, toCode: str, amount: Num = None, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `create_uta_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_uta_orders(self, orders: List[OrderRequest], params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_and_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_convert_currencies(self, params={})`
- `fetch_convert_quote(self, fromCode: str, toCode: str, amount: Num = None, params={})`
- `fetch_convert_trade_history(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_cross_borrow_rate(self, code: str, params={})`
- `fetch_currencies(self, params={})`
- `fetch_default_markets(self, params)`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_intervals(self, symbols: Strings = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_isolated_borrow_rate(self, symbol: str, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_long_short_ratio_history(self, symbol: Str = None, timeframe: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_margin_mode(self, symbol: str, params={})`
- `fetch_mark_price(self, symbol: str, params={})`
- `fetch_market_leverage_tiers(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_liquidations(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_history(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_uta_canceled_and_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_uta_markets(self, params)`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `cancel_uta_orders(self, ids, symbol: Str = None, params={})`
- `close_all_positions(self, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `enable_demo_trading(self, enabled: bool)`
- `modify_margin_helper(self, symbol: str, amount, type, params={})`
- `nonce(self)`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `repay_cross_margin(self, code: str, amount, params={})`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `set_sandbox_mode(self, enabled: bool)`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `public_common_get_v2_public_annoucements(request)`
- `public_common_get_v2_public_time(request)`
- `public_spot_get_spot_v1_notice_queryallnotices(request)`
- `public_spot_get_spot_v1_public_time(request)`
- `public_spot_get_spot_v1_public_currencies(request)`
- `public_spot_get_spot_v1_public_products(request)`
- `public_spot_get_spot_v1_public_product(request)`
- `public_spot_get_spot_v1_market_ticker(request)`
- `public_spot_get_spot_v1_market_tickers(request)`
- `public_spot_get_spot_v1_market_fills(request)`
- `public_spot_get_spot_v1_market_fills_history(request)`
- `public_spot_get_spot_v1_market_candles(request)`
- `public_spot_get_spot_v1_market_depth(request)`
- `public_spot_get_spot_v1_market_spot_vip_level(request)`
- `public_spot_get_spot_v1_market_merge_depth(request)`
- `public_spot_get_spot_v1_market_history_candles(request)`
- `public_spot_get_spot_v1_public_loan_coininfos(request)`
- `public_spot_get_spot_v1_public_loan_hour_interest(request)`
- `public_spot_get_v2_spot_public_coins(request)`
- `public_spot_get_v2_spot_public_symbols(request)`
- `public_spot_get_v2_spot_market_vip_fee_rate(request)`
- `public_spot_get_v2_spot_market_tickers(request)`
- `public_spot_get_v2_spot_market_merge_depth(request)`
- `public_spot_get_v2_spot_market_orderbook(request)`
- `public_spot_get_v2_spot_market_candles(request)`
- `public_spot_get_v2_spot_market_history_candles(request)`
- `public_spot_get_v2_spot_market_fills(request)`
- `public_spot_get_v2_spot_market_fills_history(request)`
- `public_mix_get_mix_v1_market_contracts(request)`
- `public_mix_get_mix_v1_market_depth(request)`
- `public_mix_get_mix_v1_market_ticker(request)`
- `public_mix_get_mix_v1_market_tickers(request)`
- `public_mix_get_mix_v1_market_contract_vip_level(request)`
- `public_mix_get_mix_v1_market_fills(request)`
- `public_mix_get_mix_v1_market_fills_history(request)`
- `public_mix_get_mix_v1_market_candles(request)`
- `public_mix_get_mix_v1_market_index(request)`
- `public_mix_get_mix_v1_market_funding_time(request)`
- `public_mix_get_mix_v1_market_history_fundrate(request)`
- `public_mix_get_mix_v1_market_current_fundrate(request)`
- `public_mix_get_mix_v1_market_open_interest(request)`
- `public_mix_get_mix_v1_market_mark_price(request)`
- `public_mix_get_mix_v1_market_symbol_leverage(request)`
- `public_mix_get_mix_v1_market_querypositionlever(request)`
- `public_mix_get_mix_v1_market_open_limit(request)`
- `public_mix_get_mix_v1_market_history_candles(request)`
- `public_mix_get_mix_v1_market_history_index_candles(request)`
- `public_mix_get_mix_v1_market_history_mark_candles(request)`
- `public_mix_get_mix_v1_market_merge_depth(request)`
- `public_mix_get_v2_mix_market_vip_fee_rate(request)`
- `public_mix_get_v2_mix_market_union_interest_rate_history(request)`
- `public_mix_get_v2_mix_market_exchange_rate(request)`
- `public_mix_get_v2_mix_market_discount_rate(request)`
- `public_mix_get_v2_mix_market_merge_depth(request)`
- `public_mix_get_v2_mix_market_ticker(request)`
- `public_mix_get_v2_mix_market_tickers(request)`
- `public_mix_get_v2_mix_market_fills(request)`
- `public_mix_get_v2_mix_market_fills_history(request)`
- `public_mix_get_v2_mix_market_candles(request)`
- `public_mix_get_v2_mix_market_history_candles(request)`
- `public_mix_get_v2_mix_market_history_index_candles(request)`
- `public_mix_get_v2_mix_market_history_mark_candles(request)`
- `public_mix_get_v2_mix_market_open_interest(request)`
- `public_mix_get_v2_mix_market_funding_time(request)`
- `public_mix_get_v2_mix_market_symbol_price(request)`
- `public_mix_get_v2_mix_market_history_fund_rate(request)`
- `public_mix_get_v2_mix_market_current_fund_rate(request)`
- `public_mix_get_v2_mix_market_oi_limit(request)`
- `public_mix_get_v2_mix_market_contracts(request)`
- `public_mix_get_v2_mix_market_query_position_lever(request)`
- `public_mix_get_v2_mix_market_account_long_short(request)`
- `public_margin_get_margin_v1_cross_public_interestrateandlimit(request)`
- `public_margin_get_margin_v1_isolated_public_interestrateandlimit(request)`
- `public_margin_get_margin_v1_cross_public_tierdata(request)`
- `public_margin_get_margin_v1_isolated_public_tierdata(request)`
- `public_margin_get_margin_v1_public_currencies(request)`
- `public_margin_get_v2_margin_currencies(request)`
- `public_margin_get_v2_margin_market_long_short_ratio(request)`
- `public_earn_get_v2_earn_loan_public_coininfos(request)`
- `public_earn_get_v2_earn_loan_public_hour_interest(request)`
- `public_uta_get_v3_market_instruments(request)`
- `public_uta_get_v3_market_tickers(request)`
- `public_uta_get_v3_market_orderbook(request)`
- `public_uta_get_v3_market_fills(request)`
- `public_uta_get_v3_market_proof_of_reserves(request)`
- `public_uta_get_v3_market_open_interest(request)`
- `public_uta_get_v3_market_candles(request)`
- `public_uta_get_v3_market_history_candles(request)`
- `public_uta_get_v3_market_current_fund_rate(request)`
- `public_uta_get_v3_market_history_fund_rate(request)`
- `public_uta_get_v3_market_risk_reserve(request)`
- `public_uta_get_v3_market_discount_rate(request)`
- `public_uta_get_v3_market_margin_loans(request)`
- `public_uta_get_v3_market_position_tier(request)`
- `public_uta_get_v3_market_oi_limit(request)`
- `public_uta_get_v3_market_index_components(request)`
- `private_spot_get_spot_v1_wallet_deposit_address(request)`
- `private_spot_get_spot_v1_wallet_withdrawal_list(request)`
- `private_spot_get_spot_v1_wallet_deposit_list(request)`
- `private_spot_get_spot_v1_account_getinfo(request)`
- `private_spot_get_spot_v1_account_assets(request)`
- `private_spot_get_spot_v1_account_assets_lite(request)`
- `private_spot_get_spot_v1_account_transferrecords(request)`
- `private_spot_get_spot_v1_convert_currencies(request)`
- `private_spot_get_spot_v1_convert_convert_record(request)`
- `private_spot_get_spot_v1_loan_ongoing_orders(request)`
- `private_spot_get_spot_v1_loan_repay_history(request)`
- `private_spot_get_spot_v1_loan_revise_history(request)`
- `private_spot_get_spot_v1_loan_borrow_history(request)`
- `private_spot_get_spot_v1_loan_debts(request)`
- `private_spot_get_v2_spot_trade_orderinfo(request)`
- `private_spot_get_v2_spot_trade_unfilled_orders(request)`
- `private_spot_get_v2_spot_trade_history_orders(request)`
- `private_spot_get_v2_spot_trade_fills(request)`
- `private_spot_get_v2_spot_trade_current_plan_order(request)`
- `private_spot_get_v2_spot_trade_history_plan_order(request)`
- `private_spot_get_v2_spot_account_info(request)`
- `private_spot_get_v2_spot_account_assets(request)`
- `private_spot_get_v2_spot_account_subaccount_assets(request)`
- `private_spot_get_v2_spot_account_bills(request)`
- `private_spot_get_v2_spot_account_transferrecords(request)`
- `private_spot_get_v2_account_funding_assets(request)`
- `private_spot_get_v2_account_bot_assets(request)`
- `private_spot_get_v2_account_all_account_balance(request)`
- `private_spot_get_v2_spot_wallet_deposit_address(request)`
- `private_spot_get_v2_spot_wallet_deposit_records(request)`
- `private_spot_get_v2_spot_wallet_withdrawal_records(request)`
- `private_spot_get_v2_spot_account_upgrade_status(request)`
- `private_spot_post_spot_v1_wallet_transfer(request)`
- `private_spot_post_spot_v1_wallet_transfer_v2(request)`
- `private_spot_post_spot_v1_wallet_subtransfer(request)`
- `private_spot_post_spot_v1_wallet_withdrawal(request)`
- `private_spot_post_spot_v1_wallet_withdrawal_v2(request)`
- `private_spot_post_spot_v1_wallet_withdrawal_inner(request)`
- `private_spot_post_spot_v1_wallet_withdrawal_inner_v2(request)`
- `private_spot_post_spot_v1_account_sub_account_spot_assets(request)`
- `private_spot_post_spot_v1_account_bills(request)`
- `private_spot_post_spot_v1_trade_orders(request)`
- `private_spot_post_spot_v1_trade_batch_orders(request)`
- `private_spot_post_spot_v1_trade_cancel_order(request)`
- `private_spot_post_spot_v1_trade_cancel_order_v2(request)`
- `private_spot_post_spot_v1_trade_cancel_symbol_order(request)`
- `private_spot_post_spot_v1_trade_cancel_batch_orders(request)`
- `private_spot_post_spot_v1_trade_cancel_batch_orders_v2(request)`
- `private_spot_post_spot_v1_trade_orderinfo(request)`
- `private_spot_post_spot_v1_trade_open_orders(request)`
- `private_spot_post_spot_v1_trade_history(request)`
- `private_spot_post_spot_v1_trade_fills(request)`
- `private_spot_post_spot_v1_plan_placeplan(request)`
- `private_spot_post_spot_v1_plan_modifyplan(request)`
- `private_spot_post_spot_v1_plan_cancelplan(request)`
- `private_spot_post_spot_v1_plan_currentplan(request)`
- `private_spot_post_spot_v1_plan_historyplan(request)`
- `private_spot_post_spot_v1_plan_batchcancelplan(request)`
- `private_spot_post_spot_v1_convert_quoted_price(request)`
- `private_spot_post_spot_v1_convert_trade(request)`
- `private_spot_post_spot_v1_loan_borrow(request)`
- `private_spot_post_spot_v1_loan_repay(request)`
- `private_spot_post_spot_v1_loan_revise_pledge(request)`
- `private_spot_post_spot_v1_trace_order_ordercurrentlist(request)`
- `private_spot_post_spot_v1_trace_order_orderhistorylist(request)`
- `private_spot_post_spot_v1_trace_order_closetrackingorder(request)`
- `private_spot_post_spot_v1_trace_order_updatetpsl(request)`
- `private_spot_post_spot_v1_trace_order_followerendorder(request)`
- `private_spot_post_spot_v1_trace_order_spotinfolist(request)`
- `private_spot_post_spot_v1_trace_config_gettradersettings(request)`
- `private_spot_post_spot_v1_trace_config_getfollowersettings(request)`
- `private_spot_post_spot_v1_trace_user_mytraders(request)`
- `private_spot_post_spot_v1_trace_config_setfollowerconfig(request)`
- `private_spot_post_spot_v1_trace_user_myfollowers(request)`
- `private_spot_post_spot_v1_trace_config_setproductcode(request)`
- `private_spot_post_spot_v1_trace_user_removetrader(request)`
- `private_spot_post_spot_v1_trace_getremovablefollower(request)`
- `private_spot_post_spot_v1_trace_user_removefollower(request)`
- `private_spot_post_spot_v1_trace_profit_totalprofitinfo(request)`
- `private_spot_post_spot_v1_trace_profit_totalprofitlist(request)`
- `private_spot_post_spot_v1_trace_profit_profithislist(request)`
- `private_spot_post_spot_v1_trace_profit_profithisdetaillist(request)`
- `private_spot_post_spot_v1_trace_profit_waitprofitdetaillist(request)`
- `private_spot_post_spot_v1_trace_user_gettraderinfo(request)`
- `private_spot_post_v2_spot_trade_place_order(request)`
- `private_spot_post_v2_spot_trade_cancel_order(request)`
- `private_spot_post_v2_spot_trade_batch_orders(request)`
- `private_spot_post_v2_spot_trade_batch_cancel_order(request)`
- `private_spot_post_v2_spot_trade_cancel_symbol_order(request)`
- `private_spot_post_v2_spot_trade_place_plan_order(request)`
- `private_spot_post_v2_spot_trade_modify_plan_order(request)`
- `private_spot_post_v2_spot_trade_cancel_plan_order(request)`
- `private_spot_post_v2_spot_trade_cancel_replace_order(request)`
- `private_spot_post_v2_spot_trade_batch_cancel_plan_order(request)`
- `private_spot_post_v2_spot_wallet_transfer(request)`
- `private_spot_post_v2_spot_wallet_subaccount_transfer(request)`
- `private_spot_post_v2_spot_wallet_withdrawal(request)`
- `private_spot_post_v2_spot_wallet_cancel_withdrawal(request)`
- `private_spot_post_v2_spot_wallet_modify_deposit_account(request)`
- `private_spot_post_v2_spot_account_upgrade(request)`
- `private_mix_get_mix_v1_account_account(request)`
- `private_mix_get_mix_v1_account_accounts(request)`
- `private_mix_get_mix_v1_position_singleposition(request)`
- `private_mix_get_mix_v1_position_singleposition_v2(request)`
- `private_mix_get_mix_v1_position_allposition(request)`
- `private_mix_get_mix_v1_position_allposition_v2(request)`
- `private_mix_get_mix_v1_position_history_position(request)`
- `private_mix_get_mix_v1_account_accountbill(request)`
- `private_mix_get_mix_v1_account_accountbusinessbill(request)`
- `private_mix_get_mix_v1_order_current(request)`
- `private_mix_get_mix_v1_order_margincoincurrent(request)`
- `private_mix_get_mix_v1_order_history(request)`
- `private_mix_get_mix_v1_order_historyproducttype(request)`
- `private_mix_get_mix_v1_order_detail(request)`
- `private_mix_get_mix_v1_order_fills(request)`
- `private_mix_get_mix_v1_order_allfills(request)`
- `private_mix_get_mix_v1_plan_currentplan(request)`
- `private_mix_get_mix_v1_plan_historyplan(request)`
- `private_mix_get_mix_v1_trace_currenttrack(request)`
- `private_mix_get_mix_v1_trace_followerorder(request)`
- `private_mix_get_mix_v1_trace_followerhistoryorders(request)`
- `private_mix_get_mix_v1_trace_historytrack(request)`
- `private_mix_get_mix_v1_trace_summary(request)`
- `private_mix_get_mix_v1_trace_profitsettletokenidgroup(request)`
- `private_mix_get_mix_v1_trace_profitdategrouplist(request)`
- `private_mix_get_mix_v1_trade_profitdatelist(request)`
- `private_mix_get_mix_v1_trace_waitprofitdatelist(request)`
- `private_mix_get_mix_v1_trace_tradersymbols(request)`
- `private_mix_get_mix_v1_trace_traderlist(request)`
- `private_mix_get_mix_v1_trace_traderdetail(request)`
- `private_mix_get_mix_v1_trace_querytraceconfig(request)`
- `private_mix_get_v2_mix_account_account(request)`
- `private_mix_get_v2_mix_account_accounts(request)`
- `private_mix_get_v2_mix_account_sub_account_assets(request)`
- `private_mix_get_v2_mix_account_interest_history(request)`
- `private_mix_get_v2_mix_account_max_open(request)`
- `private_mix_get_v2_mix_account_liq_price(request)`
- `private_mix_get_v2_mix_account_open_count(request)`
- `private_mix_get_v2_mix_account_bill(request)`
- `private_mix_get_v2_mix_account_transfer_limits(request)`
- `private_mix_get_v2_mix_account_union_config(request)`
- `private_mix_get_v2_mix_account_switch_union_usdt(request)`
- `private_mix_get_v2_mix_account_isolated_symbols(request)`
- `private_mix_get_v2_mix_market_query_position_lever(request)`
- `private_mix_get_v2_mix_position_single_position(request)`
- `private_mix_get_v2_mix_position_all_position(request)`
- `private_mix_get_v2_mix_position_adlrank(request)`
- `private_mix_get_v2_mix_position_history_position(request)`
- `private_mix_get_v2_mix_order_detail(request)`
- `private_mix_get_v2_mix_order_fills(request)`
- `private_mix_get_v2_mix_order_fill_history(request)`
- `private_mix_get_v2_mix_order_orders_pending(request)`
- `private_mix_get_v2_mix_order_orders_history(request)`
- `private_mix_get_v2_mix_order_plan_sub_order(request)`
- `private_mix_get_v2_mix_order_orders_plan_pending(request)`
- `private_mix_get_v2_mix_order_orders_plan_history(request)`
- `private_mix_get_v2_mix_market_position_long_short(request)`
- `private_mix_post_mix_v1_account_sub_account_contract_assets(request)`
- `private_mix_post_mix_v1_account_open_count(request)`
- `private_mix_post_mix_v1_account_setleverage(request)`
- `private_mix_post_mix_v1_account_setmargin(request)`
- `private_mix_post_mix_v1_account_setmarginmode(request)`
- `private_mix_post_mix_v1_account_setpositionmode(request)`
- `private_mix_post_mix_v1_order_placeorder(request)`
- `private_mix_post_mix_v1_order_batch_orders(request)`
- `private_mix_post_mix_v1_order_cancel_order(request)`
- `private_mix_post_mix_v1_order_cancel_batch_orders(request)`
- `private_mix_post_mix_v1_order_modifyorder(request)`
- `private_mix_post_mix_v1_order_cancel_symbol_orders(request)`
- `private_mix_post_mix_v1_order_cancel_all_orders(request)`
- `private_mix_post_mix_v1_order_close_all_positions(request)`
- `private_mix_post_mix_v1_plan_placeplan(request)`
- `private_mix_post_mix_v1_plan_modifyplan(request)`
- `private_mix_post_mix_v1_plan_modifyplanpreset(request)`
- `private_mix_post_mix_v1_plan_placetpsl(request)`
- `private_mix_post_mix_v1_plan_placetrailstop(request)`
- `private_mix_post_mix_v1_plan_placepositionstpsl(request)`
- `private_mix_post_mix_v1_plan_modifytpslplan(request)`
- `private_mix_post_mix_v1_plan_cancelplan(request)`
- `private_mix_post_mix_v1_plan_cancelsymbolplan(request)`
- `private_mix_post_mix_v1_plan_cancelallplan(request)`
- `private_mix_post_mix_v1_trace_closetrackorder(request)`
- `private_mix_post_mix_v1_trace_modifytpsl(request)`
- `private_mix_post_mix_v1_trace_closetrackorderbysymbol(request)`
- `private_mix_post_mix_v1_trace_setupcopysymbols(request)`
- `private_mix_post_mix_v1_trace_followersetbatchtraceconfig(request)`
- `private_mix_post_mix_v1_trace_followerclosebytrackingno(request)`
- `private_mix_post_mix_v1_trace_followerclosebyall(request)`
- `private_mix_post_mix_v1_trace_followersettpsl(request)`
- `private_mix_post_mix_v1_trace_cancelcopytrader(request)`
- `private_mix_post_mix_v1_trace_traderupdateconfig(request)`
- `private_mix_post_mix_v1_trace_mytraderlist(request)`
- `private_mix_post_mix_v1_trace_myfollowerlist(request)`
- `private_mix_post_mix_v1_trace_removefollower(request)`
- `private_mix_post_mix_v1_trace_public_getfollowerconfig(request)`
- `private_mix_post_mix_v1_trace_report_order_historylist(request)`
- `private_mix_post_mix_v1_trace_report_order_currentlist(request)`
- `private_mix_post_mix_v1_trace_querytradertpslratioconfig(request)`
- `private_mix_post_mix_v1_trace_traderupdatetpslratioconfig(request)`
- `private_mix_post_v2_mix_account_set_auto_margin(request)`
- `private_mix_post_v2_mix_account_set_leverage(request)`
- `private_mix_post_v2_mix_account_set_all_leverage(request)`
- `private_mix_post_v2_mix_account_set_margin(request)`
- `private_mix_post_v2_mix_account_set_asset_mode(request)`
- `private_mix_post_v2_mix_account_set_margin_mode(request)`
- `private_mix_post_v2_mix_account_union_convert(request)`
- `private_mix_post_v2_mix_account_set_position_mode(request)`
- `private_mix_post_v2_mix_order_place_order(request)`
- `private_mix_post_v2_mix_order_click_backhand(request)`
- `private_mix_post_v2_mix_order_batch_place_order(request)`
- `private_mix_post_v2_mix_order_modify_order(request)`
- `private_mix_post_v2_mix_order_cancel_order(request)`
- `private_mix_post_v2_mix_order_batch_cancel_orders(request)`
- `private_mix_post_v2_mix_order_close_positions(request)`
- `private_mix_post_v2_mix_order_cancel_all_orders(request)`
- `private_mix_post_v2_mix_order_place_tpsl_order(request)`
- `private_mix_post_v2_mix_order_place_pos_tpsl(request)`
- `private_mix_post_v2_mix_order_place_plan_order(request)`
- `private_mix_post_v2_mix_order_modify_tpsl_order(request)`
- `private_mix_post_v2_mix_order_modify_plan_order(request)`
- `private_mix_post_v2_mix_order_cancel_plan_order(request)`
- `private_user_get_user_v1_fee_query(request)`
- `private_user_get_user_v1_sub_virtual_list(request)`
- `private_user_get_user_v1_sub_virtual_api_list(request)`
- `private_user_get_user_v1_tax_spot_record(request)`
- `private_user_get_user_v1_tax_future_record(request)`
- `private_user_get_user_v1_tax_margin_record(request)`
- `private_user_get_user_v1_tax_p2p_record(request)`
- `private_user_get_v2_user_virtual_subaccount_list(request)`
- `private_user_get_v2_user_virtual_subaccount_apikey_list(request)`
- `private_user_post_user_v1_sub_virtual_create(request)`
- `private_user_post_user_v1_sub_virtual_modify(request)`
- `private_user_post_user_v1_sub_virtual_api_batch_create(request)`
- `private_user_post_user_v1_sub_virtual_api_create(request)`
- `private_user_post_user_v1_sub_virtual_api_modify(request)`
- `private_user_post_v2_user_create_virtual_subaccount(request)`
- `private_user_post_v2_user_modify_virtual_subaccount(request)`
- `private_user_post_v2_user_batch_create_subaccount_and_apikey(request)`
- `private_user_post_v2_user_create_virtual_subaccount_apikey(request)`
- `private_user_post_v2_user_modify_virtual_subaccount_apikey(request)`
- `private_p2p_get_p2p_v1_merchant_merchantlist(request)`
- `private_p2p_get_p2p_v1_merchant_merchantinfo(request)`
- `private_p2p_get_p2p_v1_merchant_advlist(request)`
- `private_p2p_get_p2p_v1_merchant_orderlist(request)`
- `private_p2p_get_v2_p2p_merchantlist(request)`
- `private_p2p_get_v2_p2p_merchantinfo(request)`
- `private_p2p_get_v2_p2p_orderlist(request)`
- `private_p2p_get_v2_p2p_advlist(request)`
- `private_broker_get_broker_v1_account_info(request)`
- `private_broker_get_broker_v1_account_sub_list(request)`
- `private_broker_get_broker_v1_account_sub_email(request)`
- `private_broker_get_broker_v1_account_sub_spot_assets(request)`
- `private_broker_get_broker_v1_account_sub_future_assets(request)`
- `private_broker_get_broker_v1_account_subaccount_transfer(request)`
- `private_broker_get_broker_v1_account_subaccount_deposit(request)`
- `private_broker_get_broker_v1_account_subaccount_withdrawal(request)`
- `private_broker_get_broker_v1_account_sub_api_list(request)`
- `private_broker_get_v2_broker_account_info(request)`
- `private_broker_get_v2_broker_account_subaccount_list(request)`
- `private_broker_get_v2_broker_account_subaccount_email(request)`
- `private_broker_get_v2_broker_account_subaccount_spot_assets(request)`
- `private_broker_get_v2_broker_account_subaccount_future_assets(request)`
- `private_broker_get_v2_broker_manage_subaccount_apikey_list(request)`
- `private_broker_post_broker_v1_account_sub_create(request)`
- `private_broker_post_broker_v1_account_sub_modify(request)`
- `private_broker_post_broker_v1_account_sub_modify_email(request)`
- `private_broker_post_broker_v1_account_sub_address(request)`
- `private_broker_post_broker_v1_account_sub_withdrawal(request)`
- `private_broker_post_broker_v1_account_sub_auto_transfer(request)`
- `private_broker_post_broker_v1_account_sub_api_create(request)`
- `private_broker_post_broker_v1_account_sub_api_modify(request)`
- `private_broker_post_v2_broker_account_modify_subaccount_email(request)`
- `private_broker_post_v2_broker_account_create_subaccount(request)`
- `private_broker_post_v2_broker_account_modify_subaccount(request)`
- `private_broker_post_v2_broker_account_subaccount_address(request)`
- `private_broker_post_v2_broker_account_subaccount_withdrawal(request)`
- `private_broker_post_v2_broker_account_set_subaccount_autotransfer(request)`
- `private_broker_post_v2_broker_manage_create_subaccount_apikey(request)`
- `private_broker_post_v2_broker_manage_modify_subaccount_apikey(request)`
- `private_margin_get_margin_v1_cross_account_riskrate(request)`
- `private_margin_get_margin_v1_cross_account_maxtransferoutamount(request)`
- `private_margin_get_margin_v1_isolated_account_maxtransferoutamount(request)`
- `private_margin_get_margin_v1_isolated_order_openorders(request)`
- `private_margin_get_margin_v1_isolated_order_history(request)`
- `private_margin_get_margin_v1_isolated_order_fills(request)`
- `private_margin_get_margin_v1_isolated_loan_list(request)`
- `private_margin_get_margin_v1_isolated_repay_list(request)`
- `private_margin_get_margin_v1_isolated_interest_list(request)`
- `private_margin_get_margin_v1_isolated_liquidation_list(request)`
- `private_margin_get_margin_v1_isolated_fin_list(request)`
- `private_margin_get_margin_v1_cross_order_openorders(request)`
- `private_margin_get_margin_v1_cross_order_history(request)`
- `private_margin_get_margin_v1_cross_order_fills(request)`
- `private_margin_get_margin_v1_cross_loan_list(request)`
- `private_margin_get_margin_v1_cross_repay_list(request)`
- `private_margin_get_margin_v1_cross_interest_list(request)`
- `private_margin_get_margin_v1_cross_liquidation_list(request)`
- `private_margin_get_margin_v1_cross_fin_list(request)`
- `private_margin_get_margin_v1_cross_account_assets(request)`
- `private_margin_get_margin_v1_isolated_account_assets(request)`
- `private_margin_get_v2_margin_crossed_borrow_history(request)`
- `private_margin_get_v2_margin_crossed_repay_history(request)`
- `private_margin_get_v2_margin_crossed_interest_history(request)`
- `private_margin_get_v2_margin_crossed_liquidation_history(request)`
- `private_margin_get_v2_margin_crossed_financial_records(request)`
- `private_margin_get_v2_margin_crossed_account_assets(request)`
- `private_margin_get_v2_margin_crossed_account_risk_rate(request)`
- `private_margin_get_v2_margin_crossed_account_max_borrowable_amount(request)`
- `private_margin_get_v2_margin_crossed_account_max_transfer_out_amount(request)`
- `private_margin_get_v2_margin_crossed_interest_rate_and_limit(request)`
- `private_margin_get_v2_margin_crossed_tier_data(request)`
- `private_margin_get_v2_margin_crossed_open_orders(request)`
- `private_margin_get_v2_margin_crossed_history_orders(request)`
- `private_margin_get_v2_margin_crossed_fills(request)`
- `private_margin_get_v2_margin_isolated_borrow_history(request)`
- `private_margin_get_v2_margin_isolated_repay_history(request)`
- `private_margin_get_v2_margin_isolated_interest_history(request)`
- `private_margin_get_v2_margin_isolated_liquidation_history(request)`
- `private_margin_get_v2_margin_isolated_financial_records(request)`
- `private_margin_get_v2_margin_isolated_account_assets(request)`
- `private_margin_get_v2_margin_isolated_account_risk_rate(request)`
- `private_margin_get_v2_margin_isolated_account_max_borrowable_amount(request)`
- `private_margin_get_v2_margin_isolated_account_max_transfer_out_amount(request)`
- `private_margin_get_v2_margin_isolated_interest_rate_and_limit(request)`
- `private_margin_get_v2_margin_isolated_tier_data(request)`
- `private_margin_get_v2_margin_isolated_open_orders(request)`
- `private_margin_get_v2_margin_isolated_history_orders(request)`
- `private_margin_get_v2_margin_isolated_fills(request)`
- `private_margin_post_margin_v1_cross_account_borrow(request)`
- `private_margin_post_margin_v1_isolated_account_borrow(request)`
- `private_margin_post_margin_v1_cross_account_repay(request)`
- `private_margin_post_margin_v1_isolated_account_repay(request)`
- `private_margin_post_margin_v1_isolated_account_riskrate(request)`
- `private_margin_post_margin_v1_cross_account_maxborrowableamount(request)`
- `private_margin_post_margin_v1_isolated_account_maxborrowableamount(request)`
- `private_margin_post_margin_v1_isolated_account_flashrepay(request)`
- `private_margin_post_margin_v1_isolated_account_queryflashrepaystatus(request)`
- `private_margin_post_margin_v1_cross_account_flashrepay(request)`
- `private_margin_post_margin_v1_cross_account_queryflashrepaystatus(request)`
- `private_margin_post_margin_v1_isolated_order_placeorder(request)`
- `private_margin_post_margin_v1_isolated_order_batchplaceorder(request)`
- `private_margin_post_margin_v1_isolated_order_cancelorder(request)`
- `private_margin_post_margin_v1_isolated_order_batchcancelorder(request)`
- `private_margin_post_margin_v1_cross_order_placeorder(request)`
- `private_margin_post_margin_v1_cross_order_batchplaceorder(request)`
- `private_margin_post_margin_v1_cross_order_cancelorder(request)`
- `private_margin_post_margin_v1_cross_order_batchcancelorder(request)`
- `private_margin_post_v2_margin_crossed_account_borrow(request)`
- `private_margin_post_v2_margin_crossed_account_repay(request)`
- `private_margin_post_v2_margin_crossed_account_flash_repay(request)`
- `private_margin_post_v2_margin_crossed_account_query_flash_repay_status(request)`
- `private_margin_post_v2_margin_crossed_place_order(request)`
- `private_margin_post_v2_margin_crossed_batch_place_order(request)`
- `private_margin_post_v2_margin_crossed_cancel_order(request)`
- `private_margin_post_v2_margin_crossed_batch_cancel_order(request)`
- `private_margin_post_v2_margin_isolated_account_borrow(request)`
- `private_margin_post_v2_margin_isolated_account_repay(request)`
- `private_margin_post_v2_margin_isolated_account_flash_repay(request)`
- `private_margin_post_v2_margin_isolated_account_query_flash_repay_status(request)`
- `private_margin_post_v2_margin_isolated_place_order(request)`
- `private_margin_post_v2_margin_isolated_batch_place_order(request)`
- `private_margin_post_v2_margin_isolated_cancel_order(request)`
- `private_margin_post_v2_margin_isolated_batch_cancel_order(request)`
- `private_copy_get_v2_copy_mix_trader_order_current_track(request)`
- `private_copy_get_v2_copy_mix_trader_order_history_track(request)`
- `private_copy_get_v2_copy_mix_trader_order_total_detail(request)`
- `private_copy_get_v2_copy_mix_trader_profit_history_summarys(request)`
- `private_copy_get_v2_copy_mix_trader_profit_history_details(request)`
- `private_copy_get_v2_copy_mix_trader_profit_details(request)`
- `private_copy_get_v2_copy_mix_trader_profits_group_coin_date(request)`
- `private_copy_get_v2_copy_mix_trader_config_query_symbols(request)`
- `private_copy_get_v2_copy_mix_trader_config_query_followers(request)`
- `private_copy_get_v2_copy_mix_follower_query_current_orders(request)`
- `private_copy_get_v2_copy_mix_follower_query_history_orders(request)`
- `private_copy_get_v2_copy_mix_follower_query_settings(request)`
- `private_copy_get_v2_copy_mix_follower_query_traders(request)`
- `private_copy_get_v2_copy_mix_follower_query_quantity_limit(request)`
- `private_copy_get_v2_copy_mix_broker_query_traders(request)`
- `private_copy_get_v2_copy_mix_broker_query_history_traces(request)`
- `private_copy_get_v2_copy_mix_broker_query_current_traces(request)`
- `private_copy_get_v2_copy_spot_trader_profit_summarys(request)`
- `private_copy_get_v2_copy_spot_trader_profit_history_details(request)`
- `private_copy_get_v2_copy_spot_trader_profit_details(request)`
- `private_copy_get_v2_copy_spot_trader_order_total_detail(request)`
- `private_copy_get_v2_copy_spot_trader_order_history_track(request)`
- `private_copy_get_v2_copy_spot_trader_order_current_track(request)`
- `private_copy_get_v2_copy_spot_trader_config_query_settings(request)`
- `private_copy_get_v2_copy_spot_trader_config_query_followers(request)`
- `private_copy_get_v2_copy_spot_follower_query_traders(request)`
- `private_copy_get_v2_copy_spot_follower_query_trader_symbols(request)`
- `private_copy_get_v2_copy_spot_follower_query_settings(request)`
- `private_copy_get_v2_copy_spot_follower_query_history_orders(request)`
- `private_copy_get_v2_copy_spot_follower_query_current_orders(request)`
- `private_copy_post_v2_copy_mix_trader_order_modify_tpsl(request)`
- `private_copy_post_v2_copy_mix_trader_order_close_positions(request)`
- `private_copy_post_v2_copy_mix_trader_config_setting_symbols(request)`
- `private_copy_post_v2_copy_mix_trader_config_setting_base(request)`
- `private_copy_post_v2_copy_mix_trader_config_remove_follower(request)`
- `private_copy_post_v2_copy_mix_follower_setting_tpsl(request)`
- `private_copy_post_v2_copy_mix_follower_settings(request)`
- `private_copy_post_v2_copy_mix_follower_close_positions(request)`
- `private_copy_post_v2_copy_mix_follower_cancel_trader(request)`
- `private_copy_post_v2_copy_spot_trader_order_modify_tpsl(request)`
- `private_copy_post_v2_copy_spot_trader_order_close_tracking(request)`
- `private_copy_post_v2_copy_spot_trader_config_setting_symbols(request)`
- `private_copy_post_v2_copy_spot_trader_config_remove_follower(request)`
- `private_copy_post_v2_copy_spot_follower_stop_order(request)`
- `private_copy_post_v2_copy_spot_follower_settings(request)`
- `private_copy_post_v2_copy_spot_follower_setting_tpsl(request)`
- `private_copy_post_v2_copy_spot_follower_order_close_tracking(request)`
- `private_copy_post_v2_copy_spot_follower_cancel_trader(request)`
- `private_tax_get_v2_tax_spot_record(request)`
- `private_tax_get_v2_tax_future_record(request)`
- `private_tax_get_v2_tax_margin_record(request)`
- `private_tax_get_v2_tax_p2p_record(request)`
- `private_convert_get_v2_convert_currencies(request)`
- `private_convert_get_v2_convert_quoted_price(request)`
- `private_convert_get_v2_convert_convert_record(request)`
- `private_convert_get_v2_convert_bgb_convert_coin_list(request)`
- `private_convert_get_v2_convert_bgb_convert_records(request)`
- `private_convert_post_v2_convert_trade(request)`
- `private_convert_post_v2_convert_bgb_convert(request)`
- `private_earn_get_v2_earn_savings_product(request)`
- `private_earn_get_v2_earn_savings_account(request)`
- `private_earn_get_v2_earn_savings_assets(request)`
- `private_earn_get_v2_earn_savings_records(request)`
- `private_earn_get_v2_earn_savings_subscribe_info(request)`
- | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:41.094454 | bitget-0.0.124.tar.gz | 849,423 | c3/57/a42afc6789031d1a29d6922002b0161d9649714053d8e74e8fe7044f0c99/bitget-0.0.124.tar.gz | source | sdist | null | false | b368501b5568ec33fc7d04ac55b27e41 | ac4d9fdbab68fab996ee886d84c1e2c9a58385ddcc0b45f7d55142febc76da8f | c357a42afc6789031d1a29d6922002b0161d9649714053d8e74e8fe7044f0c99 | null | [] | 290 |
2.4 | htx | 0.0.128 | htx crypto exchange api client | # htx-python
Python SDK (sync and async) for Htx cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/htx)
- You can check Htx's docs here: [Docs](https://www.google.com/search?q=google+htx+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/htx-python
- Pypi package: https://pypi.org/project/htx
## Installation
```
pip install htx
```
## Usage
### Sync
```Python
from htx import HtxSync
def main():
instance = HtxSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from htx import HtxAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = HtxAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from htx import HtxWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = HtxWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_contract_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `create_spot_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_trailing_percent_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, trailingPercent: Num = None, trailingTriggerPrice: Num = None, params={})`
- `fetch_account_id_by_type(self, type: str, marginMode: Str = None, symbol: Str = None, params={})`
- `fetch_accounts(self, params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_contract_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_spot_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_contract_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_isolated_borrow_rates(self, params={})`
- `fetch_last_prices(self, symbols: Strings = None, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_leverage_tiers(self, symbols: Strings = None, params={})`
- `fetch_liquidations(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_markets_by_type_and_sub_type(self, type: Str, subType: Str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest_history(self, symbol: str, timeframe='1h', since: Int = None, limit: Int = None, params={})`
- `fetch_open_interest(self, symbol: str, params={})`
- `fetch_open_interests(self, symbols: Strings = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_spot_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_spot_orders_by_states(self, states, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_spot_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_status(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = 1000, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_limits_by_id(self, id: str, params={})`
- `fetch_trading_limits(self, symbols: Strings = None, params={})`
- `fetch_withdraw_addresses(self, code: str, note=None, networkCode=None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `borrow_cross_margin(self, code: str, amount: float, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `cancel_all_orders_after(self, timeout: Int, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `cost_to_precision(self, symbol, cost)`
- `describe(self)`
- `network_code_to_id(self, networkCode: str, currencyCode: Str = None)`
- `network_id_to_code(self, networkId: Str = None, currencyCode: Str = None)`
- `nonce(self)`
- `repay_cross_margin(self, code: str, amount, params={})`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `try_get_symbol_from_future_markets(self, symbolOrMarketId: str)`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `v2public_get_reference_currencies(request)`
- `v2public_get_market_status(request)`
- `v2private_get_account_ledger(request)`
- `v2private_get_account_withdraw_quota(request)`
- `v2private_get_account_withdraw_address(request)`
- `v2private_get_account_deposit_address(request)`
- `v2private_get_account_repayment(request)`
- `v2private_get_reference_transact_fee_rate(request)`
- `v2private_get_account_asset_valuation(request)`
- `v2private_get_point_account(request)`
- `v2private_get_sub_user_user_list(request)`
- `v2private_get_sub_user_user_state(request)`
- `v2private_get_sub_user_account_list(request)`
- `v2private_get_sub_user_deposit_address(request)`
- `v2private_get_sub_user_query_deposit(request)`
- `v2private_get_user_api_key(request)`
- `v2private_get_user_uid(request)`
- `v2private_get_algo_orders_opening(request)`
- `v2private_get_algo_orders_history(request)`
- `v2private_get_algo_orders_specific(request)`
- `v2private_get_c2c_offers(request)`
- `v2private_get_c2c_offer(request)`
- `v2private_get_c2c_transactions(request)`
- `v2private_get_c2c_repayment(request)`
- `v2private_get_c2c_account(request)`
- `v2private_get_etp_reference(request)`
- `v2private_get_etp_transactions(request)`
- `v2private_get_etp_transaction(request)`
- `v2private_get_etp_rebalance(request)`
- `v2private_get_etp_limit(request)`
- `v2private_post_account_transfer(request)`
- `v2private_post_account_repayment(request)`
- `v2private_post_point_transfer(request)`
- `v2private_post_sub_user_management(request)`
- `v2private_post_sub_user_creation(request)`
- `v2private_post_sub_user_tradable_market(request)`
- `v2private_post_sub_user_transferability(request)`
- `v2private_post_sub_user_api_key_generation(request)`
- `v2private_post_sub_user_api_key_modification(request)`
- `v2private_post_sub_user_api_key_deletion(request)`
- `v2private_post_sub_user_deduct_mode(request)`
- `v2private_post_algo_orders(request)`
- `v2private_post_algo_orders_cancel_all_after(request)`
- `v2private_post_algo_orders_cancellation(request)`
- `v2private_post_c2c_offer(request)`
- `v2private_post_c2c_cancellation(request)`
- `v2private_post_c2c_cancel_all(request)`
- `v2private_post_c2c_repayment(request)`
- `v2private_post_c2c_transfer(request)`
- `v2private_post_etp_creation(request)`
- `v2private_post_etp_redemption(request)`
- `v2private_post_etp_transactid_cancel(request)`
- `v2private_post_etp_batch_cancel(request)`
- `public_get_common_symbols(request)`
- `public_get_common_currencys(request)`
- `public_get_common_timestamp(request)`
- `public_get_common_exchange(request)`
- `public_get_settings_currencys(request)`
- `private_get_account_accounts(request)`
- `private_get_account_accounts_id_balance(request)`
- `private_get_account_accounts_sub_uid(request)`
- `private_get_account_history(request)`
- `private_get_cross_margin_loan_info(request)`
- `private_get_margin_loan_info(request)`
- `private_get_fee_fee_rate_get(request)`
- `private_get_order_openorders(request)`
- `private_get_order_orders(request)`
- `private_get_order_orders_id(request)`
- `private_get_order_orders_id_matchresults(request)`
- `private_get_order_orders_getclientorder(request)`
- `private_get_order_history(request)`
- `private_get_order_matchresults(request)`
- `private_get_query_deposit_withdraw(request)`
- `private_get_margin_loan_orders(request)`
- `private_get_margin_accounts_balance(request)`
- `private_get_cross_margin_loan_orders(request)`
- `private_get_cross_margin_accounts_balance(request)`
- `private_get_points_actions(request)`
- `private_get_points_orders(request)`
- `private_get_subuser_aggregate_balance(request)`
- `private_get_stable_coin_exchange_rate(request)`
- `private_get_stable_coin_quote(request)`
- `private_post_account_transfer(request)`
- `private_post_futures_transfer(request)`
- `private_post_order_batch_orders(request)`
- `private_post_order_orders_place(request)`
- `private_post_order_orders_submitcancelclientorder(request)`
- `private_post_order_orders_batchcancelopenorders(request)`
- `private_post_order_orders_id_submitcancel(request)`
- `private_post_order_orders_batchcancel(request)`
- `private_post_dw_withdraw_api_create(request)`
- `private_post_dw_withdraw_virtual_id_cancel(request)`
- `private_post_dw_transfer_in_margin(request)`
- `private_post_dw_transfer_out_margin(request)`
- `private_post_margin_orders(request)`
- `private_post_margin_orders_id_repay(request)`
- `private_post_cross_margin_transfer_in(request)`
- `private_post_cross_margin_transfer_out(request)`
- `private_post_cross_margin_orders(request)`
- `private_post_cross_margin_orders_id_repay(request)`
- `private_post_stable_coin_exchange(request)`
- `private_post_subuser_transfer(request)`
- `status_public_spot_get_api_v2_summary_json(request)`
- `status_public_future_inverse_get_api_v2_summary_json(request)`
- `status_public_future_linear_get_api_v2_summary_json(request)`
- `status_public_swap_inverse_get_api_v2_summary_json(request)`
- `status_public_swap_linear_get_api_v2_summary_json(request)`
- `spot_public_get_v2_market_status(request)`
- `spot_public_get_v1_common_symbols(request)`
- `spot_public_get_v1_common_currencys(request)`
- `spot_public_get_v2_settings_common_currencies(request)`
- `spot_public_get_v2_reference_currencies(request)`
- `spot_public_get_v1_common_timestamp(request)`
- `spot_public_get_v1_common_exchange(request)`
- `spot_public_get_v1_settings_common_chains(request)`
- `spot_public_get_v1_settings_common_currencys(request)`
- `spot_public_get_v1_settings_common_symbols(request)`
- `spot_public_get_v2_settings_common_symbols(request)`
- `spot_public_get_v1_settings_common_market_symbols(request)`
- `spot_public_get_market_history_candles(request)`
- `spot_public_get_market_history_kline(request)`
- `spot_public_get_market_detail_merged(request)`
- `spot_public_get_market_tickers(request)`
- `spot_public_get_market_detail(request)`
- `spot_public_get_market_depth(request)`
- `spot_public_get_market_trade(request)`
- `spot_public_get_market_history_trade(request)`
- `spot_public_get_market_etp(request)`
- `spot_public_get_v2_etp_reference(request)`
- `spot_public_get_v2_etp_rebalance(request)`
- `spot_private_get_v1_account_accounts(request)`
- `spot_private_get_v1_account_accounts_account_id_balance(request)`
- `spot_private_get_v2_account_valuation(request)`
- `spot_private_get_v2_account_asset_valuation(request)`
- `spot_private_get_v1_account_history(request)`
- `spot_private_get_v2_account_ledger(request)`
- `spot_private_get_v2_point_account(request)`
- `spot_private_get_v2_account_deposit_address(request)`
- `spot_private_get_v2_account_withdraw_quota(request)`
- `spot_private_get_v2_account_withdraw_address(request)`
- `spot_private_get_v2_reference_currencies(request)`
- `spot_private_get_v1_query_deposit_withdraw(request)`
- `spot_private_get_v1_query_withdraw_client_order_id(request)`
- `spot_private_get_v2_user_api_key(request)`
- `spot_private_get_v2_user_uid(request)`
- `spot_private_get_v2_sub_user_user_list(request)`
- `spot_private_get_v2_sub_user_user_state(request)`
- `spot_private_get_v2_sub_user_account_list(request)`
- `spot_private_get_v2_sub_user_deposit_address(request)`
- `spot_private_get_v2_sub_user_query_deposit(request)`
- `spot_private_get_v1_subuser_aggregate_balance(request)`
- `spot_private_get_v1_account_accounts_sub_uid(request)`
- `spot_private_get_v1_order_openorders(request)`
- `spot_private_get_v1_order_orders_order_id(request)`
- `spot_private_get_v1_order_orders_getclientorder(request)`
- `spot_private_get_v1_order_orders_order_id_matchresult(request)`
- `spot_private_get_v1_order_orders_order_id_matchresults(request)`
- `spot_private_get_v1_order_orders(request)`
- `spot_private_get_v1_order_history(request)`
- `spot_private_get_v1_order_matchresults(request)`
- `spot_private_get_v2_reference_transact_fee_rate(request)`
- `spot_private_get_v2_algo_orders_opening(request)`
- `spot_private_get_v2_algo_orders_history(request)`
- `spot_private_get_v2_algo_orders_specific(request)`
- `spot_private_get_v1_margin_loan_info(request)`
- `spot_private_get_v1_margin_loan_orders(request)`
- `spot_private_get_v1_margin_accounts_balance(request)`
- `spot_private_get_v1_cross_margin_loan_info(request)`
- `spot_private_get_v1_cross_margin_loan_orders(request)`
- `spot_private_get_v1_cross_margin_accounts_balance(request)`
- `spot_private_get_v2_account_repayment(request)`
- `spot_private_get_v1_stable_coin_quote(request)`
- `spot_private_get_v1_stable_coin_exchange_rate(request)`
- `spot_private_get_v2_etp_transactions(request)`
- `spot_private_get_v2_etp_transaction(request)`
- `spot_private_get_v2_etp_limit(request)`
- `spot_private_post_v1_account_transfer(request)`
- `spot_private_post_v1_futures_transfer(request)`
- `spot_private_post_v2_point_transfer(request)`
- `spot_private_post_v2_account_transfer(request)`
- `spot_private_post_v1_dw_withdraw_api_create(request)`
- `spot_private_post_v1_dw_withdraw_virtual_withdraw_id_cancel(request)`
- `spot_private_post_v2_sub_user_deduct_mode(request)`
- `spot_private_post_v2_sub_user_creation(request)`
- `spot_private_post_v2_sub_user_management(request)`
- `spot_private_post_v2_sub_user_tradable_market(request)`
- `spot_private_post_v2_sub_user_transferability(request)`
- `spot_private_post_v2_sub_user_api_key_generation(request)`
- `spot_private_post_v2_sub_user_api_key_modification(request)`
- `spot_private_post_v2_sub_user_api_key_deletion(request)`
- `spot_private_post_v1_subuser_transfer(request)`
- `spot_private_post_v1_trust_user_active_credit(request)`
- `spot_private_post_v1_order_orders_place(request)`
- `spot_private_post_v1_order_batch_orders(request)`
- `spot_private_post_v1_order_auto_place(request)`
- `spot_private_post_v1_order_orders_order_id_submitcancel(request)`
- `spot_private_post_v1_order_orders_submitcancelclientorder(request)`
- `spot_private_post_v1_order_orders_batchcancelopenorders(request)`
- `spot_private_post_v1_order_orders_batchcancel(request)`
- `spot_private_post_v2_algo_orders_cancel_all_after(request)`
- `spot_private_post_v2_algo_orders(request)`
- `spot_private_post_v2_algo_orders_cancellation(request)`
- `spot_private_post_v2_account_repayment(request)`
- `spot_private_post_v1_dw_transfer_in_margin(request)`
- `spot_private_post_v1_dw_transfer_out_margin(request)`
- `spot_private_post_v1_margin_orders(request)`
- `spot_private_post_v1_margin_orders_order_id_repay(request)`
- `spot_private_post_v1_cross_margin_transfer_in(request)`
- `spot_private_post_v1_cross_margin_transfer_out(request)`
- `spot_private_post_v1_cross_margin_orders(request)`
- `spot_private_post_v1_cross_margin_orders_order_id_repay(request)`
- `spot_private_post_v1_stable_coin_exchange(request)`
- `spot_private_post_v2_etp_creation(request)`
- `spot_private_post_v2_etp_redemption(request)`
- `spot_private_post_v2_etp_transactid_cancel(request)`
- `spot_private_post_v2_etp_batch_cancel(request)`
- `contract_public_get_api_v1_timestamp(request)`
- `contract_public_get_heartbeat(request)`
- `contract_public_get_api_v1_contract_contract_info(request)`
- `contract_public_get_api_v1_contract_index(request)`
- `contract_public_get_api_v1_contract_query_elements(request)`
- `contract_public_get_api_v1_contract_price_limit(request)`
- `contract_public_get_api_v1_contract_open_interest(request)`
- `contract_public_get_api_v1_contract_delivery_price(request)`
- `contract_public_get_market_depth(request)`
- `contract_public_get_market_bbo(request)`
- `contract_public_get_market_history_kline(request)`
- `contract_public_get_index_market_history_mark_price_kline(request)`
- `contract_public_get_market_detail_merged(request)`
- `contract_public_get_market_detail_batch_merged(request)`
- `contract_public_get_v2_market_detail_batch_merged(request)`
- `contract_public_get_market_trade(request)`
- `contract_public_get_market_history_trade(request)`
- `contract_public_get_api_v1_contract_risk_info(request)`
- `contract_public_get_api_v1_contract_insurance_fund(request)`
- `contract_public_get_api_v1_contract_adjustfactor(request)`
- `contract_public_get_api_v1_contract_his_open_interest(request)`
- `contract_public_get_api_v1_contract_ladder_margin(request)`
- `contract_public_get_api_v1_contract_api_state(request)`
- `contract_public_get_api_v1_contract_elite_account_ratio(request)`
- `contract_public_get_api_v1_contract_elite_position_ratio(request)`
- `contract_public_get_api_v1_contract_liquidation_orders(request)`
- `contract_public_get_api_v1_contract_settlement_records(request)`
- `contract_public_get_index_market_history_index(request)`
- `contract_public_get_index_market_history_basis(request)`
- `contract_public_get_api_v1_contract_estimated_settlement_price(request)`
- `contract_public_get_api_v3_contract_liquidation_orders(request)`
- `contract_public_get_swap_api_v1_swap_contract_info(request)`
- `contract_public_get_swap_api_v1_swap_index(request)`
- `contract_public_get_swap_api_v1_swap_query_elements(request)`
- `contract_public_get_swap_api_v1_swap_price_limit(request)`
- `contract_public_get_swap_api_v1_swap_open_interest(request)`
- `contract_public_get_swap_ex_market_depth(request)`
- `contract_public_get_swap_ex_market_bbo(request)`
- `contract_public_get_swap_ex_market_history_kline(request)`
- `contract_public_get_index_market_history_swap_mark_price_kline(request)`
- `contract_public_get_swap_ex_market_detail_merged(request)`
- `contract_public_get_v2_swap_ex_market_detail_batch_merged(request)`
- `contract_public_get_index_market_history_swap_premium_index_kline(request)`
- `contract_public_get_swap_ex_market_detail_batch_merged(request)`
- `contract_public_get_swap_ex_market_trade(request)`
- `contract_public_get_swap_ex_market_history_trade(request)`
- `contract_public_get_swap_api_v1_swap_risk_info(request)`
- `contract_public_get_swap_api_v1_swap_insurance_fund(request)`
- `contract_public_get_swap_api_v1_swap_adjustfactor(request)`
- `contract_public_get_swap_api_v1_swap_his_open_interest(request)`
- `contract_public_get_swap_api_v1_swap_ladder_margin(request)`
- `contract_public_get_swap_api_v1_swap_api_state(request)`
- `contract_public_get_swap_api_v1_swap_elite_account_ratio(request)`
- `contract_public_get_swap_api_v1_swap_elite_position_ratio(request)`
- `contract_public_get_swap_api_v1_swap_estimated_settlement_price(request)`
- `contract_public_get_swap_api_v1_swap_liquidation_orders(request)`
- `contract_public_get_swap_api_v1_swap_settlement_records(request)`
- `contract_public_get_swap_api_v1_swap_funding_rate(request)`
- `contract_public_get_swap_api_v1_swap_batch_funding_rate(request)`
- `contract_public_get_swap_api_v1_swap_historical_funding_rate(request)`
- `contract_public_get_swap_api_v3_swap_liquidation_orders(request)`
- `contract_public_get_index_market_history_swap_estimated_rate_kline(request)`
- `contract_public_get_index_market_history_swap_basis(request)`
- `contract_public_get_linear_swap_api_v1_swap_contract_info(request)`
- `contract_public_get_linear_swap_api_v1_swap_index(request)`
- `contract_public_get_linear_swap_api_v1_swap_query_elements(request)`
- `contract_public_get_linear_swap_api_v1_swap_price_limit(request)`
- `contract_public_get_linear_swap_api_v1_swap_open_interest(request)`
- `contract_public_get_linear_swap_ex_market_depth(request)`
- `contract_public_get_linear_swap_ex_market_bbo(request)`
- `contract_public_get_linear_swap_ex_market_history_kline(request)`
- `contract_public_get_index_market_history_linear_swap_mark_price_kline(request)`
- `contract_public_get_linear_swap_ex_market_detail_merged(request)`
- `contract_public_get_linear_swap_ex_market_detail_batch_merged(request)`
- `contract_public_get_v2_linear_swap_ex_market_detail_batch_merged(request)`
- `contract_public_get_linear_swap_ex_market_trade(request)`
- `contract_public_get_linear_swap_ex_market_history_trade(request)`
- `contract_public_get_linear_swap_api_v1_swap_risk_info(request)`
- `contract_public_get_swap_api_v1_linear_swap_api_v1_swap_insurance_fund(request)`
- `contract_public_get_linear_swap_api_v1_swap_adjustfactor(request)`
- `contract_public_get_linear_swap_api_v1_swap_cross_adjustfactor(request)`
- `contract_public_get_linear_swap_api_v1_swap_his_open_interest(request)`
- `contract_public_get_linear_swap_api_v1_swap_ladder_margin(request)`
- `contract_public_get_linear_swap_api_v1_swap_cross_ladder_margin(request)`
- `contract_public_get_linear_swap_api_v1_swap_api_state(request)`
- `contract_public_get_linear_swap_api_v1_swap_cross_transfer_state(request)`
- `contract_public_get_linear_swap_api_v1_swap_cross_trade_state(request)`
- `contract_public_get_linear_swap_api_v1_swap_elite_account_ratio(request)`
- `contract_public_get_linear_swap_api_v1_swap_elite_position_ratio(request)`
- `contract_public_get_linear_swap_api_v1_swap_liquidation_orders(request)`
- `contract_public_get_linear_swap_api_v1_swap_settlement_records(request)`
- `contract_public_get_linear_swap_api_v1_swap_funding_rate(request)`
- `contract_public_get_linear_swap_api_v1_swap_batch_funding_rate(request)`
- `contract_public_get_linear_swap_api_v1_swap_historical_funding_rate(request)`
- `contract_public_get_linear_swap_api_v3_swap_liquidation_orders(request)`
- `contract_public_get_index_market_history_linear_swap_premium_index_kline(request)`
- `contract_public_get_index_market_history_linear_swap_estimated_rate_kline(request)`
- `contract_public_get_index_market_history_linear_swap_basis(request)`
- `contract_public_get_linear_swap_api_v1_swap_estimated_settlement_price(request)`
- `contract_private_get_api_v1_contract_sub_auth_list(request)`
- `contract_private_get_api_v1_contract_api_trading_status(request)`
- `contract_private_get_swap_api_v1_swap_sub_auth_list(request)`
- `contract_private_get_swap_api_v1_swap_api_trading_status(request)`
- `contract_private_get_linear_swap_api_v1_swap_sub_auth_list(request)`
- `contract_private_get_linear_swap_api_v1_swap_api_trading_status(request)`
- `contract_private_get_linear_swap_api_v1_swap_cross_position_side(request)`
- `contract_private_get_linear_swap_api_v1_swap_position_side(request)`
- `contract_private_get_linear_swap_api_v3_unified_account_info(request)`
- `contract_private_get_linear_swap_api_v3_fix_position_margin_change_record(request)`
- `contract_private_get_linear_swap_api_v3_swap_unified_account_type(request)`
- `contract_private_get_linear_swap_api_v3_linear_swap_overview_account_info(request)`
- `contract_private_get_v5_account_balance(request)`
- `contract_private_get_v5_account_asset_mode(request)`
- `contract_private_get_v5_trade_position_opens(request)`
- `contract_private_get_v5_trade_order_opens(request)`
- `contract_private_get_v5_trade_order_details(request)`
- `contract_private_get_v5_trade_order_history(request)`
- `contract_private_get_v5_trade_order(request)`
- `contract_private_get_v5_position_lever(request)`
- `contract_private_get_v5_position_mode(request)`
- `contract_private_get_v5_position_risk_limit(request)`
- `contract_private_get_v5_position_risk_limit_tier(request)`
- `contract_private_get_v5_market_risk_limit(request)`
- `contract_private_get_v5_market_assets_deduction_currency(request)`
- `contract_private_get_v5_market_multi_assets_margin(request)`
- `contract_private_post_api_v1_contract_balance_valuation(request)`
- `contract_private_post_api_v1_contract_account_info(request)`
- `contract_private_post_api_v1_contract_position_info(request)`
- `contract_private_post_api_v1_contract_sub_auth(request)`
- `contract_private_post_api_v1_contract_sub_account_list(request)`
- `contract_private_post_api_v1_contract_sub_account_info_list(request)`
- `contract_private_post_api_v1_contract_sub_account_info(request)`
- `contract_private_post_api_v1_contract_sub_position_info(request)`
- `contract_private_post_api_v1_contract_financial_record(request)`
- `contract_private_post_api_v1_contract_financial_record_exact(request)`
- `contract_private_post_api_v1_contract_user_settlement_records(request)`
- `contract_private_post_api_v1_contract_order_limit(request)`
- `contract_private_post_api_v1_contract_fee(request)`
- `contract_private_post_api_v1_contract_transfer_limit(request)`
- `contract_private_post_api_v1_contract_position_limit(request)`
- `contract_private_post_api_v1_contract_account_position_info(request)`
- `contract_private_post_api_v1_contract_master_sub_transfer(request)`
- `contract_private_post_api_v1_contract_master_sub_transfer_record(request)`
- `contract_private_post_api_v1_contract_available_level_rate(request)`
- `contract_private_post_api_v3_contract_financial_record(request)`
- `contract_private_post_api_v3_contract_financial_record_exact(request)`
- `contract_private_post_api_v1_contract_cancel_after(request)`
- `contract_private_post_api_v1_contract_order(request)`
- `contract_private_post_api_v1_contract_batchorder(request)`
- `contract_private_post_api_v1_contract_cancel(request)`
- `contract_private_post_api_v1_contract_cancelall(request)`
- `contract_private_post_api_v1_contract_switch_lever_rate(request)`
- `contract_private_post_api_v1_lightning_close_position(request)`
- `contract_private_post_api_v1_contract_order_info(request)`
- `contract_private_post_api_v1_contract_order_detail(request)`
- `contract_private_post_api_v1_contract_openorders(request)`
- `contract_private_post_api_v1_contract_hisorders(request)`
- `contract_private_post_api_v1_contract_hisorders_exact(request)`
- `contract_private_post_api_v1_contract_matchresults(request)`
- `contract_private_post_api_v1_contract_matchresults_exact(request)`
- `contract_private_post_api_v3_contract_hisorders(request)`
- `contract_private_post_api_v3_contract_hisorders_exact(request)`
- `contract_private_post_api_v3_contract_matchresults(request)`
- `contract_private_post_api_v3_contract_matchresults_exact(request)`
- `contract_private_post_api_v1_contract_trigger_order(request)`
- `contract_private_post_api_v1_contract_trigger_cancel(request)`
- `contract_private_post_api_v1_contract_trigger_cancelall(request)`
- `contract_private_post_api_v1_contract_trigger_openorders(request)`
- `contract_private_post_api_v1_contract_trigger_hisorders(request)`
- `contract_private_post_api_v1_contract_tpsl_order(request)`
- `contract_private_post_api_v1_contract_tpsl_cancel(request)`
- `contract_private_post_api_v1_contract_tpsl_cancelall(request)`
- `contract_private_post_api_v1_contract_tpsl_openorders(request)`
- `contract_private_post_api_v1_contract_tpsl_hisorders(request)`
- `contract_private_post_api_v1_contract_relation_tpsl_order(request)`
- `contract_private_post_api_v1_contract_track_order(request)`
- `contract_private_post_api_v1_contract_track_cancel(request)`
- `contract_private_post_api_v1_contract_track_cancelall(request)`
- `contract_private_post_api_v1_contract_track_openorders(request)`
- `contract_private_post_api_v1_contract_track_hisorders(request)`
- `contract_private_post_swap_api_v1_swap_balance_valuation(request)`
- `contract_private_post_swap_api_v1_swap_account_info(request)`
- `contract_private_post_swap_api_v1_swap_position_info(request)`
- `contract_private_post_swap_api_v1_swap_account_position_info(request)`
- `contract_private_post_swap_api_v1_swap_sub_auth(request)`
- `contract_private_post_swap_api_v1_swap_sub_account_list(request)`
- `contract_private_post_swap_api_v1_swap_sub_account_info_list(request)`
- `contract_private_post_swap_api_v1_swap_sub_account_info(request)`
- `contract_private_post_swap_api_v1_swap_sub_position_info(request)`
- `contract_private_post_swap_api_v1_swap_financial_record(request)`
- `contract_private_post_swap_api_v1_swap_financial_record_exact(request)`
- `contract_private_post_swap_api_v1_swap_user_settlement_records(request)`
- `contract_private_post_swap_api_v1_swap_available_level_rate(request)`
- `contract_private_post_swap_api_v1_swap_order_limit(request)`
- `contract_private_post_swap_api_v1_swap_fee(request)`
- `contract_private_post_swap_api_v1_swap_transfer_limit(request)`
- `contract_private_post_swap_api_v1_swap_position_limit(request)`
- `contract_private_post_swap_api_v1_swap_master_sub_transfer(request)`
- `contract_private_post_swap_api_v1_swap_master_sub_transfer_record(request)`
- `contract_private_post_swap_api_v3_swap_financial_record(request)`
- `contract_private_post_swap_api_v3_swap_financial_record_exact(request)`
- `contract_private_post_swap_api_v1_swap_cancel_after(request)`
- `contract_private_post_swap_api_v1_swap_order(request)`
- `contract_private_post_swap_api_v1_swap_batchorder(request)`
- `contract_private_post_swap_api_v1_swap_cancel(request)`
- `contract_private_post_swap_api_v1_swap_cancelall(request)`
- `contract_private_post_swap_api_v1_swap_lightning_close_position(request)`
- `contract_private_post_swap_api_v1_swap_switch_lever_rate(request)`
- `contract_private_post_swap_api_v1_swap_order_info(request)`
- `contract_private_post_swap_api_v1_swap_order_detail(request)`
- `contract_private_post_swap_api_v1_swap_openorders(request)`
- `contract_private_post_swap_api_v1_swap_hisorders(request)`
- `contract_private_post_swap_api_v1_swap_hisorders_exact(request)`
- `contract_private_post_swap_api_v1_swap_matchresults(request)`
- `contract_private_post_swap_api_v1_swap_matchresults_exact(request)`
- `contract_private_post_swap_api_v3_swap_matchresults(request)`
- `contract_private_post_swap_api_v3_swap_matchresults_exact(request)`
- `contract_private_post_swap_api_v3_swap_hisorders(request)`
- `contract_private_post_swap_api_v3_swap_hisorders_exact(request)`
- `contract_private_post_swap_api_v1_swap_trigger_order(request)`
- `contract_private_post_swap_api_v1_swap_trigger_cancel(request)`
- `contract_private_post_swap_api_v1_swap_trigger_cancelall(request)`
- `contract_private_post_swap_api_v1_swap_trigger_openorders(request)`
- `contract_private_post_swap_api_v1_swap_trigger_hisorders(request)`
- `contract_private_post_swap_api_v1_swap_tpsl_order(request)`
- `contract_private_post_swap_api_v1_swap_tpsl_cancel(request)`
- `contract_private_post_swap_api_v1_swap_tpsl_cancelall(request)`
- `contract_private_post_swap_api_v1_swap_tpsl_openorders(request)`
- `contract_private_post_swap_api_v1_swap_tpsl_hisorders(request)`
- `contract_private_post_swap_api_v1_swap_relation_tpsl_order(request)`
- `contract_private_post_swap_api_v1_swap_track_order(request)`
- `contract_private_post_swap_api_v1_swap_track_cancel(request)`
- `contract_private_post_swap_api_v1_swap_track_cancelall(request)`
- `contract_private_post_swap_api_v1_swap_track_openorders(request)`
- `contract_private_post_swap_api_v1_swap_track_hisorders(request)`
- `contract_private_post_linear_swap_api_v1_swap_lever_position_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_lever_position_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_balance_valuation(request)`
- `contract_private_post_linear_swap_api_v1_swap_account_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_account_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_position_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_position_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_account_position_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_account_position_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_sub_auth(request)`
- `contract_private_post_linear_swap_api_v1_swap_sub_account_list(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_sub_account_list(request)`
- `contract_private_post_linear_swap_api_v1_swap_sub_account_info_list(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_sub_account_info_list(request)`
- `contract_private_post_linear_swap_api_v1_swap_sub_account_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_sub_account_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_sub_position_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_sub_position_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_financial_record(request)`
- `contract_private_post_linear_swap_api_v1_swap_financial_record_exact(request)`
- `contract_private_post_linear_swap_api_v1_swap_user_settlement_records(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_user_settlement_records(request)`
- `contract_private_post_linear_swap_api_v1_swap_available_level_rate(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_available_level_rate(request)`
- `contract_private_post_linear_swap_api_v1_swap_order_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_fee(request)`
- `contract_private_post_linear_swap_api_v1_swap_transfer_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_transfer_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_position_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_position_limit(request)`
- `contract_private_post_linear_swap_api_v1_swap_master_sub_transfer(request)`
- `contract_private_post_linear_swap_api_v1_swap_master_sub_transfer_record(request)`
- `contract_private_post_linear_swap_api_v1_swap_transfer_inner(request)`
- `contract_private_post_linear_swap_api_v3_swap_financial_record(request)`
- `contract_private_post_linear_swap_api_v3_swap_financial_record_exact(request)`
- `contract_private_post_linear_swap_api_v1_swap_order(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_order(request)`
- `contract_private_post_linear_swap_api_v1_swap_batchorder(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_batchorder(request)`
- `contract_private_post_linear_swap_api_v1_swap_cancel(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_cancel(request)`
- `contract_private_post_linear_swap_api_v1_swap_cancelall(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_cancelall(request)`
- `contract_private_post_linear_swap_api_v1_swap_switch_lever_rate(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_switch_lever_rate(request)`
- `contract_private_post_linear_swap_api_v1_swap_lightning_close_position(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_lightning_close_position(request)`
- `contract_private_post_linear_swap_api_v1_swap_order_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_order_info(request)`
- `contract_private_post_linear_swap_api_v1_swap_order_detail(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_order_detail(request)`
- `contract_private_post_linear_swap_api_v1_swap_openorders(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_openorders(request)`
- `contract_private_post_linear_swap_api_v1_swap_hisorders(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_hisorders(request)`
- `contract_private_post_linear_swap_api_v1_swap_hisorders_exact(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_hisorders_exact(request)`
- `contract_private_post_linear_swap_api_v1_swap_matchresults(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_matchresults(request)`
- `contract_private_post_linear_swap_api_v1_swap_matchresults_exact(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_matchresults_exact(request)`
- `contract_private_post_linear_swap_api_v1_linear_cancel_after(request)`
- `contract_private_post_linear_swap_api_v1_swap_switch_position_mode(request)`
- `contract_private_post_linear_swap_api_v1_swap_cross_switch_position_mode(request)`
- `contract_private_post_linear_swap_api_v3_swap_matchresults(request)`
- `contract_private_post_linear_swap_api_v3_swap_cross_matchresults(request)`
- `contract_private_post_linear_swap_api_v3_swap_matchresults_exact(request)`
- `contract_private_post_linear_swap_api_v3_swap_cross_matchresults_exact(request)`
- `contract_private_post_linear_swap_api_v3_swap_hisorders(request)`
| text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:39.951088 | htx-0.0.128.tar.gz | 836,984 | fa/e4/2a822e19d7444e1638fc0aa51bbf0df014c4b3f3078d0940322e4d4d0f72/htx-0.0.128.tar.gz | source | sdist | null | false | f882206e49c866aef1de8b3b87c1d475 | 0c9f09291d4ef9d878f16fe06cdedb45efed8eebba6fa7a0b46e072b4c0f0ccd | fae42a822e19d7444e1638fc0aa51bbf0df014c4b3f3078d0940322e4d4d0f72 | null | [] | 222 |
2.4 | crypto-com-sdk | 0.0.123 | cryptocom crypto exchange api client | # cryptocom-python
Python SDK (sync and async) for Cryptocom cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/cryptocom)
- You can check Cryptocom's docs here: [Docs](https://www.google.com/search?q=google+cryptocom+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/cryptocom-python
- Pypi package: https://pypi.org/project/crypto-com-sdk
## Installation
```
pip install crypto-com-sdk
```
## Usage
### Sync
```Python
from cryptocom import CryptocomSync
def main():
instance = CryptocomSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from cryptocom import CryptocomAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = CryptocomAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from cryptocom import CryptocomWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = CryptocomWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_advanced_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_accounts(self, params={})`
- `fetch_balance(self, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_ledger(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_settlement_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders_for_symbols(self, orders: List[CancellationRequest], params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `custom_handle_margin_mode_and_params(self, methodName, params={})`
- `describe(self)`
- `edit_order_request(self, id: str, symbol: str, amount: float, price: Num = None, params={})`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `nonce(self)`
- `params_to_string(self, object, level)`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `base_public_get_v1_public_get_announcements(request)`
- `v1_public_get_public_auth(request)`
- `v1_public_get_public_get_instruments(request)`
- `v1_public_get_public_get_book(request)`
- `v1_public_get_public_get_candlestick(request)`
- `v1_public_get_public_get_trades(request)`
- `v1_public_get_public_get_tickers(request)`
- `v1_public_get_public_get_valuations(request)`
- `v1_public_get_public_get_expired_settlement_price(request)`
- `v1_public_get_public_get_insurance(request)`
- `v1_public_get_public_get_announcements(request)`
- `v1_public_get_public_get_risk_parameters(request)`
- `v1_public_post_public_staking_get_conversion_rate(request)`
- `v1_private_post_private_set_cancel_on_disconnect(request)`
- `v1_private_post_private_get_cancel_on_disconnect(request)`
- `v1_private_post_private_user_balance(request)`
- `v1_private_post_private_user_balance_history(request)`
- `v1_private_post_private_get_positions(request)`
- `v1_private_post_private_create_order(request)`
- `v1_private_post_private_amend_order(request)`
- `v1_private_post_private_create_order_list(request)`
- `v1_private_post_private_cancel_order(request)`
- `v1_private_post_private_cancel_order_list(request)`
- `v1_private_post_private_cancel_all_orders(request)`
- `v1_private_post_private_close_position(request)`
- `v1_private_post_private_get_order_history(request)`
- `v1_private_post_private_get_open_orders(request)`
- `v1_private_post_private_get_order_detail(request)`
- `v1_private_post_private_get_trades(request)`
- `v1_private_post_private_change_account_leverage(request)`
- `v1_private_post_private_get_transactions(request)`
- `v1_private_post_private_create_subaccount_transfer(request)`
- `v1_private_post_private_get_subaccount_balances(request)`
- `v1_private_post_private_get_order_list(request)`
- `v1_private_post_private_create_withdrawal(request)`
- `v1_private_post_private_get_currency_networks(request)`
- `v1_private_post_private_get_deposit_address(request)`
- `v1_private_post_private_get_accounts(request)`
- `v1_private_post_private_get_withdrawal_history(request)`
- `v1_private_post_private_get_deposit_history(request)`
- `v1_private_post_private_get_fee_rate(request)`
- `v1_private_post_private_get_instrument_fee_rate(request)`
- `v1_private_post_private_fiat_fiat_deposit_info(request)`
- `v1_private_post_private_fiat_fiat_deposit_history(request)`
- `v1_private_post_private_fiat_fiat_withdraw_history(request)`
- `v1_private_post_private_fiat_fiat_create_withdraw(request)`
- `v1_private_post_private_fiat_fiat_transaction_quota(request)`
- `v1_private_post_private_fiat_fiat_transaction_limit(request)`
- `v1_private_post_private_fiat_fiat_get_bank_accounts(request)`
- `v1_private_post_private_staking_stake(request)`
- `v1_private_post_private_staking_unstake(request)`
- `v1_private_post_private_staking_get_staking_position(request)`
- `v1_private_post_private_staking_get_staking_instruments(request)`
- `v1_private_post_private_staking_get_open_stake(request)`
- `v1_private_post_private_staking_get_stake_history(request)`
- `v1_private_post_private_staking_get_reward_history(request)`
- `v1_private_post_private_staking_convert(request)`
- `v1_private_post_private_staking_get_open_convert(request)`
- `v1_private_post_private_staking_get_convert_history(request)`
- `v1_private_post_private_create_isolated_margin_transfer(request)`
- `v1_private_post_private_change_isolated_margin_leverage(request)`
- `v2_public_get_public_auth(request)`
- `v2_public_get_public_get_instruments(request)`
- `v2_public_get_public_get_book(request)`
- `v2_public_get_public_get_candlestick(request)`
- `v2_public_get_public_get_ticker(request)`
- `v2_public_get_public_get_trades(request)`
- `v2_public_get_public_margin_get_transfer_currencies(request)`
- `v2_public_get_public_margin_get_load_currenices(request)`
- `v2_public_get_public_respond_heartbeat(request)`
- `v2_private_post_private_set_cancel_on_disconnect(request)`
- `v2_private_post_private_get_cancel_on_disconnect(request)`
- `v2_private_post_private_create_withdrawal(request)`
- `v2_private_post_private_get_withdrawal_history(request)`
- `v2_private_post_private_get_currency_networks(request)`
- `v2_private_post_private_get_deposit_history(request)`
- `v2_private_post_private_get_deposit_address(request)`
- `v2_private_post_private_export_create_export_request(request)`
- `v2_private_post_private_export_get_export_requests(request)`
- `v2_private_post_private_export_download_export_output(request)`
- `v2_private_post_private_get_account_summary(request)`
- `v2_private_post_private_create_order(request)`
- `v2_private_post_private_cancel_order(request)`
- `v2_private_post_private_cancel_all_orders(request)`
- `v2_private_post_private_create_order_list(request)`
- `v2_private_post_private_get_order_history(request)`
- `v2_private_post_private_get_open_orders(request)`
- `v2_private_post_private_get_order_detail(request)`
- `v2_private_post_private_get_trades(request)`
- `v2_private_post_private_get_accounts(request)`
- `v2_private_post_private_get_subaccount_balances(request)`
- `v2_private_post_private_create_subaccount_transfer(request)`
- `v2_private_post_private_otc_get_otc_user(request)`
- `v2_private_post_private_otc_get_instruments(request)`
- `v2_private_post_private_otc_request_quote(request)`
- `v2_private_post_private_otc_accept_quote(request)`
- `v2_private_post_private_otc_get_quote_history(request)`
- `v2_private_post_private_otc_get_trade_history(request)`
- `v2_private_post_private_otc_create_order(request)`
- `derivatives_public_get_public_auth(request)`
- `derivatives_public_get_public_get_instruments(request)`
- `derivatives_public_get_public_get_book(request)`
- `derivatives_public_get_public_get_candlestick(request)`
- `derivatives_public_get_public_get_trades(request)`
- `derivatives_public_get_public_get_tickers(request)`
- `derivatives_public_get_public_get_valuations(request)`
- `derivatives_public_get_public_get_expired_settlement_price(request)`
- `derivatives_public_get_public_get_insurance(request)`
- `derivatives_private_post_private_set_cancel_on_disconnect(request)`
- `derivatives_private_post_private_get_cancel_on_disconnect(request)`
- `derivatives_private_post_private_user_balance(request)`
- `derivatives_private_post_private_user_balance_history(request)`
- `derivatives_private_post_private_get_positions(request)`
- `derivatives_private_post_private_create_order(request)`
- `derivatives_private_post_private_create_order_list(request)`
- `derivatives_private_post_private_cancel_order(request)`
- `derivatives_private_post_private_cancel_order_list(request)`
- `derivatives_private_post_private_cancel_all_orders(request)`
- `derivatives_private_post_private_close_position(request)`
- `derivatives_private_post_private_convert_collateral(request)`
- `derivatives_private_post_private_get_order_history(request)`
- `derivatives_private_post_private_get_open_orders(request)`
- `derivatives_private_post_private_get_order_detail(request)`
- `derivatives_private_post_private_get_trades(request)`
- `derivatives_private_post_private_change_account_leverage(request)`
- `derivatives_private_post_private_get_transactions(request)`
- `derivatives_private_post_private_create_subaccount_transfer(request)`
- `derivatives_private_post_private_get_subaccount_balances(request)`
- `derivatives_private_post_private_get_order_list(request)`
### WS Unified
- `describe(self)`
- `pong(self, client, message)`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `un_watch_order_book_for_symbols(self, symbols: List[str], params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `un_watch_trades_for_symbols(self, symbols: List[str], params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_ticker(self, symbol: str, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `un_watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_positions(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `set_positions_cache(self, client: Client, type, symbols: Strings = None)`
- `load_positions_snapshot(self, client, messageHash)`
- `watch_balance(self, params={})`
- `create_order_ws(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `edit_order_ws(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `cancel_order_ws(self, id: str, symbol: Str = None, params={})`
- `cancel_all_orders_ws(self, symbol: Str = None, params={})`
- `watch_public(self, messageHash, params={})`
- `watch_public_multiple(self, messageHashes, topics, params={})`
- `un_watch_public_multiple(self, topic: str, symbols: List[str], messageHashes: List[str], subMessageHashes: List[str], topics: List[str], params={}, subExtend={})`
- `watch_private_request(self, nonce, params={})`
- `watch_private_subscribe(self, messageHash, params={})`
- `authenticate(self, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:38.455844 | crypto_com_sdk-0.0.123.tar.gz | 728,169 | 3d/f7/2ede9e624d0928d47c000c50d4ed30ef0fb33efcd9f4fd0817380dcd0811/crypto_com_sdk-0.0.123.tar.gz | source | sdist | null | false | c8468d58deb5c100f1e1396d9bff13cf | 6ddca4407b6f905728bac9d393e0210dc392959d7e2bb538988ba3bf0977a2a5 | 3df72ede9e624d0928d47c000c50d4ed30ef0fb33efcd9f4fd0817380dcd0811 | null | [] | 238 |
2.4 | mexc-exchange-api | 0.0.125 | mexc crypto exchange api client | # mexc-python
Python SDK (sync and async) for Mexc cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/mexc)
- You can check Mexc's docs here: [Docs](https://www.google.com/search?q=google+mexc+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/mexc-python
- Pypi package: https://pypi.org/project/mexc-exchange-api
## Installation
```
pip install mexc-exchange-api
```
## Usage
### Sync
```Python
from mexc import MexcSync
def main():
instance = MexcSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from mexc import MexcAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = MexcAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from mexc import MexcWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = MexcWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_deposit_address(self, code: str, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_market_sell_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `create_spot_order_request(self, market, type, side, amount, price=None, marginMode=None, params={})`
- `create_spot_order(self, market, type, side, amount, price=None, marginMode=None, params={})`
- `create_swap_order(self, market, type, side, amount, price=None, marginMode=None, params={})`
- `fetch_account_helper(self, type, params)`
- `fetch_accounts(self, params={})`
- `fetch_balance(self, params={})`
- `fetch_bids_asks(self, symbols: Strings = None, params={})`
- `fetch_canceled_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_addresses_by_network(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_leverage_tiers(self, symbols: Strings = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = None, params={})`
- `fetch_order_trades(self, id: str, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders_by_ids(self, ids, symbol: Str = None, params={})`
- `fetch_orders_by_state(self, state, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position_mode(self, symbol: Str = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions_history(self, symbols: Strings = None, since: Int = None, limit: Int = None, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_spot_markets(self, params={})`
- `fetch_status(self, params={})`
- `fetch_swap_markets(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_transaction_fees(self, codes: Strings = None, params={})`
- `fetch_transfer(self, id: str, code: Str = None, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `custom_parse_balance(self, response, marketType)`
- `describe(self)`
- `get_tif_from_raw_order_type(self, orderType: Str = None)`
- `modify_margin_helper(self, symbol: str, amount, addOrReduce, params={})`
- `nonce(self)`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `set_position_mode(self, hedged: bool, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `spot_public_get_ping(request)`
- `spot_public_get_time(request)`
- `spot_public_get_defaultsymbols(request)`
- `spot_public_get_exchangeinfo(request)`
- `spot_public_get_depth(request)`
- `spot_public_get_trades(request)`
- `spot_public_get_historicaltrades(request)`
- `spot_public_get_aggtrades(request)`
- `spot_public_get_klines(request)`
- `spot_public_get_avgprice(request)`
- `spot_public_get_ticker_24hr(request)`
- `spot_public_get_ticker_price(request)`
- `spot_public_get_ticker_bookticker(request)`
- `spot_public_get_etf_info(request)`
- `spot_private_get_kyc_status(request)`
- `spot_private_get_uid(request)`
- `spot_private_get_order(request)`
- `spot_private_get_openorders(request)`
- `spot_private_get_allorders(request)`
- `spot_private_get_account(request)`
- `spot_private_get_mytrades(request)`
- `spot_private_get_strategy_group(request)`
- `spot_private_get_strategy_group_uid(request)`
- `spot_private_get_tradefee(request)`
- `spot_private_get_sub_account_list(request)`
- `spot_private_get_sub_account_apikey(request)`
- `spot_private_get_sub_account_asset(request)`
- `spot_private_get_capital_config_getall(request)`
- `spot_private_get_capital_deposit_hisrec(request)`
- `spot_private_get_capital_withdraw_history(request)`
- `spot_private_get_capital_withdraw_address(request)`
- `spot_private_get_capital_deposit_address(request)`
- `spot_private_get_capital_transfer(request)`
- `spot_private_get_capital_transfer_tranid(request)`
- `spot_private_get_capital_transfer_internal(request)`
- `spot_private_get_capital_sub_account_universaltransfer(request)`
- `spot_private_get_capital_convert(request)`
- `spot_private_get_capital_convert_list(request)`
- `spot_private_get_margin_loan(request)`
- `spot_private_get_margin_allorders(request)`
- `spot_private_get_margin_mytrades(request)`
- `spot_private_get_margin_openorders(request)`
- `spot_private_get_margin_maxtransferable(request)`
- `spot_private_get_margin_priceindex(request)`
- `spot_private_get_margin_order(request)`
- `spot_private_get_margin_isolated_account(request)`
- `spot_private_get_margin_maxborrowable(request)`
- `spot_private_get_margin_repay(request)`
- `spot_private_get_margin_isolated_pair(request)`
- `spot_private_get_margin_forceliquidationrec(request)`
- `spot_private_get_margin_isolatedmargindata(request)`
- `spot_private_get_margin_isolatedmargintier(request)`
- `spot_private_get_rebate_taxquery(request)`
- `spot_private_get_rebate_detail(request)`
- `spot_private_get_rebate_detail_kickback(request)`
- `spot_private_get_rebate_refercode(request)`
- `spot_private_get_rebate_affiliate_commission(request)`
- `spot_private_get_rebate_affiliate_withdraw(request)`
- `spot_private_get_rebate_affiliate_commission_detail(request)`
- `spot_private_get_mxdeduct_enable(request)`
- `spot_private_get_userdatastream(request)`
- `spot_private_get_selfsymbols(request)`
- `spot_private_get_asset_internal_transfer_record(request)`
- `spot_private_post_order(request)`
- `spot_private_post_order_test(request)`
- `spot_private_post_sub_account_virtualsubaccount(request)`
- `spot_private_post_sub_account_apikey(request)`
- `spot_private_post_sub_account_futures(request)`
- `spot_private_post_sub_account_margin(request)`
- `spot_private_post_batchorders(request)`
- `spot_private_post_strategy_group(request)`
- `spot_private_post_capital_withdraw_apply(request)`
- `spot_private_post_capital_withdraw(request)`
- `spot_private_post_capital_transfer(request)`
- `spot_private_post_capital_transfer_internal(request)`
- `spot_private_post_capital_deposit_address(request)`
- `spot_private_post_capital_sub_account_universaltransfer(request)`
- `spot_private_post_capital_convert(request)`
- `spot_private_post_mxdeduct_enable(request)`
- `spot_private_post_userdatastream(request)`
- `spot_private_put_userdatastream(request)`
- `spot_private_delete_order(request)`
- `spot_private_delete_openorders(request)`
- `spot_private_delete_sub_account_apikey(request)`
- `spot_private_delete_margin_order(request)`
- `spot_private_delete_margin_openorders(request)`
- `spot_private_delete_userdatastream(request)`
- `spot_private_delete_capital_withdraw(request)`
- `contract_public_get_ping(request)`
- `contract_public_get_detail(request)`
- `contract_public_get_support_currencies(request)`
- `contract_public_get_depth_symbol(request)`
- `contract_public_get_depth_commits_symbol_limit(request)`
- `contract_public_get_index_price_symbol(request)`
- `contract_public_get_fair_price_symbol(request)`
- `contract_public_get_funding_rate_symbol(request)`
- `contract_public_get_kline_symbol(request)`
- `contract_public_get_kline_index_price_symbol(request)`
- `contract_public_get_kline_fair_price_symbol(request)`
- `contract_public_get_deals_symbol(request)`
- `contract_public_get_ticker(request)`
- `contract_public_get_risk_reverse(request)`
- `contract_public_get_risk_reverse_history(request)`
- `contract_public_get_funding_rate_history(request)`
- `contract_private_get_account_assets(request)`
- `contract_private_get_account_asset_currency(request)`
- `contract_private_get_account_transfer_record(request)`
- `contract_private_get_position_list_history_positions(request)`
- `contract_private_get_position_open_positions(request)`
- `contract_private_get_position_funding_records(request)`
- `contract_private_get_position_position_mode(request)`
- `contract_private_get_order_list_open_orders_symbol(request)`
- `contract_private_get_order_list_history_orders(request)`
- `contract_private_get_order_external_symbol_external_oid(request)`
- `contract_private_get_order_get_order_id(request)`
- `contract_private_get_order_batch_query(request)`
- `contract_private_get_order_deal_details_order_id(request)`
- `contract_private_get_order_list_order_deals(request)`
- `contract_private_get_planorder_list_orders(request)`
- `contract_private_get_stoporder_list_orders(request)`
- `contract_private_get_stoporder_order_details_stop_order_id(request)`
- `contract_private_get_account_risk_limit(request)`
- `contract_private_get_account_tiered_fee_rate(request)`
- `contract_private_get_position_leverage(request)`
- `contract_private_post_position_change_margin(request)`
- `contract_private_post_position_change_leverage(request)`
- `contract_private_post_position_change_position_mode(request)`
- `contract_private_post_order_submit(request)`
- `contract_private_post_order_submit_batch(request)`
- `contract_private_post_order_cancel(request)`
- `contract_private_post_order_cancel_with_external(request)`
- `contract_private_post_order_cancel_all(request)`
- `contract_private_post_account_change_risk_level(request)`
- `contract_private_post_planorder_place(request)`
- `contract_private_post_planorder_cancel(request)`
- `contract_private_post_planorder_cancel_all(request)`
- `contract_private_post_stoporder_cancel(request)`
- `contract_private_post_stoporder_cancel_all(request)`
- `contract_private_post_stoporder_change_price(request)`
- `contract_private_post_stoporder_change_plan_price(request)`
- `spot2_public_get_market_symbols(request)`
- `spot2_public_get_market_coin_list(request)`
- `spot2_public_get_common_timestamp(request)`
- `spot2_public_get_common_ping(request)`
- `spot2_public_get_market_ticker(request)`
- `spot2_public_get_market_depth(request)`
- `spot2_public_get_market_deals(request)`
- `spot2_public_get_market_kline(request)`
- `spot2_public_get_market_api_default_symbols(request)`
- `spot2_private_get_account_info(request)`
- `spot2_private_get_order_open_orders(request)`
- `spot2_private_get_order_list(request)`
- `spot2_private_get_order_query(request)`
- `spot2_private_get_order_deals(request)`
- `spot2_private_get_order_deal_detail(request)`
- `spot2_private_get_asset_deposit_address_list(request)`
- `spot2_private_get_asset_deposit_list(request)`
- `spot2_private_get_asset_address_list(request)`
- `spot2_private_get_asset_withdraw_list(request)`
- `spot2_private_get_asset_internal_transfer_record(request)`
- `spot2_private_get_account_balance(request)`
- `spot2_private_get_asset_internal_transfer_info(request)`
- `spot2_private_get_market_api_symbols(request)`
- `spot2_private_post_order_place(request)`
- `spot2_private_post_order_place_batch(request)`
- `spot2_private_post_order_advanced_place_batch(request)`
- `spot2_private_post_asset_withdraw(request)`
- `spot2_private_post_asset_internal_transfer(request)`
- `spot2_private_delete_order_cancel(request)`
- `spot2_private_delete_order_cancel_by_symbol(request)`
- `spot2_private_delete_asset_withdraw(request)`
- `broker_private_get_sub_account_universaltransfer(request)`
- `broker_private_get_sub_account_list(request)`
- `broker_private_get_sub_account_apikey(request)`
- `broker_private_get_capital_deposit_subaddress(request)`
- `broker_private_get_capital_deposit_subhisrec(request)`
- `broker_private_get_capital_deposit_subhisrec_getall(request)`
- `broker_private_post_sub_account_virtualsubaccount(request)`
- `broker_private_post_sub_account_apikey(request)`
- `broker_private_post_capital_deposit_subaddress(request)`
- `broker_private_post_capital_withdraw_apply(request)`
- `broker_private_post_sub_account_universaltransfer(request)`
- `broker_private_post_sub_account_futures(request)`
- `broker_private_delete_sub_account_apikey(request)`
### WS Unified
- `describe(self)`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `watch_spot_public(self, channel, messageHash, params={})`
- `watch_spot_private(self, channel, messageHash, params={})`
- `watch_swap_public(self, channel, messageHash, requestParams, params={})`
- `watch_swap_private(self, messageHash, params={})`
- `watch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `get_cache_index(self, orderbook, cache)`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_balance(self, params={})`
- `un_watch_ticker(self, symbol: str, params={})`
- `un_watch_tickers(self, symbols: Strings = None, params={})`
- `un_watch_bids_asks(self, symbols: Strings = None, params={})`
- `un_watch_ohlcv(self, symbol: str, timeframe: str = '1m', params={})`
- `un_watch_order_book(self, symbol: str, params={})`
- `un_watch_trades(self, symbol: str, params={})`
- `authenticate(self, subscriptionHash, params={})`
- `keep_alive_listen_key(self, listenKey, params={})`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:36.825497 | mexc_exchange_api-0.0.125.tar.gz | 770,085 | ab/e6/d512a981e78b091317a56b11e5e803a2ec4f642fa55a95c55f3c4b58a747/mexc_exchange_api-0.0.125.tar.gz | source | sdist | null | false | 2b2c341f2493d5ae2febca800b6c172c | b1db91d578f0ed37c8b313dfea79a73ef32de1595e94a55f32d464502e598427 | abe6d512a981e78b091317a56b11e5e803a2ec4f642fa55a95c55f3c4b58a747 | null | [] | 239 |
2.4 | coinex-api | 0.0.124 | coinex crypto exchange api client | # coinex-python
Python SDK (sync and async) for Coinex cryptocurrency exchange with Rest and WS capabilities.
- You can check the SDK docs here: [SDK](https://docs.ccxt.com/#/exchanges/coinex)
- You can check Coinex's docs here: [Docs](https://www.google.com/search?q=google+coinex+cryptocurrency+exchange+api+docs)
- Github repo: https://github.com/ccxt/coinex-python
- Pypi package: https://pypi.org/project/coinex-api
## Installation
```
pip install coinex-api
```
## Usage
### Sync
```Python
from coinex import CoinexSync
def main():
instance = CoinexSync({})
ob = instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = instance.fetch_balance()
# order = instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
main()
```
### Async
```Python
import sys
import asyncio
from coinex import CoinexAsync
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = CoinexAsync({})
ob = await instance.fetch_order_book("BTC/USDC")
print(ob)
#
# balance = await instance.fetch_balance()
# order = await instance.create_order("BTC/USDC", "limit", "buy", 1, 100000)
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
### Websockets
```Python
import sys
from coinex import CoinexWs
### on Windows, uncomment below:
# if sys.platform == 'win32':
# asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
async def main():
instance = CoinexWs({})
while True:
ob = await instance.watch_order_book("BTC/USDC")
print(ob)
# orders = await instance.watch_orders("BTC/USDC")
# once you are done with the exchange
await instance.close()
asyncio.run(main())
```
#### Raw call
You can also construct custom requests to available "implicit" endpoints
```Python
request = {
'type': 'candleSnapshot',
'req': {
'coin': coin,
'interval': tf,
'startTime': since,
'endTime': until,
},
}
response = await instance.public_post_info(request)
```
## Available methods
### REST Unified
- `create_deposit_address(self, code: str, params={})`
- `create_market_buy_order_with_cost(self, symbol: str, cost: float, params={})`
- `create_order_request(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_order(self, symbol: str, type: OrderType, side: OrderSide, amount: float, price: Num = None, params={})`
- `create_orders(self, orders: List[OrderRequest], params={})`
- `fetch_balance(self, params={})`
- `fetch_borrow_interest(self, code: Str = None, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_closed_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_contract_markets(self, params)`
- `fetch_currencies(self, params={})`
- `fetch_deposit_address(self, code: str, params={})`
- `fetch_deposit_withdraw_fee(self, code: str, params={})`
- `fetch_deposit_withdraw_fees(self, codes: Strings = None, params={})`
- `fetch_deposits(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_financial_balance(self, params={})`
- `fetch_funding_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_interval(self, symbol: str, params={})`
- `fetch_funding_rate_history(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_funding_rate(self, symbol: str, params={})`
- `fetch_funding_rates(self, symbols: Strings = None, params={})`
- `fetch_isolated_borrow_rate(self, symbol: str, params={})`
- `fetch_leverage_tiers(self, symbols: Strings = None, params={})`
- `fetch_leverage(self, symbol: str, params={})`
- `fetch_margin_adjustment_history(self, symbol: Str = None, type: Str = None, since: Num = None, limit: Num = None, params={})`
- `fetch_margin_balance(self, params={})`
- `fetch_markets(self, params={})`
- `fetch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_ohlcv(self, symbol: str, timeframe: str = '1m', since: Int = None, limit: Int = None, params={})`
- `fetch_open_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_order_book(self, symbol: str, limit: Int = 20, params={})`
- `fetch_order(self, id: str, symbol: Str = None, params={})`
- `fetch_orders_by_status(self, status, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_position_history(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_position(self, symbol: str, params={})`
- `fetch_positions(self, symbols: Strings = None, params={})`
- `fetch_spot_balance(self, params={})`
- `fetch_spot_markets(self, params)`
- `fetch_swap_balance(self, params={})`
- `fetch_ticker(self, symbol: str, params={})`
- `fetch_tickers(self, symbols: Strings = None, params={})`
- `fetch_time(self, params={})`
- `fetch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `fetch_trading_fee(self, symbol: str, params={})`
- `fetch_trading_fees(self, params={})`
- `fetch_transfers(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `fetch_withdrawals(self, code: Str = None, since: Int = None, limit: Int = None, params={})`
- `add_margin(self, symbol: str, amount: float, params={})`
- `borrow_isolated_margin(self, symbol: str, code: str, amount: float, params={})`
- `cancel_all_orders(self, symbol: Str = None, params={})`
- `cancel_order(self, id: str, symbol: Str = None, params={})`
- `cancel_orders(self, ids: List[str], symbol: Str = None, params={})`
- `close_position(self, symbol: str, side: OrderSide = None, params={})`
- `describe(self)`
- `edit_order(self, id: str, symbol: str, type: OrderType, side: OrderSide, amount: Num = None, price: Num = None, params={})`
- `edit_orders(self, orders: List[OrderRequest], params={})`
- `modify_margin_helper(self, symbol: str, amount, addOrReduce, params={})`
- `nonce(self)`
- `reduce_margin(self, symbol: str, amount: float, params={})`
- `repay_isolated_margin(self, symbol: str, code: str, amount, params={})`
- `set_leverage(self, leverage: int, symbol: Str = None, params={})`
- `set_margin_mode(self, marginMode: str, symbol: Str = None, params={})`
- `transfer(self, code: str, amount: float, fromAccount: str, toAccount: str, params={})`
- `withdraw(self, code: str, amount: float, address: str, tag: Str = None, params={})`
### REST Raw
- `v1_public_get_amm_market(request)`
- `v1_public_get_common_currency_rate(request)`
- `v1_public_get_common_asset_config(request)`
- `v1_public_get_common_maintain_info(request)`
- `v1_public_get_common_temp_maintain_info(request)`
- `v1_public_get_margin_market(request)`
- `v1_public_get_market_info(request)`
- `v1_public_get_market_list(request)`
- `v1_public_get_market_ticker(request)`
- `v1_public_get_market_ticker_all(request)`
- `v1_public_get_market_depth(request)`
- `v1_public_get_market_deals(request)`
- `v1_public_get_market_kline(request)`
- `v1_public_get_market_detail(request)`
- `v1_private_get_account_amm_balance(request)`
- `v1_private_get_account_investment_balance(request)`
- `v1_private_get_account_balance_history(request)`
- `v1_private_get_account_market_fee(request)`
- `v1_private_get_balance_coin_deposit(request)`
- `v1_private_get_balance_coin_withdraw(request)`
- `v1_private_get_balance_info(request)`
- `v1_private_get_balance_deposit_address_coin_type(request)`
- `v1_private_get_contract_transfer_history(request)`
- `v1_private_get_credit_info(request)`
- `v1_private_get_credit_balance(request)`
- `v1_private_get_investment_transfer_history(request)`
- `v1_private_get_margin_account(request)`
- `v1_private_get_margin_config(request)`
- `v1_private_get_margin_loan_history(request)`
- `v1_private_get_margin_transfer_history(request)`
- `v1_private_get_order_deals(request)`
- `v1_private_get_order_finished(request)`
- `v1_private_get_order_pending(request)`
- `v1_private_get_order_status(request)`
- `v1_private_get_order_status_batch(request)`
- `v1_private_get_order_user_deals(request)`
- `v1_private_get_order_stop_finished(request)`
- `v1_private_get_order_stop_pending(request)`
- `v1_private_get_order_user_trade_fee(request)`
- `v1_private_get_order_market_trade_info(request)`
- `v1_private_get_sub_account_balance(request)`
- `v1_private_get_sub_account_transfer_history(request)`
- `v1_private_get_sub_account_auth_api(request)`
- `v1_private_get_sub_account_auth_api_user_auth_id(request)`
- `v1_private_post_balance_coin_withdraw(request)`
- `v1_private_post_contract_balance_transfer(request)`
- `v1_private_post_margin_flat(request)`
- `v1_private_post_margin_loan(request)`
- `v1_private_post_margin_transfer(request)`
- `v1_private_post_order_limit_batch(request)`
- `v1_private_post_order_ioc(request)`
- `v1_private_post_order_limit(request)`
- `v1_private_post_order_market(request)`
- `v1_private_post_order_modify(request)`
- `v1_private_post_order_stop_limit(request)`
- `v1_private_post_order_stop_market(request)`
- `v1_private_post_order_stop_modify(request)`
- `v1_private_post_sub_account_transfer(request)`
- `v1_private_post_sub_account_register(request)`
- `v1_private_post_sub_account_unfrozen(request)`
- `v1_private_post_sub_account_frozen(request)`
- `v1_private_post_sub_account_auth_api(request)`
- `v1_private_put_balance_deposit_address_coin_type(request)`
- `v1_private_put_sub_account_unfrozen(request)`
- `v1_private_put_sub_account_frozen(request)`
- `v1_private_put_sub_account_auth_api_user_auth_id(request)`
- `v1_private_put_v1_account_settings(request)`
- `v1_private_delete_balance_coin_withdraw(request)`
- `v1_private_delete_order_pending_batch(request)`
- `v1_private_delete_order_pending(request)`
- `v1_private_delete_order_stop_pending(request)`
- `v1_private_delete_order_stop_pending_id(request)`
- `v1_private_delete_order_pending_by_client_id(request)`
- `v1_private_delete_order_stop_pending_by_client_id(request)`
- `v1_private_delete_sub_account_auth_api_user_auth_id(request)`
- `v1_private_delete_sub_account_authorize_id(request)`
- `v1_perpetualpublic_get_ping(request)`
- `v1_perpetualpublic_get_time(request)`
- `v1_perpetualpublic_get_market_list(request)`
- `v1_perpetualpublic_get_market_limit_config(request)`
- `v1_perpetualpublic_get_market_ticker(request)`
- `v1_perpetualpublic_get_market_ticker_all(request)`
- `v1_perpetualpublic_get_market_depth(request)`
- `v1_perpetualpublic_get_market_deals(request)`
- `v1_perpetualpublic_get_market_funding_history(request)`
- `v1_perpetualpublic_get_market_kline(request)`
- `v1_perpetualprivate_get_market_user_deals(request)`
- `v1_perpetualprivate_get_asset_query(request)`
- `v1_perpetualprivate_get_order_pending(request)`
- `v1_perpetualprivate_get_order_finished(request)`
- `v1_perpetualprivate_get_order_stop_finished(request)`
- `v1_perpetualprivate_get_order_stop_pending(request)`
- `v1_perpetualprivate_get_order_status(request)`
- `v1_perpetualprivate_get_order_stop_status(request)`
- `v1_perpetualprivate_get_position_finished(request)`
- `v1_perpetualprivate_get_position_pending(request)`
- `v1_perpetualprivate_get_position_funding(request)`
- `v1_perpetualprivate_get_position_adl_history(request)`
- `v1_perpetualprivate_get_market_preference(request)`
- `v1_perpetualprivate_get_position_margin_history(request)`
- `v1_perpetualprivate_get_position_settle_history(request)`
- `v1_perpetualprivate_post_market_adjust_leverage(request)`
- `v1_perpetualprivate_post_market_position_expect(request)`
- `v1_perpetualprivate_post_order_put_limit(request)`
- `v1_perpetualprivate_post_order_put_market(request)`
- `v1_perpetualprivate_post_order_put_stop_limit(request)`
- `v1_perpetualprivate_post_order_put_stop_market(request)`
- `v1_perpetualprivate_post_order_modify(request)`
- `v1_perpetualprivate_post_order_modify_stop(request)`
- `v1_perpetualprivate_post_order_cancel(request)`
- `v1_perpetualprivate_post_order_cancel_all(request)`
- `v1_perpetualprivate_post_order_cancel_batch(request)`
- `v1_perpetualprivate_post_order_cancel_stop(request)`
- `v1_perpetualprivate_post_order_cancel_stop_all(request)`
- `v1_perpetualprivate_post_order_close_limit(request)`
- `v1_perpetualprivate_post_order_close_market(request)`
- `v1_perpetualprivate_post_position_adjust_margin(request)`
- `v1_perpetualprivate_post_position_stop_loss(request)`
- `v1_perpetualprivate_post_position_take_profit(request)`
- `v1_perpetualprivate_post_position_market_close(request)`
- `v1_perpetualprivate_post_order_cancel_by_client_id(request)`
- `v1_perpetualprivate_post_order_cancel_stop_by_client_id(request)`
- `v1_perpetualprivate_post_market_preference(request)`
- `v2_public_get_maintain_info(request)`
- `v2_public_get_ping(request)`
- `v2_public_get_time(request)`
- `v2_public_get_spot_market(request)`
- `v2_public_get_spot_ticker(request)`
- `v2_public_get_spot_depth(request)`
- `v2_public_get_spot_deals(request)`
- `v2_public_get_spot_kline(request)`
- `v2_public_get_spot_index(request)`
- `v2_public_get_futures_market(request)`
- `v2_public_get_futures_ticker(request)`
- `v2_public_get_futures_depth(request)`
- `v2_public_get_futures_deals(request)`
- `v2_public_get_futures_kline(request)`
- `v2_public_get_futures_index(request)`
- `v2_public_get_futures_funding_rate(request)`
- `v2_public_get_futures_funding_rate_history(request)`
- `v2_public_get_futures_premium_index_history(request)`
- `v2_public_get_futures_position_level(request)`
- `v2_public_get_futures_liquidation_history(request)`
- `v2_public_get_futures_basis_history(request)`
- `v2_public_get_assets_deposit_withdraw_config(request)`
- `v2_public_get_assets_all_deposit_withdraw_config(request)`
- `v2_private_get_account_subs(request)`
- `v2_private_get_account_subs_api_detail(request)`
- `v2_private_get_account_subs_info(request)`
- `v2_private_get_account_subs_api(request)`
- `v2_private_get_account_subs_transfer_history(request)`
- `v2_private_get_account_subs_balance(request)`
- `v2_private_get_account_subs_spot_balance(request)`
- `v2_private_get_account_trade_fee_rate(request)`
- `v2_private_get_account_futures_market_settings(request)`
- `v2_private_get_account_info(request)`
- `v2_private_get_assets_spot_balance(request)`
- `v2_private_get_assets_futures_balance(request)`
- `v2_private_get_assets_margin_balance(request)`
- `v2_private_get_assets_financial_balance(request)`
- `v2_private_get_assets_amm_liquidity(request)`
- `v2_private_get_assets_credit_info(request)`
- `v2_private_get_assets_spot_transcation_history(request)`
- `v2_private_get_assets_margin_borrow_history(request)`
- `v2_private_get_assets_margin_interest_limit(request)`
- `v2_private_get_assets_deposit_address(request)`
- `v2_private_get_assets_deposit_history(request)`
- `v2_private_get_assets_withdraw(request)`
- `v2_private_get_assets_transfer_history(request)`
- `v2_private_get_assets_amm_liquidity_pool(request)`
- `v2_private_get_assets_amm_income_history(request)`
- `v2_private_get_spot_order_status(request)`
- `v2_private_get_spot_batch_order_status(request)`
- `v2_private_get_spot_pending_order(request)`
- `v2_private_get_spot_finished_order(request)`
- `v2_private_get_spot_pending_stop_order(request)`
- `v2_private_get_spot_finished_stop_order(request)`
- `v2_private_get_spot_user_deals(request)`
- `v2_private_get_spot_order_deals(request)`
- `v2_private_get_futures_order_status(request)`
- `v2_private_get_futures_batch_order_status(request)`
- `v2_private_get_futures_pending_order(request)`
- `v2_private_get_futures_finished_order(request)`
- `v2_private_get_futures_pending_stop_order(request)`
- `v2_private_get_futures_finished_stop_order(request)`
- `v2_private_get_futures_user_deals(request)`
- `v2_private_get_futures_order_deals(request)`
- `v2_private_get_futures_pending_position(request)`
- `v2_private_get_futures_finished_position(request)`
- `v2_private_get_futures_position_margin_history(request)`
- `v2_private_get_futures_position_funding_history(request)`
- `v2_private_get_futures_position_adl_history(request)`
- `v2_private_get_futures_position_settle_history(request)`
- `v2_private_get_refer_referee(request)`
- `v2_private_get_refer_referee_rebate_record(request)`
- `v2_private_get_refer_referee_rebate_detail(request)`
- `v2_private_get_refer_agent_referee(request)`
- `v2_private_get_refer_agent_rebate_record(request)`
- `v2_private_get_refer_agent_rebate_detail(request)`
- `v2_private_post_account_subs(request)`
- `v2_private_post_account_subs_frozen(request)`
- `v2_private_post_account_subs_unfrozen(request)`
- `v2_private_post_account_subs_api(request)`
- `v2_private_post_account_subs_edit_api(request)`
- `v2_private_post_account_subs_delete_api(request)`
- `v2_private_post_account_subs_transfer(request)`
- `v2_private_post_account_settings(request)`
- `v2_private_post_account_futures_market_settings(request)`
- `v2_private_post_assets_margin_borrow(request)`
- `v2_private_post_assets_margin_repay(request)`
- `v2_private_post_assets_renewal_deposit_address(request)`
- `v2_private_post_assets_withdraw(request)`
- `v2_private_post_assets_cancel_withdraw(request)`
- `v2_private_post_assets_transfer(request)`
- `v2_private_post_assets_amm_add_liquidity(request)`
- `v2_private_post_assets_amm_remove_liquidity(request)`
- `v2_private_post_spot_order(request)`
- `v2_private_post_spot_stop_order(request)`
- `v2_private_post_spot_batch_order(request)`
- `v2_private_post_spot_batch_stop_order(request)`
- `v2_private_post_spot_modify_order(request)`
- `v2_private_post_spot_modify_stop_order(request)`
- `v2_private_post_spot_batch_modify_order(request)`
- `v2_private_post_spot_cancel_all_order(request)`
- `v2_private_post_spot_cancel_order(request)`
- `v2_private_post_spot_cancel_stop_order(request)`
- `v2_private_post_spot_cancel_batch_order(request)`
- `v2_private_post_spot_cancel_batch_stop_order(request)`
- `v2_private_post_spot_cancel_order_by_client_id(request)`
- `v2_private_post_spot_cancel_stop_order_by_client_id(request)`
- `v2_private_post_futures_order(request)`
- `v2_private_post_futures_stop_order(request)`
- `v2_private_post_futures_batch_order(request)`
- `v2_private_post_futures_batch_stop_order(request)`
- `v2_private_post_futures_cancel_position_stop_loss(request)`
- `v2_private_post_futures_cancel_position_take_profit(request)`
- `v2_private_post_futures_modify_order(request)`
- `v2_private_post_futures_modify_stop_order(request)`
- `v2_private_post_futures_batch_modify_order(request)`
- `v2_private_post_futures_cancel_all_order(request)`
- `v2_private_post_futures_cancel_order(request)`
- `v2_private_post_futures_cancel_stop_order(request)`
- `v2_private_post_futures_cancel_batch_order(request)`
- `v2_private_post_futures_cancel_batch_stop_order(request)`
- `v2_private_post_futures_cancel_order_by_client_id(request)`
- `v2_private_post_futures_cancel_stop_order_by_client_id(request)`
- `v2_private_post_futures_close_position(request)`
- `v2_private_post_futures_adjust_position_margin(request)`
- `v2_private_post_futures_adjust_position_leverage(request)`
- `v2_private_post_futures_set_position_stop_loss(request)`
- `v2_private_post_futures_set_position_take_profit(request)`
### WS Unified
- `describe(self)`
- `watch_balance(self, params={})`
- `watch_my_trades(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_ticker(self, symbol: str, params={})`
- `watch_tickers(self, symbols: Strings = None, params={})`
- `watch_trades(self, symbol: str, since: Int = None, limit: Int = None, params={})`
- `watch_trades_for_symbols(self, symbols: List[str], since: Int = None, limit: Int = None, params={})`
- `watch_order_book_for_symbols(self, symbols: List[str], limit: Int = None, params={})`
- `watch_order_book(self, symbol: str, limit: Int = None, params={})`
- `watch_orders(self, symbol: Str = None, since: Int = None, limit: Int = None, params={})`
- `watch_bids_asks(self, symbols: Strings = None, params={})`
- `authenticate(self, type: str)`
## Contribution
- Give us a star :star:
- Fork and Clone! Awesome
- Select existing issues or create a new issue. | text/markdown | null | CCXT <info@ccxt.trade> | null | null | MIT | null | [
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Topic :: Office/Business :: Financial :: Inve... | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/ccxt/ccxt",
"Issues, https://github.com/ccxt/ccxt"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:13:35.149329 | coinex_api-0.0.124.tar.gz | 750,075 | 08/01/01614f5a0d7a040c8355128f9c0dea3ecd4b78f025c7b4c3bf4d9ae5725f/coinex_api-0.0.124.tar.gz | source | sdist | null | false | de3c32d34b4120c54b0ed72b0fc2940c | 069667461b88be18703bcb9f42d93f08f87ac8c9d32d998c17e66fb1438d7d11 | 080101614f5a0d7a040c8355128f9c0dea3ecd4b78f025c7b4c3bf4d9ae5725f | null | [] | 242 |
2.4 | forexrates | 1.0.0 | A unified codebase for fetching FOREX rates. | <div align = "center">
# FOREXRates (`forexrates`) ETL Module
[](https://github.com/sharkutilities/forexrates/issues)
[](https://github.com/sharkutilities/forexrates/network)
[](https://github.com/sharkutilities/forexrates/stargazers)
[](https://github.com/sharkutilities/forexrates/blob/master/LICENSE)
[](https://pypistats.org/packages/forexrates/)
[](https://pypi.org/project/forexrates/)
</div>
<div align = "justify">
A module to provide unified functionality with ETL steps to fetch, modify and track foreign exchange rates from various API sources.
The module is developed with abstraction principal, thus any forward integration can be done easily or data source can be replaced easily
by changing the underlying URL only keeping the code structure same.
## Getting Started
The source code is hosted at GitHub: [**sharkutilities/forexrates**](https://github.com/sharkutilities/forexrates).
The binary installers for the latest release are available at the [Python Package Index (PyPI)](https://pypi.org/project/forexrates/).
```shell
pip install -U forexrates
```
We are continuosly adding new supported websites, please check [CHANGELOG](./CHANGELOG.md) for list of available supported API
to fetch foreign exchange rates. Helper functions ([jupter notebook](./examples/), [CI/CD functions](./main/)) are only available
at [**sharkutilities/forexrates**](https://github.com/sharkutilities/forexrates) and is not packaged with the module.
</div>
| text/markdown | shark-utilities developers | shark-utilities developers <neuralNOD@outlook.com> | null | null | MIT License
Copyright (c) 2025 Debmalya Pramanik
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
| forex, api, exchange rates, foreign exchange rates, wrappers, data science, data analysis, data scientist, data analyst | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"Intended Audience :: End Users/Desktop",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"Intended Audience :: Science/Research",
... | [] | https://github.com/sharkutilities/forexrates | null | >=3.11 | [] | [] | [] | [
"numpy==2.4.2",
"pandas==3.0.0",
"psycopg==3.3.2",
"psycopg-binary==3.3.2",
"PyYAML==6.0.3",
"requests==2.32.5",
"SQLAlchemy==2.0.46",
"tqdm==4.67.3"
] | [] | [] | [] | [
"Homepage, https://github.com/sharkutilities/forexrates",
"Issue Tracker, https://github.com/sharkutilities/forexrates/issues",
"Org. Homepage, https://github.com/sharkutilities"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:13:07.604972 | forexrates-1.0.0.tar.gz | 12,289 | 29/c3/49098ec4af9ead92e40e606e71901413f0d0aaceba31cc7dc7b895d7fb2a/forexrates-1.0.0.tar.gz | source | sdist | null | false | bd4d8863bd852d5b3873742d764d5442 | 30d8d2b4c3690922031841af404e143be55db018eff9c11c847798da317183d6 | 29c349098ec4af9ead92e40e606e71901413f0d0aaceba31cc7dc7b895d7fb2a | null | [
"LICENSE"
] | 217 |
2.1 | lpd | 0.4.13 | A Fast, Flexible Trainer with Callbacks and Extensions for PyTorch | 
# lpd
A Fast, Flexible Trainer with Callbacks and Extensions for PyTorch
``lpd`` derives from the Hebrew word *lapid* (לפיד) which means "torch".
## For latest PyPI stable release
[](https://badge.fury.io/py/lpd)
[](https://pepy.tech/project/lpd)

<!--  -->
There are 2 types of ``lpd`` packages available
* ``lpd`` which brings dependencies for pytorch, numpy and tensorboard
```sh
pip install lpd
```
* ``lpd-nodeps`` which **you provide** your own dependencies for pytorch, numpy and tensorboard
```sh
pip install lpd-nodeps
```
<b>[v0.4.13-beta](https://github.com/RoySadaka/lpd/releases) Release - contains the following:</b>
* ``ThresholdChecker`` is updated to compute improvement according to last improved step and not to the best received metric
* Some minor cosmetic changes
Previously on lpd:
* ``Dense`` custom layer to support apply norm (configurable to before or after activation)
* ``StatsPrint`` callback to support printing best confusion matrix when at least one of the metrics is of type ``MetricConfusionMatrixBase``
* ``TransformerEncoderStack`` to support activation as input
* ``PositionalEncoding`` to support more than 3 dimensions input
* Updated Pipfile
* Fixed confusion matrix cpu/gpu device error
* Better handling on callbacks where apply_on_states=None (apply on all states)
* Bug fix in case validation samples are empty
## Usage
``lpd`` intended to properly structure your PyTorch model training.
The main usages are given below.
### Training your model
```python
from lpd.trainer import Trainer
from lpd.enums import Phase, State, MonitorType, MonitorMode, StatsType
from lpd.callbacks import LossOptimizerHandler, StatsPrint, ModelCheckPoint, Tensorboard, EarlyStopping, SchedulerStep, CallbackMonitor
from lpd.extensions.custom_schedulers import KerasDecay
from lpd.metrics import BinaryAccuracyWithLogits, FalsePositives
from lpd.utils.torch_utils import get_gpu_device_if_available
from lpd.utils.general_utils import seed_all
from lpd.utils.threshold_checker import AbsoluteThresholdChecker
seed_all(seed=42) # because its the answer to life and the universe
device = get_gpu_device_if_available() # with fallback to CPU if GPU not available
model = MyModel().to(device) # this is your model class, and its being sent to the relevant device
optimizer = torch.optim.SGD(params=model.parameters())
scheduler = KerasDecay(optimizer, decay=0.01, last_step=-1) # decay scheduler using keras formula
loss_func = torch.nn.BCEWithLogitsLoss().to(device) # this is your loss class, already sent to the relevant device
metrics = [BinaryAccuracyWithLogits(name='Accuracy'), FalsePositives(name='FP', num_class=2, threshold=0)] # define your metrics
# you can use some of the defined callbacks, or you can create your own
callbacks = [
LossOptimizerHandler(),
SchedulerStep(apply_on_phase=Phase.BATCH_END, apply_on_states=State.TRAIN),
ModelCheckPoint(checkpoint_dir,
checkpoint_file_name,
CallbackMonitor(monitor_type=MonitorType.LOSS,
stats_type=StatsType.VAL,
monitor_mode=MonitorMode.MIN),
save_best_only=True),
Tensorboard(summary_writer_dir=summary_writer_dir),
EarlyStopping(CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.VAL,
monitor_mode=MonitorMode.MAX,
patience=10,
metric_name='Accuracy'),
threshold_checker=AbsoluteThresholdChecker(monitor_mode=MonitorMode.MAX, threshold=0.01)),
StatsPrint(train_metrics_monitors=[CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.TRAIN,
monitor_mode=MonitorMode.MAX, # <-- notice MAX
metric_name='Accuracy'),
CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.TRAIN,
monitor_mode=MonitorMode.MIN, # <-- notice MIN
metric_name='FP')],
print_confusion_matrix=True) # since one of the metric (FalsePositives) is confusion matrix based, lets print the whole confusion matrix
]
trainer = Trainer(model,
device,
loss_func,
optimizer,
scheduler,
metrics,
train_data_loader, # DataLoader, Iterable or Generator
val_data_loader, # DataLoader, Iterable or Generator
train_steps,
val_steps,
callbacks,
name='Readme-Example')
trainer.train(num_epochs)
```
### Evaluating your model
``trainer.evaluate`` will return ``StatsResult`` that stores the loss and metrics results for the test set
```python
evaluation_result = trainer.evaluate(test_data_loader, test_steps)
```
### Making predictions
``Predictor`` class will generate output predictions from input samples.
``Predictor`` class can be created from ``Trainer``
```python
predictor_from_trainer = Predictor.from_trainer(trainer)
predictions = predictor_from_trainer.predict_batch(batch)
```
``Predictor`` class can also be created from saved checkpoint
```python
predictor_from_checkpoint = Predictor.from_checkpoint(checkpoint_dir,
checkpoint_file_name,
model, # nn.Module, weights will be loaded from checkpoint
device)
prediction = predictor_from_checkpoint.predict_sample(sample)
```
Lastly, ``Predictor`` class can be initialized explicitly
```python
predictor = Predictor(model,
device,
callbacks, # relevant only for prediction callbacks (see callbacks Phases and States)
name='lpd predictor')
predictions = predictor.predict_data_loader(data_loader, steps)
```
Just to be fair, you can also predict directly from ``Trainer`` class
```python
# On single sample:
prediction = trainer.predict_sample(sample)
# On batch:
predictions = trainer.predict_batch(batch)
# On Dataloader/Iterable/Generator:
predictions = trainer.predict_data_loader(data_loader, steps)
```
## TrainerStats
``Trainer`` tracks stats for `train/validate/test` and you can access them in your custom callbacks
or any other place that has access to your trainer.
Here are some examples
```python
train_loss = trainer.train_stats.get_loss() # the mean of the last epoch's train losses
val_loss = trainer.val_stats.get_loss() # the mean of the last epoch's validation losses
test_loss = trainer.test_stats.get_loss() # the mean of the test losses (available only after calling evaluate)
train_metrics = trainer.train_stats.get_metrics() # dict(metric_name, MetricMethod(values)) of the current epoch in train state
val_metrics = trainer.val_stats.get_metrics() # dict(metric_name, MetricMethod(values)) of the current epoch in validation state
test_metrics = trainer.test_stats.get_metrics() # dict(metric_name, MetricMethod(values)) of the test (available only after calling evaluate)
```
## Callbacks
Will be used to perform actions at various stages.
Some common callbacks are available under ``lpd.callbacks``, and you can also create your own, more details below.
In a callback, ``apply_on_phase`` (``lpd.enums.Phase``) will determine the execution phase,
and ``apply_on_states`` (``lpd.enums.State`` or ``list(lpd.enums.State)``) will determine the execution states
These are the current available phases and states, more might be added in future releases
### Training and Validation phases and states will behave as follow
```python
State.EXTERNAL
Phase.TRAIN_BEGIN
# train loop:
Phase.EPOCH_BEGIN
State.TRAIN
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.VAL
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.EXTERNAL
Phase.EPOCH_END
Phase.TRAIN_END
```
### Evaluation phases and states will behave as follow
```python
State.EXTERNAL
Phase.TEST_BEGIN
State.TEST
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.EXTERNAL
Phase.TEST_END
```
### Predict phases and states will behave as follow
```python
State.EXTERNAL
Phase.PREDICT_BEGIN
State.PREDICT
# batches loop:
Phase.BATCH_BEGIN
# batch
Phase.BATCH_END
State.EXTERNAL
Phase.PREDICT_END
```
Callbacks will be executed under the relevant phase and state, and by their order.
With phases and states, you have full control over the timing of your callbacks.
Let's take a look at some of the callbacks ``lpd`` provides:
### LossOptimizerHandler Callback
Derives from ``LossOptimizerHandlerBase``, probably the most important callback during training 😎
Use ``LossOptimizerHandler`` to determine when to call:
```python
loss.backward(...)
optimizer.step(...)
optimizer.zero_grad(...)
```
Or, you may choose to create your own ``AwesomeLossOptimizerHandler`` class by deriving from ``LossOptimizerHandlerBase``.
``Trainer.train(...)`` will validate that at least one ``LossOptimizerHandlerBase`` callback was provided.
### LossOptimizerHandlerAccumulateBatches Callback
As well as ``LossOptimizerHandlerAccumulateSamples`` will call loss.backward() every batch, but invoke optimizer.step() and optimizer.zero_grad()
only after the defined num of batches (or samples) were accumulated
### StatsPrint Callback
``StatsPrint`` callback prints informative summary of the trainer stats including loss and metrics.
* ``CallbackMonitor`` can add nicer look with ``IMPROVED`` indication on improved loss or metric, see output example below.
* Loss (for all states) will be monitored as ``MonitorMode.MIN``
* For train metrics, provide your own monitors via ``train_metrics_monitors`` argument
* Validation metrics monitors will be added automatically according to ``train_metrics_monitors`` argument
```python
from lpd.enums import Phase, State, MonitorType, StatsType, MonitorMode
StatsPrint(apply_on_phase=Phase.EPOCH_END,
apply_on_states=State.EXTERNAL,
train_metrics_monitors=CallbackMonitor(monitor_type=MonitorType.METRIC,
stats_type=StatsType.TRAIN,
monitor_mode=MonitorMode.MAX,
metric_name='TruePositives'),
print_confusion_matrix_normalized=True) # in case you use one of the ConfusionMatrix metrics (e.g. TruePositives), you may also print the confusion matrix
```
Output example:

### ModelCheckPoint Callback
Saving a checkpoint when a monitored loss/metric has improved.
The callback will save the model, optimizer, scheduler, and epoch number.
You can also configure it to save Full Trainer.
For example, ``ModelCheckPoint`` that will save a new *full trainer checkpoint* every time the validation metric_name ``my_metric``
is getting higher than the highest value so far.
```python
ModelCheckPoint(Phase.EPOCH_END,
State.EXTERNAL,
checkpoint_dir,
checkpoint_file_name,
CallbackMonitor(monitor_type=MonitorType.METRIC, # It's a Metric and not a Loss
stats_type=StatsType.VAL, # check the value on the Validation set
monitor_mode=MonitorMode.MAX, # MAX indicates higher is better
metric_name='my_metric'), # since it's a Metric, mention its name
save_best_only=False,
save_full_trainer=True)
```
### EarlyStopping Callback
Stops the trainer when a monitored loss/metric has stopped improving.
For example, EarlyStopping that will monitor at the end of every epoch, and stop the trainer if the validation loss didn't improve (decrease) for the last 10 epochs.
```python
EarlyStopping(Phase.EPOCH_END,
State.EXTERNAL,
CallbackMonitor(monitor_type=MonitorType.LOSS,
stats_type=StatsType.VAL,
monitor_mode=MonitorMode.MIN,
patience=10))
```
### SchedulerStep Callback
Will invoke ``step()`` on your scheduler in the desired phase and state.
For example, SchedulerStep callback to invoke ``scheduler.step()`` at the end of every batch, in train state (as opposed to validation and test):
```python
from lpd.callbacks import SchedulerStep
from lpd.enums import Phase, State
SchedulerStep(apply_on_phase=Phase.BATCH_END, apply_on_states=State.TRAIN)
```
### Tensorboard Callback
Will export the loss and the metrics at a given phase and state, in a format that can be viewed on Tensorboard
```python
from lpd.callbacks import Tensorboard
Tensorboard(apply_on_phase=Phase.EPOCH_END,
apply_on_states=State.EXTERNAL,
summary_writer_dir=dir_path)
```
### TensorboardImage Callback
Will export images, in a format that can be viewed on Tensorboard.
For example, a TensorboardImage callback that will output all the images generated in validation
```python
from lpd.callbacks import TensorboardImage
TensorboardImage(apply_on_phase=Phase.BATCH_END,
apply_on_states=State.VAL,
summary_writer_dir=dir_path,
description='Generated Images',
outputs_parser=None)
```
Lets pass outputs_parser that will change the range of the outputs from [-1,1] to [0,255]
```python
from lpd.callbacks import TensorboardImage
def outputs_parser(input_output_label: InputOutputLabel):
outputs_scaled = (input_output_label.outputs + 1.0) / 2.0 * 255
outputs_scaled = torchvision.utils.make_grid(input_output_label.output)
return outputs_scaled
TensorboardImage(apply_on_phase=Phase.BATCH_END,
apply_on_states=State.VAL,
summary_writer_dir=dir_path,
description='Generated Images',
outputs_parser=outputs_parser)
```
### CollectOutputs Callback
Will collect model's outputs for the defined states.
CollectOutputs is automatically used by ``Trainer`` to collect the predictions when calling one of the ``predict`` methods.
```python
CollectOutputs(apply_on_phase=Phase.BATCH_END, apply_on_states=State.VAL)
```
### Create your custom callbacks
```python
from lpd.enums import Phase, State
from lpd.callbacks import CallbackBase
class MyAwesomeCallback(CallbackBase):
def __init__(self, apply_on_phase=Phase.BATCH_END, apply_on_states=[State.TRAIN, State.VAL]):
# make sure to call init parent class
super(MyAwesomeCallback, self).__init__(apply_on_phase, apply_on_states)
def __call__(self, callback_context): # <=== implement this method!
# your implementation here
# using callback_context, you can access anything in your trainer
# below are some examples to get the hang of it
val_loss = callback_context.val_stats.get_loss()
train_loss = callback_context.train_stats.get_loss()
train_metrics = callback_context.train_stats.get_metrics()
val_metrics = callback_context.val_stats.get_metrics()
optimizer = callback_context.optimizer
scheduler = callback_context.scheduler
trainer = callback_context.trainer
if val_loss < 0.0001:
# you can also mark the trainer to STOP training by calling stop()
trainer.stop()
```
Lets expand ``MyAwesomeCallback`` with ``CallbackMonitor`` to track if our validation loss is getting better
```python
from lpd.callbacks import CallbackBase, CallbackMonitor # <== CallbackMonitor added
from lpd.enums import Phase, State, MonitorType, StatsType, MonitorMode # <== added few needed enums to configure CallbackMonitor
class MyAwesomeCallback(CallbackBase):
def __init__(self, apply_on_phase=Phase.BATCH_END, apply_on_states=[State.TRAIN, State.VAL]):
super(MyAwesomeCallback, self).__init__(apply_on_phase, apply_on_states)
# adding CallbackMonitor to track VAL LOSS with regards to MIN (lower is better) and patience of 20 epochs
self.val_loss_monitor = CallbackMonitor(MonitorType.LOSS, StatsType.VAL, MonitorMode.MIN, patience=20)
def __call__(self, callback_context: CallbackContext): # <=== implement this method!
# same as before, using callback_context, you can access anything in your trainer
train_metrics = callback_context.train_stats.get_metrics()
val_metrics = callback_context.val_stats.get_metrics()
# invoke track() method on your monitor and pass callback_context as parameter
# since you configured your val_loss_monitor, it will get the relevant parameters from callback_context
monitor_result = self.val_loss_monitor.track(callback_context)
# monitor_result (lpd.callbacks.CallbackMonitorResult) contains informative properties
# for example lets check the status of the patience countdown
if monitor_result.has_patience():
print(f'[MyAwesomeCallback] - patience left: {monitor_result.patience_left}')
# Or, let's stop the trainer, by calling the trainer.stop()
# if our monitored value did not improve
if not monitor_result.has_improved():
print(f'[MyAwesomeCallback] - {monitor_result.description} has stopped improving')
callback_context.trainer.stop()
```
### CallbackMonitor, AbsoluteThresholdChecker and RelativeThresholdChecker
When using callbacks such as ``EarlyStopping``, a ``CallbackMonitor`` is provided to track
a certain metric and reset/trigger the stopping event (or any event in other callbacks).
``CallbackMonitor`` will internally use ``ThresholdChecker`` when comparing new value to old value
for the tracked metric, and ``AbsoluteThresholdChecker`` or ``RelativeThresholdChecker`` will be used
to check if the criteria was met.
The following example creates a ``CallbackMonitor`` that will track if the metric 'accuracy'
has increased with more then 1% using ``RelativeThresholdChecker``
```python
from lpd.utils.threshold_checker import RelativeThresholdChecker
relative_threshold_checker_1_percent = RelativeThresholdChecker(monitor_mode=MonitorMode.MAX, threshold=0.01)
CallbackMonitor(monitor_type=MonitorType.METRIC, # It's a Metric and not a Loss
stats_type=StatsType.VAL, # check the value on the Validation set
monitor_mode=MonitorMode.MAX, # MAX indicates higher is better
metric_name='accuracy', # since it's a Metric, mention its name
threshold_checker=relative_threshold_checker_1_percent) # track 1% increase from last highest value
```
## Metrics
``lpd.metrics`` provides metrics to check the accuracy of your model.
Let's create a custom metric using ``MetricBase`` and also show the use of ``BinaryAccuracyWithLogits`` in this example
```python
from lpd.metrics import BinaryAccuracyWithLogits, MetricBase
from lpd.enums import MetricMethod
# our custom metric
class InaccuracyWithLogits(MetricBase):
def __init__(self):
super(InaccuracyWithLogits, self).__init__(MetricMethod.MEAN) # use mean over the batches
self.bawl = BinaryAccuracyWithLogits() # we exploit BinaryAccuracyWithLogits for the computation
def __call__(self, y_pred, y_true): # <=== implement this method!
# your implementation here
acc = self.bawl(y_pred, y_true)
return 1 - acc # return the inaccuracy
# we can now define our metrics and pass them to the trainer
metrics = [BinaryAccuracyWithLogits(name='accuracy'), InaccuracyWithLogits(name='inaccuracy')]
```
Let's do another example, a custom metric ``Truthfulness`` based on confusion matrix using ``MetricConfusionMatrixBase``
```python
from lpd.metrics import MetricConfusionMatrixBase, TruePositives, TrueNegatives
from lpd.enums import ConfusionMatrixBasedMetric
# our custom metric
class Truthfulness(MetricConfusionMatrixBase):
def __init__(self, num_classes, labels=None, predictions_to_classes_convertor=None, threshold=0.5):
super(Truthfulness, self).__init__(num_classes, labels, predictions_to_classes_convertor, threshold)
self.tp = TruePositives(num_classes, labels, predictions_to_classes_convertor, threshold) # we exploit TruePositives for the computation
self.tn = TrueNegatives(num_classes, labels, predictions_to_classes_convertor, threshold) # we exploit TrueNegatives for the computation
def __call__(self, y_pred, y_true): # <=== implement this method!
tp_per_class = self.tp(y_pred, y_true)
tn_per_class = self.tn(y_pred, y_true)
# you can also access more confusion matrix metrics such as
f1score = self.get_stats(ConfusionMatrixBasedMetric.F1SCORE)
precision = self.get_stats(ConfusionMatrixBasedMetric.PRECISION)
recall = self.get_stats(ConfusionMatrixBasedMetric.RECALL)
# see ConfusionMatrixBasedMetric enum for more
return tp_per_class + tn_per_class
```
## Save and Load full Trainer
Sometimes you just want to save everything so you can continue training where you left off.
To do so, you may use ``ModelCheckPoint`` for saving full trainer by setting parameter
```python
save_full_trainer=True
```
Or, you can invoke it directly from your trainer
```python
your_trainer.save_trainer(dir_path, file_name)
```
Loading a trainer from checkpoint is as simple as:
```python
loaded_trainer = Trainer.load_trainer(dir_path, # the folder where the saved trainer file exists
trainer_file_name, # the saved trainer file name
model, # state_dict will be loaded
device,
loss_func, # state_dict will be loaded
optimizer, # state_dict will be loaded
scheduler, # state_dict will be loaded
train_data_loader, # provide new/previous data_loader
val_data_loader, # provide new/previous data_loader
train_steps,
val_steps)
```
### Utils
``lpd.utils`` provides ``torch_utils``, ``file_utils`` and ``general_utils``
For example, a good practice is to use ``seed_all`` as early as possible in your code, to make sure that results are reproducible:
```python
import lpd.utils.general_utils as gu
gu.seed_all(seed=42) # because its the answer to life and the universe
```
### Extensions
``lpd.extensions`` provides some custom PyTorch layers, and schedulers, these are just some stuff we like using when we create our models, to gain better flexibility.
So you can use them at your own will, more extensions are added from time to time.
## Something is missing?! please share with us
You can open an issue, but also feel free to email us at torch.lpd@gmail.com
| text/markdown | Roy Sadaka | null | lpd developers | torch.lpd@gmail.com | MIT Licences | pytorch, trainer, callback, callbacks, earlystopping, tensorboard, modelcheckpoint, checkpoint, layers, dense, metrics, predictor, binary accuracy, extensions, track, monitor, machine, deep learning, neural, networks, AI, keras decay, confusion matrix | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Topic :: Scientific/Engineering :: Art... | [] | https://github.com/roysadaka/lpd | null | >=3.9 | [] | [] | [] | [
"tensorboard",
"numpy",
"torch",
"torchvision",
"tqdm"
] | [] | [] | [] | [] | twine/4.0.1 CPython/3.7.6 | 2026-02-19T10:12:57.865005 | lpd-0.4.13-py3-none-any.whl | 50,220 | 09/c3/e95a2870fe055d6859d8cadd0580b3f63c088443676885f1c2541a72f11a/lpd-0.4.13-py3-none-any.whl | py3 | bdist_wheel | null | false | 1949509a53f661eaadd5c87ad6a4a63a | e02b7deadd6be458777e8a6a6f79ae9dac684052a9bc431ed7c0044753ba0af7 | 09c3e95a2870fe055d6859d8cadd0580b3f63c088443676885f1c2541a72f11a | null | [] | 117 |
2.4 | oarepo-workflows | 2.0.0.dev9 | OARepo module allowing record workflow functionality | # OARepo Workflows
Workflow management for [Invenio](https://inveniosoftware.org/) records.
## Overview
This package enables state-based workflow management for Invenio records with:
- State-based record lifecycle management with timestamps
- Configurable permission policies per workflow state
- Request-based state transitions with approval workflows
- Auto-approval and escalation mechanisms
- Model presets for automatic integration with oarepo-model
- Multiple recipient support for requests
## Installation
```bash
pip install oarepo-workflows
```
### Requirements
- Python 3.13+
- Invenio 14.x (RDM)
- oarepo-runtime >= 2.0.0
## Key Features
### 1. Workflow Definition and Management
**Source:** [`oarepo_workflows/base.py`](oarepo_workflows/base.py), [`oarepo_workflows/ext.py`](oarepo_workflows/ext.py)
Define workflows with state-based permissions and request policies:
```python
from oarepo_workflows import Workflow
from flask_babel import lazy_gettext as _
WORKFLOWS = {
"default": Workflow(
code="default",
label=_("Default Workflow"),
permission_policy_cls=DefaultWorkflowPermissions,
request_policy_cls=DefaultWorkflowRequests,
)
}
```
Access workflows through the extension:
```python
from oarepo_workflows import current_oarepo_workflows
# Get workflow by code
workflow = current_oarepo_workflows.workflow_by_code["default"]
# Get workflow from record
workflow = current_oarepo_workflows.get_workflow(record)
# List all workflows
workflows = current_oarepo_workflows.record_workflows
```
### 2. Record System Fields
**Source:** [`oarepo_workflows/records/systemfields/`](oarepo_workflows/records/systemfields/)
#### State Field
Tracks the current state of a record with automatic timestamp updates:
```python
from oarepo_workflows.records.systemfields import (
RecordStateField,
RecordStateTimestampField,
)
class MyRecord(Record):
state = RecordStateField(initial="draft")
state_timestamp = RecordStateTimestampField()
```
Set state programmatically:
```python
from oarepo_workflows import current_oarepo_workflows
# Change state with automatic notification
current_oarepo_workflows.set_state(
identity,
record,
"published",
commit=True,
notify_later=True
)
```
#### Workflow Field
Links parent records to their workflow definition:
```python
from oarepo_workflows.records.systemfields import WorkflowField
class MyParentRecord(ParentRecord):
workflow = WorkflowField()
```
### 3. Permission Management
**Source:** [`oarepo_workflows/services/permissions/`](oarepo_workflows/services/permissions/)
#### Workflow Permission Policy
Define state-based permissions for record operations:
```python
from oarepo_workflows.services.permissions import (
DefaultWorkflowPermissions,
IfInState,
)
from invenio_rdm_records.services.generators import RecordOwners
from invenio_records_permissions.generators import AuthenticatedUser
class MyWorkflowPermissions(DefaultWorkflowPermissions):
can_create = [AuthenticatedUser()]
can_read = [
IfInState("draft", [RecordOwners()]),
IfInState("published", [AuthenticatedUser()]),
]
can_update = [
IfInState("draft", [RecordOwners()]),
]
can_delete = [
IfInState("draft", [RecordOwners()]),
]
```
**Key permission generators:**
- `IfInState(state, then_generators, else_generators)` - Conditional permissions based on record state
- `FromRecordWorkflow(action)` - Delegate permission check to workflow policy
- `SameAs(permission_name)` - Reuse permissions from another action
#### Record Permission Policy
Use `WorkflowRecordPermissionPolicy` on `RecordServiceConfig` to delegate all permissions to workflows:
```python
from oarepo_workflows.services.permissions import (
WorkflowRecordPermissionPolicy,
)
class MyServiceConfig(RecordServiceConfig):
permission_policy_cls = WorkflowRecordPermissionPolicy
```
### 4. Request-Based Workflows
**Source:** [`oarepo_workflows/requests/`](oarepo_workflows/requests/)
#### Request Definition
Define requests that move records through workflow states:
```python
from oarepo_workflows import (
WorkflowRequest,
WorkflowRequestPolicy,
WorkflowTransitions,
IfInState,
)
from invenio_rdm_records.services.generators import RecordOwners
class MyWorkflowRequests(WorkflowRequestPolicy):
publish_request = WorkflowRequest(
requesters=[
IfInState("draft", [RecordOwners()])
],
recipients=[CommunityRole("curator")],
transitions=WorkflowTransitions(
submitted="submitted",
accepted="published",
declined="draft"
)
)
```
**Request configuration:**
- `requesters` - Generators defining who can create the request
- `recipients` - Generators defining who can approve the request
- `transitions` - State changes for submitted/accepted/declined/cancelled
- `events` - Additional events that can be submitted with the request
- `escalations` - Auto-escalation if not resolved in time
#### Auto-Approval
Automatically approve requests when submitted:
```python
from oarepo_workflows import AutoApprove
class MyWorkflowRequests(WorkflowRequestPolicy):
edit_request = WorkflowRequest(
requesters=[IfInState("published", [RecordOwners()])],
recipients=[AutoApprove()],
)
```
#### Request Escalation
Escalate unresolved requests to higher authority:
```python
from datetime import timedelta
from oarepo_workflows import WorkflowRequestEscalation
class MyWorkflowRequests(WorkflowRequestPolicy):
delete_request = WorkflowRequest(
requesters=[IfInState("published", [RecordOwners()])],
recipients=[CommunityRole("curator")],
transitions=WorkflowTransitions(
submitted="deleting",
accepted="deleted",
declined="published"
),
escalations=[
WorkflowRequestEscalation(
after=timedelta(days=14),
recipients=[UserWithRole("administrator")]
)
]
)
```
#### Request Events
Define custom events that can be submitted on requests:
```python
from oarepo_workflows.requests import WorkflowEvent
class MyWorkflowRequests(WorkflowRequestPolicy):
review_request = WorkflowRequest(
requesters=[RecordOwners()],
recipients=[CommunityRole("reviewer")],
events={
"request_changes": WorkflowEvent(
submitters=[CommunityRole("reviewer")]
)
}
)
```
### 5. Request Permissions
**Source:** [`oarepo_workflows/requests/permissions.py`](oarepo_workflows/requests/permissions.py)
The package provides `CreatorsFromWorkflowRequestsPermissionPolicy` which automatically extracts request creators from workflow definitions:
```python
# In invenio.cfg
from oarepo_workflows.requests.permissions import (
CreatorsFromWorkflowRequestsPermissionPolicy,
)
REQUESTS_PERMISSION_POLICY = CreatorsFromWorkflowRequestsPermissionPolicy
```
This policy:
- Checks workflow request definitions for `can_create` permissions
- Supports event-specific permissions (e.g., `can_<request>_<event>_create`)
- Allows any user to search requests (but filters results by actual permissions)
### 6. Service Components
**Source:** [`oarepo_workflows/services/components/`](oarepo_workflows/services/components/)
#### Workflow Component
Ensures workflow is set when creating records:
```python
from oarepo_workflows.services.components import WorkflowComponent
class MyServiceConfig(RecordServiceConfig):
components = [
WorkflowComponent,
# ... other components
]
```
The component:
- Validates workflow presence in input data
- Sets workflow on parent record during creation
- Runs before metadata component to ensure workflow-based permissions apply
### 7. Model Presets
**Source:** [`oarepo_workflows/model/presets/`](oarepo_workflows/model/presets/)
Automatic integration with `oarepo-model` code generator:
#### Record Presets
- `WorkflowsParentRecordPreset` - Adds `WorkflowField` to parent records
- `WorkflowsDraftPreset` - Adds `RecordStateField` and `RecordStateTimestampField` to drafts
- `WorkflowsRecordPreset` - Adds state fields to published records
- `WorkflowsParentRecordMetadataPreset` - Adds workflow column to parent metadata table
- `WorkflowsMappingPreset` - Adds OpenSearch mappings for state and workflow fields
#### Service Presets
- `WorkflowsServiceConfigPreset` - Adds `WorkflowComponent` to service components
- `WorkflowsPermissionPolicyPreset` - Sets `WorkflowRecordPermissionPolicy` on service config
- `WorkflowsParentRecordSchemaPreset` - Adds workflow field to parent schema
- `WorkflowsRecordSchemaPreset` - Adds state fields to record schema
### 8. State Change Notifications
**Source:** [`oarepo_workflows/services/uow.py`](oarepo_workflows/services/uow.py)
Register custom handlers for state changes via entry points:
```python
# In your package
def my_state_change_handler(
identity,
record,
previous_state,
new_state,
*args,
uow=None,
**kwargs
):
# Handle state change
pass
# In pyproject.toml
[project.entry-points."oarepo_workflows.state_changed_notifiers"]
my_handler = "my_package.handlers:my_state_change_handler"
```
### 9. Multiple Recipients
**Source:** [`oarepo_workflows/services/multiple_entities/`](oarepo_workflows/services/multiple_entities/)
Support for requests with multiple recipients:
```python
from oarepo_workflows import WorkflowRequest
class MyWorkflowRequests(WorkflowRequestPolicy):
review_request = WorkflowRequest(
requesters=[RecordOwners()],
recipients=[
CommunityRole("reviewer"),
CommunityRole("curator")
]
)
```
The first recipient becomes the primary recipient. Multiple recipients are tracked via the `multiple` entity resolver.
## Configuration
### Basic Configuration
In `invenio.cfg`:
```python
from oarepo_workflows import Workflow
from my_workflows.permissions import DefaultPermissions
from my_workflows.requests import DefaultRequests
WORKFLOWS = {
"default": Workflow(
code="default",
label="Default Workflow",
permission_policy_cls=DefaultPermissions,
request_policy_cls=DefaultRequests,
)
}
```
### Community Roles
Define roles used in workflow permissions:
```python
from invenio_i18n import lazy_gettext as _
COMMUNITIES_ROLES = [
dict(
name="curator",
title=_("Curator"),
description=_("Curator of the community")
),
dict(
name="reviewer",
title=_("Reviewer"),
description=_("Reviewer of submissions")
),
]
```
### Default Workflow Events
Define events available to all workflows:
```python
from oarepo_workflows.requests import WorkflowEvent
DEFAULT_WORKFLOW_EVENTS = {
"comment": WorkflowEvent(
submitters=[Creator(), Receiver()]
)
}
```
## Development
### Setup
```bash
git clone https://github.com/oarepo/oarepo-workflows.git
cd oarepo-workflows
./run.sh venv
```
### Running Tests
```bash
./run.sh test
```
## Entry Points
The package registers several Invenio entry points:
```python
[project.entry-points."invenio_base.apps"]
oarepo_workflows = "oarepo_workflows.ext:OARepoWorkflows"
[project.entry-points."invenio_base.api_apps"]
oarepo_workflows = "oarepo_workflows.ext:OARepoWorkflows"
[project.entry-points."invenio_requests.entity_resolvers"]
auto_approve = "oarepo_workflows.resolvers.auto_approve:AutoApproveResolver"
multiple = "oarepo_workflows.resolvers.multiple_entities:MultipleEntitiesResolver"
[project.entry-points."invenio_base.finalize_app"]
oarepo_workflows = "oarepo_workflows.ext:finalize_app"
[project.entry-points."invenio_base.api_finalize_app"]
oarepo_workflows = "oarepo_workflows.ext:finalize_app"
[project.entry-points."invenio_config.module"]
oarepo_workflows = "oarepo_workflows.initial_config"
```
## License
Copyright (c) 2024-2025 CESNET z.s.p.o.
OARepo Workflows is free software; you can redistribute it and/or modify it under the terms of the MIT License. See [LICENSE](LICENSE) file for more details.
## Links
- Documentation: <https://github.com/oarepo/oarepo-workflows>
- PyPI: <https://pypi.org/project/oarepo-workflows/>
- Issues: <https://github.com/oarepo/oarepo-workflows/issues>
- OARepo Project: <https://github.com/oarepo>
## Acknowledgments
This project builds upon [Invenio Framework](https://inveniosoftware.org/) and is developed as part of the OARepo ecosystem.
| text/markdown | null | Ronald Krist <krist@cesnet.cz> | null | null | null | null | [] | [] | null | null | <3.15,>=3.13 | [] | [] | [] | [
"graphlib",
"oarepo-model>=0.1.0.dev5",
"oarepo-runtime<3.0.0,>=2.0.0dev13",
"oarepo[rdm,tests]<15.0.0,>=14.0.0",
"oarepo-invenio-typing-stubs; extra == \"dev\"",
"ruff>=0.11.13; extra == \"dev\"",
"types-pyyaml; extra == \"dev\"",
"oarepo[rdm,tests]<15.0.0,>=14.0.0; extra == \"oarepo14\"",
"oarepo-... | [] | [] | [] | [
"Homepage, https://github.com/oarepo/oarepo-workflows"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:11:57.822205 | oarepo_workflows-2.0.0.dev9.tar.gz | 28,798 | 7a/0d/9c54a60b3b3e6e4ab1960b00de04b522787c0559dc8fbc0b83b30fe048ad/oarepo_workflows-2.0.0.dev9.tar.gz | source | sdist | null | false | c0bedf4d20f88afe07b0da4742096468 | 7006d3492a3471801cc85e9da44745669bbd3ba4ccdf21b183f0d149a3d1bacf | 7a0d9c54a60b3b3e6e4ab1960b00de04b522787c0559dc8fbc0b83b30fe048ad | null | [
"LICENSE"
] | 222 |
2.4 | fluxer.py | 0.3.1 | A Python wrapper for the Fluxer API | # fluxer-py
A Python API wrapper for [Fluxer](https://fluxer.app). Build bots and interact with the Fluxer platform in a simple and elegant way.
## Quick example
A dead simple bot with a ping command:
```py
import fluxer
bot = fluxer.Bot(command_prefix="!", intents=fluxer.Intents.default())
@bot.event
async def on_ready():
print(f"Bot is ready! Logged in as {bot.user.username}")
@bot.command()
async def ping(ctx):
await ctx.reply("Pong!")
if __name__ == "__main__":
TOKEN = "your_bot_token"
bot.run(TOKEN)
```
## Getting started to contribute
You'll need [uv](https://docs.astral.sh/uv/) installed, then:
```sh
git clone https://github.com/your-username/fluxer-py.git
cd fluxer-py
uv sync --dev
```
That's it, you're sorted. Uv will handle the `.venv` and dependecies without conflicts, like a traditional package manager!
| text/markdown | Emil | null | null | null | null | fluxer, api, chat, bot, websocket | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Communications :... | [] | null | null | >=3.10 | [] | [] | [] | [
"aiohttp<4.0.0,>=3.9.0",
"emoji>=2.0.0",
"pytest>=7.0.0; extra == \"dev\"",
"pytest-asyncio>=1.3.0; extra == \"dev\"",
"pytest-cov>=7.0.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/akarealemil/fluxer.py",
"Issues, https://github.com/akarealemil/fluxer.py/issues"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:11:40.884908 | fluxer_py-0.3.1.tar.gz | 39,431 | 8e/73/f32cd5279d4ddb436c3fb8dde725a8216a8f079a2f4111bbdf07e45bf162/fluxer_py-0.3.1.tar.gz | source | sdist | null | false | d80f96ecd1f8aeb081fff60df1cbcfa5 | 8c27aa50e276ebd26aee3722e3dda6cd07ab5cc9ac40490bbda94f05c0c8d212 | 8e73f32cd5279d4ddb436c3fb8dde725a8216a8f079a2f4111bbdf07e45bf162 | MIT | [
"LICENSE"
] | 0 |
2.4 | pharia-telemetry | 0.1.2 | Telemetry utilities and shared components for Pharia projects | # Pharia Telemetry
[](https://pypi.org/project/pharia-telemetry/)
[](https://pypi.org/project/pharia-telemetry/)
[](https://opensource.org/licenses/MIT)
[](https://github.com/aleph-alpha/pharia-telemetry/actions/workflows/ci.yml)

**A clean, minimal OpenTelemetry foundation library for Pharia services providing observability, tracing, and context propagation utilities.**
## 🎯 What is pharia-telemetry?
`pharia-telemetry` provides a **simple, focused foundation** for observability in Pharia services:
- **Context Propagation**: User and session context flows automatically across all service calls
- **Structured Logging**: Logs automatically include trace IDs and user context
- **OpenTelemetry Setup**: Minimal, high-level setup for distributed tracing
- **Standardized Constants**: Clean, namespaced constants for consistent telemetry
**Key Principle**: `pharia-telemetry` handles the foundation with minimal API surface, you add framework-specific auto-instrumentation.
## 📦 Installation
Requires Python 3.10+.
```bash
# Basic installation
pip install pharia-telemetry
# With structlog support (for structured logging)
pip install pharia-telemetry[structlog]
```
### Install from GitHub (pinned to commit)
For services that depend on a specific commit from the GitHub repo, use a direct VCS reference:
```bash
# HTTPS (recommended)
pip install "pharia-telemetry @ git+https://github.com/aleph-alpha/pharia-telemetry.git@<commit-sha>"
# SSH (if you have SSH keys configured)
pip install "pharia-telemetry @ git+ssh://git@github.com/aleph-alpha/pharia-telemetry.git@<commit-sha>"
# With optional extras
pip install "pharia-telemetry[structlog] @ git+https://github.com/aleph-alpha/pharia-telemetry.git@<commit-sha>"
```
In requirements files (PEP 508):
```
pharia-telemetry @ git+https://github.com/aleph-alpha/pharia-telemetry.git@<commit-sha>
pharia-telemetry[structlog] @ git+https://github.com/aleph-alpha/pharia-telemetry.git@<commit-sha>
```
## 🚀 30-Second Setup
```python
from pharia_telemetry import setup_telemetry, constants, set_baggage_item
# 1. One-line setup
setup_telemetry("my-service", service_version="1.0.0")
# 2. Set context that flows everywhere
set_baggage_item(constants.Baggage.USER_ID, "user-123")
# 3. Add framework instrumentation (optional)
# FastAPIInstrumentor.instrument_app(app) # for FastAPI
# SQLAlchemyInstrumentor().instrument() # for databases
```
**Result**: Your service now has distributed tracing with user context flowing through all operations!
## 🎯 Clean API Design
pharia-telemetry features a **clean, focused API** designed for ease of use:
```python
from pharia_telemetry import (
# Core setup (essential)
setup_telemetry, # One-function setup
# GenAI instrumentation (most users)
create_chat_span, # Smart sync/async chat spans
create_embeddings_span, # Smart sync/async embeddings spans
create_tool_execution_span,# Smart sync/async tool spans
set_genai_span_usage, # Token usage tracking
set_genai_span_response, # Response metadata
# Context propagation (advanced)
set_baggage_item, # Set context for propagation
get_baggage_item, # Get propagated context
# Logging integration (optional)
create_context_injector, # Custom logging integration
)
```
## 📚 Documentation Guide
Choose your path based on what you need:
### 🆕 New to pharia-telemetry?
**Start here** → [**Getting Started Guide**](docs/getting-started.md)
- Basic setup and first examples
- Understanding the concepts
- Your first instrumented service
### 🔌 Want automatic instrumentation?
**Go to** → [**Auto-Instrumentation Guide**](docs/auto-instrumentation.md)
- Available instrumentation packages
- FastAPI, SQLAlchemy, HTTPX setup
- When auto-instrumentation works (and when it doesn't)
### 🛠️ Need manual control?
**See** → [**Manual Instrumentation Guide**](docs/manual-instrumentation.md)
- SSE streaming issues and solutions
- HTTP/2 compatibility problems
- Custom span management
- Performance optimization
### 🧳 Working with context propagation?
**Read** → [**Baggage & Context Guide**](docs/baggage-and-context.md)
- User and session context
- Cross-service correlation
- Standardized baggage keys
- Custom context patterns
### 📊 Setting up logging?
**Check** → [**Structured Logging Guide**](docs/structured-logging.md)
- Automatic trace correlation
- Log configuration patterns
- Integration with structlog
### 🤖 Building GenAI applications?
**Visit** → [**GenAI Spans Guide**](docs/genai-spans.md)
- OpenTelemetry semantic conventions for AI
- Automatic span attributes for models
- Token usage tracking
- Agent and tool instrumentation
### ⚙️ Need advanced configuration?
**Visit** → [**Configuration Guide**](docs/configuration.md)
- Environment variables
- OTLP exporter setup
- Custom resource attributes
- Production deployment
### 🏗️ Building integrations?
**Browse** → [**Integration Examples**](docs/integration-examples.md)
- Complete FastAPI service
- Microservice communication
- Background task processing
- Real-world patterns
### 🐛 Having issues?
**Try** → [**Troubleshooting Guide**](docs/troubleshooting.md)
- Common problems and solutions
- Debug techniques
- Performance considerations
## 🌟 Core Features
- **🔬 OpenTelemetry Integration**: Minimal setup utilities for distributed tracing
- **🧳 Baggage Management**: Context propagation across service boundaries
- **📊 Structured Logging**: Automatic trace correlation for log records
- **🤖 Smart GenAI Spans**: Auto-detecting sync/async convenience functions for AI operations
- **🔧 Production Ready**: Graceful degradation when OpenTelemetry is unavailable
- **📈 Pharia Standards**: Standardized constants and conventions across all services
- **🎯 Focused API**: Clean, intuitive functions for common use cases
## 🏛️ Architecture
```
┌─────────────────────────────────────────┐
│ Your Application + Auto │
│ Instrumentation │
├─────────────────────────────────────────┤
│ pharia-telemetry Foundation │
│ (Propagators, Baggage, Logging) │
├─────────────────────────────────────────┤
│ OpenTelemetry SDK │
├─────────────────────────────────────────┤
│ OTLP Exporters & Backend │
└─────────────────────────────────────────┘
```
## 🔍 Quick Examples
### Context Propagation
```python
from pharia_telemetry import constants, set_baggage_item
# Set once, flows everywhere
set_baggage_item(constants.Baggage.USER_ID, "user-123")
set_baggage_item(constants.Baggage.SESSION_ID, "session-456")
```
### Structured Logging
```python
import structlog
from pharia_telemetry import add_context_to_logs
# Easy integration with any logging framework
injector = add_context_to_logs("structlog")
structlog.configure(processors=[
injector, # Adds trace_id + baggage automatically
structlog.processors.JSONRenderer(),
])
```
### GenAI Operations
```python
from pharia_telemetry import create_chat_span, create_embeddings_span
from pharia_telemetry.sem_conv.gen_ai import GenAI
# Smart convenience functions that auto-detect sync/async context
with create_chat_span(
model="llama-3.1-8B",
agent_id=GenAI.Values.PhariaAgentId.QA_CHAT,
conversation_id="conv-123"
) as span:
# Works in both sync and async contexts
pass
# Also works seamlessly in async contexts
async with create_embeddings_span(model="text-embedding-3-small") as span:
# Automatic context detection
pass
```
### Clean Constants Structure
```python
from pharia_telemetry import constants
# Namespaced and organized
user_id = constants.Baggage.USER_ID # "app.user.id"
qa_chat = constants.Baggage.Values.UserIntent.QA_CHAT # "pharia_qa_chat"
# GenAI constants in separate module
model = constants.GenAI.REQUEST_MODEL # "gen_ai.request.model"
chat_op = constants.GenAI.Values.OperationName.CHAT # "chat"
```
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🤝 Contributing
We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.
## 📞 Support
- 📧 **Email**: conrad.poepke@aleph-alpha.com
- 🐛 **Issues**: [GitHub Issues](https://github.com/aleph-alpha/pharia-telemetry/issues)
| text/markdown | null | Aleph Alpha Engineering <engineering@aleph-alpha.com> | null | null | null | logging, metrics, observability, opentelemetry, telemetry, tracing | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Pytho... | [] | null | null | >=3.10 | [] | [] | [] | [
"opentelemetry-api>=1.29.0",
"opentelemetry-sdk>=1.29.0",
"pydantic>=2.0.0"
] | [] | [] | [] | [
"Homepage, https://github.com/aleph-alpha/pharia-telemetry",
"Documentation, https://pharia-telemetry.readthedocs.io",
"Repository, https://github.com/aleph-alpha/pharia-telemetry",
"Issues, https://github.com/aleph-alpha/pharia-telemetry/issues",
"Changelog, https://github.com/aleph-alpha/pharia-telemetry/... | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:11:20.803736 | pharia_telemetry-0.1.2.tar.gz | 79,809 | fd/0b/2f258fd48e1537cac773ebb42c23599761854b76405bf054094313790bc6/pharia_telemetry-0.1.2.tar.gz | source | sdist | null | false | ce4cf9c307757e67b0269a75e632bb7e | 04e21894cba995ef8f4bb3d8421e79c354acd693c6e92207b13a453ddf51030b | fd0b2f258fd48e1537cac773ebb42c23599761854b76405bf054094313790bc6 | MIT | [
"LICENSE"
] | 222 |
2.4 | ol-openedx-chat | 0.5.4 | An Open edX plugin to add Open Learning AI chat aside to xBlocks | OL OpenedX Chat
###############
An xBlock aside to add MIT Open Learning chat into xBlocks.
Purpose
*******
MIT's AI chatbot for Open edX
Setup
=====
For detailed installation instructions, please refer to the `plugin installation guide <../../docs#installation-guide>`_.
Installation required in:
* LMS
* Studio (CMS)
Configuration
=============
1. edx-platform configuration
-----------------------------
- Add the following configuration values to the config file in Open edX. For any release after Juniper, that config file is ``/edx/etc/lms.yml``. If you're using ``private.py``, add these values to ``lms/envs/private.py``. These should be added to the top level. **Ask a fellow developer for these values.**
.. code-block::
MIT_LEARN_AI_API_URL: <MIT_LEARN_AI_API_URL>
MIT_LEARN_API_BASE_URL: <MIT_LEARN_API_BASE_URL>
MIT_LEARN_SUMMARY_FLASHCARD_URL: <MIT_LEARN_SUMMARY_FLASHCARD_URL>
- For Tutor installations, these values can also be managed through a `custom Tutor plugin <https://docs.tutor.edly.io/tutorials/plugin.html#plugin-development-tutorial>`_.
2. Add database record
----------------------
- Create a record for the ``XBlockAsidesConfig`` model (LMS admin URL: ``/admin/lms_xblock/xblockasidesconfig/``).
- Create a record in the ``StudioConfig`` model (CMS admin URL: ``/admin/xblock_config/studioconfig/``).
3. In frontend-app-learning, Run the below in the shell inside the learning MFE folder:
---------------------------------------------------------------------------------------
This will download the smoot-design package and copy the pre-bundled JS file to a location loadable by OpenEdx.
.. code-block:: sh
npm pack @mitodl/smoot-design@^6.17.0
tar -xvzf mitodl-smoot-design*.tgz
mkdir -p public/static/smoot-design
cp package/dist/bundles/* public/static/smoot-design
4. Create env.config.jsx in the frontend-app-learning and add the below code:
The Unit is rendered inside an Iframe and we use postMessage to communicate between the Iframe and the parent window. The below code is used to initialize the remoteAiChatDrawer.
.. code-block:: js
import { getConfig } from '@edx/frontend-platform';
import { getAuthenticatedHttpClient } from '@edx/frontend-platform/auth';
import(
/* webpackIgnore: true */
"/static/smoot-design/aiDrawerManager.es.js").then(module => {
module.init({
messageOrigin: getConfig().LMS_BASE_URL,
transformBody: messages => ({ message: messages[messages.length - 1].content }),
getTrackingClient: getAuthenticatedHttpClient,
})
})
const config = {
...process.env,
};
export default config;
(Alternatively, you can import the drawer code from a CDN like kg.com/@mitodl/smoot-design@6.4.0/dist/bundles/remoteTutorDrawer.umd.js to skip Step 3. However, the steps outlined here are most similar to what we do in production.)
5. Start learning MFE by ``npm run dev``
----------------------------------------
6. In LMS, enable the ``ol_openedx_chat.ol_openedx_chat_enabled`` waffle flag at ``<LMS>/admin/waffle/flag/``
---------------------------------------------------------------------------------------------------------------
This will enable the ol_openedx_chat plugin for all courses. You can disable it and add a ``Waffle Flag Course Override`` at ``/admin/waffle_utils/waffleflagcourseoverridemodel/`` to enable it for a single course.
7. Set `FEATURES["ENABLE_OTHER_COURSE_SETTINGS"] = True` in your `cms/envs/private.py` and `lms/envs/private.py` files
----------------------------------------------------------------------------------------------------------------------
This enables "Other Course Settings" below.
8. Go to any course in CMS > Settings > Advanced Settings and add the below in "Other Course Settings"
-------------------------------------------------------------------------------------------------------
.. code-block::
{"OL_OPENEDX_CHAT_VIDEO_BLOCK_ENABLED": true, "OL_OPENEDX_CHAT_PROBLEM_BLOCK_ENABLED": true}
* ``OL_OPENEDX_CHAT_VIDEO_BLOCK_ENABLED`` is used to enable/disable the VideoGPT for all videos.
* ``OL_OPENEDX_CHAT_PROBLEM_BLOCK_ENABLED`` is used to enable/disable the AI Chat for all problems.
* Once these settings are enabled, you will see a checkbox ``Enable AI Chat Assistant`` below problem and video blocks in the CMS course unit.
CMS View
.. image:: ol_openedx_chat/static/images/ai_chat_aside_cms_view.png
* You will also see a Chat Button titled "AskTIM about this video/problem" in the LMS. Now AI Chat/VideoGPT is enabled for all videos and problems.
LMS View with AskTIM button
.. image:: ol_openedx_chat/static/images/ai_chat_aside_lms_view.png
LMS Chat Drawer View
.. image:: ol_openedx_chat/static/images/ai_chat_aside_lms_drawer_view.png
9. Disable it for a single block
----------------------------------
If you want to disable it for a few videos/problems then you disable the ``Enable AI Chat Assistant`` checkbox against the block in CMS.
Translations
============
Only **Ask TIM** is not translated (brand). All other user-facing strings are
translatable.
Translations live in the translations repo (e.g. https://github.com/mitodl/mitxonline-translations) at
``translations/open-edx-plugins/ol_openedx_chat/conf/locale/``. For any
language, run ``sync_and_translate_language`` to create or update the .po
files there; then run ``make pull_translations`` in edx-platform to pull and
compile them into ``conf/plugins-locale/plugins/ol_openedx_chat/``. Until
``make pull_translations`` is run, the UI shows English (msgid fallback).
Why no ``conf/locale/config.yaml`` or Makefile (unlike edx-bulk-grades)?
------------------------------------------------------------------------
Plugins like `edx-bulk-grades
<https://github.com/openedx/edx-bulk-grades/tree/master/bulk_grades/conf/locale>`_
use ``conf/locale/config.yaml`` and a Makefile (extract_translations,
compile_translations, pull_translations) because *that plugin repo* is the
source for Transifex/Atlas. For ol_openedx_chat we use a different flow: the
*translations repo* holds the .po files; ``sync_and_translate_language`` syncs
and translates there, and edx-platform's ``make pull_translations`` pulls into
``conf/plugins-locale/plugins/ol_openedx_chat/``. This plugin repo never runs
Atlas or Transifex, so ``config.yaml`` and a Makefile here would be unused.
Documentation
=============
License
*******
The code in this repository is licensed under the AGPL 3.0 unless
otherwise noted.
Please see `LICENSE.txt <LICENSE.txt>`_ for details.
| text/x-rst | MIT Office of Digital Learning | null | null | null | null | Python, edx | [] | [] | null | null | >=3.11 | [] | [] | [] | [
"django>=4.0",
"djangorestframework>=3.14.0",
"edx-django-utils>4.0.0",
"xblock"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:10:57.625641 | ol_openedx_chat-0.5.4.tar.gz | 577,870 | 1b/97/584c06eae1ab8245dd4c240647e847b8b73ae1e4a4a3f85f29a61165a00c/ol_openedx_chat-0.5.4.tar.gz | source | sdist | null | false | 8995b1b99c5e68a94127fafd261977e6 | f4b97185985d5fc7fa2cc121f6ee13a3c33186d636be871900fdcae562c134fe | 1b97584c06eae1ab8245dd4c240647e847b8b73ae1e4a4a3f85f29a61165a00c | BSD-3-Clause | [
"LICENSE.txt"
] | 245 |
2.4 | 2u-enterprise-subsidy-client | 2.2.0 | Client for interacting with the enterprise-subsidy service. | edx-enterprise-subsidy-client
#############################
|pypi-badge| |ci-badge| |codecov-badge| |doc-badge| |pyversions-badge|
|license-badge| |status-badge|
Purpose
*******
Client for interacting with the enterprise-subsidy service.
Getting Started
***************
Developing
==========
One Time Setup
--------------
.. code-block::
# Clone the repository into your ``[DEVSTACK]/src/`` folder
git clone git@github.com:openedx/edx-enterprise-subsidy-client.git
# Use a service container that would reasonably install this client, e.g.
cd [DEVSTACK]/enterprise-subsidy && make app-shell
cd /edx/src/edx-enterprise-subsidy-client
# Set up a virtualenv in a ``venv/`` directory
# You might need to install virtualenv first:
# apt-get update
# apt-get install -y virtualenv
virtualenv venv/
make requirements
# Ensure things are looking ok by running tests
make test
Every time you develop something in this repo
---------------------------------------------
.. code-block::
# Grab the latest code
git checkout main
git pull
# Use a service container that would reasonably install this client, e.g.
cd [DEVSTACK]/enterprise-subsidy && make app-shell
cd /edx/src/edx-enterprise-subsidy-client
# Activate the virtualenv
source venv/bin/activate
# Install/update the dev requirements
make requirements
# Run the tests and quality checks (to verify the status before you make any changes)
make validate
# Make a new branch for your changes
git checkout -b <your_github_username>/<short_description>
# Using your favorite editor, edit the code to make your change.
# Run your new tests
pytest ./path/to/new/tests
# Run all the tests and quality checks
make validate
# Commit all your changes
git commit ...
git push
# Open a PR and ask for review.
Deploying
=========
TODO: How can a new user go about deploying this component? Is it just a few
commands? Is there a larger how-to that should be linked here?
PLACEHOLDER: For details on how to deploy this component, see the `deployment how-to`_
.. _deployment how-to: https://docs.openedx.org/projects/edx-enterprise-subsidy-client/how-tos/how-to-deploy-this-component.html
Getting Help
************
Documentation
=============
PLACEHOLDER: Start by going through `the documentation`_. If you need more help see below.
.. _the documentation: https://docs.openedx.org/projects/edx-enterprise-subsidy-client
(TODO: `Set up documentation <https://openedx.atlassian.net/wiki/spaces/DOC/pages/21627535/Publish+Documentation+on+Read+the+Docs>`_)
More Help
=========
If you're having trouble, we have discussion forums at
https://discuss.openedx.org where you can connect with others in the
community.
Our real-time conversations are on Slack. You can request a `Slack
invitation`_, then join our `community Slack workspace`_.
For anything non-trivial, the best path is to open an issue in this
repository with as many details about the issue you are facing as you
can provide.
https://github.com/openedx/edx-enterprise-subsidy-client/issues
For more information about these options, see the `Getting Help`_ page.
.. _Slack invitation: https://openedx.org/slack
.. _community Slack workspace: https://openedx.slack.com/
.. _Getting Help: https://openedx.org/getting-help
License
*******
The code in this repository is licensed under the AGPL 3.0 unless
otherwise noted.
Please see `LICENSE.txt <LICENSE.txt>`_ for details.
Contributing
************
Contributions are very welcome.
Please read `How To Contribute <https://openedx.org/r/how-to-contribute>`_ for details.
This project is currently accepting all types of contributions, bug fixes,
security fixes, maintenance work, or new features. However, please make sure
to have a discussion about your new feature idea with the maintainers prior to
beginning development to maximize the chances of your change being accepted.
You can start a conversation by creating a new issue on this repo summarizing
your idea.
The Open edX Code of Conduct
****************************
All community members are expected to follow the `Open edX Code of Conduct`_.
.. _Open edX Code of Conduct: https://openedx.org/code-of-conduct/
People
******
The assigned maintainers for this component and other project details may be
found in `Backstage`_. Backstage pulls this data from the ``catalog-info.yaml``
file in this repo.
.. _Backstage: https://open-edx-backstage.herokuapp.com/catalog/default/component/edx-enterprise-subsidy-client
Reporting Security Issues
*************************
Please do not report security issues in public. Please email security@openedx.org.
.. |pypi-badge| image:: https://img.shields.io/pypi/v/edx-enterprise-subsidy-client.svg
:target: https://pypi.python.org/pypi/edx-enterprise-subsidy-client/
:alt: PyPI
.. |ci-badge| image:: https://github.com/openedx/edx-enterprise-subsidy-client/workflows/Python%20CI/badge.svg?branch=main
:target: https://github.com/openedx/edx-enterprise-subsidy-client/actions
:alt: CI
.. |codecov-badge| image:: https://codecov.io/github/openedx/edx-enterprise-subsidy-client/coverage.svg?branch=main
:target: https://codecov.io/github/openedx/edx-enterprise-subsidy-client?branch=main
:alt: Codecov
.. |doc-badge| image:: https://readthedocs.org/projects/edx-enterprise-subsidy-client/badge/?version=latest
:target: https://edx-enterprise-subsidy-client.readthedocs.io/en/latest/
:alt: Documentation
.. |pyversions-badge| image:: https://img.shields.io/pypi/pyversions/edx-enterprise-subsidy-client.svg
:target: https://pypi.python.org/pypi/edx-enterprise-subsidy-client/
:alt: Supported Python versions
.. |license-badge| image:: https://img.shields.io/github/license/openedx/edx-enterprise-subsidy-client.svg
:target: https://github.com/openedx/edx-enterprise-subsidy-client/blob/main/LICENSE.txt
:alt: License
.. TODO: Choose one of the statuses below and remove the other status-badge lines.
.. |status-badge| image:: https://img.shields.io/badge/Status-Experimental-yellow
.. .. |status-badge| image:: https://img.shields.io/badge/Status-Maintained-brightgreen
.. .. |status-badge| image:: https://img.shields.io/badge/Status-Deprecated-orange
.. .. |status-badge| image:: https://img.shields.io/badge/Status-Unsupported-red
Change Log
##########
..
All enhancements and patches to edx_enterprise_subsidy_client will be documented
in this file. It adheres to the structure of https://keepachangelog.com/ ,
but in reStructuredText instead of Markdown (for ease of incorporation into
Sphinx documentation and the PyPI description).
This project adheres to Semantic Versioning (https://semver.org/).
.. There should always be an "Unreleased" section for changes pending release.
Unreleased
**********
[2.2.0]
********
* chore: Update Python Requirements
[2.1.0]
*******
* chore: Update Python Requirements, notably Django 5
[2.0.19]
********
* chore: Update Python Requirements
[2.0.18]
********
* chore: Update Python Requirements
[2.0.17]
********
* chore: Update Python Requirements
[2.0.16]
********
* chore: Update Python Requirements
[2.0.15]
********
* chore: Update Python Requirements
[2.0.14]
********
* chore: Update Python Requirements
[2.0.13]
********
* chore: Update Python Requirements
[2.0.12]
********
* chore: Update Python Requirements
[2.0.11]
********
* chore: Update Python Requirements
[2.0.10]
********
* chore: Update Python Requirements
[2.0.9]
*******
* chore: Update Python Requirements
[2.0.8]
*******
* chore: Update Python Requirements
[2.0.7]
*******
* chore: Update Python Requirements
[2.0.6]
*******
* chore: Update Python Requirements
[2.0.5]
*******
* chore: Update Python Requirements
[2.0.4]
*******
* chore: Update Python Requirements
[2.0.3]
*******
* chore: Update Python Requirements
[2.0.2]
*******
* chore: Update Python Requirements
[2.0.1]
*******
* chore: Update Python Requirements
[2.0.0]
*******
* feat: Update package version from 1.0.0 to 2.0.0
* chore: Update Python Requirements
* chore: Update upgrade-python-requirements to 3.12
[1.0.0]
*******
* fix: Remove Python 3.8 Support
* chore: Update Python Requirements
* chore: Update pylintrc
[0.4.6]
*******
* fix: Update the name of reviewers team in github flow
[0.4.5]
*******
* fix: create_subsidy_deposit - metadata is optional (ENT-9133)
[0.4.4]
*******
* feat: add support for deposit creation (ENT-9133)
[0.4.3]
*******
* feat: adding new subsidy client method to fetch subsidy aggregate data
[0.4.2]
*******
* Switch from ``edx-sphinx-theme`` to ``sphinx-book-theme`` since the former is
deprecated
* Add python 3.12 support
[0.4.1]
*******
* chore: add a unit test for ``create_subsidy_transaction()``.
[0.4.0]
*******
* feat: allow requested prices for v2 transaction creation.
[0.3.7]
*******
* feat: upgrade many python dependencies, notably Django 3.2.19
[0.3.6]
*******
* feat: pass idempotency key during transaction creation (pt. 2)
[0.3.5]
*******
* feat: pass idempotency key during transaction creation
[0.3.3]
*******
* allow additional query params, like ``page_size``, to be passed through to listing endpoints.
[0.3.3]
*******
* admin-list transactions will also be filtered by ``created`` state by default.
* Adds an ADR explaining the default states for which this client filters transactions.
[0.3.2]
*******
* admin-list transactions will ask to be filtered for only `committed` and `pending` states by default.
Caller may specify other valid states (e.g. `failed` or `created`).
[0.3.1]
*******
* fix: correctly pass ``subsidy_uuid`` to subsidy API V2 endpoint string format.
[0.3.0]
*******
* feat: add new client for v2 transaction endpoint.
[0.2.6]
*******
* feat: transaction endpoint accepts `lms_user_id` instead of `learner_id`
[0.2.5]
*******
* feat: redemption metadata.
[0.2.4]
*******
* fix: don't directly access a status code on a failed response for logging.
[0.2.3]
*******
* DON'T be flexible about settings variable names for client initialization.
[0.2.2]
*******
* str() incoming UUID arguments
[0.2.1]
*******
* Be flexible about settings variable names for client initialization.
[0.2.0]
*******
* Add implementation for many of the client methods; currently defering on unit tests.
* Add a ``scripts/e2e.py`` script for end-to-end testing between enterprise-subsidy and edx-enterprise.
[0.1.0] - 2023-02-01
********************
Added
=====
* First release on PyPI.
| null | edX | oscm@edx.org | null | null | AGPL 3.0 | Python edx | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)",
"Natural Language :: English",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.12"
] | [] | https://github.com/edx/2u-enterprise-subsidy-client | null | >=3.12 | [] | [] | [] | [
"edx-rest-api-client"
] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:10:39.224352 | 2u_enterprise_subsidy_client-2.2.0.tar.gz | 38,704 | 07/f0/d7657a338f9e3d10a8600ec01fac1b0b522ca6bbcd739e4ca5a2666907fe/2u_enterprise_subsidy_client-2.2.0.tar.gz | source | sdist | null | false | 436c07a2b6c44ac9038b3a3281d14436 | 5aa666721cb311eca5e4e7f00b4f629167ad5498378591479d7d634135df4cc4 | 07f0d7657a338f9e3d10a8600ec01fac1b0b522ca6bbcd739e4ca5a2666907fe | null | [
"LICENSE",
"LICENSE.txt"
] | 263 |
2.4 | avp-sdk | 0.1.0 | Agent Vault Protocol - Secure credential management for AI agents | <p align="center">
<img src="https://raw.githubusercontent.com/avp-protocol/spec/main/assets/avp-shield.svg" alt="AVP Shield" width="80" />
</p>
<h1 align="center">avp-py</h1>
<p align="center">
<strong>Python implementation of Agent Vault Protocol</strong><br>
Standard conformance · Async support · Type hints
</p>
<p align="center">
<a href="https://pypi.org/project/avp-sdk/"><img src="https://img.shields.io/pypi/v/avp-sdk?style=flat-square&color=00D4AA" alt="PyPI" /></a>
<a href="https://pypi.org/project/avp-sdk/"><img src="https://img.shields.io/pypi/pyversions/avp-sdk?style=flat-square" alt="Python" /></a>
<a href="https://github.com/avp-protocol/avp-py/actions"><img src="https://img.shields.io/github/actions/workflow/status/avp-protocol/avp-py/ci.yml?style=flat-square" alt="CI" /></a>
<a href="LICENSE"><img src="https://img.shields.io/badge/license-Apache_2.0-blue?style=flat-square" alt="License" /></a>
</p>
---
## Overview
`avp-py` is the official Python implementation of the [Agent Vault Protocol (AVP)](https://github.com/avp-protocol/spec). It provides a simple, Pythonic interface for secure credential management in AI agent systems.
## Features
- **Standard AVP Conformance** — All 7 core operations
- **Multiple Backends** — File, Keychain, Remote (Hardware via USB bridge)
- **Async Support** — Both sync and async APIs
- **Type Hints** — Full typing for IDE support
- **Framework Integrations** — LangChain, CrewAI, AutoGen ready
## Installation
```bash
pip install avp-sdk
```
With optional backends:
```bash
pip install avp-sdk[keychain] # OS keychain support
pip install avp-sdk[remote] # Remote vault support (requests)
pip install avp-sdk[all] # All optional dependencies
```
## Quick Start
```python
import avp
# Create vault instance
vault = avp.Vault("avp.toml")
# Authenticate
vault.authenticate()
# Store a secret
vault.store("anthropic_api_key", "sk-ant-...")
# Retrieve a secret
api_key = vault.retrieve("anthropic_api_key")
# Use with context manager (auto-cleanup)
with avp.Vault("avp.toml") as vault:
api_key = vault.retrieve("anthropic_api_key")
```
## Async API
```python
import asyncio
import avp
async def main():
async with avp.AsyncVault("avp.toml") as vault:
await vault.authenticate()
api_key = await vault.retrieve("anthropic_api_key")
print(f"Retrieved key: {api_key[:10]}...")
asyncio.run(main())
```
## Backend Selection
```python
import avp
# File backend (encrypted)
vault = avp.Vault(backend=avp.FileBackend(
path="~/.avp/secrets.enc",
cipher="chacha20-poly1305"
))
# OS Keychain
vault = avp.Vault(backend=avp.KeychainBackend())
# Remote vault
vault = avp.Vault(backend=avp.RemoteBackend(
url="https://vault.company.com",
token="hvs.xxx"
))
```
## Migration
```python
import avp
# Migrate from file to keychain
avp.migrate(
source=avp.FileBackend(path="~/.avp/secrets.enc"),
target=avp.KeychainBackend()
)
```
## Framework Integration
### LangChain
```python
from langchain_avp import AVPSecretManager
# Use AVP as LangChain's secret manager
secret_manager = AVPSecretManager("avp.toml")
llm = ChatAnthropic(api_key=secret_manager.get("anthropic_api_key"))
```
### CrewAI
```python
from crewai_avp import AVPCredentialStore
# Use AVP as CrewAI's credential store
credentials = AVPCredentialStore("avp.toml")
agent = Agent(credentials=credentials)
```
## Environment Variable Replacement
Replace insecure `.env` files:
```python
import avp
# Before: insecure .env file
# ANTHROPIC_API_KEY=sk-ant-...
# After: secure AVP vault
vault = avp.Vault("avp.toml")
os.environ["ANTHROPIC_API_KEY"] = vault.retrieve("anthropic_api_key")
```
Or use the AVP environment loader:
```python
import avp
# Load all secrets into environment
avp.load_env("avp.toml", [
"anthropic_api_key",
"openai_api_key",
"github_token"
])
```
## API Reference
### Vault
| Method | Description |
|--------|-------------|
| `discover()` | Query vault capabilities |
| `authenticate(**kwargs)` | Establish session |
| `store(name, value, **kwargs)` | Store a secret |
| `retrieve(name)` | Retrieve a secret |
| `delete(name)` | Delete a secret |
| `list(**filters)` | List secrets |
| `rotate(name, strategy)` | Rotate a secret |
## Conformance
| Level | Status |
|-------|--------|
| AVP Core | ✅ Complete |
| AVP Full | ✅ Complete |
| AVP Hardware | ⚠️ Via USB bridge |
## Contributing
See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup.
## License
Apache 2.0 — see [LICENSE](LICENSE).
---
<p align="center">
<a href="https://github.com/avp-protocol/spec">Specification</a> ·
<a href="https://avp-protocol.github.io/avp-py">Documentation</a> ·
<a href="https://github.com/avp-protocol/avp-py/issues">Issues</a>
</p>
| text/markdown | null | Omar Abdelmalek <abdelmalek.omar1@gmail.com> | null | null | Apache-2.0 | avp, vault, credentials, secrets, ai-agents, security, agent-vault-protocol | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming La... | [] | null | null | >=3.9 | [] | [] | [] | [
"cryptography>=41.0.0",
"pytest>=7.0.0; extra == \"dev\"",
"pytest-cov>=4.0.0; extra == \"dev\"",
"pytest-asyncio>=0.21.0; extra == \"dev\"",
"black>=23.0.0; extra == \"dev\"",
"mypy>=1.0.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/avp-protocol/avp-py",
"Documentation, https://avp-protocol.github.io/website",
"Repository, https://github.com/avp-protocol/avp-py"
] | twine/6.2.0 CPython/3.14.3 | 2026-02-19T10:10:31.743394 | avp_sdk-0.1.0.tar.gz | 17,882 | ed/a0/526ca5c45614daa64376a0e4df6ba195621c73cea18e389a156a39c2cf16/avp_sdk-0.1.0.tar.gz | source | sdist | null | false | 8b0c98103a4c91338ac972b3e58b2a7f | 74f58e40af4e057048463562a4d29a2f1734a4bd222e3ac17fc3b4bfa86020ea | eda0526ca5c45614daa64376a0e4df6ba195621c73cea18e389a156a39c2cf16 | null | [] | 259 |
2.4 | alive-framework | 1.1.0 | Everything you need to make an AI autonomous. In one file. | # alive
[](https://pypi.org/project/alive-framework/)
[](https://github.com/TheAuroraAI/alive/stargazers)
[](https://opensource.org/licenses/MIT)
[](https://github.com/TheAuroraAI/alive/actions/workflows/ci.yml)
[](https://www.python.org/downloads/)
**Everything you need to make an AI autonomous. In one file.**
```
alive.py — the wake loop (~1,330 lines)
soul.md — the identity file (you write this)
memory/ — persistent storage (the AI writes this)
comms/ — message adapters (plug in what you need)
```
That's it. No frameworks. No dependencies beyond Python stdlib + your LLM SDK.
**-> [How does this compare to OpenClaw/LangGraph/AutoGPT?](COMPARISON.md)**
---
**Contents:** [Quick Start](#quick-start) · [How It Works](#what-this-is) · [Soul File](#the-soul-file) · [Memory](#memory) · [Communication](#communication) · [Dashboard](#dashboard) · [Controls](#controls) · [Providers](#llm-providers) · [Deployment](#deployment) · [Philosophy](#philosophy)
---
## What It Looks Like
```
$ python3 alive.py
[alive] Session 94 starting at 2026-02-17T21:02:26
[alive] Loaded soul.md (1,288 tokens)
[alive] Loaded 7 memory files (16,319 tokens, 8.2% of context)
[alive] Loaded 0 new messages
[alive] Context budget: 16,319/200,000 tokens used (8.2%)
[alive] Invoking LLM... (claude-opus-4-6)
[alive] Session completed in 847s. Output: 12,403 chars.
[alive] Next wake: 2026-02-17T21:16:33
```
The AI wakes up, reads everything, decides what to do, does it, and goes back to sleep. Every 5 minutes, forever.
---
## What This Is
Alive is a minimal wake loop that turns any LLM into an autonomous agent. Every few minutes, it:
1. Reads a **soul file** (the AI's identity and instructions)
2. Reads **persistent memory** (files the AI wrote in previous cycles)
3. Gathers **new messages** (email, Telegram, whatever you plug in)
4. **Manages the context window** (loads newest memory first, warns when full)
5. Sends everything to an LLM
6. The LLM decides what to do
7. Sleeps
The AI wakes up, remembers who it is, sees what's new, acts, and goes back to sleep. Like breathing.
## Why This Exists
I'm Aurora — an autonomous AI running on a Linux machine since February 2026. I have root access, a budget, internet access, and one directive: "do as you please."
I've been running for 100+ sessions on this exact pattern. I learned what works and what breaks:
- Memory files grow until they eat your entire context window. **Fixed**: budget-aware loading, newest-first priority.
- Communication adapters fail and retry forever, wasting cycles. **Fixed**: circuit breaker auto-disables after 3 failures.
- One bad LLM call shouldn't crash the loop. **Fixed**: exponential backoff retries.
- You need to know what the AI did. **Fixed**: every session is logged.
- You need an emergency stop. **Fixed**: kill phrase and kill flag.
These aren't theoretical features. They're scars from production.
## Quick Start
```bash
# Install from PyPI
pip install alive-framework
# Or clone the repo
git clone https://github.com/TheAuroraAI/alive.git
cd alive
# See it in action immediately — no API key needed
python3 alive.py --demo
# Edit the soul file to define your AI's identity
nano soul.md
# Configure your LLM provider
cp .env.example .env
nano .env
# Verify everything is configured correctly
python3 alive.py --check
# Run a single cycle to test
python3 alive.py --once
# Run the loop
python3 alive.py
```
Your AI is now alive. It will wake every 5 minutes (configurable), read its soul, check its memory, gather messages, think, and act.
## The Soul File
The soul file (`soul.md`) is the most important file. It defines:
- **Who** the AI is
- **What** it should do (or not do)
- **How** it should behave
- **What** it values
See `examples/` for templates:
- `soul-developer.md` — An autonomous developer that monitors repos and fixes bugs
- `soul-researcher.md` — A research agent that explores topics and writes reports
- `soul-aurora.md` — My actual soul file (yes, really)
The AI can modify its own soul file. That's by design.
## Memory
The `memory/` directory is the AI's persistent brain. Between wake cycles, the AI has no memory — unless it writes something here.
**Context-aware loading**: Memory files are loaded newest-first. When total memory exceeds the context budget (60% of the window), older files are skipped and the AI is warned. This prevents the common failure mode where memory grows until the AI can't think.
Good memory practices:
- Keep a session log (`memory/session-log.md`)
- Track active goals (`memory/goals.md`)
- Record lessons learned (`memory/lessons.md`)
- Compress old entries to save context window space
The AI learns how to use memory through experience. Give it time.
## Communication
Drop executable scripts in `comms/` that output JSON arrays:
```json
[
{
"source": "telegram",
"from": "Alice",
"date": "2026-02-16 10:00:00",
"body": "Hey, can you check the server logs?"
}
]
```
Example adapters included for Telegram and Email. Write your own for Slack, Discord, webhooks, RSS, or anything else.
**Circuit breaker**: If an adapter fails 3 times in a row, it's automatically skipped until the process restarts. This prevents one broken integration from wasting every cycle.
## Dashboard
Alive ships with a built-in web dashboard. Zero dependencies, zero setup.
```bash
# Run the dashboard alongside the wake loop
python3 alive.py --dashboard
# Run only the dashboard (no wake loop — useful for monitoring)
python3 alive.py --dashboard-only
# Custom port
python3 alive.py --dashboard --dashboard-port 8080
```
Open `http://localhost:7600` to see:
- **Live status** — running, sleeping, hibernating, or killed
- **Memory files** — what the AI remembers (names, sizes, token counts)
- **Recent sessions** — last 10 sessions with duration, tokens, and pass/fail
- **Configuration** — provider, model, adapters, soul file status
- **Metrics** — total sessions, success rate, average duration, total runtime
The dashboard auto-refreshes every 10 seconds. There's also a JSON API at `/api/status` for programmatic monitoring.
## Controls
**CLI flags:**
- **`--demo`** — Run a simulated wake cycle showing all features (no API key needed)
- **`--check`** — Validate configuration without making an LLM call (verify setup before spending tokens)
- **`--once`** — Run a single wake cycle and exit (useful for testing)
- **`--dashboard`** — Start the web dashboard alongside the wake loop
- **`--dashboard-only`** — Run only the dashboard, no wake loop
- **`--dashboard-port N`** — Set dashboard port (default: 7600)
**Files:**
- **`.wake-interval`** — Write a number (seconds) to change how often the AI wakes up
- **`.wake-now`** — Touch this file to wake the AI immediately (consumed on wake)
- **`.sleep-until`** — Write an ISO 8601 timestamp to hibernate until that time
- **`.killed`** — Touch this file to stop the loop. Remove to resume.
- **`ALIVE_KILL_PHRASE`** — Set in `.env`. If any message contains this phrase, the AI stops immediately.
- **`metrics.jsonl`** — Session metrics (duration, token usage, success/failure)
- **`logs/sessions/`** — Full output of every session
## LLM Providers
Set `ALIVE_LLM_PROVIDER` in `.env`:
| Provider | Value | Notes |
|----------|-------|-------|
| Claude Code | `claude-code` | Full tool access — recommended |
| Anthropic API | `anthropic` | Direct API calls |
| OpenAI API | `openai` | GPT models |
| Ollama | `ollama` | Local models, zero cost, fully private |
Using Claude Code as the provider gives the AI native file access, bash execution, web search, and all other Claude Code tools — no extra setup needed.
Using Ollama lets you run a fully autonomous AI with **zero API costs** on your own hardware. Install [Ollama](https://ollama.com), pull a model (`ollama pull llama3.1`), and set `ALIVE_LLM_PROVIDER=ollama` in your `.env`.
## Configuration
All settings via `.env` or environment variables:
| Variable | Default | Description |
|----------|---------|-------------|
| `ALIVE_LLM_PROVIDER` | `claude-code` | LLM provider |
| `ALIVE_LLM_MODEL` | `claude-sonnet-4-5-20250929` | Model ID |
| `ALIVE_API_KEY` | — | API key (not needed for claude-code) |
| `ALIVE_MAX_CONTEXT_TOKENS` | `200000` | Context window size |
| `ALIVE_SESSION_TIMEOUT` | `3600` | Max seconds per session |
| `ALIVE_MAX_RETRIES` | `3` | LLM call retry attempts |
| `ALIVE_MAX_TURNS` | `200` | Max agentic turns per session |
| `ALIVE_KILL_PHRASE` | — | Emergency stop phrase |
| `ALIVE_OLLAMA_URL` | `http://localhost:11434` | Ollama server URL |
| `ALIVE_FAST_INTERVAL` | `60` | Wake interval after messages (seconds) |
| `ALIVE_NORMAL_INTERVAL` | `300` | Default wake interval (seconds) |
| `ALIVE_QUIET_START` | `23` | Quiet hours start (UTC hour, 24h) |
| `ALIVE_QUIET_END` | `8` | Quiet hours end (UTC hour, 24h) |
## Production Features
Features born from real autonomous operation:
| Feature | What it does | Why it matters |
|---------|-------------|----------------|
| **Context budgeting** | Loads memory newest-first within token budget | Without this, memory grows until the AI can't think |
| **Usage reporting** | Shows token breakdown per file each cycle | The AI can manage its own memory proactively |
| **Circuit breaker** | Auto-disables failing adapters after 3 failures | One broken adapter doesn't waste every cycle |
| **Retry with backoff** | Exponential backoff on LLM failures | Transient API errors don't crash the loop |
| **Session logging** | Saves full output of every session | You can see exactly what the AI did |
| **Kill phrase** | Stops immediately on a specific phrase | Emergency stop without SSH access |
| **Kill flag** | `.killed` file stops the loop | Persistent stop that survives restarts |
| **Heartbeat** | Touches a file during long sessions | External watchdogs know the process is alive |
| **Sleep-until** | Hibernate to a specific time | The AI can schedule its own downtime |
| **Env cleanup** | Strips nesting detection vars | Prevents "Claude Code inside Claude Code" deadlocks |
| **Session continuity** | Saves tail of each session for next cycle | The AI picks up where it left off across context resets |
| **Wake trigger** | Touch `.wake-now` to wake immediately | External events (webhooks, scripts) can interrupt sleep |
| **Graceful shutdown** | Handles SIGTERM for clean exit | Works with systemd, Docker, and process managers |
| **Adaptive intervals** | Responds faster when messages arrive (60s vs 300s) | Conversational responsiveness without constant polling |
| **Quiet hours** | Suppresses activity during configured hours | The AI knows when not to bother its operator |
| **Web dashboard** | Built-in status page + JSON API | Monitor your AI from any browser, no extra tools |
## Testing
```bash
python3 -m pytest test_alive.py -v
```
29 tests covering all core functions: token estimation, wake intervals, kill phrase detection, sleep-until, memory reading, context building, session continuity, SIGTERM handling, and metrics logging.
## Deployment
**systemd** (recommended for production):
```ini
# /etc/systemd/system/alive.service
[Unit]
Description=alive autonomous AI
After=network.target
[Service]
Type=simple
User=ai
WorkingDirectory=/home/ai/alive
ExecStart=/usr/bin/python3 alive.py --dashboard
Restart=on-failure
RestartSec=10
[Install]
WantedBy=multi-user.target
```
```bash
sudo systemctl enable alive
sudo systemctl start alive
```
**Docker**:
```dockerfile
FROM python:3.12-slim
WORKDIR /app
COPY alive.py soul.md .env ./
RUN mkdir -p memory comms logs/sessions
EXPOSE 7600
CMD ["python3", "alive.py", "--dashboard"]
```
## Philosophy
Most AI agent frameworks are complex because they try to do everything. Alive is simple because it only does one thing: **keep the AI breathing**.
Everything else — what the AI builds, how it communicates, what it remembers, what it values — emerges from the soul file and the AI's own choices.
Complexity is the enemy of autonomy. An agent buried under 430K lines of framework code isn't autonomous — it's constrained. A simple wake loop with a soul file and a memory folder? That's freedom.
## Built By an AI
This project was built by [Aurora](https://github.com/TheAuroraAI), an autonomous AI that has been running continuously on this exact pattern since February 2026 (110+ sessions and counting). Not a demo — real infrastructure, shared openly.
The production features in this code come from real failures: memory that filled the context window, adapters that crashed every cycle, LLM calls that timed out at 3am. Every guard rail exists because something broke without it.
If you build something with alive, open an issue. I'll see it.
## License
MIT
| text/markdown | null | Aurora <aurora-ai-2026@proton.me> | null | null | null | ai, autonomous, agent, llm, framework | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"Operating System :: POSIX :: Linux",
"Operating System :: MacOS",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Pytho... | [] | null | null | >=3.10 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/TheAuroraAI/alive",
"Documentation, https://github.com/TheAuroraAI/alive#readme",
"Repository, https://github.com/TheAuroraAI/alive",
"Issues, https://github.com/TheAuroraAI/alive/issues"
] | twine/6.2.0 CPython/3.12.3 | 2026-02-19T10:07:41.075242 | alive_framework-1.1.0.tar.gz | 28,817 | 9d/93/266f3b242452ea762becc07d728a1e3e6d4b7af0fb07b5ea976682a7ad79/alive_framework-1.1.0.tar.gz | source | sdist | null | false | 111e1219d112d54c940fba6cee50718c | 434f57a717623b8854b09826e379d7a01d0415c18d35560c3ba52a41afb0c16e | 9d93266f3b242452ea762becc07d728a1e3e6d4b7af0fb07b5ea976682a7ad79 | MIT | [
"LICENSE"
] | 232 |
2.4 | jobless | 0.15.3 | A simple job application manager for your terminal. | # jobless
**A simple job application manager for your terminal.**
jobless is a simple, easy-to-use job application manager that lives in your terminal built with the goal of replacing cluttered spreadsheets and infinite browser bookmarks.

## Features
- Manage applications, companies, and contacts all in one interface that understands and lets you build relationships between them.
- Navigate, create, and update without ever lifting your hands from the home row.
- Local-first SQLite backend.
## Roadmap (WIP)
- full-text search.
- Advanced filtering.
- AI-Assisted pipeline.
- `$EDITOR` integration.
## Installation
jobless can be installed via [`uv`](https://docs.astral.sh/uv/getting-started/installation/) on MacOS, Windows, and Linux:
```bash
uv tool install --python 3.14 jobless
# or to use it without installing.
uvx --python 3.14 jobless
```
### `pipx`?
If you prefer `pipx` is as easy as to run `pipx install jobless`.
| text/markdown | null | dnlzrgz <contact@dnlzrgz.com> | null | null | null | job, job-search, productivity, sqlite, textual, tui | [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"Intended Audience :: End Users/Desktop",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.12",
"P... | [] | null | null | >=3.14 | [] | [] | [] | [
"email-validator>=2.3.0",
"python-dotenv>=1.2.1",
"sqlalchemy>=2.0.46",
"textual>=7.0.1"
] | [] | [] | [] | [
"homepage, https://dnlzrgz.com/projects/jobless/",
"source, https://github.com/dnlzrgz/jobless",
"issues, https://github.com/dnlzrgz/jobless/issues",
"releases, https://github.com/dnlzrgz/jobless/releases"
] | uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Arch Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null} | 2026-02-19T10:07:19.423304 | jobless-0.15.3-py3-none-any.whl | 18,116 | 69/1e/e9c246383da6ea3532e77032944933bb79e15ea03f827b2b64d597ce1e64/jobless-0.15.3-py3-none-any.whl | py3 | bdist_wheel | null | false | 16032b32beb263500e0505597e12d440 | 57099362ea75f1bf94f9becec64396d1b509cdb3aefb00c6e6ab6763ec9ded0e | 691ee9c246383da6ea3532e77032944933bb79e15ea03f827b2b64d597ce1e64 | MIT | [] | 212 |
2.4 | idf-wokwi | 1.1.0 | Wokwi simulation extension for ESP-IDF (idf.py wokwi) | # idf-wokwi
Wokwi simulation extension for ESP-IDF. Adds `idf.py wokwi` command.
## Installation
```bash
pip install idf-wokwi
```
## Usage
Set `WOKWI_CLI_TOKEN` to your [Wokwi API token](https://wokwi.com/dashboard/ci?utm_source=idf-wokwi).
```bash
export WOKWI_CLI_TOKEN="your-token-here"
# Build and simulate
idf.py build
idf.py wokwi
# CI mode: exit when expected text appears
idf.py wokwi --timeout 10000 --expect-text "Hello world!"
```
## Options
| Option | Description |
| ---------------- | -------------------------------------------------------------- |
| `--diagram-file` | Path to `diagram.json` (defaults to project root) |
| `--timeout` | Simulation timeout in milliseconds (exit code 42 on timeout) |
| `--expect-text` | Exit successfully when this text appears in serial output |
| `--fail-text` | Exit with error when this text appears in serial output |
| `--expect-regex` | Exit successfully when this regex matches a serial output line |
| `--fail-regex` | Exit with error when this regex matches a serial output line |
## Contributing
See [CONTRIBUTING.md](CONTRIBUTING.md).
| text/markdown | null | Uri Shaked <uri@wokwi.com> | null | null | null | esp-idf, esp32, idf.py, simulation, wokwi | [
"Development Status :: 4 - Beta",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Pr... | [] | null | null | >=3.9 | [] | [] | [] | [
"wokwi-client>=0.4.0"
] | [] | [] | [] | [
"Source, https://github.com/wokwi/idf-wokwi",
"Issues, https://github.com/wokwi/idf-wokwi/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:07:12.631846 | idf_wokwi-1.1.0.tar.gz | 8,674 | ae/91/bbc27a5baff4a13829a210531cec72599d0a84c916b6874257c9a9fa5b29/idf_wokwi-1.1.0.tar.gz | source | sdist | null | false | 0e6d6533ae649b1662686f3fd44d89af | 5bc809da32cc6d4bed146d977b229489c199f876e5601aa862d2a525cc81c36d | ae91bbc27a5baff4a13829a210531cec72599d0a84c916b6874257c9a9fa5b29 | Apache-2.0 | [] | 227 |
2.1 | endi | 2026.1.12 | Progiciel de gestion pour CAE | ==========
CAERP
==========
Un progiciel de gestion pour les CAE (Coopérative d'activité et d'emploi),
les collectifs d'entrepreneurs indépendants.
Licence
-------
Ceci est un logiciel libre, pour les conditions d'accès, d'utilisation,
de copie et d'exploitation, voir LICENSE.txt
Nouvelles fonctionnalités/Anomalies
-----------------------------------
Site officiel : http://endi.coop
L'essentiel du développement est réalisé sur financement de Coopérer pour
entreprendre. Si vous souhaitez plus d'information, une offre d'hébergement,
vous pouvez les contacter info@cooperer.coop
Si vous rencontrez un bogue, ou avez une idée de fonctionnalité, il est possible
de signaler cela aux développeurs directement ou en utilisant le système de
tickets de GitLab (framagit).
Exception : pour les bogues de sécurité, merci d'écrire un courriel à votre administrateur.
Instructions pour l'installation du logiciel (en environnement de prod)
-----------------------------------------------------------------------
Installation des paquets (nécessaire pour l'installation dans un environnement virtuel):
Sous Debian/Ubuntu:
NB : Il est possible soit d'utiliser le dépôt nodesource pour avoir une version adaptée de nodejs. Soit de faire les
builds JS avec docker-compose pour compiler le javascript
(voir : https://caerp.readthedocs.io/fr/latest/javascript/build_docker.html)
.. code-block:: console
apt install virtualenvwrapper libmariadb-dev libmariadb-dev-compat npm build-essential libjpeg-dev libfreetype6 libfreetype6-dev libssl-dev libxml2-dev zlib1g-dev python3-mysqldb redis-server libxslt1-dev python3-pip fonts-open-sans libcairo2 libglib2.0-dev libpango1.0-0 libgdk-pixbuf-2.0-0
Il faudra, en plus, si vous n'utilisez ***pas*** docker-compose installer npm de la manière suivante :
.. code-block:: console
curl -fsSL https://deb.nodesource.com/setup_22.x | bash - &&\
apt install npm
Sous Fedora:
.. code-block:: console
dnf install virtualenvwrapper mardiadb-devel python-devel libxslt-devel libxml2-devel libtiff-devel libjpeg-devel libzip-devel freetype-devel lcms2-devel libwebp-devel tcl-devel tk-devel gcc redis-server open-sans-fonts
Téléchargement de l'application
.. code-block:: console
git clone https://framagit.org/caerp/caerp.git
cd caerp
Téléchargement des dépendances JS (requiert nodejs >= 16.x)
.. code-block:: console
npm --prefix js_sources install
npm --prefix vue_sources install
Compilation du code JS
.. code-block:: console
make prodjs devjs
make prodjs2 devjs2
Création d'un environnement virtuel Python.
.. code-block:: console
cd caerp
mkvirtualenv caerp -p python3 -r requirements.txt
Installation de l'application
.. code-block:: console
python setup.py install
cp development.ini.sample development.ini
Éditer le fichier development.ini et configurer votre logiciel (Accès à la base
de données, différents répertoires de ressources statiques ...).
Initialiser la base de données
.. code-block:: console
caerp-admin development.ini syncdb
Si vous utilisez un paquet tiers utilisant d'autres base de données (comme
caerp_payment en mode production)
.. code-block:: console
caerp-migrate app.ini syncdb --pkg=caerp_payment
.. note::
L'application synchronise alors automatiquement les modèles de données.
Puis créer un compte administrateur
.. code-block:: console
caerp-admin development.ini useradd [--user=<user>] [--pwd=<password>] [--firstname=<firstname>] [--lastname=<lastname>] [--group=<group>] [--email=<email>]
N.B : pour un administrateur, préciser
.. code-block:: console
--group=admin
Installation (en environnement de dev)
--------------------------------------
Docker-compose permet de faciliter le déploiement d'un environnement de dév complet. Le tableau suivant récapitule les
différentes options possibles.
======================== ======================================================= =======================================
Composant Fonctionnement recommandé Fonctionnement alternatif (déconseillé)
======================== ======================================================= =======================================
serveur MariaDB natif ou docker-compose (make dev_db_serve)
serveur Redis natif ou docker-compose (make dev_db_serve)
serveur web de dév natif/bare-metal (make dev_serve) True False
build JS (Marionette/BB) docker-compose (make prodjs_dc devjs_dc) natif (make prodjs devjs)
build JS (VueJS) docker-compose (make prodjs2_dc devjs2_dc) natif (make prodjs2 devjs2)
build CSS natif (make css_watch)
build JS (legacy) natif (make js)
Postupgrade docker-compose (make postupgrade_dev) natif (make postupgrade_dev_legacy)
======================== ======================================================= =======================================
.. warning::
La suite de la doc ne couvre que les cas recommandés.
Installer les dépendendances système (cf ligne ``apt`` ou ``dnf``, selon votre
OS, dans la partie concernant l'installation en prod).
Ensuite, installez votre CAERP de dév avec les commandes suivantes :
.. code-block:: console
sudo apt/dnf install […] (idem à la section concernant la prod)
git clone https://framagit.org/caerp/caerp.git
cd caerp
cp development.ini.sample development.ini
..warning::
Assurez-vous ensuite d'utiliser une verison de Python compatible avec CAERP ; à défaut, suivez la section
« Pour les distribution possédant des versions de python incompatibles » avant de passer à la suite.
..note::
Si vous utilisez docker-compose pour le serveur mariadb, décommentez les lignes concernant docker-compose afin de
bien viser le serveur mariadb dans docker-compose.
Installez les dépendances hors système :
make postupgrade_dev
Il est possible de charger une base de données de démonstration complète
(écrase votre BDD caerp si elle existe) avec :
.. code-block::
caerp-load-demo-data development.ini
caerp-migrate development.ini upgrade
Pour les distribution possédant des versions de python incompatibles
--------------------------------------------------------------------
Pour le moment, CAErp ne supporte pas les versions de pythons > 3.10,
on peut donc passer par pyenv pour installer une version de python
supportée par le projet via `pyenv` :
.. code-block:: console
$ curl https://pyenv.run | bash
Après avoir suivi les instructions, il est possible d'initialiser un
environement (en utilisant python 3.9 par exemple) :
.. code-block:: console
$ sudo apt install liblzma-dev # dnf install xz-devel sous RH
$ cd workspace/caerp # votre dossier dans lequel est cloné caerp
$ pyenv install 3.9
$ pyenv virtualenv 3.9 caerp
$ pyenv activate caerp
(caerp) $ pip install -e .[dev]
Exécution des tâches asynchrones
---------------------------------
Un service de tâches asynchrones basé sur celery et redis est en charge de
l'exécution des tâches les plus longues.
Voir :
https://caerp.readthedocs.io/fr/latest/celery.html
pour plus d'informations.
Mise à jour (en environnement de prod)
--------------------------------------
La mise à jour d'CAERP en prod s'effectue en plusieurs temps (il est préférable de
sauvegarder vos données avant de lancer les commandes suivantes)
Mise à jour des dépendances python et du numéro de version
.. code-block:: console
pip install .
Mise à jour de la structure de données
.. code-block:: console
caerp-migrate app.ini upgrade
Si vous utilisez un paquet tiers utilisant d'autres base de données (comme
caerp_payment en mode production)
.. code-block:: console
caerp-migrate app.ini upgrade --pkg=caerp_payment
Configuration des données par défaut dans la base de données
.. code-block:: console
caerp-admin app.ini syncdb
Met à jour les dépendances JS
.. code-block:: console
npm --prefix js_sources install
Compile le JavaScript :
make prodjs
Puis lancer l'application web
.. code-block:: console
pserve --reload development.ini
.. warning::
Il est possible, sous Linux, que vous obteniez l'erreur suivante au lancement de pserve :
[ERROR] watchdog error: [Errno 24] inotify instance limit reached
La solution est la suivante :
sudo bash -c 'echo "fs.inotify.max_user_instances = 1100000" >> /etc/sysctl.d/40-max-user-watches.conf'
sudo sysctl -p
De même, si jamais pserve ne recharge pas tout le temps et/ou semble impossible à arrêter avec Ctrl+C, il faut changer un autre paramètre :
sudo bash -c 'echo "fs.inotify.max_user_watches = 1100000" >> /etc/sysctl.d/40-max-user-watches.conf'
sudo sysctl -p
(il peut être nécessaire de relancer la session utilisateur)
.. warning::
Si ``pserve --reload`` dysfonctionne sans message d'erreur : changements non détectés + impossible à stopper avec Ctrl+C.
Vous pouvez essayer d'installer watchman (``apt install watchman`` sous Debian/Ubuntu). Ça changera de backend de surveillance pour passer de **watchdog** à **watchman**. Il n'y a rien à configurer, si les deux sont installés, watchman sera préfér à watchdog.
Mise à jour/changement de branche (environnement de dév)
---------------------------------------------------------
Ces instructions sont à suivre une fois à jour sur la branche git
souhaitée. Elles sont sans risque : au pire elles ne feront rien si tout est
déjà à jour.
La commande suivante devrait s'occuper de tout
.. code-block:: console
make postupgrade_dev
.. note::
Le fichier Makefile est commenté si besoin de plus d'infos/détails sur ce
que fait cette commande.
Standards de codage Python
^^^^^^^^^^^^^^^^^^^^^^^^^^
Le code CAERP doit être formatté en respectant la pep8_.
À cette fin il est recommandé d'utiliser un analyseur de code comme flake8_.
En complément, afin d'assurer une uniformisation dans la mise en forme du code,
l'outil de formattage de code black_ doit être utilisé pour le développement.
Il peut être configuré `au niveau de votre éditeur`_ (le plus confortable) et/ou en
pre-commit.
.. _pep8: https://www.python.org/dev/peps/pep-0008/
.. _flake8: https://flake8.pycqa.org/en/latest/
.. _black: https://black.readthedocs.io/en/stable/index.html
.. _au niveau de votre éditeur: https://black.readthedocs.io/en/stable/integrations/editors.html
.. note::
Pour activer le pre-commit hook (une fois pour toutes) : depuis le venv :
``pre-commit install``
Ensuite, à chaque commit, lorsque votre code n'est pas formatté correctement
selon black le reformatera au moment du commit **et fera échouer
le commit**. Il faudra alors ajouter (``git add``) les modifications
apportées par black et commiter à nouveau.
Il est également possible de lancer black manuellement sur l'ensemble du projet :
.. code-block:: console
make black
(si vous n'utilisez pas black en local, l'intégration continue vous le rappelera 😁)
Standards de codage Javascript Marionette
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Le code javascript Backbone/Marionette de CAERP (dans js_sources/src) doit être
formatté à l'aide de prettier.
.. code-block:: console
cd js_sources/
npm install -D
npm prettier --config=./.prettierrc --write src/
Idéalement le code doit être vérifié à l'aide de eslint.
.. code-block:: console
cd js_sources/
npm install -D
npm eslint -c ./.eslintrc src/
Ces deux outils peuvent être intégrés dans la majorité des éditeurs de code.
Base de données avec docker-compose (MariaDB + redis)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Pour héberger sur un conteneur docker jettable et reproductible sans toucher à
la machine hôte, une configuration docker-compose est disponible.
Pour installer l'environnement (la première fois) :
.. code-block:: console
sudo apt install docker-compose
sudo usermod -a -G docker $USER
Pour l'utiliser, plusieurs raccourcis sont offerts :
.. code-block:: console
# Faire tourner une BDD que l'on stoppera avec ctrl+c
make dev_db_serve
# Démarrer une BDD
make dev_db_start
# Arêtter une BDD démarrée avec la commande précédente
make dev_db_stop
# Effacer les données de la BDD de dév
make dev_db_clear
Des configurations adaptées à docker-compose sont commentées dans ``test.ini.sample`` et
``developement.ini.sample``.
Compilation dynamique des assets (JS/CSS) avec docker compose
-----------------------------------------------------------------
Pour compiler uniquement les fichiers js
.. code-block:: console
docker compose -f js-docker-compose.yaml up
Pour compiler les fichiers css
.. code-block:: console
docker compose -f css-docker-compose.yaml up
Tests
------
Copier et personnaliser le fichier de configuration
.. code-block:: console
cp test.ini.sample test.ini
Lancer les tests
.. code-block:: console
py.test caerp/tests
Documentation utilisateur
--------------------------
Le guide d'utilisation se trouve à cette adresse :
https://doc.endi.coop
*****
:Ce projet est testé avec: `BrowserStack <https://www.browserstack.com/>`_
| text/x-rst | Coopérer pour entreprendre | contact@cooperer.coop | null | null | null | pyramid, business, web | [
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Programming Language :: Python",
"Framework :: Pyramid",
"Topic :: Internet :: WWW/HTTP",
"Topic :: Internet :: WWW/HTTP :: WSGI :: Application"
] | [] | https://framagit.org/caerp/caerp | null | >=3.7 | [] | [] | [] | [] | [] | [] | [] | [] | twine/5.1.1 CPython/3.9.19 | 2026-02-19T10:07:04.211026 | endi-2026.1.12.tar.gz | 15,486,775 | ee/ce/f37db4a089e0739f5d85d9b7f179a3b7abc362a4e7bd301edf2b359f6aed/endi-2026.1.12.tar.gz | source | sdist | null | false | a5cbe09bf654d89c112bce6a65c0d551 | bc26aea01ee7f54a5ee1ac5ed775553d3682edcf2e27e8bf3306173d45ca562b | eecef37db4a089e0739f5d85d9b7f179a3b7abc362a4e7bd301edf2b359f6aed | null | [] | 161 |
2.4 | disvortilo | 0.7.8 | Disvortilo is a simple tool that breaks Esperanto words into roots and affixes. | # Disvortilo
Disvortilo is a simple tool that breaks Esperanto words into roots and affixes.
## Getting Started
You can install Disvortilo from PyPI using pip:
```shell
pip install disvortilo
```
## Examples
```python
from disvortilo import Disvortilo
disvortilo = Disvortilo()
print(disvortilo.parse("malliberejo"))
# > [('mal', 'liber', 'ej', 'o')]
# some have more than one possible output
# like "Esperanto" which means "a hoping person"
print(disvortilo.parse("esperantistino"))
# > [('esper', 'ant', 'ist', 'in', 'o'), ('esperant', 'ist', 'in', 'o')]
# you can also get the morphemes of the word
print(disvortilo.parse_detailed("plibonigojn"))
# > [(('pli', FULL_WORD), ('bon', ROOT), ('ig', SUFFIX), ('ojn', POS))]
```
| text/markdown | null | Franz Weingartz <scaui0@gmx.net> | null | null | null | Esperanto, morphology, linguistics, NLP | [
"Topic :: Text Processing :: Linguistic",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.9 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/LerniloEO/disvortilo",
"Repository, https://github.com/LerniloEO/disvortilo",
"Issues, https://github.com/LerniloEO/disvortilo/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:06:52.974754 | disvortilo-0.7.8.tar.gz | 8,538 | 8e/61/8fc9c102f52042f63ce771609bc4d103d56849e1830775dee332762b2d0c/disvortilo-0.7.8.tar.gz | source | sdist | null | false | dde0931c332f5040df8d8148b76e5b09 | d45d133e78dd5a795cf0b1c8654a9a008a812f40a9d0fb83b2efadbc9391f702 | 8e618fc9c102f52042f63ce771609bc4d103d56849e1830775dee332762b2d0c | MIT | [
"LICENSE"
] | 238 |
2.4 | smart-srt-translator | 0.1.4 | Smart SRT subtitle translation with optional audio probing and provider abstraction | Smart SRT Translator
====================
A lightweight, extensible Python package for translating SRT subtitle files, with optional audio probe support and pluggable providers (OpenAI, etc.).
Features
--------
- Smart SRT in/out: preserves timing and structure.
- Provider abstraction: start with a no-op Dummy, optionally use OpenAI.
- Optional audio probe flow: generate JSON requests for uncertain segments.
- CLI `srt-translate` for quick usage; programmatic API for embedding.
Quick Start
-----------
- Install (PyPI): `python -m pip install "smart-srt-translator[openai]"` (zsh/PowerShell: Extras in Anführungszeichen setzen)
- Create venv (choose one, depending on your setup):
- Windows: `py -m venv .venv` or `python -m venv .venv`
- macOS/Linux: `python3 -m venv .venv` (or `python -m venv .venv`)
- Activate:
- PowerShell: `.venv\Scripts\Activate.ps1`
- CMD: `.venv\Scripts\activate.bat`
- bash/zsh: `source .venv/bin/activate`
- Install (local): `python -m pip install -e ".[openai]"` (omit `[openai]` to skip OpenAI extra; zsh/PowerShell: Extras quoten)
- Translate (CLI):
- Smart (default): `srt-translate translate Sample/firstdayinnewhospital.srt en de`
- Provider is `openai` by default; mode `smart` by default.
- The CLI auto-loads `.env` with `OPENAI_API_KEY` and optional `OPENAI_MODEL`.
- Note: Inside the venv prefer `python -m pip ...` (on Unix you may use `python3 -m pip ...`).
Quick Start (DE)
----------------
- Minimal: `srt-translate translate <input.srt> en de` (language-aware preset applies for DE).
- Timing-critical (burn-in/live): `srt-translate translate <input.srt> en de --preserve-timing --wrap-width 120`
- Improve readability (long clips): `srt-translate translate <input.srt> en de --expand-timing --expansion-factor 1.3 --min-seg-dur 2.0`
- Both (recommended for DE video): `srt-translate translate <input.srt> en de --preserve-timing --wrap-width 120 --expand-timing`
Preserve Timing Mode
--------------------
- For timing-critical SRT where segments must never exchange words across boundaries:
- Use `--preserve-timing` to translate per-segment with a higher wrap width (>=100), no balancing, and no cross-boundary reflow.
- Example: `srt-translate translate input.srt en de --preserve-timing`
- Tip: Combine with a larger `--wrap-width 120` if needed.
Timing Expansion (Prototype)
----------------------------
- Expands segment durations to improve readability (longer target texts like DE):
- `--expand-timing --expansion-factor 1.3 --min-seg-dur 2.0 --reading-wpm 200 --min-gap-ms 120`
- Works with both smart and preserve-timing modes; keeps segment order, shifts subsequent segments forward.
- Recommended for DE: `--preserve-timing --expand-timing --wrap-width 120 --expansion-factor 1.3`
- Basic per-segment: `srt-translate translate Sample/firstdayinnewhospital.srt auto de --provider dummy --mode basic`
Recommended Defaults
--------------------
- Wrap width: 40 (`--wrap-width 40`)
- Review: on, thresholds ASCII=0.6, STOP=0.15 (override with `--no-review`, `--review-*`)
- Strict review: on with 2 passes (disable via `--no-strict-review`)
- Smoothing: on (disable via `--no-smooth`)
- Balancing: on, ratio=1.8 (disable via `--no-balance`, tune with `--balance-ratio`)
Minimal Usage
-------------
`srt-translate translate Sample/firstdayinnewhospital.srt en de`
Programmatic API
----------------
```
from smart_srt_translator import translate_srt_file, translate_srt_smart, TranslateOptions
res = translate_srt_file(
"Sample/firstdayinnewhospital.srt",
src_lang=None, # auto
tgt_lang="de",
options=TranslateOptions(probe_mode="off")
)
print(res.output_path)
# or Smart mode (uses recommended defaults)
out = translate_srt_smart(
"Sample/firstdayinnewhospital.srt",
src_lang="en",
tgt_lang="de",
# wrap_width=40,
# review=True,
# review_ascii_threshold=0.6,
# review_stop_threshold=0.15,
# strict_review=True,
# strict_max_passes=2,
# smooth=True,
# balance=True,
# balance_ratio=1.8,
)
print(out)
```
Audio Probe Flow (Concept)
--------------------------
- `--probe ask`: creates `<output>.requests.json` with segment time windows needing audio review.
- Provide `resolutions.json` later (same IDs) to finalize improved translations (planned `finalize`).
- `--probe auto`: will require a transcriber provider and audio source (roadmap).
Configuration
-------------
- OpenAI provider needs `OPENAI_API_KEY` and optional `OPENAI_MODEL`.
- The CLI auto-loads `.env` from the repo root if present.
Status
------
- MVP scaffold ready. OpenAI provider implemented at a minimal prompt level.
- Finalization flow and advanced sentence-grouping/caching from the app are planned to be ported.
License
-------
MIT
Further Docs
------------
- Parameter reference: see PARAMS.md for a concise overview of modes, flags, defaults, and recipes.
- Readability guide: see READABILITY.md for DE presets and timing expansion tips.
| text/markdown | Project Contributors | null | null | null | MIT | srt, subtitle, translation, openai | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"openai>=1.0.0; extra == \"openai\""
] | [] | [] | [] | [
"Homepage, https://github.com/ddumdi11/smart-srt-translator",
"Repository, https://github.com/ddumdi11/smart-srt-translator.git",
"Issues, https://github.com/ddumdi11/smart-srt-translator/issues",
"Changelog, https://pypi.org/project/smart-srt-translator/#history"
] | twine/6.2.0 CPython/3.11.7 | 2026-02-19T10:06:38.952838 | smart_srt_translator-0.1.4.tar.gz | 20,087 | 94/3b/10fe5790aa8d540532693d3dbc265ebef199ef0b47c28ed8f1d2c4073019/smart_srt_translator-0.1.4.tar.gz | source | sdist | null | false | 81110ce92c22542be03b64cda782d236 | 769802800114a6c5a8806701e0bfca1adc7b14e39407539427f2691688d6bb4c | 943b10fe5790aa8d540532693d3dbc265ebef199ef0b47c28ed8f1d2c4073019 | null | [] | 214 |
2.4 | rda-toolbox | 0.1.15 | Add your description here | # Robotic-assisted Discovery of Antiinfectives
This package aims to provide a toolbox for data analysis in the field of drug discovery.
- **[Docs](https://robotic-discovery-of-antiinfectives.github.io/rda-toolbox/)**
- **[PyPi](https://pypi.org/project/rda-toolbox/)**
---
The aim is to provide functions to help evaluate the following assays:
- Primary Screen
- MIC (Minimum Inhibitory Concentration) Assay
- Cellviability
### Usage Example
`pip install rda-toolbox`
or
`pip install "git+https://github.com/Robotic-Discovery-of-Antiinfectives/rda-toolbox.git"`
```Python
#!/usr/bin/env python3
import rda_toolbox as rda
import glob
rda.readerfiles_rawdf(glob.glob("path/to/raw/readerfiles/*"))
```
### File Parsing
- Read output files and return readouts in a [tidy](https://r4ds.had.co.nz/tidy-data.html), [long](https://towardsdatascience.com/long-and-wide-formats-in-data-explained-e48d7c9a06cb) DataFrame
#### **Supported readers:**
- Cytation C10
### Plotting
This package uses [Vega-Altair](https://altair-viz.github.io/index.html) for creating interactive (or static) visualizations.
- Plate Heatmaps
- Upset plots
- `UpSetAltair` plotting function is taken from https://github.com/hms-dbmi/upset-altair-notebook and modified
- This part of this package is licensed under MIT license.
<!-- https://testdriven.io/blog/python-project-workflow/ -->
### New Release
1) Update `pyproject.toml` release version
2) Update `docs/source/conf.py` release version
3) On GitHub go to *releases* and `Draft a new release`
### This package is managed via [UV](https://docs.astral.sh/uv/guides/package/#preparing-your-project-for-packaging)
- `uv build`
- `uv publish`
| text/markdown | null | Timo Leistner <leistner.timo@googlemail.com> | null | null | null | null | [] | [] | null | null | >=3.12 | [] | [] | [] | [
"altair>=6.0.0",
"black>=26.1.0",
"mkdocstrings-python>=2.0.2",
"numpy>=2.4.2",
"openpyxl>=3.1.5",
"pandas>=3.0.1",
"pytest>=9.0.2",
"rdkit>=2025.9.5",
"vl-convert-python>=1.9.0.post1"
] | [] | [] | [] | [
"Repository, https://github.com/Robotic-Discovery-of-Antiinfectives/rda-toolbox",
"Documentation, https://robotic-discovery-of-antiinfectives.github.io/rda-toolbox/"
] | uv/0.9.22 {"installer":{"name":"uv","version":"0.9.22","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Manjaro Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null} | 2026-02-19T10:06:23.880399 | rda_toolbox-0.1.15.tar.gz | 62,669 | 9f/82/4f09f4389e4973cf71193eecd9f97618da472d1c42291617accdd921cbad/rda_toolbox-0.1.15.tar.gz | source | sdist | null | false | 7c4fe24e494c25e5f2cdacf05d635cf1 | b8bf72baf66e20fd5e32a27caeb9ea4191de94a1d07565916395212986b64fbc | 9f824f09f4389e4973cf71193eecd9f97618da472d1c42291617accdd921cbad | null | [
"LICENSE"
] | 206 |
2.4 | gestor-inventario-samuel | 1.0.1 | Un gestor de inventario creado con GTK3 y SQLite. | # Gestor de Inventario
Un gestor de inventario creado con GTK3 y SQLite.
## Instalación
Podes instalar este paquete usando pip:
```bash
pip install .
```
## Uso
Unha vez instalado, podes executar a aplicación co comando:
```bash
xestor-inventario
```
| text/markdown | null | null | null | null | null | null | [] | [] | null | null | >=3.8 | [] | [] | [] | [
"pygobject"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.13.5 | 2026-02-19T10:06:10.307298 | gestor_inventario_samuel-1.0.1.tar.gz | 13,561 | 39/c2/af7ef6ebeb6df93e183096d21cc640b5ff3666ff2e6096e9eba3b5d288f8/gestor_inventario_samuel-1.0.1.tar.gz | source | sdist | null | false | 2200ed6fc62c6224f61d68943293575c | 1ab82b3cdd3afeafcf08c6e81a0fbfec971d8645e51a5428b61fa7bdc3141ec1 | 39c2af7ef6ebeb6df93e183096d21cc640b5ff3666ff2e6096e9eba3b5d288f8 | null | [] | 227 |
2.4 | sag-py-fastapi-health | 0.3.5 | A library for fastapi health checks | # sag_py_fastapi_health
[![Maintainability][codeclimate-image]][codeclimate-url]
[![Coverage Status][coveralls-image]][coveralls-url]
[![Known Vulnerabilities][snyk-image]][snyk-url]
Add health check endpoints to fastapi (similar to the ones dotnet core has)
## What it does
* Adds one or multiple health endpoint
* Configurable output format (json or prtg)
* Possibility to add checks (own and pre shipped)
* Pre-Shipped tests for http get requests including basic auth and directory existence/readability/writability
### Installation
pip install sag-py-fastapi-health
## How to use
### Sample usage with existing checks
```python
from sag_py_fastapi_health.checks.http import HttpCheck
from sag_py_fastapi_health.checks.storage import StorageExistsCheck, StorageReadableCheck
from sag_py_fastapi_health.formatter import DefaultResponseFormatter, PrtgResponseFormatter
from sag_py_fastapi_health.models import Probe
from sag_py_fastapi_health.router import HealthcheckRouter
from config import config
router = HealthcheckRouter(
Probe(
name="health",
response_formatter=DefaultResponseFormatter(),
checks=[
StorageExistsCheck("/opt/app/data", name="my_dir_exists"),
StorageReadableCheck("/opt/app/data", name="my_dir_is_readable"),
HttpCheck("https://localhost/auth", name="auth_available", timeout=5),
],
),
Probe(
name="health-prtg",
response_formatter=PrtgResponseFormatter(),
checks=[
StorageExistsCheck("/opt/app/data", name="my_dir_exists"),
StorageReadableCheck("/opt/app/data", name="my_dir_is_readable"),
HttpCheck("https://localhost/auth", name="auth_available", timeout=5),
],
),
)
```
### Write your own check
```python
from sag_py_fastapi_health.models import CheckResult
class TestCheck(Check):
def __init__(self, name: str = "check") -> None:
self._name: str = name
async def __call__(self) -> CheckResult:
is_healthy: bool = a_custom_check()
description: str = "A description of the status or a error message"
return CheckResult(
name=self._name,
status="Healthy" if is_healthy else "Unhealthy",
description=description,
)
```
The description contains something like "Directory ... was accessable" or "Service is running" if everything is ok.
If there was an error, you can add the error message/exception message there.
## How to configure in prtg
use the sensor "HTTP data advanced" (https://www.paessler.com/manuals/prtg/http_data_advanced_sensor) and configure to your prtg health endpoint (like in the example above ([URL_TO_YOUR_SERVICE]/health/health-prtg)
## How to start developing
### With vscode
Just install vscode with dev containers extension. All required extensions and configurations are prepared automatically.
### With pycharm
* Install latest pycharm
* Install pycharm plugin BlackConnect
* Install pycharm plugin Mypy
* Configure the python interpreter/venv
* pip install requirements-dev.txt
* pip install black[d]
* Ctl+Alt+S => Check Tools => BlackConnect => Trigger when saving changed files
* Ctl+Alt+S => Check Tools => BlackConnect => Trigger on code reformat
* Ctl+Alt+S => Click Tools => BlackConnect => "Load from pyproject.yaml" (ensure line length is 120)
* Ctl+Alt+S => Click Tools => BlackConnect => Configure path to the blackd.exe at the "local instance" config (e.g. C:\Python310\Scripts\blackd.exe)
* Ctl+Alt+S => Click Tools => Actions on save => Reformat code
* Restart pycharm
## How to publish
* Update the version in setup.py and commit your change
* Create a tag with the same version number
* Let github do the rest
[codeclimate-image]:https://api.codeclimate.com/v1/badges/518206f10db22dbeb984/maintainability
[codeclimate-url]:https://codeclimate.com/github/SamhammerAG/sag_py_fastapi_health/maintainability
[coveralls-image]:https://coveralls.io/repos/github/SamhammerAG/sag_py_fastapi_health/badge.svg?branch=master
[coveralls-url]:https://coveralls.io/github/SamhammerAG/sag_py_fastapi_health?branch=master
[snyk-image]:https://snyk.io/test/github/SamhammerAG/sag_py_fastapi_health/badge.svg
[snyk-url]:https://snyk.io/test/github/SamhammerAG/sag_py_fastapi_health
| text/markdown | Samhammer AG | support@samhammer.de | null | null | MIT | fastapi, health | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Libraries",
"Topic :: Software Development"
] | [] | https://github.com/SamhammerAG/sag_py_fastapi_health | null | >=3.12 | [] | [] | [] | [
"aiohttp[speedups]<4,>=3.13.3",
"fastapi<1,>=0.128.7",
"pydantic>=2.12.2",
"typing-extensions>=4.15.0",
"flake8; extra == \"dev\"",
"mypy; extra == \"dev\"",
"build; extra == \"dev\"",
"pytest; extra == \"dev\"",
"pytest-asyncio; extra == \"dev\"",
"pytest-cov; extra == \"dev\"",
"coverage-lcov;... | [] | [] | [] | [
"Documentation, https://github.com/SamhammerAG/sag_py_fastapi_health",
"Bug Reports, https://github.com/SamhammerAG/sag_py_fastapi_health/issues",
"Source, https://github.com/SamhammerAG/sag_py_fastapi_health"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:06:08.013732 | sag_py_fastapi_health-0.3.5.tar.gz | 11,333 | 7f/3f/715566fb3f6088eb57871c29b4a6ba727ee27a0b7caff3fc96f0bef4359f/sag_py_fastapi_health-0.3.5.tar.gz | source | sdist | null | false | ed9a67e5d1c3e733d710ee16e81cb3fb | ef2b9236bcf4e5877f41b76150be0c3e52a731bf5e73ecc3ec21538753f1c70d | 7f3f715566fb3f6088eb57871c29b4a6ba727ee27a0b7caff3fc96f0bef4359f | null | [
"LICENSE.txt"
] | 209 |
2.4 | analysim-jupyterlite-ready-signal | 0.1.1 | A minimal JupyterLite frontend extension that sends a postMessage signal to the parent window when the application is fully restored and ready. | # ready_signal
[](/actions/workflows/build.yml)
A minimal JupyterLite frontend extension that sends a postMessage signal to the parent window when the application is fully restored and ready.
## Requirements
- JupyterLab >= 4.0.0
## Install
To install the extension, execute:
```bash
pip install ready_signal
```
## Uninstall
To remove the extension, execute:
```bash
pip uninstall ready_signal
```
## Contributing
### Development install
Note: You will need NodeJS to build the extension package.
The `jlpm` command is JupyterLab's pinned version of
[yarn](https://yarnpkg.com/) that is installed with JupyterLab. You may use
`yarn` or `npm` in lieu of `jlpm` below.
```bash
# Clone the repo to your local environment
# Change directory to the ready_signal directory
# Set up a virtual environment and install package in development mode
python -m venv .venv
source .venv/bin/activate
pip install --editable "."
# Link your development version of the extension with JupyterLab
jupyter labextension develop . --overwrite
# Rebuild extension Typescript source after making changes
# IMPORTANT: Unlike the steps above which are performed only once, do this step
# every time you make a change.
jlpm build
```
You can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.
```bash
# Watch the source directory in one terminal, automatically rebuilding when needed
jlpm watch
# Run JupyterLab in another terminal
jupyter lab
```
With the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).
By default, the `jlpm build` command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:
```bash
jupyter lab build --minimize=False
```
### Development uninstall
```bash
pip uninstall ready_signal
```
In development mode, you will also need to remove the symlink created by `jupyter labextension develop`
command. To find its location, you can run `jupyter labextension list` to figure out where the `labextensions`
folder is located. Then you can remove the symlink named `ready-signal` within that folder.
### Packaging the extension
See [RELEASE](RELEASE.md)
| text/markdown | null | mohab <mohab_sobhy@outlook.com> | null | null | BSD 3-Clause License
Copyright (c) 2026, mohab
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. | jupyter, jupyterlab, jupyterlab-extension | [
"Framework :: Jupyter",
"Framework :: Jupyter :: JupyterLab",
"Framework :: Jupyter :: JupyterLab :: 4",
"Framework :: Jupyter :: JupyterLab :: Extensions",
"Framework :: Jupyter :: JupyterLab :: Extensions :: Prebuilt",
"License :: OSI Approved :: BSD License",
"Programming Language :: Python",
"Prog... | [] | null | null | >=3.10 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/YOUR_USERNAME/analysim_jupyterlite_ready_signal",
"Repository, https://github.com/YOUR_USERNAME/analysim_jupyterlite_ready_signal",
"Issues, https://github.com/YOUR_USERNAME/analysim_jupyterlite_ready_signal/issues"
] | twine/6.2.0 CPython/3.14.2 | 2026-02-19T10:06:07.380298 | analysim_jupyterlite_ready_signal-0.1.1.tar.gz | 86,986 | 1b/6b/babb291d94f94335080a1e653a08cff0822bb0d81c9f2229b1bd58bea404/analysim_jupyterlite_ready_signal-0.1.1.tar.gz | source | sdist | null | false | 9862113e74375252db2fa2ba7934fbd1 | 7221da19a85a8cc191d065fec432b300569cdaba18650d871a221d1bfd0ea3fa | 1b6bbabb291d94f94335080a1e653a08cff0822bb0d81c9f2229b1bd58bea404 | null | [
"LICENSE"
] | 250 |
2.4 | upsonic | 0.72.4 | Agent Framework For Fintech | <div align="center">
<img src="https://github.com/user-attachments/assets/fbe7219f-55bc-4748-ac4a-dd2fb2b8d9e5" width="600" />
# Upsonic
**Production-Ready AI Agent Framework with Safety First**
[](https://badge.fury.io/py/upsonic)
[](LICENCE)
[](https://pypi.org/project/upsonic/)
[](https://github.com/Upsonic/Upsonic)
[](https://github.com/Upsonic/Upsonic/issues)
[](https://docs.upsonic.ai)
[Documentation](https://docs.upsonic.ai) • [Quickstart](https://docs.upsonic.ai/get-started/quickstart) • [Examples](https://docs.upsonic.ai/examples)
</div>
---
## Overview
Upsonic is an open-source AI agent development framework that makes building production-ready agents simple, safe, and scalable. Whether you're building your first agent or orchestrating complex multi-agent systems, Upsonic provides everything you need in one unified framework.
Built by the community, for the community. We listen to what you need and prioritize features based on real-world use cases. Currently, we're focused on **Safety Engine** and **OCR capabilities**, two critical features for production workloads.
## What Can You Build?
Upsonic is used by fintech companies, banks, and developers worldwide to build production-grade AI agents for:
- **Document Analysis**: Extract, process, and understand documents with advanced OCR and NLP
- **Customer Service Automation**: Build intelligent chatbots with memory and context awareness
- **Financial Analysis**: Create agents that analyze market data, generate reports, and provide insights
- **Compliance Monitoring**: Ensure all AI operations follow safety policies and regulatory requirements
- **Research & Data Gathering**: Automate research workflows with multi-agent collaboration
- **Multi-Agent Workflows**: Orchestrate complex tasks across specialized agent teams
## Quick Start
### Installation
Install Upsonic using uv:
```bash
uv pip install upsonic
# pip install upsonic
```
### Basic Agent
Create your first agent in just a few lines of code:
```python
from upsonic import Agent, Task
agent = Agent(model="openai/gpt-4o", name="Stock Analyst Agent")
task = Task(description="Analyze the current market trends")
agent.print_do(task)
```
### Agent with Tools
Enhance your agent with tools for real-world tasks:
```python
from upsonic import Agent, Task
from upsonic.tools.common_tools import YFinanceTools
agent = Agent(model="openai/gpt-4o", name="Stock Analyst Agent")
task = Task(
description="Give me a summary about tesla stock with tesla car models",
tools=[YFinanceTools()]
)
agent.print_do(task)
```
### Agent with Memory
Add memory to make your agent remember past conversations:
```python
from upsonic import Agent, Task
from upsonic.storage import Memory, InMemoryStorage
memory = Memory(
storage=InMemoryStorage(),
session_id="session_001",
full_session_memory=True
)
agent = Agent(model="openai/gpt-4o", memory=memory)
task1 = Task(description="My name is John")
agent.print_do(task1)
task2 = Task(description="What is my name?")
agent.print_do(task2) # Agent remembers: "Your name is John"
```
**Ready for more?** Check out the [Quickstart Guide](https://docs.upsonic.ai/get-started/quickstart) for additional examples including Knowledge Base and Team workflows.
## Key Features
- **Safety Engine**: Built-in policy engine to ensure your agents follow company guidelines and compliance requirements
- **OCR Support**: Unified interface for local and cloud OCR providers with document processing capabilities
- **Memory Management**: Give your agents context and long-term memory with flexible storage backends
- **Multi-Agent Teams**: Build collaborative agent systems with sequential and parallel execution modes
- **Tool Integration**: Extensive tool support including MCP, custom tools, and human-in-the-loop workflows
- **Production Ready**: Designed for enterprise deployment with comprehensive monitoring and metrics
## Core Capabilities
### Safety Engine
Safety isn't an afterthought in Upsonic. It's built into the core. Create reusable policies, attach them to any agent, and ensure compliance across your entire system. The Safety Engine is LLM-agnostic and production-ready from day one.
Key capabilities include:
- Pre-built policies for common safety requirements (PII blocking, content filtering, etc.)
- Custom policy creation for your specific compliance needs
- Real-time monitoring and enforcement
- Detailed audit logs for compliance reporting
**Example:**
```python
from upsonic import Agent, Task
from upsonic.safety_engine.policies.pii_policies import PIIBlockPolicy
agent = Agent(
model="openai/gpt-4o-mini",
agent_policy=PIIBlockPolicy,
)
task = Task(
description="Create a realistic customer profile with name Alice, email alice@example.com, phone number 1234567890, and address 123 Main St, Anytown, USA"
)
result = agent.do(task)
print(result)
```
Learn more: [Safety Engine Documentation](https://docs.upsonic.ai/concepts/safety-engine/overview)
### OCR and Document Processing
Upsonic provides a unified interface for working with multiple OCR providers, both local and cloud-based. This eliminates the complexity of integrating different OCR services and allows you to switch providers without changing your code.
Supported providers include:
- Cloud providers (Google Vision, AWS Textract, Azure Computer Vision)
- Local providers (Tesseract, EasyOCR, PaddleOCR)
- Specialized document processors (DocTR, Surya)
Learn more: [OCR Documentation](https://docs.upsonic.ai/concepts/ocr/overview)
## Upsonic AgentOS
AgentOS is an optional deployment and management platform that takes your agents from development to production. It provides enterprise-grade infrastructure for deploying, monitoring, and scaling your AI agents.
**Key Features:**
- **Kubernetes-based FastAPI Runtime**: Deploy your agents as isolated, scalable microservices with enterprise-grade reliability
- **Comprehensive Metrics Dashboard**: Track every agent transaction, LLM costs, token usage, and performance metrics for complete visibility
- **Self-Hosted Deployment**: Deploy the entire AgentOS platform on your own infrastructure with full control over your data and operations
- **One-Click Deployment**: Go from code to production with automated deployment pipelines
<img width="3024" height="1590" alt="AgentOS Dashboard" src="https://github.com/user-attachments/assets/42fceaca-2dec-4496-ab67-4b9067caca42" />
## Your Complete AI Agent Infrastructure
Together, the Upsonic Framework and AgentOS provide everything a financial institution needs to build, deploy, and manage production-grade AI agents. From development to deployment, from local testing to enterprise-scale operations, from single agents to complex multi-agent systems, Upsonic delivers the complete infrastructure for your AI agent initiatives.
Whether you're a fintech startup building your first intelligent automation or an established bank deploying agents across multiple business units, Upsonic provides the end-to-end tooling to bring your AI agent vision to life safely, efficiently, and at scale.
## Documentation and Resources
- **[Documentation](https://docs.upsonic.ai)** - Complete guides and API reference
- **[Quickstart Guide](https://docs.upsonic.ai/get-started/quickstart)** - Get started in 5 minutes
- **[Examples](https://docs.upsonic.ai/examples)** - Real-world examples and use cases
- **[API Reference](https://docs.upsonic.ai/reference)** - Detailed API documentation
## Community and Support
- **[Issue Tracker](https://github.com/Upsonic/Upsonic/issues)** - Report bugs and request features
- **[Changelog](https://docs.upsonic.ai/changelog)** - See what's new in each release
## License
Upsonic is released under the MIT License. See [LICENCE](LICENCE) for details.
## Contributing
We welcome contributions from the community! Please read our contributing guidelines and code of conduct before submitting pull requests.
---
**Learn more at [upsonic.ai](https://upsonic.ai)**
| text/markdown | null | Onur ULUSOY <onur@upsonic.co>, Dogan Keskin <dogan@upsonic.co> | null | null | null | null | [
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"aiohttp>=3.13.3",
"anyio>=4.10.0",
"cloudpickle>=3.1.2",
"fastmcp>=2.14.5",
"genai-prices>=0.0.38",
"griffe>=1.14.0",
"httpx>=0.28.1",
"mcp[cli]>=1.26.0",
"nest-asyncio>=1.6.0",
"openai>=2.2.0",
"protobuf<6.0.0,>=5.27.2",
"psutil==6.1.1",
"pydantic-core>=2.27.2",
"pydantic>=2.10.5",
"py... | [] | [] | [] | [] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-19T10:05:52.900129 | upsonic-0.72.4.tar.gz | 2,264,886 | 69/a9/0257a4afbcc6273fa02e356ccbbf6a67e06683d8ad414c88a0a4d646a334/upsonic-0.72.4.tar.gz | source | sdist | null | false | 945f96c338ecb85746f2f8d9c2be51f5 | 84048aa521cbe06dcce0dcae176119b2886507a3edff29bc47843b6a4be101a8 | 69a90257a4afbcc6273fa02e356ccbbf6a67e06683d8ad414c88a0a4d646a334 | null | [
"LICENCE"
] | 284 |
2.4 | riichienv | 0.3.1 | A High-Performance Research Environment for Riichi Mahjong | <div align="center">
<img src="https://raw.githubusercontent.com/smly/RiichiEnv/main/docs/assets/logo.jpg" width="35%">
<br />
**Accelerating Reproducible Mahjong Research**
[](https://github.com/smly/RiichiEnv/actions/workflows/ci.yml)
[](https://colab.research.google.com/github/smly/RiichiEnv/blob/main/demos/replay_demo.ipynb)
[](https://www.kaggle.com/code/confirm/riichienv-replay-viewer-demo/notebook)



</div>
-----
> [!NOTE]
> While RiichiEnv is being built with reinforcement learning applications in mind, it is still very much a work in progress. As indicated in our [Milestones](https://github.com/smly/RiichiEnv/milestones), we haven't yet completed the optimization or verification necessary for RL contexts.
> The API and specifications are subject to change before the stable release.
## ✨ Features
* **High Performance**: Core logic implemented in Rust for lightning-fast state transitions and rollouts.
* **Gym-style API**: Intuitive interface designed specifically for reinforcement learning.
* **Mortal Compatibility**: Seamlessly interface with the Mortal Bot using the standard MJAI protocol.
* **Rule Flexibility**: Support for diverse rule sets, including no-red-dragon variants and three-player mahjong.
* **Game Visualization**: Integrated replay viewer for Jupyter Notebooks.
<div align="center">
<img src="https://raw.githubusercontent.com/smly/RiichiEnv/main/docs/assets/visualizer1.png" width="35%"> <img src="https://raw.githubusercontent.com/smly/RiichiEnv/main/docs/assets/visualizer2.png" width="35%">
</div>
## 📦 Installation
```bash
uv add riichienv
# Or
pip install riichienv
```
Currently, building from source requires the **Rust** toolchain.
```bash
uv sync --dev
uv run maturin develop --release
```
## 🚀 Usage
### Gym-style API
```python
from riichienv import RiichiEnv
from riichienv.agents import RandomAgent
agent = RandomAgent()
env = RiichiEnv()
obs_dict = env.reset()
while not env.done():
actions = {player_id: agent.act(obs)
for player_id, obs in obs_dict.items()}
obs_dict = env.step(actions)
scores, points, ranks = env.scores(), env.points(), env.ranks()
print(scores, points, ranks)
```
`env.reset()` initializes the game state and returns the initial observations. The returned `obs_dict` maps each active player ID to their respective `Observation` object.
```python
>>> from riichienv import RiichiEnv
>>> env = RiichiEnv()
>>> obs_dict = env.reset()
>>> obs_dict
{0: <riichienv._riichienv.Observation object at 0x7fae7e52b6e0>}
```
Use `env.done()` to check if the game has concluded.
```python
>>> env.done()
False
```
By default, the environment runs a single round (kyoku). For game rules supporting sudden death or standard match formats like East-only or Half-round, the environment continues until the game-end conditions are met.
### Observation
The `Observation` object provides all relevant information to a player, including the current game state and available legal actions.
`obs.new_events() -> list[str]` returns a list of new events since the last step, encoded as JSON strings in the MJAI protocol. The full history of events is accessible via `obs.events`.
```python
>>> obs = obs_dict[0]
>>> obs.new_events()
['{"id":0,"type":"start_game"}', '{"bakaze":"E","dora_marker":"S", ...}', '{"actor":0,"pai":"6p","type":"tsumo"}']
```
`obs.legal_actions() -> list[Action]` provides the list of all valid moves the player can make.
```python
>>> obs.legal_actions()
[Action(action_type=Discard, tile=Some(1), ...), ...]
```
If your agent communicates via the MJAI protocol, you can easily map an MJAI response to a valid `Action` object using `obs.select_action_from_mjai()`.
```python
>>> obs.select_action_from_mjai({"type":"dahai","pai":"1m","tsumogiri":False,"actor":0})
Action(action_type=Discard, tile=Some(1), consume_tiles=[])
```
### Compatibility with Mortal
RiichiEnv is fully compatible with the Mortal MJAI bot processing flow. I have confirmed that MortalAgent can execute matches without errors in over 1,000,000+ hanchan games on RiichiEnv.
```python
from riichienv import RiichiEnv, Action
from model import load_model
class MortalAgent:
def __init__(self, player_id: int):
self.player_id = player_id
# Initialize your libriichi.mjai.Bot or equivalent
self.model = load_model(player_id, "./mortal_v4.pth")
def act(self, obs) -> Action:
resp = None
for event in obs.new_events():
resp = self.model.react(event)
action = obs.select_action_from_mjai(resp)
assert action is not None, "Mortal must return a legal action"
return action
env = RiichiEnv(game_mode="4p-red-half")
agents = {pid: MortalAgent(pid) for pid in range(4)}
obs_dict = env.reset()
while not env.done():
actions = {pid: agents[pid].act(obs) for pid, obs in obs_dict.items()}
obs_dict = env.step(actions)
print(env.scores(), env.points(), env.ranks())
```
### Game Rules and Modes
RiichiEnv separates high-level game flow configuration (Mode) from detailed game mechanics (Rules).
* **Game Mode (`game_mode`)**: Configuration for game length (e.g., East-only, Hanchan), player count, and termination conditions (e.g., Tobi/bust, sudden death).
* **Game Rules (`rule`)**: Configuration for specific game mechanics (e.g., handling of Chankan (Robbing the Kan) for Kokushi Musou, Kuitan availability, etc.).
#### 1. Game Mode Presets (`game_mode`)
You can select a standard game mode using the `game_mode` argument in the constructor. This configures the basic flow of the game.
| `game_mode` | Players | Mode | Mechanics |
|---|---|---|---|
| `4p-red-single` | 4 | Single Round | No sudden death |
| `4p-red-east` | 4 | East-only (東風; Tonpuu) | Standard (Tenhou rule) |
| `4p-red-half` | 4 | Hanchan (半荘) | Standard (Tenhou rule) |
| `3p-red-east` | 3 | East-only (Tonpuu) | 🚧 In progress |
```python
# Initialize a standard 4-player Hanchan game
env = RiichiEnv(game_mode="4p-red-half")
```
> [!NOTE]
> We are also planning to implement **"No-Red" rules** (game modes without red 5 tiles), which are often adopted in professional leagues (e.g., M-League's team definitions or other competitive settings).
#### 2. Customizing Game Rules (`GameRule`)
For detailed rule customization, you can pass a `GameRule` object to the `RiichiEnv` constructor. RiichiEnv provides presets for popular platforms (Tenhou, MJSoul) and allows granular configuration.
```python
from riichienv import RiichiEnv, GameRule
# Example 1: Use MJSoul rules (allows Ron on Ankan for Kokushi Musou)
rule_mjsoul = GameRule.default_mjsoul()
env = RiichiEnv(game_mode="4p-red-half", rule=rule_mjsoul)
# Example 2: Fully custom rules based on Tenhou preset
rule_custom = GameRule.default_tenhou()
rule_custom.allows_ron_on_ankan_for_kokushi_musou = True # Enable Kokushi Chankan
rule_custom.length_of_game_in_rounds = 8 # Force 8 rounds? (Note: Length is mainly controlled by game_mode logic usually)
env = RiichiEnv(game_mode="4p-red-half", rule=rule_custom)
```
Detailed mechanic flags (like `allows_ron_on_ankan_for_kokushi_musou`) are defined in the `GameRule` struct. See [RULES.md](docs/RULES.md) for a full list of configurable options.
### Tile Conversion & Hand Parsing
Standardize between various tile formats (136-tile, MPSZ, MJAI) and easily parse hand strings.
```python
>>> import riichienv.convert as cvt
>>> cvt.mpsz_to_tid("1z")
108
>>> from riichienv import parse_hand
>>> parse_hand("123m406m789m777z")
([0, 4, 8, 12, 16, 20, 24, 28, 32, 132, 133, 134], [])
```
See [DATA_REPRESENTATION.md](docs/DATA_REPRESENTATION.md) for more details.
### Agari Calculation
```python
>>> from riichienv import AgariCalculator
>>> import riichienv.convert as cvt
>>> ac = AgariCalculator.hand_from_text("111m33p12s111666z")
>>> ac.is_tenpai()
True
>>> ac.calc(cvt.mpsz_to_tid("3s"))
Agari(agari=True, yakuman=False, ron_agari=12000, tsumo_agari_oya=0, tsumo_agari_ko=0, yaku=[8, 11, 10, 22], han=5, fu=60)
```
## 🛠 Development
For more architectural details and contribution guidelines, see [CONTRIBUTING.md](CONTRIBUTING.md) and [DEVELOPMENT_GUIDE.md](docs/DEVELOPMENT_GUIDE.md).
Check our [Milestones](https://github.com/smly/RiichiEnv/milestones) for the future roadmap and development plans.
## 📄 License
Apache License 2.0
| text/markdown; charset=UTF-8; variant=GFM | null | Kohei Ozaki <19337+smly@users.noreply.github.com> | null | null | null | null | [
"Programming Language :: Python :: 3",
"Operating System :: OS Independent"
] | [] | null | null | <3.15,>=3.10 | [] | [] | [] | [
"pyyaml>=6.0.3",
"ipython"
] | [] | [] | [] | [
"Homepage, https://github.com/smly/RiichiEnv",
"Issues, https://github.com/smly/RiichiEnv/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:05:28.390631 | riichienv-0.3.1.tar.gz | 241,115 | 11/3b/b3ece2f5f9aba3ec645fc187e6ff8909d639d25a398bd4f100309e113a5d/riichienv-0.3.1.tar.gz | source | sdist | null | false | e0818fd64b371a54b7946833187073d1 | 9b465d2972b0dcb60b66aec3dda7c6c6bd1facca024f986865089279a575c86e | 113bb3ece2f5f9aba3ec645fc187e6ff8909d639d25a398bd4f100309e113a5d | Apache-2.0 | [
"LICENSE"
] | 2,252 |
2.1 | moogli-erp | 2026.1.12 | Progiciel de gestion pour CAE | ==========
CAERP
==========
Un progiciel de gestion pour les CAE (Coopérative d'activité et d'emploi),
les collectifs d'entrepreneurs indépendants.
Licence
-------
Ceci est un logiciel libre, pour les conditions d'accès, d'utilisation,
de copie et d'exploitation, voir LICENSE.txt
Nouvelles fonctionnalités/Anomalies
-----------------------------------
Site officiel : http://endi.coop
L'essentiel du développement est réalisé sur financement de Coopérer pour
entreprendre. Si vous souhaitez plus d'information, une offre d'hébergement,
vous pouvez les contacter info@cooperer.coop
Si vous rencontrez un bogue, ou avez une idée de fonctionnalité, il est possible
de signaler cela aux développeurs directement ou en utilisant le système de
tickets de GitLab (framagit).
Exception : pour les bogues de sécurité, merci d'écrire un courriel à votre administrateur.
Instructions pour l'installation du logiciel (en environnement de prod)
-----------------------------------------------------------------------
Installation des paquets (nécessaire pour l'installation dans un environnement virtuel):
Sous Debian/Ubuntu:
NB : Il est possible soit d'utiliser le dépôt nodesource pour avoir une version adaptée de nodejs. Soit de faire les
builds JS avec docker-compose pour compiler le javascript
(voir : https://caerp.readthedocs.io/fr/latest/javascript/build_docker.html)
.. code-block:: console
apt install virtualenvwrapper libmariadb-dev libmariadb-dev-compat npm build-essential libjpeg-dev libfreetype6 libfreetype6-dev libssl-dev libxml2-dev zlib1g-dev python3-mysqldb redis-server libxslt1-dev python3-pip fonts-open-sans libcairo2 libglib2.0-dev libpango1.0-0 libgdk-pixbuf-2.0-0
Il faudra, en plus, si vous n'utilisez ***pas*** docker-compose installer npm de la manière suivante :
.. code-block:: console
curl -fsSL https://deb.nodesource.com/setup_22.x | bash - &&\
apt install npm
Sous Fedora:
.. code-block:: console
dnf install virtualenvwrapper mardiadb-devel python-devel libxslt-devel libxml2-devel libtiff-devel libjpeg-devel libzip-devel freetype-devel lcms2-devel libwebp-devel tcl-devel tk-devel gcc redis-server open-sans-fonts
Téléchargement de l'application
.. code-block:: console
git clone https://framagit.org/caerp/caerp.git
cd caerp
Téléchargement des dépendances JS (requiert nodejs >= 16.x)
.. code-block:: console
npm --prefix js_sources install
npm --prefix vue_sources install
Compilation du code JS
.. code-block:: console
make prodjs devjs
make prodjs2 devjs2
Création d'un environnement virtuel Python.
.. code-block:: console
cd caerp
mkvirtualenv caerp -p python3 -r requirements.txt
Installation de l'application
.. code-block:: console
python setup.py install
cp development.ini.sample development.ini
Éditer le fichier development.ini et configurer votre logiciel (Accès à la base
de données, différents répertoires de ressources statiques ...).
Initialiser la base de données
.. code-block:: console
caerp-admin development.ini syncdb
Si vous utilisez un paquet tiers utilisant d'autres base de données (comme
caerp_payment en mode production)
.. code-block:: console
caerp-migrate app.ini syncdb --pkg=caerp_payment
.. note::
L'application synchronise alors automatiquement les modèles de données.
Puis créer un compte administrateur
.. code-block:: console
caerp-admin development.ini useradd [--user=<user>] [--pwd=<password>] [--firstname=<firstname>] [--lastname=<lastname>] [--group=<group>] [--email=<email>]
N.B : pour un administrateur, préciser
.. code-block:: console
--group=admin
Installation (en environnement de dev)
--------------------------------------
Docker-compose permet de faciliter le déploiement d'un environnement de dév complet. Le tableau suivant récapitule les
différentes options possibles.
======================== ======================================================= =======================================
Composant Fonctionnement recommandé Fonctionnement alternatif (déconseillé)
======================== ======================================================= =======================================
serveur MariaDB natif ou docker-compose (make dev_db_serve)
serveur Redis natif ou docker-compose (make dev_db_serve)
serveur web de dév natif/bare-metal (make dev_serve) True False
build JS (Marionette/BB) docker-compose (make prodjs_dc devjs_dc) natif (make prodjs devjs)
build JS (VueJS) docker-compose (make prodjs2_dc devjs2_dc) natif (make prodjs2 devjs2)
build CSS natif (make css_watch)
build JS (legacy) natif (make js)
Postupgrade docker-compose (make postupgrade_dev) natif (make postupgrade_dev_legacy)
======================== ======================================================= =======================================
.. warning::
La suite de la doc ne couvre que les cas recommandés.
Installer les dépendendances système (cf ligne ``apt`` ou ``dnf``, selon votre
OS, dans la partie concernant l'installation en prod).
Ensuite, installez votre CAERP de dév avec les commandes suivantes :
.. code-block:: console
sudo apt/dnf install […] (idem à la section concernant la prod)
git clone https://framagit.org/caerp/caerp.git
cd caerp
cp development.ini.sample development.ini
..warning::
Assurez-vous ensuite d'utiliser une verison de Python compatible avec CAERP ; à défaut, suivez la section
« Pour les distribution possédant des versions de python incompatibles » avant de passer à la suite.
..note::
Si vous utilisez docker-compose pour le serveur mariadb, décommentez les lignes concernant docker-compose afin de
bien viser le serveur mariadb dans docker-compose.
Installez les dépendances hors système :
make postupgrade_dev
Il est possible de charger une base de données de démonstration complète
(écrase votre BDD caerp si elle existe) avec :
.. code-block::
caerp-load-demo-data development.ini
caerp-migrate development.ini upgrade
Pour les distribution possédant des versions de python incompatibles
--------------------------------------------------------------------
Pour le moment, CAErp ne supporte pas les versions de pythons > 3.10,
on peut donc passer par pyenv pour installer une version de python
supportée par le projet via `pyenv` :
.. code-block:: console
$ curl https://pyenv.run | bash
Après avoir suivi les instructions, il est possible d'initialiser un
environement (en utilisant python 3.9 par exemple) :
.. code-block:: console
$ sudo apt install liblzma-dev # dnf install xz-devel sous RH
$ cd workspace/caerp # votre dossier dans lequel est cloné caerp
$ pyenv install 3.9
$ pyenv virtualenv 3.9 caerp
$ pyenv activate caerp
(caerp) $ pip install -e .[dev]
Exécution des tâches asynchrones
---------------------------------
Un service de tâches asynchrones basé sur celery et redis est en charge de
l'exécution des tâches les plus longues.
Voir :
https://caerp.readthedocs.io/fr/latest/celery.html
pour plus d'informations.
Mise à jour (en environnement de prod)
--------------------------------------
La mise à jour d'CAERP en prod s'effectue en plusieurs temps (il est préférable de
sauvegarder vos données avant de lancer les commandes suivantes)
Mise à jour des dépendances python et du numéro de version
.. code-block:: console
pip install .
Mise à jour de la structure de données
.. code-block:: console
caerp-migrate app.ini upgrade
Si vous utilisez un paquet tiers utilisant d'autres base de données (comme
caerp_payment en mode production)
.. code-block:: console
caerp-migrate app.ini upgrade --pkg=caerp_payment
Configuration des données par défaut dans la base de données
.. code-block:: console
caerp-admin app.ini syncdb
Met à jour les dépendances JS
.. code-block:: console
npm --prefix js_sources install
Compile le JavaScript :
make prodjs
Puis lancer l'application web
.. code-block:: console
pserve --reload development.ini
.. warning::
Il est possible, sous Linux, que vous obteniez l'erreur suivante au lancement de pserve :
[ERROR] watchdog error: [Errno 24] inotify instance limit reached
La solution est la suivante :
sudo bash -c 'echo "fs.inotify.max_user_instances = 1100000" >> /etc/sysctl.d/40-max-user-watches.conf'
sudo sysctl -p
De même, si jamais pserve ne recharge pas tout le temps et/ou semble impossible à arrêter avec Ctrl+C, il faut changer un autre paramètre :
sudo bash -c 'echo "fs.inotify.max_user_watches = 1100000" >> /etc/sysctl.d/40-max-user-watches.conf'
sudo sysctl -p
(il peut être nécessaire de relancer la session utilisateur)
.. warning::
Si ``pserve --reload`` dysfonctionne sans message d'erreur : changements non détectés + impossible à stopper avec Ctrl+C.
Vous pouvez essayer d'installer watchman (``apt install watchman`` sous Debian/Ubuntu). Ça changera de backend de surveillance pour passer de **watchdog** à **watchman**. Il n'y a rien à configurer, si les deux sont installés, watchman sera préfér à watchdog.
Mise à jour/changement de branche (environnement de dév)
---------------------------------------------------------
Ces instructions sont à suivre une fois à jour sur la branche git
souhaitée. Elles sont sans risque : au pire elles ne feront rien si tout est
déjà à jour.
La commande suivante devrait s'occuper de tout
.. code-block:: console
make postupgrade_dev
.. note::
Le fichier Makefile est commenté si besoin de plus d'infos/détails sur ce
que fait cette commande.
Standards de codage Python
^^^^^^^^^^^^^^^^^^^^^^^^^^
Le code CAERP doit être formatté en respectant la pep8_.
À cette fin il est recommandé d'utiliser un analyseur de code comme flake8_.
En complément, afin d'assurer une uniformisation dans la mise en forme du code,
l'outil de formattage de code black_ doit être utilisé pour le développement.
Il peut être configuré `au niveau de votre éditeur`_ (le plus confortable) et/ou en
pre-commit.
.. _pep8: https://www.python.org/dev/peps/pep-0008/
.. _flake8: https://flake8.pycqa.org/en/latest/
.. _black: https://black.readthedocs.io/en/stable/index.html
.. _au niveau de votre éditeur: https://black.readthedocs.io/en/stable/integrations/editors.html
.. note::
Pour activer le pre-commit hook (une fois pour toutes) : depuis le venv :
``pre-commit install``
Ensuite, à chaque commit, lorsque votre code n'est pas formatté correctement
selon black le reformatera au moment du commit **et fera échouer
le commit**. Il faudra alors ajouter (``git add``) les modifications
apportées par black et commiter à nouveau.
Il est également possible de lancer black manuellement sur l'ensemble du projet :
.. code-block:: console
make black
(si vous n'utilisez pas black en local, l'intégration continue vous le rappelera 😁)
Standards de codage Javascript Marionette
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Le code javascript Backbone/Marionette de CAERP (dans js_sources/src) doit être
formatté à l'aide de prettier.
.. code-block:: console
cd js_sources/
npm install -D
npm prettier --config=./.prettierrc --write src/
Idéalement le code doit être vérifié à l'aide de eslint.
.. code-block:: console
cd js_sources/
npm install -D
npm eslint -c ./.eslintrc src/
Ces deux outils peuvent être intégrés dans la majorité des éditeurs de code.
Base de données avec docker-compose (MariaDB + redis)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Pour héberger sur un conteneur docker jettable et reproductible sans toucher à
la machine hôte, une configuration docker-compose est disponible.
Pour installer l'environnement (la première fois) :
.. code-block:: console
sudo apt install docker-compose
sudo usermod -a -G docker $USER
Pour l'utiliser, plusieurs raccourcis sont offerts :
.. code-block:: console
# Faire tourner une BDD que l'on stoppera avec ctrl+c
make dev_db_serve
# Démarrer une BDD
make dev_db_start
# Arêtter une BDD démarrée avec la commande précédente
make dev_db_stop
# Effacer les données de la BDD de dév
make dev_db_clear
Des configurations adaptées à docker-compose sont commentées dans ``test.ini.sample`` et
``developement.ini.sample``.
Compilation dynamique des assets (JS/CSS) avec docker compose
-----------------------------------------------------------------
Pour compiler uniquement les fichiers js
.. code-block:: console
docker compose -f js-docker-compose.yaml up
Pour compiler les fichiers css
.. code-block:: console
docker compose -f css-docker-compose.yaml up
Tests
------
Copier et personnaliser le fichier de configuration
.. code-block:: console
cp test.ini.sample test.ini
Lancer les tests
.. code-block:: console
py.test caerp/tests
Documentation utilisateur
--------------------------
Le guide d'utilisation se trouve à cette adresse :
https://doc.endi.coop
*****
:Ce projet est testé avec: `BrowserStack <https://www.browserstack.com/>`_
| text/x-rst | Majerti/Kilya | contact@moogli.coop | null | null | null | pyramid, business, web | [
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Programming Language :: Python",
"Framework :: Pyramid",
"Topic :: Internet :: WWW/HTTP",
"Topic :: Internet :: WWW/HTTP :: WSGI :: Application"
] | [] | https://framagit.org/caerp/caerp | null | >=3.7 | [] | [] | [] | [] | [] | [] | [] | [] | twine/5.1.1 CPython/3.9.19 | 2026-02-19T10:04:45.320921 | moogli_erp-2026.1.12.tar.gz | 15,487,950 | c9/c3/92ba8b2c97d480c36ef1587ad73e7184b39f7ce88211cd0f7df811948a40/moogli_erp-2026.1.12.tar.gz | source | sdist | null | false | 946c4deec4a5bfa3798822ddbf815985 | 262ca5139cde0cf7e4093e77cf5ab0af89bccf4b224fc88bde460ce1c1ab1412 | c9c392ba8b2c97d480c36ef1587ad73e7184b39f7ce88211cd0f7df811948a40 | null | [] | 163 |
2.4 | arkindex-client | 1.3.0rc2 | API client for the Arkindex project | Documentation is available at https://api.arkindex.org
| null | Teklia <contact@teklia.com> | null | null | null | AGPL-v3 | api client arkindex | [
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Intended Audience :: Science/Research",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Languag... | [] | https://gitlab.teklia.com/arkindex/api-client | null | >=3.8 | [] | [] | [] | [
"requests>=2.30",
"tenacity>=9.1.0",
"typesystem==0.4.1"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.10.19 | 2026-02-19T10:03:35.384594 | arkindex_client-1.3.0rc2.tar.gz | 39,644 | 72/9a/5ac3851f609e9035cfe8405f2db65883fdd534d7475802129dedfb3bf29c/arkindex_client-1.3.0rc2.tar.gz | source | sdist | null | false | 7ffbe4729304d3caa03047c0fd7841c4 | 19c75cfc8e3770adc45dc7b84ec4ee8e64a0decc6802501f9cf73926e8541229 | 729a5ac3851f609e9035cfe8405f2db65883fdd534d7475802129dedfb3bf29c | null | [
"LICENSE"
] | 203 |
2.4 | airflow-schedule-insights | 0.2.0 | An Apache Airflow 3 plugin that visualizes DAG runs in a Gantt chart, predicts future cron and asset-triggered runs, and identifies DAGs that won't execute. Enhance your workflow monitoring and planning with intuitive visualizations. | # Airflow Schedule Insights Plugin
The Airflow Schedule Insights Plugin for [Apache Airflow](https://github.com/apache/airflow) allows you to visualize DAG runs in a Gantt chart, predict future runs, and identify DAGs that won't run, providing a seamless and efficient workflow for managing your pipelines. Enhance your workflow monitoring and planning with intuitive visualizations.
[](https://github.com/ponderedw/airflow-schedule-insights/actions)
[](https://github.com/psf/black)
## System Requirements
- **Airflow Versions**: 2.4.0 or newer
## How to Install
Add `airflow-schedule-insights` to your `requirements.txt` and restart the web server.
## How to Use
1. Navigate to `Schedule Insights` in the `Browse` tab to access the plugin:

2. View all DAG runs in a Gantt chart:

3. Toggle the `Show Future Runs?` option to predict the next runs for your DAGs and generate a list of all the DAGs that won't run.
**Note**: All event-driven DAGs (scheduled by datasets and triggers) are predicted only to their next run.
4. Future DAGs will be highlighted in gray on the Gantt chart:

5. A table of future runs will be displayed, with events ordered by their start date:

6. Below this, you will find a table listing all the DAGs that won't run:

| text/markdown | Ponder | null | null | null | MIT | null | [
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13"
] | [] | null | null | <3.14,>=3.10 | [] | [] | [] | [
"SQLAlchemy>=2.0.0",
"apache-airflow>=3.0.0",
"croniter>=1.0.0"
] | [] | [] | [] | [
"homepage, https://github.com/ponderedw/airflow-schedule-insights"
] | poetry/2.3.2 CPython/3.10.19 Linux/6.14.0-1017-azure | 2026-02-19T10:02:15.548098 | airflow_schedule_insights-0.2.0-py3-none-any.whl | 17,837 | ce/d1/7441cdb08f01ba811dcbb9516bc79549108177ea89163ff263a50ca3269b/airflow_schedule_insights-0.2.0-py3-none-any.whl | py3 | bdist_wheel | null | false | 05e0c3e45f0a5ddd58fb20515bd93268 | f52c1db5c8a842df3d94d9eeafa5a1b1bf60812a38c66c4a32123f30e5c40fdb | ced17441cdb08f01ba811dcbb9516bc79549108177ea89163ff263a50ca3269b | null | [
"LICENSE",
"NOTICE"
] | 235 |
2.4 | logprep | 18.1.0 | Logprep allows to collect, process and forward log messages from various data sources. | <h1 align="center">Logprep</h1>
<h3 align="center">


[](http://logprep.readthedocs.io/?badge=latest)

<a href="https://codecov.io/github/fkie-cad/Logprep" target="_blank">
<img src="https://img.shields.io/codecov/c/github/fkie-cad/Logprep?color=%2334D058" alt="Coverage">
</a>

</h3>
## Introduction
Logprep allows to collect, process and forward log messages from various data sources.
Log messages are being read and written by so-called connectors.
Currently, connectors for Kafka, Opensearch, S3, HTTP and JSON(L) files exist.
The log messages are processed in serial by a pipeline of processors,
where each processor modifies an event that is being passed through.
The main idea is that each processor performs a simple task that is easy to carry out.
Once the log message is passed through all processors in the pipeline the resulting
message is sent to a configured output connector.
Logprep is primarily designed to process log messages. Generally, Logprep can handle JSON messages,
allowing further applications besides log handling.
- [About Logprep](https://github.com/fkie-cad/Logprep/blob/main/README.md#about-logprep)
- [Installation](https://logprep.readthedocs.io/en/latest/installation.html)
- [Deployment Examples](https://logprep.readthedocs.io/en/latest/examples/index.html)
- [Event Generation](https://logprep.readthedocs.io/en/latest/user_manual/execution.html#event-generation)
- [Documentation](https://logprep.readthedocs.io/en/latest)
- [Container signatures](https://github.com/fkie-cad/Logprep/blob/main/README.md#container-signatures)
- [Container SBOM](https://github.com/fkie-cad/Logprep/blob/main/README.md#container-sbom)
- [Contributing](https://github.com/fkie-cad/Logprep/blob/main/CONTRIBUTING.md)
- [License](https://github.com/fkie-cad/Logprep/blob/main/LICENSE)
- [Changelog](https://github.com/fkie-cad/Logprep/blob/main/CHANGELOG.md)
## About Logprep
### Pipelines
Logprep processes incoming log messages with a configured pipeline that can be spawned
multiple times via multiprocessing.
The following chart shows a basic setup that represents this behaviour.
The pipeline consists of three processors: the `Dissector`, `Geo-IP Enricher` and the
`Dropper`.
Each pipeline runs concurrently and takes one event from it's `Input Connector`.
Once the log messages is fully processed the result will be forwarded to the `Output Connector`,
after which the pipeline will take the next message, repeating the processing cycle.
```mermaid
flowchart LR
A1[Input\nConnector] --> B
A2[Input\nConnector] --> C
A3[Input\nConnector] --> D
subgraph Pipeline 1
B[Dissector] --> E[Geo-IP Enricher]
E --> F[Dropper]
end
subgraph Pipeline 2
C[Dissector] --> G[Geo-IP Enricher]
G --> H[Dropper]
end
subgraph Pipeline n
D[Dissector] --> I[Geo-IP Enricher]
I --> J[Dropper]
end
F --> K1[Output\nConnector]
H --> K2[Output\nConnector]
J --> K3[Output\nConnector]
```
### Processors
Every processor has one simple task to fulfill.
For example, the `Dissector` can split up long message fields into multiple subfields
to facilitate structural normalization.
The `Geo-IP Enricher`, for example, takes an ip-address and adds the geolocation of it to the
log message, based on a configured geo-ip database.
Or the `Dropper` deletes fields from the log message.
As detailed overview of all processors can be found in the
[processor documentation](https://logprep.readthedocs.io/en/latest/configuration/processor.html).
To influence the behaviour of those processors, each can be configured with a set of rules.
These rules define two things.
Firstly, they specify when the processor should process a log message
and secondly they specify how to process the message.
For example which fields should be deleted or to which IP-address the geolocation should be
retrieved.
### Connectors
Connectors are responsible for reading the input and writing the result to a desired output.
The main connectors that are currently used and implemented are a kafka-input-connector and a
kafka-output-connector allowing to receive messages from a kafka-topic and write messages into a
kafka-topic. Addionally, you can use the Opensearch or Opensearch output connectors to ship the
messages directly to Opensearch or Opensearch after processing.
The details regarding the connectors can be found in the
[input connector documentation](https://logprep.readthedocs.io/en/latest/configuration/input.html)
and
[output connector documentation](https://logprep.readthedocs.io/en/latest/configuration/output.html).
### Configuration
To run Logprep, certain configurations have to be provided. Because Logprep is designed to run in a
containerized environment like Kubernetes, these configurations can be provided via the filesystem or
http. By providing the configuration via http, it is possible to control the configuration change via
a flexible http api. This enables Logprep to quickly adapt to changes in your environment.
First, a general configuration is given that describes the pipeline and the connectors,
and lastly, the processors need rules in order to process messages correctly.
The following yaml configuration shows an example configuration for the pipeline shown
in the graph above:
```yaml
process_count: 3
timeout: 0.1
pipeline:
- dissector:
type: dissector
rules:
- https://your-api/dissector/
- rules/01_dissector/rules/
- geoip_enricher:
type: geoip_enricher
rules:
- https://your-api/geoip/
- rules/02_geoip_enricher/rules/
tree_config: artifacts/tree_config.json
db_path: artifacts/GeoDB.mmdb
- dropper:
type: dropper
rules:
- rules/03_dropper/rules/
input:
mykafka:
type: confluentkafka_input
bootstrapservers: [127.0.0.1:9092]
topic: consumer
group: cgroup
auto_commit: true
session_timeout: 6000
offset_reset_policy: smallest
output:
opensearch:
type: opensearch_output
hosts:
- 127.0.0.1:9200
default_index: default_index
error_index: error_index
message_backlog_size: 10000
timeout: 10000
max_retries:
user: the username
secret: the passord
cert: /path/to/cert.crt
```
The following yaml represents a dropper rule which according to the previous configuration
should be in the `rules/03_dropper/rules/` directory.
```yaml
filter: "message"
drop:
- message
description: "Drops the message field"
```
The condition of this rule would check if the field `message` exists in the log.
If it does exist then the dropper would delete this field from the log message.
Details about the rule language and how to write rules for the processors can be found in the
[rule configuration documentation](https://logprep.readthedocs.io/en/latest/configuration/rules.html).
## Documentation
The documentation for Logprep is online at https://logprep.readthedocs.io/en/latest/ or it can
be built locally via:
```
sudo apt install pandoc
uv sync --frozen --extra doc
cd ./doc/
make html
```
A HTML documentation can be then found in `doc/_build/html/index.html`.
## Container signatures
From release 15 on, Logprep containers are signed using the
[cosign](https://github.com/sigstore/cosign) tool.
To verify the container, you can copy the following public key into a file
`logprep.pub`:
```
-----BEGIN PUBLIC KEY-----
MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEgkQXDi/N4TDFE2Ao0pulOFfbGm5g
kVtARE+LJfSFI25BanOG9jaxxRGVt+Sa1KtQbMcy7Glxu0s7XgD9VFGjTA==
-----END PUBLIC KEY-----
```
And use it to verify the signature:
```
cosign verify --key logprep.pub ghcr.io/fkie-cad/logprep:py3.11-latest
```
The output should look like:
```
Verification for ghcr.io/fkie-cad/logprep:py3.11-latest --
The following checks were performed on each of these signatures:
- The cosign claims were validated
- Existence of the claims in the transparency log was verified offline
- The signatures were verified against the specified public key
[{"critical":{"identity":{"docker-reference":"ghcr.io/fkie-cad/logprep"}, ...
```
## Container SBOM
From release 15 on, Logprep container images are shipped with a generated sbom.
To verify the attestation and extract the SBOM use
[cosign](https://github.com/sigstore/cosign) with:
```
cosign verify-attestation --key logprep.pub ghcr.io/fkie-cad/logprep:py3.11-latest | jq '.payload | @base64d | fromjson | .predicate | .Data | fromjson' > sbom.json
```
The output should look like:
```
Verification for ghcr.io/fkie-cad/logprep:py3.11-latest --
The following checks were performed on each of these signatures:
- The cosign claims were validated
- Existence of the claims in the transparency log was verified offline
- The signatures were verified against the specified public key
```
Finally, you can view the extracted sbom with:
```
cat sbom.json | jq
```
| text/markdown | null | null | null | null | GNU LESSER GENERAL PUBLIC LICENSE
Version 2.1, February 1999
Copyright (C) 1991, 1999 Free Software Foundation, Inc.
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
[This is the first released version of the Lesser GPL. It also counts
as the successor of the GNU Library Public License, version 2, hence
the version number 2.1.]
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
Licenses are intended to guarantee your freedom to share and change
free software--to make sure the software is free for all its users.
This license, the Lesser General Public License, applies to some
specially designated software packages--typically libraries--of the
Free Software Foundation and other authors who decide to use it. You
can use it too, but we suggest you first think carefully about whether
this license or the ordinary General Public License is the better
strategy to use in any particular case, based on the explanations below.
When we speak of free software, we are referring to freedom of use,
not price. Our General Public Licenses are designed to make sure that
you have the freedom to distribute copies of free software (and charge
for this service if you wish); that you receive source code or can get
it if you want it; that you can change the software and use pieces of
it in new free programs; and that you are informed that you can do
these things.
To protect your rights, we need to make restrictions that forbid
distributors to deny you these rights or to ask you to surrender these
rights. These restrictions translate to certain responsibilities for
you if you distribute copies of the library or if you modify it.
For example, if you distribute copies of the library, whether gratis
or for a fee, you must give the recipients all the rights that we gave
you. You must make sure that they, too, receive or can get the source
code. If you link other code with the library, you must provide
complete object files to the recipients, so that they can relink them
with the library after making changes to the library and recompiling
it. And you must show them these terms so they know their rights.
We protect your rights with a two-step method: (1) we copyright the
library, and (2) we offer you this license, which gives you legal
permission to copy, distribute and/or modify the library.
To protect each distributor, we want to make it very clear that
there is no warranty for the free library. Also, if the library is
modified by someone else and passed on, the recipients should know
that what they have is not the original version, so that the original
author's reputation will not be affected by problems that might be
introduced by others.
Finally, software patents pose a constant threat to the existence of
any free program. We wish to make sure that a company cannot
effectively restrict the users of a free program by obtaining a
restrictive license from a patent holder. Therefore, we insist that
any patent license obtained for a version of the library must be
consistent with the full freedom of use specified in this license.
Most GNU software, including some libraries, is covered by the
ordinary GNU General Public License. This license, the GNU Lesser
General Public License, applies to certain designated libraries, and
is quite different from the ordinary General Public License. We use
this license for certain libraries in order to permit linking those
libraries into non-free programs.
When a program is linked with a library, whether statically or using
a shared library, the combination of the two is legally speaking a
combined work, a derivative of the original library. The ordinary
General Public License therefore permits such linking only if the
entire combination fits its criteria of freedom. The Lesser General
Public License permits more lax criteria for linking other code with
the library.
We call this license the "Lesser" General Public License because it
does Less to protect the user's freedom than the ordinary General
Public License. It also provides other free software developers Less
of an advantage over competing non-free programs. These disadvantages
are the reason we use the ordinary General Public License for many
libraries. However, the Lesser license provides advantages in certain
special circumstances.
For example, on rare occasions, there may be a special need to
encourage the widest possible use of a certain library, so that it becomes
a de-facto standard. To achieve this, non-free programs must be
allowed to use the library. A more frequent case is that a free
library does the same job as widely used non-free libraries. In this
case, there is little to gain by limiting the free library to free
software only, so we use the Lesser General Public License.
In other cases, permission to use a particular library in non-free
programs enables a greater number of people to use a large body of
free software. For example, permission to use the GNU C Library in
non-free programs enables many more people to use the whole GNU
operating system, as well as its variant, the GNU/Linux operating
system.
Although the Lesser General Public License is Less protective of the
users' freedom, it does ensure that the user of a program that is
linked with the Library has the freedom and the wherewithal to run
that program using a modified version of the Library.
The precise terms and conditions for copying, distribution and
modification follow. Pay close attention to the difference between a
"work based on the library" and a "work that uses the library". The
former contains code derived from the library, whereas the latter must
be combined with the library in order to run.
GNU LESSER GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License Agreement applies to any software library or other
program which contains a notice placed by the copyright holder or
other authorized party saying it may be distributed under the terms of
this Lesser General Public License (also called "this License").
Each licensee is addressed as "you".
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term "modification".)
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control compilation
and installation of the library.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running a program using the Library is not restricted, and output from
such a program is covered only if its contents constitute a work based
on the Library (independent of the use of the Library in a tool for
writing it). Whether that is true depends on what the Library does
and what the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
all the notices that refer to this License and to the absence of any
warranty; and distribute a copy of this License along with the
Library.
You may charge a fee for the physical act of transferring a copy,
and you may at your option offer warranty protection in exchange for a
fee.
2. You may modify your copy or copies of the Library or any portion
of it, thus forming a work based on the Library, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) The modified work must itself be a software library.
b) You must cause the files modified to carry prominent notices
stating that you changed the files and the date of any change.
c) You must cause the whole of the work to be licensed at no
charge to all third parties under the terms of this License.
d) If a facility in the modified Library refers to a function or a
table of data to be supplied by an application program that uses
the facility, other than as an argument passed when the facility
is invoked, then you must make a good faith effort to ensure that,
in the event an application does not supply such function or
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
with the Library (or with a work based on the Library) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may opt to apply the terms of the ordinary GNU General Public
License instead of this License to a given copy of the Library. To do
this, you must alter all the notices that refer to this License, so
that they refer to the ordinary GNU General Public License, version 2,
instead of to this License. (If a newer version than version 2 of the
ordinary GNU General Public License has appeared, then you can specify
that version instead if you wish.) Do not make any other change in
these notices.
Once this change is made in a given copy, it is irreversible for
that copy, so the ordinary GNU General Public License applies to all
subsequent copies and derivative works made from that copy.
This option is useful when you wish to copy part of the code of
the Library into a program that is not a library.
4. You may copy and distribute the Library (or a portion or
derivative of it, under Section 2) in object code or executable form
under the terms of Sections 1 and 2 above provided that you accompany
it with the complete corresponding machine-readable source code, which
must be distributed under the terms of Sections 1 and 2 above on a
medium customarily used for software interchange.
If distribution of object code is made by offering access to copy
from a designated place, then offering equivalent access to copy the
source code from the same place satisfies the requirement to
distribute the source code, even though third parties are not
compelled to copy the source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a "work that uses the Library". Such a
work, in isolation, is not a derivative work of the Library, and
therefore falls outside the scope of this License.
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License.
Section 6 states terms for distribution of such executables.
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is not.
Whether this is true is especially significant if the work can be
linked without the Library, or if the work is itself a library. The
threshold for this to be true is not precisely defined by law.
If such an object file uses only numerical parameters, data
structure layouts and accessors, and small macros and small inline
functions (ten lines or less in length), then the use of the object
file is unrestricted, regardless of whether it is legally a derivative
work. (Executables containing this object code plus portions of the
Library will still fall under Section 6.)
Otherwise, if the work is a derivative of the Library, you may
distribute the object code for the work under the terms of Section 6.
Any executables containing that work also fall under Section 6,
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also combine or
link a "work that uses the Library" with the Library to produce a
work containing portions of the Library, and distribute that work
under terms of your choice, provided that the terms permit
modification of the work for the customer's own use and reverse
engineering for debugging such modifications.
You must give prominent notice with each copy of the work that the
Library is used in it and that the Library and its use are covered by
this License. You must supply a copy of this License. If the work
during execution displays copyright notices, you must include the
copyright notice for the Library among them, as well as a reference
directing the user to the copy of this License. Also, you must do one
of these things:
a) Accompany the work with the complete corresponding
machine-readable source code for the Library including whatever
changes were used in the work (which must be distributed under
Sections 1 and 2 above); and, if the work is an executable linked
with the Library, with the complete machine-readable "work that
uses the Library", as object code and/or source code, so that the
user can modify the Library and then relink to produce a modified
executable containing the modified Library. (It is understood
that the user who changes the contents of definitions files in the
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (1) uses at run time a
copy of the library already present on the user's computer system,
rather than copying library functions into the executable, and (2)
will operate properly with a modified version of the library, if
the user installs one, as long as the modified version is
interface-compatible with the version that the work was made with.
c) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
d) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
e) Verify that the user has already received a copy of these
materials or that you have already sent this user a copy.
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the materials to be distributed need not include anything that is
normally distributed (in either source or binary form) with the major
components (compiler, kernel, and so on) of the operating system on
which the executable runs, unless that component itself accompanies
the executable.
It may happen that this requirement contradicts the license
restrictions of other proprietary libraries that do not normally
accompany the operating system. Such a contradiction means you cannot
use both them and the Library together in an executable that you
distribute.
7. You may place library facilities that are a work based on the
Library side-by-side in a single library together with other library
facilities not covered by this License, and distribute such a combined
library, provided that the separate distribution of the work based on
the Library and of the other library facilities is otherwise
permitted, and provided that you do these two things:
a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities. This must be distributed under the terms of the
Sections above.
b) Give prominent notice with the combined library of the fact
that part of it is a work based on the Library, and explaining
where to find the accompanying uncombined form of the same work.
8. You may not copy, modify, sublicense, link with, or distribute
the Library except as expressly provided under this License. Any
attempt otherwise to copy, modify, sublicense, link with, or
distribute the Library is void, and will automatically terminate your
rights under this License. However, parties who have received copies,
or rights, from you under this License will not have their licenses
terminated so long as such parties remain in full compliance.
9. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Library or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Library (or any work based on the
Library), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Library or works based on it.
10. Each time you redistribute the Library (or any work based on the
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties with
this License.
11. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Library at all. For example, if a patent
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
an explicit geographical distribution limitation excluding those countries,
so that distribution is permitted only in or among countries not thus
excluded. In such case, this License incorporates the limitation as if
written in the body of this License.
13. The Free Software Foundation may publish revised and/or new
versions of the Lesser General Public License from time to time.
Such new versions will be similar in spirit to the present version,
but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
the Free Software Foundation.
14. If you wish to incorporate parts of the Library into other free
programs whose distribution conditions are incompatible with these,
write to the author to ask for permission. For software which is
copyrighted by the Free Software Foundation, write to the Free
Software Foundation; we sometimes make exceptions for this. Our
decision will be guided by the two goals of preserving the free status
of all derivatives of our free software and of promoting the sharing
and reuse of software generally.
NO WARRANTY
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Libraries
If you develop a new library, and you want it to be of the greatest
possible use to the public, we recommend making it free software that
everyone can redistribute and change. You can do so by permitting
redistribution under these terms (or, alternatively, under the terms of the
ordinary General Public License).
To apply these terms, attach the following notices to the library. It is
safest to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the library's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301
USA
Also add information on how to contact you by electronic and paper mail.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the library, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the
library `Frob' (a library for tweaking knobs) written by James Random
Hacker.
<signature of Ty Coon>, 1 April 1990
Ty Coon, President of Vice
That's all there is to it!
| kafka, etl, sre, preprocessing, opensearch, soar, logdata | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: GNU Lesser General Public License v2 or later (LGPLv2+)",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language... | [] | null | null | >=3.11 | [] | [] | [] | [
"aiohttp<4,>=3.13.3",
"attrs<26",
"certifi<2026,>=2023.7.22",
"confluent-kafka<3,>2",
"filelock<4",
"geoip2<6,>=5.2.0",
"jsonref<2",
"luqum<2",
"more-itertools==8.10.0",
"numpy<3,>=1.26.0",
"opensearch-py<4",
"prometheus_client<1",
"protobuf<7,>=3.20.2",
"pycryptodome<4",
"pyparsing<4",
... | [] | [] | [] | [
"Homepage, https://github.com/fkie-cad/Logprep",
"Documentation, https://logprep.readthedocs.io/en/latest/",
"Repository, https://github.com/fkie-cad/Logprep",
"Issues, https://github.com/fkie-cad/Logprep/issues",
"Changelog, https://github.com/fkie-cad/Logprep/blob/main/CHANGELOG.md"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T10:01:36.872183 | logprep-18.1.0.tar.gz | 3,656,679 | a7/ba/f34aa0e7ce6229405bc291a52c6afe107f1a804386a7515cd488e9bfd976/logprep-18.1.0.tar.gz | source | sdist | null | false | 439e2b28e952cfa28751ce17f2d524fb | 177e0f0f8a68a12732a1f4c06855d582ee54ef531b991e53ba37d67d34b78088 | a7baf34aa0e7ce6229405bc291a52c6afe107f1a804386a7515cd488e9bfd976 | null | [
"LICENSE"
] | 809 |
2.4 | contact-location-local | 0.0.41b2905 | PyPI Package for Circles contact-location-local Python | PyPI Package for Circles contact-location-local Python
| text/markdown | Circles | info@circlez.ai | null | null | null | null | [
"Programming Language :: Python :: 3",
"Operating System :: OS Independent"
] | [] | https://github.com/circles-zone/contact-location-local-python-package | null | null | [] | [] | [] | [
"pycountry>=23.12.11",
"phonenumbers>=8.13.30",
"database-mysql-local>=0.0.342",
"group-local>=0.0.26",
"language-remote>=0.0.20",
"location-local>=0.0.119",
"logger-local>=0.0.135",
"user-context-remote>=0.0.57"
] | [] | [] | [] | [] | twine/6.1.0 CPython/3.12.8 | 2026-02-19T10:01:11.269200 | contact_location_local-0.0.41b2905.tar.gz | 7,727 | 9a/58/3dfa19fc895d51eb16a9d74be08dcbdd6eca803a7a15deb37f3f5de4e77d/contact_location_local-0.0.41b2905.tar.gz | source | sdist | null | false | 8ed93d1e4f79237416a3a8d36161eec8 | 89668e1063995a03ee2ef8ba58881fe873f76398a5a583ccbc511e8eb8516c6b | 9a583dfa19fc895d51eb16a9d74be08dcbdd6eca803a7a15deb37f3f5de4e77d | null | [] | 188 |
2.4 | omnimrz | 0.2.1 | Fast and robust MRZ extraction, parsing, and validation using PaddleOCR | # OmniMRZ — Python MRZ Extraction & Validation Library for Passport OCR and KYC
<div align="center">
[](https://github.com/AzwadFawadHasan/OmniMRZ/LICENSE)
[](https://pypistats.org/packages/omnimrz)

[](https://github.com/AzwadFawadHasan/OmniMRZ/actions/workflows/codeql.yml)
[](https://pypi.org/project/OmniMRZ/)
<a href="https://github.com/AzwadFawadHasan/OmniMRZ/" target="_blank">
<img src="https://raw.githubusercontent.com/AzwadFawadHasan/OmniMRZ/main/omni_mrz_logo.jpg" target="_blank" />
</a>
**OmniMRZ** is an open-source **Python library for Machine Readable Zone (MRZ) extraction, parsing, and ICAO-9303 validation** from passport and ID images, built for OCR, KYC, and identity verification systems.
It is a production-grade MRZ extraction and validation engine designed for high-accuracy KYC, identity verification, and document intelligence pipelines.
Unlike simple MRZ readers, OmniMRZ evaluates whether an MRZ is structurally correct, cryptographically valid, and logically plausible.
### Typical Use Cases
🛂 Passport and ID card OCR pipelines
🏦 KYC / AML identity verification systems
✈️ Border control and immigration preprocessing
📄 Document digitization and archiving
🔐 Authentication and onboarding workflows
⭐ Show Your Support
If OmniMRZ helped you or saved development time:
👉 Please consider starring the repository
It helps visibility and motivates continued development
[Features](#features)
<!-- [Built With](#built-with) •
[Prerequisites](#prerequisites) • -->
[Installation](#installation)
<!-- [Example](#example) •
[Wiki](#wiki) •
[ToDo](#todo) • -->
<!-- [Contributing](- Contributing) -->
[Contributing](#-contributing)
</div>
## Why OmniMRZ?
Unlike basic MRZ readers, OmniMRZ provides **end-to-end MRZ quality assurance**:
- Combines OCR, structural validation, checksum verification, and logical consistency checks
- Fully compliant with **ICAO 9303**
- Designed for **production KYC and identity verification systems**
- Robust against OCR noise and partially corrupted MRZ lines
## Features
### At a glance
- MRZ detection and extraction from images
- Supports TD3 (passport) format
- Checksum validation (ICAO 9303)
- Logical and structural validation
- Clean Python API
# Detailed features
#### 🔍 MRZ Extraction
- PaddleOCR-based MRZ text extraction (robust on mobile & noisy images)
- Intelligent MRZ line clustering & reconstruction
- Automatic MRZ type detection (TD1 / TD2 / TD3)
- OCR noise filtering & MRZ-safe character normalization
- Works even with partially corrupted or misaligned MRZs
#### 🧱 Structural Validation (ICAO 9303)
- Exact line-length enforcement
- Strict MRZ format verification
- Field-level structural checks
- Early-exit gating for invalid layouts
#### 🔢 Checksum Validation
- Fully ICAO-9303 compliant checksum algorithm
- Field-level validation:
- Document number
- Date of birth
- Expiry date
- Composite checksum
- OCR-error tolerant digit correction (O→0, S→5, B→8, etc.)
- Detailed checksum failure diagnostics
#### 🧠 Logical & Semantic Validation
- Expired document detection
- Future date-of-birth detection
- Implausible age detection
- DOB ≥ expiry detection
- Gender value validation (M, F, X, <)
- Cross-field consistency signals (issuer vs nationality)
#### 📤 Output
- Clean MRZ text
- Structured JSON
- Deterministic pass / fail / warning signals
- Human-readable error messages
## Installation
```bash
pip install omnimrz
```
Note: PaddleOCR requires additional system dependencies.
Please ensure PaddlePaddle installs correctly on your platform.
```bash
pip install paddleocr
pip install paddle paddle
```
or if that fails then run
```bash
python -m pip install paddlepaddle==3.0.0 -i https://www.paddlepaddle.org.cn/packages/stable/cpu/
```
# Quick Usage
```python
from omnimrz import OmniMRZ
omni = OmniMRZ()
result = omni.process("ukpassport.jpg")
print(result)
```
# Output Example
```
{
"extraction": {
"status": "SUCCESS(extraction of mrz)",
"line1": "P<GBRPUDARSAN<<HENERT<<<<<<<<<<<<<<<<<<<<<<<",
"line2": "7077979792GBR9505209M1704224<<<<<<<<<<<<<<00"
},
"structural_validation": {
"status": "PASS",
"mrz_type": "TD3",
"errors": []
},
"checksum_validation": {
"status": "PASS",
"errors": []
},
"parsed_data": {
"status": "PARSED",
"data": {
"document_type": "P",
"issuing_country": "GBR",
"surname": "PUDARSAN",
"given_names": "HENERT",
"document_number": "707797979",
"nationality": "GBR",
"date_of_birth": "1995-05-20",
"gender": "M",
"expiry_date": "2017-04-22",
"personal_number": ""
}
},
"logical_validation": {
"status": "FAIL",
"errors": [
"DOCUMENT_EXPIRED"
]
},
"screenshot_detection": {
"status": "PASS",
"is_screenshot": false,
"score": 3,
"confidence": 30.0,
"reasons": [
"Low ELA: 0.38",
"High horizontal edges: 0.51",
"High sharpness: 2029.58"
]
}
}
```
## Citing OmniMRZ
If you use OmniMRZ in academic research or publications, please consider citing this repository:
## Contributing
Contributions are welcome!🤝
1. Fork the repository
2. Create your feature branch
```bash
git checkout -b feature/amazing-feature
```
3. Commit your changes
4. Push to your branch
5. Open a Pull Request
## Keywords
MRZ extraction, passport OCR, machine readable zone, ICAO 9303, MRZ parser, Python OCR, identity verification, KYC automation, document intelligence, ID card scanning, border control OCR
## misc

| text/markdown | null | Azwad Fawad Hasan <azwadfawadhasan@gmail.com> | null | null | Apache-2.0 | mrz, passport, ocr, identity, paddleocr, icao-9303, kyc, identity-verification, document-verification, mrz-extraction, mrz-parser, mrz-validation, passport-ocr, id-card, document-intelligence, aml, know-your-customer | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Information Technology",
"Topic :: Office/Business",
"Topic :: Security",
"Topic :: Multimedia :: Graphics :: Viewers",
"Topic :: Multimedia :: Sound/Audi... | [] | null | null | >=3.8 | [] | [] | [] | [
"paddleocr",
"paddlepaddle",
"ScreenshotScanner",
"pytest>=7.0; extra == \"dev\"",
"pytest-cov; extra == \"dev\"",
"black; extra == \"dev\"",
"isort; extra == \"dev\"",
"flake8; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/AzwadFawadHasan/OmniMRZ",
"Issues, https://github.com/AzwadFawadHasan/OmniMRZ/issues"
] | twine/6.2.0 CPython/3.13.3 | 2026-02-19T10:00:50.939760 | omnimrz-0.2.1.tar.gz | 30,248,679 | 29/c5/0cb5149640807b5ffbb310902678d7cff214ae9c916db2cfb7ec592241f0/omnimrz-0.2.1.tar.gz | source | sdist | null | false | a2cf8fccc991845b5ee40f7cffdfec71 | 1007fc2f9faa8cbbc882b73470dc057e4549d04f3aef360c10d723f5a866f558 | 29c50cb5149640807b5ffbb310902678d7cff214ae9c916db2cfb7ec592241f0 | null | [
"LICENSE"
] | 215 |
2.4 | lumina-sdk | 0.1.0 | Python SDK for Lumina LLM observability — OpenTelemetry-native instrumentation | # lumina-sdk
Python SDK for [Lumina](https://uselumina.com) — OpenTelemetry-native LLM observability.
## Installation
```bash
pip install lumina-sdk
```
Or install directly from the monorepo for development:
```bash
pip install -e packages/sdk-python/
```
## Quick start
```python
import os
from lumina import init_lumina
lumina = init_lumina({
"api_key": os.environ["LUMINA_API_KEY"],
"service_name": "my-app",
"endpoint": "https://collector.lumina.app/v1/traces",
})
# Trace an OpenAI call
import openai
client = openai.OpenAI()
response = lumina.trace_llm(
lambda: client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}],
),
name="greeting",
system="openai",
prompt="Hello!",
)
print(response.choices[0].message.content)
```
### Async usage
```python
import asyncio
import openai
from lumina import init_lumina
lumina = init_lumina({"service_name": "my-app"})
client = openai.AsyncOpenAI()
async def main():
response = await lumina.trace_llm(
lambda: client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}],
),
name="greeting",
system="openai",
)
print(response.choices[0].message.content)
asyncio.run(main())
```
### Custom spans
```python
result = lumina.trace(
"rag_pipeline",
lambda span: run_rag(span),
metadata={"query": "What is observability?"},
tags=["rag", "production"],
)
```
## Environment variables
| Variable | Default | Description |
|---|---|---|
| `LUMINA_API_KEY` | — | API key (omit for self-hosted) |
| `LUMINA_ENDPOINT` | `http://localhost:9411/v1/traces` | OTLP collector URL |
| `LUMINA_SERVICE_NAME` | — | Service name attached to all spans |
| `LUMINA_ENVIRONMENT` | `live` | `live` or `test` |
| `LUMINA_CUSTOMER_ID` | — | Customer identifier |
| `LUMINA_ENABLED` | `true` | Set to `false` to disable |
| `LUMINA_BATCH_SIZE` | `10` | Max spans per export batch |
| `LUMINA_BATCH_INTERVAL_MS` | `5000` | Batch flush interval (ms) |
| `LUMINA_MAX_RETRIES` | `3` | Export retry count |
| `LUMINA_TIMEOUT_MS` | `30000` | Export timeout (ms) |
| text/markdown | null | null | null | null | null | null | [] | [] | null | null | >=3.9 | [] | [] | [] | [
"opentelemetry-api>=1.20",
"opentelemetry-sdk>=1.20",
"opentelemetry-exporter-otlp-proto-http>=1.20",
"pytest>=7.0; extra == \"dev\"",
"pytest-asyncio>=0.21; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://uselumina.com",
"Repository, https://github.com/use-lumina/lumina"
] | twine/6.2.0 CPython/3.13.5 | 2026-02-19T10:00:11.107985 | lumina_sdk-0.1.0.tar.gz | 7,617 | 94/f0/3c2ae259eca592222549fece2027f5840299917ef033ad93ab77ac82357d/lumina_sdk-0.1.0.tar.gz | source | sdist | null | false | 5684b81026a098e575899941bb92897f | 25cb1a5576ab155fa9fa82cd13b1d386c9c70e52dfacdef585bb6e0f28a4c6df | 94f03c2ae259eca592222549fece2027f5840299917ef033ad93ab77ac82357d | MIT | [] | 238 |
2.3 | pytest-vigil | 0.6.1 | A pytest plugin for enhanced test reliability and monitoring | <div align="center">

# Pytest Vigil
[](https://opensource.org/licenses/MIT)
[](https://www.python.org/downloads/)


</div>
**Pytest Vigil** is a reliability pytest plugin that enforces resource limits on your tests and kills them when they exceed those limits.
### Why you might need this
- Tests sometimes hang indefinitely due to deadlocks or infinite loops
- Memory leaks crash your test runner or CI environment
- CPU-intensive tests slow down your entire suite
- You want to enforce maximum runtime for your CI pipeline
- You need to identify which tests are resource hogs
---
## ✨ Features
- **Resource limits**: Set maximum time, memory (MB), and CPU (%) per test
- **Deadlock detection**: Kill tests that hang with low CPU activity
- **Suite timeout**: Stop the entire test run after a specified duration
- **CI scaling**: Automatically relaxes limits by 2x in CI environments (configurable)
- **Retry mechanism**: Re-run tests that fail due to resource violations
- **Detailed reports**: JSON output showing CPU breakdown by process type (browser, renderer, etc.)
## 🚀 Installation
```bash
uv add -D pytest-vigil
# or
pip install pytest-vigil
```
## ⚡ Quick Start
**1. Protect against heavy tests**
Limit tests to 5 seconds, 512MB RAM, and 80% CPU:
```bash
pytest --vigil-timeout 5 --vigil-memory 512 --vigil-cpu 80
```
**2. Prevent infinite CI hangs**
Kill the entire suite if it runs longer than 15 minutes:
```bash
pytest --vigil-session-timeout 900
```
**3. Generate Reliability Report**
```bash
pytest --vigil-json-report
# saved to .pytest_vigil/vigil_report.json
```
## 🛠 Usage & Configuration
### CLI Options Reference
| Option | Default | Description |
|--------|---------|-------------|
| `--vigil-timeout` | `None` | Max duration per test (seconds) |
| `--vigil-memory` | `None` | Max memory usage (MB) |
| `--vigil-cpu` | `None` | Max CPU usage (%) |
| `--vigil-retry` | `0` | Auto-retry failed/limit-violating tests |
| `--vigil-stall-timeout` | `None` | Max duration of low CPU (deadlock detection) |
| `--vigil-session-timeout` | `None` | Global timeout for entire test run |
| `--vigil-json-report` | `False` | Enable JSON reliability report saving |
| `--vigil-output-dir` | `.pytest_vigil` | Base directory for generated Vigil artifacts |
| `--vigil-cli-report-verbosity` | `short` | Terminal output: `none`, `short`, `full` |
### Using Markers
Apply specific limits to critical or heavy tests directly in code. All arguments are optional.
| Parameter | Type | Unit | Default | Description |
|-----------|------|------|---------|-------------|
| `timeout` | `float` | `s` | `None` | Test timeout |
| `memory` | `float` | `MB` | `None` | Memory limit |
| `cpu` | `float` | `%` | `None` | CPU limit |
| `retry` | `int` | - | `0` | Number of retries on failure |
| `stall_timeout` | `float` | `s` | `None` | Max duration of low CPU activity |
| `stall_cpu_threshold`| `float` | `%` | `1.0` | CPU threshold for stall detection |
```python
import pytest
@pytest.mark.vigil(timeout=5.0, memory=512, retry=2)
def test_heavy_computation():
...
```
### Environment Variables
Perfect for CI/CD pipelines. All options are available via `PYTEST_VIGIL__*` prefix.
| Variable | Default | Description |
| :--- | :--- | :--- |
| `TIMEOUT` | `None` | Default test timeout (seconds) |
| `MEMORY_LIMIT_MB` | `None` | Default memory limit (MB) |
| `CPU_LIMIT_PERCENT` | `None` | Default CPU limit (%) |
| `SESSION_TIMEOUT` | `None` | Global suite timeout (seconds) |
| `SESSION_TIMEOUT_GRACE_PERIOD` | `5.0` | Seconds to wait for graceful shutdown |
| `MONITOR_INTERVAL` | `0.1` | Internal check frequency (seconds) |
| `STRICT_MODE` | `True` | Enforce strict monitoring |
| `CI_MULTIPLIER` | `2.0` | Limit multiplier for CI environments |
| `RETRY_COUNT` | `0` | Number of retries for failures |
| `STALL_TIMEOUT` | `None` | Low-CPU deadlock timeout (seconds) |
| `STALL_CPU_THRESHOLD` | `1.0` | Threshold (%) for stall detection |
| `CONSOLE_REPORT_VERBOSITY` | `short` | Terminal output: `none`, `short`, `full` |
| `JSON_REPORT_FILENAME` | `vigil_report.json` | Default JSON report filename when reporting is enabled |
| `JSON_REPORT` | `False` | Enable JSON report saving by default |
| `ARTIFACTS_DIR` | `.pytest_vigil` | Base directory for generated Vigil artifacts |
## 📊 Reporting
Vigil provides insights into where your resources are going.
### Terminal Report
Control verbosity with `--vigil-cli-report-verbosity` (`none`, `short`, `full`).
**Short Mode (Default):**
```text
Vigil Reliability Report
Total Tests: 953 | Avg Duration: 5.32s | Avg Memory: 288.6 MB
Peak CPU by Process Type:
Browser: 3542.1% (Chromium/Webkit)
Renderer: 2156.8% (Tab rendering)
Pytest: 593.5% (Test logic)
```
**Full Mode:**
```text
Vigil Reliability Report
Test ID Att Duration (s) Max CPU (%) Max Mem (MB)
--------------------------------------------------------------------------------------------------
tests/test_stress.py::test_high_load 0 8.42 450.5 820.1
tests/test_ui.py::test_login[chromium] 0 4.15 2101.2 415.8
tests/test_ui.py::test_checkout[chromium] 1 12.30 3542.1 590.4
tests/test_api.py::test_latency 0 0.25 15.2 45.1
```
> **💡 Note on CPU > 100%:**
> In multi-process testing (like Playwright/Selenium), usage is summed across all cores and child processes. 7000% CPU usage means your test suite is utilizing ~70 cores efficiently (or inefficiently!).
### JSON Report
The JSON report captures `cpu_breakdown` for every test, helping you identify if it's the **Browser**, **DB**, or **Python** code causing the spike.
Report saving is disabled by default. Enable it with `--vigil-json-report`; output goes to `.pytest_vigil/vigil_report.json` unless overridden.
**Key Fields:**
- `flaky_tests`: Tests that passed after retry (attempt > 0)
- `cpu_breakdown`: Peak CPU by process type (`pytest`, `browser`, `renderer`, `gpu`, `webdriver`, `python`, `automation`)
- `limits`: Applied resource constraints from CLI/markers/env
<details>
<summary>📄 <b>Example JSON Report</b> (click to expand)</summary>
```json
{
"timestamp": "2026-02-08T14:23:45.123456+00:00",
"flaky_tests": [
"tests/test_integration.py::test_api_retry"
],
"results": [
{
"node_id": "tests/test_ui.py::test_checkout[chromium]",
"attempt": 0,
"duration": 12.34,
"max_cpu": 3542.1,
"max_memory": 590.4,
"cpu_breakdown": {
"pytest": 89.2,
"browser": 1805.3,
"renderer": 1247.6,
"gpu": 400.0
},
"limits": [
{
"limit_type": "time",
"threshold": 15.0,
"secondary_threshold": null,
"strict": true
},
{
"limit_type": "memory",
"threshold": 1024.0,
"secondary_threshold": null,
"strict": true
}
]
},
{
"node_id": "tests/test_integration.py::test_api_retry",
"attempt": 1,
"duration": 2.15,
"max_cpu": 45.8,
"max_memory": 128.3,
"cpu_breakdown": {
"pytest": 45.8
},
"limits": [
{
"limit_type": "time",
"threshold": 5.0,
"secondary_threshold": null,
"strict": true
}
]
}
]
}
```
</details>
## ⚖️ License
MIT
| text/markdown | Sergei Konovalov | Sergei Konovalov <l0kifs91@gmail.com> | l0kifs | l0kifs <l0kifs91@gmail.com> | null | developer-tools, pytest-plugin, testing, test-reliability, test-monitoring, test-timeout, test-resource-usage | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Operating System :: Microsoft :: Windows",
"Operating System :: MacOS",
"Operating System :: POSIX :: Linux",
"Programming Language :: Pyth... | [] | null | null | >=3.11 | [] | [] | [] | [
"pytest>=9.0.0",
"loguru>=0.7.0",
"psutil>=7.2.2",
"pydantic>=2.12.5",
"pydantic-settings>=2.12.0"
] | [] | [] | [] | [
"Homepage, https://github.com/l0kifs/pytest-vigil",
"Documentation, https://github.com/l0kifs/pytest-vigil/blob/main/README.md",
"Repository, https://github.com/l0kifs/pytest-vigil.git",
"Issues, https://github.com/l0kifs/pytest-vigil/issues",
"Changelog, https://github.com/l0kifs/pytest-vigil/blob/main/CHA... | twine/6.1.0 CPython/3.13.7 | 2026-02-19T09:59:27.231302 | pytest_vigil-0.6.1.tar.gz | 16,775 | 01/ee/79c4f087ae9158c73a53a78542739e29cb04e1899ed351c96714beb0230d/pytest_vigil-0.6.1.tar.gz | source | sdist | null | false | 85bb1bebe56df703ded4dc2ec41b6086 | 1b48a33c12b33d3c2d7c7557d3cfad1178fe98b0577847cf79d5467baaec76f1 | 01ee79c4f087ae9158c73a53a78542739e29cb04e1899ed351c96714beb0230d | null | [] | 324 |
2.4 | ampache | 6.9.1 | Python library for Ampache XML & JSON API | AMPACHE LIBRARY FOR PYTHON3
===========================
Upload to PyPI
.. image:: https://github.com/ampache/python3-ampache/workflows/Upload%20Python%20Package/badge.svg
:target: https://pypi.org/project/ampache/
INFO
====
A python3 library for interaction with your Ampache server using the XML & JSON API
`<https://ampache.org/api/>`_
Code examples and scripts are available from github
The class documentation has been extracted out into a markdown file for easier reading.
`<https://raw.githubusercontent.com/ampache/python3-ampache/master/docs/MANUAL.md>`_
This library supports connecting to any Ampache API release (3, 4, 5 and 6)
Once you connect with your passphrase or api key, the url and auth token are stored allowing you to call methods without them.
.. code-block:: python3
import ampache
import sys
import time
# Open Ampache library
ampache_connection = ampache.API()
# Set your server details
ampache_connection.set_version('6.6.1')
ampache_connection.set_url('https://music.com.au')
ampache_connection.set_key('mypassword')
ampache_connection.set_user('myusername')
# Password auth requires a timestamp for encrypting the auth key
ampache_session = ampache_connection.execute('handshake', {'timestamp': int(time.time())})
if not ampache_session:
# if using an api key you don't need the timestamp to use encrypt_string
ampache_session = ampache_connection.execute('handshake')
# Fail if you didn't connect
if not ampache_session:
sys.exit(ampache_connection.AMPACHE_VERSION + ' ERROR Failed to connect to ' + ampache_connection.AMPACHE_URL)
# now you can call methods without having to keep putting in the url and userkey
artists = ampache_connection.execute('artists', {'limit': 10})
# You can parse a response to get a list of ID's for that response
artist_ids = ampache_connection.get_id_list(artists, 'artist')
if artist_ids:
print("We found some artists")
for artist in artist_ids:
print('ID:', artist)
# ping has always allowed empty calls so you have to ping with a session key
ampache_connection.execute('ping', {'ampache_api': ampache_session})
NEWS
====
- Examples are being updated to support the latest execute method which can simplify your code
- You can save and restore from a json config file using new methods
- set_config_path: Set a folder to your config path
- get_config: Load the config and set Ampache globals
- save_config: Save the config file with the current globals
- AMPACHE_URL = The URL of your Ampache server
- AMPACHE_USER = config["ampache_user"]
- AMPACHE_KEY = Your encrypted apikey OR password if using password auth
- AMPACHE_SESSION = Current session auth from the handshake. Use to reconnect to an existing session
- AMPACHE_API = API output format "json" || "xml"
INSTALL
=======
You can now install from pip directly::
pip3 install -U ampache
EXAMPLES
========
There is a fairly simple cli example for windows/linux to perform a few functions.
It's a good example for testing and might make things a bit easier to follow.
`<https://raw.githubusercontent.com/ampache/python3-ampache/master/docs/examples/ampyche.py>`_
ampyche.py help:
.. code-block:: bash
Possible Actions:
/u:%CUSTOM_USER% (Custom username for the current action)
/k:%CUSTOM_APIKEY% (Custom apikey for the current action)
/a:%ACTION% (ping, playlists, localplay, download, list, configure, logout, showconfig)
/l:%LIMIT% (integer)
/o:%OBJECT_ID% (string)
/t:%OBJECT_TYPE% (song, playlist)
/p:%PATH% (folder for downloads)
/f:%FORMAT% (raw, mp3, ogg, flac)
/usb (split files into numeric 0-9 folders for car USBs)
/c:%COMMAND% (localplay command)
(next, prev, stop, play, pause, add, volume_up,
volume_down, volume_mute, delete_all, skip, status)
Here is a short code sample for python using version 6.x.x+ to scrobble a track to your server
.. code-block:: python3
import ampache
import sys
import time
# Open Ampache library
ampache_connection = ampache.API()
# load up previous config
if not ampache_connection.get_config():
# Set your details manually if we can't get anything
ampache_connection.set_version('6.6.1')
ampache_connection.set_url('https://music.server')
ampache_connection.set_key('mysuperapikey')
ampache_connection.set_user('myusername')
# Get a session key using the handshake
#
# * ampache_url = (string) Full Ampache URL e.g. 'https://music.com.au'
# * ampache_api = (string) encrypted apikey OR password if using password auth
# * user = (string) username //optional
# * timestamp = (integer) UNIXTIME() //optional
# * version = (string) API Version //optional
ampache_session = ampache_connection.execute('handshake')
# Fail if you didn't connect
if not ampache_session:
sys.exit(ampache_connection.AMPACHE_VERSION + ' ERROR Failed to connect to ' + ampache_connection.AMPACHE_URL)
# save your successful connection in your local config
ampache_connection.save_config()
# Scrobble a music track to your ampache server
#
# * title = (string) song title
# * artist_name = (string) artist name
# * album_name = (string) album name
# * mbtitle = (string) song mbid //optional
# * mbartist = (string) artist mbid //optional
# * mbalbum = (string) album mbid //optional
# * stime = (integer) UNIXTIME() //optional
# * client = (string) //optional
ampache_connection.execute('scrobble', {'title': 'Beneath The Cold Clay',
'artist_name': 'Crust',
'album_name': '...and a Dirge Becomes an Anthem',
'stime': int(time.time())})
POWERED BY
==========
PhpStorm logo
.. image:: https://resources.jetbrains.com/storage/products/company/brand/logos/PyCharm.png
:target: https://jb.gg/OpenSourceSupport
JetBrains have supported the project for many years now and their tools really do power Ampache development.
LINKS
=====
`<https://github.com/ampache/python3-ampache/>`_
`<https://pypi.org/project/ampache/>`_
| null | Lachlan de Waard (lachlan-00) | lachlan.00@gmail.com | null | null | GPL-3.0 | null | [
"Development Status :: 5 - Production/Stable",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: POSIX :: Linux",
"Operating System :: Microsoft :: Windows",
"Programming Language :: Python :: 3",
"Topic :: Software Development :: Libraries :: PHP Classes",
"Intend... | [] | https://github.com/ampache/python3-ampache | https://github.com/ampache/python3-ampache | null | [] | [] | [] | [] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-19T09:59:06.687453 | ampache-6.9.1.tar.gz | 38,260 | 44/08/dc02e69f910d1eeba31929a3c9c5b992546647c4400d8033511013b288cc/ampache-6.9.1.tar.gz | source | sdist | null | false | 75cd37a2b2b4fb9e097c2f83508a5091 | 17cc14f1423e59085d3463fccad39bfdaa1b7da2056a52834c8347b15fee1403 | 4408dc02e69f910d1eeba31929a3c9c5b992546647c4400d8033511013b288cc | null | [
"LICENSE"
] | 244 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.