metadata_version string | name string | version string | summary string | description string | description_content_type string | author string | author_email string | maintainer string | maintainer_email string | license string | keywords string | classifiers list | platform list | home_page string | download_url string | requires_python string | requires list | provides list | obsoletes list | requires_dist list | provides_dist list | obsoletes_dist list | requires_external list | project_urls list | uploaded_via string | upload_time timestamp[us] | filename string | size int64 | path string | python_version string | packagetype string | comment_text string | has_signature bool | md5_digest string | sha256_digest string | blake2_256_digest string | license_expression string | license_files list | recent_7d_downloads int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2.4 | karrio-veho | 2026.1.14 | Karrio - Veho Shipping Extension | # karrio.veho
This package is a Veho extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.veho
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.veho.settings import Settings
# Initialize a carrier gateway
veho = karrio.gateway["veho"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:41:07.796854 | karrio_veho-2026.1.14-py3-none-any.whl | 16,402 | 08/bc/088908caf6d3ba00be78067d06b8015189d9b1af4a45721158a5c63e9e1f/karrio_veho-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 7203c257c95086e59dd3d48120e9fa14 | 9091dcdc0ca0b6ace3d79618a175daec30374041e4374156719ad7a8b19e5706 | 08bc088908caf6d3ba00be78067d06b8015189d9b1af4a45721158a5c63e9e1f | LGPL-3.0 | [] | 82 |
2.4 | karrio-usps-international | 2026.1.14 | Karrio - USPS Shipping Extension |
# karrio.usps
This package is a USPS extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.usps
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.usps_international.settings import Settings
# Initialize a carrier gateway
usps = karrio.gateway["usps"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:41:05.848634 | karrio_usps_international-2026.1.14-py3-none-any.whl | 34,621 | 78/72/cce8fc557332cae70d776605e4c56ff9930a11ea57e47bf0851bcbf3bc50/karrio_usps_international-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 16aae9f9e66b947560fc63b93bc4850e | 6073f45f54d1dc3daf4d8aa818ed40fe84629b5602d4227a83b273d9f5edc49e | 7872cce8fc557332cae70d776605e4c56ff9930a11ea57e47bf0851bcbf3bc50 | LGPL-3.0 | [] | 80 |
2.4 | regula-facesdk-webclient | 8.1.545rc0 | Regula's FaceSDK web python client | # Regula Face SDK web API Python 3.9+ client
[](https://github.com/regulaforensics/FaceSDK-web-openapi)
[](https://support.regulaforensics.com/hc/en-us/articles/115000916306-Documentation)
[](https://faceapi.regulaforensics.com/)
## ⚠️ Warning: Package Name Changed
## Package name has been changed from `regula.facesdk.webclient` to `regula_facesdk_webclient`
Face recognition as easy as reading two bytes.
If you have any problems with or questions about this client, please contact us
through a [GitHub issue](https://github.com/regulaforensics/FaceSDK-web-python-client/issues).
You are invited to contribute [new features, fixes, or updates](https://github.com/regulaforensics/FaceSDK-web-python-client/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22), large or small;
We are always thrilled to receive pull requests, and do our best to process them as fast as we can.
See [dev guide](./dev.md)
## Install package
`regula_facesdk_webclient` is on the Python Package Index (PyPI):
```bash
pip install regula_facesdk_webclient
```
Or using `pipenv`
```bash
pipenv install regula_facesdk_webclient
```
## Example
Performing request:
```python
from regula.facesdk.webclient import MatchImage, MatchRequest
from regula.facesdk.webclient.ext import FaceSdk, DetectRequest
from regula.facesdk.webclient.gen.model.image_source import ImageSource
with open("face1.jpg", "rb") as f:
face_1_bytes = f.read()
with open("face2.jpg", "rb") as f:
face_2_bytes = f.read()
with FaceSdk(host="http://0.0.0.0:41101") as sdk:
images = [
MatchImage(index=1, data=face_1_bytes, type=ImageSource.LIVE),
MatchImage(index=2, data=face_1_bytes, type=ImageSource.DOCUMENT_RFID),
MatchImage(index=3, data=face_2_bytes)
]
match_request = MatchRequest(images=images)
match_response = sdk.match_api.match(match_request)
detect_request = DetectRequest(image=face_1_bytes)
detect_response = sdk.match_api.detect(detect_request)
```
You can find more detailed guide and run this sample in [example](./example) folder.
| text/markdown | Regula Forensics, Inc. | support@regulaforensics.com | null | null | null | face recognition, facesdk, regulaforensics, regula | [] | [] | https://regulaforensics.com | null | >=3.9 | [] | [] | [] | [
"certifi>=2024.07.04",
"six>=1.10",
"python-dateutil>=2.8.2",
"urllib3<3.0.0,>=1.25.3",
"vistir<=0.6.1,>=0.4.0",
"idna==3.7",
"requests>=2.32.3",
"pydantic>=2",
"typing-extensions>=4.7.1",
"lazy-imports==1.0.1"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.9.25 | 2026-02-21T02:41:05.209244 | regula_facesdk_webclient-8.1.545rc0.tar.gz | 62,600 | 31/0f/0fe3063d30314fe8d8c96522e9e9cce6562b48121a80552bf96314e54f06/regula_facesdk_webclient-8.1.545rc0.tar.gz | source | sdist | null | false | 1d9f860c09ab139751a2230e8e7f8538 | 9fbcbbf690e44e629667e498a4371bb3a4aa6561a6934bd0d6ea9e0e48086a44 | 310f0fe3063d30314fe8d8c96522e9e9cce6562b48121a80552bf96314e54f06 | null | [] | 186 |
2.4 | karrio-usps | 2026.1.14 | Karrio - USPS Shipping Extension |
# karrio.usps
This package is a USPS extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.usps
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.usps.settings import Settings
# Initialize a carrier gateway
usps = karrio.gateway["usps"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:41:03.915016 | karrio_usps-2026.1.14-py3-none-any.whl | 35,806 | e1/5a/2f3b8b703e5a089dbd298a9630b9042628f0ba23470d8ccf168fdd5fef1a/karrio_usps-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 6973918358620046dbce43148b1181ed | 39877db18e6833bda23a96366e40eb81852b285a4ff46c33c0c349be0c809f7c | e15a2f3b8b703e5a089dbd298a9630b9042628f0ba23470d8ccf168fdd5fef1a | LGPL-3.0 | [] | 82 |
2.4 | karrio-ups | 2026.1.14 | Karrio - UPS Shipping extension | # karrio.ups
This package is a UPS extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.ups
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.ups.settings import Settings
# Initialize a carrier gateway
ups = karrio.gateway["ups"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:41:02.926083 | karrio_ups-2026.1.14-py3-none-any.whl | 51,204 | f3/77/100c565c3b34dc3dfeb2e23b2b92f2266885b6c2a0a4c3f058c9501fafb3/karrio_ups-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 18c5abefca71923b90935a4239efa768 | 16f291d86a1501a33090c5cd2feb828ec03d8945eaf51bd7ea2ef49696550eed | f377100c565c3b34dc3dfeb2e23b2b92f2266885b6c2a0a4c3f058c9501fafb3 | LGPL-3.0 | [] | 82 |
2.4 | karrio-tnt | 2026.1.14 | Karrio - TNT Shipping extension | # karrio.tnt
This package is a TNT extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.tnt
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.tnt.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["tnt"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:41:01.967051 | karrio_tnt-2026.1.14-py3-none-any.whl | 216,497 | 61/56/a3e2effd92cc2105547ca428dff46b997c60f1ed32cbb28d9e196666eca7/karrio_tnt-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 21ad7c53b4640329f49e81120e97e58d | 20266e926c9ba6554b35cb35ea929caed3d5dc81110de4841f4a97b3a763b67c | 6156a3e2effd92cc2105547ca428dff46b997c60f1ed32cbb28d9e196666eca7 | LGPL-3.0 | [] | 84 |
2.4 | karrio-tge | 2026.1.14 | Karrio - TGE Shipping Extension |
# karrio.tge
This package is a TGE extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.tge
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.tge.settings import Settings
# Initialize a carrier gateway
tge = karrio.gateway["tge"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:41:00.766860 | karrio_tge-2026.1.14-py3-none-any.whl | 21,896 | c9/46/6ff896cd4f2553ed68f894d13ab031835cfcee515789e8e4c7f9fa549bbc/karrio_tge-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | c0152f92bf81c63a2fd50169ff7b73db | 70333b3fe8ff288805fbcb8bd47d4c08f352d1b021b77c40efb423c1aa9ecaf6 | c9466ff896cd4f2553ed68f894d13ab031835cfcee515789e8e4c7f9fa549bbc | LGPL-3.0 | [] | 85 |
2.4 | karrio-teleship | 2026.1.14 | Karrio - Teleship Shipping Extension | # karrio.teleship
This package is a Teleship extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.teleship
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.teleship.settings import Settings
# Initialize a carrier gateway
teleship = karrio.gateway["teleship"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:59.874516 | karrio_teleship-2026.1.14-py3-none-any.whl | 39,053 | 6c/7d/2a7ac40a704c56a0b465fa4d82e69d1477540a7a3dfabfef1b6f5776af81/karrio_teleship-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 4239017ac32c48130724d94971c53391 | 469d8b6f80db5e343f89286da190f08542bdc6a91f88460cfdb708b0fbcf350a | 6c7d2a7ac40a704c56a0b465fa4d82e69d1477540a7a3dfabfef1b6f5776af81 | LGPL-3.0 | [] | 82 |
2.4 | karrio-spring | 2026.1.14 | Karrio - Spring Shipping Extension | # karrio.spring
This package is a Spring extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.spring
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.spring.settings import Settings
# Initialize a carrier gateway
spring = karrio.gateway["spring"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:58.836819 | karrio_spring-2026.1.14-py3-none-any.whl | 20,433 | 71/84/edcf5feda3fc09d90c0cd16ab3b654c58ebb60ce5abbcfc621e1267d3d01/karrio_spring-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 8ec8db68879e4ffb0bd1c28ac9142c5e | 6468c9a2bd09313831f87dba4a539f5c645615944fe1d07e094594e4b926b53b | 7184edcf5feda3fc09d90c0cd16ab3b654c58ebb60ce5abbcfc621e1267d3d01 | LGPL-3.0 | [] | 81 |
2.4 | karrio-smartkargo | 2026.1.14 | Karrio - SmartKargo Shipping Extension | # karrio.smartkargo
This package is a SmartKargo extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.smartkargo
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.smartkargo.settings import Settings
# Initialize a carrier gateway
smartkargo = karrio.gateway["smartkargo"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:58.044192 | karrio_smartkargo-2026.1.14-py3-none-any.whl | 21,940 | 95/56/1c783bd3b2317c63941483e630081f33520989c7edc0ead2bddfd9eb8d5b/karrio_smartkargo-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | f46cd667ed449c05a2d108ff4ba5a312 | 15033c35ceb63bb5dc3539e9dac294020da979f9e74531fc1b29d786e6468af6 | 95561c783bd3b2317c63941483e630081f33520989c7edc0ead2bddfd9eb8d5b | LGPL-3.0 | [] | 74 |
2.4 | karrio-shipengine | 2026.1.14 | Karrio - ShipEngine Shipping Extension | # karrio.shipengine
This package is a ShipEngine extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.shipengine
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.shipengine.settings import Settings
# Initialize a carrier gateway
shipengine = karrio.gateway["shipengine"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:57.151891 | karrio_shipengine-2026.1.14-py3-none-any.whl | 21,852 | 5c/2a/da542301a55e4d877be2555e5843b19895ce9491c3691622318aa8666fce/karrio_shipengine-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 3b0cb4209a2c25c72ded2d1a57f6d077 | 7e011a64f288d29d7ab36e60c85a6c1b41e18a927b037830c29416f77f249053 | 5c2ada542301a55e4d877be2555e5843b19895ce9491c3691622318aa8666fce | LGPL-3.0 | [] | 81 |
2.4 | karrio-server-proxy | 2026.1.14 | Multi-carrier shipping API Proxy module | # karrio.server.proxy
This package is a module of the [karrio](https://pypi.org/project/karrio.server) universal shipping API.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.server.proxy
```
Check the [karrio docs](https://docs.karrio.io) to get started.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:56.386640 | karrio_server_proxy-2026.1.14-py3-none-any.whl | 16,399 | 22/2b/1040ad03ba9890bdd8371219ccf0ce95a51a157fcf37d9b94d56b6600dc5/karrio_server_proxy-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 71f91c2a6a7cc46874274e874dd51dcc | 97fbd6aac14fa6c83984a7ed0747aebd58167e71bcde50c841b77441c0006627 | 222b1040ad03ba9890bdd8371219ccf0ce95a51a157fcf37d9b94d56b6600dc5 | LGPL-3.0 | [] | 82 |
2.4 | karrio-server-pricing | 2026.1.14 | Multi-carrier shipping API Pricing panel | # karrio.server.pricing
This package is a module of the [karrio](https://pypi.org/project/karrio.server) universal shipping API.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.server.pricing
```
Check the [karrio docs](https://docs.karrio.io) to get started.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:55.347548 | karrio_server_pricing-2026.1.14-py3-none-any.whl | 692,848 | 2e/5e/a773cdf82b17887ec8f9ef60609b1bc9d4f70d2fbb3ae1ab576381f50cd6/karrio_server_pricing-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | b19ba65d4a48cd77550d42d11cb60b23 | 91edd8cba87d19e9449daf0966b8166b2cac6ce8f086435588d0dc9d73e3709b | 2e5ea773cdf82b17887ec8f9ef60609b1bc9d4f70d2fbb3ae1ab576381f50cd6 | LGPL-3.0 | [] | 81 |
2.4 | karrio-server-orders | 2026.1.14 | Multi-carrier shipping API orders module | # karrio-server
Karrio server orders component
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core",
"karrio_server_graph",
"karrio_server_manager",
"karrio_server_events"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:54.414299 | karrio_server_orders-2026.1.14-py3-none-any.whl | 42,440 | 8b/28/2dba0eec5753b687826e8e867cbae1529747f425f36caa0705b7c1106239/karrio_server_orders-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | bd95881f46c5d4481a45eae5e3bc0ffa | 6beedf412ba5f7fd74ad5eb756282538f30a3ba365622b999e9d07f94bbd55d0 | 8b282dba0eec5753b687826e8e867cbae1529747f425f36caa0705b7c1106239 | LGPL-3.0 | [] | 80 |
2.4 | karrio-server-manager | 2026.1.14 | Multi-carrier shipping API Shipments manager module | # karrio.server.manager
This package is a module of the [karrio](https://pypi.org/project/karrio.server) universal shipping API.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.server.manager
```
Check the [karrio docs](https://docs.karrio.io) to get started.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core",
"django-downloadview"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:53.146314 | karrio_server_manager-2026.1.14-py3-none-any.whl | 150,173 | 31/9a/2f32a7fe4ce87c9c0583c5d9bb3e3b8e4b984d8209b0f462334f3755b338/karrio_server_manager-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | a28a13e4e70fb4235647832e1c9aeb4d | 3f92e7114200edeaaf8a61e855dedab7bd538bbb63706af985ec318211cb8d4b | 319a2f32a7fe4ce87c9c0583c5d9bb3e3b8e4b984d8209b0f462334f3755b338 | LGPL-3.0 | [] | 81 |
2.4 | karrio-server-graph | 2026.1.14 | Multi-carrier shipping API Graph module | # karrio.server.graph
This package is a module of the [karrio](https://pypi.org/project/karrio.server) universal shipping API.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.server.graph
```
Check the [karrio docs](https://docs.karrio.io) to get started.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core",
"django-filter",
"strawberry-graphql"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:52.201395 | karrio_server_graph-2026.1.14-py3-none-any.whl | 72,443 | 61/f9/e5138679f7b5f680f377e51f090814d5fcf1ca51c7fce9b5c7f43a255b8d/karrio_server_graph-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 7964db230241f07639b07a7ef8ee9d31 | d10550df8e2acac122d79614fab9d4656adc1a85af7767afcf59974994099214 | 61f9e5138679f7b5f680f377e51f090814d5fcf1ca51c7fce9b5c7f43a255b8d | LGPL-3.0 | [] | 79 |
2.4 | karrio-server-events | 2026.1.14 | Multi-carrier shipping API Events module | # karrio.server.events
This package is a module of the [karrio](https://pypi.org/project/karrio.server) universal shipping API.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.server.events
```
Check the [karrio docs](https://docs.karrio.io) to get started.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core",
"huey"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:51.380031 | karrio_server_events-2026.1.14-py3-none-any.whl | 40,057 | 0d/e3/1979fb813a65d93f3200d32b1a07e66a419180b55799e47465f46bbb1d3f/karrio_server_events-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | c0447b011a0590a14f8cd6aa046c0396 | 911089c69594779c131ef9912fa25811644c026cc2ec3cd372d7968a3935a94d | 0de31979fb813a65d93f3200d32b1a07e66a419180b55799e47465f46bbb1d3f | LGPL-3.0 | [] | 79 |
2.4 | karrio-server-documents | 2026.1.14 | Multi-carrier shipping API apps module | # karrio-server
Karrio server documents module.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"pyzint",
"weasyprint",
"karrio_server_core",
"karrio_server_graph",
"karrio_server_manager"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:50.330537 | karrio_server_documents-2026.1.14-py3-none-any.whl | 37,738 | 5b/42/77622f8d77a8b908952566a2c835f479ff1658ea65a142ca95dbacb1786b/karrio_server_documents-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 6087481cb6b8516812a40e172ab2dcb4 | e3abe12041e32fa6217a5bf02eb89c068e9db105ad06ea55234d5e8153ad6247 | 5b4277622f8d77a8b908952566a2c835f479ff1658ea65a142ca95dbacb1786b | LGPL-3.0 | [] | 79 |
2.4 | karrio-server-data | 2026.1.14 | Multi-carrier shipping API data import/export module | # karrio-server
Karrio server data import/import module.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio_server_core",
"karrio_server_events",
"karrio_server_manager",
"django-import-export"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:49.471786 | karrio_server_data-2026.1.14-py3-none-any.whl | 36,795 | f8/19/f91883dbd130a08326e2793a5f9ab40bea345039162307c81eca7b2963f4/karrio_server_data-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 5d43db4f25912f0d0e258269d7ea6aa1 | 862ddba476de47f27cd6b820f09ea23e0cd355823fc95d15aac5c2e886433cf3 | f819f91883dbd130a08326e2793a5f9ab40bea345039162307c81eca7b2963f4 | LGPL-3.0 | [] | 81 |
2.4 | karrio-server-core | 2026.1.14 | Multi-carrier shipping API Core module | # karrio.server.core
This package is a module of the [karrio](https://pypi.org/project/karrio.server) universal shipping API.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.server.core
```
Check the [karrio docs](https://docs.karrio.io) to get started.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio",
"psycopg2-binary",
"django-health-check",
"dnspython",
"psutil",
"pyyaml",
"Jinja2"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:48.455510 | karrio_server_core-2026.1.14-py3-none-any.whl | 279,556 | 6e/0f/48a4d626ea2e626a5147a652f9eadcf5a47979b0197a64b5a8e974e1596a/karrio_server_core-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | d5ac5b8a465255d4681e3af3444a2db6 | 72d7ed4d9120580934e13c072e20db671b5e4d6de13c3f3f2ff799a78a332db8 | 6e0f48a4d626ea2e626a5147a652f9eadcf5a47979b0197a64b5a8e974e1596a | LGPL-3.0 | [] | 80 |
2.4 | karrio-server | 2026.1.14 | Multi-carrier shipping API | # <a href="https://karrio.io" target="_blank"><img alt="Karrio" src="https://docs.karrio.io/img/logo.svg" height="50px" /></a>
**The Universal Shipping API**
[](https://github.com/karrioapi/karrio/actions/workflows/tests.yml)
[](./LICENSE)
[](https://www.codacy.com/gh/karrioapi/karrio/dashboard?utm_source=github.com&utm_medium=referral&utm_content=karrioapi/karrio&utm_campaign=Badge_Grade)
karrio makes shipping services simple and accessible.
**Features**
- **Headless Shipping**: Access a network of traditional and modern shipping carrier API-first
- **Multi-carrier SDK**: Integrate karrio once and connect to multiple shipping carrier APIs
- **Extensible**: Use the karrio SDK Framework to integrate with custom carrier APIs.
- **Shipping**: Connect carrier accounts, get live rates and purchase shipping labels.
- **Tracking**: Create package tracker, get real time tracking status and provide a branded tracking page.
- **Address Validation**: Validate shipping addresses using integrated 3rd party APIs.
- **Cloud**: Optimized for deployments using Docker.
- **Dashboard**: Use the [karrio dashboard](https://github.com/karrioapi/karrio-dashboard) to orchestrate your logistics operations.
## Try it now
There are several ways to use Karrio:
- [Karrio Cloud](https://app.karrio.io) let's you use the fullset of shipping features. you don't need to deploy anything. We will manage and scale your infrastructure.
- [Karrio OSS](https://github.com/karrioapi/karrio) is an open-source version of karrio that provides the core functionality of karrio (rating API, tracking API, shipping API), but lacks more advanced features (multi-tenant/orgs, shipping billing data, built-in address validation, etc.)
## Resources
- [**Documentation**](https://docs.karrio.io)
- [**Community Discussions**](https://github.com/karrioapi/karrio/discussions)
- [**Issue Tracker**](https://github.com/karrioapi/karrio/issues)
- [**Blog**](https://docs.karrio.io/blog)
> [Join us on Discord](https://discord.gg/gS88uE7sEx)
## License
This repository contains both OSS-licensed and non-OSS-licensed files. We maintain one repository rather than two separate repositories mainly for development convenience.
All files in the `/ee` fall under the [Karrio LICENSE](/ee/LICENSE).
The remaining files fall under the [Apache 2 license](LICENSE). Karrio OSS is built only from the Apache-licensed files in this repository.
Any other questions, mail us at hello@karrio.io We’d love to meet you!
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"django",
"djangorestframework",
"djangorestframework-simplejwt",
"django-constance",
"django-filter",
"django-picklefield",
"django-email-verification",
"django-cors-headers",
"django-redis",
"django-two-factor-auth",
"django-oauth-toolkit",
"drf-api-tracking",
"drf-spectacular",
"dj-database-url",
"gunicorn",
"hiredis",
"uvicorn",
"jsonfield",
"more-itertools",
"requests",
"posthog",
"python-decouple",
"karrio_server_core",
"sentry-sdk",
"whitenoise",
"opentelemetry-instrumentation-django",
"opentelemetry-api",
"opentelemetry-sdk",
"opentelemetry-exporter-otlp",
"opentelemetry-instrumentation-requests",
"opentelemetry-instrumentation-logging",
"opentelemetry-instrumentation-psycopg2",
"opentelemetry-instrumentation-redis",
"opentelemetry-semantic-conventions",
"loguru"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio-server"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:47.194117 | karrio_server-2026.1.14-py3-none-any.whl | 2,409,486 | 80/1b/df1f6f7b622285d154d75186b20d72dfaae9cf4cddecd9aeba55ecaee7be/karrio_server-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 0e46d63e451f46adf6c2ade900f0bcd5 | 945cbbb8b9bffb1fe3b7f01b4270ecb00834b4d6392325601b2d82728cf22e10 | 801bdf1f6f7b622285d154d75186b20d72dfaae9cf4cddecd9aeba55ecaee7be | Apache-2.0 | [] | 85 |
2.4 | karrio-sendle | 2026.1.14 | Karrio - Sendle Shipping Extension |
# karrio.sendle
This package is a Sendle extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.sendle
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.sendle.settings import Settings
# Initialize a carrier gateway
sendle = karrio.gateway["sendle"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:46.039008 | karrio_sendle-2026.1.14-py3-none-any.whl | 19,482 | b0/74/498c0a4e583810229a105c3b76cb6318bf70bf4514ca1c46342cd9706932/karrio_sendle-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 3367b4ff1a9606636458ad81682bd184 | 1b3cb177cd782a4d4b4c93a6f0f79024bcae5e143b4663fec0c1da8bf5cda982 | b074498c0a4e583810229a105c3b76cb6318bf70bf4514ca1c46342cd9706932 | LGPL-3.0 | [] | 81 |
2.4 | karrio-seko | 2026.1.14 | Karrio - SEKO Logistics Shipping Extension |
# karrio.seko
This package is a SEKO Logistics extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.seko
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.seko.settings import Settings
# Initialize a carrier gateway
seko = karrio.gateway["seko"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:44.895907 | karrio_seko-2026.1.14-py3-none-any.whl | 25,287 | 0b/65/a2ad1345c5bed38f49efd163336a1149c849f4466720b1cbecf6ba6b4aee/karrio_seko-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 7c3722d387e5cdd4375d2ed913bd1944 | 300d27d4c5c57b837031a23bddafee9771831a7677207044e8988b5c955686a4 | 0b65a2ad1345c5bed38f49efd163336a1149c849f4466720b1cbecf6ba6b4aee | LGPL-3.0 | [] | 81 |
2.4 | karrio-sapient | 2026.1.14 | Karrio - SAPIENT Shipping Extension |
# karrio.sapient
This package is a SAPIENT extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.sapient
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.sapient.settings import Settings
# Initialize a carrier gateway
sapient = karrio.gateway["sapient"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:43.245400 | karrio_sapient-2026.1.14-py3-none-any.whl | 22,473 | e6/c4/070e8faed9af252823c5a6332687009c64b28246c39eecc700cbd6f31461/karrio_sapient-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 8c8c63091ea82b4d1b478b5147638d85 | 6fc133a55421aa7942f1f53667dfbd9cbe7bc1a43dd5f8992a1abb5c89a9b804 | e6c4070e8faed9af252823c5a6332687009c64b28246c39eecc700cbd6f31461 | LGPL-3.0 | [] | 88 |
2.4 | karrio-royalmail | 2026.1.14 | Karrio - Royal Mail Shipping extension | # karrio.royalmail
This package is a Royal Mail extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.royalmail
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.royalmail.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["royalmail"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:41.978395 | karrio_royalmail-2026.1.14-py3-none-any.whl | 9,222 | 41/40/c8befc8c4f7f410b120f7098236c03917b8a917695665cd0cd6ba6926f19/karrio_royalmail-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 95adf3ce55be38635a57fff27ad7011a | b961786a874a4634d7a732951f3aa44b7c6a1d64c3fb632f7a0fe3cd61afc00c | 4140c8befc8c4f7f410b120f7098236c03917b8a917695665cd0cd6ba6926f19 | LGPL-3.0 | [] | 84 |
2.4 | karrio-roadie | 2026.1.14 | Karrio - Roadie Shipping Extension |
# karrio.roadie
This package is a Roadie extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.roadie
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.roadie.settings import Settings
# Initialize a carrier gateway
roadie = karrio.gateway["roadie"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:41.173759 | karrio_roadie-2026.1.14-py3-none-any.whl | 15,553 | e4/fe/ddf2d68a9694211f13d3cd36a1345a9c08f0e1cb34b0c4caf2a59c2cca7b/karrio_roadie-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | aa66499d0f64ebf7b8d67d825ffeded1 | 87da4894d75332cae8aa7b7a086a9b733ebcb7aa696079148a2d7125ebaeb89f | e4feddf2d68a9694211f13d3cd36a1345a9c08f0e1cb34b0c4caf2a59c2cca7b | LGPL-3.0 | [] | 83 |
2.4 | karrio-purolator | 2026.1.14 | Karrio - Purolator Shipping extension | # karrio.purolator
This package is a Purolator extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.purolator
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.purolator.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["purolator"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:39.605547 | karrio_purolator-2026.1.14-py3-none-any.whl | 397,053 | 83/b3/75c3844603e1bf1796d4426173d3c5566c70f404719e29ccbb4bf01b7dfa/karrio_purolator-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 99529878cdd9e5d5928e21373b485342 | 039eb22abb7bef5d3a3edd701b4da83e7e75829dba6e84f4af0fe0b5b5b557e1 | 83b375c3844603e1bf1796d4426173d3c5566c70f404719e29ccbb4bf01b7dfa | LGPL-3.0 | [] | 81 |
2.4 | karrio-postat | 2026.1.14 | Karrio - Austrian Post Shipping Extension | # karrio.postat
This package is a Post AT (Austrian Post) extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.postat
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.postat.settings import Settings
# Initialize a carrier gateway
postat = karrio.gateway["postat"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:38.612422 | karrio_postat-2026.1.14-py3-none-any.whl | 41,603 | e3/86/d6c0faff09214aebea2e420fb449bfabadea51c9342d7835aaad5a5d37c1/karrio_postat-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 40c2aa2a72671e984835c0e82430a6c5 | 8f048bdfa11e27b05425311d978b624ad511244001682d82efbf6dd7db3ce99d | e386d6c0faff09214aebea2e420fb449bfabadea51c9342d7835aaad5a5d37c1 | LGPL-3.0 | [] | 85 |
2.4 | karrio-parcelone | 2026.1.14 | Karrio - ParcelOne Shipping Extension | # karrio.parcelone
This package is a ParcelOne extension of the [karrio](https://pypi.org/project/karrio) multi-carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.parcelone
```
## Usage
```python
import karrio
from karrio.mappers.parcelone.settings import Settings
# Initialize a gateway with your ParcelOne credentials
gateway = karrio.gateway["parcelone"].create(
Settings(
username="your-username",
password="your-password",
mandator_id="your-mandator-id",
consigner_id="your-consigner-id",
test_mode=True,
)
)
```
Check the [karrio documentation](https://docs.karrio.io) for more details.
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:37.439291 | karrio_parcelone-2026.1.14-py3-none-any.whl | 23,765 | d0/05/7760bc3f6cdc3ab51a5978891ae08369d2a719262efd05c314d82e46493b/karrio_parcelone-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 4bfddf13724b74c5349e4fb75365365a | f159b79c7378830c84a34c6dcb39d40ebe8886ab41a7ee472056af739bcee407 | d0057760bc3f6cdc3ab51a5978891ae08369d2a719262efd05c314d82e46493b | LGPL-3.0 | [] | 80 |
2.4 | matty | 0.10.0 | A Terminal UI for Matrix chat - simple and AI-friendly | <div align="center">
<img src="logo.svg" alt="Matrix TUI Logo" width="200" height="200">
# Matty - Matrix CLI Client
A simple, functional Matrix chat client built with Python, Typer, Pydantic, Nio, and Rich. Every interaction is a single CLI command for easy automation.
</div>
[](https://pypi.python.org/pypi/matty)
[](https://github.com/basnijholt/matty/actions/workflows/pytest.yml)
[](https://codecov.io/gh/basnijholt/matty)
[](https://github.com/basnijholt/matty)
[](https://github.com/astral-sh/ruff)
## Features
- Fast CLI commands for quick Matrix operations
- Thread support - view and navigate threaded conversations
- Reactions support - add and view emoji reactions on messages
- Message redaction - delete messages with optional reasons
- AI-friendly - every action is a single CLI command
- Functional programming style (minimal classes, maximum functions)
- Environment-based configuration
- Multiple output formats (rich, simple, JSON)
- Type-safe with dataclasses and type hints
- Persistent simple ID mapping for complex Matrix IDs
## Installation
```bash
uv tool install matty
# or
pipx install matty
# or
pip install matty
```
For development, clone the repo and install dependencies:
```bash
# Clone the repository
git clone https://github.com/basnijholt/matrix-cli
cd matrix-cli
# Install dependencies with uv
uv sync
# Optional: Install pre-commit hooks
uv run pre-commit install
```
## Configuration
Matty uses environment variables for configuration. Create a `.env` file in your working directory with your Matrix credentials:
```bash
MATRIX_HOMESERVER=https://matrix.org
MATRIX_USERNAME=your_username
MATRIX_PASSWORD=your_password
MATRIX_SSL_VERIFY=true # Set to false for test servers
```
### Environment Variables
| Variable | Description | Default | Example |
|----------|-------------|---------|---------|
| `MATRIX_HOMESERVER` | The Matrix homeserver URL to connect to | `https://matrix.org` | `https://matrix.example.com` |
| `MATRIX_USERNAME` | Your Matrix username (without @ or :server) | None (required) | `alice` |
| `MATRIX_PASSWORD` | Your Matrix account password | None (required) | `secretpassword` |
| `MATRIX_SSL_VERIFY` | Whether to verify SSL certificates | `true` | `false` (for test servers) |
**Notes:**
- The username should be provided without the `@` prefix or `:server` suffix
- Set `MATRIX_SSL_VERIFY=false` when connecting to test servers with self-signed certificates
- Command-line options (`--username`, `--password`) override environment variables
## Usage
### Available Commands
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty [OPTIONS] COMMAND [ARGS]...
Functional Matrix CLI client
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --install-completion Install completion for the current shell. │
│ --show-completion Show completion for the current shell, to copy it or │
│ customize the installation. │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Commands ─────────────────────────────────────────────────────────────────────────────╮
│ rooms List all joined rooms. (alias: r) │
│ messages Show recent messages from a room. (alias: m) │
│ users Show users in a room. (alias: u) │
│ send Send a message to a room. Supports @mentions. (alias: s) │
│ threads List all threads in a room. (alias: t) │
│ thread Show all messages in a specific thread. (alias: th) │
│ reply Reply to a specific message using its handle. (alias: re) │
│ thread-start Start a new thread from a message using its handle. (alias: ts) │
│ thread-reply Reply within an existing thread. (alias: tr) │
│ react Add a reaction to a message using its handle. (alias: rx) │
│ edit Edit a message using its handle. (alias: e) │
│ redact Delete/redact a message using its handle. (alias: del) │
│ reactions Show detailed reactions for a specific message. (alias: rxs) │
│ tui Launch interactive TUI chat interface. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Rooms Command
List all joined Matrix rooms:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty rooms --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty rooms [OPTIONS]
List all joined rooms. (alias: r)
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env │
│ var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env │
│ var) │
│ --format -f [rich|simple|json] Output format (rich/simple/json) │
│ [default: rich] │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Messages Command
Get recent messages from a room:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty messages --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty messages [OPTIONS] [ROOM]
Show recent messages from a room. (alias: m)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --limit -l INTEGER [default: 20] │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env │
│ var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env │
│ var) │
│ --format -f [rich|simple|json] Output format (rich/simple/json) │
│ [default: rich] │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Users Command
List users in a room:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty users --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty users [OPTIONS] [ROOM]
Show users in a room. (alias: u)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env │
│ var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env │
│ var) │
│ --format -f [rich|simple|json] Output format (rich/simple/json) │
│ [default: rich] │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Thread Commands
View and interact with threads:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty threads --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty threads [OPTIONS] [ROOM]
List all threads in a room. (alias: t)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --limit -l INTEGER Number of messages to check [default: 50] │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env │
│ var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env │
│ var) │
│ --format -f [rich|simple|json] Output format (rich/simple/json) │
│ [default: rich] │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty thread --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty thread [OPTIONS] [ROOM] [THREAD_ID]
Show all messages in a specific thread. (alias: th)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ thread_id [THREAD_ID] Thread ID (t1, t2, etc.) or full Matrix ID │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --limit -l INTEGER Number of messages to fetch [default: 50] │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env │
│ var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env │
│ var) │
│ --format -f [rich|simple|json] Output format (rich/simple/json) │
│ [default: rich] │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Send Command
Send messages to rooms:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty send --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty send [OPTIONS] [ROOM] [MESSAGE]
Send a message to a room. Supports @mentions. (alias: s)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ message [MESSAGE] Message to send (use @username for mentions) │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --stdin Read message from stdin │
│ --file -f PATH Read message from file │
│ --no-mentions Don't parse @mentions in messages │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env var) │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Reply Command
Reply to messages:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty reply --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty reply [OPTIONS] [ROOM] [HANDLE] [MESSAGE]
Reply to a specific message using its handle. (alias: re)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ handle [HANDLE] Message handle (m1, m2, etc.) to reply to │
│ message [MESSAGE] Reply message │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --no-mentions Don't parse @mentions in messages │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env var) │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Thread Start Command
Start a thread from a message:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty thread-start --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty thread-start [OPTIONS] [ROOM] [HANDLE] [MESSAGE]
Start a new thread from a message using its handle. (alias: ts)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ handle [HANDLE] Message handle (m1, m2, etc.) to start thread from │
│ message [MESSAGE] First message in the thread │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --no-mentions Don't parse @mentions in messages │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env var) │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Thread Reply Command
Reply in a thread:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty thread-reply --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty thread-reply [OPTIONS] [ROOM] [THREAD_ID] [MESSAGE]
Reply within an existing thread. (alias: tr)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ thread_id [THREAD_ID] Thread ID (t1, t2, etc.) or full Matrix ID │
│ message [MESSAGE] Reply message │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --no-mentions Don't parse @mentions in messages │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env var) │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### React Command
Add reactions to messages:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty react --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty react [OPTIONS] [ROOM] [HANDLE] [EMOJI]
Add a reaction to a message using its handle. (alias: rx)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ handle [HANDLE] Message handle (m1, m2, etc.) to react to │
│ emoji [EMOJI] Emoji reaction (e.g., 👍, ❤️, 😄) │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env var) │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Reactions Command
View reactions on a message:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty reactions --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty reactions [OPTIONS] [ROOM] [HANDLE]
Show detailed reactions for a specific message. (alias: rxs)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ handle [HANDLE] Message handle (m1, m2, etc.) to show reactions for │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env │
│ var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env │
│ var) │
│ --format -f [rich|simple|json] Output format (rich/simple/json) │
│ [default: rich] │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
### Redact Command
Delete/redact messages:
<!-- CODE:BASH:START -->
<!-- echo '```' -->
<!-- matty redact --help -->
<!-- echo '```' -->
<!-- CODE:END -->
<!-- OUTPUT:START -->
<!-- ⚠️ This content is auto-generated by `markdown-code-runner`. -->
```
Usage: matty redact [OPTIONS] [ROOM] [HANDLE]
Delete/redact a message using its handle. (alias: del)
╭─ Arguments ────────────────────────────────────────────────────────────────────────────╮
│ room [ROOM] Room ID or name │
│ handle [HANDLE] Message handle (m1, m2, etc.) to redact/delete │
╰────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ──────────────────────────────────────────────────────────────────────────────╮
│ --reason -r TEXT Reason for redaction │
│ --username -u TEXT Matrix username (overrides MATRIX_USERNAME env var) │
│ --password -p TEXT Matrix password (overrides MATRIX_PASSWORD env var) │
│ --help -h Show this message and exit. │
╰────────────────────────────────────────────────────────────────────────────────────────╯
```
<!-- OUTPUT:END -->
## Command Aliases
For faster typing, all commands have short aliases:
- `matty r` → `matty rooms` - List all rooms
- `matty m` → `matty messages` - Show messages from a room
- `matty u` → `matty users` - Show users in a room
- `matty s` → `matty send` - Send a message
- `matty t` → `matty threads` - List threads
- `matty th` → `matty thread` - Show thread messages
- `matty re` → `matty reply` - Reply to a message
- `matty ts` → `matty thread-start` - Start a thread
- `matty tr` → `matty thread-reply` - Reply in a thread
- `matty rx` → `matty react` - Add a reaction to a message
- `matty del` → `matty redact` - Delete/redact a message
- `matty rxs` → `matty reactions` - Show reactions on a message
## Examples
### Basic Usage
```bash
# List all rooms
matty rooms
# or use alias: matty r
# Show users in a room (with mention hints)
matty users lobby
# or: matty u lobby
# Get recent messages from a room
matty messages lobby --limit 10
# or: matty m lobby --limit 10
# Send a message to a room
matty send lobby "Hello from CLI!"
# or: matty s lobby "Hello from CLI!"
# Send a message with mentions
matty send lobby "@alice check this out!"
# or: matty s lobby "@bob @alice meeting at 3pm"
# Use different output formats
matty rooms --format json
matty rooms --format simple
# or: matty r --format json
```
### Working with Threads
```bash
# List threads in a room
matty threads lobby
# View messages in a specific thread (using simple ID)
matty thread lobby t1
# Start a thread from a message
matty thread-start lobby m2 "Starting a thread!"
# Reply in a thread (using simple thread ID)
matty thread-reply lobby t1 "Reply in thread"
```
### Reactions and Redaction
```bash
# Add a reaction to a message
matty react lobby m3 "👍"
# or: matty rx lobby m3 "🚀"
# View reactions on a message
matty reactions lobby m3
# or: matty rxs lobby m3 --format simple
# Delete/redact a message
matty redact lobby m5 --reason "Accidental message"
# or: matty del lobby m5
```
### Message Handles and Replies
```bash
# Reply to a message using handle
matty reply lobby m3 "This is a reply!"
# Reply to the 5th message in a room
matty messages lobby --limit 10
matty reply lobby m5 "Replying to message 5"
```
### Mentions
The CLI supports @mentions in messages:
```bash
# Mention a user by username
matty send lobby "@alice can you check this?"
# Multiple mentions
matty send lobby "@bob @alice meeting in 5 minutes"
# List users to see available mentions
matty users lobby # Shows User IDs and simplified @mentions
# Mentions work in replies and threads too
matty reply lobby m3 "@alice I agree with your point"
matty thread-reply lobby t1 "@bob what do you think?"
```
The mention system will:
- Automatically find the full Matrix ID for @username mentions
- Support full Matrix IDs like @user:server.com
- Format mentions properly so users get notified
## Message Handles and Thread IDs
The CLI uses convenient handles to reference messages and threads:
- **Message handles**: `m1`, `m2`, `m3`, etc. - Reference messages by their position
- **Thread IDs**: `t1`, `t2`, `t3`, etc. - Reference threads with simple persistent IDs
These IDs are stored in `~/.matrix_cli_ids.json` and persist across sessions.
### Why Simple IDs?
Matrix uses complex IDs like:
- Event: `$Uj2XuH2a8EqJBh4g:matrix.org`
- Room: `!DfQvqvwXYsFjVcfLTp:matrix.org`
Our CLI simplifies these to:
- Messages: `m1`, `m2`, `m3` (temporary handles for current view)
- Threads: `t1`, `t2`, `t3` (persistent IDs across sessions)
## Output Formats
The CLI supports three output formats:
1. **Rich** (default) - Beautiful terminal UI with tables and colors
2. **Simple** - Plain text output, perfect for scripts
3. **JSON** - Machine-readable format for automation
Example:
```bash
# Pretty tables with colors
matty rooms
# Simple text output
matty rooms --format simple
# JSON for automation
matty rooms --format json | jq '.[] | .name'
```
## Project Structure
```
matrix-cli/
├── matty.py # Main CLI application (functional style)
├── test_client.py # Connection testing utility
├── tests/ # Test suite
│ ├── __init__.py
│ ├── conftest.py # Pytest configuration
│ └── test_matrix_cli.py # Unit tests
├── .github/ # GitHub Actions workflows
│ └── workflows/
│ ├── pytest.yml # Test runner
│ ├── release.yml # PyPI release
│ └── markdown-code-runner.yml # README updater
├── .env # Your credentials (not in git)
├── .env.example # Example environment file
├── CLAUDE.md # Development guidelines
├── pyproject.toml # Project configuration
└── README.md # This file
```
## Development
This project follows functional programming principles:
- Private functions (`_function_name`) for internal logic
- Dataclasses over dictionaries for data structures
- Type hints everywhere for clarity
- No unnecessary abstractions or class hierarchies
- Functions over classes where possible
See `CLAUDE.md` for detailed development guidelines.
## Testing
```bash
# Run tests
uv run pytest tests/ -v
# Test with coverage
uv run pytest tests/ -v --cov=matty --cov-report=term-missing
# Test connection to Matrix server
uv run python test_client.py
# Run pre-commit checks
uv run pre-commit run --all-files
```
## License
MIT
| text/markdown | null | Bas Nijholt <bas@nijho.lt> | null | null | MIT | ai-friendly, chat, cli, client, matrix | [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.12",
"Topic :: Communications :: Chat",
"Topic :: Software Development :: Libraries :: Python Modules"
] | [] | null | null | >=3.12 | [] | [] | [] | [
"aiofiles>=24.1.0",
"matrix-nio>=0.25.2",
"pydantic-settings>=2.10.1",
"pydantic>=2.11.7",
"python-dotenv>=1.1.1",
"rich>=13.0.0",
"textual>=1.0.0",
"typer>=0.16.1"
] | [] | [] | [] | [
"Homepage, https://github.com/basnijholt/matty",
"Documentation, https://github.com/basnijholt/matty#readme",
"Repository, https://github.com/basnijholt/matty",
"Issues, https://github.com/basnijholt/matty/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:36.780387 | matty-0.10.0.tar.gz | 32,094 | 39/d3/c037b762ab1abffcaf36cbecc22c0c1c1f423e4ad130df37c40d1ac35922/matty-0.10.0.tar.gz | source | sdist | null | false | 6b6afbe5862fff6a30e20a0262e7ef0e | f1a5b409d3c99b50544f84b8214dad59cce6576504ebf324e066fc1a8ca96e29 | 39d3c037b762ab1abffcaf36cbecc22c0c1c1f423e4ad130df37c40d1ac35922 | null | [] | 167 |
2.4 | karrio-nationex | 2026.1.14 | Karrio - Nationex Shipping Extension |
# karrio.nationex
This package is a Nationex extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.nationex
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.nationex.settings import Settings
# Initialize a carrier gateway
nationex = karrio.gateway["nationex"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:35.740913 | karrio_nationex-2026.1.14-py3-none-any.whl | 16,733 | 34/6d/13bd1585a7ac756572f81cec34a768029a1b9d3425981903baf6c247c151/karrio_nationex-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 2aa1e1a156657b44f8bb10958969f1b8 | b3dcbc993a200df6115ec4f1924e2c1b4fdfdb389271e54209cff0a412c3c9b8 | 346d13bd1585a7ac756572f81cec34a768029a1b9d3425981903baf6c247c151 | LGPL-3.0 | [] | 85 |
2.4 | karrio-mydhl | 2026.1.14 | Karrio - MyDHL Shipping Extension | # karrio.mydhl
This package is a MyDHL extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.mydhl
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.mydhl.settings import Settings
# Initialize a carrier gateway
mydhl = karrio.gateway["mydhl"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:34.957796 | karrio_mydhl-2026.1.14-py3-none-any.whl | 40,175 | 47/40/4315dc223f723aac1377122bef97bd2def3f83eeaf9997c09b2db03c28a1/karrio_mydhl-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 54e6edab250a368b372ea331199adbb9 | 0701b83ed31d6a93651e79e257a7d2ae7d27ec27191bda3838abe750d42aa193 | 47404315dc223f723aac1377122bef97bd2def3f83eeaf9997c09b2db03c28a1 | Apache-2.0 | [] | 82 |
2.4 | karrio-locate2u | 2026.1.14 | Karrio - Locate2u Shipping Extension |
# karrio.locate2u
This package is a Locate2u extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.locate2u
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.locate2u.settings import Settings
# Initialize a carrier gateway
locate2u = karrio.gateway["locate2u"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:34.175126 | karrio_locate2u-2026.1.14-py3-none-any.whl | 13,875 | 80/d8/808eb4b875ecf5f40cbdb41a09bc790c77c38d48fe614c7fb3eedcf76a5b/karrio_locate2u-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 3749aa4c7feacd4fd76699d393f3107f | c094ebeb156b790ec900caced5db231406d0b4b1e62369619a29ea9988e481e0 | 80d8808eb4b875ecf5f40cbdb41a09bc790c77c38d48fe614c7fb3eedcf76a5b | LGPL-3.0 | [] | 84 |
2.4 | karrio-laposte | 2026.1.14 | Karrio - La Poste Shipping Extension |
# karrio.laposte
This package is a La Poste extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.laposte
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.laposte.settings import Settings
# Initialize a carrier gateway
laposte = karrio.gateway["laposte"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:33.386528 | karrio_laposte-2026.1.14-py3-none-any.whl | 9,549 | e5/a9/b8846287aa8bba80a7fdcff9fa7b8f0838222140580428a100c1a6033f56/karrio_laposte-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 96eaa55fddb9a9dc89a11c54b1f96e5f | fe56cbcbec2fd03353c7122426b61435329e0b002075f9b76e013d3852d3c675 | e5a9b8846287aa8bba80a7fdcff9fa7b8f0838222140580428a100c1a6033f56 | LGPL-3.0 | [] | 80 |
2.4 | karrio-landmark | 2026.1.14 | Karrio - Landmark Global Shipping Extension | # karrio.landmark
This package is a Landmark Global extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.landmark
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.landmark.settings import Settings
# Initialize a carrier gateway
landmark = karrio.gateway["landmark"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:32.219465 | karrio_landmark-2026.1.14-py3-none-any.whl | 147,302 | 60/44/a0b29806a2d4d7fbf8ce29d41ef688545281255ec83750487adad49aac79/karrio_landmark-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 01cf9f6c8fdf8c494d631f919c68ca8e | a16e940cf02ca34901ad53215dae047136cbb8ecc63c0751d77d088a55765dfb | 6044a0b29806a2d4d7fbf8ce29d41ef688545281255ec83750487adad49aac79 | LGPL-3.0 | [] | 81 |
2.4 | karrio-hermes | 2026.1.14 | Karrio - Hermes Shipping Extension | # karrio.hermes
This package is a Hermes extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.hermes
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.hermes.settings import Settings
# Initialize a carrier gateway
hermes = karrio.gateway["hermes"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:31.129940 | karrio_hermes-2026.1.14-py3-none-any.whl | 27,035 | a1/50/44e2e2a3039bd996a19c4e79031efe0f179492176f59920204ef35de042b/karrio_hermes-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | ad9b7637de0a486b29602389b146d06f | 019cbe3a5b993d20946d08e7402812cbef5575faf956fcb60a8157d3bc8c4147 | a15044e2e2a3039bd996a19c4e79031efe0f179492176f59920204ef35de042b | LGPL-3.0 | [] | 80 |
2.4 | karrio-hay-post | 2026.1.14 | Karrio - HayPost Shipping Extension | # karrio.hay_post
This package is a HayPost extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.hay_post
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.hay_post.settings import Settings
# Initialize a carrier gateway
hay_post = karrio.gateway["hay_post"].create(
Settings(
...
)
)
```
For test mode, you need to add your proxy or use a static IP address that HayPost has whitelisted.
Additionally, you need to connect with HayPost managers to obtain test and production
accounts. [Haypost ContactUs](https://www.haypost.am/en/contact-us)
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
## Additional information
### Services
* letter_ordered
* letter_simple
* letter_valued
* package_ordered
* package_simple
* package_valued
* parcel_simple
* parcel_valued
* postcard_ordered
* postcard_simple
* sekogram_simple
* sprint_simple
* yes_ordered_value
### Currencies
* RUB
* USD
* EUR
* AMD
### Additional Services as Option
* notification
* ordered_packaging
* pick_up
* postmen_delivery_value
* delivery
* international_notification
* domestic_sms
* international_sms
### Note
* To obtain rates, you first need to configure them in the Carrier Configurations section (Services and Options).
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:30.332764 | karrio_hay_post-2026.1.14-py3-none-any.whl | 16,535 | 0d/6f/ac18f892d02a3b39c5fa9b15034d734c5d020176ad48f6f1b267b5059730/karrio_hay_post-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | b08ec672207abbfebf68a4ca7d670194 | 9ee25c97166327fec2d34ebf5e70abfa88b8843b80798f030889164b54ba80b8 | 0d6fac18f892d02a3b39c5fa9b15034d734c5d020176ad48f6f1b267b5059730 | LGPL-3.0 | [] | 83 |
2.4 | karrio-gls | 2026.1.14 | Karrio - GLS Group Shipping Extension | # Karrio - GLS Group Shipping Extension
This extension adds support for GLS Group shipping services to the Karrio platform.
## Features
- Shipment creation with label generation
- Tracking information retrieval
- OAuth2 authentication
- Support for multiple parcel types and services
## Installation
```bash
pip install karrio-gls-group
```
## Configuration
```python
import karrio
from karrio.mappers.gls.settings import Settings
settings = Settings(
client_id="your_client_id",
client_secret="your_client_secret",
test_mode=True
)
gateway = karrio.gateway["gls"].create(settings)
```
## API Documentation
- Production: https://api.gls-group.net
- Sandbox: https://api-sandbox.gls-group.net
- Developer Portal: https://dev-portal.gls-group.net
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:29.527747 | karrio_gls-2026.1.14-py3-none-any.whl | 17,558 | 46/84/fc8d11156d8a3d6844f711e26a593ab3002187d33032952c64aa00f4d441/karrio_gls-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 9f1966d442ef48c69c555669c465c681 | 13421eb02daef4feead32f080ba1d1a844356732c24ff450002f76fec3dabad9 | 4684fc8d11156d8a3d6844f711e26a593ab3002187d33032952c64aa00f4d441 | Apache-2.0 | [] | 81 |
2.4 | karrio-geodis | 2026.1.14 | Karrio - GEODIS Shipping Extension |
# karrio.geodis
This package is a GEODIS extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.geodis
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.geodis.settings import Settings
# Initialize a carrier gateway
geodis = karrio.gateway["geodis"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:28.756619 | karrio_geodis-2026.1.14-py3-none-any.whl | 19,926 | 8e/eb/b97da3f56ac3dfce970fe7ddb1cd6e6341884a065f82b860a25798657442/karrio_geodis-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 4932ddd2815fd262821aab6ca98d7fb2 | 1c7184c9e9d1007688de3b0c59ca850a7ececd0671c3dfb366d3183d25a15c4e | 8eebb97da3f56ac3dfce970fe7ddb1cd6e6341884a065f82b860a25798657442 | LGPL-3.0 | [] | 82 |
2.4 | karrio-generic | 2026.1.14 | Karrio - Custom carrier Shipping extension | # karrio.generic
This package is a Generic extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.generic
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.generic.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["generic"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:27.756368 | karrio_generic-2026.1.14-py3-none-any.whl | 5,767 | 7a/18/282595e5e49b167745dc69c4ae827eda44a0f9c5ee9cffde56993611a1e6/karrio_generic-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | f7f0a44fa1db42d863f96136e1a4735b | 182fe9fad5b79eb11218f0d0ad5fe6749ac7cf413f3b5c1bcece7d055aa2ffa8 | 7a18282595e5e49b167745dc69c4ae827eda44a0f9c5ee9cffde56993611a1e6 | LGPL-3.0 | [] | 83 |
2.4 | karrio-freightcom | 2026.1.14 | Karrio - Freightcom Shipping extension | # karrio.freightcom
This package is a freightcom extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.freightcom
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.freightcom.settings import Settings
# Initialize a carrier gateway
freightcom = karrio.gateway["freightcom"].create(
Settings(
...
)
)
```
Check the [karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:26.874633 | karrio_freightcom-2026.1.14-py3-none-any.whl | 112,874 | 1b/9e/9d366221296407ec6ebea72fa2a981891acc107ee5a01fb2961430f24763/karrio_freightcom-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | e48ee356c34a35614606b78d880aa54e | 2f6f499268ed86a85e9757ff636a18a6ceddd002e686b3a121e5d2a1ddb25c4e | 1b9e9d366221296407ec6ebea72fa2a981891acc107ee5a01fb2961430f24763 | LGPL-3.0 | [] | 80 |
2.4 | karrio-fedex | 2026.1.14 | Karrio - FedEx Shipping Extension |
# karrio.fedex
This package is a FedEx extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.fedex
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.fedex.settings import Settings
# Initialize a carrier gateway
fedex = karrio.gateway["fedex"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:25.934455 | karrio_fedex-2026.1.14-py3-none-any.whl | 56,077 | 0a/a2/80f8c68a05697dc8963560ae492e57aa71eaea4f7051fb7382a9cad72b81/karrio_fedex-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 89eb8e85147f3beabb1856bee9133992 | 887233c3d40c62470bdbe62e6ceb8f47149ead5a7bf2f7ff55160f21b5d95157 | 0aa280f8c68a05697dc8963560ae492e57aa71eaea4f7051fb7382a9cad72b81 | LGPL-3.0 | [] | 80 |
2.4 | karrio-eshipper | 2026.1.14 | Karrio - eShipper Shipping Extension |
# karrio.eshipper
This package is a eShipper extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.eshipper
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.eshipper.settings import Settings
# Initialize a carrier gateway
eshipper = karrio.gateway["eshipper"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:24.941422 | karrio_eshipper-2026.1.14-py3-none-any.whl | 39,065 | 66/ac/488c5988600e98581a11417947ff0cfd3bfdc967f42305f7c22e7e0e5037/karrio_eshipper-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 8db5bcd0a52442e674f4370b3e7e63d6 | e4e58e37f0fcb7590bb0d5f96cda69dc3779ba4ac07513da25a96a1573a726de | 66ac488c5988600e98581a11417947ff0cfd3bfdc967f42305f7c22e7e0e5037 | LGPL-3.0 | [] | 84 |
2.4 | karrio-easyship | 2026.1.14 | Karrio - Easyship Shipping Extension |
# karrio.easyship
This package is a Easyship extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.easyship
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.easyship.settings import Settings
# Initialize a carrier gateway
easyship = karrio.gateway["easyship"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:23.845539 | karrio_easyship-2026.1.14-py3-none-any.whl | 47,699 | 34/2f/2b2b8eacf8a0229b3b267dfd833d2941b6f3045ba35e1e5923490de3c142/karrio_easyship-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 7c3bcb29d0e5703443330b68e232a6b0 | c9b89d6f6cee7de064724f5d40d5dfaa20e967c9e40bb909d2e85f33d8739d39 | 342f2b2b8eacf8a0229b3b267dfd833d2941b6f3045ba35e1e5923490de3c142 | LGPL-3.0 | [] | 81 |
2.4 | karrio-easypost | 2026.1.14 | Karrio - EasyPost Shipping extension | # karrio.easypost
This package is a easypost extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.easypost
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.easypost.settings import Settings
# Initialize a carrier gateway
easypost = karrio.gateway["easypost"].create(
Settings(
...
)
)
```
Check the [karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:23.006338 | karrio_easypost-2026.1.14-py3-none-any.whl | 30,134 | fa/e3/84fce39bc1742f842418b0bdf9c52ec2971fb2879d1824702ba159caa00b/karrio_easypost-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 8acc7461be8cce0ed6748396dfcb0bde | c4fdb0940d7456dfa51d416eb46855cee68f06241a23cb0df423ea3c25a23cf7 | fae384fce39bc1742f842418b0bdf9c52ec2971fb2879d1824702ba159caa00b | LGPL-3.0 | [
"LICENSE"
] | 82 |
2.4 | karrio-dpd-meta | 2026.1.14 | Karrio - DPD Meta Shipping Extension | # karrio.dpd_meta
This package is a DPD Group extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.dpd_meta
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dpd_meta.settings import Settings
# Initialize a carrier gateway
dpd_meta = karrio.gateway["dpd_meta"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:22.059473 | karrio_dpd_meta-2026.1.14-py3-none-any.whl | 24,152 | 1a/2f/6a57516c350588d363cd81eb2eeed27e6208ed41c272325a64d384d0ff83/karrio_dpd_meta-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 6fb8a3f198ab60234ef4400135730222 | f99829bf383a0c19170464089dbd1cb5efd2df076f30d18c920c9072839f1cad | 1a2f6a57516c350588d363cd81eb2eeed27e6208ed41c272325a64d384d0ff83 | LGPL-3.0 | [] | 80 |
2.4 | karrio-dpd | 2026.1.14 | Karrio - DPD Shipping Extension |
# karrio.dpd
This package is a DPD extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.dpd
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dpd.settings import Settings
# Initialize a carrier gateway
dpd = karrio.gateway["dpd"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:21.188328 | karrio_dpd-2026.1.14-py3-none-any.whl | 117,822 | 3b/60/344340ba79242521a796baa7a272ab5289f2cff2df2da8f25649d054eb0f/karrio_dpd-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 9efec8cb45b2c0312aa7e8ba06ebba36 | 9d09a4c4bf4198868aedc3c785ec2954ebb7de3db8e5f97b34bd0bd3020d44ce | 3b60344340ba79242521a796baa7a272ab5289f2cff2df2da8f25649d054eb0f | LGPL-3.0 | [] | 84 |
2.4 | karrio-dicom | 2026.1.14 | Karrio - Dicom Shipping extension | # karrio.dicom
This package is a Dicom extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.dicom
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dicom.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["dicom"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:20.324185 | karrio_dicom-2026.1.14-py3-none-any.whl | 18,955 | 05/43/4046428c749c095ee25ca69981068592d4ec897beedff085d427e499dbf2/karrio_dicom-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | d11304316b137b8721b47861af8fa2da | 46d62ca13b60a00785cfad76fe1f770c49e338f2a6ebc1c8c6a6d92febe45d72 | 05434046428c749c095ee25ca69981068592d4ec897beedff085d427e499dbf2 | LGPL-3.0 | [] | 85 |
2.4 | karrio-dhl-universal | 2026.1.14 | DHL Universal Tracking karrio extension | # karrio.dhl_universal
This package is a DHL Universal Tracking extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.dhl_universal
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dhl_universal.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["dhl_universal"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:18.685695 | karrio_dhl_universal-2026.1.14-py3-none-any.whl | 9,421 | 4a/42/cb4760e2fa82758201e5b0e14c71bb96cbee53e828e93d8d77e9132c2855/karrio_dhl_universal-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 66d86b6063eaf938799dc22576afb775 | 59c96a762be6b9e20a9210b3055daafdef756a6d5bcbb7dc44b220abf64e0c0a | 4a42cb4760e2fa82758201e5b0e14c71bb96cbee53e828e93d8d77e9132c2855 | LGPL-3.0 | [] | 78 |
2.4 | karrio-dhl-poland | 2026.1.14 | Karrio - DHL Parcel Poland Shipping Extension | # karrio.dhl_poland
This package is a DHL Parcel Poland extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.dhl_poland
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dhl_poland.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["dhl_poland"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:17.459342 | karrio_dhl_poland-2026.1.14-py3-none-any.whl | 106,078 | 41/58/3454c22fec1481df2713ec4270ce5fc0d58e2b9a6375b2b8a50d0d9bc171/karrio_dhl_poland-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | e7b2f83399a2af6a3c9346196c6ca0b4 | f67dd889fb3d6cb34c0430cee95f02247bca8607da3eb59decb279895d18dc56 | 41583454c22fec1481df2713ec4270ce5fc0d58e2b9a6375b2b8a50d0d9bc171 | LGPL-3.0 | [] | 75 |
2.4 | karrio-dhl-parcel-de | 2026.1.14 | Karrio - DHL Germany Shipping Extension |
# karrio.dhl_parcel_de
This package is a DHL Germany extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.dhl_parcel_de
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dhl_parcel_de.settings import Settings
# Initialize a carrier gateway
dhl_parcel_de = karrio.gateway["dhl_parcel_de"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:16.491940 | karrio_dhl_parcel_de-2026.1.14-py3-none-any.whl | 66,102 | fe/31/476f0206314687bd9a67068af6021495d512f4313489595264e5a5c1021b/karrio_dhl_parcel_de-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 30d6afdd3384c5b6815184f5ba48bcda | 1b66af130971cf7936eb7ff969affb51af6237f5184c6a73ab882b19edcaf912 | fe31476f0206314687bd9a67068af6021495d512f4313489595264e5a5c1021b | LGPL-3.0 | [] | 77 |
2.4 | karrio-dhl-express | 2026.1.14 | Karrio - DHL Express Shipping Extension | # karrio.dhl_express
This package is a DHL Express extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.dhl_express
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.dhl_express.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["dhl_express"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:14.950193 | karrio_dhl_express-2026.1.14-py3-none-any.whl | 2,955,773 | e5/b7/80c4999dda8a29e7480d3c9aee2928a236227ee345b81afaaac8feaf8881/karrio_dhl_express-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | dfc4d830329d2fb55f6da1f1b79c8990 | 2eabaeea298a4967510159d06b83f82b9b10fef68cb4e6fc437563edaa32a83e | e5b780c4999dda8a29e7480d3c9aee2928a236227ee345b81afaaac8feaf8881 | LGPL-3.0 | [] | 79 |
2.4 | karrio-colissimo | 2026.1.14 | Karrio - Colissimo Shipping Extension |
# karrio.colissimo
This package is a Colissimo extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.colissimo
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.colissimo.settings import Settings
# Initialize a carrier gateway
colissimo = karrio.gateway["colissimo"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:13.966813 | karrio_colissimo-2026.1.14-py3-none-any.whl | 15,678 | 19/33/cca54bc1efa9cb62db070b5a5d73a16c808f4d2352914cb1943264275ae1/karrio_colissimo-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | b472f050a03a85aaba840cfb010550a2 | 700ee1c17be39faf00f17f7e530ce87fdfddb99db4644769a1b75d3c67f91dcc | 1933cca54bc1efa9cb62db070b5a5d73a16c808f4d2352914cb1943264275ae1 | LGPL-3.0 | [] | 82 |
2.4 | karrio-cli | 2026.1.14 | Command line interface for Karrio | # Karrio CLI Tools
This folder contains CLI tools for the Karrio project.
## Codegen
The `codegen` command provides utilities for code generation, particularly for transforming JSON schemas into Python code using jstruct.
### Transform
The `transform` command converts Python code generated by quicktype (using dataclasses) into code that uses attrs and jstruct decorators.
```bash
# Transform from stdin to stdout
cat input.py | karrio codegen transform > output.py
# Transform from file to file
karrio codegen transform input.py output.py
# Transform from file to stdout
karrio codegen transform input.py
# Disable appending 'Type' to class names
karrio codegen transform input.py output.py --no-append-type-suffix
```
The transform command makes the following changes:
- Replaces `from dataclasses import dataclass` with explicit imports: `import attr`, `import jstruct`, `import typing`
- Removes any `from typing import ...` lines entirely
- Replaces `@dataclass` with `@attr.s(auto_attribs=True)`
- Appends 'Type' to all class names (unless they already end with 'Type') by default
- Replaces complex type annotations with jstruct equivalents
- Ensures there are no duplicate import statements
### Generate
The `generate` command generates Python code with jstruct from a JSON schema file using quicktype.
```bash
# Generate Python code with jstruct from a JSON schema
karrio codegen generate --src=schema.json --out=output.py
# Specify Python version
karrio codegen generate --src=schema.json --out=output.py --python-version=3.8
# Generate without --just-types
karrio codegen generate --src=schema.json --out=output.py --just-types=false
# Disable appending 'Type' to class names
karrio codegen generate --src=schema.json --out=output.py --no-append-type-suffix
```
### Create Tree
The `create-tree` command generates a Python code tree from a class definition. It's useful for visualizing the structure of complex nested objects and generating initialization code templates.
```bash
# Generate a tree for a class
karrio codegen create-tree --module=karrio.schemas.allied_express.label_request --class-name=LabelRequest
# Generate a tree with a module alias
karrio codegen create-tree --module=karrio.schemas.allied_express.label_request --class-name=LabelRequest --module-alias=allied
```
Example output:
```python
LabelRequestType(
bookedBy=None,
account=None,
instructions=None,
itemCount=None,
items=[
ItemType(
dangerous=None,
height=None,
itemCount=None,
length=None,
volume=None,
weight=None,
width=None,
)
],
jobStopsP=JobStopsType(
companyName=None,
contact=None,
emailAddress=None,
geographicAddress=GeographicAddressType(
address1=None,
address2=None,
country=None,
postCode=None,
state=None,
suburb=None,
),
phoneNumber=None,
),
jobStopsD=JobStopsType(...),
referenceNumbers=[],
serviceLevel=None,
volume=None,
weight=None,
)
```
## Migrating carrier connector generate scripts
To migrate carrier connector generate scripts to use the new codegen command, run:
```bash
python karrio/modules/cli/commands/migrate_generate_scripts.py /path/to/karrio/workspace
```
This will update all generate scripts in the carrier connectors to use the new karrio codegen command.
## JStruct Overview
JStruct is a Python library for nested object models that offers serialization and deserialization into Python dictionaries. It leverages the `attrs` library to define structs without boilerplate.
Here's an example of the transformed code:
```python
import attr
import jstruct
import typing
@attr.s(auto_attribs=True)
class PersonType:
first_name: typing.Optional[str] = None
last_name: typing.Optional[str] = None
@attr.s(auto_attribs=True)
class RoleModelsType:
scientists: typing.Optional[typing.List[PersonType]] = jstruct.JList[PersonType]
```
For more information, see the [JStruct README](https://github.com/karrioapi/jstruct/blob/main/README.md).
| text/markdown | null | Karrio Team <hello@karrio.io> | null | null | MIT | null | [
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"typer>=0.7.0",
"rich>=13.0.0",
"requests>=2.28.0",
"importlib-metadata>=4.12.0",
"jstruct>=0.1.0",
"generateDS",
"tabulate",
"Jinja2",
"fastapi",
"uvicorn",
"python-multipart",
"python-dotenv; extra == \"dev\"",
"karrio; extra == \"dev\"",
"beautifulsoup4; extra == \"dev\"",
"pdfplumber; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio",
"Bug Tracker, https://github.com/karrioapi/karrio/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:13.049383 | karrio_cli-2026.1.14-py3-none-any.whl | 97,589 | 21/7a/f7196251eec73d8df6cec96507be040e532cb765891e012311c618b13c53/karrio_cli-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | a70090d07828484640cf02d0723e07e3 | e851780f72183a33f1f422ca9dcd44284bc26bd227978294178b193f2f4aad2c | 217af7196251eec73d8df6cec96507be040e532cb765891e012311c618b13c53 | null | [] | 79 |
2.4 | karrio-chronopost | 2026.1.14 | Karrio - Chronopost Shipping Extension |
# karrio.chronopost
This package is a Chronopost extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.chronopost
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.chronopost.settings import Settings
# Initialize a carrier gateway
chronopost = karrio.gateway["chronopost"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:11.667738 | karrio_chronopost-2026.1.14-py3-none-any.whl | 149,724 | b1/d0/2365a4b38257d706f1355ada454a6bccf16091ca60ec7b29f19ceb30c76a/karrio_chronopost-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 4708b29628eff99f5a2a0d69e3785970 | b46e754d938f16aac71253e2da18cb325565344889aca5fd17c1ddb668d5f52e | b1d02365a4b38257d706f1355ada454a6bccf16091ca60ec7b29f19ceb30c76a | LGPL-3.0 | [] | 83 |
2.4 | karrio-canpar | 2026.1.14 | Karrio - Canpar Shipping Extension | # karrio.canpar
This package is a Canpar extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.canpar
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.canpar.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["canpar"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:10.552423 | karrio_canpar-2026.1.14-py3-none-any.whl | 210,081 | 49/1d/d613beaa32d47bfcc6870330dd8688c579383cf5a2416d7308f8b81bc40d/karrio_canpar-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | d6c7d9b2a3b9d5b69c69518c9d73d2b3 | 5fb9e094f7c4d4449f539de0a9af5e1657670bca02120223a04e64e50aeeca80 | 491dd613beaa32d47bfcc6870330dd8688c579383cf5a2416d7308f8b81bc40d | LGPL-3.0 | [] | 82 |
2.4 | karrio-canadapost | 2026.1.14 | Karrio - Canada Post Shipping Extension | # karrio.canadapost
This package is a Canada Post extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.canadapost
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.canadapost.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["canadapost"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:08.675309 | karrio_canadapost-2026.1.14-py3-none-any.whl | 316,153 | 9f/16/65fdeccf05d5bd407301a6b922df376b2ce751f023947ce5e0c8581271ac/karrio_canadapost-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 2f406692fd5499bf079de1add7a73589 | 79222aad89b0607ce197192836ed932358dbaf2e3a9c6c03b3d4dbc8109cbbb2 | 9f1665fdeccf05d5bd407301a6b922df376b2ce751f023947ce5e0c8581271ac | LGPL-3.0 | [] | 77 |
2.4 | karrio-bpost | 2026.1.14 | Karrio - Belgian Post Shipping Extension |
# karrio.bpost
This package is a Belgian Post extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.bpost
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.bpost.settings import Settings
# Initialize a carrier gateway
bpost = karrio.gateway["bpost"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:07.391729 | karrio_bpost-2026.1.14-py3-none-any.whl | 219,403 | 8c/1b/4d124f86bf40e428c139b80720fdcd189328edc48621efa92d88277cef9d/karrio_bpost-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | b5ba10a6f203495133bb373ff1aad0cf | a9d7498bcffc459880e381033d3ab10529565dd4dd50dd53730c458c47dbbd4d | 8c1b4d124f86bf40e428c139b80720fdcd189328edc48621efa92d88277cef9d | LGPL-3.0 | [] | 83 |
2.4 | karrio-boxknight | 2026.1.14 | Karrio - BoxKnight Shipping Extension |
# karrio.boxknight
This package is a BoxKnight extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.boxknight
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.boxknight.settings import Settings
# Initialize a carrier gateway
boxknight = karrio.gateway["boxknight"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:06.489502 | karrio_boxknight-2026.1.14-py3-none-any.whl | 14,770 | a9/ac/8048ca7b28fca59b94b28981f69b7b06f263869f1c1e64be6c376997a5bf/karrio_boxknight-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 12d80804a8494814e008d0ff460c5928 | ae4ae413c8342f329879f29c64849d15ed57ba82445968b1da10e6adeccce68e | a9ac8048ca7b28fca59b94b28981f69b7b06f263869f1c1e64be6c376997a5bf | LGPL-3.0 | [] | 82 |
2.4 | karrio-australiapost | 2026.1.14 | Karrio - Australia Post Shipping Extension |
# karrio.australiapost
This package is a Australia Post extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.australiapost
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.australiapost.settings import Settings
# Initialize a carrier gateway
australiapost = karrio.gateway["australiapost"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:05.666238 | karrio_australiapost-2026.1.14-py3-none-any.whl | 24,312 | de/42/50821e94f76098d1fa01f31645143d987022cb47683117bac6e9f2539b69/karrio_australiapost-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 3fb624a7f66c6c1f7a96e579e846cae7 | 2b2c85fc852999149c7a58eff943cdb9ad8c1407678a67cba4fe9ff4f6c9ade8 | de4250821e94f76098d1fa01f31645143d987022cb47683117bac6e9f2539b69 | LGPL-3.0 | [] | 76 |
2.4 | karrio-asendia-us | 2026.1.14 | Karrio - Asendia US Shipping Extension |
# karrio.asendia_us
This package is a Asendia US extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.asendia_us
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.asendia_us.settings import Settings
# Initialize a carrier gateway
asendia_us = karrio.gateway["asendia_us"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:04.808904 | karrio_asendia_us-2026.1.14-py3-none-any.whl | 15,754 | 11/cd/72fc4c6988f74beebccaf5f42c3b1e40817b9db3f425f89922d4d25d9fa2/karrio_asendia_us-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 2c2178151c13b58fb156914ea8f5ebfe | 42b573e9b01934029f61e213653cc6953c1b0ce6d0f91ef24b001b777e2e76c6 | 11cd72fc4c6988f74beebccaf5f42c3b1e40817b9db3f425f89922d4d25d9fa2 | LGPL-3.0 | [] | 81 |
2.4 | karrio-asendia | 2026.1.14 | Karrio - Asendia Shipping Extension | # karrio.asendia
This package is a Asendia extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.11+`
## Installation
```bash
pip install karrio.asendia
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.asendia.settings import Settings
# Initialize a carrier gateway
asendia = karrio.gateway["asendia"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:03.917467 | karrio_asendia-2026.1.14-py3-none-any.whl | 22,790 | 18/74/9d36c2a3d767197552f0cdb1e5c56339be1ff0c82c22ca50cb7c8bd4189c/karrio_asendia-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 5fc0d9aa72984ac7faec1eea90ea5e73 | ef122af99d89861f3d1733a61558e95e10844d20ff14dc504ca7118ac2844c49 | 18749d36c2a3d767197552f0cdb1e5c56339be1ff0c82c22ca50cb7c8bd4189c | LGPL-3.0 | [] | 77 |
2.4 | karrio-aramex | 2026.1.14 | Karrio - Aramex Shipping extension | # karrio.aramex
This package is a Aramex extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.aramex
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.aramex.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["aramex"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:02.805940 | karrio_aramex-2026.1.14-py3-none-any.whl | 146,430 | ff/6c/44a772271a134b5b9e5c087dc4c214d37e089dd04034609acb3ed3f8a21f/karrio_aramex-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 68469bed943168866199aa42786a25f8 | a7d05484758bf05384de1228aa7f3f3f3d59a43f00145d4670781c777084ed9e | ff6c44a772271a134b5b9e5c087dc4c214d37e089dd04034609acb3ed3f8a21f | LGPL-3.0 | [] | 79 |
2.4 | voice-vibecoder | 2.12.3 | Olaf The Vibecoder — a voice-controlled coding assistant using OpenAI Realtime API + Claude Code | # Olaf The Vibecoder
A voice-controlled coding assistant that combines OpenAI's Realtime API for voice conversation with Claude Code for autonomous coding tasks.
## Quick Start
```bash
# Install and run
uv tool install voice-vibecoder
vibecoder
```
On first launch, a setup wizard will guide you through connecting your OpenAI or Azure OpenAI account. Press **s** anytime to change settings.
## Features
- Voice conversation via OpenAI Realtime API (supports OpenAI and Azure)
- Multi-instance Claude Code agents running in separate git worktrees
- Git diff visualization with file tree and colorized hunks
- Session persistence across restarts
- Push-to-talk and voice activity detection modes
- Multi-language support (English, Norwegian, Swedish)
## Requirements
- Python 3.10+
- An OpenAI API key with Realtime API access
- [PortAudio](http://www.portaudio.com/) for microphone input (`brew install portaudio` on macOS)
- [Claude Code](https://docs.anthropic.com/en/docs/claude-code) CLI installed
## Usage as a Library
Embed the voice coding screen in your own Textual app:
```python
from pathlib import Path
from voice_vibecoder import VoiceCodingApp, VoiceCodingScreen, VoiceConfig
# Standalone app with custom config
config = VoiceConfig(
app_name="My Coding Assistant",
config_dir=Path.home() / ".my-app",
data_dir=Path.home() / ".my-app",
log_dir=Path.home() / ".my-app" / "logs",
on_startup=lambda send: send("Hello from startup!"),
)
VoiceCodingApp(config=config).run()
# Or embed the screen in your own Textual app
screen = VoiceCodingScreen(repo_root, config=config)
```
## Configuration
Settings are stored using platform-standard directories:
| Platform | Config | Data | Logs |
|----------|--------|------|------|
| macOS | `~/Library/Application Support/voice-vibecoder/` | `~/Library/Application Support/voice-vibecoder/` | `~/Library/Logs/voice-vibecoder/` |
| Linux | `~/.config/voice-vibecoder/` | `~/.local/share/voice-vibecoder/` | `~/.local/state/voice-vibecoder/log/` |
Git worktrees are created as siblings of your project directory (e.g., `../my-project-feat-login`).
## Keyboard Shortcuts
| Key | Action |
|-----|--------|
| `q` / `Esc` | Quit |
| `m` | Toggle mute |
| `s` | Settings |
| `c` | Cancel current Claude task |
| `Space` | Push-to-talk (when in PTT mode) |
## License
MIT
| text/markdown | null | Morten Jansrud <morten@jansrud.com> | null | null | null | assistant, claude, coding, openai, realtime, tui, vibecoder, voice | [
"Development Status :: 3 - Alpha",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Software Development"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"claude-agent-sdk>=0.1.30",
"cryptography>=42.0",
"numpy>=1.24",
"platformdirs>=4.0",
"rich>=13.0",
"sounddevice>=0.4",
"textual>=0.50",
"websockets>=12"
] | [] | [] | [] | [
"Homepage, https://github.com/snokam/voice-vibecoder",
"Repository, https://github.com/snokam/voice-vibecoder",
"Issues, https://github.com/snokam/voice-vibecoder/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:02.393001 | voice_vibecoder-2.12.3.tar.gz | 83,684 | ae/4f/8b327a78847f3dae7698c3c033bd041113723679adfdd4649059f2ea2674/voice_vibecoder-2.12.3.tar.gz | source | sdist | null | false | 6424b1b07862f3d59b0f783f16df3adf | a0e178a7d6fa9552ceb2ac4ccaa09ba373be6e9b908c1066f11839e762fddeb1 | ae4f8b327a78847f3dae7698c3c033bd041113723679adfdd4649059f2ea2674 | MIT | [
"LICENSE"
] | 135 |
2.4 | karrio-amazon-shipping | 2026.1.14 | Karrio - Amazon Shipping Extension |
# karrio.amazon_shipping
This package is a Amazon Shipping extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.amazon_shipping
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.amazon_shipping.settings import Settings
# Initialize a carrier gateway
amazon_shipping = karrio.gateway["amazon_shipping"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:40:00.629350 | karrio_amazon_shipping-2026.1.14-py3-none-any.whl | 21,796 | 78/4f/604b4009ad1f937899a3caa48f029e1896f6c5435d6eb16e6ba26150fc50/karrio_amazon_shipping-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 30c77b1f0487560c2511b3e06a170e71 | fadf7a40388f89bc6469d61a364b8c291f4e85dafaaffd9272cffddb6c1e791e | 784f604b4009ad1f937899a3caa48f029e1896f6c5435d6eb16e6ba26150fc50 | LGPL-3.0 | [] | 79 |
2.4 | karrio-allied-express-local | 2026.1.14 | Karrio - Allied Express Local Shipping Extension |
# karrio.allied_express_local
This package is a Allied Express Local extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.allied_express_local
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.allied_express_local.settings import Settings
# Initialize a carrier gateway
allied_express_local = karrio.gateway["allied_express_local"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:39:59.859373 | karrio_allied_express_local-2026.1.14-py3-none-any.whl | 18,384 | ab/d1/ae5ee0987998566e58d4bb4d3602a7c66f22c453442e29958f1411e8cdf5/karrio_allied_express_local-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 5fcbccceed75262c55027e30eb6e2def | 33af81922545621e7bd83410cb77025778198cb246b76ebdefd9ce457dd862dc | abd1ae5ee0987998566e58d4bb4d3602a7c66f22c453442e29958f1411e8cdf5 | LGPL-3.0 | [] | 80 |
2.4 | karrio-allied-express | 2026.1.14 | Karrio - Allied Express Shipping Extension |
# karrio.allied_express
This package is a Allied Express extension of the [karrio](https://pypi.org/project/karrio) multi carrier shipping SDK.
## Requirements
`Python 3.7+`
## Installation
```bash
pip install karrio.allied_express
```
## Usage
```python
import karrio.sdk as karrio
from karrio.mappers.allied_express.settings import Settings
# Initialize a carrier gateway
allied_express = karrio.gateway["allied_express"].create(
Settings(
...
)
)
```
Check the [Karrio Mutli-carrier SDK docs](https://docs.karrio.io) for Shipping API requests
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.7 | [] | [] | [] | [
"karrio"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:39:58.737651 | karrio_allied_express-2026.1.14-py3-none-any.whl | 17,317 | 10/cb/2dc2ab91d18a49cd384bba6fd3265735197ee9bebc81415e47fc7d28ef1d/karrio_allied_express-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | bc74b66fc059acfb718bd717f7451f51 | 8f96c35eb0f06186c477d71253475582f5df25cc0cf88241c5f4a8290187faa8 | 10cb2dc2ab91d18a49cd384bba6fd3265735197ee9bebc81415e47fc7d28ef1d | LGPL-3.0 | [] | 80 |
2.4 | karrio | 2026.1.14 | Multi-carrier shipping API integration with python | # <a href="https://karrio.io" target="_blank"><img alt="Karrio" src="https://docs.karrio.io/img/logo.svg" height="50px" /></a>
[](https://github.com/karrioapi/karrio/actions/workflows/tests.yml)
[](./LICENSE)
[](https://www.codacy.com/gh/karrio/karrio/dashboard?utm_source=github.com&utm_medium=referral&utm_content=karrio/karrio&utm_campaign=Badge_Grade)
karrio is a multi-carrier shipping SDK.
The key features are:
- **Unified API**: A standardized set of models representing the common shipping data (`Address`, `Parcel`, `Shipment`...)
- **Intuitive API**: A library that abstracts and unifies the typical shipping API services (`Rating`, `Shipping`, `Tracking`...)
- **Multi-carrier**: Integrate karrio once and connect to multiple shipping carrier APIs
- **Custom carrier**: A framework to integrate a shipping carrier services within hours instead of months
## Requirements
Python 3.11+
## Installation
```bash
# install karrio core
pip install karrio
# eg: install the karrio canadapost extention
pip install karrio.canadapost
```
<details>
<summary>Additional carrier extensions</summary>
- `karrio.aramex`
- `karrio.australiapost`
- `karrio.canadapost`
- `karrio.canpar`
- `karrio.dhl-express`
- `karrio.dhl-universal`
- `karrio.dicom`
- `karrio.fedex`
- `karrio.purolator`
- `karrio.royalmail`
- `karrio.sendle`
- `karrio.sf-express`
- `karrio.tnt`
- `karrio.ups`
- `karrio.usps`
- `karrio.usps-international`
- `karrio.yanwen`
- `karrio.yunexpress`
</details>
## Usage
<details>
<summary>Rates Fetching</summary>
- Fetch shipping rates
```python
import karrio.sdk as karrio
from karrio.core.models import Address, Parcel, RateRequest
from karrio.mappers.canadapost.settings import Settings
# Initialize a carrier gateway
canadapost = karrio.gateway["canadapost"].create(
Settings(
username="6e93d53968881714",
password="0bfa9fcb9853d1f51ee57a",
customer_number="2004381",
contract_id="42708517",
test=True
)
)
# Fetching shipment rates
# Provide the shipper's address
shipper = Address(
postal_code="V6M2V9",
city="Vancouver",
country_code="CA",
state_code="BC",
address_line1="5840 Oak St"
)
# Provide the recipient's address
recipient = Address(
postal_code="E1C4Z8",
city="Moncton",
country_code="CA",
state_code="NB",
residential=False,
address_line1="125 Church St"
)
# Specify your package dimensions and weight
parcel = Parcel(
height=3.0,
length=6.0,
width=3.0,
weight=0.5,
weight_unit='KG',
dimension_unit='CM'
)
# Prepare a rate request
rate_request = RateRequest(
shipper=shipper,
recipient=recipient,
parcels=[parcel],
services=["canadapost_xpresspost"],
)
# Send a rate request using a carrier gateway
response = karrio.Rating.fetch(rate_request).from_(canadapost)
# Parse the returned response
rates, messages = response.parse()
print(rates)
# [
# RateDetails(
# carrier_name="canadapost",
# carrier_id="canadapost",
# currency="CAD",
# transit_days=2,
# service="canadapost_xpresspost",
# discount=1.38,
# base_charge=12.26,
# total_charge=13.64,
# duties_and_taxes=0.0,
# extra_charges=[
# ChargeDetails(name="Automation discount", amount=-0.37, currency="CAD"),
# ChargeDetails(name="Fuel surcharge", amount=1.75, currency="CAD"),
# ],
# meta=None,
# id=None,
# )
# ]
```
</details>
## Resources
- [**Documentation**](https://docs.karrio.io)
- [**Community Discussions**](https://github.com/karrioapi/karrio/discussions)
- [**Issue Tracker**](https://github.com/karrioapi/karrio/issues)
- [**Blog**](https://docs.karrio.io/blog)
> [Join us on Discord](https://discord.gg/gS88uE7sEx)
| text/markdown | null | karrio <hello@karrio.io> | null | null | null | null | [
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"attrs",
"jstruct",
"xmltodict",
"lxml",
"lxml-stubs",
"py-soap",
"Pillow",
"phonenumbers",
"python-barcode",
"PyPDF2",
"toml",
"loguru"
] | [] | [] | [] | [
"Homepage, https://github.com/karrioapi/karrio"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:39:57.048108 | karrio-2026.1.14-py3-none-any.whl | 206,239 | 96/4f/435020414308c6a173aff47abc84b6584d55511fff745da582c5a2ba7fcd/karrio-2026.1.14-py3-none-any.whl | py3 | bdist_wheel | null | false | 39d00ab16ba7ceb54117cf86772a67ce | 01e3cffc08bc197b5f061c67c0b349b174d663c7aacf50c9a6f20415dcf3dc67 | 964f435020414308c6a173aff47abc84b6584d55511fff745da582c5a2ba7fcd | LGPL-3.0 | [] | 85 |
2.1 | odoo-addon-account-invoice-triple-discount | 18.0.1.0.0.9 | Manage triple discount on invoice lines | .. image:: https://odoo-community.org/readme-banner-image
:target: https://odoo-community.org/get-involved?utm_source=readme
:alt: Odoo Community Association
===============================
Account Invoice Triple Discount
===============================
..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:96fbed1626bb94b34b29d3287cbf750e394cae6f90526ddba1450a75f4c45b49
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/license-AGPL--3-blue.png
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Faccount--invoicing-lightgray.png?logo=github
:target: https://github.com/OCA/account-invoicing/tree/18.0/account_invoice_triple_discount
:alt: OCA/account-invoicing
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/account-invoicing-18-0/account-invoicing-18-0-account_invoice_triple_discount
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/account-invoicing&target_branch=18.0
:alt: Try me on Runboat
|badge1| |badge2| |badge3| |badge4| |badge5|
This module allows to have three successive discounts on each invoice
line.
**Table of contents**
.. contents::
:local:
Usage
=====
Create a new invoice and add discounts in any of the three discount
fields given. They go in order of precedence so discount 2 will be
calculated over discount 1 and discount 3 over the result of discount 2.
For example, let's divide by two on every discount:
Unit price: 600.00 ->
- Disc. 1 = 50% -> Amount = 300.00
- Disc. 2 = 50% -> Amount = 150.00
- Disc. 3 = 50% -> Amount = 75.00
You can also use negative values to charge instead of discount:
Unit price: 600.00 ->
- Disc. 1 = 50% -> Amount = 300.00
- Disc. 2 = -5% -> Amount = 315.00
Bug Tracker
===========
Bugs are tracked on `GitHub Issues <https://github.com/OCA/account-invoicing/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/account-invoicing/issues/new?body=module:%20account_invoice_triple_discount%0Aversion:%2018.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.
Do not contact contributors directly about support or help with technical issues.
Credits
=======
Authors
-------
* QubiQ
* Tecnativa
* GRAP
Contributors
------------
- David Vidal <david.vidal@tecnativa.com>
- Pedro M. Baeza <pedro.baeza@tecnativa.com>
- Nikul Chaudhary <nikulchaudhary2112@gmail.com>
- `Aion Tech <https://aiontech.company/>`__:
- Simone Rubino <simone.rubino@aion-tech.it>
- Laurent Mignon <laurent.mignon@acsone.eu>
- Akim Juillerat <akim.juillerat@camptocamp.com>
Maintainers
-----------
This module is maintained by the OCA.
.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org
OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.
This module is part of the `OCA/account-invoicing <https://github.com/OCA/account-invoicing/tree/18.0/account_invoice_triple_discount>`_ project on GitHub.
You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
| text/x-rst | QubiQ, Tecnativa, GRAP, Odoo Community Association (OCA) | support@odoo-community.org | null | null | AGPL-3 | null | [
"Programming Language :: Python",
"Framework :: Odoo",
"Framework :: Odoo :: 18.0",
"License :: OSI Approved :: GNU Affero General Public License v3"
] | [] | https://github.com/OCA/account-invoicing | null | >=3.10 | [] | [] | [] | [
"odoo==18.0.*"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.3 | 2026-02-21T02:39:25.273174 | odoo_addon_account_invoice_triple_discount-18.0.1.0.0.9-py3-none-any.whl | 70,548 | a3/d5/13c5f9d28a574fba8396a6cda892e7e1aec09a2eb2c03e080f1b309dc919/odoo_addon_account_invoice_triple_discount-18.0.1.0.0.9-py3-none-any.whl | py3 | bdist_wheel | null | false | 7d6b1cc02bd1a22151e8fde9ce2fd672 | 8b441d87bd714f9690214be046360b3c03cf62ac04ead43717a1a95dbf7ed4af | a3d513c5f9d28a574fba8396a6cda892e7e1aec09a2eb2c03e080f1b309dc919 | null | [] | 75 |
2.1 | odoo-addon-account-invoice-pricelist | 18.0.1.0.3.1 | Add partner pricelist on invoices | .. image:: https://odoo-community.org/readme-banner-image
:target: https://odoo-community.org/get-involved?utm_source=readme
:alt: Odoo Community Association
===============================
Account - Pricelist on Invoices
===============================
..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:d5d32486340a48ed58a998c2f5dffb72adf89db5f6c05ef5bfca2831985ddff4
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/license-AGPL--3-blue.png
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Faccount--invoicing-lightgray.png?logo=github
:target: https://github.com/OCA/account-invoicing/tree/18.0/account_invoice_pricelist
:alt: OCA/account-invoicing
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/account-invoicing-18-0/account-invoicing-18-0-account_invoice_pricelist
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/account-invoicing&target_branch=18.0
:alt: Try me on Runboat
|badge1| |badge2| |badge3| |badge4| |badge5|
- Add a stored field pricelist on invoices, related to the partner
pricelist;
- Use this pricelist when manually adding invoice lines;
- Rules defined in pricelists applied in multicurrency context;
- Possibility to group by pricelist on account.invoice view;
|image|
For further information, please visit:
- https://www.odoo.com/forum/help-1
.. |image| image:: https://raw.githubusercontent.com/OCA/account-invoicing/18.0/account_invoice_pricelist/static/src/description/screenshot_group_by.png
**Table of contents**
.. contents::
:local:
Bug Tracker
===========
Bugs are tracked on `GitHub Issues <https://github.com/OCA/account-invoicing/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/account-invoicing/issues/new?body=module:%20account_invoice_pricelist%0Aversion:%2018.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.
Do not contact contributors directly about support or help with technical issues.
Credits
=======
Authors
-------
* GRAP
* Therp BV
* Tecnativa
Contributors
------------
- Sylvain LE GAL (https://twitter.com/legalsylvain)
- Holger Brunn <hbrunn@therp.nl>
- Sergio Teruel <sergio.teruel@tecnativa.com>
- Raphaël Valyi <rvalyi@akretion.com>
- Alberto Martín <alberto.martin@guadaltech.es>
- Nikul Chaudhary <nikulchaudhary2112@gmail.com>
- Manuel Regidor <manuel.regidor@sygel.es>
- `APSL-Nagarro <https://www.apsl.tech>`__:
- Antoni Marroig <amarroig@apsl.net>
Maintainers
-----------
This module is maintained by the OCA.
.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org
OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.
This module is part of the `OCA/account-invoicing <https://github.com/OCA/account-invoicing/tree/18.0/account_invoice_pricelist>`_ project on GitHub.
You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
| text/x-rst | GRAP,Therp BV,Tecnativa,Odoo Community Association (OCA) | support@odoo-community.org | null | null | AGPL-3 | null | [
"Programming Language :: Python",
"Framework :: Odoo",
"Framework :: Odoo :: 18.0",
"License :: OSI Approved :: GNU Affero General Public License v3"
] | [] | https://github.com/OCA/account-invoicing | null | >=3.10 | [] | [] | [] | [
"odoo==18.0.*"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.3 | 2026-02-21T02:39:21.201824 | odoo_addon_account_invoice_pricelist-18.0.1.0.3.1-py3-none-any.whl | 118,734 | d7/19/ca4c39ba6ee94d2a84f6a42b363bc5cce4589293396bdfd8b353d924d66f/odoo_addon_account_invoice_pricelist-18.0.1.0.3.1-py3-none-any.whl | py3 | bdist_wheel | null | false | c0124eac8da402097b953346bc2960aa | 8c3f978b0e2a6c1967f7a6273bc9f87e7909fdb58be7682bb814a0b4cc93f981 | d719ca4c39ba6ee94d2a84f6a42b363bc5cce4589293396bdfd8b353d924d66f | null | [] | 67 |
2.1 | odoo-addon-account-invoice-supplierinfo-update | 16.0.1.1.3.1 | In the supplier invoice, automatically updates all products whose unit price on the line is different from the supplier price | .. image:: https://odoo-community.org/readme-banner-image
:target: https://odoo-community.org/get-involved?utm_source=readme
:alt: Odoo Community Association
======================================
Account Invoice - Supplier Info Update
======================================
..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:976b15f293014f48e7da3d48fb3c0bd34425924dc8d057af5219103f06b6d819
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/license-AGPL--3-blue.png
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Faccount--invoicing-lightgray.png?logo=github
:target: https://github.com/OCA/account-invoicing/tree/16.0/account_invoice_supplierinfo_update
:alt: OCA/account-invoicing
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/account-invoicing-16-0/account-invoicing-16-0-account_invoice_supplierinfo_update
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/account-invoicing&target_branch=16.0
:alt: Try me on Runboat
|badge1| |badge2| |badge3| |badge4| |badge5|
This module allows to automatically update all products information in a vendor
bill for which the purchase information on the line is different from the
vendor information defined in the product form.
It creates a new vendor information line if there isn't any, or it updates the
first one in the list.
**Table of contents**
.. contents::
:local:
Usage
=====
This module adds a new button 'Check Supplier Info' in supplier
invoice form.
.. image:: https://raw.githubusercontent.com/OCA/account-invoicing/16.0/account_invoice_supplierinfo_update/static/description/supplier_invoice_form.png
When the user clicks on it, they can see the changes that will apply to the
vendor information. Optionally, they can remove some temporary changes,
specially, if, for example, a vendor applied an exceptional price change.
.. image:: https://raw.githubusercontent.com/OCA/account-invoicing/16.0/account_invoice_supplierinfo_update/static/description/main_screenshot.png
* blue: Creates a full new supplier info line
* brown: Updates current settings, displaying price variation (%)
This module adds an extra boolean field 'Supplier Informations Checked' in the
'Other Info' tab inside the supplier invoice form.
This field indicates that the prices have been checked and
supplierinfo updated (or eventually that the changes have been ignored).
.. image:: https://raw.githubusercontent.com/OCA/account-invoicing/16.0/account_invoice_supplierinfo_update/static/description/supplier_invoice_form_other_info_tab.png
Known issues / Roadmap
======================
* This module does not manage correctly the difference if invoice line taxes
are not the same as products taxes. (If one is marked as tax included in the
price and the other is marked as tax excluded in the price.)
* Refactor this module to share algorithm with the similar module
`purchase_order_supplierinfo_update`
Bug Tracker
===========
Bugs are tracked on `GitHub Issues <https://github.com/OCA/account-invoicing/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/account-invoicing/issues/new?body=module:%20account_invoice_supplierinfo_update%0Aversion:%2016.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.
Do not contact contributors directly about support or help with technical issues.
Credits
=======
Authors
~~~~~~~
* Akretion
* GRAP
Contributors
~~~~~~~~~~~~
* Chafique Delli <chafique.delli@akretion.com>
* Sylvain LE GAL (https://twitter.com/legalsylvain)
* Mourad EL HADJ MIMOUNE <mourad.elhadj.mimoune@akretion.com>
* Stefan Rijnhart <stefan@opener.amsterdam>
* `Tecnativa <https://www.tecnativa.com>`_:
* Ernesto Tejeda
* Luis D. Lafaurie
Maintainers
~~~~~~~~~~~
This module is maintained by the OCA.
.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org
OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.
.. |maintainer-legalsylvain| image:: https://github.com/legalsylvain.png?size=40px
:target: https://github.com/legalsylvain
:alt: legalsylvain
Current `maintainer <https://odoo-community.org/page/maintainer-role>`__:
|maintainer-legalsylvain|
This module is part of the `OCA/account-invoicing <https://github.com/OCA/account-invoicing/tree/16.0/account_invoice_supplierinfo_update>`_ project on GitHub.
You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
| null | Akretion, GRAP, Odoo Community Association (OCA) | support@odoo-community.org | null | null | AGPL-3 | null | [
"Programming Language :: Python",
"Framework :: Odoo",
"Framework :: Odoo :: 16.0",
"License :: OSI Approved :: GNU Affero General Public License v3"
] | [] | https://github.com/OCA/account-invoicing | null | >=3.10 | [] | [] | [] | [
"odoo<16.1dev,>=16.0a"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.3 | 2026-02-21T02:39:17.588565 | odoo_addon_account_invoice_supplierinfo_update-16.0.1.1.3.1-py3-none-any.whl | 252,030 | cb/6c/093d06e579a6805ac0af22c75e74850c6cf2d1cd885555b41d9a17b77e22/odoo_addon_account_invoice_supplierinfo_update-16.0.1.1.3.1-py3-none-any.whl | py3 | bdist_wheel | null | false | 91ee5da46e22d2c28f9d6d1e13aad991 | 66f34d29226643b5a6e34c491dd149a4539560247be8831e32fbefcbcb356bc4 | cb6c093d06e579a6805ac0af22c75e74850c6cf2d1cd885555b41d9a17b77e22 | null | [] | 68 |
2.1 | odoo-addon-account-invoice-check-total | 18.0.1.0.0.5 | Check if the verification total is equal to the bill's total | .. image:: https://odoo-community.org/readme-banner-image
:target: https://odoo-community.org/get-involved?utm_source=readme
:alt: Odoo Community Association
===========================
Account Invoice Check Total
===========================
..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:6450065eb3a5c2f531e4a25977d91d4e92e016c7e1b625cc5df4e62d87a4111d
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/license-AGPL--3-blue.png
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Faccount--invoicing-lightgray.png?logo=github
:target: https://github.com/OCA/account-invoicing/tree/18.0/account_invoice_check_total
:alt: OCA/account-invoicing
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/account-invoicing-18-0/account-invoicing-18-0-account_invoice_check_total
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/account-invoicing&target_branch=18.0
:alt: Try me on Runboat
|badge1| |badge2| |badge3| |badge4| |badge5|
Add a Verification Total field on vendor bills.
The user enters the taxes included invoice total as printed on the
vendor bill, then enters the invoice lines and taxes.
The system then checks the total computed by Odoo is the same as the
verification total.
**Table of contents**
.. contents::
:local:
Configuration
=============
The setting **Check Total on Vendor Bills** needs to be checked. This
can be done as follows:
- on the Access Rights (Technical Settings) of the user
- on the Invoicing Settings inside of the section *Vendor Payments*
Bug Tracker
===========
Bugs are tracked on `GitHub Issues <https://github.com/OCA/account-invoicing/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/account-invoicing/issues/new?body=module:%20account_invoice_check_total%0Aversion:%2018.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.
Do not contact contributors directly about support or help with technical issues.
Credits
=======
Authors
-------
* Acsone SA/NV
Contributors
------------
- Thomas Binsfeld <thomas.binsfeld@acsone.eu>
- Raf Ven <raf.ven@dynapps.be>
- Andrea Stirpe <a.stirpe@onestein.nl>
- `Tecnativa <https://www.tecnativa.com>`__:
- Ernesto Tejeda
Maintainers
-----------
This module is maintained by the OCA.
.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org
OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.
This module is part of the `OCA/account-invoicing <https://github.com/OCA/account-invoicing/tree/18.0/account_invoice_check_total>`_ project on GitHub.
You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
| text/x-rst | Acsone SA/NV, Odoo Community Association (OCA) | support@odoo-community.org | null | null | AGPL-3 | null | [
"Programming Language :: Python",
"Framework :: Odoo",
"Framework :: Odoo :: 18.0",
"License :: OSI Approved :: GNU Affero General Public License v3"
] | [] | https://github.com/OCA/account-invoicing | null | >=3.10 | [] | [] | [] | [
"odoo==18.0.*"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.3 | 2026-02-21T02:39:15.818774 | odoo_addon_account_invoice_check_total-18.0.1.0.0.5-py3-none-any.whl | 81,127 | 7b/e7/c0b44b6c43fb7ec34bbaf38b3ca0d7ca78afb1220a3830289777f15f0379/odoo_addon_account_invoice_check_total-18.0.1.0.0.5-py3-none-any.whl | py3 | bdist_wheel | null | false | b7df89ed5ee6b3551506599b12de922f | 940430890a66a3fc515350c692f961f207d5e625a60ab22c1b0235b3f95df0f8 | 7be7c0b44b6c43fb7ec34bbaf38b3ca0d7ca78afb1220a3830289777f15f0379 | null | [] | 74 |
2.4 | solweig | 0.1.0b54 | High-performance SOLWEIG urban microclimate model (Rust + Python) | # SOLWEIG
**Map how hot it _feels_ across a city — pixel by pixel.**
SOLWEIG computes **Mean Radiant Temperature (Tmrt)** and thermal comfort indices (**UTCI**, **PET**) for urban environments. Give it a building height model and weather data, and it produces high-resolution maps showing where people experience heat stress — and where trees, shade, and cool surfaces make a difference.
Adapted from the [UMEP](https://github.com/UMEP-dev/UMEP-processing) (Urban Multi-scale Environmental Predictor) platform by Fredrik Lindberg, Sue Grimmond, and contributors — see Lindberg et al. ([2008](https://doi.org/10.1007/s00484-008-0162-7), [2018](https://doi.org/10.1016/j.envsoft.2017.09.020)). Re-implemented in Rust for speed, with optional GPU acceleration.

_DSM/DEM data: [PNOA-LiDAR](https://pnoa.ign.es/pnoa-lidar), Instituto Geográfico Nacional (IGN), Spain. CC BY 4.0._
> **Experimental:** This package and QGIS plugin are released for testing and discussion purposes. The API is stabilising but may change. Feedback and bug reports welcome — [open an issue](https://github.com/UMEP-dev/solweig/issues).
**[Documentation](https://umep-dev.github.io/solweig/)** · [Installation](https://umep-dev.github.io/solweig/getting-started/installation/) · [Quick Start](https://umep-dev.github.io/solweig/getting-started/quick-start/) · [API Reference](https://umep-dev.github.io/solweig/api/)
---
## What can you do with it?
- **Urban planning** — Compare street canyon designs, tree planting scenarios, or cool-roof strategies by mapping thermal comfort before and after.
- **Heat risk assessment** — Identify the hottest spots in a neighbourhood during a heatwave, hour by hour.
- **Research** — Run controlled microclimate experiments at 1 m resolution with full radiation budgets.
- **Climate services** — Generate thermal comfort maps for public health warnings or outdoor event planning.
## How it works
SOLWEIG models the complete radiation budget experienced by a person standing in an urban environment:
1. **Shadows** — Which pixels are shaded by buildings and trees at a given sun angle?
2. **Sky View Factor (SVF)** — How much sky can a person see from each point? (More sky = more incoming longwave and diffuse radiation.)
3. **Surface temperatures** — How hot are the ground and surrounding walls, accounting for thermal inertia across the diurnal cycle?
4. **Radiation balance** — Sum shortwave (sun) and longwave (heat) radiation from all directions, using either isotropic or Perez anisotropic sky models.
5. **Tmrt** — Convert total absorbed radiation into Mean Radiant Temperature.
6. **Thermal comfort** — Optionally derive UTCI or PET, which combine Tmrt with air temperature, humidity, and wind.
The computation pipeline is implemented in Rust and exposed to Python via PyO3. Shadow casting and anisotropic sky calculations can optionally run on the GPU via WebGPU. Large rasters are automatically tiled to fit GPU memory constraints.
---
## Install
```bash
pip install solweig
```
For all features (rasterio, geopandas, progress bars):
```bash
pip install solweig[full]
```
**Requirements:** Python 3.11–3.13. Pre-built wheels are available for Linux, macOS, and Windows.
### From source
```bash
git clone https://github.com/UMEP-dev/solweig.git
cd solweig
pip install maturin
maturin develop --release
```
This compiles the Rust extension locally. A Rust toolchain is required.
---
## Quick start
### Minimal example (numpy arrays)
```python
import numpy as np
import solweig
from datetime import datetime
# A flat surface with one 15 m building
dsm = np.full((200, 200), 2.0, dtype=np.float32)
dsm[80:120, 80:120] = 15.0
surface = solweig.SurfaceData(dsm=dsm, pixel_size=1.0)
surface.compute_svf() # Required before calculate()
location = solweig.Location(latitude=48.8, longitude=2.3, utc_offset=1) # Paris
weather = solweig.Weather(
datetime=datetime(2025, 7, 15, 14, 0),
ta=32.0, # Air temperature (°C)
rh=40.0, # Relative humidity (%)
global_rad=850.0, # Solar radiation (W/m²)
)
result = solweig.calculate(surface, location, weather)
print(f"Sunlit Tmrt: {result.tmrt[result.shadow > 0.5].mean():.0f}°C")
print(f"Shaded Tmrt: {result.tmrt[result.shadow < 0.5].mean():.0f}°C")
```
### Real-world workflow (GeoTIFFs + EPW weather)
```python
import solweig
# 1. Load surface — prepare() computes and caches walls/SVF when missing
surface = solweig.SurfaceData.prepare(
dsm="data/dsm.tif",
cdsm="data/trees.tif", # Optional: vegetation canopy heights
working_dir="cache/", # Expensive preprocessing cached here
)
# 2. Load weather from an EPW file (standard format from climate databases)
weather_list = solweig.Weather.from_epw(
"data/weather.epw",
start="2025-07-01",
end="2025-07-03",
)
location = solweig.Location.from_epw("data/weather.epw")
# 3. Run — outputs saved as GeoTIFFs, thermal state carried between timesteps
summary = solweig.calculate(
surface=surface,
weather=weather_list,
location=location,
output_dir="output/",
outputs=["tmrt", "shadow"],
)
# 4. Inspect results
print(summary.report())
summary.plot()
```
---
## API overview
### Core classes
| Class | Purpose |
|-------|---------|
| `SurfaceData` | Holds all spatial inputs (DSM, CDSM, DEM, land cover) and precomputed arrays (walls, SVF). Use `.prepare()` to load GeoTIFFs with automatic caching. |
| `Location` | Geographic coordinates (latitude, longitude, UTC offset). Create from coordinates, DSM CRS, or an EPW file. |
| `Weather` | Per-timestep meteorological data (air temperature, relative humidity, global radiation, optional wind speed). Load from EPW files or create manually. |
| `SolweigResult` | Output grids from a single timestep: Tmrt, shadow, UTCI, PET, radiation components. |
| `TimeseriesSummary` | Aggregated results from a multi-timestep run: mean/max/min grids, sun hours, UTCI threshold exceedance, per-timestep scalars. |
| `HumanParams` | Body parameters: posture (standing/sitting), absorption coefficients, PET body parameters (age, weight, height, etc.). |
| `ModelConfig` | Runtime settings: anisotropic sky, max shadow distance, tiling workers. |
### Main functions
```python
# Single timestep
summary = solweig.calculate(surface, weather=[weather], output_dir="output/")
# Multi-timestep with thermal inertia (auto-tiles large rasters)
summary = solweig.calculate(surface, weather=weather_list, output_dir="output/")
# Include UTCI and/or PET in per-timestep GeoTIFFs
summary = solweig.calculate(
surface, weather=weather_list,
output_dir="output/",
outputs=["tmrt", "utci", "shadow"],
)
# Input validation
warnings = solweig.validate_inputs(surface, location, weather)
```
### Convenience I/O
```python
# Load/save GeoTIFFs
data, transform, crs, nodata = solweig.io.load_raster("dsm.tif")
solweig.io.save_raster("output.tif", data, transform, crs)
# Rasterise vector data (e.g., tree polygons → height grid)
raster, transform = solweig.io.rasterise_gdf(gdf, "geometry", "height", bbox=bbox, pixel_size=1.0)
# Download EPW weather data (no API key needed)
epw_path = solweig.download_epw(latitude=37.98, longitude=23.73, output_path="athens.epw")
```
---
## Inputs and outputs
### What you need
| Input | Required? | What it is |
|-------|-----------|------------|
| **DSM** | Yes | Digital Surface Model — a height grid (metres) including buildings. GeoTIFF or numpy array. |
| **Location** | Yes | Latitude, longitude, and UTC offset. Can be extracted from the DSM's CRS or an EPW file. |
| **Weather** | Yes | Air temperature, relative humidity, and global solar radiation. Load from an EPW file or create manually. |
| **CDSM** | No | Canopy heights (trees). Adds vegetation shading. |
| **DEM** | No | Ground elevation. Separates terrain from buildings. |
| **Land cover** | No | Surface type grid (paved, grass, water, etc.). Affects surface temperatures. |
### What you get
| Output | Unit | Description |
|--------|------|-------------|
| **Tmrt** | °C | Mean Radiant Temperature — how much radiation a person absorbs. |
| **Shadow** | 0–1 | Shadow fraction (1 = sunlit, 0 = fully shaded). |
| **UTCI** | °C | Universal Thermal Climate Index — "feels like" temperature. |
| **PET** | °C | Physiological Equivalent Temperature — similar to UTCI with customisable body parameters. |
| Kdown / Kup | W/m² | Shortwave radiation (down and reflected up). |
| Ldown / Lup | W/m² | Longwave radiation (thermal, down and emitted up). |
### Timeseries summary grids
When running `calculate()` with a list of weather timesteps, the returned `TimeseriesSummary` provides aggregated grids across all timesteps:
| Grid | Description |
|------|-------------|
| `tmrt_mean`, `tmrt_max`, `tmrt_min` | Overall Tmrt statistics |
| `tmrt_day_mean`, `tmrt_night_mean` | Day/night Tmrt averages |
| `utci_mean`, `utci_max`, `utci_min` | Overall UTCI statistics |
| `utci_day_mean`, `utci_night_mean` | Day/night UTCI averages |
| `sun_hours`, `shade_hours` | Hours of direct sun / shade per pixel |
| `utci_hours_above` | Dict of threshold → grid of hours exceeding that UTCI value |
Plus a `Timeseries` object with per-timestep spatial means (Tmrt, UTCI, sun fraction, air temperature, radiation, etc.) for plotting.
### Don't have an EPW file? Download one
```python
epw_path = solweig.download_epw(latitude=37.98, longitude=23.73, output_path="athens.epw")
weather_list = solweig.Weather.from_epw(epw_path)
```
---
## Configuration
### Human body parameters
```python
human = solweig.HumanParams(
posture="standing", # or "sitting"
abs_k=0.7, # Shortwave absorption coefficient
abs_l=0.97, # Longwave absorption coefficient
# PET-specific:
age=35, weight=75, height=1.75, sex=1, activity=80, clothing=0.9,
)
result = solweig.calculate(surface, location, weather, human=human)
```
### Model options
Key parameters accepted by `calculate()`:
| Parameter | Default | Description |
|-----------|---------|-------------|
| `use_anisotropic_sky` | `True` | Use Perez anisotropic sky model for more accurate diffuse radiation. |
| `conifer` | `False` | Treat trees as evergreen (skip seasonal leaf-off). |
| `max_shadow_distance_m` | `1000` | Maximum shadow reach in metres. Increase for mountainous terrain. |
| `output_dir` | _(required)_ | Working directory for all output (summary grids, per-timestep GeoTIFFs, metadata). |
| `outputs` | `None` | Which per-timestep grids to save: `"tmrt"`, `"utci"`, `"pet"`, `"shadow"`, `"kdown"`, `"kup"`, `"ldown"`, `"lup"`. |
### Physics and materials
```python
# Custom vegetation transmissivity, posture geometry, etc.
physics = solweig.load_physics("custom_physics.json")
# Custom surface materials (albedo, emissivity per land cover class)
materials = solweig.load_materials("site_materials.json")
summary = solweig.calculate(
surface=surface,
weather=weather_list,
location=location,
physics=physics,
materials=materials,
)
```
---
## GPU acceleration
SOLWEIG uses WebGPU (via wgpu/Rust) for shadow casting and anisotropic sky computations. GPU is enabled by default when available.
```python
import solweig
# Check GPU status
print(solweig.is_gpu_available()) # True/False
print(solweig.get_compute_backend()) # "gpu" or "cpu"
print(solweig.get_gpu_limits()) # {"max_buffer_size": ..., "backend": "Metal"}
# Disable GPU (fall back to CPU)
solweig.disable_gpu()
```
Large rasters are automatically tiled to fit within GPU buffer limits. Tile size, worker count, and prefetch depth are configurable via `ModelConfig` or keyword arguments.
---
## Run metadata and reproducibility
Every timeseries run records a `run_metadata.json` in the output directory capturing the full parameter set:
```python
metadata = solweig.load_run_metadata("output/run_metadata.json")
print(metadata["solweig_version"])
print(metadata["location"])
print(metadata["parameters"]["use_anisotropic_sky"])
print(metadata["timeseries"]["start"], "to", metadata["timeseries"]["end"])
```
---
## QGIS plugin
SOLWEIG is also available as a **QGIS Processing plugin** for point-and-click spatial analysis — no Python scripting required.
### Installation
1. **Plugins** → **Manage and Install Plugins**
2. **Settings** tab → Check **"Show also experimental plugins"**
3. Search for **"SOLWEIG"** → **Install Plugin**
The plugin requires QGIS 4.0+ (Qt6, Python 3.11+). On first use it will offer to install the `solweig` Python library automatically.
### Processing algorithms
Once installed, SOLWEIG algorithms appear in the **Processing Toolbox** under the SOLWEIG group:
| Algorithm | Description |
|-----------|-------------|
| **Download / Preview Weather File** | Download a TMY EPW file from PVGIS, or preview an existing EPW file. |
| **Prepare Surface Data** | Align rasters, compute wall heights, wall aspects, and SVF. Results are cached and reused. |
| **SOLWEIG Calculation** | Single-timestep or timeseries Tmrt with optional inline UTCI/PET. Supports EPW and UMEP met files. |
### QGIS-specific features
- All inputs and outputs are standard QGIS raster layers (GeoTIFF)
- Automatic tiling for large rasters with GPU support
- QGIS progress bar integration with cancellation support
- Configurable vegetation parameters (transmissivity, seasonal leaf dates, conifer/deciduous)
- Configurable land cover materials table
- UTCI heat stress thresholds for day and night
- Run metadata saved alongside outputs for reproducibility
### Typical QGIS workflow
1. **Surface Preparation** — Load your DSM (and optionally CDSM, DEM, land cover). The algorithm computes walls, SVF, and caches everything to a working directory.
2. **Tmrt Timeseries** — Point to the prepared surface directory and an EPW file. Select your date range, outputs, and run. Results are saved as GeoTIFFs and loaded into the QGIS canvas.
3. **Inspect results** — Use standard QGIS tools to style, compare, and export the output layers.
---
## Demos
Complete working scripts:
- **[demos/athens-demo.py](demos/athens-demo.py)** — Full workflow: rasterise tree vectors, load GeoTIFFs, run a multi-day timeseries, visualise summary grids.
- **[demos/solweig_gbg_test.py](demos/solweig_gbg_test.py)** — Gothenburg: surface preparation with SVF caching, timeseries calculation.
---
## Citation
Adapted from [UMEP](https://github.com/UMEP-dev/UMEP-processing) by Fredrik Lindberg, Sue Grimmond, and contributors.
If you use SOLWEIG in your research, please cite the original model paper and the UMEP platform:
1. Lindberg F, Holmer B, Thorsson S (2008) SOLWEIG 1.0 – Modelling spatial variations of 3D radiant fluxes and mean radiant temperature in complex urban settings. _International Journal of Biometeorology_ 52, 697–713 [doi:10.1007/s00484-008-0162-7](https://doi.org/10.1007/s00484-008-0162-7)
2. Lindberg F, Grimmond CSB, Gabey A, Huang B, Kent CW, Sun T, Theeuwes N, Järvi L, Ward H, Capel-Timms I, Chang YY, Jonsson P, Krave N, Liu D, Meyer D, Olofson F, Tan JG, Wästberg D, Xue L, Zhang Z (2018) Urban Multi-scale Environmental Predictor (UMEP) – An integrated tool for city-based climate services. _Environmental Modelling and Software_ 99, 70-87 [doi:10.1016/j.envsoft.2017.09.020](https://doi.org/10.1016/j.envsoft.2017.09.020)
## Demo data
The Athens demo dataset (`demos/data/athens/`) uses the following sources:
- **DSM/DEM** — Derived from LiDAR data available via the [Hellenic Cadastre geoportal](https://www.ktimatologio.gr/)
- **Tree vectors** (`trees.gpkg`) — Derived from the [Athens Urban Atlas](https://land.copernicus.eu/local/urban-atlas) and municipal open data at [geodata.gov.gr](https://geodata.gov.gr/)
- **EPW weather** (`athens_2023.epw`) — Generated using Copernicus Climate Change Service information [2025] via [PVGIS](https://re.jrc.ec.europa.eu/pvg_tools/en/). Contains modified Copernicus Climate Change Service information; neither the European Commission nor ECMWF is responsible for any use that may be made of the Copernicus information or data it contains.
## License
GNU Affero General Public License v3.0 — see [LICENSE](LICENSE).
| text/markdown; charset=UTF-8; variant=GFM | UMEP Developers | null | UMEP Developers | null | AGPL-3.0 | python3, geographical-information-system, spatial-data, spatial-data-analysis, urban-climate, urban-heat-island, urban-meteorology, urban-microclimate, urban-planning, urban-sustainability, urban-thermal-comfort, urban-thermal-environment, urban-thermal-mapping, urban-thermal-modelling, urban-thermal-simulation, urban-thermal-sustainability | [
"Development Status :: 3 - Alpha",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Rust",
"Topic :: Scientific/Engineering :: Atmospheric Science",
"Topic :: Scientific/Engineering :: GIS"
] | [] | null | null | <3.14,>=3.11 | [] | [] | [] | [
"numpy>=1.26.0",
"pyproj>=3.7.0",
"shapely>=2.0.4",
"geopandas>=1.0.1; extra == \"full\"",
"rasterio>=1.3.0; extra == \"full\"",
"tqdm>=4.67.1; extra == \"full\"",
"pillow>=9.0.0; extra == \"full\""
] | [] | [] | [] | [
"documentation, https://umep-dev.github.io/solweig/",
"homepage, https://github.com/UMEP-dev/solweig",
"repository, https://github.com/UMEP-dev/solweig"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:39:10.248222 | solweig-0.1.0b54.tar.gz | 279,540 | aa/41/d5381560c3644d3f65c02dd0dab6047688261a11bba1b7f2e318250d8c6f/solweig-0.1.0b54.tar.gz | source | sdist | null | false | 45585b2963acecbf9efc175ce3da45bf | 2672e6b40b31bc6c6e6f363ff63d930185cc72357ff060fdebf2dbe161ad2a8d | aa41d5381560c3644d3f65c02dd0dab6047688261a11bba1b7f2e318250d8c6f | null | [
"LICENSE"
] | 644 |
2.4 | flask-openapi-swagger | 5.31.2 | Provide Swagger UI for flask-openapi. | Provide Swagger UI for [flask-openapi](https://github.com/luolingchun/flask-openapi). | text/markdown | null | null | null | llc <luolingchun@outlook.com> | MIT | null | [
"Development Status :: 5 - Production/Stable",
"Environment :: Web Environment",
"Framework :: Flask",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only"
] | [] | null | null | >=3.8 | [] | [] | [] | [
"flask-openapi"
] | [] | [] | [] | [
"Homepage, https://github.com/luolingchun/flask-openapi-plugins/tree/master/flask-openapi-swagger",
"Documentation, https://luolingchun.github.io/flask-openapi/latest/Usage/UI_Templates/"
] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-21T02:38:40.211286 | flask_openapi_swagger-5.31.2.tar.gz | 516,945 | e2/9d/0db79cb44960c412011b0897c41eb1b06ae107ef99abf460f28962dbfbed/flask_openapi_swagger-5.31.2.tar.gz | source | sdist | null | false | 0621b939ecfc9c004b305065ac80bbd1 | 64b7f37dbe26680faae6d5b9cb8c598c9d459c3b8a0596bbc8c5c9558c8abd5d | e29d0db79cb44960c412011b0897c41eb1b06ae107ef99abf460f28962dbfbed | null | [] | 146 |
2.4 | stringty | 0.0.2.1 | Uma biblioteca PYTHON para editar conteúdo de arquivos, substitua todo o conteúdo ou adicione algo a mais. | The **Stringty** is a Python library for **manipulating file content in a simple way**. either replacing all content or appending new content.
## Installation
```bash
pip install stringty
```
## Replace the entire content
```python
from stringty import Stringty
content = """
a = "ab2c"
print(a)"""
Redacty().rescript("test.py", content)
```
## Append content to the end
```python
from stringty import Stringty
content = """
a = "ab2c"
print(a)"""
Redacty().imprement("test.py", content)
```
| text/markdown | silvaleal | lealsilva.ctt@outlook.com | null | null | MIT License | redacty | [] | [] | null | null | null | [] | [] | [] | [] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.10 | 2026-02-21T02:37:59.326476 | stringty-0.0.2.1.tar.gz | 2,488 | d8/a3/a35aff170900b6ba503ba52a3a8a55eef9aec1b51f58345e24b347312e03/stringty-0.0.2.1.tar.gz | source | sdist | null | false | 8b5d0025c1e0bf32d6df9176152a7e13 | 46e239a736761dc4e4850993aa10407fdf5aca156b10ef744713410fbba3c2ec | d8a3a35aff170900b6ba503ba52a3a8a55eef9aec1b51f58345e24b347312e03 | null | [
"LICENSE"
] | 93 |
2.4 | moonbridge | 0.16.0 | MCP server for spawning AI coding agents (Kimi, Codex, and more) | # Moonbridge
**Your MCP client just got a team.**
Spawn AI coding agents from Claude Code, Cursor, or any MCP client. Run 10 approaches in parallel for a fraction of the cost.
```bash
uvx moonbridge
```
## Quick Start
1. **Install at least one supported CLI:**
| Adapter | Install | Authenticate |
|---------|---------|--------------|
| Kimi (default) | `uv tool install --python 3.13 kimi-cli` | `kimi login` |
| Codex | `npm install -g @openai/codex` | Set `OPENAI_API_KEY` |
| OpenCode | `curl -fsSL https://opencode.ai/install \| bash` | `opencode auth login` |
| Gemini CLI | `npm install -g @google/gemini-cli` | Run `gemini` login flow or set `GEMINI_API_KEY` |
2. **Add to MCP config** (`~/.mcp.json`):
```json
{
"mcpServers": {
"moonbridge": {
"type": "stdio",
"command": "uvx",
"args": ["moonbridge"]
}
}
}
```
3. **Use it.** Your MCP client now has `spawn_agent` and `spawn_agents_parallel` tools.
## Security Warning (Read First)
Moonbridge executes agentic CLIs (Kimi/Codex/OpenCode/Gemini). A malicious or careless prompt can cause an
agent to run shell commands, read accessible files, or exfiltrate data via network calls.
Moonbridge adds guardrails (`MOONBRIDGE_ALLOWED_DIRS`, environment allowlists, optional
`MOONBRIDGE_SANDBOX=1`), but these are not equivalent to OS-level containment.
For untrusted prompts or shared environments, run Moonbridge inside a container or VM with
least-privilege filesystem and network access.
## Updating
Moonbridge checks for updates on startup (cached for 24h). To update manually:
```bash
# If using uvx (recommended)
uvx moonbridge --refresh
# If installed as a tool
uv tool upgrade moonbridge
```
Disable update checks for CI/automation:
```bash
export MOONBRIDGE_SKIP_UPDATE_CHECK=1
```
## When to Use Moonbridge
| Task | Why Moonbridge |
|------|----------------|
| Parallel exploration | Run 10 approaches simultaneously, pick the best |
| Frontend/UI work | Kimi excels at visual coding and component design |
| Tests and documentation | Cost-effective for high-volume tasks |
| Refactoring | Try multiple strategies in one request |
**Best for:** Tasks that benefit from parallel execution or volume.
## How it Works
### Connection Flow
1. MCP client (Claude Code, Cursor, etc.) connects to Moonbridge over stdio
2. Client discovers available tools via `list_tools`
3. Client calls `spawn_agent` or `spawn_agents_parallel`
### Spawn Process
1. Moonbridge validates the prompt and working directory
2. Resolves which adapter to use (Kimi, Codex, OpenCode, Gemini)
3. Adapter builds the CLI command with appropriate flags
4. Spawns subprocess in a separate process group
5. Captures stdout/stderr, enforces timeout
6. Returns structured JSON result
### Parallel Execution
- `spawn_agents_parallel` runs up to 10 agents concurrently via `asyncio.gather`
- Each agent is independent (separate process, separate output)
- All results returned together when the last agent finishes (or times out)
```
MCP Client → stdio → Moonbridge → adapter → CLI subprocess
→ CLI subprocess (parallel)
→ CLI subprocess (parallel)
```
## Tools
| Tool | Use case |
|------|----------|
| `spawn_agent` | Single task: "Write tests for auth.ts" |
| `spawn_agents_parallel` | Go wide: 10 agents, 10 approaches, pick the best |
| `check_status` | Verify an adapter CLI is installed and authenticated |
| `list_adapters` | Show available adapters and their status |
| `list_models` | Show known/dynamic model options for an adapter |
### Example: Parallel Exploration
```json
{
"agents": [
{"prompt": "Refactor to React hooks"},
{"prompt": "Refactor to Zustand"},
{"prompt": "Refactor to Redux Toolkit"}
]
}
```
Three approaches. One request. You choose the winner.
### Tool Parameters
**`spawn_agent`**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `prompt` | string | Yes | Task description for the agent |
| `adapter` | string | No | Backend to use: `kimi`, `codex`, `opencode`, `gemini` (default from `MOONBRIDGE_ADAPTER`, fallback `kimi`) |
| `model` | string | No | Model override (e.g., `gpt-5.2-codex`, `openrouter/minimax/minimax-m2.5`, `gemini-2.5-pro`). For `opencode`, models use `provider/model`. |
| `thinking` | boolean | No | Enable reasoning mode (Kimi only) |
| `reasoning_effort` | string | No | Reasoning budget: `low`, `medium`, `high`, `xhigh` (Codex only, default `xhigh`) |
| `timeout_seconds` | integer | No | Override default timeout (30-3600) |
**`spawn_agents_parallel`**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `agents` | array | Yes | List of agent configs (max 10) |
| `agents[].prompt` | string | Yes | Task for this agent |
| `agents[].adapter` | string | No | Backend for this agent |
| `agents[].model` | string | No | Model override for this agent (`codex` default: `gpt-5.3-codex`; `opencode` uses `provider/model`; `gemini` default: `gemini-2.5-pro`) |
| `agents[].thinking` | boolean | No | Enable reasoning (Kimi only) |
| `agents[].reasoning_effort` | string | No | Reasoning budget (Codex only, default `xhigh`) |
| `agents[].timeout_seconds` | integer | No | Timeout for this agent |
**`check_status`**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `adapter` | string | No | Adapter to check explicitly. Defaults to `MOONBRIDGE_ADAPTER` when omitted. |
**`list_models`**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `adapter` | string | No | Adapter to inspect. Defaults to `MOONBRIDGE_ADAPTER` when omitted. |
| `provider` | string | No | Provider filter for OpenCode model catalogs (e.g., `openrouter`). |
| `refresh` | boolean | No | Refresh model catalog for adapters that support dynamic discovery. |
## Response Format
All tools return JSON with these fields:
| Field | Type | Description |
|-------|------|-------------|
| `status` | string | `success`, `error`, `timeout`, `auth_error`, or `cancelled` |
| `output` | string | stdout from the agent |
| `stderr` | string\|null | stderr if any |
| `returncode` | int | Process exit code (-1 for timeout/error) |
| `duration_ms` | int | Execution time in milliseconds |
| `agent_index` | int | Agent index (0 for single, 0-N for parallel) |
| `message` | string? | Human-readable error context (when applicable) |
| `raw` | object? | Optional structured metadata (e.g., sandbox diff) |
When output is too large, Moonbridge truncates it and adds `raw.output_limit` metadata
with original sizes.
## Configuration
### Environment Variables
| Variable | Description |
|----------|-------------|
| `MOONBRIDGE_ADAPTER` | Default adapter (default: `kimi`) |
| `MOONBRIDGE_TIMEOUT` | Default timeout in seconds (30-3600) |
| `MOONBRIDGE_KIMI_TIMEOUT` | Kimi-specific default timeout |
| `MOONBRIDGE_CODEX_TIMEOUT` | Codex-specific default timeout |
| `MOONBRIDGE_OPENCODE_TIMEOUT` | OpenCode-specific default timeout |
| `MOONBRIDGE_GEMINI_TIMEOUT` | Gemini-specific default timeout |
| `MOONBRIDGE_MODEL` | Global default model override |
| `MOONBRIDGE_KIMI_MODEL` | Kimi-specific model override |
| `MOONBRIDGE_CODEX_MODEL` | Codex-specific model override |
| `MOONBRIDGE_OPENCODE_MODEL` | OpenCode-specific model override |
| `MOONBRIDGE_GEMINI_MODEL` | Gemini-specific model override |
| `MOONBRIDGE_MAX_AGENTS` | Maximum parallel agents |
| `MOONBRIDGE_MAX_OUTPUT_CHARS` | Max chars returned per agent across `stdout`+`stderr` (default 120000; timeout tails are per stream) |
| `MOONBRIDGE_ALLOWED_DIRS` | Colon-separated allowlist of working directories |
| `MOONBRIDGE_STRICT` | Set to `1` to require `ALLOWED_DIRS` (exits if unset) |
| `MOONBRIDGE_SANDBOX` | Set to `1` to run agents in a temp copy of cwd |
| `MOONBRIDGE_SANDBOX_KEEP` | Set to `1` to keep sandbox dir for inspection |
| `MOONBRIDGE_SANDBOX_MAX_DIFF` | Max diff size in bytes (default 500000) |
| `MOONBRIDGE_SANDBOX_MAX_COPY` | Max sandbox copy size in bytes (default 500MB) |
| `MOONBRIDGE_LOG_LEVEL` | Set to `DEBUG` for verbose logging |
## Security
Moonbridge inherits the security model of the selected adapter CLI. Kimi, Codex, OpenCode, and Gemini are
agentic CLIs; prompts can trigger command execution, file access, and network activity
within the process permissions.
### 1. Threat Model (Prompt Injection Included)
If an attacker can influence prompt input sent through MCP, they can attempt to make the
agent:
- read sensitive files (for example `~/.ssh` or `.env`),
- run destructive shell commands,
- exfiltrate data over the network.
Moonbridge does not inspect prompt intent. Treat prompt input as potentially untrusted.
### 2. Directory Restrictions (`MOONBRIDGE_ALLOWED_DIRS`)
Default: agents can operate in any directory. Set `MOONBRIDGE_ALLOWED_DIRS` to restrict: colon-separated allowed paths. Symlinks resolved via `os.path.realpath` before checking. Strict mode (`MOONBRIDGE_STRICT=1`) exits on startup if no valid allowed directories are configured.
```bash
export MOONBRIDGE_ALLOWED_DIRS="/home/user/projects:/home/user/work"
export MOONBRIDGE_STRICT=1 # require restrictions
```
### 3. Environment Sanitization
Only whitelisted env vars are passed to spawned agents. Each adapter defines its own allowlist (`PATH`, `HOME`, plus adapter-specific like `OPENAI_API_KEY` for Codex). Your shell environment (secrets, tokens, SSH keys) is not inherited by default.
### 4. Input Validation
Model parameters are validated to prevent flag injection (values starting with `-` are rejected). Prompts are capped at 100,000 characters and cannot be empty.
### 5. Process Isolation and Sandbox Mode
Agents run in separate process groups (`start_new_session=True`). Orphan cleanup on exit. Sandbox mode available (`MOONBRIDGE_SANDBOX=1`) for copy-on-run isolation.
> **Not OS-level sandboxing.** Agents can still read or write arbitrary host files they can access.
### 6. Hardened Deployment Checklist
- Set `MOONBRIDGE_ALLOWED_DIRS` to the smallest possible set.
- Enable `MOONBRIDGE_STRICT=1` so missing restrictions fail closed.
- Enable `MOONBRIDGE_SANDBOX=1` to avoid direct workspace mutation.
- Run Moonbridge in a container/VM for strong isolation.
- Do not expose Moonbridge to untrusted clients without additional auth controls.
## Troubleshooting
### "CLI not found"
Install the CLI for your chosen adapter:
```bash
# Kimi
uv tool install --python 3.13 kimi-cli
which kimi
# Codex
npm install -g @openai/codex
which codex
# OpenCode
curl -fsSL https://opencode.ai/install | bash
which opencode
# Gemini
npm install -g @google/gemini-cli
which gemini
```
### "auth_error" responses
Authenticate with your chosen CLI:
```bash
# Kimi
kimi login
# Codex
export OPENAI_API_KEY=sk-...
# OpenCode
opencode auth login
# Gemini
gemini # complete login flow, or set GEMINI_API_KEY
```
### Timeout errors
Adapters have sensible defaults: Codex=1800s, Kimi=600s, OpenCode=1200s, Gemini=1200s.
For exceptionally long tasks, override explicitly:
```json
{"prompt": "...", "timeout_seconds": 3600}
```
Or set per-adapter defaults via environment:
```bash
export MOONBRIDGE_CODEX_TIMEOUT=2400 # 40 minutes
export MOONBRIDGE_KIMI_TIMEOUT=900 # 15 minutes
export MOONBRIDGE_OPENCODE_TIMEOUT=1200 # 20 minutes
export MOONBRIDGE_GEMINI_TIMEOUT=1200 # 20 minutes
```
## Timeout Best Practices
| Task Type | Recommended |
|-----------|-------------|
| Quick query, status | 60-180s |
| Simple edits | 300-600s |
| Feature implementation | 1200-1800s |
| Large refactor | 1800-3600s |
Priority resolution: explicit param > adapter env > adapter default > global env > 600s fallback
### "MOONBRIDGE_ALLOWED_DIRS is not set" warning
By default, Moonbridge warns at startup if no directory restrictions are configured. This is expected for local development. For shared/production environments, set allowed directories:
```bash
export MOONBRIDGE_ALLOWED_DIRS="/path/to/project:/another/path"
```
## Sandbox Mode (Copy-on-Run)
Enable sandbox mode to run agents in a temporary copy of the working directory:
```bash
export MOONBRIDGE_SANDBOX=1
```
When enabled:
- Agents run in a temp copy of `cwd`.
- Host files stay unchanged by default.
- A unified diff + summary is included in `raw.sandbox`.
Optional:
```bash
export MOONBRIDGE_SANDBOX_KEEP=1 # keep temp dir
export MOONBRIDGE_SANDBOX_MAX_DIFF=200000
export MOONBRIDGE_SANDBOX_MAX_COPY=300000000
```
Limitations: this is not OS-level isolation. Agents can still read/write arbitrary host paths if they choose to. Use containers/VMs for strong isolation.
To enforce restrictions (exit instead of warn):
```bash
export MOONBRIDGE_STRICT=1
```
### Permission denied on working directory
Verify the directory is in your allowlist:
```bash
export MOONBRIDGE_ALLOWED_DIRS="/path/to/project:/another/path"
```
### Debug logging
Enable verbose logging:
```bash
export MOONBRIDGE_LOG_LEVEL=DEBUG
```
## Platform Support
macOS and Linux only. Windows is not supported.
## License
MIT. See `LICENSE`.
| text/markdown | null | Phaedrus <hello@mistystep.io> | null | null | null | agent, ai, claude, codex, kimi, mcp | [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"mcp>=1.0.0",
"mypy>=1.8; extra == \"dev\"",
"pytest-asyncio>=0.23; extra == \"dev\"",
"pytest-mock>=3.12; extra == \"dev\"",
"pytest>=8.0; extra == \"dev\"",
"ruff>=0.2; extra == \"dev\"",
"opentelemetry-api>=1.20; extra == \"telemetry\""
] | [] | [] | [] | [
"Homepage, https://github.com/misty-step/moonbridge",
"Repository, https://github.com/misty-step/moonbridge",
"Issues, https://github.com/misty-step/moonbridge/issues"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:37:37.061659 | moonbridge-0.16.0.tar.gz | 104,264 | 6f/8c/7c3a4d6c9282b70ba5e6cde1a8f67da9894dfd1b2d68a50d5cfa724080d5/moonbridge-0.16.0.tar.gz | source | sdist | null | false | ef65e20d3b129394daea53786055e546 | 0d2e8dd66cd6961cc0cc7be9ece16f493ac5fce60ddff46447881fd6119d2f95 | 6f8c7c3a4d6c9282b70ba5e6cde1a8f67da9894dfd1b2d68a50d5cfa724080d5 | MIT | [
"LICENSE"
] | 152 |
2.4 | kitchenowl-cli | 0.2.0 | CLI for interacting with the KitchenOwl API. | # kitchenowl-cli
Command-line client for KitchenOwl's `/api` endpoints, covering auth, household, recipe, shopping list, and user workflows.
## Supported CLI surface
- `kitchenowl auth [login|logout|status|signup]` — JWT login with refresh, logout and signup helpers.
- `kitchenowl category [list]` and `kitchenowl tag [list]` — discovery commands for household categories and tags.
- `kitchenowl config [show|set-default-household|server-settings]` — manage stored tokens/default household and inspect read-only server flags from `/api/health`.
- `kitchenowl expense [list|create|update|delete|overview|balance]` and `kitchenowl expense category [list|create]`.
- `kitchenowl household [list|use|get|create|update|delete]` and `kitchenowl household member [list|add|remove]`.
- `kitchenowl planner [list|add-recipe|remove-recipe|recent-recipes|suggested-recipes|refresh-suggestions]`.
- `kitchenowl recipe [list|get|add|edit|delete]` with JSON/YAML payload support and flag-based editors, including ingredients management.
- `kitchenowl recipe [search|search-by-tag]` for discovery and filtering.
- `kitchenowl shoppinglist [list|create|delete|items|add-item|add-item-by-name|add-items-from-file|suggested|remove-item|remove-items|clear]`.
- `kitchenowl user [list|get|search|create|update|delete]` for admins.
- `run_cli_e2e.sh` script exercises login, household creation, lists, recipes, planner, expenses, and optionally house cleanup.
## Install
Recommended for normal usage (installs globally via `pipx`):
```bash
pipx install kitchenowl-cli
```
The installed command is:
```bash
kitchenowl --help
```
For local development / modifying this CLI:
```bash
cd kitchenowl-cli
python3 -m pip install -e .
```
For test/development dependencies:
```bash
cd kitchenowl-cli
uv sync --extra dev
uv run pytest
```
## Publish to PyPI via GitHub Actions
This repository includes `.github/workflows/publish.yml` for Trusted Publishing.
One-time setup:
- In PyPI, create a Trusted Publisher for this GitHub repository and workflow file.
- In GitHub, keep the repository public and allow Actions to run.
Release flow:
```bash
git tag v0.1.1
git push origin v0.1.1
```
Then create a GitHub Release for that tag (or publish from the Releases UI).
When the release is published, the workflow builds and uploads `kitchenowl-cli` to PyPI.
## Quick start examples
```bash
kitchenowl auth login --server https://your-kitchenowl.example.com
# tip: both https://host and https://host/api are accepted
kitchenowl config server-settings
kitchenowl household list
kitchenowl household member list --household-id 42
kitchenowl household member add 17 --household-id 42 --admin
kitchenowl shoppinglist create "Weekly List" --household-id 42
kitchenowl shoppinglist add-item-by-name 12 Milk --description "2L"
kitchenowl shoppinglist remove-item 12 456 -y
kitchenowl shoppinglist remove-items 12 456 457 -y
kitchenowl shoppinglist clear 12 -y
kitchenowl recipe add --name "Tomato Soup" --description "Simple soup" --household-id 42 --yields 2 --time 25 --ingredient "Tomatoes|6 pcs|false" --ingredient "Salt|1 tsp|false"
kitchenowl recipe edit 123 --description "Updated" --ingredient "Basil|a handful|true"
kitchenowl recipe search --household-id 42 --query soup
kitchenowl recipe search-by-tag --tag vegetarian
kitchenowl recipe delete 123
kitchenowl planner add-recipe 123 --household-id 42 --date 2026-02-20
kitchenowl expense create --household-id 42 --name "Groceries" --amount 35.5 --paid-by 17 --paid-for 17:1 --paid-for 21:1
kitchenowl expense overview --household-id 42 --view 0 --frame 2
kitchenowl user list
kitchenowl auth signup --username newuser --name "New User"
```
`auth login` / `auth signup` will always ask for the server URL when `--server` is not provided, using your last saved server as the default.
## Read-only server settings
Inspect the public read-only settings exposed by the server health endpoint (works without login if you pass `--server`):
```bash
kitchenowl config server-settings --server https://your-kitchenowl.example.com
kitchenowl config server-settings --json
```
This currently shows:
- `open_registration`
- `email_mandatory`
- `oidc_provider`
- `privacy_policy` (if configured)
- `terms` (if configured)
## File-based recipe editing
```bash
kitchenowl recipe add --household-id 1 --from-file recipe.yml
kitchenowl recipe edit 42 --from-file recipe.yml
```
Example `recipe.yml`:
```yaml
name: Tomato Soup
description: Simple soup
time: 25
cook_time: 20
prep_time: 5
yields: 2
visibility: 0
source: ""
items:
- name: Blend
description: Blend until smooth
optional: false
ingredients:
- name: Tomatoes
description: 6 pcs
optional: false
- name: Salt
description: 1 tsp
optional: false
tags:
- soup
- vegetarian
```
## What’s not implemented yet
- Advanced category/tag management commands (create/update/delete/merge) beyond discovery/listing.
- More advanced auth/admin workflows (OIDC login, password reset/email verification, token management, deeper server admin tooling) are still API-only.
- Data portability and advanced media workflows (import/export wrappers, upload helpers, recipe scraping shortcuts) are still API-only.
## Troubleshooting
- `No server configured` or `Not authenticated`:
Run `kitchenowl auth login` and confirm `kitchenowl config show`.
- Network/transport failures:
The CLI now wraps transport errors as user-facing messages. Verify server URL, DNS, and connectivity.
- Corrupted config file:
The CLI defensively recovers malformed config content, but you can reset with:
`rm ~/.config/kitchenowl/config.json` and re-login.
- Token refresh edge cases in concurrent CLI usage:
Retry the command once; if refresh token rotation invalidates the session chain, run `kitchenowl auth login` again.
## Release Notes (v0.2 Progress)
- Stability and blocker fixes:
- Recipe edit payload handling corrected to avoid stale item overwrite behavior.
- Transport-level request failures now surface through `ApiError` with cleaner CLI messages.
- User command init/auth/config error handling aligned with Click-friendly failures.
- Config parsing hardened for malformed JSON and invalid key types.
- Signup prompt semantics and e2e admin lookup behavior fixed.
- New command surfaces:
- Planner command group.
- Expense command group with category/overview/balance flows.
- Recipe search and search-by-tag commands.
- Shopping list bulk operations.
- Category/tag discovery listing commands.
- Scope split:
- Core blocker/stability and main CLI surfaces are implemented.
- Remaining advanced admin/auth/data workflows are tracked for follow-up releases.
| text/markdown | null | null | null | null | null | null | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"click>=8.1.7",
"requests>=2.31.0",
"PyYAML>=6.0.1",
"rich>=13.7.1",
"pytest>=8.1.0; extra == \"dev\""
] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:37:18.282020 | kitchenowl_cli-0.2.0.tar.gz | 27,707 | da/7f/8836299c4c466f53a0dfaf1e377d6ada118cb9d7f08c2637f075620ecd73/kitchenowl_cli-0.2.0.tar.gz | source | sdist | null | false | 61dfd6c0abe6ff378e4153b7e79fbc4c | 819b3054de2f8ae1984731a1d9ffb26b54226b314e1d50da8d8e122470ea2b01 | da7f8836299c4c466f53a0dfaf1e377d6ada118cb9d7f08c2637f075620ecd73 | null | [] | 147 |
2.4 | hydra-router | 0.16.1 | ZeroMQ-based distributed framework | # Hydra Router
A distributed network architecture based on ZeroMQ.
| text/markdown | Nadim-Daniel Ghaznavi | nghaznavi@gmail.com | null | null | null | null | [
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.14"
] | [] | null | null | <4.0,>=3.11 | [] | [] | [] | [
"pyzmq<26.0.0,>=25.0.0",
"textual<9.0.0,>=8.0.0"
] | [] | [] | [] | [] | poetry/2.3.2 CPython/3.11.14 Linux/6.11.0-1018-azure | 2026-02-21T02:36:03.716162 | hydra_router-0.16.1-py3-none-any.whl | 18,858 | da/58/77c2c2aa7f5c2189efda16e917437956b6e9162fcecc1b784e62cd067596/hydra_router-0.16.1-py3-none-any.whl | py3 | bdist_wheel | null | false | 924234a0da8dd4cdf12992914fc9c85e | e5133d0ec1946d9cea91e9d5d219633f0f3452599d85059e7ca5b667a3174ea6 | da5877c2c2aa7f5c2189efda16e917437956b6e9162fcecc1b784e62cd067596 | null | [] | 149 |
2.4 | nginx-proxy-manager-mcp | 2.14.0 | MCP server for managing Nginx Proxy Manager instances | # npm-mcp
MCP server for managing Nginx Proxy Manager (NPM) instances via Claude/AI assistants. **50 tools** covering the full NPM API.
> Version tracks NPM releases — v2.14.0 targets [Nginx Proxy Manager v2.14.0](https://github.com/NginxProxyManager/nginx-proxy-manager/releases/tag/v2.14.0).
## Installation
```bash
# Install via uvx (recommended)
uvx nginx-proxy-manager-mcp
# Or install from source
uv sync
uv run nginx-proxy-manager-mcp
```
## Configuration
Add to your Claude Desktop config (`~/.config/claude/config.json`):
```json
{
"mcpServers": {
"nginx-proxy-manager": {
"command": "uvx",
"args": ["nginx-proxy-manager-mcp"],
"env": {
"NPM_URL": "http://your-npm-instance:81",
"NPM_EMAIL": "admin@example.com",
"NPM_PASSWORD": "your-password"
}
}
}
}
```
## Available Tools (50)
### Proxy Hosts (7 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_proxy_hosts` | List all proxy hosts | — |
| `get_proxy_host` | Get proxy host by ID | `host_id` |
| `create_proxy_host` | Create a new proxy host | `domain_names`, `forward_host`, `forward_port` |
| `update_proxy_host` | Update a proxy host | `host_id` + any fields to change |
| `delete_proxy_host` | Delete a proxy host | `host_id` |
| `enable_proxy_host` | Enable a proxy host | `host_id` |
| `disable_proxy_host` | Disable a proxy host | `host_id` |
Optional create/update params: `forward_scheme`, `certificate_id`, `ssl_forced`, `block_exploits`, `advanced_config`
### Redirection Hosts (7 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_redirection_hosts` | List all redirection hosts | — |
| `get_redirection_host` | Get redirection host by ID | `host_id` |
| `create_redirection_host` | Create HTTP redirect | `domain_names`, `forward_http_code`, `forward_domain_name` |
| `update_redirection_host` | Update a redirection host | `host_id` + any fields to change |
| `delete_redirection_host` | Delete a redirection host | `host_id` |
| `enable_redirection_host` | Enable a redirection host | `host_id` |
| `disable_redirection_host` | Disable a redirection host | `host_id` |
Optional create/update params: `forward_scheme` (auto/http/https), `preserve_path`, `certificate_id`, `ssl_forced`, `block_exploits`, `advanced_config`
### Streams (7 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_streams` | List all TCP/UDP streams | — |
| `get_stream` | Get stream by ID | `stream_id` |
| `create_stream` | Create a TCP/UDP stream proxy | `incoming_port`, `forwarding_host`, `forwarding_port` |
| `update_stream` | Update a stream | `stream_id` + any fields to change |
| `delete_stream` | Delete a stream | `stream_id` |
| `enable_stream` | Enable a stream | `stream_id` |
| `disable_stream` | Disable a stream | `stream_id` |
Optional create/update params: `tcp_forwarding`, `udp_forwarding`, `certificate_id`
### Dead Hosts / 404 Hosts (7 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_dead_hosts` | List all 404 dead hosts | — |
| `get_dead_host` | Get dead host by ID | `host_id` |
| `create_dead_host` | Create a 404 dead host | `domain_names` |
| `update_dead_host` | Update a dead host | `host_id` + any fields to change |
| `delete_dead_host` | Delete a dead host | `host_id` |
| `enable_dead_host` | Enable a dead host | `host_id` |
| `disable_dead_host` | Disable a dead host | `host_id` |
Optional create/update params: `certificate_id`, `ssl_forced`, `hsts_enabled`, `hsts_subdomains`, `http2_support`, `advanced_config`
### SSL Certificates (7 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_certificates` | List all SSL certificates | — |
| `get_certificate` | Get certificate by ID | `certificate_id` |
| `request_certificate` | Request a Let's Encrypt cert | `domain_names`, `nice_name` |
| `delete_certificate` | Delete a certificate | `certificate_id` |
| `renew_certificate` | Renew a Let's Encrypt cert | `certificate_id` |
| `list_dns_providers` | List supported DNS providers | — |
| `test_http_challenge` | Test HTTP-01 ACME reachability | `domains` |
### Access Lists (5 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_access_lists` | List all access lists | — |
| `get_access_list` | Get access list by ID | `access_list_id` |
| `create_access_list` | Create an access list | `name` |
| `update_access_list` | Update an access list | `access_list_id` + any fields to change |
| `delete_access_list` | Delete an access list | `access_list_id` |
Optional create/update params: `satisfy_any`, `pass_auth`
### Users (5 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_users` | List all NPM users | — |
| `get_user` | Get user by ID | `user_id` |
| `create_user` | Create a new user | `name`, `email` |
| `update_user` | Update a user | `user_id` + any fields to change |
| `delete_user` | Delete a user | `user_id` |
Optional create/update params: `nickname`, `roles`, `is_disabled`
### Settings (3 tools)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_settings` | List all NPM settings | — |
| `get_setting` | Get a setting by ID | `setting_id` |
| `update_setting` | Update a setting | `setting_id` |
Optional update params: `value`, `meta`
### Audit Log (1 tool)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `list_audit_log` | List recent audit log entries | — |
### Reports (1 tool)
| Tool | Description | Required Params |
|------|-------------|-----------------|
| `get_host_report` | Get host count report | — |
## Usage Examples
Once configured, use natural language:
- "List all my proxy hosts"
- "Create a proxy host for example.com pointing to 192.168.1.100:8080"
- "Set up a 301 redirect from old.example.com to new.example.com"
- "Create a TCP stream forwarding port 3306 to my database server"
- "Request a Let's Encrypt certificate for api.example.com"
- "Enable the proxy host with ID 5"
- "Show me the audit log"
## Development
```bash
uv sync --dev
uv run pytest
```
## Requirements
- Python 3.10+
- Nginx Proxy Manager instance
- Valid NPM credentials
## License
MIT
| text/markdown | null | null | null | null | null | null | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"httpx>=0.28.1",
"mcp>=1.0.0",
"pydantic>=2.0.0",
"pytest-asyncio>=0.24.0; extra == \"dev\"",
"pytest-cov>=6.0.0; extra == \"dev\"",
"pytest-httpx>=0.34.0; extra == \"dev\"",
"pytest>=8.0.0; extra == \"dev\"",
"ruff>=0.8.0; extra == \"dev\""
] | [] | [] | [] | [] | uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null} | 2026-02-21T02:35:59.863409 | nginx_proxy_manager_mcp-2.14.0.tar.gz | 130,678 | ca/5a/40a6e459ead73234132ba8629c7357b73dab1698b3a23b257c47ce6b509f/nginx_proxy_manager_mcp-2.14.0.tar.gz | source | sdist | null | false | c253f28c90d630b314a69f49a65900b7 | 4fbab573541b180b89ae1417d9480a2c04ccf240f07fdb9c9f8800a88a1fd591 | ca5a40a6e459ead73234132ba8629c7357b73dab1698b3a23b257c47ce6b509f | null | [
"LICENSE"
] | 152 |
2.4 | assert-no-inline-directives | 20260221023510 | CLI tool to assert that files contain no inline directives | # assert-no-inline-directives
A CLI tool to assert that files contain no inline directives for
clang-diagnostic, clang-format, clang-tidy, coverage, markdownlint,
mypy, pylint, and yamllint.
## Usage
```bash
assert-no-inline-directives --tools TOOLS [OPTIONS] PATH [PATH ...]
```
### Required Arguments
- `--tools TOOLS` - Comma-separated list of tools: `clang-diagnostic,clang-format,clang-tidy,coverage,markdownlint,mypy,pylint,yamllint`
- `PATH` - One or more file paths, directory paths, or glob patterns to scan
(directories are scanned recursively, globs support hidden directories)
### Optional Arguments
- `--allow PATTERNS` - Comma-separated patterns to allow
- `--count` - Print finding count only
- `--exclude PATTERNS` - Comma-separated glob patterns to exclude files
- `--fail-fast` - Exit on first finding
- `--quiet` - Suppress output, exit code only
- `-v, --verbose` - Show tools, files scanned, findings, and summary
- `--warn-only` - Always exit 0, report only
### Examples
```bash
# Check all clang tools
assert-no-inline-directives --tools clang-tidy,clang-format,clang-diagnostic src/
# Check for clang-tidy NOLINT directives in C++ files
assert-no-inline-directives --tools clang-tidy "**/*.cpp" "**/*.h"
# Check for coverage pragmas
assert-no-inline-directives --tools coverage src/
# Check for markdownlint suppressions in markdown files
assert-no-inline-directives --tools markdownlint "**/*.md"
# Check for pylint and mypy suppressions in files
assert-no-inline-directives --tools pylint,mypy src/*.py
# Scan a directory recursively
assert-no-inline-directives --tools pylint,mypy src/
# Use glob patterns (including hidden directories like .github)
assert-no-inline-directives --tools yamllint "**/*.yml" "**/*.yaml"
# Check all Python/YAML tools, exclude vendor files
assert-no-inline-directives --tools coverage,mypy,pylint,yamllint \
--exclude "*vendor*" src/ config/
# Allow specific type: ignore patterns
assert-no-inline-directives --tools mypy \
--allow "type: ignore[import]" src/*.py
# CI mode: quiet, just exit code
assert-no-inline-directives --tools pylint,mypy --quiet src/
# Verbose mode: show progress
assert-no-inline-directives --tools pylint,mypy --verbose src/
# Non-blocking check (always exit 0)
assert-no-inline-directives --tools mypy --warn-only src/
```
### Exit Codes
- `0` - No inline directives found
- `1` - One or more inline directives found
- `2` - Usage or runtime error (e.g., file not found, invalid tool)
### Output Formats
**Default format** (one finding per line):
```text
config.yaml:5:yamllint:yamllint disable
README.md:3:markdownlint:markdownlint-disable
src/example.py:10:pylint:pylint: disable
src/example.py:15:mypy:type: ignore
src/example.py:20:coverage:pragma: no cover
src/main.cpp:8:clang-tidy:NOLINT
```
**Count format** (`--count`):
```text
2
```
## Detected Directives
### clang-diagnostic (suppressions only)
- `#pragma clang diagnostic ignored`
### clang-format (suppressions only)
- `clang-format off`
### clang-tidy (suppressions only)
- `NOLINT` (including check-specific forms like `NOLINT(bugprone-*)`)
- `NOLINTBEGIN`
- `NOLINTNEXTLINE`
### coverage
- `pragma: no branch`
- `pragma: no cover`
### markdownlint (suppressions only)
- `markdownlint-capture`
- `markdownlint-configure-file`
- `markdownlint-disable`
(including rule-specific forms like `markdownlint-disable MD001`)
- `markdownlint-disable-file`
- `markdownlint-disable-line`
- `markdownlint-disable-next-line`
### mypy (suppressions only)
- `mypy: ignore-errors`
- `type: ignore` (including bracketed forms like `type: ignore[attr-defined]`)
### pylint (suppressions only)
- `pylint: disable`
- `pylint: disable-line`
- `pylint: disable-next`
- `pylint: skip-file`
### yamllint (suppressions only)
- `yamllint disable`
- `yamllint disable-file`
- `yamllint disable-line`
## Matching Behavior
- Case-insensitive matching
- Tolerates extra whitespace (e.g., `pylint: disable`, `type: ignore`)
- Only detects directives in comments
(after `#` for Python/YAML,
`//` and `/* */` for C/C++),
not in string literals
- `clang-diagnostic` matches `#pragma`
preprocessor directives on the full line
- `markdownlint` matches `<!-- markdownlint-... -->`
HTML comment directives on the full line
- Does **not** flag "enable" directives
(e.g., `clang-format on`, `markdownlint-enable`,
`markdownlint-restore`, `NOLINTEND`,
`yamllint enable`)
- Files are scanned in alphabetical order
for consistent output
- Glob patterns support hidden directories
(e.g., `.github`)
- Tools only check files with matching
extensions:
- `clang-diagnostic`:
`.c`, `.cc`, `.cpp`, `.cxx`,
`.h`, `.hpp`, `.hxx`, `.m`, `.mm`
- `clang-format`:
`.c`, `.cc`, `.cpp`, `.cxx`,
`.h`, `.hpp`, `.hxx`, `.m`, `.mm`
- `clang-tidy`:
`.c`, `.cc`, `.cpp`, `.cxx`,
`.h`, `.hpp`, `.hxx`, `.m`, `.mm`
- `coverage`: `.py`, `.toml`
- `markdownlint`: `.md`
- `mypy`: `.py`, `.toml`
- `pylint`: `.py`, `.toml`
- `yamllint`: `.yaml`, `.yml`, `.toml`
| text/markdown | null | 10U Labs <dev@10ulabs.com> | null | null | null | null | [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Quality Assurance"
] | [] | null | null | >=3.10 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/10U-Labs-LLC/assert-no-inline-directives",
"Issues, https://github.com/10U-Labs-LLC/assert-no-inline-directives/issues",
"Repository, https://github.com/10U-Labs-LLC/assert-no-inline-directives"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:35:33.355555 | assert_no_inline_directives-20260221023510.tar.gz | 44,020 | fd/65/f50725279b71816b3b8130fd935cdf5fc3dc10978f1e6529c646c07a60d7/assert_no_inline_directives-20260221023510.tar.gz | source | sdist | null | false | 96647c4e41b06f1bee07ad2ea1e1d4bb | ec29ef91262c1baf135261147a963c4c2ef0292575f1192b1d95e279cb212393 | fd65f50725279b71816b3b8130fd935cdf5fc3dc10978f1e6529c646c07a60d7 | Apache-2.0 | [
"LICENSE.txt"
] | 221 |
2.4 | plato-sdk-v2 | 2.22.2 | Python SDK for the Plato API | # Plato Python SDK
Python SDK for the Plato platform. Uses [Harbor](https://harborframework.com) for agent execution.
## Installation
```bash
pip install plato-sdk-v2
# For agent functionality (requires Python 3.12+)
pip install 'plato-sdk-v2[agents]'
```
Or with uv:
```bash
uv add plato-sdk-v2
uv add 'plato-sdk-v2[agents]' # for agent support
```
## Configuration
Create a `.env` file in your project root:
```bash
PLATO_API_KEY=your-api-key
PLATO_BASE_URL=https://plato.so # optional, defaults to https://plato.so
```
Or set environment variables directly:
```bash
export PLATO_API_KEY=your-api-key
```
## Agents
The SDK uses Harbor's agent framework. All agents are `BaseInstalledAgent` subclasses that run in containers.
### Available Agents
**Harbor built-in agents** (code agents):
| Agent | Description |
|-------|-------------|
| `claude-code` | Claude Code CLI |
| `openhands` | OpenHands/All Hands AI |
| `codex` | OpenAI Codex CLI |
| `aider` | Aider pair programming |
| `gemini-cli` | Google Gemini CLI |
| `goose` | Block Goose |
| `swe-agent` | SWE-agent |
| `mini-swe-agent` | Mini SWE-agent |
| `cline-cli` | Cline CLI |
| `cursor-cli` | Cursor CLI |
| `opencode` | OpenCode |
| `qwen-coder` | Qwen Coder |
**Plato custom agents** (browser/automation):
| Agent | Description |
|-------|-------------|
| `computer-use` | Browser automation (install: `pip install plato-agent-computer-use`) |
### Python Usage
```python
from plato.agents import ClaudeCode, OpenHands, AgentFactory, AgentName
from pathlib import Path
# Option 1: Use AgentFactory
agent = AgentFactory.create_agent_from_name(
AgentName.CLAUDE_CODE,
logs_dir=Path("./logs"),
model_name="anthropic/claude-sonnet-4",
)
# Option 2: Import agent class directly
agent = ClaudeCode(
logs_dir=Path("./logs"),
model_name="anthropic/claude-sonnet-4",
)
# Option 3: Create custom BaseInstalledAgent
from plato.agents import BaseInstalledAgent
```
### CLI Usage
```bash
# Run an agent
plato agent run -a claude-code -m anthropic/claude-sonnet-4 -d swe-bench-lite
# List available agents
plato agent list
# Get agent config schema
plato agent schema claude-code
# Publish custom agent to Plato PyPI
plato agent publish ./my-agent
```
### Agent Schemas
Get configuration schemas for any agent:
```python
from plato.agents import get_agent_schema, AGENT_SCHEMAS
# Get schema for specific agent
schema = get_agent_schema("claude-code")
print(schema)
# List all available schemas
print(list(AGENT_SCHEMAS.keys()))
```
### Custom Agents
Create a custom agent by extending `BaseInstalledAgent`:
```python
from harbor.agents.installed.base import BaseInstalledAgent, ExecInput
from pathlib import Path
class MyAgent(BaseInstalledAgent):
@staticmethod
def name() -> str:
return "my-agent"
@property
def _install_agent_template_path(self) -> Path:
return Path(__file__).parent / "install.sh.j2"
def create_run_agent_commands(self, instruction: str) -> list[ExecInput]:
return [ExecInput(command=f"my-agent --task '{instruction}'")]
```
Publish to Plato PyPI:
```bash
plato agent publish ./my-agent-package
```
---
## Sessions & Environments
### Flow 1: Create Session from Environments
Use this when you want to spin up environments for development, testing, or custom automation.
```python
import asyncio
from plato.v2 import AsyncPlato, Env
async def main():
plato = AsyncPlato()
# Create session with one or more environments
# (heartbeat starts automatically to keep session alive)
session = await plato.sessions.create(
envs=[
Env.simulator("gitea", dataset="blank", alias="gitea"),
Env.simulator("kanboard", alias="kanboard"),
],
timeout=600,
)
# Reset environments to initial state
await session.reset()
# Get public URLs for browser access
public_urls = await session.get_public_url()
for alias, url in public_urls.items():
print(f"{alias}: {url}")
# Get state mutations from all environments
state = await session.get_state()
print(state)
# Cleanup
await session.close()
await plato.close()
asyncio.run(main())
```
### Flow 2: Create Session from Task
Use this when running evaluations against predefined tasks. This flow includes task evaluation at the end.
```python
import asyncio
from plato.v2 import AsyncPlato
async def main():
plato = AsyncPlato()
# Create session from task ID
session = await plato.sessions.create(task=123, timeout=600)
# Reset environments to initial state
await session.reset()
# Get public URLs for browser access
public_urls = await session.get_public_url()
for alias, url in public_urls.items():
print(f"{alias}: {url}")
# Evaluate task completion
evaluation = await session.evaluate()
print(f"Task completed: {evaluation}")
# Cleanup
await session.close()
await plato.close()
asyncio.run(main())
```
## Environment Configuration
Two ways to specify environments:
```python
from plato.v2 import Env
# 1. From simulator (most common)
Env.simulator("gitea") # default tag
Env.simulator("gitea", tag="staging") # specific tag
Env.simulator("gitea", dataset="blank") # specific dataset
Env.simulator("gitea", alias="my-git") # custom alias
# 2. From artifact ID
Env.artifact("artifact-abc123")
Env.artifact("artifact-abc123", alias="my-env")
```
## Per-Environment Operations
Access individual environments within a session:
```python
# Get all environments
for env in session.envs:
print(f"{env.alias}: {env.job_id}")
# Get specific environment by alias
gitea = session.get_env("gitea")
if gitea:
# Execute shell command
result = await gitea.execute("whoami", timeout=30)
print(result)
# Get state for this environment only
state = await gitea.get_state()
# Reset this environment only
await gitea.reset()
```
## Sync Client
A synchronous client is also available:
```python
from plato.v2 import Plato, Env
plato = Plato()
session = plato.sessions.create(
envs=[Env.simulator("gitea", alias="gitea")],
timeout=600,
)
session.reset()
public_urls = session.get_public_url()
state = session.get_state()
session.close()
plato.close()
```
## Architecture
```
plato/
├── agents/ # Harbor agent re-exports + schemas
├── sims/ # Simulator clients (Spree, Firefly, etc.)
├── world/ # World/environment abstractions
├── v1/ # Legacy SDK + CLI
└── v2/ # New API client
```
## Documentation
- [Generating Simulator SDKs](docs/GENERATING_SIM_SDKS.md) - How to create API clients for simulators
- [Building Simulators](BUILDING_SIMS.md) - Internal docs for snapshotting simulators
## License
MIT
| text/markdown | null | Plato <support@plato.so> | null | null | null | null | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Typing :: Typed"
] | [] | null | null | <3.14,>=3.10 | [] | [] | [] | [
"aiohttp>=3.8.0",
"boto3",
"cloudpickle>=3.0.0",
"cryptography>=43.0.0",
"datamodel-code-generator>=0.43.0",
"email-validator>=2.0.0",
"fastmcp>=3.0.0",
"google-genai>=1.0.0",
"httpx>=0.25.0",
"jinja2>=3.1.0",
"litellm>=1.40.0",
"openapi-pydantic>=0.5.1",
"opentelemetry-api>=1.20.0",
"opentelemetry-exporter-otlp-proto-http>=1.20.0",
"opentelemetry-sdk>=1.20.0",
"pydantic-settings>=2.12.0",
"pydantic>=2.0.0",
"python-dotenv>=1.2.1",
"pyyaml>=6.0",
"requests>=2.32.5",
"rich>=13.0.0",
"tenacity>=9.1.2",
"tomli>=2.0.0",
"typer>=0.9.0",
"aiomysql>=0.2; extra == \"db-cleanup\"",
"aiosqlite>=0.20; extra == \"db-cleanup\"",
"asyncpg>=0.29; extra == \"db-cleanup\"",
"sqlalchemy[asyncio]>=2.0; extra == \"db-cleanup\"",
"basedpyright>=1.18; extra == \"dev\"",
"pytest-asyncio>=0.21; extra == \"dev\"",
"pytest>=7.0; extra == \"dev\"",
"ruff>=0.1; extra == \"dev\""
] | [] | [] | [] | [] | uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-21T02:33:15.569813 | plato_sdk_v2-2.22.2-py3-none-any.whl | 917,489 | 48/ab/193f52be9db3ca48e9f428a8fe8014d8d7d88d655f43b6a51355bc61fe7a/plato_sdk_v2-2.22.2-py3-none-any.whl | py3 | bdist_wheel | null | false | d4270c918829e89d93f42a2fb5f6e1e1 | 3b2c191db9d6f1afcc272960a35d8e743f3656759515b2d3fb23031b739cebd7 | 48ab193f52be9db3ca48e9f428a8fe8014d8d7d88d655f43b6a51355bc61fe7a | MIT | [] | 1,183 |
2.4 | ollama-agentic | 0.1.7 | A beautiful, agentic CLI for Ollama — run local LLMs with auto tool-calling, memory, and more | # ollama-agentic
A beautiful, agentic terminal interface for [Ollama](https://ollama.com) — run local LLMs with auto tool-calling, long-term memory, iterative code debugging, and more.



## Install
```bash
pip install ollama-agentic
ollama-cli
```
Ollama is installed and started automatically if not already present.
---
## Features
- ⚡ **Auto mode** — model autonomously calls tools to complete tasks (`/auto`)
- 🔁 **Iterative debug loop** — `/run file.py` auto-fixes errors until code passes
- 📋 **Plan executor** — `/plan <goal>` breaks goals into typed steps and executes them
- 🧠 **Long-term memory** — `/remember` stores facts that persist across sessions
- 📦 **Auto-installs Ollama** — detects if Ollama is missing and installs it for you
- 🚀 **Auto-starts Ollama** — spins up `ollama serve` automatically if not running
- ⬇️ **Arrow-key model picker** — `/install` lets you browse and download 25+ models
- 🔧 **Agent tools** — `/shell`, `/file`, `/fetch`, `/ls` inject real context into chats
- 💾 **Conversation saving** — `/save` and `/load` persist chats as JSON
- 🎭 **Personas** — save and load system prompt presets
- 🆚 **Compare mode** — run the same prompt through two models side by side
---
## Usage
```bash
ollama-cli # start chatting
ollama-cli --model qwen2.5:7b # start with a specific model
ollama-cli --auto # start in autonomous agent mode
ollama-cli --compare # compare two models side by side
```
---
## Commands
### Chat & Navigation
| Command | Description |
|---|---|
| `/cls` | Clear screen (keep context) |
| `/clear` | Clear conversation and screen |
| `Ctrl+L` | Clear screen |
| `/retry` | Regenerate last response |
| `/tokens` | Toggle token count display |
### Models
| Command | Description |
|---|---|
| `/model` | Switch active model (arrow-key picker) |
| `/current` | Show currently active model |
| `/install` | Browse & install models from catalogue |
| `/models` | List all installed models |
| `/compare` | Compare two models side by side |
### Agentic
| Command | Description |
|---|---|
| `/auto` | Toggle autonomous tool-calling mode |
| `/plan <goal>` | Break a goal into steps and execute |
| `/run <file.py>` | Run code, auto-fix errors in a loop |
### Memory
| Command | Description |
|---|---|
| `/remember <fact>` | Store a fact in long-term memory |
| `/memories` | List all stored memories |
| `/forget <id>` | Delete a memory by ID |
### Context Injection
| Command | Description |
|---|---|
| `/file <path>` | Load a file into context |
| `/shell <cmd>` | Run a shell command, inject output |
| `/fetch <url>` | Fetch a webpage into context |
| `/ls <path>` | Inject a directory listing |
| `/context` | View or clear active injections |
### Conversations & Personas
| Command | Description |
|---|---|
| `/save <n>` | Save conversation |
| `/load <n>` | Load conversation |
| `/list` | List saved conversations |
| `/system <prompt>` | Set a system prompt |
| `/persona <n>` | Load a saved persona |
| `/personas` | List saved personas |
| `/save-persona <n>` | Save current system prompt as persona |
---
## Agent Mode
Toggle with `/auto` or launch with `--auto`. In auto mode the model can call tools, read results, and loop until the task is done — no manual `/file` or `/shell` needed.
```
⚡ you › look at main.py and find any bugs
⚡ you › write a web scraper for hacker news and run it
⚡ you › set up a basic Flask app in this folder
```
---
## Config & Data
All config and data is stored in your home directory:
| Path | Description |
|---|---|
| `~/.ollama_cli_config.json` | Settings (model, auto mode, etc) |
| `~/.ollama_cli_history` | Input history |
| `~/.ollama_cli_memory.json` | Long-term memories |
| `~/.ollama_cli_saves/` | Saved conversations |
| `~/.ollama_cli_personas/` | Saved personas |
---
## Requirements
- Python 3.10+
- macOS, Linux, or Windows
- Ollama (handled automatically on first run)
---
## Roadmap
- [ ] MCP server — expose tools to Claude Code, Cursor, and other agents
- [ ] Repo-aware context — auto-index codebase on launch from a project folder
- [ ] Git tools — `/diff`, `/commit`, `/log`
- [ ] API key integrations — Claude, OpenAI, Gemini, Groq as model backends
- [ ] Symbol search across codebase
---
## Contributing
PRs and issues welcome at [github.com/Akhil123454321/ollama-cli](https://github.com/Akhil123454321/ollama-cli). Keep changes focused and include tests where appropriate.
## License
MIT — see [LICENSE](LICENSE)
| text/markdown | null | null | null | null | Copyright (c) 2026 Akhil Sagaran Kasturi | ollama, llm, cli, ai, agent, local-ai, terminal | [
"Development Status :: 4 - Beta",
"Environment :: Console",
"Intended Audience :: Developers",
"Intended Audience :: End Users/Desktop",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
"Topic :: Terminals"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"rich>=13.0",
"prompt_toolkit>=3.0",
"ollama>=0.4",
"requests>=2.28",
"beautifulsoup4>=4.11",
"build; extra == \"dev\"",
"twine; extra == \"dev\"",
"pytest; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/Akhil123454321/ollama-cli",
"Repository, https://github.com/Akhil123454321/ollama-cli",
"Issues, https://github.com/Akhil123454321/ollama-cli/issues"
] | twine/6.2.0 CPython/3.12.8 | 2026-02-21T02:33:09.666750 | ollama_agentic-0.1.7.tar.gz | 18,120 | fd/49/bf1f2b9b1af4b7b9891b276cdc37c0d236082a6566472f0e5e775e55947b/ollama_agentic-0.1.7.tar.gz | source | sdist | null | false | 19cb9b80ff5ac3e59dd3a2b3919ad668 | f2e0411a849928989fee586e589d4dcf45498cbed4148a6a9b87a63c4e853aea | fd49bf1f2b9b1af4b7b9891b276cdc37c0d236082a6566472f0e5e775e55947b | null | [
"LICENSE"
] | 166 |
2.4 | playpy | 0.1.0 | PlayPy is a lightweight UI toolkit built off pygame designed to make building apps on python easier. | # PlayPy (0.1.0)
PlayPy is a lightweight UI toolkit built off `pygame` designed to make building apps on python easier.
It provides a small scene-based workspace, UI elements, style modifiers, and decorator-based event hooks.
## Requirements
- Python `>=3.11`
- `pygame >=2.6.1`
## Installation
To install PlayPy, Enter the following to the terminal:
```bash
pip install playpy
```
This should automatically install PlayPy to your computer to be used in any python projects
## Core Concepts
### Workspace
`Workspace` owns the window, render loop, input state, scene stack, and modal stack.
Key methods:
- `run()` starts the loop
- `quit()` stops the loop
- `set_scene(scene)` replaces the current scene
- `push_scene(scene)` / `pop_scene()` stacks scenes
- `push_modal(element)` / `pop_modal()` / `clear_modals()` manages overlays
### Layout Values
PlayPy uses two rectangle value types:
- `FRectValue(x, y, w, h)`
Relative scale (fraction of parent rectangle).
- `RectValue(x, y, w, h)`
Absolute pixel offsets applied on top of scale.
Final element rect = `scale * parent_size + offset`.
### Parenting
Any `UIElement` can contain children. Assigning parent wires it automatically:
```python
child.parent = parent
```
Or use helpers:
```python
parent.add_child(child)
```
## Built-in Elements
- `UIPanel`: colored rectangle container
- `UIScrollablePanel`: panel with wheel scrolling
- `UIText`: wrapped text rendering with alignment
- `UIButton`: clickable button with hover/pressed colors
- `UITextbox`: single-line text input with placeholder/caret
- `Scene`: root container for scene lifecycle
## Modifiers
Modifiers attach style/behavior to a single element.
- `UIPadding(scale=0, offset=10)`
- `UIOutline(color, width, edge_type)`
- `UIBorderRadius(radius)`
- `UIGradient(start_color, end_color, direction)`
- `UIFont(font_path=None, font_size=None, bold=None, italic=None, antialias=None)`
Attach/remove/get:
```python
element.set_modifier(plp.UIOutline((0, 0, 0), 2, "middle"))
outline = element.get_modifier(plp.UIOutline)
element.remove_modifier(plp.UIOutline)
```
## Event Helpers
Decorator helpers create `Event` elements attached to a workspace, scene, or element.
- `@on_start(target)`
- `@on_update(target)`
- `@on_quit(target)`
- `@on_scene_change(target)`
- `@on_modal_change(target)`
- `@create_event(target)(condition)` for custom conditions
Example:
```python
@plp.on_update(ws)
def tick(w: plp.Workspace):
if plp.pg.K_ESCAPE in w.input.key_downs:
w.quit()
```
## Scene and Modal Behavior
- If a modal is active, input is only routed to the modal tree
- `Workspace` tracks scene/modal transitions with:
- `current_scene`, `previous_scene`, `scene_changed`
- `current_modal`, `previous_modal`, `modal_changed`
- Scenes can implement lifecycle hooks:
- `on_enter`, `on_exit`, `on_pause`, `on_resume`
## Input State
Read per-frame input from `workspace.input`:
- `keys_pressed`, `key_downs`, `key_ups`
- `mouse_buttons_pressed`, `mouse_downs`, `mouse_ups`
- `mouse_pos`, `mouse_wheel`
- `text_input`
- `dt`, `runtime`, `quit`
| text/markdown | null | angle <angyv2861@gmail.com> | null | null | MIT License
Copyright (c) 2026 Angel Christopher Ventura
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. | null | [] | [] | null | null | >=3.11 | [] | [] | [] | [
"pygame-ce>=2.5.6"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.14.3 | 2026-02-21T02:32:30.567782 | playpy-0.1.0.tar.gz | 14,418 | df/89/50dbbd55b875074f1b047fdeccc84c6aa8e2f37a9f39cb96ae2a58bd98aa/playpy-0.1.0.tar.gz | source | sdist | null | false | e020115a9eadc33d668c64665dff2c50 | 190523769adb7262b37bf60df5f6971b8606790fbc854a733c24dd7337282421 | df8950dbbd55b875074f1b047fdeccc84c6aa8e2f37a9f39cb96ae2a58bd98aa | null | [
"LICENSE"
] | 169 |
2.4 | preduce | 0.3.0 | Compress LLM prompts 50%+ while preserving meaning. Lightweight API client. | # preduce
Compress LLM prompts 50%+ while preserving meaning. Cut your API costs in half with one line of code.
## Install
```bash
pip install preduce
```
## Quick Start
```python
from preduce import compress
result = compress("Your verbose text here...", api_key="your-key")
print(result.compressed_text) # compressed output
print(result.category) # GENERAL, FINANCIAL, MEDICAL, CODE, ACADEMIC, SUPPORT, or MARKETING
print(result.token_reduction_pct) # e.g. 52.3
print(result.stats) # full reduction metrics
```
## Category Override
```python
result = compress(
"Total revenue for fiscal year 2025 was $847,300,000, "
"representing an increase of 22.4% year-over-year.",
api_key="your-key",
category="FINANCIAL",
)
# → "Total rev FY2025 $847.3M, +22.4% YoY."
```
## Categories
| Category | What it compresses | Example |
|---|---|---|
| **GENERAL** | Business docs, emails, reports | "we are pleased to announce" → removed |
| **FINANCIAL** | Earnings, metrics, filings | "$847,300,000" → "$847.3M" |
| **MEDICAL** | Clinical notes, discharge summaries | "type 2 diabetes mellitus" → "T2DM" |
| **CODE** | Preserves code, compresses prose | Code untouched, comments compressed |
| **ACADEMIC** | Research papers, dissertations | "statistically significant" → "sig." |
| **SUPPORT** | Support tickets, help desk | "thank you for contacting" → "thanks for contacting" |
| **MARKETING** | Marketing copy, promotions | "limited time offer" → "limited time" |
## Response
```python
result.compressed_text # str — the compressed text
result.category # str — detected or overridden category
result.original_tokens # int — estimated original token count
result.compressed_tokens # int — estimated compressed token count
result.confidence_pct # dict — confidence % per category (e.g. {"FINANCIAL": 85.2, "GENERAL": 14.8, ...})
result.token_reduction_pct # float — percentage reduction
result.char_reduction_pct # float — percentage reduction
result.stats # dict — full metrics
result.to_dict() # dict — serialize everything
```
## Confidence Scores
The API returns confidence percentages for each category, useful for debugging classification:
```python
result = compress("Revenue increased 15% this quarter.", api_key="your-key")
print(result.confidence_pct)
# {"FINANCIAL": 78.5, "GENERAL": 12.3, "MARKETING": 9.2, ...}
print(result.category) # "FINANCIAL"
```
## Error Handling
```python
from preduce import compress
try:
result = compress("text", api_key="invalid-key")
except PermissionError:
print("Bad API key or quota exceeded")
except ValueError:
print("Invalid input (empty text or bad category)")
except RuntimeError:
print("API error")
```
## Get an API Key
Visit [preduce.dev](https://preduce.dev) to get your API key.
## License
MIT
| text/markdown | Eddy Ding | null | null | null | MIT | prompt, compression, llm, tokens, openai, gpt, claude, cost-reduction | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Text Processing",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
"Typing :: Typed"
] | [] | null | null | >=3.8 | [] | [] | [] | [
"requests>=2.20.0",
"pytest>=7.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/eddydings/preduce",
"Documentation, https://github.com/eddydings/preduce#readme",
"Issues, https://github.com/eddydings/preduce/issues"
] | twine/6.2.0 CPython/3.12.0 | 2026-02-21T02:31:02.286960 | preduce-0.3.0.tar.gz | 5,951 | b7/38/6372acca654ee38aec248dbd4684ac86cc0d23e59b6c73864f3493f4b043/preduce-0.3.0.tar.gz | source | sdist | null | false | b86d89a24e23e860325f08712966898e | 6010d1df41678e7b08be6c08ba0e8db1d81bcca03579e0d5b55f77c7e8cf4500 | b7386372acca654ee38aec248dbd4684ac86cc0d23e59b6c73864f3493f4b043 | null | [] | 162 |
2.4 | standup-cli-tool | 0.1.0 | ⚡ Generate your daily standup from git commits — right in your terminal | # ⚡ standup-cli
> Generate your daily standup from git commits — right in your terminal.
Never manually write a standup again. `standup-cli` scans your git commits from the last 24 hours, asks what you're working on today and if you have blockers, then formats a clean standup message ready to paste anywhere.
```
$ standup
⚡ standup-cli
Generate your daily standup in seconds
🔍 Scanning git commits from last 24hrs...
✅ Found 3 commit(s):
• Fixed auth bug in login flow
• Updated API documentation
• Refactor user model
🚀 What are you working on today?
> Integrating Stripe payment API
🚧 Any blockers? (press Enter for "None")
> None
──────────────────────────────────────────────────
✅ Your Standup [plain]
Yesterday: Fixed auth bug in login flow, Updated API documentation, Refactor user model
Today: Integrating Stripe payment API
Blockers: None
──────────────────────────────────────────────────
💡 Tip: use --format slack | markdown | plain
```
## Install
**via npm:**
```bash
npm install -g standup-cli
```
**via pip:**
```bash
pip install standup-cli
```
## Usage
```bash
# Default (plain text output)
standup
# Slack-ready output
standup --format slack
# Markdown output
standup --format markdown
```
## Output Formats
**Plain** (default) — paste anywhere:
```
Yesterday: Fixed auth bug, updated docs
Today: Stripe integration
Blockers: None
```
**Slack** — with bold formatting:
```
*📋 Yesterday:* Fixed auth bug, updated docs
*🚀 Today:* Stripe integration
*🚧 Blockers:* None
```
**Markdown** — for GitHub, Notion, etc:
```markdown
### Daily Standup
**Yesterday:**
Fixed auth bug, updated docs
**Today:**
Stripe integration
**Blockers:**
None
```
## How it works
1. Runs `git log --since="24 hours ago"` in your current directory
2. Prompts you for today's focus and any blockers
3. Formats and prints your standup
> **Tip:** Run it from your project root for best results. Works with any git repo.
## Roadmap (v1 ideas)
- [ ] Copy to clipboard automatically
- [ ] Support multiple repos
- [ ] `.standuprc` config file for team name, format preference
- [ ] Weekly summary mode
## License
MIT © [Muhammad Talha Khan](https://github.com/muhtalhakhan)
| text/markdown | null | null | null | null | MIT | standup, cli, git, developer, productivity, scrum | [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Environment :: Console",
"Topic :: Utilities"
] | [] | null | null | >=3.8 | [] | [] | [] | [] | [] | [] | [] | [
"Homepage, https://github.com/muhtalhakhan/standup-cli",
"Repository, https://github.com/muhtalhakhan/standup-cli",
"Issues, https://github.com/muhtalhakhan/standup-cli/issues"
] | twine/6.2.0 CPython/3.12.4 | 2026-02-21T02:30:17.351432 | standup_cli_tool-0.1.0.tar.gz | 3,596 | ca/21/d544f95826909b8899de2f32857ac53a4931e21245f6578f9368e115ccbf/standup_cli_tool-0.1.0.tar.gz | source | sdist | null | false | e74df002f8d415e6ed3df7d02e6f5af8 | d1a33bdb94e1b9c1d4c646ee71797458d02b3246ecd250bd7058eb73b99a854a | ca21d544f95826909b8899de2f32857ac53a4931e21245f6578f9368e115ccbf | null | [] | 177 |
2.4 | filerohr | 0.6.0 | A pipeline-based file processing library. | # filerohr
filerohr is a pipeline-based file processing library and CLI tool.
Users can configure a custom processing pipeline to suit their needs
using freely interchangeable tasks.
filerohr comes with a number of built-in tasks that specialize in
audio processing, all based on ffmpeg and ffprobe.
Adding new task definitions is relatively easy and
only requires some knowledge of Python.
NOTE: filerohr currently is a proof-of-concept and doesn’t have any tests yet.
filerohr’s name is a wordplay on the german _Fallrohr_ (literally _droppipe_).
## CLI
filerohr comes with a CLI that can be executed with `uv run python -m filerohr`
Quick start:
```shell
# validate a pipeline config
uv run python -m filerohr validate-config my-pipeline.yaml
# Show available jobs
uv run python -m filerohr list-jobs
# import a file
uv run python -m filerohr import-file --config my-pipeline.yaml ~/Music/song.mp3
```
## Configuration
filerohr itself is mostly configured through environment variables.
The pipeline configuration is YAML and is documented below.
### Environment variables
Environment variable configuration options include:
`TZ`
: The local timezone (e.g. `Europe/Vienna`)
`FILEROHR_FFMPEG_BIN`
: Path to the ffmpeg binary
`FILEROHR_FFPROBE_BIN`
: Path to the ffprobe binary
`FILEROHR_DATA_DIR`
: Path to the data directory
`FILEROHR_TMP_DIR`
: Path to the temporary directory
`FILEROHR_TASK_MODULES`
: comma-separated list of Python modules to import as task definitions
`FILEROHR_PIPELINE_CONFIG_DIR`
: Path to the pipeline configuration file directory
Most of these variables will be set to sensible defaults
in the upcoming container image.
### Pipeline configuration
The following pipeline configuration is based on filerohr’s built-in tasks:
```yaml
# You can give pipelines a name.
# That makes it easy to reference them in an API or with the CLI.
name: audio
# Pipelines can be marked as default (but only one).
use_as_default: true
jobs:
# Downloads the file if a remote URL was provided.
# Skips them otherwise.
- job: download_file
match_content_type: ["audio/*", "video/*", "pass_unset"]
max_download_size_mib: 25
# Logs the file size.
- job: log_file_size
# Sanitizes the file as audio/video.
# Automatically skips the file if it is not audio/video.
- job: sanitize_av
# Extracts all audio streams as separate files.
- job: extract_audio
# Extract metadata from the audio files.
- job: extract_audio_metadata
# Normalize the audio file with the podcast preset.
- job: normalize_audio
preset: podcast
# Converts the audio file (if necessary).
# Only allow opus and flac and convert to flac as needed.
- job: convert_audio
allowed_formats: ["opus", "flac"]
fallback_format: flac
# Now that the last job that could have changed the audio format is finished,
# we can extract the mime type from the audio file.
- job: extract_mime_type
# Log the file size again.
# This time it will include the reduction in file size in percent.
- job: log_file_size
# Hash the file content with SHA3 512.
# This will be re-used in store_by_content_hash.
- job: hash_file
alg: sha3_512
# Store the files by their content hash.
- job: store_by_content_hash
storage_dir: /home/you/data/audio/by-hash
# Emit the stored file in the hash-based directory as a pipeline result.
emit: true
# Additionally, store by upload date, but only symlink to files in hash storage.
- job: store_by_creation_date
storage_dir: /home/you/data/audio/by-hash
symlink: true
# Do not emit but keep the file in the upload-date-based directory.
keep: true
```
Not that you can include jobs multiple times.
This can be helpful e.g., if you do file size logging,
before and after converting audio streams.
## Built-in tasks
### convert_audio
Ensures an audio file matches the allowed formats.
Audio files that don’t match the formats will be converted
to the fallback format.
Note: This job must be placed after an `extract_audio` job.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### allowed_formats: `string[]`, _required_
List of allowed audio formats. Use 'any' to allow any audio format.
Examples:
- `['vorbis', 'flac', 'opus']`
##### fallback_format: `string`, default: `'flac'`
The format to fallback to if the detected format is not allowed.
ffmpeg often has different encoders for the same audio format. Some of
these encoders are experimental, e.g. you probably want to
use`libopus` over `opus`. If you select an experimental encoder, you
might have to add `['-strict', '-2']` to
`fallback_format_encoder_args` to avoid errors.
##### fallback_format_ext: `string | null`, default: `null`
The file extension of the fallback format. This is automatically
inferred for the most common audio formats.
##### fallback_format_encoder_args: `string[]`, _required_
Additional arguments passed to ffmpeg when encoding to the fallback
format.
### discard_remote
Stops the pipeline if the current file is not a local file.
#### Configuration options:
##### skip: `boolean`, default: `False`
### download_file
Downloads files from remote sources.
When called with a local file path, the job will simply be skipped.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### storage_dir: `string | null`, default: `null`
Base directory to store downloaded files in.
##### max_download_size_mib: `number`, default: `1024`
Maximum size allowed for downloaded files in MiB (megabytes). Use `0`
for no limit.
##### allow_streams: `boolean`, default: `False`
Whether to download files from sources that are streaming responses
and do not have a fixed file size.
##### allowed_protocols: `('http' | 'https')[]`, default: `['http', 'https']`
List of allowed protocols to download files from.
##### timeout_seconds: `integer`, default: `600`
Timeout in seconds for downloading.
##### follow_redirects: `boolean`, default: `True`
Follow redirects when downloading.
##### match_content_type: `(string | 'pass_unset')[] | null`, default: `null`
List of mime types to check. Supports glob patterns like `audio/*`.
Add `pass_unset` to the list if you want this check to pass in case no
Content-Type was given in the server’s response headers. Setting
`match_content_type` to `null` will allow all content types.
##### download_chunk_size_mib: `number`, default: `1`
Size of chunks to download in MiB (megabytes).
### extract_audio
Extracts all audio streams in the job’s file and saves
them as separate files.
In case more than one stream is found, this job will
spawn a subsequent job for each stream file.
If you only want to extract a single stream,
set `count: 1` in the job configuration.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### count: `integer`, default: `0`
Maximum number of audio streams to extract. This is useful for files
that contain multiple audio streams. Extracts all audio streams if
number is `0`.
### extract_audio_metadata
Extracts metadata, like duration, artist, title, album
from the job’s file.
#### Configuration options:
##### skip: `boolean`, default: `False`
### extract_mime_type
Extracts the mime type of the job’s file.
#### Configuration options:
##### skip: `boolean`, default: `False`
### ffmpeg
Run a custom ffmpeg command with args specified in the job configuration.
Ensure that you include `{input_file}` and `{output_file}` placeholders.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### args: `string[]`, _required_
FFMPEG command line arguments. Must contain `{input_file}` and
`{output_file}` as literal placeholder strings for actual file paths.
Examples:
- `['-i', '{input_file}', '-vn', '{output_file}']`
##### output_format: `string`, _required_
Output file format. Must be a valid file extension understood by
ffmpeg.
Examples:
- `'.mp3'`
- `'.flac'`
### hash_file
Hash the file content.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### alg: `'blake2b' | 'blake2s' | 'md5' | 'sha1' | 'sha224' | 'sha256' | 'sha384' | 'sha3_224' | 'sha3_256' | 'sha3_384' | 'sha3_512' | 'sha512' | 'shake_128' | 'shake_256'`, default: `'sha256'`
Hash algorithm to use.
### keep_file
Keep the current job file and do not delete it after
the pipeline finishes.
This can be helpful to debug intermediate job output.
#### Configuration options:
##### skip: `boolean`, default: `False`
### log_file_size
Logs the file size in MiB.
If used multiple times in the same pipeline config, it will also
display the change in file size compared to the reference.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### force_update_reference: `boolean`, default: `False`
If set to true, the reference used to calculate changes in the file
size between jobs will be updated. Implicitly `true` on first call.
### normalize_audio
Normalizes the audio stream with [ffmpeg-normalize](https://slhck.info/ffmpeg-normalize/).
If no additional options are given, the `podcast` preset will be used.
Note: This job must be placed after an `extract_audio` job.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### preset: `'music' | 'podcast' | 'streaming-video' | null`, default: `null`
The audio normalization preset to use. See https://slhck.info/ffmpeg-
normalize/usage/presets/ for available presets. Mutually exclusive
with `args`.
##### args: `string[]`, _required_
ffmpeg-normalize command line arguments. See
https://slhck.info/ffmpeg-normalize/usage/cli-options/ for available
options. Mutually exclusive with `preset`.
### sanitize_av
Sanitizes the job file as an audio/video stream.
This performs a simple copy operation for all streams in the
job’s file and ensures that the resulting file is playable.
If the file is not an audio file, it will be skipped by default.
#### Configuration options:
##### skip: `boolean`, default: `False`
##### error_handling: `'ignore' | 'ignore_minor'`, default: `'ignore'`
How to handle errors during file sanitization of broken files.
`ignore` will ignore all but critical errors in the audio and
corresponds to ffmpeg’s `-err_detect ignore_err`. The resulting file
will play, but may not be pleasant to listen to. `ignore_minor` will
let minor errors pass and corresponds to ffmpeg’s `-err_detect
careful`.
Examples:
- `'ignore'`
- `'ignore_minor'`
##### skip_invalid: `boolean`, default: `True`
Silently skip files that do not contain audio.
### store_by_content_hash
Stores the job file in a directory-tree based on the file’s content hash.
The generated directory structure looks like this:
```
audio_dir/
8d/
a9/
8da9bce68a6aebdcba325cf21402c78c1628c9da1278b817a600cdd92b720653.flac
8da9fc4939da378a720ba1ba310d3d7a1a85e44b79cd4c68bfc4bd3081f01062.flac
e0/
0a/
e00afc4939da378a720ba1ba310d3d7a1a85e44b79cd4c68bfc4bd3081f01062.flac
```
#### Configuration options:
##### skip: `boolean`, default: `False`
##### storage_dir: `string | null`, default: `null`
Base directory to store files in.
##### symlink: `boolean`, default: `False`
If set to true, a symlink is created to the file from the previous
job. In that case, the previous job must create the file in a
persistent storage location.
##### relative: `boolean`, default: `True`
If set to true, the symlink will be relative to the storage path.
##### keep: `boolean`, default: `False`
If set to true, the file is kept after the pipeline finishes.
##### emit: `boolean`, default: `False`
If set to true, the file is emitted as a pipeline result. Implies
keep.
##### alg: `'blake2b' | 'blake2s' | 'md5' | 'sha1' | 'sha224' | 'sha256' | 'sha384' | 'sha3_224' | 'sha3_256' | 'sha3_384' | 'sha3_512' | 'sha512' | 'shake_128' | 'shake_256' | null`, default: `null`
Hash algorithm to use. If unset the job will try to re-use an existing
file hash calculated in a `hash_file` job. Otherwise, a new hash will
be calculated and stored along with the file. In case the file already
has a hash but uses a different algorithm, a new hash will be
calculated for storage but the hash on the file will be kept.
##### levels: `integer`, default: `2`
Number of directory levels that should be created for stored files.
##### chars_per_level: `integer`, default: `2`
Number of hash-characters per level.
### store_by_creation_date
Stores the job file in a directory-tree based on the creation date.
The generated directory structure looks like this:
```
audio_dir/
2025/
11/
20/
filename2.flac
filename3.flac
10/
16/
filename1.flac
```
#### Configuration options:
##### skip: `boolean`, default: `False`
##### storage_dir: `string | null`, default: `null`
Base directory to store files in.
##### symlink: `boolean`, default: `False`
If set to true, a symlink is created to the file from the previous
job. In that case, the previous job must create the file in a
persistent storage location.
##### relative: `boolean`, default: `True`
If set to true, the symlink will be relative to the storage path.
##### keep: `boolean`, default: `False`
If set to true, the file is kept after the pipeline finishes.
##### emit: `boolean`, default: `False`
If set to true, the file is emitted as a pipeline result. Implies
keep.
##### month: `boolean`, default: `True`
Include month in storage subdirectories.
##### day: `boolean`, default: `True`
Include day in storage subdirectories.
## Custom tasks
You can define custom tasks by creating a python module/file.
After that you need to ensure set the `FILEROHR_TASK_MODULES` environment variable to
the name of your module. If you have created multiple modules, you can separate them by a comma.
You can find examples for tasks in the `my-custom-tasks.py` file
or in filerohr’s own `filerohr/tasks/` directory.
When implementing a custom task, there are three important guidelines:
1. DO NOT block the loop.
All code must run asynchronously, also known as cooperative multitasking.
filerohr includes the [`aiofiles`](https://pypi.org/project/aiofiles/)
and [`pebble`](https://pypi.org/project/pebble/) libraries and some additional
helpers in `filerohr.utils`. Use these to do blocking IO or CPU-intensive work.
2. Call `job.next()` when you are done with the task.
That is, if you want the pipeline to move on.
If you don’t call `job.next()` the pipeline will stop after your job finished.
Sometimes a job might do that intentionally, but most of the time you don’t want that.
You may also call `job.next()` multiple times, if you want to enqueue multiple
follow-up jobs. This may be something you do, when you create multiple new files
in your job (like `filerohr.tasks.av.extract_audio` does).
3. Don’t modify the `job.file` directly.
Instead, call `job.next(file.clone(path=...))`.
| text/markdown | null | Konrad Mohrfeldt <km@roko.li> | null | null | null | null | [] | [] | null | null | >=3.13 | [] | [] | [] | [
"aiofiles>=25.1.0",
"blinker>=1.9.0",
"httpx>=0.28.1",
"pebble>=5.1.3",
"pydantic>=2.12.4",
"pyyaml>=6.0.3",
"ffmpeg-normalize>=1.37.2; extra == \"av\""
] | [] | [] | [] | [
"repository, https://codeberg.org/kmohrf/filerohr",
"documentation, https://python-poetry.org/docs/",
"Bug Tracker, https://codeberg.org/kmohrf/filerohr/issues"
] | uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Arch Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null} | 2026-02-21T02:29:43.685124 | filerohr-0.6.0.tar.gz | 65,774 | 09/9c/b277edd8afec7bb7899a571cb474ae608bb664ea3f12c09d937788764e46/filerohr-0.6.0.tar.gz | source | sdist | null | false | 7bfb2c7211da383e85a06bb0224f6895 | 785d5dc69c298e409fba6ad3ff12066abc333f064297e5e3fa7576e3c8eb154b | 099cb277edd8afec7bb7899a571cb474ae608bb664ea3f12c09d937788764e46 | AGPL-3.0-or-later | [
"LICENSE"
] | 182 |
2.4 | scientific-writer | 2.12.0 | Deep research and writing tool - combines AI-driven deep research with well-formatted written outputs. Generates publication-ready scientific documents with verified citations. | # Claude Scientific Writer
[](https://pypi.org/project/scientific-writer/)
[](https://pepy.tech/project/scientific-writer)
[](https://opensource.org/licenses/MIT)
> 🚀 **Looking for more advanced capabilities?** For end-to-end scientiic writing, deep scientfic search, advanced image generation and enterprise solutions, visit **[www.k-dense.ai](https://www.k-dense.ai)**
**A deep research and writing tool** that combines the power of AI-driven deep research with well-formatted written outputs. Generate publication-ready scientific papers, reports, posters, grant proposals, literature reviews, and more academic documents—all backed by real-time literature search and verified citations.
Scientific Writer performs comprehensive research before writing, ensuring every claim is supported by real, verifiable sources. Features include real-time research lookup via Perplexity Sonar Pro Search, intelligent paper detection, comprehensive document conversion, and AI-powered diagram generation with Nano Banana Pro. You have the option of using it as a claude code plugin, python package or a native CLI
## Quick Start
### Prerequisites
- Python 3.10-3.12
- ANTHROPIC_API_KEY (required), OPENROUTER_API_KEY (optional for research lookup)
### Installation Options
#### Option 1: Claude Code Plugin (Recommended) ⭐
The easiest way to use Scientific Writer is as a Claude Code plugin. See the [Plugin Installation](#-use-as-a-claude-code-plugin-recommended) section above.
#### Option 2: Install from PyPI (CLI/API Usage)
```bash
pip install scientific-writer
```
#### Option 3: Install from source with uv
```bash
git clone https://github.com/K-Dense-AI/claude-scientific-writer.git
cd claude-scientific-writer
uv sync
```
### Configure API keys
```bash
# .env file (recommended)
echo "ANTHROPIC_API_KEY=your_key" > .env
echo "OPENROUTER_API_KEY=your_openrouter_key" >> .env
# or export in your shell
export ANTHROPIC_API_KEY='your_key'
```
### Usage Options
#### Use as Plugin (Recommended)
After installing the plugin and running `/scientific-writer:init`, simply ask Claude:
```bash
> Create a Nature paper on CRISPR gene editing. Present experimental_data.csv
(efficiency across 5 cell lines), include Western_blot.png and flow_cytometry.png
showing 87% editing efficiency (p<0.001). Compare with literature benchmarks.
> Generate an NSF grant proposal presenting preliminary data from quantum_results.csv
(99.2% gate fidelity), circuit_topology.png, and error_rates.csv.
Include 5-year timeline with milestones_budget.xlsx.
> @research-lookup Find papers on mRNA vaccine efficacy (2022-2024). Compare
with our trial_outcomes.csv (n=500, 94% efficacy) and antibody_titers.png.
```
#### Use the CLI
```bash
# If installed via pip
scientific-writer
# If installed from source with uv
uv run scientific-writer
```
#### Use the Python API
```python
import asyncio
from scientific_writer import generate_paper
async def main():
# Detailed prompt with specific data and figures
async for update in generate_paper(
query=(
"Create a Nature paper on CRISPR gene editing. "
"Present editing_efficiency.csv (5 cell lines, n=200 cells each). "
"Include Western blot (protein_knockout.png) showing target depletion, "
"flow cytometry data (editing_percentages.png) with 87% efficiency in HEK293, "
"and off_target_analysis.csv showing <0.1% off-target effects. "
"Compare results to published Cas9 benchmarks (typically 70-75% efficiency)."
),
data_files=[
"editing_efficiency.csv",
"protein_knockout.png",
"editing_percentages.png",
"off_target_analysis.csv"
]
):
if update["type"] == "progress":
print(f"[{update['stage']}] {update['message']}")
else:
print(f"✓ PDF: {update['files']['pdf_final']}")
print(f" Figures: {len(update.get('figures', []))} included")
asyncio.run(main())
```
## 🎯 Use as a Claude Code Plugin (Recommended)
**Scientific Writer works best as a Claude Code (Cursor) plugin**, providing seamless access to all scientific writing capabilities directly in your IDE. No CLI required!
### Quick Start - Plugin Installation
1. **Add the plugin marketplace** in Claude Code:
```bash
/plugin marketplace add https://github.com/K-Dense-AI/claude-scientific-writer
```
2. **Install the plugin**:
```bash
/plugin install claude-scientific-writer
```
3. **Restart Claude Code** when prompted.
4. **Initialize in your project**:
```bash
/scientific-writer:init
```
This creates a `CLAUDE.md` file with comprehensive scientific writing instructions and makes all 19+ skills available.
5. **Start using immediately**:
```bash
# Create papers with data and figures
> Create a Nature paper on CRISPR gene editing. Present knockout_efficiency.csv
(5 cell lines tested), include Western blot (protein_levels.png) and flow
cytometry data (editing_rates.png). Highlight 87% efficiency in HEK293 cells.
> Write an NSF grant proposal for quantum computing. Present preliminary results
from gate_fidelity.csv (99.2% fidelity), include circuit_diagram.png and
error_analysis.png. Compare to state-of-art 95% baseline.
> Generate conference poster. Feature results from clinical_trial.csv
(n=150), survival_curves.png, biomarker_heatmap.png, and mechanism_diagram.svg.
# Use specific skills with research data
> @research-lookup Find papers on mRNA vaccine efficacy (2022-2024). Compare
with our trial_data.csv showing 94% efficacy and antibody_titers.xlsx.
> @peer-review Evaluate this manuscript. Reference sample size in methods.csv
(n=30) and effect_sizes.png. Assess if statistical power is adequate.
> @clinical-reports Create case report for autoimmune disorder. Include patient_labs.xlsx
(6 months data), MRI_scans/ folder, treatment_timeline.csv showing response.
```
### Why Use the Plugin?
- ✅ **No CLI Required** - Everything works directly in Claude Code
- ✅ **Instant Access** - All 19+ skills available immediately
- ✅ **IDE Integration** - Files created and edited in your project
- ✅ **Context Aware** - Skills understand your project structure
- ✅ **Seamless Workflow** - No switching between tools
### Available Skills
When installed as a plugin, you get instant access to:
- `scientific-schematics` - AI diagram generation with Nano Banana Pro (CONSORT, neural networks, pathways)
- `research-lookup` - Real-time literature search
- `peer-review` - Systematic manuscript evaluation
- `citation-management` - BibTeX and reference handling
- `clinical-reports` - Medical documentation standards
- `research-grants` - NSF, NIH, DOE proposal support
- `scientific-slides` - Research presentations
- `latex-posters` - Conference poster generation
- `hypothesis-generation` - Scientific hypothesis development
- `market-research-reports` - Comprehensive 50+ page market analysis reports with visuals
- And 10+ more specialized skills...
See the [Plugin Testing Guide](#plugin-testing-local-development) below for local development instructions.
## Features
### 📝 Document Generation
- **Scientific papers** with IMRaD structure (Nature, Science, NeurIPS, etc.)
- **Clinical reports** (case reports, diagnostic reports, trial reports, patient documentation)
- **Research posters** using LaTeX (beamerposter, tikzposter, baposter)
- **Grant proposals** (NSF, NIH, DOE, DARPA) with agency-specific formatting
- **Literature reviews** with systematic citation management
- **Scientific schematics** powered by Nano Banana Pro (CONSORT diagrams, neural architectures, biological pathways, circuit diagrams)
### 🤖 AI-Powered Capabilities
- **Real-time research lookup** using Perplexity Sonar Pro Search (via OpenRouter)
- **AI-powered diagram generation** with Nano Banana Pro - create any scientific diagram from natural language descriptions
- **Intelligent paper detection** - automatically identifies references to existing papers
- **Peer review feedback** with quantitative ScholarEval framework (8-dimension scoring)
- **Iterative editing** with context-aware revision suggestions
### 🔧 Developer-Friendly
- **Programmatic API** - Full async Python API with type hints
- **CLI interface** - Interactive command-line tool with progress tracking
- **Progress streaming** - Real-time updates during generation
- **Comprehensive results** - JSON output with metadata, file paths, citations
### 📦 Data & File Integration
- **Automatic data handling** - Drop files in `data/`, auto-sorted to `figures/` or `data/`
- **Document conversion** - PDF, DOCX, PPTX, XLSX to Markdown with MarkItDown
- **Bibliography management** - Automatic BibTeX generation and citation formatting
- **Figure integration** - Images automatically referenced and organized
## Typical Workflow
### CLI Usage
1. Place figures and data in `data/` at the project root (images → `figures/`, files → `data/` automatically)
2. Run `scientific-writer` and describe what you want
3. Follow progress updates; outputs saved to `writing_outputs/<timestamp>_<topic>/`
```bash
# Start a new paper with figures and data
> Create a Nature paper on CRISPR gene editing. Include experimental_results.csv showing knockout efficiency across 5 cell lines. Reference figure1.png (Western blot) and figure2.png (flow cytometry data) in the results section. Discuss the 87% efficiency improvement observed in HEK293 cells.
# Continue editing with additional research results
> Add a methods section describing the experimental setup used to generate the data in results_table.csv. Reference the protocols for transfection, selection, and validation shown in microscopy_images/ folder.
# Grant proposal with preliminary data
> Write an NSF proposal for quantum computing research. Present preliminary results from quantum_fidelity.csv showing 99.2% gate fidelity. Include circuit_diagram.png and error_rates.png figures. Emphasize the breakthrough results compared to current state-of-art (95% fidelity).
# Research poster with comprehensive figures
> Generate a conference poster from my paper. Feature dose_response_graph.png as the central figure. Include mechanism_schematic.png, compare_treatments.png, and statistical_analysis.png. Highlight the p<0.001 significance for the primary outcome shown in the results.
# Clinical case report with patient data
> Create a clinical case report for rare disease presentation. Reference patient_timeline.csv showing symptom progression over 6 months. Include diagnostic_images/ (CT scans, MRI). Discuss lab_values.xlsx showing elevated biomarkers and treatment response documented in follow_up_data.csv.
# Literature review with meta-analysis
> Create a literature review on machine learning in healthcare. Reference the comparison in studies_comparison.csv covering 50 papers. Include forest_plot.png showing pooled effect sizes and quality_assessment.png from bias analysis. Synthesize the findings showing diagnostic accuracy (AUC 0.89), treatment prediction (accuracy 82%), and risk stratification results.
```
### API Usage
```python
import asyncio
from scientific_writer import generate_paper
async def main():
async for update in generate_paper(
query="Create a NeurIPS paper on transformers",
data_files=["results.csv", "figure.png"],
output_dir="./my_papers",
track_token_usage=True # Optional: track token consumption
):
if update["type"] == "progress":
print(f"[{update['stage']}] {update['message']}")
else:
print(f"✓ PDF: {update['files']['pdf_final']}")
# Token usage available when track_token_usage=True
if "token_usage" in update:
print(f" Tokens used: {update['token_usage']['total_tokens']:,}")
asyncio.run(main())
```
## Quick Reference
### Common Commands
| Task | Command Example |
|------|----------------|
| **Scientific Paper** | `> Create a Nature paper on CRISPR gene editing. Present knockout efficiency data from results.csv (5 cell lines tested). Include Western blot (figure1.png) and flow cytometry (figure2.png) showing 87% efficiency in HEK293 cells. Compare with published benchmarks.` |
| **Clinical Report** | `> Create a clinical case report for rare mitochondrial disease. Include patient_timeline.csv (6-month progression), diagnostic_scans/ folder (MRI, CT images), and lab_values.xlsx showing elevated lactate (8.2 mmol/L) and creatine kinase (450 U/L). Describe treatment response.` |
| **Grant Proposal** | `> Write an NSF proposal for quantum error correction research. Present preliminary data from gate_fidelity.csv showing 99.2% fidelity (vs 95% state-of-art). Include circuit_topology.png, error_rates_comparison.png, and scalability_projections.csv for 100-qubit systems.` |
| **Research Poster** | `> Generate an A0 conference poster. Highlight findings from efficacy_study.csv (n=150 patients, 40% response rate). Feature mechanism_diagram.png, survival_curves.png, biomarker_heatmap.png, and statistical_forest_plot.png (p<0.001 primary endpoint).` |
| **Literature Review** | `> Create a systematic review on AI in drug discovery. Reference studies_database.csv (127 papers, 2020-2024). Include success_rates_meta.png (pooled OR=2.3, 95% CI 1.8-2.9), publication_trends.png, and therapeutic_areas_breakdown.csv showing oncology dominance (45% of studies).` |
| **Peer Review** | `> Evaluate this manuscript using ScholarEval. Reference figures (power_analysis.png shows n=30, underpowered), review statistics in results_table.csv, assess methodology against CONSORT standards, verify citations match claims.` |
| **Hypothesis Paper** | `> Generate research hypotheses on aging interventions. Reference transcriptomics_data.csv (15,000 genes across tissues), pathway_enrichment.png, and longevity_correlations.csv. Propose 5 testable hypotheses linking NAD+ metabolism, senescence, and lifespan extension.` |
| **Continue Editing** | `> Add methods section describing the protocols used to generate binding_assay.csv data. Include equipment specs, statistical tests used (t-tests in stats_summary.csv), and sample size justification from power_calculation.xlsx` |
| **Find Existing Paper** | `> Find the CRISPR paper and add discussion of limitations shown in off_target_analysis.csv and efficiency_variation.png across different cell types` |
### Research Lookup Examples
```bash
# Recent research with data integration (auto-triggers research lookup)
> Create a paper on recent advances in quantum computing (2024). Compare published values with our gate_fidelity_results.csv (99.2% for 2-qubit gates). Include our error_correction_benchmarks.png and cite papers achieving >98% fidelity. Discuss how our topology_diagram.png relates to Google's and IBM's recent architectures.
# Fact verification with experimental context
> What are the current success rates for CAR-T therapy in B-cell lymphoma? Compare with our clinical_trial_outcomes.csv (n=45 patients, 62% complete response). Include our response_timeline.png and cytokine_profiles.csv. How do our results compare to published JULIET and ZUMA trials?
# Literature search with data-driven focus
> Find 10 recent papers on transformer efficiency optimizations (2023-2024). Compare their reported FLOPS and memory usage with our benchmark_results.csv testing GPT-4, Claude, and Llama models. Include our latency_comparison.png and throughput_scaling.csv for context.
# Meta-analysis with new data
> Search for RCTs on metformin in aging (last 5 years). Compare published efficacy data with our mouse_longevity_study.csv (18% lifespan extension, n=120). Include our survival_curves.png, biomarker_changes.xlsx (AMPK, mTOR, NAD+ levels), and dose_response.png. How do our findings align with human trial outcomes?
# Comparative analysis
> Find papers on CRISPR base editors vs prime editors (2022-2024). Compare their reported efficiency and specificity with our editing_efficiency.csv (5 targets, 3 cell lines). Include our off_target_analysis.png and on_target_rates.csv. Discuss if our 89% on-target rate is competitive.
```
### Document Types
| Type | Example with Data/Figures |
|------|---------|
| **Papers** | `> Create a Nature paper on neural plasticity. Present electrophysiology_data.csv (n=30 neurons), include LTP_traces.png, calcium_imaging_timelapse/ folder, and synaptic_strength.csv showing 156% potentiation (p<0.001).` |
| **Clinical Reports** | `> Write a case report for autoimmune encephalitis. Include MRI_series/ (FLAIR, T2 sequences), CSR_results.xlsx (oligoclonal bands, elevated IgG), EEG_recordings.png, treatment_timeline.csv showing immunotherapy response over 8 weeks.` |
| **Grants** | `> NSF proposal for optogenetics. Present pilot_data/ with behavioral_results.csv (n=24 mice), neural_activation_maps.png, circuit_tracing.tif, and projection_analysis.csv showing 78% success in behavior modification. Include 5-year timeline with milestones.xlsx.` |
| **Posters** | `> A0 poster for ASCO conference. Feature trial_demographics.csv (n=200), primary_outcome_kaplan_meier.png, adverse_events_heatmap.png, biomarker_correlations.csv, mechanism_schematic.png. Highlight 8.5 month median PFS improvement.` |
| **Reviews** | `> Systematic review of immunotherapy combinations. Reference extracted_data.csv from 85 trials, include forest_plot_OS.png and forest_plot_PFS.png for meta-analysis, risk_of_bias_summary.png, network_meta_analysis.csv comparing 12 regimens.` |
| **Schematics** | `> Generate CONSORT diagram for RCT using Nano Banana Pro. Use enrollment_data.csv (n=450 screened, 312 randomized), show flowchart with allocation. Create transformer architecture diagram showing encoder-decoder. Generate biological pathway diagrams for MAPK signaling.` |
### File Handling
```bash
# 1. Drop all your research files in data/ folder
cp experimental_data.csv ~/Documents/claude-scientific-writer/data/
cp western_blot.png ~/Documents/claude-scientific-writer/data/
cp flow_cytometry.png ~/Documents/claude-scientific-writer/data/
cp statistical_summary.xlsx ~/Documents/claude-scientific-writer/data/
cp methods_diagram.svg ~/Documents/claude-scientific-writer/data/
# 2. Files are automatically sorted by type:
# Images (png, jpg, svg, tif, pdf figures) → figures/
# Data files (csv, json, txt, xlsx, tsv) → data/
# Documents (pdf, docx, pptx) → converted to markdown
# 3. Reference files explicitly in your prompt with specific details
> Create a NeurIPS paper on deep learning optimization. Include training_curves.csv showing convergence after 50 epochs across 5 model architectures. Reference accuracy_comparison.png (our method: 94.2% vs baseline: 89.1%), loss_landscapes.png visualizing optimization trajectories, and hyperparameter_grid.csv with 100 configurations tested. Include architecture_diagram.svg in methods. Discuss the 5.1% accuracy improvement and 30% faster convergence shown in benchmark_results.xlsx.
# 4. Reference folders for multiple related files
> Write a radiology case report. Include the CT_scans/ folder (20 slices showing tumor progression), lab_results/ with weekly bloodwork CSVs, and treatment_response.xlsx documenting lesion measurements. Reference dates in imaging_timeline.csv for timeline.
# 5. Combine data files for comprehensive presentation
> Generate grant proposal presenting preliminary data from: dose_response.csv (6 doses, 4 replicates), survival_analysis.csv (Kaplan-Meier data, n=80 mice), mechanism_pathway.png, gene_expression.csv (RNA-seq, 15,000 genes), and protein_validation.xlsx (Western blots quantified). Include budget from project_costs.xlsx.
```
### API Quick Start
```python
import asyncio
from scientific_writer import generate_paper
# Simple usage with detailed prompt
async for update in generate_paper(
"Create a Nature paper on CRISPR base editing. Present editing efficiency from "
"results.csv (5 cell lines, n=200 per line). Include Western blots (protein_expression.png), "
"flow cytometry (editing_rates.png), and off-target analysis (specificity_heatmap.png). "
"Highlight 89% on-target efficiency with <0.1% off-target effects."
):
if update["type"] == "result":
print(f"PDF: {update['files']['pdf_final']}")
# With multiple data files and specific instructions
async for update in generate_paper(
query=(
"Create an ICML paper on reinforcement learning for robotics. "
"Present training_metrics.csv (1M timesteps, 5 environments). "
"Include learning_curves.png comparing our method (reward: 450) vs baselines (320), "
"success_rates.csv across 100 test episodes, policy_visualizations.png, "
"and ablation_study.xlsx testing 8 hyperparameter configurations. "
"Include robot_architecture.svg diagram and trajectory_examples.png in methods. "
"Emphasize 40% improvement over SAC and 25% over TD3."
),
data_files=[
"training_metrics.csv",
"learning_curves.png",
"success_rates.csv",
"policy_visualizations.png",
"ablation_study.xlsx",
"robot_architecture.svg",
"trajectory_examples.png"
],
output_dir="./papers"
):
if update["type"] == "progress":
print(f"[{update['stage']}] {update['message']}")
elif update["type"] == "result":
print(f"✓ Paper completed!")
print(f" PDF: {update['files']['pdf_final']}")
print(f" LaTeX: {update['files']['tex_final']}")
print(f" Figures: {len(update.get('figures', []))} included")
# Clinical trial report with comprehensive data
async for update in generate_paper(
query=(
"Generate Phase 2 clinical trial report for novel immunotherapy. "
"Present patient_demographics.csv (n=120, stratified by age/stage), "
"primary_endpoint_PFS.csv (median 12.3 months, HR=0.65, p=0.003), "
"secondary_outcomes.xlsx (ORR 45%, DCR 78%), "
"kaplan_meier_curves.png for OS and PFS, "
"adverse_events.csv (Grade 3+: 23%), "
"biomarker_analysis.csv (PD-L1, TMB correlations), "
"and response_waterfall.png. Include CONSORT diagram based on enrollment_flow.csv."
),
data_files=[
"patient_demographics.csv",
"primary_endpoint_PFS.csv",
"secondary_outcomes.xlsx",
"kaplan_meier_curves.png",
"adverse_events.csv",
"biomarker_analysis.csv",
"response_waterfall.png",
"enrollment_flow.csv"
]
):
if update["type"] == "result":
print(f"Trial report: {update['files']['pdf_final']}")
```
## Plugin Testing (Local Development)
For developers working on the plugin or testing locally:
### Setup Local Marketplace
1. **Create a test marketplace** in the parent directory:
```bash
cd ..
mkdir -p test-marketplace/.claude-plugin
```
2. **Create marketplace configuration** (`test-marketplace/.claude-plugin/marketplace.json`):
Copy the example from `test-marketplace-example.json` or create:
```json
{
"name": "test-marketplace",
"owner": { "name": "K-Dense" },
"plugins": [
{
"name": "claude-scientific-writer",
"source": "../claude-scientific-writer",
"description": "Scientific writing skills and CLAUDE.md initializer"
}
]
}
```
**Note**: Update the `source` path to match your local directory structure (relative to the test-marketplace directory).
### Install and Test
3. **Add the test marketplace** in Claude Code:
```bash
/plugin marketplace add ../test-marketplace
```
(Use the correct relative or absolute path to your test-marketplace directory)
4. **Install the plugin**:
```bash
/plugin install claude-scientific-writer@test-marketplace
```
5. **Restart Claude Code** when prompted.
6. **Test the plugin**:
- Open any project directory
- Run `/scientific-writer:init`
- Verify CLAUDE.md is created
- Test skills: "What skills are available?"
- Try creating a document: "Create a short scientific abstract on quantum computing"
### Verify Plugin Structure
Your plugin should have this structure:
```
claude-scientific-writer/
├── .claude-plugin/
│ └── plugin.json # Plugin metadata
├── commands/
│ └── scientific-writer-init.md # /scientific-writer:init command
├── skills/ # All 20 skills
│ ├── citation-management/
│ ├── clinical-decision-support/
│ ├── clinical-reports/
│ ├── document-skills/
│ ├── hypothesis-generation/
│ ├── latex-posters/
│ ├── literature-review/
│ ├── market-research-reports/
│ ├── markitdown/
│ ├── paper-2-web/
│ ├── peer-review/
│ ├── research-grants/
│ ├── research-lookup/
│ ├── scholar-evaluation/
│ ├── scientific-critical-thinking/
│ ├── scientific-schematics/
│ ├── scientific-slides/
│ ├── scientific-writing/
│ ├── treatment-plans/
│ └── venue-templates/
├── templates/
│ └── CLAUDE.scientific-writer.md # CLAUDE.md template
└── ... (existing Python package files)
```
### Troubleshooting Plugin Installation
- **Skills not showing**: Verify each `SKILL.md` has valid YAML frontmatter (name, description, allowed-tools)
- **Command not working**: Check `commands/scientific-writer-init.md` exists and has proper frontmatter
- **Template not found**: Ensure `templates/CLAUDE.scientific-writer.md` is present
- **Marketplace not loading**: Verify `marketplace.json` syntax and relative path to plugin
## 📄 Example Outputs
Want to see what Scientific Writer can create? Check out real examples in the [`docs/examples/`](docs/examples/) directory!
| Document Type | Example | Description |
|--------------|---------|-------------|
| **Research Paper** | Coming soon | Full scientific papers with IMRaD structure |
| **Grant Proposal** | [NSF Proposal](docs/examples/grants/v6_draft.pdf) | Complete NSF grant with budget and timeline |
| **Research Poster** | [Conference Poster](docs/examples/poster/poster.pdf) | LaTeX-generated academic poster |
| **Presentation Slides** | [AI Scientist Talk](docs/examples/slides/ai_scientist_talk.pdf) | Professional research presentation |
| **Clinical Report** | [Treatment Plan](docs/examples/treatment_plan/GERD.pdf) | Patient treatment documentation |
| **Clinical Decision Support** | [Breast Cancer](docs/examples/clinical_decision_support/breast_cancer.pdf) | Evidence-based clinical recommendations |
| **Hypothesis Generation** | [AI Weather Prediction](docs/examples/hypotheses_generation/AI_in_weather.pdf) | Research hypothesis development |
| **Market Research** | [Agentic AI Report](docs/examples/market%20research%20reports/agentic_ai_life_sciences.pdf) | Industry analysis and market insights |
**🎯 Browse the examples** to see formatting, structure, and quality before starting your own projects!
## Documentation
### User Guides
- [📖 Complete Features Guide](docs/FEATURES.md) - Comprehensive overview of all capabilities
- [🔧 API Reference](docs/API.md) - Full programmatic API documentation
- [🎯 Skills Overview](docs/SKILLS.md) - All available skills and tools
- [🐛 Troubleshooting](docs/TROUBLESHOOTING.md) - Common issues and solutions
### Developer Resources
- [💻 Development Guide](docs/DEVELOPMENT.md) - Contributing and development setup
- [📦 Releasing Guide](docs/RELEASING.md) - Versioning and publishing
- [📋 Release Notes](CHANGELOG.md) - Version history and updates
- [🤖 System Instructions](CLAUDE.md) - Agent instructions (advanced)
## Versioning and Publishing (short)
Use `uv` and the helper scripts:
- Bump version (keeps pyproject + __init__ in sync): `uv run scripts/bump_version.py [patch|minor|major]`
- Build and publish: `uv run scripts/publish.py` (or `--bump patch|minor|major`)
See [docs/RELEASING.md](docs/RELEASING.md) for prerequisites, dry runs, tagging, and verification.
## Migration (v1.x -> v2.0)
- CLI remains unchanged (scientific-writer).
- New programmatic API: from scientific_writer import generate_paper.
- Legacy single-file script is replaced by a proper package; no action needed for CLI users.
## License
MIT - see LICENSE.
## Support
- Open an issue on GitHub
- See [docs/TROUBLESHOOTING.md](docs/TROUBLESHOOTING.md) for common problems
## 💬 Join Our Community!
**Want to connect with other researchers, share tips, and get help in real-time?** Join our vibrant Slack community! 🎉
Whether you're writing your first paper, exploring advanced features, or just want to chat about scientific writing and AI, we'd love to have you! Get faster support, share your success stories, and collaborate with fellow users.
👉 **[Join the K-Dense Community on Slack](https://join.slack.com/t/k-densecommunity/shared_invite/zt-3iajtyls1-EwmkwIZk0g_o74311Tkf5g)** 👈
We're excited to meet you! 🚀
## ⭐ Show Your Support
If you find this project helpful for your research or work, please consider giving it a star on GitHub! It helps others discover the tool and motivates continued development. Thank you! 🙏

## Star History
[](https://star-history.com/#K-Dense-AI/claude-scientific-writer&Date)
| text/markdown | null | null | null | null | MIT | null | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"claude-agent-sdk>=0.1.0",
"pymupdf>=1.24.0",
"python-dotenv>=1.0.0",
"requests>=2.31.0"
] | [] | [] | [] | [] | uv/0.9.14 {"installer":{"name":"uv","version":"0.9.14","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true} | 2026-02-21T02:28:45.592555 | scientific_writer-2.12.0.tar.gz | 7,566,182 | 26/e8/79948984f6a946f9106b526ec50b097d176cc2d9a5f94d86925262aa91da/scientific_writer-2.12.0.tar.gz | source | sdist | null | false | a61af83b7a1475b8b2793023eb972db5 | 1f92e0cbf3ea885eb5b2efcdd59c25421a5c626a072f9983f20be5ce1103ec66 | 26e879948984f6a946f9106b526ec50b097d176cc2d9a5f94d86925262aa91da | null | [
"LICENSE"
] | 195 |
2.4 | scurrypy | 2.2.0 | Dataclass-driven Discord API Wrapper in Python | <div align='center'>
## ScurryPy
[](https://badge.fury.io/py/scurrypy)
[](https://discord.gg/D4SdHxcujM)
<img
src="assets/banner.png"
width="450"
alt="Fire-breathing squirrel"
/>
✨ **Clarity over magic**: build a bot that lasts ✨
</div>
## Features
* Lightweight core
* Rate limit handling
* Automatic session & gateway management
* Automatic sharding
* Predictable event models and resource classes
Your focus is building what you want instead of fighting a framework.
## Installation
Install ScurryPy with pip:
```bash
pip install scurrypy
```
## Examples
The following examples are quick drop-in starters if you wish to try ScurryPy.
> [!TIP]
> It is recommended to use a `.env` file for bot tokens. More details about using a `.env` file [here](https://scurry-works.github.io/scurrypy/getting_started/start_here/).
### Slash Command
```python
# Set TOKEN, APP_ID (bot user ID), and GUILD_ID (for guild command)
# --- Core library imports ---
from scurrypy import Client
from scurrypy.ext.commands import CommandsAddon, ApplicationCommandContext
# --- Setup bot ---
client = Client(token=TOKEN)
commands = CommandsAddon(client, APP_ID)
@commands.slash_command('greet', 'Greet the bot!', guild_ids=[GUILD_ID])
async def on_greet(ctx: ApplicationCommandContext):
await ctx.respond("Hello!")
# --- Run the bot ---
client.run()
```
### Prefix Command (Legacy)
```python
# Set TOKEN and APP_ID (bot user ID)
# --- Core library imports ---
from scurrypy import Client, Intents
from scurrypy.ext.prefixes import PrefixAddon, PrefixCommandContext
client = Client(token=TOKEN, intents=Intents.DEFAULT | Intents.MESSAGE_CONTENT)
prefixes = PrefixAddon(client, APP_ID, '!')
# --- Setup bot ---
@prefixes.listen('ping')
async def on_ping(ctx: PrefixCommandContext):
await ctx.send("Pong!")
# --- Run the bot ---
client.run()
```
## Dependencies
ScurryPy has exactly 3 required dependencies:
- aiohttp (HTTP client)
- websockets (Gateway connection)
- aiofiles (Async file operations)
These dependencies are automatically installed with ScurryPy's pip package.
## Learn More
Explore the full [documentation](https://scurry-works.github.io/scurrypy) for more examples, guides, and API reference.
See the [manifesto](https://scurry-works.github.io/scurrypy/manifesto) section for details!
**Got some questions?**
Check out the [FAQ](https://scurry-works.github.io/scurrypy/faq) page for commonly asked questions!
**Looking for changes?**
See the [Changelog](https://github.com/scurry-works/scurrypy/blob/main/CHANGELOG.md).
| text/markdown | Furmissile | null | null | null | null | null | [] | [] | null | null | >=3.11 | [] | [] | [] | [
"aiohttp>=3.8.0",
"websockets>=11.0.0",
"aiofiles>=23.0.0"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.13.3 | 2026-02-21T02:28:28.122990 | scurrypy-2.2.0.tar.gz | 68,851 | c6/71/af497ab64e95a946c3d88818236eccd94ec673d59ebf7369b79440ec323a/scurrypy-2.2.0.tar.gz | source | sdist | null | false | 22d694cf5c2223b38fffb4a286a01c9f | b883f7a9dc8fb4ad7e38f690ec19d9e45de5fc7923fb638c391d6587202b0830 | c671af497ab64e95a946c3d88818236eccd94ec673d59ebf7369b79440ec323a | null | [
"LICENSE"
] | 189 |
2.4 | seaduck | 1.0.5 | A python package that interpolates data from ocean dataset from both Eulerian and Lagrangian perspective. | # seaduck
A python package that interpolates data from ocean dataset from both Eulerian and Lagrangian perspective.
## Quick Start
```python
>>> import seaduck as sd
```
## Documentation
Seaduck documentation:
https://macekuailv.github.io/seaduck/
## Citation
Please cite our paper on the Journal of Open Source Software
[](https://doi.org/10.21105/joss.05967)
Jiang et al., (2023). Seaduck: A python package for Eulerian and Lagrangian interpolation on ocean datasets. Journal of Open Source Software, 8(92), 5967, https://doi.org/10.21105/joss.05967
## Workflow for developers/contributors
For best experience create a new conda environment (e.g. bubblebath):
```
conda create -n bubblebath
conda activate bubblebath
```
Before pushing to GitHub, run the following commands:
1. Update conda environment: `make conda-env-update`
1. Install this package: `pip install -e .`
1. Run quality assurance checks: `make qa`
1. Run tests: `make unit-tests`
1. Run the static type checker: `make type-check`
1. Build the documentation: `make docs-build`
1. Build the pdf version of the paper `make joss`
## License
```
Copyright 2023, Wenrui Jiang.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
| text/markdown | null | null | null | null | Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2023, Wenrui Jiang.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
| null | [
"Development Status :: 2 - Pre-Alpha",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Topic :: Scientific/Engineering"
] | [] | null | null | null | [] | [] | [] | [
"numpy",
"pandas",
"scipy",
"dask[array]",
"xarray"
] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:27:52.561439 | seaduck-1.0.5.tar.gz | 730,749 | 07/3c/fa881bdf1d5ccf923463819fdfb4f73c7256533fa0e60361590ab4767b5e/seaduck-1.0.5.tar.gz | source | sdist | null | false | fe69e32a5ee23f10535aa5eca344392a | c65d16ebac0e37448c770dc18fc82700a7e3f67833babf407ecec3ac79d992c2 | 073cfa881bdf1d5ccf923463819fdfb4f73c7256533fa0e60361590ab4767b5e | null | [
"LICENSE"
] | 190 |
2.4 | routeme | 0.1.2 | Intelligent LLM Router - Route requests across multiple LLM providers with automatic failover, quota management, and caching | # Intelligent LLM Router
<p align="center">
<a href="https://pypi.org/project/llm_router/">
<img src="https://img.shields.io/pypi/v/llm_router.svg" alt="PyPI version">
</a>
<a href="https://pypi.org/project/llm_router/">
<img src="https://img.shields.io/pypi/pyversions/llm_router.svg" alt="Python versions">
</a>
<a href="https://github.com/anomalyco/llm_router/blob/main/LICENSE">
<img src="https://img.shields.io/pypi/l/llm_router.svg" alt="License">
</a>
</p>
Intelligent LLM Router is a Python library that routes requests across multiple LLM providers with automatic failover, quota management, and response caching.
## Features
- **Multi-Provider Support**: OpenAI, Anthropic, Google Gemini, Groq, Mistral, Cohere, DeepSeek, Together, HuggingFace, OpenRouter, xAI, DashScope, and Ollama
- **Automatic Failover**: Automatically switches to the next provider if one fails
- **Quota Management**: Tracks RPM/RPD limits per provider and routes around rate limits
- **Response Caching**: Exact and semantic caching to reduce costs and latency
- **Multiple Routing Strategies**: Auto, Cost-Optimized, Quality-First, Latency-First, Round-Robin
- **Vision & Embeddings**: Full support for vision models and embeddings
## Installation
```bash
pip install llm_router
```
Or with specific extras:
```bash
pip install llm_router[server] # Includes FastAPI server
```
## Quick Start
```python
import asyncio
from llm_router import IntelligentRouter, RoutingOptions, RoutingStrategy
async def main():
# Initialize the router
router = IntelligentRouter()
await router.start()
# Define your request
request_data = {
"messages": [
{"role": "user", "content": "What is the capital of France?"}
],
"temperature": 0.7,
}
# Configure routing options (optional)
options = RoutingOptions(
strategy=RoutingStrategy.AUTO,
)
# Route the request
response = await router.route(request_data, options)
print(response["choices"][0]["message"]["content"])
print(f"Provider: {response['routing_metadata']['provider']}")
await router.stop()
asyncio.run(main())
```
## Configuration
### Environment Variables
All configuration can be done via environment variables with the `ROUTER_` prefix:
| Variable | Default | Description |
|----------|---------|-------------|
| `ROUTER_LLM_TIMEOUT` | 60 | Timeout for LLM calls in seconds |
| `ROUTER_MAX_RETRIES` | 3 | Maximum retry attempts |
| `ROUTER_ENABLE_OLLAMA_FALLBACK` | true | Enable Ollama fallback when cloud providers fail |
| `ROUTER_CACHE_DIR` | /tmp/llm_router_cache | Directory for response cache |
| `ROUTER_RESPONSE_CACHE_TTL` | 3600 | Cache TTL in seconds |
| `ROUTER_DEFAULT_STRATEGY` | auto | Default routing strategy |
### Provider API Keys
Set API keys as environment variables:
```bash
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GROQ_API_KEY=gsk_...
export GEMINI_API_KEY=AIza...
# etc.
```
## Routing Strategies
| Strategy | Description |
|----------|-------------|
| `auto` | Balanced: quota + latency + quality (default) |
| `cost_optimized` | Maximize remaining free quota |
| `quality_first` | Prioritize highest quality models |
| `latency_first` | Prioritize fastest responding models |
| `round_robins` | Uniform spread across providers |
## Request Options
```python
from llm_router import RoutingOptions, CachePolicy
options = RoutingOptions(
strategy=RoutingStrategy.COST_OPTIMIZED,
free_tier_only=True, # Only use free tier providers
preferred_providers=["groq", "gemini"], # Prefer these providers
excluded_providers=["openai"], # Skip these providers
cache_policy=CachePolicy.ENABLED, # Enable response caching
)
```
## API Reference
### IntelligentRouter
```python
router = IntelligentRouter()
await router.start() # Initialize router
response = await router.route(request_data, options) # Route request
stats = router.get_stats() # Get router statistics
await router.stop() # Cleanup
```
### Models
- `RoutingOptions`: Configuration for routing behavior
- `RoutingStrategy`: Enum for routing strategies
- `TaskType`: Enum for task types (chat, embeddings, vision, etc.)
- `CachePolicy`: Enum for caching behavior
- `settings`: Global settings object
## Running the Server
```bash
# Install with server extras
pip install llm_router[server]
# Run the server
llm-router
# Or with custom settings
ROUTER_PORT=8000 llm-router
```
The server provides a FastAPI-compatible API at `/v1/chat/completions`, `/v1/embeddings`, and `/health`.
## Development
```bash
# Clone the repository
git clone https://github.com/anomalyco/llm_router.git
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
# Run linting
ruff check src/
```
## License
MIT License - see [LICENSE](LICENSE) for details.
| text/markdown | null | remixonwin <remixonwin@gmail.com> | null | null | null | llm, router, load-balancing, failover, litellm, openai, anthropic, gemini, mistral, groq, embedding, caching, intelligent-routing, ai, llm-proxy | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Libraries :: Application Frameworks",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Internet :: Proxy Servers"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"aiofiles>=23.2.0",
"anyio>=4.3.0",
"cachetools>=5.3.0",
"diskcache>=5.6.0",
"fastapi>=0.111.0",
"httpx>=0.27.0",
"limits>=3.12.0",
"litellm>=1.35.0",
"prometheus-client>=0.20.0",
"pydantic>=2.7.0",
"pydantic-settings>=2.2.0",
"python-dotenv>=1.0.0",
"tenacity>=8.3.0",
"uvicorn[standard]>=0.29.0",
"xxhash>=3.4.0",
"ruff>=0.4.0; extra == \"dev\"",
"mypy>=1.10.0; extra == \"dev\"",
"pytest>=8.0.0; extra == \"dev\"",
"pytest-asyncio>=0.23.0; extra == \"dev\"",
"pytest-cov>=4.1.0; extra == \"dev\"",
"httpx>=0.27.0; extra == \"dev\"",
"fastapi>=0.111.0; extra == \"server\"",
"uvicorn[standard]>=0.29.0; extra == \"server\"",
"sphinx>=7.0.0; extra == \"docs\"",
"sphinx-rtd-theme>=2.0.0; extra == \"docs\"",
"myst-parser>=2.0.0; extra == \"docs\""
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.3 | 2026-02-21T02:27:37.356784 | routeme-0.1.2.tar.gz | 47,232 | 51/18/9db2d0ef0f8140dbc7697c14ac804316de8e29cba2cf9a1e22e7fa1b0c2d/routeme-0.1.2.tar.gz | source | sdist | null | false | 3b9d0ef425e2e717504f1aaa7d9e109b | 891d0ca372b4beb934fe2e3626bcaa76c404d7319675d40c28b00bb5e64661ef | 51189db2d0ef0f8140dbc7697c14ac804316de8e29cba2cf9a1e22e7fa1b0c2d | null | [] | 175 |
2.4 | libtpu | 0.0.36 | Google Cloud TPU runtime library. | # What is libtpu?
`libtpu` is the core library that enables machine learning frameworks like
`JAX`, `PyTorch`, and `TensorFlow` to execute models on Google Cloud TPUs. It
provides core functionality for compilation, inter-chip communication (ICI), and
runtime execution.
`libtpu` also includes a set of SDK primitives for direct TPU interaction and
deployment.
Learn more about Cloud TPUs at
[Google Cloud TPUs](https://cloud.google.com/tpu).
********************************************************************************
## Version 0.0.21.1
### Compatibility
* JAX Compatibility: `libtpu` supports `JAX` 0.7.1 or newer.
* Python Compatibility: The `libtpu` SDK is now compatible with the Python
versions 3.11, 3.12, 3.13, 3.13-ft, 3.14 and 3.14-ft where `ft` corresponds
to free threaded python variant.
| text/markdown | Google, Inc. | null | null | null | Google Cloud Platform Terms of Service | null | [] | [] | null | null | >=3.11 | [] | [] | [] | [] | [] | [] | [] | [] | twine/6.2.0 CPython/3.14.3 | 2026-02-21T02:25:28.111825 | libtpu-0.0.36-cp314-cp314-manylinux_2_31_x86_64.whl | 208,651,928 | ed/f0/0673bc11a997edcc4e53fe2d24e9a100f7f4938fd955a9837524ea7b6282/libtpu-0.0.36-cp314-cp314-manylinux_2_31_x86_64.whl | cp314 | bdist_wheel | null | false | f5d64239bad655454c427c7df4c7275c | e0062407121ecf0e96eaff3695ea889a2fcadad1fd08727d4d96b85af6677548 | edf00673bc11a997edcc4e53fe2d24e9a100f7f4938fd955a9837524ea7b6282 | null | [] | 629 |
2.4 | thalamus-neuro | 0.3.41 | Data Acquisition and Behavioral Experiment Platform | # Thalamus
Thalamus is an open-source Python program designed for real-time, synchronized, closed-loop multimodal data capture, specifically tailored to meet the stringent demands of neurosurgical environments.
# Overview
Thalamus facilitates the advancement of clinical applications of Brain-Computer Interface (BCI) technology by integrating behavioral and electrophysiological data streams. Thalamus prioritizes the following design requirements:
1. Requires minimal setup within an operating room, clinical and research environment and could be easily controlled and quickly modified by the experimenter
2. Operated with high reliability with few crashes
3. Fail-safe architecture that guarantees minimal data loss in the setting of a crash
4. Allows for real-time computation to support visualizations of research and clinical data streams
5. Closed-loop control based on research and/or clinical data streams
6. Acquires synchronous data from the available research and clinical sensors including relevant behavioral, physiologic, and neural sensors that could easily be scaled over time
7. Supports a high-bandwidth, low latency, parallel distributed architecture for modular acquisition and computation that could easily be upgraded as technology continues to advance
8. Open-source with source code available to support research use
9. Embodies best practice in software engineering using unit tests and validation checks
10. Supports advances in translational applications and, hence, also operates in research domains
# System Requirements
## Hardware Requirements
Thalamus requires only a standard computer with enough RAM to support the in-memory operations.
External hardware devices for data aquisition are dependent on the goals of individual projects.
## Software Requirements
Thalamus requires Python.
### OS Requirements
We provide auto builds for Linux (glibc 2.35) and Windows (10).
### Python Dependencies
**requirements.txt** includes required dependencies if installing from Github. However, all dependencies have been packaged into the auto builds.
# Installation Guide
## Install from Build
Download appropriate (Windows or Linux) build directly from actions tab or under Releases.
For Windows:
```python -m pip install thalamus-0.3.0-py3-none-win_amd64.whl```
For Linux:
```python -m pip install thalamus-0.3.0-py3-none-manylunux_2_27.whl```
You should now be able to run any of the Thalamus tools
```python -m thalamus.pipeline # Data pipeline, no task controller```
```python -m thalamus.task_controller # Data pipeline and task controller```
```python -m thalamus.hydrate # Convert capture files to sharable formats```
Approximately 1 hour set-up time
# Documentaton
The code respository for Thalamus is hosted on GitHub at https://github.com/cajigaslab/thalamus. For detailed documentation of Thalamus visit https://cajigaslab.github.io/Thalamus/.
For additional examples and generation of figures in our paper, refer to the **SimpleUseCase** folder in the repo.
# License
If you use Thalamus in your work, please remember to cite the repository in any publications.
# Issues
Like all open-source projects, Thalamus will benefit from your involvement, suggestions and contributions. This platform is intended as a repository for extensions to the program based on your code contributions as well as for flagging and tracking open issues. Please use the **Issues** tab as fit.
| text/markdown | null | null | Jarl Haggerty | Jarl.Haggerty@Pennmedicine.penn.edu | null | null | [] | [] | null | null | null | [] | [] | [] | [
"grpcio-tools==1.78.1",
"grpcio-reflection==1.78.1",
"pyyaml",
"numpy>=1.24.0; python_version >= \"3.8\"",
"numpy>=1.19.5; python_version < \"3.8\"",
"PyOpenGL>=3.1.6",
"PyQt5; python_version < \"3.7\"",
"PyQt6; python_version >= \"3.7\"",
"matplotlib",
"dataclasses",
"jinja2",
"h5py>=3.1.0",
"scipy",
"typing_extensions>=4.4.0; python_version >= \"3.8\"",
"typing_extensions; python_version < \"3.8\"",
"opencv-contrib-python",
"numpy-stl",
"scikit-learn",
"Pillow",
"lark",
"jsonpath_ng",
"nidaqmx",
"wheel",
"toml",
"build",
"meson",
"ninja",
"oslex",
"psutil"
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.3 | 2026-02-21T02:25:14.600112 | thalamus_neuro-0.3.41-py3-none-win_amd64.whl | 37,755,014 | 42/e5/20cf1b97d6b494754d177d27b57973e53c27818212965ebe69a198ff4ab4/thalamus_neuro-0.3.41-py3-none-win_amd64.whl | py3 | bdist_wheel | null | false | a691730f1aab05f74e51586c5569003a | 3236098a549c56b2ce13336627e66183b32684663ed5ecc211d08121a43418c7 | 42e520cf1b97d6b494754d177d27b57973e53c27818212965ebe69a198ff4ab4 | GPL-3.0-only | [
"LICENSE"
] | 184 |
2.4 | synth-nmr | 0.6.0 | NMR spectroscopy calculations for protein structures | # synth-nmr
[](https://github.com/elkins/synth-nmr/actions/workflows/test.yml)
[](https://elkins.github.io/synth-nmr/)
<img src="https://raw.githubusercontent.com/elkins/synth-nmr/master/images/NOE_Avenue.jpg" alt="NOE Avenue" width="50%">
**NMR spectroscopy calculations for protein structures**
A lightweight, standalone Python package for calculating NMR observables from protein structures. Originally extracted from the [synth-pdb](https://github.com/elkins/synth-pdb) package to provide a focused toolkit that works with any protein structure source.
**[Read the full documentation here!](https://elkins.github.io/synth-nmr/)**
## Features
- **NOE Calculations**: Synthetic NOE distance restraints
- **Relaxation Rates**: R1, R2, and heteronuclear NOE predictions
- **Chemical Shifts**: High-accuracy Neural Network predictions with SPARTA+ empirical fallback
- **J-Couplings**: Karplus equation for scalar couplings
- **RDC Calculations**: Prediction of residual dipolar couplings
- **NEF I/O**: Read and write NMR Exchange Format files
- **Secondary Structure**: Automatic classification for enhanced predictions
## Installation
```bash
pip install synth-nmr
```
For improved performance with JIT compilation:
```bash
pip install synth-nmr[performance]
```
## Command-Line Interface
`synth-nmr` provides a command-line interface for common tasks, allowing you to perform calculations directly from your terminal.
### Usage
You can run `synth-nmr` CLI commands directly or enter an interactive mode.
#### Non-Interactive Mode
Execute commands by passing them as arguments to the `synth_nmr.synth_nmr_cli` module:
```bash
python -m synth_nmr.synth_nmr_cli <command> [arguments]
```
**Examples:**
1. **Read a PDB file and calculate RDCs:**
```bash
python -m synth_nmr.synth_nmr_cli read pdb protein.pdb calculate rdc 10.0 0.5
```
2. **Read a PDB file and predict chemical shifts:**
```bash
python -m synth_nmr.synth_nmr_cli read pdb protein.pdb predict shifts
```
3. **Read a PDB file and calculate J-couplings:**
```bash
python -m synth_nmr.synth_nmr_cli read pdb protein.pdb calculate j-coupling
```
#### Interactive Mode
To enter interactive mode, run the CLI without any arguments:
```bash
python -m synth_nmr.synth_nmr_cli
```
Once in interactive mode, you will see a `SynthNMR>` prompt. Type `help` to see available commands:
```
SynthNMR> help
Commands:
read pdb <filename>
calculate rdc [Da] [R]
predict shifts
calculate j-coupling
exit
SynthNMR> read pdb protein.pdb
SynthNMR> calculate rdc 10.0 0.5
SynthNMR> exit
```
### Available Commands
- `read pdb <filename>`: Loads a protein structure from the specified PDB file. This command must be executed before any calculation commands.
- `calculate rdc [Da] [R]`: Calculates Residual Dipolar Couplings.
- `Da`: (Optional) Axial component of the alignment tensor in Hz (default: 10.0).
- `R`: (Optional) Rhombicity of the alignment tensor (dimensionless) (default: 0.5).
- `predict shifts`: Predicts chemical shifts using SPARTA+ with ring current corrections.
- `calculate j-coupling`: Calculates ³J(HN-Hα) couplings using the Karplus equation.
- `help`: (Interactive mode only) Displays a list of available commands.
- `exit`: (Interactive mode only) Exits the CLI.
## Quick Start
```python
import biotite.structure.io as strucio
from synth_nmr import (
calculate_synthetic_noes,
calculate_relaxation_rates,
predict_chemical_shifts,
calculate_hn_ha_coupling,
calculate_rdcs
)
# Load a protein structure
structure = strucio.load_structure("protein.pdb")
# Calculate NOEs
noes = calculate_synthetic_noes(structure, cutoff=5.0)
# Predict relaxation rates
relaxation = calculate_relaxation_rates(
structure,
field_strength=600.0, # MHz
temperature=298.0, # K
correlation_time=5.0 # ns
)
# Predict chemical shifts
shifts = predict_chemical_shifts(structure)
# Calculate J-couplings
j_couplings = calculate_hn_ha_coupling(structure)
# Predict RDCs
rdcs = calculate_rdcs(
structure,
Da=10.0, # Axial component of alignment tensor (Hz)
R=0.5 # Rhombic component of alignment tensor
)
```
## Requirements
- Python ≥ 3.8
- NumPy ≥ 1.20
- Biotite ≥ 0.35.0
- Numba ≥ 0.55.0 (optional, for performance)
## Documentation
### Core Functions
#### `calculate_synthetic_noes(structure, cutoff=5.0)`
Calculate synthetic NOE distance restraints.
**Parameters:**
- `structure`: biotite AtomArray
- `cutoff`: Distance cutoff in Ångströms (default: 5.0)
**Returns:** Dictionary of NOE restraints
#### `calculate_relaxation_rates(structure, field_strength, temperature, correlation_time)`
Predict NMR relaxation rates (R1, R2, heteronuclear NOE).
**Parameters:**
- `structure`: biotite AtomArray
- `field_strength`: Spectrometer frequency in MHz
- `temperature`: Temperature in Kelvin
- `correlation_time`: Molecular correlation time in nanoseconds
**Returns:** Dictionary of relaxation rates per residue
#### `predict_chemical_shifts(structure)`
Predict chemical shifts using SPARTA+ with ring current corrections.
**Parameters:**
- `structure`: biotite AtomArray
**Returns:** Dictionary of chemical shifts by residue and atom type
#### `calculate_hn_ha_coupling(structure)`
Calculate ³J(HN-Hα) couplings using the Karplus equation.
**Parameters:**
- `structure`: biotite AtomArray
**Returns:** Dictionary of J-coupling values per residue
#### `calculate_rdcs(structure, Da, R)`
Predict residual dipolar couplings (RDCs) for backbone N-H vectors.
**Parameters:**
- `structure`: biotite AtomArray
- `Da`: Axial component of the alignment tensor in Hz
- `R`: Rhombicity of the alignment tensor (dimensionless)
**Returns:** Dictionary of RDC values per residue
## Use Cases
- **Structure Validation**: Compare predicted vs experimental NMR data
- **MD Analysis**: Calculate NMR observables from molecular dynamics trajectories
- **Protein Design**: Predict NMR properties of designed structures
- **Data Integration**: Generate synthetic NMR data for machine learning
## Compatibility
Works with protein structures from any source:
- PDB files
- AlphaFold predictions
- Molecular dynamics simulations
- De novo structure generation (e.g., synth-pdb)
## Citation
If you use synth-nmr in your research, please cite:
```bibtex
@software{synth_nmr,
author = {Elkins, George},
title = {synth-nmr: NMR spectroscopy calculations for protein structures},
year = {2026},
url = {https://github.com/elkins/synth-nmr}
}
```
## License
MIT License - see LICENSE file for details
## Related Projects
- [synth-pdb](https://github.com/elkins/synth-pdb) - Synthetic protein structure generation
- [Biotite](https://www.biotite-python.org/) - Computational biology toolkit
## References
This package relies on the following peer-reviewed research:
- **SPARTA+**: For chemical shift predictions.
> Yang, Y., & Bax, A. (2011). *Journal of Biomolecular NMR*, 51(3), 259–274.
- **Karplus Equation**: For J-coupling calculations.
> Karplus, M. (1959). *The Journal of Chemical Physics*, 30(1), 11–15.
- **NMR Relaxation**: The underlying theory for relaxation rate predictions.
> Lipari, G., & Szabo, A. (1982). *Journal of the American Chemical Society*, 104(17), 4546–4559.
- **Residual Dipolar Couplings**: Seminal work on applying RDCs to proteins.
> Bax, A., & Tjandra, N. (1997). *Journal of the American Chemical Society*, 119(49), 12041-12042.
- **Nuclear Overhauser Effect**: Foundational experimental observation.
> Solomon, I. (1955). *Physical Review*, 99(2), 559.
- **2D NOESY**: Development of two-dimensional NOE spectroscopy for biomolecules.
> Kumar, A., Ernst, R. R., & Wüthrich, K. (1980). *Biochemical and Biophysical Research Communications*, 95(1), 1-6.
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
## Support
For issues and questions, please use the [GitHub issue tracker](https://github.com/elkins/synth-nmr/issues).
| text/markdown | George Elkins | null | null | null | MIT | nmr, spectroscopy, protein, structural biology, computational chemistry | [
"Development Status :: 4 - Beta",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering :: Bio-Informatics",
"Topic :: Scientific/Engineering :: Chemistry"
] | [] | null | null | >=3.8 | [] | [] | [] | [
"numpy<2.0,>=1.20",
"biotite>=0.35.0",
"numba>=0.55.0; extra == \"performance\"",
"torch>=2.0.0; extra == \"ml\"",
"scikit-learn>=1.0.0; extra == \"ml\"",
"pytest>=7.0; extra == \"dev\"",
"pytest-mock>=3.10.0; extra == \"dev\"",
"pytest-cov>=4.0; extra == \"dev\"",
"black>=23.0; extra == \"dev\"",
"ruff>=0.1.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/elkins/synth-nmr",
"Repository, https://github.com/elkins/synth-nmr",
"Documentation, https://github.com/elkins/synth-nmr#readme"
] | twine/6.2.0 CPython/3.12.10 | 2026-02-21T02:24:00.232854 | synth_nmr-0.6.0.tar.gz | 89,159 | a6/3d/679375a31b7ba4c914bf7e58ed5e77b68742bdf235ef74e056e269513edb/synth_nmr-0.6.0.tar.gz | source | sdist | null | false | c92f53a7c4b1b2b219d348f251659906 | f4e19f623c8cf8d2b6b10143f401587708b10bb15af491ee406acf2fbc6611ff | a63d679375a31b7ba4c914bf7e58ed5e77b68742bdf235ef74e056e269513edb | null | [
"LICENSE"
] | 226 |
2.4 | temporal-mcp-server | 0.1.0 | MCP server for Temporal workflow orchestration | # Temporal MCP Server
## Overview
This is a Model Context Protocol (MCP) server that provides tools for interacting with Temporal workflow orchestration. It enables AI assistants and other MCP clients to manage Temporal workflows, schedules, and workflow executions through a standardized interface. The server supports both local and remote Temporal instances.
## Tools
### Workflow Execution
- **`start_workflow`** - Start a new Temporal workflow execution with specified parameters, workflow ID, and task queue
- **`get_workflow_result`** - Retrieve the result of a completed workflow execution
- **`describe_workflow`** - Get detailed information about a workflow execution including status, timing, and metadata
- **`list_workflows`** - List workflow executions based on a query filter with pagination support (limit/skip)
- **`get_workflow_history`** - Retrieve the complete event history of a workflow execution
### Workflow Control
- **`query_workflow`** - Query a running workflow for its current state without affecting execution
- **`signal_workflow`** - Send a signal to a running workflow to change its behavior or provide data
- **`cancel_workflow`** - Request cancellation of a running workflow execution
- **`terminate_workflow`** - Forcefully terminate a workflow execution with a reason
- **`continue_as_new`** - Signal a workflow to continue as new (restart with new inputs while preserving history link)
### Batch Operations
- **`batch_signal`** - Send a signal to multiple workflows matching a query (configurable batch size)
- **`batch_cancel`** - Cancel multiple workflows matching a query (configurable batch size)
- **`batch_terminate`** - Terminate multiple workflows matching a query with a specified reason (configurable batch size)
### Schedule Management
- **`create_schedule`** - Create a new schedule for periodic workflow execution using cron expressions
- **`list_schedules`** - List all schedules with pagination support (limit/skip)
- **`pause_schedule`** - Pause a schedule to temporarily stop workflow executions
- **`unpause_schedule`** - Resume a paused schedule
- **`delete_schedule`** - Permanently delete a schedule
- **`trigger_schedule`** - Manually trigger a scheduled workflow immediately
## Temporal Documentation
For more information about Temporal, refer to the official Temporal documentation:
- **Temporal Documentation**: https://docs.temporal.io/
- **Workflows**: https://docs.temporal.io/workflows
- **Activities**: https://docs.temporal.io/activities
- **Python SDK**: https://docs.temporal.io/dev-guide/python
## MCP Client Config Examples
### OpenCode
Add to your OpenCode settings (`~/.config/opencode/opencode.json`):
```json
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"temporal": {
"type": "local",
"command": [
"docker", "run", "-i", "--rm", "--network", "host",
"-e", "TEMPORAL_HOST=192.168.69.98:7233",
"-e", "TEMPORAL_NAMESPACE=default",
"-e", "TEMPORAL_TLS_ENABLED=false",
"temporal-mcp-server:latest"
],
"enabled": true
}
}
}
```
### VS Code (Copilot MCP)
Add to your VS Code settings (`.vscode/mcp.json` or global settings):
```json
{
"mcp.servers": {
"temporal": {
"command": "docker",
"args": [
"run", "-i", "--rm", "--network", "host",
"-e", "TEMPORAL_HOST=192.168.69.98:7233",
"-e", "TEMPORAL_NAMESPACE=default",
"-e", "TEMPORAL_TLS_ENABLED=false",
"temporal-mcp-server:latest"
]
}
}
}
```
### Cursor
Add to your Cursor MCP settings (`~/.cursor/mcp.json` on macOS/Linux or `%APPDATA%\Cursor\mcp.json` on Windows):
```json
{
"mcpServers": {
"temporal": {
"command": "docker",
"args": [
"run", "-i", "--rm", "--network", "host",
"-e", "TEMPORAL_HOST=192.168.69.98:7233",
"-e", "TEMPORAL_NAMESPACE=default",
"-e", "TEMPORAL_TLS_ENABLED=false",
"temporal-mcp-server:latest"
]
}
}
}
```
### Claude Desktop
Add to your Claude Desktop configuration (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\Claude\claude_desktop_config.json` on Windows):
```json
{
"mcpServers": {
"temporal": {
"command": "docker",
"args": [
"run", "-i", "--rm", "--network", "host",
"-e", "TEMPORAL_HOST=192.168.69.98:7233",
"-e", "TEMPORAL_NAMESPACE=default",
"-e", "TEMPORAL_TLS_ENABLED=false",
"temporal-mcp-server:latest"
]
}
}
}
```
### Configuration Notes
- **TEMPORAL_HOST**: Set to your Temporal server address (default: `localhost:7233`)
- **TEMPORAL_NAMESPACE**: Set to your Temporal namespace (default: `default`)
- **TEMPORAL_TLS_ENABLED**: Set to `true` for remote servers, `false` for local, or omit for auto-detection
- Replace `192.168.69.98:7233` with your actual Temporal server address
- For local development, you can use `localhost:7233` or `host.docker.internal:7233` (when running in Docker)
## Development
### Running Tests
Install development dependencies:
```bash
pip install -r requirements-dev.txt
```
Run the test suite:
```bash
pytest test.py -v
```
### Building the Docker Image
```bash
docker build -t mcp/temporal:latest .
```
| text/markdown | null | Mike <mike@miketoscano.com> | null | null | Apache-2.0 | temporal, mcp, model-context-protocol, workflow, orchestration | [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Libraries"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"temporalio>=1.5.0",
"mcp>=0.9.0",
"asyncio-atexit>=1.0.1"
] | [] | [] | [] | [
"Homepage, https://github.com/GethosTheWalrus/temporal-mcp",
"Repository, https://github.com/GethosTheWalrus/temporal-mcp",
"Issues, https://github.com/GethosTheWalrus/temporal-mcp/issues"
] | twine/6.2.0 CPython/3.14.2 | 2026-02-21T02:23:44.845873 | temporal_mcp_server-0.1.0.tar.gz | 18,622 | 82/e5/cc87285d9f7816cee432745ceb8d9e497a4a87a46aa7ae0cd05b240fbf73/temporal_mcp_server-0.1.0.tar.gz | source | sdist | null | false | 3062beb27ba8e7bc83ab3e0479f27253 | a85ab8fe0c89d248a7904e322074e409c19b7e62fbf03cc7263a1f53b361387c | 82e5cc87285d9f7816cee432745ceb8d9e497a4a87a46aa7ae0cd05b240fbf73 | null | [
"LICENSE"
] | 209 |
2.4 | labenv-embedding-cache | 0.1.3 | Shared embedding cache core for cross-project reuse | # labenv_embedding_cache
Shared policy and Python cache-core library for projects that reuse text embedding caches.
## Files
- `embedding_rulebook.yaml`: machine-readable cache policy (path/layout/consistency)
- `embedding_registry.yaml`: machine-readable model registry shared across projects
- `embedding_cache_spec.yaml`: machine-readable spec for dataset key / metadata / variant tag
- `src/labenv_embedding_cache/`: reusable Python library
- `CODEX_PROMPT.md`: reusable prompt template for Codex sessions
## Install library
Preferred distribution is wheel (`v0.1.3`).
```bash
# Internal index (recommended)
pip install "labenv-embedding-cache==0.1.3"
# Optional: explicitly point pip to internal index
# PIP_EXTRA_INDEX_URL="https://<internal-index>/simple" pip install "labenv-embedding-cache==0.1.3"
# GitHub Release wheel fallback
pip install "labenv-embedding-cache @ https://github.com/ryuuua/labenv_embedding_cache/releases/download/v0.1.3/labenv_embedding_cache-0.1.3-py3-none-any.whl"
# Local editable (maintainer workflow)
pip install -e /path/to/labenv_embedding_cache
```
Rollback-only legacy install (VCS pin):
```bash
pip install "git+ssh://git@github.com/ryuuua/labenv_embedding_cache.git@c0154c06ee6e41852c58ac76d6504f5b38d20168#egg=labenv-embedding-cache"
```
## Standalone run (smoke tests)
Install with optional embedding/debug extras:
```bash
pip install -e ".[embed,debug]"
```
Embedding generation only (no cache):
```bash
python tools/embedding_smoketest.py
```
Embedding + cache read/write:
```bash
python tools/embedding_cache_smoketest.py --backend dummy
# or (downloads model)
python tools/embedding_cache_smoketest.py --backend sentence-transformers --model sentence-transformers/all-MiniLM-L6-v2
```
Debugpy (wait for attach):
```bash
python tools/embedding_cache_smoketest.py --backend dummy --debugpy --wait-for-client
```
Embedding generation from `conf/embedding` presets (auto DDP/pipeline policy via `embedding_model.md`):
```bash
python tools/generate_embeddings_from_conf.py --preset conf/embedding/qwen3_embedding.yaml --strategy auto
```
Default is non-normalized embeddings. Use `--normalize` to generate L2-normalized caches.
`normalize_embeddings=true` is treated as a separate model variant (`registry_key=...__l2`) and cache variant (`norm=l2`).
Best practices: `docs/EMBEDDING_BEST_PRACTICES.md`
## Policy path setup
No environment variable is required for normal use. The package resolves its
bundled `embedding_rulebook.yaml` automatically.
If you want to pin `EMBEDDING_RULEBOOK_PATH` explicitly in your shell profile:
```bash
export EMBEDDING_RULEBOOK_PATH="$(labenv-embedding-cache-path rulebook)"
```
Then reload shell:
```bash
source ~/.zshrc
```
## Legacy cache compatibility index (temporary)
Build a read-only index for existing `lm/**` cache files:
```bash
labenv-embedding-cache index-build --cache-dir /work/$USER/data/embedding_cache
```
Show index stats:
```bash
labenv-embedding-cache index-stats --cache-dir /work/$USER/data/embedding_cache
```
Verify whether request manifests can be served without regeneration:
```bash
labenv-embedding-cache verify-requests --requests /path/to/request_manifest.jsonl --min-selected-models 2
```
Build request manifest rows from canonical metadata in Python:
```python
import labenv_embedding_cache as lec
record = lec.build_request_manifest_entry(
dataset_name="ag_news",
model_id="bert",
model_name="bert-base-uncased",
expected_cache_path="/work/$USER/data/embedding_cache/lm/bert-base-uncased/ag_news__x.npz",
metadata=metadata,
)
```
Export a lock payload for CI/DVC (`policy digest + index + optional verify report`):
```bash
labenv-embedding-cache lock-export \
--cache-dir /work/$USER/data/embedding_cache \
--requests /path/to/request_manifest.jsonl \
--output /work/$USER/data/embedding_cache/.labenv/lock.json \
--min-selected-models 2
```
Index file location:
- `${EMBEDDING_CACHE_DIR}/.labenv/index_v1.jsonl`
Compatibility is controlled by rulebook:
- `compatibility.legacy_index.enabled`
- `compatibility.legacy_index.sunset_date`
- `compatibility.legacy_index.require_ids_sha256_match`
Identity expansion is controlled by spec:
- `identity.profiles.default.hard_fields`
- `identity.profiles.default.soft_fields`
- `identity.profiles.default.defaults`
- `identity.profiles.default.legacy_match_policy`
## Quick usage
```python
from labenv_embedding_cache.api import get_or_compute_embeddings
vectors, resolution = get_or_compute_embeddings(
cfg,
texts,
ids,
labels=labels,
compute_embeddings=my_backend_compute_fn,
)
print(resolution.cache_path, resolution.was_cache_hit)
```
## Docker / Docker Compose (env1-env4)
Compose files match the `labenv_config/envkit-templates` profiles (env1_a6000/env2_3090/env3_cc21_a100/env4_cc21_cpu):
- env1: `nvcr.io/nvidia/pytorch:25.02-py3`
- env2: `nvcr.io/nvidia/pytorch:25.09-py3`
- env3: `nvcr.io/nvidia/pytorch:23.10-py3`
- env4: `python:3.11-slim`
Examples:
```bash
docker compose -f docker/compose.env1.yaml run --rm app python tools/embedding_cache_smoketest.py --backend dummy
docker compose -f docker/compose.env4.yaml run --rm app python tools/embedding_smoketest.py --backend transformers
# debugpy (port publish requires --service-ports)
docker compose -f docker/compose.env1.yaml run --rm --service-ports app python tools/embedding_cache_smoketest.py --backend dummy --debugpy --wait-for-client
```
## Sweep utility
Create a stable, line-based sweep list file:
```bash
python tools/make_sweep_list.py --glob "configs/sweep/*.yaml" --out sweep.txt --root .
```
## Per-project Codex usage
In each project, paste `CODEX_PROMPT.md` (or add equivalent guidance in `AGENTS.md`) so Codex always aligns with this policy.
For cross-project rollout requests, use `PROJECT_MIGRATION_PROMPT.md`.
## Automation runbook
- Runtime-aware automation prompt: `docs/AUTOMATION_PROMPT_RUNTIME_AWARE.md`
- Copy/paste execution checklist: `docs/AUTOMATION_EXECUTION_CHECKLIST.md`
- Pre-filled ready-to-run checklist: `docs/AUTOMATION_EXECUTION_CHECKLIST_READY.md`
- What to place in central/downstream/runtime repos: `docs/REPO_AUTOMATION_ASSETS.md`
## Updating shared standards
1. Edit `embedding_rulebook.yaml`, `embedding_registry.yaml`, and/or `embedding_cache_spec.yaml`.
2. In each project, run its embedding-cache validation/tests.
3. If schema behavior changes, bump `identity.version` in `embedding_rulebook.yaml`.
| text/markdown | labenv | null | null | null | Proprietary | null | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"numpy>=1.24",
"omegaconf>=2.3",
"torch>=2.1; extra == \"embed\"",
"accelerate>=0.33; extra == \"embed\"",
"transformers>=4.40; extra == \"embed\"",
"sentence-transformers>=2.6; extra == \"embed\"",
"debugpy>=1.8; extra == \"debug\""
] | [] | [] | [] | [] | twine/6.2.0 CPython/3.12.12 | 2026-02-21T02:23:43.933459 | labenv_embedding_cache-0.1.3.tar.gz | 33,172 | 2e/b1/bcb6d0280630fedf79bba9923a0f5220eebe84b894d5db7e158a18a069cd/labenv_embedding_cache-0.1.3.tar.gz | source | sdist | null | false | fa0948a3807c723a11af31a0c213bd40 | 1eece82ffffb1fe98ba8c710f4f2df41ca81da3966a494e98a778a2ef8893cc0 | 2eb1bcb6d0280630fedf79bba9923a0f5220eebe84b894d5db7e158a18a069cd | null | [] | 258 |
2.4 | obsidian-rag-mcp | 0.2.0 | Semantic search for Obsidian vaults via MCP | # Obsidian RAG MCP Server
**Your notes, searchable by meaning.**
An MCP server that gives Claude Code semantic search over your Obsidian vault. Ask questions in natural language, get answers from your own documents.
[](https://github.com/danielscholl/obsidian-rag-mcp/actions/workflows/ci.yml)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
[](https://modelcontextprotocol.io)
---
## The Problem
You have 500 notes in Obsidian. You know you wrote something about database timeouts causing customer issues... somewhere. Ctrl+F won't help when you don't remember the exact words.
## The Solution
This server indexes your vault with vector embeddings. Ask for "RCAs where database timeouts caused customer-facing issues" and get results even if your notes use terms like "CosmosDB latency", "connection pool exhaustion", or "query timeout."
Semantic search. Vectors stored locally. Sub-second queries.
---
## Quick Start
```bash
# Clone and install (using uv - recommended)
git clone https://github.com/danielscholl/obsidian-rag-mcp.git
cd obsidian-rag-mcp
uv sync
# Set your API key
export OPENAI_API_KEY="sk-..."
# Index the sample vault (or your own)
uv run obsidian-rag index --vault ./vault
# Search it
uv run obsidian-rag search "database connection issues" --vault ./vault
```
<details>
<summary>Output</summary>
```
Query: database connection issues
Found 3 results (searched 1154 chunks)
--- Result 1 (score: 0.480) ---
Source: RCAs/2025-05-17-database-connection-pool-exhaustion.md
Tags: database, p1, rca
# 2025-05-17 - Database Connection Pool Exhaustion in payment-gateway...
--- Result 2 (score: 0.466) ---
Source: RCAs/2025-06-18-database-connection-pool-exhaustion.md
Tags: database, p2, rca
# 2025-06-18 - Database Connection Pool Exhaustion in auth-service...
```
</details>
---
## Connect to Claude Code
Add to `~/.claude/claude_desktop_config.json`:
```json
{
"mcpServers": {
"obsidian-rag": {
"command": "uv",
"args": ["run", "--directory", "/path/to/obsidian-rag-mcp", "obsidian-rag", "serve"],
"env": {
"OBSIDIAN_VAULT_PATH": "/path/to/your/vault",
"OPENAI_API_KEY": "sk-..."
}
}
}
}
```
Then ask Claude things like:
- "Search my vault for notes about Kubernetes deployments"
- "Find RCAs related to authentication failures"
- "What did I write about the Q3 migration?"
---
## MCP Tools
| Tool | What it does |
|------|--------------|
| `search_vault` | Semantic search across all content |
| `search_by_tag` | Filter by Obsidian tags |
| `get_note` | Retrieve full note content |
| `get_related` | Find notes similar to a given note |
| `list_recent` | Recently modified notes |
| `index_status` | Index statistics |
| `search_with_reasoning` | Search with extracted conclusions |
| `get_conclusion_trace` | Trace reasoning for a conclusion |
| `explore_connected_conclusions` | Find related conclusions |
---
## How It Works
1. **Index**: Scans your vault, chunks markdown intelligently (respecting headers, code blocks), generates embeddings via OpenAI
2. **Store**: Vectors go into ChromaDB (local, no external database needed)
3. **Query**: Your question gets embedded, matched against stored vectors, ranked results returned
4. **Reasoning** (optional): Extract conclusions from notes at index time for richer search results
**Documentation:**
- **[Getting Started](docs/GETTING_STARTED.md)** - 5-minute tutorial with Claude Desktop setup
- [Architecture](docs/ARCHITECTURE.md) - System design
- [Development](docs/DEVELOPMENT.md) - Local setup (includes Windows)
- [Integration](docs/INTEGRATION.md) - Use with other agents/pipelines
---
## Configuration
| Variable | Required | Description |
|----------|----------|-------------|
| `OPENAI_API_KEY` | Yes* | For embeddings (and reasoning if enabled) |
| `AZURE_OPENAI_ENDPOINT` | No* | Azure OpenAI endpoint URL |
| `AZURE_API_KEY` | No* | Azure OpenAI API key |
| `AZURE_OPENAI_VERSION` | No | Azure API version (default: `2024-10-21`) |
| `AZURE_EMBEDDING_DEPLOYMENT` | No | Azure deployment name (default: `text-embedding-3-small`) |
| `OBSIDIAN_VAULT_PATH` | No | Default vault path |
| `REASONING_ENABLED` | No | Enable conclusion extraction (default: false) |
\* Either `OPENAI_API_KEY` **or** `AZURE_OPENAI_ENDPOINT` + `AZURE_API_KEY` is required. When both Azure variables are set, Azure OpenAI is used automatically.
**Cost**: ~$0.02 to index 100 notes. Queries are essentially free.
---
## Development
```bash
uv sync
# Tests
uv run pytest
# Lint + format
uv run black obsidian_rag_mcp/ tests/
uv run ruff check obsidian_rag_mcp/ tests/
```
See [docs/DEVELOPMENT.md](docs/DEVELOPMENT.md) for the full guide.
---
## Requirements
- Python 3.11+
- OpenAI API key
- An Obsidian vault (or use `vault/` for testing)
---
## License
MIT
---
<div align="center">
[Getting Started](docs/GETTING_STARTED.md) | [Architecture](docs/ARCHITECTURE.md) | [Development](docs/DEVELOPMENT.md) | [Integration](docs/INTEGRATION.md)
</div>
| text/markdown | Daniel Scholl | null | Daniel Scholl | null | null | ai, mcp, obsidian, rag, semantic-search | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13"
] | [] | null | null | <3.14,>=3.11 | [] | [] | [] | [
"chromadb<2.0.0,>=1.0.0",
"click>=8.3.0",
"mcp>=1.20.0",
"openai<3.0.0,>=1.0.0",
"python-dotenv>=1.2.0",
"python-frontmatter>=1.1.0",
"tenacity>=9.0.0",
"tiktoken>=0.10.0",
"black>=26.0.0; extra == \"dev\"",
"mypy>=1.15.0; extra == \"dev\"",
"pre-commit>=4.0.0; extra == \"dev\"",
"pytest-asyncio>=1.0.0; extra == \"dev\"",
"pytest-cov>=6.0.0; extra == \"dev\"",
"pytest>=8.0.0; extra == \"dev\"",
"ruff>=0.10.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/danielscholl/obsidian-rag-mcp",
"Repository, https://github.com/danielscholl/obsidian-rag-mcp",
"Issues, https://github.com/danielscholl/obsidian-rag-mcp/issues",
"Changelog, https://github.com/danielscholl/obsidian-rag-mcp/blob/main/CHANGELOG.md"
] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:21:44.620159 | obsidian_rag_mcp-0.2.0.tar.gz | 264,033 | 42/5e/5680a021093e93cfc580a1dfaf05b90f81fdc3be9505c6ef99149ccd62fc/obsidian_rag_mcp-0.2.0.tar.gz | source | sdist | null | false | c12aa110d8753f718de1439d94a82461 | a3cbbac7f02da7a1c55b5fdc63b55c3596b4dd60a138db8473ee27a00a3cfd03 | 425e5680a021093e93cfc580a1dfaf05b90f81fdc3be9505c6ef99149ccd62fc | MIT | [
"LICENSE"
] | 205 |
2.4 | specfact-cli | 0.35.0 | The swiss knife CLI for agile DevOps teams. Keep backlog, specs, tests, and code in sync with validation and contract enforcement for new projects and long-lived codebases. | # SpecFact CLI
> **The "swiss knife" CLI that turns any codebase into a clear, safe, and shippable workflow.**
> Keep backlog, specs, tests, and code in sync so AI-assisted changes do not break production.
> Works for brand-new projects and long-lived codebases - even if you are new to coding.
**No API keys required. Works offline. Zero vendor lock-in.**
[](https://pypi.org/project/specfact-cli/)
[](https://pypi.org/project/specfact-cli/)
[](LICENSE)
[](https://github.com/nold-ai/specfact-cli)
<div align="center">
**[specfact.com](https://specfact.com)** • **[specfact.io](https://specfact.io)** • **[specfact.dev](https://specfact.dev)** • **[Documentation](https://docs.specfact.io/)** • **[Support](mailto:hello@noldai.com)**
</div>
---
## Start Here (60 seconds)
### Install
```bash
# Zero-install (recommended)
uvx specfact-cli@latest
# Or install globally
pip install -U specfact-cli
```
### Bootstrap and IDE Setup
```bash
# Bootstrap module registry and local config (~/.specfact)
specfact init
# Configure IDE prompts/templates (interactive selector by default)
specfact init ide
specfact init ide --ide cursor
specfact init ide --ide vscode
```
### Run Your First Flow
```bash
# Analyze an existing codebase
specfact import from-code my-project --repo .
# Validate external code without modifying source
specfact validate sidecar init my-project /path/to/repo
specfact validate sidecar run my-project /path/to/repo
```
**AI IDE quick start**
```bash
# In your IDE chat (Cursor, VS Code, Copilot, etc.)
/specfact.01-import my-project --repo .
```
**Next steps**
- **[Getting Started](docs/getting-started/README.md)**
- **[AI IDE Workflow](docs/guides/ai-ide-workflow.md)**
- **[Command Chains](docs/guides/command-chains.md)**
---
## Proof and Expectations
- **Typical runtime**: 2-15 minutes depending on repo size and complexity.
- **Checkpointing**: Progress is saved during analysis so you can resume safely.
- **Performance**: Optimized for large codebases with cached parsing and file hashes.
---
## Why It Matters (Plain Language)
- **Clarity**: Turn messy code into clear specs your team can trust.
- **Safety**: Catch risky changes early with validation and contract checks.
- **Sync**: Keep backlog items, specs, tests, and code aligned end to end.
- **Adoption**: Simple CLI, no platform lock-in, works offline.
---
## Who It Is For
- **Vibe coders and new builders** who want to ship fast with guardrails and confidence.
- **Legacy professionals** who want AI speed without lowering standards.
- **DevOps and engineering leaders** who need evidence and repeatable workflows.
---
## How It Works (High Level)
1. **Analyze**: Read code and extract specs, gaps, and risks.
2. **Sync**: Connect specs, backlog items, and plans in one workflow.
3. **Validate**: Enforce contracts and block regressions before production.
---
## The Missing Link (Coder + DevOps Bridge)
Most tools help **either** coders **or** agile teams. SpecFact does both:
- **Backlog sync that is actually strong**: round-trip sync + refinement with GitHub, Azure DevOps, Jira, Linear.
- **Ceremony support teams can run**: standup, refinement, sprint planning, flow metrics (Scrum/Kanban/SAFe).
- **Policy + validation**: DoR/DoD/flow checks plus contract enforcement for production-grade stability.
Recommended command entrypoints:
- `specfact backlog ceremony standup ...`
- `specfact backlog ceremony refinement ...`
- `specfact policy validate ...`
- `specfact policy suggest ...`
What the Policy Engine does in practice:
- Turns team agreements (DoR, DoD, flow checks) into executable checks against your real backlog data.
- Shows exactly what is missing per item (for example missing acceptance criteria or definition of done).
- Generates patch-ready suggestions so teams can fix policy gaps quickly without guessing.
Start with:
- `specfact policy init --template scrum`
- `specfact policy validate --group-by-item`
- `specfact policy suggest --group-by-item --limit 5`
**Try it now**
- **Coders**: [AI IDE Workflow](docs/guides/ai-ide-workflow.md)
- **Agile teams**: [Agile/Scrum Workflows](docs/guides/agile-scrum-workflows.md)
---
## Modules and Capabilities
**Core modules**
- **Analyze**: Extract specs and plans from existing code.
- **Validate**: Enforce contracts, run reproducible checks, and block regressions.
- **Report**: CI/CD summaries and evidence outputs.
**Agile DevOps modules**
- **Backlog**: Refinement, dependency analysis, sprint summaries, risk rollups.
- **Ceremony**: Standup, refinement, and planning entry points.
- **Policy**: DoR, DoD, flow, PI readiness checks.
- **Patch**: Preview, apply, and write changes safely.
**Adapters and bridges**
- **Specs**: Spec-Kit and OpenSpec
- **Backlogs**: GitHub Issues, Azure DevOps, Jira, Linear
- **Contracts**: Specmatic, OpenAPI
### Module Lifecycle Baseline
SpecFact now has a lifecycle-managed module system:
- `specfact init` is bootstrap-first: initializes local CLI state and reports prompt status.
- `specfact init ide` handles IDE prompt/template sync and IDE settings updates.
- `specfact module` is the canonical lifecycle surface:
- `specfact module install <namespace/name>` installs marketplace modules into `~/.specfact/marketplace-modules/`.
- `specfact module list [--source builtin|marketplace|custom]` shows multi-source discovery state.
- `specfact module enable <id>` / `specfact module disable <id> [--force]` manage enabled state.
- `specfact module uninstall <name>` and `specfact module upgrade <name>` manage marketplace lifecycle.
- `specfact init --list-modules`, `--enable-module`, and `--disable-module` remain supported as compatibility aliases during migration.
- Module lifecycle operations keep dependency-aware safety checks with `--force` cascading behavior.
- Module manifests support dependency and core-version compatibility enforcement at registration time.
This lifecycle model is the baseline for future granular module updates and enhancements. Module installation from third-party or open-source community providers is planned, but not implemented yet.
Contract-first module architecture highlights:
- `ModuleIOContract` formalizes module IO operations (`import`, `export`, `sync`, `validate`) on `ProjectBundle`.
- Core-module isolation is enforced by static analysis (`core` never imports `specfact_cli.modules.*` directly).
- Registration tracks protocol operation coverage and schema compatibility metadata.
- Bridge registry support allows module manifests to declare `service_bridges` converters (for example ADO/Jira/Linear/GitHub) loaded at lifecycle startup without direct core-to-module imports.
- Protocol reporting classifies modules from effective runtime interfaces with a single aggregate summary (`Full/Partial/Legacy`).
- Module manifests support publisher and integrity metadata (arch-06) with optional checksum and signature verification at registration time.
Why this matters:
- Feature areas can evolve independently without repeatedly modifying core CLI wiring.
- Module teams can ship at different speeds while preserving stable core behavior.
- Clear IO contracts reduce coupling and make future migrations (e.g., new adapters/modules) lower risk.
- Core remains focused on lifecycle, registry, and validation orchestration rather than tool-specific command logic.
---
## Developer Note: Command Layout
- Primary command implementations live in `src/specfact_cli/modules/<module>/src/commands.py`.
- Legacy imports from `src/specfact_cli/commands/*.py` are compatibility shims and only guarantee `app` re-exports.
- Preferred imports for module code:
- `from specfact_cli.modules.<module>.src.commands import app`
- `from specfact_cli.modules.<module>.src.commands import <symbol>`
- Shim deprecation timeline:
- Legacy shim usage is deprecated for non-`app` symbols now.
- Shim removal is planned no earlier than `v0.30` (or the next major migration window).
---
## Where SpecFact Fits
SpecFact complements your stack rather than replacing it.
- **Spec-Kit/OpenSpec** for authoring and change tracking
- **Backlog tools** for planning and delivery
- **CI/CD** for enforcement and regression prevention
**SpecFact connects them** with adapters, policy checks, and deterministic validation.
**Integrations snapshot**: GitHub, Azure DevOps, Jira, Linear, Spec-Kit, OpenSpec, Specmatic.
- **[Integrations Overview](docs/guides/integrations-overview.md)**
- **[DevOps Adapter Integration](docs/guides/devops-adapter-integration.md)**
---
## Recommended Paths
- **I want a quick win**: [Getting Started](docs/getting-started/README.md)
- **I use an AI IDE**: [AI IDE Workflow](docs/guides/ai-ide-workflow.md)
- **We have a team backlog**: [Agile/Scrum Workflows](docs/guides/agile-scrum-workflows.md)
- **We have a long-lived codebase**: [Working With Existing Code](docs/guides/brownfield-engineer.md)
---
## Documentation Map
- **[Documentation Index](docs/README.md)**
- **[Command Reference](docs/reference/commands.md)**
- **[Backlog Refinement](docs/guides/backlog-refinement.md)**
- **[Backlog Dependency Analysis](docs/guides/backlog-dependency-analysis.md)**
- **[Backlog Delta Commands](docs/guides/backlog-delta-commands.md)**
- **[Policy Engine Commands](docs/guides/policy-engine-commands.md)**
- **[Project DevOps Flow](docs/guides/project-devops-flow.md)**
- **[Sidecar Validation](docs/guides/sidecar-validation.md)**
- **[OpenSpec Journey](docs/guides/openspec-journey.md)**
---
## Contributing
We welcome contributions. See [CONTRIBUTING.md](CONTRIBUTING.md).
```bash
git clone https://github.com/nold-ai/specfact-cli.git
cd specfact-cli
pip install -e ".[dev]"
hatch run contract-test-full
```
---
## License
**Apache License 2.0** - Open source and enterprise-friendly.
[Full license](LICENSE)
---
## Support
- **GitHub Discussions**: https://github.com/nold-ai/specfact-cli/discussions
- **GitHub Issues**: https://github.com/nold-ai/specfact-cli/issues
- **Email**: hello@noldai.com
- **Debug logs**: Run with `--debug` and check `~/.specfact/logs/specfact-debug.log`.
---
<div align="center">
**Built by [NOLD AI](https://noldai.com)**
Copyright © 2025-2026 Nold AI (Owner: Dominikus Nold)
**Trademarks**: NOLD AI (NOLDAI) is a registered trademark (wordmark) at the European Union Intellectual Property Office (EUIPO). All other trademarks mentioned in this project are the property of their respective owners. See [TRADEMARKS.md](TRADEMARKS.md) for more information.
</div>
| text/markdown | null | "NOLD AI (Owner: Dominikus Nold)" <hello@noldai.com> | null | null | Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2025 Nold AI (Owner: Dominikus Nold)
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. | agile, backlog, beartype, ceremonies, cli, contract-driven-development, contracts, crosshair, devops, existing-code, icontract, kanban, legacy, modernization, policy-as-code, property-based-testing, safe, scrum, specfact, specs, tdd, validation | [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Quality Assurance",
"Topic :: Software Development :: Testing"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"azure-identity>=1.17.1",
"beartype>=0.22.4",
"crosshair-tool>=0.0.97",
"gitpython>=3.1.45",
"graphviz>=0.20.1",
"hypothesis>=6.142.4",
"icontract>=2.7.1",
"jinja2>=3.1.6",
"jsonschema>=4.23.0",
"networkx>=3.4.2",
"opentelemetry-exporter-otlp-proto-http>=1.27.0",
"opentelemetry-sdk>=1.27.0",
"pydantic>=2.12.3",
"python-dotenv>=1.2.1",
"pyyaml>=6.0.3",
"questionary>=2.0.1",
"requests>=2.32.3",
"rich<13.6.0,>=13.5.2",
"ruamel-yaml>=0.18.16",
"ruff>=0.14.2",
"typer>=0.20.0",
"typing-extensions>=4.15.0",
"watchdog>=6.0.0",
"basedpyright>=1.32.1; extra == \"dev\"",
"bearer>=3.1.0; extra == \"dev\"",
"beartype>=0.22.4; extra == \"dev\"",
"crosshair-tool>=0.0.97; extra == \"dev\"",
"graphviz>=0.20.1; extra == \"dev\"",
"hypothesis>=6.142.4; extra == \"dev\"",
"icontract>=2.7.1; extra == \"dev\"",
"isort>=7.0.0; extra == \"dev\"",
"pip-tools>=7.5.1; extra == \"dev\"",
"pyan3>=1.2.0; extra == \"dev\"",
"pylint>=4.0.2; extra == \"dev\"",
"pytest-asyncio>=1.2.0; extra == \"dev\"",
"pytest-cov>=7.0.0; extra == \"dev\"",
"pytest-mock>=3.15.1; extra == \"dev\"",
"pytest-xdist>=3.8.0; extra == \"dev\"",
"pytest>=8.4.2; extra == \"dev\"",
"ruff>=0.14.2; extra == \"dev\"",
"semgrep>=1.144.0; extra == \"dev\"",
"tomlkit>=0.13.3; extra == \"dev\"",
"types-pyyaml>=6.0.12.20250516; extra == \"dev\"",
"bearer>=3.1.0; extra == \"enhanced-analysis\"",
"graphviz>=0.20.1; extra == \"enhanced-analysis\"",
"pyan3>=1.2.0; extra == \"enhanced-analysis\"",
"syft>=0.9.5; extra == \"enhanced-analysis\"",
"semgrep>=1.144.0; extra == \"scanning\""
] | [] | [] | [] | [
"Homepage, https://github.com/nold-ai/specfact-cli",
"Repository, https://github.com/nold-ai/specfact-cli.git",
"Documentation, https://github.com/nold-ai/specfact-cli#readme",
"Issues, https://github.com/nold-ai/specfact-cli/issues",
"Trademarks, https://github.com/nold-ai/specfact-cli/blob/main/TRADEMARKS.md"
] | twine/6.2.0 CPython/3.12.12 | 2026-02-21T02:21:27.132366 | specfact_cli-0.35.0.tar.gz | 826,839 | ad/ee/2a4a0873e5711d083b609320cbac746a327ccd62dcdef24d33972c65781e/specfact_cli-0.35.0.tar.gz | source | sdist | null | false | 6fabffccae87527daf2017f2ef631de1 | 682875e8250e2d36e9133e269570e74157940b5501af39c8590e878edf518c84 | adee2a4a0873e5711d083b609320cbac746a327ccd62dcdef24d33972c65781e | null | [
"LICENSE"
] | 210 |
2.4 | python-fasthtml | 0.12.47 | The fastest way to create an HTML app | # FastHTML
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
Welcome to the official FastHTML documentation.
FastHTML is a new next-generation web framework for fast, scalable web
applications with minimal, compact code. It’s designed to be:
- Powerful and expressive enough to build the most advanced, interactive
web apps you can imagine.
- Fast and lightweight, so you can write less code and get more done.
- Easy to learn and use, with a simple, intuitive syntax that makes it
easy to build complex apps quickly.
FastHTML apps are just Python code, so you can use FastHTML with the
full power of the Python language and ecosystem. FastHTML’s
functionality maps 1:1 directly to HTML and HTTP, but allows them to be
encapsulated using good software engineering practices—so you’ll need to
understand these foundations to use this library fully. To understand
how and why this works, please read this first:
[fastht.ml/about](https://fastht.ml/about).
## Installation
Since `fasthtml` is a Python library, you can install it with:
``` sh
pip install python-fasthtml
```
In the near future, we hope to add component libraries that can likewise
be installed via `pip`.
## Usage
For a minimal app, create a file “main.py” as follows:
<div class="code-with-filename">
**main.py**
``` python
from fasthtml.common import *
app,rt = fast_app()
@rt('/')
def get(): return Div(P('Hello World!'), hx_get="/change")
serve()
```
</div>
Running the app with `python main.py` prints out a link to your running
app: `http://localhost:5001`. Visit that link in your browser and you
should see a page with the text “Hello World!”. Congratulations, you’ve
just created your first FastHTML app!
Adding interactivity is surprisingly easy, thanks to HTMX. Modify the
file to add this function:
<div class="code-with-filename">
**main.py**
``` python
@rt('/change')
def get(): return P('Nice to be here!')
```
</div>
You now have a page with a clickable element that changes the text when
clicked. When clicking on this link, the server will respond with an
“HTML partial”—that is, just a snippet of HTML which will be inserted
into the existing page. In this case, the returned element will replace
the original P element (since that’s the default behavior of HTMX) with
the new version returned by the second route.
This “hypermedia-based” approach to web development is a powerful way to
build web applications.
### Getting help from AI
Because FastHTML is newer than most LLMs, AI systems like Cursor,
ChatGPT, Claude, and Copilot won’t give useful answers about it. To fix
that problem, we’ve provided an LLM-friendly guide that teaches them how
to use FastHTML. To use it, add this link for your AI helper to use:
- [/llms-ctx.txt](https://www.fastht.ml/docs/llms-ctx.txt)
This example is in a format based on recommendations from Anthropic for
use with [Claude
Projects](https://support.anthropic.com/en/articles/9517075-what-are-projects).
This works so well that we’ve actually found that Claude can provide
even better information than our own documentation! For instance, read
through [this annotated Claude
chat](https://gist.github.com/jph00/9559b0a563f6a370029bec1d1cc97b74)
for some great getting-started information, entirely generated from a
project using the above text file as context.
If you use Cursor, type `@doc` then choose “*Add new doc*”, and use the
/llms-ctx.txt link above. The context file is auto-generated from our
[`llms.txt`](https://llmstxt.org/) (our proposed standard for providing
AI-friendly information)—you can generate alternative versions suitable
for other models as needed.
## Next Steps
Start with the official sources to learn more about FastHTML:
- [About](https://fastht.ml/about): Learn about the core ideas behind
FastHTML
- [Documentation](https://www.fastht.ml/docs): Learn from examples how
to write FastHTML code
- [Idiomatic
app](https://github.com/AnswerDotAI/fasthtml/blob/main/examples/adv_app.py):
Heavily commented source code walking through a complete application,
including custom authentication, JS library connections, and database
use.
We also have a 1-hour intro video:
<https://www.youtube.com/embed/Auqrm7WFc0I>
The capabilities of FastHTML are vast and growing, and not all the
features and patterns have been documented yet. Be prepared to invest
time into studying and modifying source code, such as the main FastHTML
repo’s notebooks and the official FastHTML examples repo:
- [FastHTML Examples Repo on
GitHub](https://github.com/AnswerDotAI/fasthtml-example)
- [FastHTML Repo on GitHub](https://github.com/AnswerDotAI/fasthtml)
Then explore the small but growing third-party ecosystem of FastHTML
tutorials, notebooks, libraries, and components:
- [FastHTML Gallery](https://gallery.fastht.ml): Learn from minimal
examples of components (ie chat bubbles, click-to-edit, infinite
scroll, etc)
- [Creating Custom FastHTML Tags for Markdown
Rendering](https://isaac-flath.github.io/website/posts/boots/FasthtmlTutorial.html)
by Isaac Flath
- [How to Build a Simple Login System in
FastHTML](https://blog.mariusvach.com/posts/login-fasthtml) by Marius
Vach
- Your tutorial here!
Finally, join the FastHTML community to ask questions, share your work,
and learn from others:
- [Discord](https://discord.gg/qcXvcxMhdP)
## Other languages and related projects
If you’re not a Python user, or are keen to try out a new language,
we’ll list here other projects that have a similar approach to FastHTML.
(Please reach out if you know of any other projects that you’d like to
see added.)
- [htmgo](https://htmgo.dev/) (Go): “*htmgo is a lightweight pure go way
to build interactive websites / web applications using go & htmx. By
combining the speed & simplicity of go + hypermedia attributes (htmx)
to add interactivity to websites, all conveniently wrapped in pure go,
you can build simple, fast, interactive websites without touching
javascript. All compiled to a single deployable binary*”
If you’re just interested in functional HTML components, rather than a
full HTMX server solution, consider:
- [fastcore.xml.FT](https://fastcore.fast.ai/xml.html): This is actually
what FastHTML uses behind the scenes
- [htpy](https://htpy.dev/): Similar to
[`fastcore.xml.FT`](https://fastcore.fast.ai/xml.html#ft), but with a
somewhat different syntax
- [elm-html](https://package.elm-lang.org/packages/elm/html/latest/):
Elm’s built-in HTML library with a type-safe functional approach
- [hiccup](https://github.com/weavejester/hiccup): Popular library for
representing HTML in Clojure using vectors
- [hiccl](https://github.com/garlic0x1/hiccl): HTML generation library
for Common Lisp inspired by Clojure’s Hiccup
- [Falco.Markup](https://github.com/pimbrouwers/Falco): F# HTML DSL and
web framework with type-safe HTML generation
- [Lucid](https://github.com/chrisdone/lucid): Type-safe HTML generation
for Haskell using monad transformers
- [dream-html](https://github.com/aantron/dream): Part of the Dream web
framework for OCaml, provides type-safe HTML templating
For other hypermedia application platforms, not based on HTMX, take a
look at:
- [Hotwire/Turbo](https://turbo.hotwired.dev/): Rails-oriented framework
that similarly uses HTML-over-the-wire
- [LiveView](https://hexdocs.pm/phoenix_live_view/Phoenix.LiveView.html):
Phoenix framework’s solution for building interactive web apps with
minimal JavaScript
- [Unpoly](https://unpoly.com/): Another HTML-over-the-wire framework
with progressive enhancement
- [Livewire](https://laravel-livewire.com/): Laravel’s take on building
dynamic interfaces with minimal JavaScript
| text/markdown | null | Jeremy Howard and contributors <github@jhoward.fastmail.fm> | null | null | Apache-2.0 | nbdev, jupyter, notebook, python | [
"Natural Language :: English",
"Intended Audience :: Developers",
"Development Status :: 3 - Alpha",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only"
] | [] | null | null | >=3.10 | [] | [] | [] | [
"fastcore>=1.12.16",
"python-dateutil",
"starlette>0.33",
"oauthlib",
"itsdangerous",
"uvicorn[standard]>=0.30",
"httpx",
"fastlite>=0.1.1",
"python-multipart",
"beautifulsoup4",
"ipython; extra == \"dev\"",
"lxml; extra == \"dev\"",
"pysymbol_llm; extra == \"dev\"",
"monsterui; extra == \"dev\"",
"PyJWT; extra == \"dev\""
] | [] | [] | [] | [
"Repository, https://github.com/AnswerDotAI/fasthtml",
"Documentation, https://www.fastht.ml/docs/"
] | twine/6.2.0 CPython/3.12.0 | 2026-02-21T02:20:52.171349 | python_fasthtml-0.12.47.tar.gz | 71,755 | eb/47/afd5be266a7215921495553ad7afa26fa4c4a4b6e2f0d8f076f6098dfc1a/python_fasthtml-0.12.47.tar.gz | source | sdist | null | false | 455cb25a5b0a5d8184bb6c25038d4858 | 9efa6e1ff846e34889fcc4cbbab0b33b9e4d12c6a5d12aa1b8cf613675b7cee5 | eb47afd5be266a7215921495553ad7afa26fa4c4a4b6e2f0d8f076f6098dfc1a | null | [
"LICENSE"
] | 7,259 |
2.4 | chatjimmy-proxy | 0.1.0 | OpenAI-compatible API proxy for chatjimmy.ai | # chatjimmy-proxy
OpenAI-compatible HTTP proxy for chatjimmy.ai. Point any OpenAI SDK or tool at
it and use model ``jimmy``.
## Quick start
1. clone and install:
```bash
git clone <repo>
cd chatjimmy-proxy
uv sync
uv run playwright install chromium
```
2. configure:
```bash
cp .env.example .env
# edit PROXY_API_KEY (leave blank to disable auth)
```
> **Note:** if you have a `PROXY_API_KEY` set in your shell environment (for
> example some systems default it to your username), the proxy will require
> that exact value in the `Authorization` header. Use
> `export PROXY_API_KEY=` to clear it, or choose a different secret.
3. run discovery once:
```bash
uv run chatjimmy-discover
```
4. start proxy (default port 8000, change with `PORT` env var):
```bash
uv run chatjimmy-proxy
# or explicitly:
uv run uvicorn chatjimmy_proxy.main:app --host 0.0.0.0 --port ${PORT:-8000}
```
If you see "address already in use" set `PORT` to a free port (e.g. 8001) or
kill the process currently listening on the port.
## Usage
```bash
curl http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $PROXY_API_KEY" \
-d '{"model":"jimmy","messages":[{"role":"user","content":"Hi"}]}'
```
Streaming: add `--no-buffer` and `"stream": true`.
Python example:
```python
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8000/v1", api_key="key")
print(client.chat.completions.create(model="jimmy", messages=[{"role":"user","content":"Hi"}]).choices[0].message.content)
```
### Editors and agents
Any tool that lets you supply a custom OpenAI‑compatible provider should work.
You need three things:
1. **Base URL** – the root of the OpenAI API, not a specific endpoint. Use
`http://<host>:<port>/v1` (omit `/chat/completions`); most clients append the
path themselves. If you include `/chat/completions` twice you'll see
404s like `/v1/chat/completions/chat/completions`.
2. **API key** – the secret from `.env` (or any string if auth is off).
3. **Model** – `jimmy`.
Correct Roo Code configuration example:
```json
{
"provider": "OpenAI Compatible",
"baseUrl": "http://localhost:8000/v1",
"apiKey": "<your-proxy-key>",
"model": "jimmy"
}
```
If you accidentally set the base URL to include `/chat/completions`, the agent
will produce 404 errors when it tries to call `/v1/chat/completions/chat/completions`.
## Development
```
uv run pytest tests/ -v
uv run ruff check src/
uv run ruff format src/
```
## Packaging & publishing
Build with `uv run hatch build`. Releases are made by tagging `vX.Y.Z`;
GitHub Actions tests and publishes to PyPI using `PYPI_API_TOKEN`.
## Troubleshooting
* **Port already in use** – if the proxy fails to start with
`address already in use`:
```bash
# locate the offending PID
sudo lsof -i :8000 -t # substitute whatever port you were using
# kill it (or choose a different port)
sudo kill <pid>
# or directly:
sudo kill -9 $(sudo lsof -i :8000 -t)
```
alternatively, set `PORT` to a free port before launching:
```bash
PORT=8001 uv run chatjimmy-proxy
```
* **Discovery failures** – run with `HEADLESS=false` or switch to
`mode: browser_relay` in blueprint.
* **401/403** – clear `.jimmy_blueprint.json`/`.jimmy_state.json` and
re-run discovery; ensure `PROXY_API_KEY` is correct.
* **Slow first response** – discovery runs on startup; subsequent
requests are fast under HTTP‑replay mode.
## License
MIT – see LICENSE.
| text/markdown | chatjimmy-proxy contributors | null | null | null | MIT | chatjimmy, fastapi, openai, proxy, python | [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.11",
"Topic :: Internet :: WWW/HTTP"
] | [] | null | null | >=3.11 | [] | [] | [] | [
"fastapi>=0.115.0",
"httpx>=0.28.0",
"playwright>=1.48.0",
"pydantic-settings>=2.6.0",
"pydantic>=2.9.0",
"python-dotenv>=1.0.0",
"structlog>=24.4.0",
"tenacity>=9.0.0",
"uvicorn[standard]>=0.32.0",
"httpx>=0.28.0; extra == \"dev\"",
"pytest-asyncio>=0.24.0; extra == \"dev\"",
"pytest>=8.3.0; extra == \"dev\"",
"ruff>=0.8.0; extra == \"dev\""
] | [] | [] | [] | [
"Homepage, https://github.com/remixonwin/chatjimmy-proxy",
"Repository, https://github.com/remixonwin/chatjimmy-proxy",
"Bug Tracker, https://github.com/remixonwin/chatjimmy-proxy/issues"
] | twine/6.2.0 CPython/3.11.14 | 2026-02-21T02:20:18.155380 | chatjimmy_proxy-0.1.0.tar.gz | 3,718 | f4/eb/790fce56840ae8d3c9de62601c6b8f0baf8ef6568182d61d1fa319a454c5/chatjimmy_proxy-0.1.0.tar.gz | source | sdist | null | false | 813fd3f564e1c1b71980530efa4e6b92 | 2e18a6567d2341296982f66dc2a3792c6fc3ee9e640ac3db021741fd20a997b3 | f4eb790fce56840ae8d3c9de62601c6b8f0baf8ef6568182d61d1fa319a454c5 | null | [
"LICENSE"
] | 216 |
2.4 | aiel-sdk | 1.3.0 | AI Execution Layer SDK (contracts + registry + decorators) with curated facades. | # AIEL SDK
**AI Execution Layer SDK** — Enterprise-grade contract-first decorators, registry system, and curated facade for AI orchestration.
[](https://pypi.org/project/aiel-sdk/)
[](https://www.python.org/downloads/)
[](LICENSE)
---
## Overview
AIEL SDK provides a stable contract surface for building AI-powered applications that execute on the AIEL Execution Plane. This lightweight SDK enables:
- **Contract-first development** via decorators and type-safe registry
- **Curated facade modules** (`aiel.*`) mirroring server runtime imports
- **Clear error messages** for missing optional dependencies
- **Type safety** and IDE support for local development
> **Note:** Code execution is performed server-side by the Execution Plane using `aiel-runtime`. This SDK enables local development, type-checking, linting, and testing with identical import paths used in production.
---
## Table of Contents
- [Installation](#installation)
- [Quick Start](#quick-start)
- [Core Concepts](#core-concepts)
- [API Reference](#api-reference)
- [Runtime Execution](#runtime-execution)
- [Best Practices](#best-practices)
- [Troubleshooting](#troubleshooting)
---
## Installation
### Installation Matrix
Choose the appropriate installation based on your requirements:
| Use Case | Command | Description |
|----------|---------|-------------|
| **Minimal (Recommended)** | `pip install aiel-sdk` | Core contracts, decorators, and registry only |
| **LangGraph Development** | `pip install "aiel-sdk[langgraph]"` | Adds LangGraph facade support |
| **LangChain Development** | `pip install "aiel-sdk[langchain]"` | Adds LangChain Core facade support |
| **LangSmith Integration** | `pip install "aiel-sdk[langsmith]"` | Adds LangSmith facade support |
| **AIEL CLI Context** | `pip install "aiel-sdk[aiel-cli]"` | Adds CLI user context facade support |
| **Full Development Suite** | `pip install "aiel-sdk[all]"` | All curated integrations (best for experimentation) |
> **Shell Note:** When using zsh, always quote extras: `pip install "aiel-sdk[all]"`
### Requirements
- Python 3.8 or higher
- pip 21.0 or higher
---
## Quick Start
### Basic Example
```python
from aiel_sdk import tool, agent, flow, http, mcp_server
from aiel_core.pydantic import BaseModel, Field
# Define data models
class NormalizeEmailPayload(BaseModel):
email: str = Field(..., description="Raw email address")
name: str = Field(..., description="User's full name")
class NormalizeEmailOut(BaseModel):
email: str = Field(..., description="Normalized email address")
# Define a tool
@tool("normalize_email")
def normalize_email(ctx, payload: dict) -> dict:
"""Normalize email addresses to lowercase."""
data = NormalizeEmailPayload.model_validate(payload)
email = (data.email or "").strip().lower()
ctx.log("normalize_email", email=email)
return NormalizeEmailOut(email=email).model_dump()
# Define an agent
@agent("collect_personal_data")
def collect_personal_data(ctx, state: dict) -> dict:
"""Collect and validate personal information."""
email_info = normalize_email(ctx, {
"email": state.get("email"),
"name": state.get("name")
})
state = dict(state)
state["email"] = email_info["email"]
state["personal_validated"] = True
return state
# Define a flow
@flow("driver_onboarding")
def driver_onboarding(ctx, input: dict) -> dict:
"""Main onboarding orchestration flow."""
state = collect_personal_data(ctx, input)
return {"status": "ok", "state": state}
# Define HTTP endpoint
@http.post("/driver/onboard")
def http_driver_onboard(ctx, body: dict) -> dict:
"""HTTP handler for driver onboarding."""
return driver_onboarding(ctx, body)
# Expose via MCP
mcp_server("driver_support", tools=["normalize_email"])
```
### Using Facade Imports
```python
from aiel_core.langgraph.graph import StateGraph, MessagesState, START, END
from aiel_core.langchain.agents import create_agent
# Build a state graph
graph = StateGraph(MessagesState)
graph.add_node("process", lambda state: state)
graph.add_edge(START, "process")
graph.add_edge("process", END)
app = graph.compile()
result = app.invoke({"messages": []})
```
### Backend Integrations (Jira, Slack, Postgres)
```python
from aiel_sdk import IntegrationsClient
client = IntegrationsClient(
base_url="https://api.example.com",
api_key="REPLACE_ME",
)
connections = client.list("workspace_id", "project_id")
first = connections[0]
detail = client.get(first.workspace_id, first.project_id, first.connection_id)
# Example action (Slack)
client.invoke_action(
first.workspace_id,
first.project_id,
first.connection_id,
action="slack.post_message",
payload={"params": {"channel": "#general", "text": "Hello from AIEL SDK"}},
)
```
---
## Core Concepts
### Contract-First Architecture
The SDK defines a stable contract surface that mirrors the server runtime environment. This approach ensures:
- **Local Development Parity:** Import paths work identically in local and production environments
- **Type Safety:** Full IDE support and type checking during development
- **Clear Contracts:** Explicit interfaces between your code and the Execution Plane
### Registry System
Decorators automatically register exports into an in-memory registry:
- **`@tool(name)`** — Discrete actions callable by agents and flows
- **`@agent(name)`** — Orchestration steps that operate on state
- **`@flow(name)`** — Main orchestration entry points
- **`@http.get(path)` / `@http.post(path)`** — HTTP handler exports
- **`mcp_server(name, tools=[])`** — MCP server exposure metadata
### Facade Modules
The `aiel.*` namespace provides stable import paths for curated integrations:
| Module | Import Path | Purpose |
|--------|-------------|---------|
| **Pydantic** | `aiel_core.pydantic` | Data validation and serialization |
| **LangGraph** | `aiel_core.langgraph.graph` | State graph orchestration |
| **LangChain** | `aiel_core.langchain.core` | Prompt templates and runnables |
| **LangChain Messages** | `aiel_core.langchain.messages` | AI/Human message types |
| **LangSmith** | `aiel_core.langsmith.client` | Observability and tracing |
| **AIEL CLI** | `aiel_core.aiel_cli.context.user_context` | CLI user context helpers |
Missing integrations fail with actionable error messages indicating the required installation command.
---
## API Reference
### Decorators
#### `@tool(name: str)`
Register a discrete tool that can be invoked by agents and flows.
```python
@tool("my_tool")
def my_tool(ctx, payload: dict) -> dict:
"""
Args:
ctx: Execution context with logging and utilities
payload: Input data dictionary
Returns:
Output data dictionary
"""
ctx.log("operation", key="value")
return {"result": "success"}
```
#### `@agent(name: str)`
Register an orchestration agent that operates on state.
```python
@agent("my_agent")
def my_agent(ctx, state: dict) -> dict:
"""
Args:
ctx: Execution context
state: Current state dictionary
Returns:
Updated state dictionary
"""
return {**state, "processed": True}
```
#### `@flow(name: str)`
Register a main orchestration entry point.
```python
@flow("my_flow")
def my_flow(ctx, input: dict) -> dict:
"""
Args:
ctx: Execution context
input: Initial input dictionary
Returns:
Final output dictionary
"""
return {"status": "complete"}
```
#### `@http.get(path: str)` / `@http.post(path: str)`
Register HTTP endpoint handlers.
```python
@http.get("/status")
def get_status(ctx, query: dict) -> dict:
"""Handle GET requests."""
return {"status": "healthy"}
@http.post("/process")
def process_data(ctx, body: dict) -> dict:
"""Handle POST requests."""
return {"processed": True}
```
#### `flow_graph(name: str, builder_fn: Callable)`
Register a LangGraph-based flow.
```python
def build_graph():
graph = StateGraph(dict)
graph.add_node("start", lambda s: s)
graph.add_edge(START, "start")
graph.add_edge("start", END)
return graph.compile()
flow_graph("my_graph_flow", build_graph)
```
#### `mcp_server(name: str, tools: List[str] = [])`
Register MCP server metadata.
```python
mcp_server("my_server", tools=["tool1", "tool2"])
```
### Facade Imports
#### Pydantic
```python
from aiel_core.pydantic import BaseModel, Field
class MyModel(BaseModel):
field: str = Field(..., description="Description")
```
#### LangGraph
```python
from aiel_core.langgraph.graph import StateGraph, START, END, MessagesState
graph = StateGraph(MessagesState)
```
#### LangChain Core
```python
from aiel_core.langchain.core import ChatPromptTemplate, PromptTemplate, Runnable, RunnableConfig
template = ChatPromptTemplate.from_template("Hello {name}")
```
#### LangSmith
```python
from aiel_core.langsmith.client import Client
client = Client()
```
---
## Runtime Execution
### Execution Plane Architecture
In production, the Execution Plane runs your code using `aiel-runtime`:
1. **Download:** Fetches project snapshot from Data Plane
2. **Import:** Loads `entry_point.py` to register exports
3. **Validate:** Ensures contracts are properly implemented
4. **Execute:** Invokes exports in a sandboxed runtime
The runtime is invoked via: `python -m aiel_runtime.runner`
### Runtime Bundles
The Execution Plane provides curated runtime images with approved dependencies:
#### Minimal Runtime
```
aiel-runtime==X.Y.Z
aiel-sdk==X.Y.Z
```
Core Python dependencies only.
#### AI Runtime
```
aiel-runtime[ai]==X.Y.Z
```
Includes curated orchestration dependencies:
- `langgraph`
- `langchain-core`
- `langsmith` (optional)
### Data Plane Manifest
Projects include a manifest specifying:
- **`runtime`**: Which runtime bundle/image to use
- **`sdk_version`**: Target SDK contract level
The Execution Plane allowlists runtime identifiers and pins package versions accordingly.
---
## Best Practices
### Project Structure
```
my-aiel-project/
├── entry_point.py # Main export definitions
├── tools/
│ ├── __init__.py
│ └── validation.py # Tool implementations
├── agents/
│ ├── __init__.py
│ └── orchestration.py # Agent implementations
├── flows/
│ ├── __init__.py
│ └── main.py # Flow definitions
├── models/
│ ├── __init__.py
│ └── schemas.py # Pydantic models
├── requirements.txt
└── README.md
```
### Type Safety
Always use Pydantic models for input validation:
```python
from aiel_core.pydantic import BaseModel, Field
class Input(BaseModel):
field: str = Field(..., min_length=1)
@tool("my_tool")
def my_tool(ctx, payload: dict) -> dict:
data = Input.model_validate(payload)
# Type-safe access
return {"result": data.field}
```
### Error Handling
Implement proper error handling and logging:
```python
@tool("safe_tool")
def safe_tool(ctx, payload: dict) -> dict:
try:
# Process data
ctx.log("processing", status="started")
result = process(payload)
ctx.log("processing", status="completed")
return result
except Exception as e:
ctx.log("error", message=str(e))
return {"error": str(e), "status": "failed"}
```
### Naming Conventions
- Use **snake_case** for function and variable names
- Use **descriptive names** that reflect functionality
- Prefix internal helpers with underscore: `_internal_helper()`
---
## Troubleshooting
### Common Issues
#### Missing Integration Error
**Error:** `ImportError: aiel.langgraph is not available`
**Solution:** Install the required extra:
```bash
pip install "aiel-sdk[langgraph]"
```
#### Contract Validation Failure
**Error:** `Contract validation failed: invalid signature`
**Solution:** Ensure your functions match the expected signature:
- Tools: `(ctx, payload: dict) -> dict`
- Agents: `(ctx, state: dict) -> dict`
- Flows: `(ctx, input: dict) -> dict`
- HTTP handlers: `(ctx, body/query: dict) -> dict`
#### Import Path Issues
**Error:** `ModuleNotFoundError: No module named 'aiel'`
**Solution:** Use correct import paths:
- SDK exports: `from aiel_sdk import tool, agent`
- Facade imports: `from aiel_core.langgraph.graph import StateGraph`
### Getting Help
- **Documentation:** [Internal docs portal]
- **Support:** [Internal support channel]
- **Issues:** [Internal issue tracker]
---
## Migration Guide
### From Legacy SDK
If migrating from a previous SDK version:
1. Update imports to use `aiel.*` facade
2. Update decorator signatures to match new contracts
3. Replace direct library imports with facade imports
4. Update manifest `sdk_version` field
---
## Contributing
[Internal contribution guidelines]
---
## License
[Internal license information]
---
## Changelog
See [CHANGELOG.md](CHANGELOG.md) for version history.
---
**Maintained by:** Platform Engineering Team
**Last Updated:** January 2026
| text/markdown | null | Aldenir Flauzino <aldenirsrv@gmail.com> | null | null | ```text
MIT License
Copyright (c) 2025 Aldenir Flauzino
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. | null | [] | [] | null | null | >=3.10 | [] | [] | [] | [
"email-validator>=2.0",
"keyring<26,>=24",
"pydantic>=2.8",
"typing-extensions>=4.8.0",
"aiel-cli>=1.2.4; extra == \"aiel-cli\"",
"aiel-cli>=1.2.4; extra == \"all\"",
"langchain-core>=0.3.0; extra == \"all\"",
"langgraph>=0.2.0; extra == \"all\"",
"langsmith>=0.1.100; extra == \"all\"",
"langchain-core>=0.3.0; extra == \"langchain\"",
"langgraph>=0.2.0; extra == \"langgraph\"",
"langsmith>=0.1.100; extra == \"langsmith\""
] | [] | [] | [] | [] | twine/6.1.0 CPython/3.13.7 | 2026-02-21T02:19:57.198075 | aiel_sdk-1.3.0.tar.gz | 13,408 | 11/00/471c9ec7a4fcfb240930df580fbae60e22d85cc99a070dd685a392d0ae81/aiel_sdk-1.3.0.tar.gz | source | sdist | null | false | 60292ac78a0ff911083b84b123a73b70 | 2306624987263f600e945486c5c03908d24af20b472cd8130e2614b7dc6bbcfa | 1100471c9ec7a4fcfb240930df580fbae60e22d85cc99a070dd685a392d0ae81 | null | [
"LICENSE"
] | 240 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.