repo_id
stringclasses 195
values | file_path
stringlengths 32
139
| content
stringlengths 6
440k
| __index_level_0__
int64 0
0
|
|---|---|---|---|
cloned_public_repos
|
cloned_public_repos/zenml/CLA.md
|
# Fiduciary License Agreement 2.0
based on the
## Individual Contributor Exclusive License Agreement
(including the Traditional Patent License OPTION)
Thank you for your interest in contributing to ZenML by ZenML GmbH ("We" or "Us").
The purpose of this contributor agreement ("Agreement") is to clarify and document the rights granted by contributors to Us. To make this document effective, please follow the instructions at https://zenml.io/cla/.
### 0. Preamble
Software is deeply embedded in all aspects of our lives and it is important that it empower, rather than restrict us. Free Software gives everybody the rights to use, understand, adapt and share software. These rights help support other fundamental freedoms like freedom of speech, press and privacy.
Development of Free Software can follow many patterns. In some cases whole development is handled by a sole programmer or a small group of people. But usually, the creation and maintenance of software is a complex process that requires the contribution of many individuals. This also affects who owns the rights to the software. In the latter case, rights in software are owned jointly by a great number of individuals.
To tackle this issue some projects require a full copyright assignment to be signed by all contributors. The problem with such assignments is that they often lack checks and balances that would protect the contributors from potential abuse of power from the new copyright holder.
FSFEโs Fiduciary License Agreement (FLA) was created by the Free Software Foundation Europe e.V. with just that in mind โ to concentrate all deciding power within one entity and prevent fragmentation of rights on one hand, while on the other preventing that single entity from abusing its power. The main aim is to ensure that the software covered under the FLA will forever remain Free Software.
This process only serves for the transfer of economic rights. So-called moral rights (e.g. authors right to be identified as author) remain with the original author(s) and are inalienable.
How to use this FLA
If You are an employee and have created the Contribution as part of your employment, You need to have Your employer approve this Agreement or sign the Entity version of this document. If You do not own the Copyright in the entire work of authorship, any other author of the Contribution should also sign this โ in any event, please contact Us at support@zenml.io
### 1. Definitions
"You" means the individual Copyright owner who Submits a Contribution to Us.
"Contribution" means any original work of authorship, including any original modifications or additions to an existing work of authorship, Submitted by You to Us, in which You own the Copyright.
"Copyright" means all rights protecting works of authorship, including copyright, moral and neighboring rights, as appropriate, for the full term of their existence.
"Material" means the software or documentation made available by Us to third parties. When this Agreement covers more than one software project, the Material means the software or documentation to which the Contribution was Submitted. After You Submit the Contribution, it may be included in the Material.
"Submit" means any act by which a Contribution is transferred to Us by You by means of tangible or intangible media, including but not limited to electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, Us, but excluding any transfer that is conspicuously marked or otherwise designated in writing by You as "Not a Contribution."
"Documentation" means any non-software portion of a Contribution.
### 2. License grant
#### 2.1 Copyright license to Us
Subject to the terms and conditions of this Agreement, You hereby grant to Us a worldwide, royalty-free, exclusive, perpetual and irrevocable (except as stated in Section 8.2) license, with the right to transfer an unlimited number of non-exclusive licenses or to grant sublicenses to third parties, under the Copyright covering the Contribution to use the Contribution by all means, including, but not limited to:
publish the Contribution,
modify the Contribution,
prepare derivative works based upon or containing the Contribution and/or to combine the Contribution with other Materials,
reproduce the Contribution in original or modified form,
distribute, to make the Contribution available to the public, display and publicly perform the Contribution in original or modified form.
#### 2.2 Moral rights
Moral Rights remain unaffected to the extent they are recognized and not waivable by applicable law. Notwithstanding, You may add your name to the attribution mechanism customary used in the Materials you Contribute to, such as the header of the source code files of Your Contribution, and We will respect this attribution when using Your Contribution.
#### 2.3 Copyright license back to You
Upon such grant of rights to Us, We immediately grant to You a worldwide, royalty-free, non-exclusive, perpetual and irrevocable license, with the right to transfer an unlimited number of non-exclusive licenses or to grant sublicenses to third parties, under the Copyright covering the Contribution to use the Contribution by all means, including, but not limited to:
publish the Contribution,
modify the Contribution,
prepare derivative works based upon or containing the Contribution and/or to combine the Contribution with other Materials,
reproduce the Contribution in original or modified form,
distribute, to make the Contribution available to the public, display and publicly perform the Contribution in original or modified form.
This license back is limited to the Contribution and does not provide any rights to the Material.
### 3. Patents
#### 3.1 Patent license
Subject to the terms and conditions of this Agreement You hereby grant to Us and to recipients of Materials distributed by Us a worldwide, royalty-free, non-exclusive, perpetual and irrevocable (except as stated in Section 3.2) patent license, with the right to transfer an unlimited number of non-exclusive licenses or to grant sublicenses to third parties, to make, have made, use, sell, offer for sale, import and otherwise transfer the Contribution and the Contribution in combination with any Material (and portions of such combination). This license applies to all patents owned or controlled by You, whether already acquired or hereafter acquired, that would be infringed by making, having made, using, selling, offering for sale, importing or otherwise transferring of Your Contribution(s) alone or by combination of Your Contribution(s) with any Material.
#### 3.2 Revocation of patent license
You reserve the right to revoke the patent license stated in section 3.1 if We make any infringement claim that is targeted at your Contribution and not asserted for a Defensive Purpose. An assertion of claims of the Patents shall be considered for a "Defensive Purpose" if the claims are asserted against an entity that has filed, maintained, threatened, or voluntarily participated in a patent infringement lawsuit against Us or any of Our licensees.
### 4. License obligations by Us
We agree to (sub)license the Contribution or any Materials containing, based on or derived from your Contribution under the terms of any licenses the Free Software Foundation classifies as Free Software License and which are approved by the Open Source Initiative as Open Source licenses.
More specifically and in strict accordance with the above paragraph, we agree to (sub)license the Contribution or any Materials containing, based on or derived from the Contribution only under the terms of the following license(s) Apache-2.0 (including any right to adopt any future version of a license if permitted).
We agree to license patents owned or controlled by You only to the extent necessary to (sub)license Your Contribution(s) and the combination of Your Contribution(s) with the Material under the terms of any licenses the Free Software Foundation classifies as Free Software licenses and which are approved by the Open Source Initiative as Open Source licenses..
### 5. Disclaimer
THE CONTRIBUTION IS PROVIDED "AS IS". MORE PARTICULARLY, ALL EXPRESS OR IMPLIED WARRANTIES INCLUDING, WITHOUT LIMITATION, ANY IMPLIED WARRANTY OF SATISFACTORY QUALITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT ARE EXPRESSLY DISCLAIMED BY YOU TO US AND BY US TO YOU. TO THE EXTENT THAT ANY SUCH WARRANTIES CANNOT BE DISCLAIMED, SUCH WARRANTY IS LIMITED IN DURATION AND EXTENT TO THE MINIMUM PERIOD AND EXTENT PERMITTED BY LAW.
### 6. Consequential damage waiver
TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT WILL YOU OR WE BE LIABLE FOR ANY LOSS OF PROFITS, LOSS OF ANTICIPATED SAVINGS, LOSS OF DATA, INDIRECT, SPECIAL, INCIDENTAL, CONSEQUENTIAL AND EXEMPLARY DAMAGES ARISING OUT OF THIS AGREEMENT REGARDLESS OF THE LEGAL OR EQUITABLE THEORY (CONTRACT, TORT OR OTHERWISE) UPON WHICH THE CLAIM IS BASED.
### 7. Approximation of disclaimer and damage waiver
IF THE DISCLAIMER AND DAMAGE WAIVER MENTIONED IN SECTION 5. AND SECTION 6. CANNOT BE GIVEN LEGAL EFFECT UNDER APPLICABLE LOCAL LAW, REVIEWING COURTS SHALL APPLY LOCAL LAW THAT MOST CLOSELY APPROXIMATES AN ABSOLUTE WAIVER OF ALL CIVIL OR CONTRACTUAL LIABILITY IN CONNECTION WITH THE CONTRIBUTION.
### 8. Term
#### 8.1 This Agreement shall come into effect upon Your acceptance of the terms and conditions.
#### 8.2 This Agreement shall apply for the term of the copyright and patents licensed here. However, You shall have the right to terminate the Agreement if We do not fulfill the obligations as set forth in Section 4. Such termination must be made in writing.
#### 8.3 In the event of a termination of this Agreement Sections 5., 6., 7., 8., and 9. shall survive such termination and shall remain in full force thereafter. For the avoidance of doubt, Free and Open Source Software (sub)licenses that have already been granted for Contributions at the date of the termination shall remain in full force after the termination of this Agreement.
### 9. Miscellaneous
#### 9.1 This Agreement and all disputes, claims, actions, suits or other proceedings arising out of this agreement or relating in any way to it shall be governed by the laws of Germany excluding its private international law provisions.
#### 9.2 This Agreement sets out the entire agreement between You and Us for Your Contributions to Us and overrides all other agreements or understandings.
#### 9.3 In case of Your death, this agreement shall continue with Your heirs. In case of more than one heir, all heirs must exercise their rights through a commonly authorized person.
#### 9.4 If any provision of this Agreement is found void and unenforceable, such provision will be replaced to the extent possible with a provision that comes closest to the meaning of the original provision and that is enforceable. The terms and conditions set forth in this Agreement shall apply notwithstanding any failure of essential purpose of this Agreement or any limited remedy to the maximum extent possible under law.
#### 9.5 You agree to notify Us of any facts or circumstances of which you become aware that would make this Agreement inaccurate in any respect.
**You**
Date:_______________________________
Name:_______________________________
Title:______________________________
Address:____________________________
**Us**
Date:_______________________________
Name:_______________________________
Title:_______________________________
Address:_______________________________
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/pyproject.toml
|
[tool.poetry]
name = "zenml"
version = "0.80.1"
packages = [{ include = "zenml", from = "src" }]
description = "ZenML: Write production-ready ML code."
authors = ["ZenML GmbH <info@zenml.io>"]
readme = "README.md"
homepage = "https://zenml.io"
documentation = "https://docs.zenml.io"
repository = "https://github.com/zenml-io/zenml"
license = "Apache-2.0"
keywords = ["machine learning", "production", "pipeline", "mlops", "devops"]
classifiers = [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"Intended Audience :: System Administrators",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: System :: Distributed Computing",
"Topic :: Software Development :: Libraries :: Python Modules",
"Typing :: Typed",
]
exclude = [
"tests.*",
"*.tests",
"docs",
"tests",
"tests",
"legacy",
"*.tests.*",
"examples",
]
include = ["src/zenml", "*.txt", "*.sh", "*.md"]
[tool.poetry.scripts]
zenml = "zenml.cli.cli:cli"
[tool.poetry.dependencies]
alembic = { version = "~1.8.1" }
bcrypt = { version = "4.0.1" }
click = "^8.0.1,<8.1.8"
cloudpickle = ">=2.0.0,<3"
distro = "^1.6.0"
docker = "~7.1.0"
gitpython = "^3.1.18"
packaging = ">=24.1"
passlib = { extras = ["bcrypt"], version = "~1.7.4" }
psutil = ">=5.0.0"
pydantic = "~2.8"
pydantic-settings = "*"
pymysql = { version = "~1.1.0,>=1.1.1" }
python = ">=3.9,<3.13"
python-dateutil = "^2.8.1"
pyyaml = ">=6.0.1"
rich = { extras = ["jupyter"], version = ">=12.0.0" }
setuptools = "*"
sqlalchemy = "^2.0.0"
sqlalchemy_utils = "*"
sqlmodel = "0.0.18"
importlib_metadata = { version = "<=7.0.0", python = "<3.10" }
# Optional dependencies for the ZenServer
fastapi = { version = ">=0.100, <=0.115.8", optional = true }
uvicorn = { extras = ["standard"], version = ">=0.17.5", optional = true }
python-multipart = { version = "~0.0.9", optional = true }
pyjwt = { extras = ["crypto"], version = "2.7.*", optional = true }
orjson = { version = "~3.10.0", optional = true }
Jinja2 = { version = "*", optional = true }
ipinfo = { version = ">=4.4.3", optional = true }
secure = { version = "~0.3.0", optional = true }
tldextract = { version = "~5.1.0", optional = true }
itsdangerous = { version = "~2.2.0", optional = true }
# Optional dependencies for project templates
copier = { version = ">=8.1.0", optional = true }
pyyaml-include = { version = "<2.0", optional = true }
jinja2-time = { version = "^0.2.0", optional = true }
# Optional dependencies for the AWS secrets store
boto3 = { version = ">=1.16.0", optional = true }
# Optional dependencies for the GCP secrets store
google-cloud-secret-manager = { version = ">=2.12.5", optional = true }
# Optional dependencies for the Azure Key Vault secrets store
requests = { version = "^2.27.11", optional = true }
azure-identity = { version = ">=1.4.0", optional = true }
azure-keyvault-secrets = { version = ">=4.0.0", optional = true }
# Optional dependencies for the HashiCorp Vault secrets store
hvac = { version = ">=0.11.2", optional = true }
# Optional dependencies for the AWS connector
aws-profile-manager = { version = ">=0.5.0", optional = true }
# Optional dependencies for the Kubernetes connector
kubernetes = { version = ">=18.20.0", optional = true }
# Optional dependencies for the GCP connector
google-cloud-container = { version = ">=2.21.0", optional = true }
google-cloud-storage = { version = ">=2.9.0", optional = true }
google-cloud-artifact-registry = { version = ">=1.11.3", optional = true }
# Optional dependencies for the Azure connector
azure-mgmt-containerservice = { version = ">=20.0.0", optional = true }
azure-mgmt-containerregistry = { version = ">=10.0.0", optional = true }
azure-mgmt-storage = { version = ">=20.0.0", optional = true }
azure-storage-blob = { version = ">=12.0.0", optional = true }
azure-mgmt-resource = { version = ">=21.0.0", optional = true }
# Optional dependencies for the S3 artifact store
s3fs = { version = ">=2022.11.0", optional = true }
# Optional dependencies for the Sagemaker orchestrator
sagemaker = { version = ">=2.199.0", optional = true }
# Optional dependencies for the GCS artifact store
gcsfs = { version = ">=2022.11.0", optional = true }
# Optional dependencies for the Vertex orchestrator
kfp = { version = ">=2.6.0", optional = true }
google-cloud-aiplatform = { version = ">=1.34.0", optional = true }
# Optional dependencies for the Azure artifact store
adlfs = { version = ">=2021.10.0", optional = true }
# Optional dependencies for the AzureML orchestrator
azure-ai-ml = { version = "1.23.1", optional = true }
# Optional development dependencies
bandit = { version = "^1.7.5", optional = true }
coverage = { extras = ["toml"], version = "^5.5", optional = true }
mypy = { version = "1.7.1", optional = true }
pyment = { version = "^0.3.3", optional = true }
tox = { version = "^3.24.3", optional = true }
hypothesis = { version = "^6.43.1", optional = true }
typing-extensions = { version = ">=3.7.4", optional = true }
darglint = { version = "^1.8.1", optional = true }
ruff = { version = ">=0.1.7", optional = true }
yamlfix = { version = "^1.16.0", optional = true }
maison = { version = "<2.0", optional = true }
# pytest
pytest = { version = "^7.4.0", optional = true }
pytest-randomly = { version = "^3.10.1", optional = true }
pytest-mock = { version = "^3.6.1", optional = true }
pytest-clarity = { version = "^1.0.1", optional = true }
pytest-instafail = { version = ">=0.5.0", optional = true }
pytest-rerunfailures = { version = ">=13.0", optional = true }
pytest-split = { version = "^0.8.1", optional = true }
# mkdocs including plugins
mkdocs = { version = "^1.6.1,<2.0.0", optional = true }
mkdocs-material = { version = ">=9.6.5,<10.0.0", optional = true }
mkdocs-awesome-pages-plugin = { version = ">=2.10.1,<3.0.0", optional = true }
mkdocstrings = { extras = ["python"], version = "^0.28.1,<1.0.0", optional = true }
mkdocs-autorefs = { version = ">=1.4.0,<2.0.0", optional = true }
mike = { version = ">=1.1.2,<2.0.0", optional = true }
# mypy type stubs
types-certifi = { version = "^2021.10.8.0", optional = true }
types-croniter = { version = "^1.0.2", optional = true }
types-futures = { version = "^3.3.1", optional = true }
types-Markdown = { version = "^3.3.6", optional = true }
types-paramiko = { version = ">=3.4.0", optional = true }
types-Pillow = { version = "^9.2.1", optional = true }
types-protobuf = { version = "^3.18.0", optional = true }
types-PyMySQL = { version = "^1.0.4", optional = true }
types-python-dateutil = { version = "^2.8.2", optional = true }
types-python-slugify = { version = "^5.0.2", optional = true }
types-PyYAML = { version = "^6.0.0", optional = true }
types-redis = { version = "^4.1.19", optional = true }
types-requests = { version = "^2.27.11", optional = true }
types-setuptools = { version = "^57.4.2", optional = true }
types-six = { version = "^1.16.2", optional = true }
types-termcolor = { version = "^1.1.2", optional = true }
types-psutil = { version = "^5.8.13", optional = true }
types-passlib = { version = "^1.7.7", optional = true }
[tool.poetry.extras]
server = [
"fastapi",
"uvicorn",
"python-multipart",
"pyjwt",
"fastapi-utils",
"orjson",
"Jinja2",
"ipinfo",
"secure",
"tldextract",
"itsdangerous",
]
templates = ["copier", "jinja2-time", "ruff", "pyyaml-include"]
terraform = ["python-terraform"]
secrets-aws = ["boto3"]
secrets-gcp = ["google-cloud-secret-manager"]
secrets-azure = ["azure-identity", "azure-keyvault-secrets"]
secrets-hashicorp = ["hvac"]
s3fs = ["s3fs"]
gcsfs = ["gcsfs"]
adlfs = ["adlfs"]
connectors-kubernetes = ["kubernetes"]
connectors-aws = ["boto3", "kubernetes", "aws-profile-manager"]
connectors-gcp = [
"google-cloud-container",
"google-cloud-storage",
"google-cloud-artifact-registry",
"kubernetes",
]
connectors-azure = [
"azure-identity",
"azure-mgmt-containerservice",
"azure-mgmt-containerregistry",
"azure-mgmt-storage",
"azure-storage-blob",
"azure-mgmt-resource",
"kubernetes",
"requests",
]
sagemaker = [
"sagemaker",
]
vertex = [
"google-cloud-aiplatform",
"kfp",
]
azureml = [
"azure-ai-ml",
]
dev = [
"bandit",
"ruff",
"yamlfix",
"coverage",
"pytest",
"mypy",
"pre-commit",
"pyment",
"tox",
"hypothesis",
"typing-extensions",
"darglint",
"pytest-randomly",
"pytest-mock",
"pytest-clarity",
"pytest-instafail",
"pytest-rerunfailures",
"pytest-split",
"mkdocs",
"mkdocs-material",
"mkdocs-awesome-pages-plugin",
"mkdocstrings",
"mkdocstrings-python",
"mkdocs-autorefs",
"mike",
"maison",
"types-certifi",
"types-croniter",
"types-futures",
"types-Markdown",
"types-paramiko",
"types-Pillow",
"types-protobuf",
"types-PyMySQL",
"types-python-dateutil",
"types-python-slugify",
"types-PyYAML",
"types-redis",
"types-requests",
"types-setuptools",
"types-six",
"types-termcolor",
"types-psutil",
"types-passlib",
]
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.poetry-version-plugin]
source = "init"
[tool.pytest.ini_options]
filterwarnings = ["ignore::DeprecationWarning"]
log_cli = true
log_cli_level = "INFO"
testpaths = "tests"
xfail_strict = true
norecursedirs = [
"tests/integration/examples/*", # ignore example folders
]
[tool.coverage.run]
parallel = true
source = ["src/zenml"]
[tool.coverage.report]
exclude_lines = [
"pragma: no cover",
'if __name__ == "__main__":',
"if TYPE_CHECKING:",
]
[tool.ruff]
line-length = 79
# Exclude a variety of commonly ignored directories.
exclude = [
".bzr",
".direnv",
".eggs",
".git",
".hg",
".mypy_cache",
".nox",
".pants.d",
".ruff_cache",
".svn",
".tox",
".venv",
"__pypackages__",
"_build",
"buck-out",
".test_durations",
"build",
"dist",
"node_modules",
"venv",
'__init__.py',
'src/zenml/cli/version.py',
# LitGPT files from the LLM Finetuning example
'examples/llm_finetuning/evaluate',
'examples/llm_finetuning/finetune',
'examples/llm_finetuning/generate',
'examples/llm_finetuning/lit_gpt',
'examples/llm_finetuning/scripts',
]
src = ["src", "test"]
# use Python 3.9 as the minimum version for autofixing
target-version = "py39"
[tool.ruff.format]
exclude = [
"*.git",
"*.hg",
".mypy_cache",
".tox",
".venv",
"_build",
"buck-out",
"build]",
]
[tool.ruff.lint]
# Disable autofix for unused imports (`F401`).
unfixable = ["F401"]
per-file-ignores = { }
ignore = [
"E501",
"F401",
"F403",
"D301",
"D401",
"D403",
"D407",
"D213",
"D203",
"S101",
"S104",
"S105",
"S106",
"S107",
]
select = ["D", "E", "F", "I", "I001", "Q"]
[tool.ruff.lint.flake8-import-conventions.aliases]
altair = "alt"
"matplotlib.pyplot" = "plt"
numpy = "np"
pandas = "pd"
seaborn = "sns"
[tool.ruff.lint.mccabe]
max-complexity = 18
[tool.ruff.lint.pydocstyle]
# Use Google-style docstrings.
convention = "google"
[tool.mypy]
plugins = ["pydantic.mypy"]
strict = true
namespace_packages = true
show_error_codes = true
# import all google, transformers and datasets files as `Any`
[[tool.mypy.overrides]]
module = [
"google.*",
"transformers.*", # https://github.com/huggingface/transformers/issues/13390
"datasets.*",
"langchain_community.*",
"IPython.core.*",
]
follow_imports = "skip"
[[tool.mypy.overrides]]
module = [
"airflow.*",
"tensorflow.*",
"apache_beam.*",
"pandas.*",
"distro.*",
"analytics.*",
"absl.*",
"gcsfs.*",
"s3fs.*",
"adlfs.*",
"fsspec.*",
"torch.*",
"pytorch_lightning.*",
"sklearn.*",
"numpy.*",
"facets_overview.*",
"IPython.core.*",
"IPython.display.*",
"plotly.*",
"dash.*",
"dash_bootstrap_components.*",
"dash_cytoscape",
"dash.dependencies",
"docker.*",
"flask.*",
"kfp.*",
"kubernetes.*",
"urllib3.*",
"kfp_server_api.*",
"sagemaker.*",
"azureml.*",
"google.*",
"google_cloud_pipeline_components.*",
"neuralprophet.*",
"lightgbm.*",
"scipy.*",
"deepchecks.*",
"boto3.*",
"botocore.*",
"jupyter_dash.*",
"slack_sdk.*",
"azure-keyvault-keys.*",
"azure-mgmt-resource.*",
"azure.mgmt.resource.*",
"model_archiver.*",
"kfp_tekton.*",
"mlflow.*",
"python_terraform.*",
"bentoml.*",
"multipart.*",
"jose.*",
"sqlalchemy_utils.*",
"sky.*",
"copier.*",
"datasets.*",
"pyngrok.*",
"cloudpickle.*",
"matplotlib.*",
"IPython.*",
"huggingface_hub.*",
"distutils.*",
"accelerate.*",
"label_studio_sdk.*",
"argilla.*",
"lightning_sdk.*",
"peewee.*",
"prodigy.*",
"prodigy.components.*",
"prodigy.components.db.*",
"transformers.*",
"vllm.*",
"numba.*",
"uvloop.*",
]
ignore_missing_imports = true
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/release-cloudbuild-preparation.yaml
|
steps:
# login to Dockerhub
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- docker login --username=$$USERNAME --password=$$PASSWORD
id: docker-login
entrypoint: bash
secretEnv:
- USERNAME
- PASSWORD
# Build base image
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- |
docker build . \
--platform linux/amd64 \
-f docker/zenml-dev.Dockerfile \
-t $$USERNAME/prepare-release:base-${_ZENML_NEW_VERSION}
id: build-base
waitFor: ['-']
entrypoint: bash
secretEnv:
- USERNAME
# Push base image
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- docker push $$USERNAME/prepare-release:base-${_ZENML_NEW_VERSION}
id: push-base
waitFor:
- docker-login
- build-base
entrypoint: bash
secretEnv:
- USERNAME
# Build server image
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- |
docker build . \
--platform linux/amd64 \
-f docker/zenml-server-dev.Dockerfile \
-t $$USERNAME/prepare-release:server-${_ZENML_NEW_VERSION}
id: build-server
waitFor: ['-']
entrypoint: bash
secretEnv:
- USERNAME
# Push server images
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- docker push $$USERNAME/prepare-release:server-${_ZENML_NEW_VERSION}
id: push-server
waitFor:
- docker-login
- build-server
entrypoint: bash
secretEnv:
- USERNAME
# Build Quickstart GCP Image
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- |
docker build . \
--platform linux/amd64 \
--build-arg BASE_IMAGE=$$USERNAME/prepare-release:base-${_ZENML_NEW_VERSION} \
--build-arg CLOUD_PROVIDER=gcp \
--build-arg ZENML_BRANCH=${_ZENML_BRANCH} \
-f docker/zenml-quickstart-dev.Dockerfile \
-t $$USERNAME/prepare-release:quickstart-gcp-${_ZENML_NEW_VERSION}
id: build-quickstart-gcp
waitFor:
- push-base
entrypoint: bash
secretEnv:
- USERNAME
# Build Quickstart AWS image
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- |
docker build . \
--platform linux/amd64 \
--build-arg BASE_IMAGE=$$USERNAME/prepare-release:base-${_ZENML_NEW_VERSION} \
--build-arg CLOUD_PROVIDER=aws \
--build-arg ZENML_BRANCH=${_ZENML_BRANCH} \
-f docker/zenml-quickstart-dev.Dockerfile \
-t $$USERNAME/prepare-release:quickstart-aws-${_ZENML_NEW_VERSION}
id: build-quickstart-aws
waitFor:
- push-base
entrypoint: bash
secretEnv:
- USERNAME
# Build Quickstart Azure image
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- |
docker build . \
--platform linux/amd64 \
--build-arg BASE_IMAGE=$$USERNAME/prepare-release:base-${_ZENML_NEW_VERSION} \
--build-arg CLOUD_PROVIDER=azure \
--build-arg ZENML_BRANCH=${_ZENML_BRANCH} \
-f docker/zenml-quickstart-dev.Dockerfile \
-t $$USERNAME/prepare-release:quickstart-azure-${_ZENML_NEW_VERSION}
id: build-quickstart-azure
waitFor:
- push-base
entrypoint: bash
secretEnv:
- USERNAME
# Push Quickstart images
- name: gcr.io/cloud-builders/docker
args:
- '-c'
- |
docker push $$USERNAME/prepare-release:quickstart-aws-${_ZENML_NEW_VERSION}
docker push $$USERNAME/prepare-release:quickstart-azure-${_ZENML_NEW_VERSION}
docker push $$USERNAME/prepare-release:quickstart-gcp-${_ZENML_NEW_VERSION}
id: push-quickstart
waitFor:
- docker-login
- build-quickstart-gcp
- build-quickstart-aws
- build-quickstart-azure
entrypoint: bash
secretEnv:
- USERNAME
timeout: 3600s
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/docker-password/versions/1
env: PASSWORD
- versionName: projects/$PROJECT_ID/secrets/docker-username/versions/1
env: USERNAME
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/ROADMAP.md
|
# Roadmap
The roadmap is an encapsulation of the features that we intend to build for ZenML. However, please note that we limited resources and therefore no means of guaranteeing that this roadmap will be followed precisely as described on this page. Rest assured we are working to follow this diligently - please keep us in check!
The roadmap is public and can be found [here](https://zenml.io/roadmap).
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/.typos.toml
|
[files]
extend-exclude = [
"*.json",
"*.js",
"*.ipynb",
"src/zenml/zen_stores/migrations/versions/",
"tests/unit/materializers/test_built_in_materializer.py",
"tests/integration/functional/cli/test_pipeline.py",
"src/zenml/zen_server/dashboard/",
"examples/llm_finetuning/lit_gpt/"
]
[default.extend-identifiers]
HashiCorp = "HashiCorp"
NDArray = "NDArray"
K_Scatch = "K_Scatch"
MCAGA1UECgwZQW1hem9uIFdlYiBTZXJ2aWNlcywgSW5jLjETMBEGA1UECwwKQW1h = "MCAGA1UECgwZQW1hem9uIFdlYiBTZXJ2aWNlcywgSW5jLjETMBEGA1UECwwKQW1h"
VQQGEwJVUzEQMA4GA1UEBwwHU2VhdHRsZTETMBEGA1UECAwKV2FzaGluZ3RvbjEi = "VQQGEwJVUzEQMA4GA1UEBwwHU2VhdHRsZTETMBEGA1UECAwKV2FzaGluZ3RvbjEi"
MDEyOk9yZ2FuaXphdGlvbjg4Njc2OTU1 = "MDEyOk9yZ2FuaXphdGlvbjg4Njc2OTU1"
[default.extend-words]
# Don't correct the surname "Teh"
aks = "aks"
hashi = "hashi"
womens = "womens"
prepend = "prepend"
prepended = "prepended"
goes = "goes"
bare = "bare"
prepending = "prepending"
prev = "prev"
creat = "creat"
ret = "ret"
daa = "daa"
arange = "arange"
cachable = "cachable"
OT = "OT"
cll = "cll"
[default]
locale = "en-us"
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/README.md
|
<div align="center">
<img referrerpolicy="no-referrer-when-downgrade" src="https://static.scarf.sh/a.png?x-pxid=0fcbab94-8fbe-4a38-93e8-c2348450a42e" />
<h1 align="center">Beyond The Demo: Production-Grade AI Systems</h1>
<h3 align="center">ZenML brings battle-tested MLOps practices to your AI applications, handling evaluation, monitoring, and deployment at scale</h3>
</div>
<!-- PROJECT SHIELDS -->
<!--
*** I'm using markdown "reference style" links for readability.
*** Reference links are enclosed in brackets [ ] instead of parentheses ( ).
*** See the bottom of this document for the declaration of the reference variables
*** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use.
*** https://www.markdownguide.org/basic-syntax/#reference-style-links
-->
<div align="center">
<!-- PROJECT LOGO -->
<br />
<a href="https://zenml.io">
<img alt="ZenML Logo" src="docs/book/.gitbook/assets/header.png" alt="ZenML Logo">
</a>
<br />
[![PyPi][pypi-shield]][pypi-url]
[![PyPi][pypiversion-shield]][pypi-url]
[![PyPi][downloads-shield]][downloads-url]
[![Contributors][contributors-shield]][contributors-url]
[![License][license-shield]][license-url]
<!-- [![Build][build-shield]][build-url] -->
<!-- [![CodeCov][codecov-shield]][codecov-url] -->
</div>
<!-- MARKDOWN LINKS & IMAGES -->
<!-- https://www.markdownguide.org/basic-syntax/#reference-style-links -->
[pypi-shield]: https://img.shields.io/pypi/pyversions/zenml?color=281158
[pypi-url]: https://pypi.org/project/zenml/
[pypiversion-shield]: https://img.shields.io/pypi/v/zenml?color=361776
[downloads-shield]: https://img.shields.io/pypi/dm/zenml?color=431D93
[downloads-url]: https://pypi.org/project/zenml/
[codecov-shield]: https://img.shields.io/codecov/c/gh/zenml-io/zenml?color=7A3EF4
[codecov-url]: https://codecov.io/gh/zenml-io/zenml
[contributors-shield]: https://img.shields.io/github/contributors/zenml-io/zenml?color=7A3EF4
[contributors-url]: https://github.com/zenml-io/zenml/graphs/contributors
[license-shield]: https://img.shields.io/github/license/zenml-io/zenml?color=9565F6
[license-url]: https://github.com/zenml-io/zenml/blob/main/LICENSE
[linkedin-shield]: https://img.shields.io/badge/-LinkedIn-black.svg?style=for-the-badge&logo=linkedin&colorB=555
[linkedin-url]: https://www.linkedin.com/company/zenml/
[twitter-shield]: https://img.shields.io/twitter/follow/zenml_io?style=for-the-badge
[twitter-url]: https://twitter.com/zenml_io
[slack-shield]: https://img.shields.io/badge/-Slack-black.svg?style=for-the-badge&logo=linkedin&colorB=555
[slack-url]: https://zenml.io/slack-invite
[build-shield]: https://img.shields.io/github/workflow/status/zenml-io/zenml/Build,%20Lint,%20Unit%20&%20Integration%20Test/develop?logo=github&style=for-the-badge
[build-url]: https://github.com/zenml-io/zenml/actions/workflows/ci.yml
---
Need help with documentation? Visit our [docs site](https://docs.zenml.io) for comprehensive guides and tutorials, or browse the [SDK reference](https://sdkdocs.zenml.io/) to find specific functions and classes.
## โญ๏ธ Show Your Support
If you find ZenML helpful or interesting, please consider giving us a star on GitHub. Your support helps promote the project and lets others know that it's worth checking out.
Thank you for your support! ๐
[](https://github.com/zenml-io/zenml/stargazers)
## ๐คธ Quickstart
[](https://colab.research.google.com/github/zenml-io/zenml/blob/main/examples/quickstart/quickstart.ipynb)
[Install ZenML](https://docs.zenml.io/getting-started/installation) via [PyPI](https://pypi.org/project/zenml/). Python 3.9 - 3.12 is required:
```bash
pip install "zenml[server]" notebook
```
Take a tour with the guided quickstart by running:
```bash
zenml go
```
## ๐ช From Prototype to Production: AI Made Simple
### Create AI pipelines with minimal code changes
ZenML is an open-source framework that handles MLOps and LLMOps for engineers scaling AI beyond prototypes. Automate evaluation loops, track performance, and deploy updates across 100s of pipelinesโall while your RAG apps run like clockwork.
```python
from zenml import pipeline, step
@step
def load_rag_documents() -> dict:
# Load and chunk documents for RAG pipeline
documents = extract_web_content(url="https://www.zenml.io/")
return {"chunks": chunk_documents(documents)}
@step
def generate_embeddings(data: dict) -> None:
# Generate embeddings for RAG pipeline
embeddings = embed_documents(data['chunks'])
return {"embeddings": embeddings}
@step
def index_generator(
embeddings: dict,
) -> str:
# Generate index for RAG pipeline
index = create_index(embeddings)
return index.id
@pipeline
def rag_pipeline() -> str:
documents = load_rag_documents()
embeddings = generate_embeddings(documents)
index = index_generator(embeddings)
return index
```

### Easily provision an MLOps stack or reuse your existing infrastructure
The framework is a gentle entry point for practitioners to build complex ML pipelines with little knowledge required of the underlying infrastructure complexity. ZenML pipelines can be run on AWS, GCP, Azure, Airflow, Kubeflow and even on Kubernetes without having to change any code or know underlying internals.
ZenML provides different features to aid people to get started quickly on a remote setting as well. If you want to deploy a remote stack from scratch on your selected cloud provider, you can use the 1-click deployment feature either through the dashboard:

Or, through our CLI command:
```bash
zenml stack deploy --provider aws
```
Alternatively, if the necessary pieces of infrastructure are already deployed, you can register a cloud stack seamlessly through the stack wizard:
```bash
zenml stack register <STACK_NAME> --provider aws
```
Read more about [ZenML stacks](https://docs.zenml.io/user-guide/production-guide/understand-stacks).
### Run workloads easily on your production infrastructure
Once you have your MLOps stack configured, you can easily run workloads on it:
```bash
zenml stack set <STACK_NAME>
python run.py
```
```python
from zenml.config import ResourceSettings, DockerSettings
@step(
settings={
"resources": ResourceSettings(memory="16GB", gpu_count="1", cpu_count="8"),
"docker": DockerSettings(parent_image="pytorch/pytorch:1.12.1-cuda11.3-cudnn8-runtime")
}
)
def training(...):
...
```

### Track models, pipeline, and artifacts
Create a complete lineage of who, where, and what data and models are produced.
You'll be able to find out who produced which model, at what time, with which data, and on which version of the code. This guarantees full reproducibility and auditability.
```python
from zenml import Model
@step(model=Model(name="rag_llm", tags=["staging"]))
def deploy_rag(index_id: str) -> str:
deployment_id = deploy_to_endpoint(index_id)
return deployment_id
```

## ๐ Key LLMOps Capabilities
### Continual RAG Improvement
**Build production-ready retrieval systems**
<div align="center">
<img src="/docs/book/.gitbook/assets/rag_zenml_home.png" width="800" alt="RAG Pipeline">
</div>
ZenML tracks document ingestion, embedding versions, and query patterns. Implement feedback loops and:
- Fix your RAG logic based on production logs
- Automatically re-ingest updated documents
- A/B test different embedding models
- Monitor retrieval quality metrics
### Reproducible Model Fine-Tuning
**Confidence in model updates**
<div align="center">
<img src="/docs/book/.gitbook/assets/finetune_zenml_home.png" width="800" alt="Finetuning Pipeline">
</div>
Maintain full lineage of SLM/LLM training runs:
- Version training data and hyperparameters
- Track performance across iterations
- Automatically promote validated models
- Roll back to previous versions if needed
### Purpose built for machine learning with integrations to your favorite tools
While ZenML brings a lot of value out of the box, it also integrates into your existing tooling and infrastructure without you having to be locked in.
```python
from bentoml._internal.bento import bento
@step(on_failure=alert_slack, experiment_tracker="mlflow")
def train_and_deploy(training_df: pd.DataFrame) -> bento.Bento
mlflow.autolog()
...
return bento
```

## ๐ Your LLM Framework Isn't Enough for Production
While tools like LangChain and LlamaIndex help you **build** LLM workflows, ZenML helps you **productionize** them by adding:
โ
**Artifact Tracking** - Every vector store index, fine-tuned model, and evaluation result versioned automatically
โ
**Pipeline History** - See exactly what code/data produced each version of your RAG system
โ
**Stage Promotion** - Move validated pipelines from staging โ production with one click
## ๐ผ๏ธ Learning
The best way to learn about ZenML is the [docs](https://docs.zenml.io/). We recommend beginning with the [Starter Guide](https://docs.zenml.io/user-guide/starter-guide) to get up and running quickly.
If you are a visual learner, this 11-minute video tutorial is also a great start:
[](https://www.youtube.com/watch?v=wEVwIkDvUPs)
And finally, here are some other examples and use cases for inspiration:
1. [E2E Batch Inference](examples/e2e/): Feature engineering, training, and inference pipelines for tabular machine learning.
2. [Basic NLP with BERT](examples/e2e_nlp/): Feature engineering, training, and inference focused on NLP.
3. [LLM RAG Pipeline with Langchain and OpenAI](https://github.com/zenml-io/zenml-projects/tree/main/zenml-support-agent): Using Langchain to create a simple RAG pipeline.
4. [Huggingface Model to Sagemaker Endpoint](https://github.com/zenml-io/zenml-projects/tree/main/huggingface-sagemaker): Automated MLOps on Amazon Sagemaker and HuggingFace
5. [LLMops](https://github.com/zenml-io/zenml-projects/tree/main/llm-complete-guide): Complete guide to do LLM with ZenML
## ๐ Learn from Books
<div align="center">
<a href="https://www.amazon.com/LLM-Engineers-Handbook-engineering-production/dp/1836200072">
<img src="docs/book/.gitbook/assets/llm_engineering_handbook_cover.jpg" alt="LLM Engineer's Handbook Cover" width="200"/></img>
</a>
<a href="https://www.amazon.com/-/en/Andrew-McMahon/dp/1837631964">
<img src="docs/book/.gitbook/assets/ml_engineering_with_python.jpg" alt="Machine Learning Engineering with Python Cover" width="200"/></img>
</a>
</br></br>
</div>
ZenML is featured in these comprehensive guides to modern MLOps and LLM engineering. Learn how to build production-ready machine learning systems with real-world examples and best practices.
## ๐ Deploy ZenML
For full functionality ZenML should be deployed on the cloud to
enable collaborative features as the central MLOps interface for teams.
Read more about various deployment options [here](https://docs.zenml.io/getting-started/deploying-zenml).
Or, sign up for [ZenML Pro to get a fully managed server on a free trial](https://cloud.zenml.io/?utm_source=readme&utm_medium=referral_link&utm_campaign=cloud_promotion&utm_content=signup_link).
## Use ZenML with VS Code
ZenML has a [VS Code extension](https://marketplace.visualstudio.com/items?itemName=ZenML.zenml-vscode) that allows you to inspect your stacks and pipeline runs directly from your editor. The extension also allows you to switch your stacks without needing to type any CLI commands.
<details>
<summary>๐ฅ๏ธ VS Code Extension in Action!</summary>
<div align="center">
<img width="60%" src="/docs/book/.gitbook/assets/zenml-extension-shortened.gif" alt="ZenML Extension">
</div>
</details>
## ๐บ Roadmap
ZenML is being built in public. The [roadmap](https://zenml.io/roadmap) is a regularly updated source of truth for the ZenML community to understand where the product is going in the short, medium, and long term.
ZenML is managed by a [core team](https://zenml.io/company) of developers that are responsible for making key decisions and incorporating feedback from the community. The team oversees feedback via various channels,
and you can directly influence the roadmap as follows:
- Vote on your most wanted feature on our [Discussion
board](https://zenml.io/discussion).
- Start a thread in our [Slack channel](https://zenml.io/slack).
- [Create an issue](https://github.com/zenml-io/zenml/issues/new/choose) on our GitHub repo.
## ๐ Contributing and Community
We would love to develop ZenML together with our community! The best way to get
started is to select any issue from the `[good-first-issue`
label](https://github.com/issues?q=is%3Aopen+is%3Aissue+archived%3Afalse+user%3Azenml-io+label%3A%22good+first+issue%22)
and open up a Pull Request!
If you
would like to contribute, please review our [Contributing
Guide](CONTRIBUTING.md) for all relevant details.
## ๐ Getting Help
The first point of call should
be [our Slack group](https://zenml.io/slack-invite/).
Ask your questions about bugs or specific use cases, and someone from
the [core team](https://zenml.io/company) will respond.
Or, if you
prefer, [open an issue](https://github.com/zenml-io/zenml/issues/new/choose) on
our GitHub repo.
## ๐ LLM-focused Learning Resources
1. [LL Complete Guide - Full RAG Pipeline](https://github.com/zenml-io/zenml-projects/tree/main/llm-complete-guide) - Document ingestion, embedding management, and query serving
2. [LLM Fine-Tuning Pipeline](https://github.com/zenml-io/zenml-projects/tree/main/zencoder) - From data prep to deployed model
3. [LLM Agents Example](https://github.com/zenml-io/zenml-projects/tree/main/zenml-support-agent) - Track conversation quality and tool usage
## ๐ค AI-Friendly Documentation with llms.txt
ZenML implements the llms.txt standard to make our documentation more accessible to AI assistants and LLMs. Our implementation includes:
- Base documentation at [zenml.io/llms.txt](https://zenml.io/llms.txt) with core user guides
- Specialized files for different documentation aspects:
- [Component guides](https://zenml.io/component-guide.txt) for integration details
- [How-to guides](https://zenml.io/how-to-guides.txt) for practical implementations
- [Complete documentation corpus](https://zenml.io/llms-full.txt) for comprehensive access
This structured approach helps AI tools better understand and utilize ZenML's documentation, enabling more accurate code suggestions and improved documentation search.
## ๐ License
ZenML is distributed under the terms of the Apache License Version 2.0.
A complete version of the license is available in the [LICENSE](LICENSE) file in
this repository. Any contribution made to this project will be licensed under
the Apache License Version 2.0.
<div>
<p align="left">
<div align="left">
Join our <a href="https://zenml.io/slack" target="_blank">
<img width="18" src="https://cdn3.iconfinder.com/data/icons/logos-and-brands-adobe/512/306_Slack-512.png" alt="Slack"/>
<b>Slack Community</b> </a> and be part of the ZenML family.
</div>
<br />
<a href="https://zenml.io/features">Features</a>
ยท
<a href="https://zenml.io/roadmap">Roadmap</a>
ยท
<a href="https://github.com/zenml-io/zenml/issues">Report Bug</a>
ยท
<a href="https://zenml.io/pro">Sign up for ZenML Pro</a>
ยท
<a href="https://www.zenml.io/blog">Read Blog</a>
ยท
<a href="https://github.com/issues?q=is%3Aopen+is%3Aissue+archived%3Afalse+user%3Azenml-io+label%3A%22good+first+issue%22">Contribute to Open Source</a>
ยท
<a href="https://github.com/zenml-io/zenml-projects">Projects Showcase</a>
<br />
<br />
๐ Version 0.80.1 is out. Check out the release notes
<a href="https://github.com/zenml-io/zenml/releases">here</a>.
<br />
๐ฅ๏ธ Download our VS Code Extension <a href="https://marketplace.visualstudio.com/items?itemName=ZenML.zenml-vscode">here</a>.
<br />
</p>
</div>
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/CODE-OF-CONDUCT.md
|
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
[support@zenml.io](mailto:support@zenml.io).
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
[https://www.contributor-covenant.org/version/2/0/code_of_conduct.html][v2.0].
Community Impact Guidelines were inspired by
[Mozilla's code of conduct enforcement ladder][Mozilla CoC].
For answers to common questions about this code of conduct, see the FAQ at
[https://www.contributor-covenant.org/faq][FAQ]. Translations are available
at [https://www.contributor-covenant.org/translations][translations].
[homepage]: https://www.contributor-covenant.org
[v2.0]: https://www.contributor-covenant.org/version/2/0/code_of_conduct.html
[Mozilla CoC]: https://github.com/mozilla/inclusion
[FAQ]: https://www.contributor-covenant.org/faq
[translations]: https://www.contributor-covenant.org/translations
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/RELEASE_NOTES.md
| "<!-- markdown-link-check-disable -->\n\n# 0.80.1\n\nThe `0.80.1` release focuses on bug fixes and p(...TRUNCATED)
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/release-cloudbuild.yaml
| "steps:\n # build client base image - python 3.9\n - name: gcr.io/cloud-builders/docker\n args:(...TRUNCATED)
| 0
|
cloned_public_repos
|
cloned_public_repos/zenml/.coderabbit.yaml
| "language: \"en\"\nearly_access: false\nreviews:\n high_level_summary: true\n poem: true\n review(...TRUNCATED)
| 0
|
End of preview. Expand
in Data Studio
README.md exists but content is empty.
- Downloads last month
- 1