content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
name: "docker: build release containers for all tags"\n\non:\n push:\n tags:\n - '*'\n workflow_dispatch: {}\n\npermissions:\n contents: read\n\njobs:\n build-default-release-container:\n runs-on: [ubuntu-latest]\n\n steps:\n -\n name: Checkout\n uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # v2\n -\n name: Docker meta\n id: docker_meta\n uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v3\n with:\n images: |\n chrislusf/seaweedfs\n tags: |\n type=ref,event=tag,suffix=_full\n flavor: |\n latest=false\n labels: |\n org.opencontainers.image.title=seaweedfs\n org.opencontainers.image.description=SeaweedFS is a distributed storage system for blobs, objects, files, and data lake, to store and serve billions of files fast!\n org.opencontainers.image.vendor=Chris Lu\n -\n name: Set up QEMU\n uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v1\n -\n name: Set up Docker Buildx\n uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v1\n -\n name: Login to Docker Hub\n if: github.event_name != 'pull_request'\n uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v1\n with:\n username: ${{ secrets.DOCKER_USERNAME }}\n password: ${{ secrets.DOCKER_PASSWORD }}\n -\n name: Build\n uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v2\n with:\n context: ./docker\n push: ${{ github.event_name != 'pull_request' }}\n file: ./docker/Dockerfile.go_build\n build-args: TAGS=elastic,gocdk,rclone,sqlite,tarantool,tikv,ydb\n platforms: linux/amd64\n tags: ${{ steps.docker_meta.outputs.tags }}\n labels: ${{ steps.docker_meta.outputs.labels }}\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\container_release4.yml | container_release4.yml | YAML | 1,964 | 0.8 | 0.051724 | 0 | awesome-app | 539 | 2023-09-05T20:45:00.131378 | GPL-3.0 | false | 0e98fc778f0152e8751260cd6640ef00 |
name: "docker: build release containers for all tags and large volume"\n\non:\n push:\n tags:\n - '*'\n workflow_dispatch: {}\n\npermissions:\n contents: read\n\njobs:\n build-default-release-container:\n runs-on: [ubuntu-latest]\n\n steps:\n -\n name: Checkout\n uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # v2\n -\n name: Docker meta\n id: docker_meta\n uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v3\n with:\n images: |\n chrislusf/seaweedfs\n tags: |\n type=ref,event=tag,suffix=_large_disk_full\n flavor: |\n latest=false\n labels: |\n org.opencontainers.image.title=seaweedfs\n org.opencontainers.image.description=SeaweedFS is a distributed storage system for blobs, objects, files, and data lake, to store and serve billions of files fast!\n org.opencontainers.image.vendor=Chris Lu\n -\n name: Set up QEMU\n uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v1\n -\n name: Set up Docker Buildx\n uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v1\n -\n name: Login to Docker Hub\n if: github.event_name != 'pull_request'\n uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v1\n with:\n username: ${{ secrets.DOCKER_USERNAME }}\n password: ${{ secrets.DOCKER_PASSWORD }}\n -\n name: Build\n uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v2\n with:\n context: ./docker\n push: ${{ github.event_name != 'pull_request' }}\n file: ./docker/Dockerfile.go_build\n build-args: TAGS=5BytesOffset,elastic,gocdk,rclone,sqlite,tarantool,tikv,ydb\n platforms: linux/amd64\n tags: ${{ steps.docker_meta.outputs.tags }}\n labels: ${{ steps.docker_meta.outputs.labels }}\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\container_release5.yml | container_release5.yml | YAML | 2,005 | 0.8 | 0.051724 | 0 | node-utils | 23 | 2023-07-28T00:13:39.482639 | Apache-2.0 | false | 0cbd929876aba8abd8609fc1b7acc5f8 |
name: 'Dependency Review'\non: [pull_request]\n\npermissions:\n contents: read\n\njobs:\n dependency-review:\n runs-on: ubuntu-latest\n steps:\n - name: 'Checkout Repository'\n uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633\n - name: 'Dependency Review'\n uses: actions/dependency-review-action@0659a74c94536054bfa5aeb92241f70d680cc78e\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\depsreview.yml | depsreview.yml | YAML | 372 | 0.7 | 0 | 0 | react-lib | 482 | 2025-03-25T21:20:35.071339 | BSD-3-Clause | false | 05245b803e3dcc81bd3c02d9e003a33e |
name: "End to End"\n\non:\n push:\n branches: [ master ]\n pull_request:\n branches: [ master ]\n\nconcurrency:\n group: ${{ github.head_ref }}/e2e\n cancel-in-progress: true\n\npermissions:\n contents: read\n\ndefaults:\n run:\n working-directory: docker\n\njobs:\n e2e:\n name: FUSE Mount\n runs-on: ubuntu-22.04\n timeout-minutes: 30\n steps:\n - name: Set up Go 1.x\n uses: actions/setup-go@0c52d547c9bc32b1aa3301fd7a9cb496313a4491 # v2\n with:\n go-version: ^1.13\n id: go\n\n - name: Check out code into the Go module directory\n uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # v2\n\n - name: Install dependencies\n run: |\n sudo apt-get update\n sudo apt-get install -y fuse\n\n - name: Start SeaweedFS\n timeout-minutes: 5\n run: make build_e2e && docker compose -f ./compose/e2e-mount.yml up --wait\n\n - name: Run FIO 4k\n timeout-minutes: 15\n run: |\n echo "Starting FIO at: $(date)"\n # Concurrent r/w\n echo 'Run randrw with size=16M bs=4k'\n docker compose -f ./compose/e2e-mount.yml exec mount timeout -k5 60 fio --name=fiotest --filename=/mnt/seaweedfs/fiotest --size=16M --rw=randrw --bs=4k --direct=1 --numjobs=8 --ioengine=libaio --group_reporting --runtime=30 --time_based=1\n\n echo "Verify FIO at: $(date)"\n # Verified write\n echo 'Run randwrite with size=16M bs=4k'\n docker compose -f ./compose/e2e-mount.yml exec mount timeout -k5 60 fio --name=fiotest --filename=/mnt/seaweedfs/fiotest --size=16M --rw=randwrite --bs=4k --direct=1 --numjobs=8 --ioengine=libaio --iodepth=32 --group_reporting --runtime=30 --time_based=1 --do_verify=0 --verify=crc32c --verify_backlog=1\n\n - name: Run FIO 128k\n timeout-minutes: 15\n run: |\n echo "Starting FIO at: $(date)"\n # Concurrent r/w\n echo 'Run randrw with size=16M bs=128k'\n docker compose -f ./compose/e2e-mount.yml exec mount timeout -k5 60 fio --name=fiotest --filename=/mnt/seaweedfs/fiotest --size=16M --rw=randrw --bs=128k --direct=1 --numjobs=8 --ioengine=libaio --iodepth=32 --group_reporting --runtime=30 --time_based=1\n\n echo "Verify FIO at: $(date)"\n # Verified write\n echo 'Run randwrite with size=16M bs=128k'\n docker compose -f ./compose/e2e-mount.yml exec mount timeout -k5 60 fio --name=fiotest --filename=/mnt/seaweedfs/fiotest --size=16M --rw=randwrite --bs=128k --direct=1 --numjobs=8 --ioengine=libaio --iodepth=32 --group_reporting --runtime=30 --time_based=1 --do_verify=0 --verify=crc32c --verify_backlog=1\n\n - name: Run FIO 1MB\n timeout-minutes: 15\n run: |\n echo "Starting FIO at: $(date)"\n # Concurrent r/w\n echo 'Run randrw with size=16M bs=1m'\n docker compose -f ./compose/e2e-mount.yml exec mount timeout -k5 60 fio --name=fiotest --filename=/mnt/seaweedfs/fiotest --size=16M --rw=randrw --bs=1m --direct=1 --numjobs=8 --ioengine=libaio --iodepth=32 --group_reporting --runtime=30 --time_based=1\n \n echo "Verify FIO at: $(date)"\n # Verified write\n echo 'Run randwrite with size=16M bs=1m'\n docker compose -f ./compose/e2e-mount.yml exec mount timeout -k5 60 fio --name=fiotest --filename=/mnt/seaweedfs/fiotest --size=16M --rw=randwrite --bs=1m --direct=1 --numjobs=8 --ioengine=libaio --iodepth=32 --group_reporting --runtime=30 --time_based=1 --do_verify=0 --verify=crc32c --verify_backlog=1\n\n - name: Save logs\n if: always()\n run: |\n docker compose -f ./compose/e2e-mount.yml logs > output.log\n echo 'Showing last 500 log lines of mount service:'\n docker compose -f ./compose/e2e-mount.yml logs --tail 500 mount\n\n - name: Check for data races\n if: always()\n continue-on-error: true # TODO: remove this comment to enable build failure on data races (after all are fixed)\n run: grep -A50 'DATA RACE' output.log && exit 1 || exit 0\n\n - name: Archive logs\n if: always()\n uses: actions/upload-artifact@v4\n with:\n name: output-logs\n path: docker/output.log\n\n - name: Cleanup\n if: always()\n run: docker compose -f ./compose/e2e-mount.yml down --volumes --remove-orphans --rmi all\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\e2e.yml | e2e.yml | YAML | 4,243 | 0.8 | 0.048077 | 0.069767 | node-utils | 914 | 2025-01-07T14:38:08.683915 | GPL-3.0 | false | 2fe2c9943fea4a6d2f2f04086035b1f9 |
name: "go: build binary"\n\non:\n push:\n branches: [ master ]\n pull_request:\n branches: [ master ]\n\nconcurrency:\n group: ${{ github.head_ref }}/go\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n\n build:\n name: Build\n runs-on: ubuntu-latest\n steps:\n\n - name: Set up Go 1.x\n uses: actions/setup-go@0c52d547c9bc32b1aa3301fd7a9cb496313a4491 # v2\n with:\n go-version: ^1.13\n id: go\n\n - name: Check out code into the Go module directory\n uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # v2\n\n - name: Get dependencies\n run: |\n cd weed; go get -v -t -d ./...\n\n - name: Build\n run: cd weed; go build -tags "elastic gocdk sqlite ydb tarantool tikv rclone" -v .\n\n - name: Test\n run: cd weed; go test -tags "elastic gocdk sqlite ydb tarantool tikv rclone" -v ./...\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\go.yml | go.yml | YAML | 867 | 0.8 | 0 | 0 | react-lib | 996 | 2024-08-22T10:06:27.123646 | MIT | false | f35793c7cf3e16f458820fe975424474 |
name: "helm: publish charts"\non:\n push:\n tags:\n - '*'\n\npermissions:\n contents: write\n pages: write\n\njobs:\n release:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633\n - name: Publish Helm charts\n uses: stefanprodan/helm-gh-pages@master\n with: \n token: ${{ secrets.GITHUB_TOKEN }}\n charts_dir: k8s/charts\n target_dir: helm\n branch: gh-pages\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\helm_chart_release.yml | helm_chart_release.yml | YAML | 476 | 0.7 | 0 | 0 | vue-tools | 653 | 2023-12-09T08:40:36.115014 | BSD-3-Clause | false | f80e244ecfeb39885ff433752681088a |
name: "helm: lint and test charts"\n\non:\n push:\n branches: [ master ]\n paths: ['k8s/**']\n pull_request:\n branches: [ master ]\n paths: ['k8s/**']\n\npermissions:\n contents: read\n\njobs:\n lint-test:\n runs-on: ubuntu-latest\n steps:\n - name: Checkout\n uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633\n with:\n fetch-depth: 0\n\n - name: Set up Helm\n uses: azure/setup-helm@v4\n with:\n version: v3.10.0\n\n - uses: actions/setup-python@v5\n with:\n python-version: '3.9'\n check-latest: true\n\n - name: Set up chart-testing\n uses: helm/chart-testing-action@v2.7.0\n\n - name: Run chart-testing (list-changed)\n id: list-changed\n run: |\n changed=$(ct list-changed --target-branch ${{ github.event.repository.default_branch }} --chart-dirs k8s/charts)\n if [[ -n "$changed" ]]; then\n echo "::set-output name=changed::true"\n fi\n\n - name: Run chart-testing (lint)\n run: ct lint --target-branch ${{ github.event.repository.default_branch }} --all --validate-maintainers=false --chart-dirs k8s/charts\n\n - name: Create kind cluster\n uses: helm/kind-action@v1.12.0\n\n - name: Run chart-testing (install)\n run: ct install --target-branch ${{ github.event.repository.default_branch }} --all --chart-dirs k8s/charts\n | dataset_sample\yaml\seaweedfs_seaweedfs\.github\workflows\helm_ci.yml | helm_ci.yml | YAML | 1,409 | 0.8 | 0.019608 | 0 | awesome-app | 231 | 2023-10-13T09:39:18.238180 | GPL-3.0 | false | 012c06d5c9df80ab408d5715a3ea81de |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs # use a remote image\n ports:\n - 9333:9333\n - 19333:19333\n - 9324:9324\n command: "master -ip=master -ip.bind=0.0.0.0 -metricsPort=9324"\n volume:\n image: chrislusf/seaweedfs # use a remote image\n ports:\n - 8080:8080\n - 18080:18080\n - 9325:9325\n command: 'volume -mserver="master:9333" -ip.bind=0.0.0.0 -port=8080 -metricsPort=9325'\n depends_on:\n - master\n filer:\n image: chrislusf/seaweedfs # use a remote image\n ports:\n - 8888:8888\n - 18888:18888\n - 9326:9326\n command: 'filer -master="master:9333" -ip.bind=0.0.0.0 -metricsPort=9326'\n tty: true\n stdin_open: true\n depends_on:\n - master\n - volume\n s3:\n image: chrislusf/seaweedfs # use a remote image\n ports:\n - 8333:8333\n - 9327:9327\n command: 's3 -filer="filer:8888" -ip.bind=0.0.0.0 -metricsPort=9327'\n depends_on:\n - master\n - volume\n - filer\n webdav:\n image: chrislusf/seaweedfs # use a remote image\n ports:\n - 7333:7333\n command: 'webdav -filer="filer:8888"'\n depends_on:\n - master\n - volume\n - filer\n prometheus:\n image: prom/prometheus:v2.21.0\n ports:\n - 9000:9090\n volumes:\n - ./prometheus:/etc/prometheus\n command: --web.enable-lifecycle --config.file=/etc/prometheus/prometheus.yml\n depends_on:\n - s3\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\seaweedfs-compose.yml | seaweedfs-compose.yml | YAML | 1,430 | 0.8 | 0 | 0 | vue-tools | 28 | 2024-01-20T18:16:04.641377 | MIT | false | 792d25fcdbcdf58beee88527cabc0506 |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:dev # use a remote dev image\n ports:\n - 9333:9333\n - 19333:19333\n command: "master -ip=master"\n volume:\n image: chrislusf/seaweedfs:dev # use a remote dev image\n ports:\n - 8080:8080\n - 18080:18080\n command: 'volume -mserver="master:9333" -port=8080 -ip=volume'\n depends_on:\n - master\n filer:\n image: chrislusf/seaweedfs:dev # use a remote dev image\n ports:\n - 8888:8888\n - 18888:18888\n command: 'filer -master="master:9333" -ip.bind=0.0.0.0'\n depends_on:\n - master\n - volume\n s3:\n image: chrislusf/seaweedfs:dev # use a remote dev image\n ports:\n - 8333:8333\n command: 's3 -filer="filer:8888" -ip.bind=0.0.0.0'\n depends_on:\n - master\n - volume\n - filer\n webdav:\n image: chrislusf/seaweedfs:dev # use a remote dev image\n ports:\n - 7333:7333\n command: 'webdav -filer="filer:8888"'\n depends_on:\n - master\n - volume\n - filer\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\seaweedfs-dev-compose.yml | seaweedfs-dev-compose.yml | YAML | 1,030 | 0.8 | 0 | 0 | node-utils | 929 | 2023-08-09T12:54:04.300350 | MIT | false | 76a00268db47c7aba408c85687cb1be4 |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:e2e\n command: "-v=4 master -ip=master -ip.bind=0.0.0.0 -raftBootstrap"\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://localhost:9333/cluster/healthz" ]\n interval: 1s\n timeout: 60s\n\n volume:\n image: chrislusf/seaweedfs:e2e\n command: "-v=4 volume -mserver=master:9333 -ip=volume -ip.bind=0.0.0.0 -preStopSeconds=1"\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://localhost:8080/healthz" ]\n interval: 1s\n timeout: 30s\n depends_on:\n master:\n condition: service_healthy\n\n filer:\n image: chrislusf/seaweedfs:e2e\n command: "-v=4 filer -master=master:9333 -ip=filer -ip.bind=0.0.0.0"\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://localhost:8888" ]\n interval: 1s\n timeout: 30s\n depends_on:\n volume:\n condition: service_healthy\n\n mount:\n image: chrislusf/seaweedfs:e2e\n command: "-v=4 mount -filer=filer:8888 -filer.path=/ -dirAutoCreate -dir=/mnt/seaweedfs"\n cap_add:\n - SYS_ADMIN\n devices:\n - /dev/fuse\n security_opt:\n - apparmor:unconfined\n deploy:\n resources:\n limits:\n memory: 4096m\n healthcheck:\n test: [ "CMD", "mountpoint", "-q", "--", "/mnt/seaweedfs" ]\n interval: 1s\n timeout: 30s\n depends_on:\n filer:\n condition: service_healthy\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\e2e-mount.yml | e2e-mount.yml | YAML | 1,431 | 0.8 | 0 | 0 | node-utils | 694 | 2023-08-13T20:18:10.191643 | BSD-3-Clause | false | e93da99c34eaf8e911afc0e0c151365b |
version: '3.9'\n\nservices:\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8333:8333\n - 9333:9333\n - 19333:19333\n - 8084:8080\n - 18084:18080\n - 8888:8888\n - 18888:18888\n - 8000:8000\n command: "server -ip=s3 -filer -s3 -s3.config=/etc/seaweedfs/s3.json -s3.port=8000 -s3.auditLogConfig=/etc/seaweedfs/fluent.json -volume.max=0 -master.volumeSizeLimitMB=8 -volume.preStopSeconds=1"\n volumes:\n - ./fluent.json:/etc/seaweedfs/fluent.json\n - ./s3.json:/etc/seaweedfs/s3.json\n depends_on:\n - fluent\n fluent:\n image: fluent/fluentd:v1.17\n volumes:\n - ./fluent.conf:/fluentd/etc/fluent.conf\n ports:\n - 24224:24224\n #s3tests:\n # image: chrislusf/ceph-s3-tests:local\n # volumes:\n # - ./s3tests.conf:/opt/s3-tests/s3tests.conf\n # environment:\n # S3TEST_CONF: "s3tests.conf"\n # NOSETESTS_OPTIONS: "--verbose --logging-level=ERROR --with-xunit --failure-detail s3tests_boto3.functional.test_s3"\n # NOSETESTS_ATTR: "!tagging,!fails_on_aws,!encryption,!bucket-policy,!versioning,!fails_on_rgw,!bucket-policy,!fails_with_subdomain,!policy_status,!object-lock,!lifecycle,!cors,!user-policy"\n # NOSETESTS_EXCLUDE: "(get_bucket_encryption|put_bucket_encryption|bucket_list_delimiter_basic|bucket_listv2_delimiter_basic|bucket_listv2_encoding_basic|bucket_list_encoding_basic|bucket_list_delimiter_prefix|bucket_listv2_delimiter_prefix_ends_with_delimiter|bucket_list_delimiter_prefix_ends_with_delimiter|bucket_list_delimiter_alt|bucket_listv2_delimiter_alt|bucket_list_delimiter_prefix_underscore|bucket_list_delimiter_percentage|bucket_listv2_delimiter_percentage|bucket_list_delimiter_whitespace|bucket_listv2_delimiter_whitespace|bucket_list_delimiter_dot|bucket_listv2_delimiter_dot|bucket_list_delimiter_unreadable|bucket_listv2_delimiter_unreadable|bucket_listv2_fetchowner_defaultempty|bucket_listv2_fetchowner_empty|bucket_list_prefix_delimiter_alt|bucket_listv2_prefix_delimiter_alt|bucket_list_prefix_delimiter_prefix_not_exist|bucket_listv2_prefix_delimiter_prefix_not_exist|bucket_list_prefix_delimiter_delimiter_not_exist|bucket_listv2_prefix_delimiter_delimiter_not_exist|bucket_list_prefix_delimiter_prefix_delimiter_not_exist|bucket_listv2_prefix_delimiter_prefix_delimiter_not_exist|bucket_list_maxkeys_none|bucket_listv2_maxkeys_none|bucket_list_maxkeys_invalid|bucket_listv2_continuationtoken_empty|bucket_list_return_data|bucket_list_objects_anonymous|bucket_listv2_objects_anonymous|bucket_notexist|bucketv2_notexist|bucket_delete_nonempty|bucket_concurrent_set_canned_acl|object_write_to_nonexist_bucket|object_requestid_matches_header_on_error|object_set_get_metadata_none_to_good|object_set_get_metadata_none_to_empty|object_set_get_metadata_overwrite_to_empty|post_object_anonymous_request|post_object_authenticated_request|post_object_authenticated_no_content_type|post_object_authenticated_request_bad_access_key|post_object_set_success_code|post_object_set_invalid_success_code|post_object_upload_larger_than_chunk|post_object_set_key_from_filename|post_object_ignored_header|post_object_case_insensitive_condition_fields|post_object_escaped_field_values|post_object_success_redirect_action|post_object_invalid_signature|post_object_invalid_access_key|post_object_missing_policy_condition|post_object_user_specified_header|post_object_request_missing_policy_specified_field|post_object_expired_policy|post_object_invalid_request_field_value|get_object_ifunmodifiedsince_good|put_object_ifmatch_failed|object_raw_get_bucket_gone|object_delete_key_bucket_gone|object_raw_get_bucket_acl|object_raw_get_object_acl|object_raw_response_headers|object_raw_authenticated_bucket_gone|object_raw_get_x_amz_expires_out_max_range|object_raw_get_x_amz_expires_out_positive_range|object_anon_put_write_access|object_raw_put_authenticated_expired|bucket_create_exists|bucket_create_naming_bad_short_one|bucket_create_naming_bad_short_two|bucket_get_location|bucket_acl_default|bucket_acl_canned|bucket_acl_canned_publicreadwrite|bucket_acl_canned_authenticatedread|object_acl_default|object_acl_canned_during_create|object_acl_canned|object_acl_canned_publicreadwrite|object_acl_canned_authenticatedread|object_acl_canned_bucketownerread|object_acl_canned_bucketownerfullcontrol|object_acl_full_control_verify_attributes|bucket_acl_canned_private_to_private|bucket_acl_grant_nonexist_user|bucket_acl_no_grants|bucket_acl_grant_email_not_exist|bucket_acl_revoke_all|bucket_recreate_not_overriding|object_copy_verify_contenttype|object_copy_to_itself_with_metadata|object_copy_not_owned_bucket|object_copy_not_owned_object_bucket|object_copy_retaining_metadata|object_copy_replacing_metadata|multipart_upload_empty|multipart_copy_invalid_range|multipart_copy_special_names|multipart_upload_resend_part|multipart_upload_size_too_small|abort_multipart_upload_not_found|multipart_upload_missing_part|multipart_upload_incorrect_etag|100_continue|ranged_request_invalid_range|ranged_request_empty_object|access_bucket)"\n # depends_on:\n # - s3\n # - fluent | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-auditlog-compose.yml | local-auditlog-compose.yml | YAML | 5,071 | 0.95 | 0 | 0.324324 | node-utils | 438 | 2025-01-03T11:55:12.149585 | GPL-3.0 | false | ac1721e12089b9388578628c8fa2422c |
version: '3.9'\n\nservices:\n master0:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "-v=0 master -volumeSizeLimitMB 100 -resumeState=false -ip=master0 -port=9333 -peers=master0:9333,master1:9334,master2:9335 -mdir=/tmp"\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n master1:\n image: chrislusf/seaweedfs:local\n ports:\n - 9334:9334\n - 19334:19334\n command: "-v=0 master -volumeSizeLimitMB 100 -resumeState=false -ip=master1 -port=9334 -peers=master0:9333,master1:9334,master2:9335 -mdir=/tmp"\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n master2:\n image: chrislusf/seaweedfs:local\n ports:\n - 9335:9335\n - 19335:19335\n command: "-v=0 master -volumeSizeLimitMB 100 -resumeState=false -ip=master2 -port=9335 -peers=master0:9333,master1:9334,master2:9335 -mdir=/tmp"\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n volume1:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: 'volume -dataCenter=dc1 -rack=v1 -mserver="master0:9333,master1:9334,master2:9335" -port=8080 -ip=volume1 -publicUrl=localhost:8080 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n - master2\n volume2:\n image: chrislusf/seaweedfs:local\n ports:\n - 8082:8082\n - 18082:18082\n command: 'volume -dataCenter=dc2 -rack=v2 -mserver="master0:9333,master1:9334,master2:9335" -port=8082 -ip=volume2 -publicUrl=localhost:8082 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n - master2\n volume3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8083:8083\n - 18083:18083\n command: 'volume -dataCenter=dc3 -rack=v3 -mserver="master0:9333,master1:9334,master2:9335" -port=8083 -ip=volume3 -publicUrl=localhost:8083 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n - master2\n filer1:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n command: 'filer -defaultReplicaPlacement=100 -iam -master="master0:9333,master1:9334,master2:9335" -port=8888 -ip=filer1'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n filer2:\n image: chrislusf/seaweedfs:local\n ports:\n - 8889:8889\n - 18889:18889\n command: 'filer -defaultReplicaPlacement=100 -iam -master="master0:9333,master1:9334,master2:9335" -port=8889 -ip=filer2'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n - filer1\n broker1:\n image: chrislusf/seaweedfs:local\n ports:\n - 17777:17777\n command: 'mq.broker -master="master0:9333,master1:9334,master2:9335" -port=17777 -ip=broker1'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n - filer1\n - filer2\n broker2:\n image: chrislusf/seaweedfs:local\n ports:\n - 17778:17778\n command: 'mq.broker -master="master0:9333,master1:9334,master2:9335" -port=17778 -ip=broker2'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n - filer1\n - filer2\n broker3:\n image: chrislusf/seaweedfs:local\n ports:\n - 17779:17779\n command: 'mq.broker -master="master0:9333,master1:9334,master2:9335" -port=17779 -ip=broker3'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n - filer1\n - filer2\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-brokers-compose.yml | local-brokers-compose.yml | YAML | 3,740 | 0.7 | 0 | 0 | python-kit | 242 | 2024-09-30T23:20:20.510723 | GPL-3.0 | false | b5e893bdb621d2aa03941ca17d82abe2 |
version: '3.9'\n\nservices:\n master0:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "-v=1 master -volumeSizeLimitMB 100 -resumeState=false -ip=master0 -port=9333 -peers=master0:9333,master1:9334,master2:9335 -mdir=/tmp"\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n master1:\n image: chrislusf/seaweedfs:local\n ports:\n - 9334:9334\n - 19334:19334\n command: "-v=1 master -volumeSizeLimitMB 100 -resumeState=false -ip=master1 -port=9334 -peers=master0:9333,master1:9334,master2:9335 -mdir=/tmp"\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n master2:\n image: chrislusf/seaweedfs:local\n ports:\n - 9335:9335\n - 19335:19335\n command: "-v=1 master -volumeSizeLimitMB 100 -resumeState=false -ip=master2 -port=9335 -peers=master0:9333,master1:9334,master2:9335 -mdir=/tmp"\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n volume1:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: 'volume -dataCenter=dc1 -rack=v1 -mserver="master0:9333,master1:9334,master2:9335" -port=8080 -ip=volume1 -publicUrl=localhost:8080 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n - master2\n volume2:\n image: chrislusf/seaweedfs:local\n ports:\n - 8082:8082\n - 18082:18082\n command: 'volume -dataCenter=dc2 -rack=v2 -mserver="master0:9333,master1:9334,master2:9335" -port=8082 -ip=volume2 -publicUrl=localhost:8082 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n - master2\n volume3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8083:8083\n - 18083:18083\n command: 'volume -dataCenter=dc3 -rack=v3 -mserver="master0:9333,master1:9334,master2:9335" -port=8083 -ip=volume3 -publicUrl=localhost:8083 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n - master2\n filer:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n - 8111:8111\n command: 'filer -defaultReplicaPlacement=100 -iam -master="master0:9333,master1:9334,master2:9335"'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8333:8333\n command: '-v=9 s3 -filer="filer:8888"'\n depends_on:\n - master0\n - master1\n - master2\n - volume1\n - volume2\n - filer\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-cluster-compose.yml | local-cluster-compose.yml | YAML | 2,718 | 0.7 | 0 | 0 | vue-tools | 969 | 2024-02-11T17:21:00.985723 | Apache-2.0 | false | add4a87c707afa2ef060288a2e5e6d36 |
version: '3.9'\n\nservices:\n server1:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n - 8084:8080\n - 18084:18080\n - 8888:8888\n - 18888:18888\n command: "server -ip=server1 -filer -volume.max=0 -master.volumeSizeLimitMB=1024 -volume.preStopSeconds=1"\n volumes:\n - ./master-cloud.toml:/etc/seaweedfs/master.toml\n depends_on:\n - server2\n server2:\n image: chrislusf/seaweedfs:local\n ports:\n - 9334:9333\n - 19334:19333\n - 8085:8080\n - 18085:18080\n - 8889:8888\n - 18889:18888\n - 8334:8333\n command: "server -ip=server2 -filer -s3 -volume.max=0 -master.volumeSizeLimitMB=1024 -volume.preStopSeconds=1"\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-clusters-compose.yml | local-clusters-compose.yml | YAML | 716 | 0.7 | 0 | 0 | vue-tools | 121 | 2023-12-08T09:40:07.550605 | MIT | false | 216d063eed3a351b50c6e85e2dcf5c3b |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "-v=1 master -ip=master -volumeSizeLimitMB=10"\n volumes:\n - ./tls:/etc/seaweedfs/tls\n env_file:\n - ${ENV_FILE:-dev.env}\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: "-v=1 volume -mserver=master:9333 -port=8080 -ip=volume -preStopSeconds=1 -max=10000"\n depends_on:\n - master\n volumes:\n - ./tls:/etc/seaweedfs/tls\n env_file:\n - ${ENV_FILE:-dev.env}\n filer:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n command: '-v=1 filer -ip.bind=0.0.0.0 -master="master:9333"'\n depends_on:\n - master\n - volume\n volumes:\n - ./tls:/etc/seaweedfs/tls\n env_file:\n - ${ENV_FILE:-dev.env}\n\n iam:\n image: chrislusf/seaweedfs:local\n ports:\n - 8111:8111\n command: '-v=1 iam -filer="filer:8888" -master="master:9333"'\n depends_on:\n - master\n - volume\n - filer\n volumes:\n - ./tls:/etc/seaweedfs/tls\n\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8333:8333\n command: '-v=1 s3 -filer="filer:8888" -ip.bind=s3'\n depends_on:\n - master\n - volume\n - filer\n volumes:\n - ./tls:/etc/seaweedfs/tls\n env_file:\n - ${ENV_FILE:-dev.env}\n\n mount:\n image: chrislusf/seaweedfs:local\n privileged: true\n cap_add:\n - SYS_ADMIN\n mem_limit: 4096m\n command: '-v=4 mount -filer="filer:8888" -dirAutoCreate -dir=/mnt/seaweedfs -cacheCapacityMB=100 -concurrentWriters=128'\n volumes:\n - ./tls:/etc/seaweedfs/tls\n env_file:\n - ${ENV_FILE:-dev.env}\n depends_on:\n - master\n - volume\n - filer\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-dev-compose.yml | local-dev-compose.yml | YAML | 1,797 | 0.7 | 0 | 0 | awesome-app | 429 | 2024-01-23T17:24:25.734812 | BSD-3-Clause | false | 6364a70f268b20c6556900f5f9725ce9 |
version: '3.9'\n\nservices:\n master0:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "-v=4 master -volumeSizeLimitMB 100 -raftHashicorp -electionTimeout 1s -ip=master0 -port=9333 -peers=master1:9334,master2:9335 -mdir=/data"\n volumes:\n - ./master/0:/data\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n master1:\n image: chrislusf/seaweedfs:local\n ports:\n - 9334:9334\n - 19334:19334\n command: "-v=4 master -volumeSizeLimitMB 100 -raftHashicorp -electionTimeout 1s -ip=master1 -port=9334 -peers=master0:9333,master2:9335 -mdir=/data"\n volumes:\n - ./master/1:/data\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n master2:\n image: chrislusf/seaweedfs:local\n ports:\n - 9335:9335\n - 19335:19335\n command: "-v=4 master -volumeSizeLimitMB 100 -raftHashicorp -electionTimeout 1s -ip=master2 -port=9335 -peers=master0:9333,master1:9334 -mdir=/data"\n volumes:\n - ./master/2:/data\n environment:\n WEED_MASTER_VOLUME_GROWTH_COPY_1: 1\n WEED_MASTER_VOLUME_GROWTH_COPY_2: 2\n WEED_MASTER_VOLUME_GROWTH_COPY_OTHER: 1\n volume1:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: 'volume -dataCenter=dc1 -rack=v1 -mserver="master0:9333,master1:9334,master2:9335" -port=8080 -ip=volume1 -publicUrl=localhost:8080 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n volume2:\n image: chrislusf/seaweedfs:local\n ports:\n - 8082:8082\n - 18082:18082\n command: 'volume -dataCenter=dc2 -rack=v2 -mserver="master0:9333,master1:9334,master2:9335" -port=8082 -ip=volume2 -publicUrl=localhost:8082 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n volume3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8083:8083\n - 18083:18083\n command: 'volume -dataCenter=dc3 -rack=v3 -mserver="master0:9333,master1:9334,master2:9335" -port=8083 -ip=volume3 -publicUrl=localhost:8083 -preStopSeconds=1'\n depends_on:\n - master0\n - master1\n filer:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n - 8111:8111\n command: 'filer -defaultReplicaPlacement=100 -iam -master="master0:9333,master1:9334,master2:9335"'\n depends_on:\n - master0\n - master1\n - volume1\n - volume2\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8333:8333\n command: '-v=9 s3 -ip.bind="s3" -filer="filer:8888"'\n depends_on:\n - master0\n - master1\n - volume1\n - volume2\n - filer | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-hashicorp-raft-compose.yml | local-hashicorp-raft-compose.yml | YAML | 2,777 | 0.7 | 0 | 0 | react-lib | 137 | 2024-01-14T07:19:12.938613 | MIT | false | 0b38bf31c0525be4f718337cb03204e8 |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "master -ip=master"\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: "volume -mserver=master:9333 -port=8080 -ip=volume"\n depends_on:\n - master\n mysql:\n image: percona/percona-server:5.7\n ports:\n - 3306:3306\n volumes:\n - ./seaweedfs.sql:/docker-entrypoint-initdb.d/seaweedfs.sql\n environment:\n - MYSQL_ROOT_PASSWORD=secret\n - MYSQL_DATABASE=seaweedfs\n - MYSQL_PASSWORD=secret\n - MYSQL_USER=seaweedfs\n filer:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n environment:\n - WEED_MYSQL_HOSTNAME=mysql\n - WEED_MYSQL_PORT=3306\n - WEED_MYSQL_DATABASE=seaweedfs\n - WEED_MYSQL_USERNAME=seaweedfs\n - WEED_MYSQL_PASSWORD=secret\n - WEED_MYSQL_ENABLED=true\n - WEED_MYSQL_CONNECTION_MAX_IDLE=5\n - WEED_MYSQL_CONNECTION_MAX_OPEN=75\n # "refresh" connection every 10 minutes, eliminating mysql closing "old" connections\n - WEED_MYSQL_CONNECTION_MAX_LIFETIME_SECONDS=600\n # enable usage of memsql as filer backend\n - WEED_MYSQL_INTERPOLATEPARAMS=true\n - WEED_LEVELDB2_ENABLED=false\n command: '-v 9 filer -master="master:9333"'\n depends_on:\n - master\n - volume\n - mysql\n ingress:\n image: jwilder/nginx-proxy:alpine\n ports:\n - "80:80"\n volumes:\n - /var/run/docker.sock:/tmp/docker.sock:ro\n - ./nginx/proxy.conf:/etc/nginx/proxy.conf\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8333:8333\n command: '-v 9 s3 -filer="filer:8888"'\n depends_on:\n - master\n - volume\n - filer\n environment:\n - VIRTUAL_HOST=ingress\n - VIRTUAL_PORT=8333\n registry:\n image: registry:2\n environment:\n REGISTRY_HTTP_ADDR: "0.0.0.0:5001" # seaweedfs s3\n REGISTRY_LOG_LEVEL: "debug"\n REGISTRY_STORAGE: "s3"\n REGISTRY_STORAGE_S3_REGION: "us-east-1"\n REGISTRY_STORAGE_S3_REGIONENDPOINT: "http://ingress"\n REGISTRY_STORAGE_S3_BUCKET: "registry"\n REGISTRY_STORAGE_S3_ACCESSKEY: "some_access_key1"\n REGISTRY_STORAGE_S3_SECRETKEY: "some_secret_key1"\n REGISTRY_STORAGE_S3_V4AUTH: "true"\n REGISTRY_STORAGE_S3_SECURE: "false"\n REGISTRY_STORAGE_S3_SKIPVERIFY: "true"\n REGISTRY_STORAGE_S3_ROOTDIRECTORY: "/"\n REGISTRY_STORAGE_DELETE_ENABLED: "true"\n REGISTRY_STORAGE_REDIRECT_DISABLE: "true"\n REGISTRY_VALIDATION_DISABLED: "true"\n ports:\n - 5001:5001\n depends_on:\n - s3\n - ingress | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-k8s-compose.yml | local-k8s-compose.yml | YAML | 2,674 | 0.8 | 0 | 0.021505 | vue-tools | 217 | 2024-05-27T00:17:49.049224 | MIT | false | d1c0f02eb921a759ddba2cd84c6212b1 |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "master -ip=master -volumeSizeLimitMB=1024"\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: "volume -mserver=master:9333 -port=8080 -ip=volume -max=0 -preStopSeconds=1"\n depends_on:\n - master\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n - 8333:8333\n command: '-v 1 filer -master="master:9333" -s3 -s3.config=/etc/seaweedfs/s3.json -s3.port=8333'\n volumes:\n - ./s3.json:/etc/seaweedfs/s3.json\n depends_on:\n - master\n - volume\n minio-gateway-s3:\n image: minio/minio\n ports:\n - 9000:9000\n command: 'minio gateway s3 http://s3:8333'\n restart: on-failure\n environment:\n MINIO_ACCESS_KEY: "some_access_key1"\n MINIO_SECRET_KEY: "some_secret_key1"\n depends_on:\n - s3\n minio-warp:\n image: minio/warp\n command: 'mixed --duration=5m --obj.size=3mb --autoterm'\n restart: on-failure\n environment:\n WARP_HOST: "minio-gateway-s3:9000"\n WARP_ACCESS_KEY: "some_access_key1"\n WARP_SECRET_KEY: "some_secret_key1"\n depends_on:\n - minio-gateway-s3 | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-minio-gateway-compose.yml | local-minio-gateway-compose.yml | YAML | 1,282 | 0.8 | 0 | 0 | python-kit | 837 | 2025-01-29T05:59:31.890672 | GPL-3.0 | false | e4f3499297ea00e4e1d58180e57444e5 |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "master -ip=master"\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: "volume -mserver=master:9333 -port=8080 -ip=volume"\n depends_on:\n - master\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n - 8333:8333\n command: '-v 9 filer -master="master:9333" -s3'\n depends_on:\n - master\n - volume\n nextcloud:\n image: nextcloud:23.0.5-apache\n environment:\n - OBJECTSTORE_S3_HOST=s3\n - OBJECTSTORE_S3_BUCKET=nextcloud\n - OBJECTSTORE_S3_KEY=some_access_key1\n - OBJECTSTORE_S3_SECRET=some_secret_key1\n - OBJECTSTORE_S3_PORT=8333\n - OBJECTSTORE_S3_SSL=false\n - OBJECTSTORE_S3_USEPATH_STYLE=true\n - SQLITE_DATABASE=nextcloud\n - NEXTCLOUD_ADMIN_USER=admin\n - NEXTCLOUD_ADMIN_PASSWORD=admin\n ports:\n - 80:80\n depends_on:\n - s3 | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-nextcloud-compose.yml | local-nextcloud-compose.yml | YAML | 1,049 | 0.7 | 0 | 0 | vue-tools | 783 | 2024-08-30T15:06:53.551271 | Apache-2.0 | false | 58ef41e1512be8a47061c59cff9a4b6e |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "master -ip=master -volumeSizeLimitMB=1024"\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: "volume -mserver=master:9333 -port=8080 -ip=volume -max=0 -preStopSeconds=1"\n depends_on:\n - master\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n - 8333:8333\n command: '-v 9 filer -master="master:9333" -s3 -s3.config=/etc/seaweedfs/s3.json -s3.port=8333'\n volumes:\n - ./s3.json:/etc/seaweedfs/s3.json\n depends_on:\n - master\n - volume\n minio:\n image: minio/minio\n ports:\n - 9000:9000\n command: 'minio server /data'\n environment:\n MINIO_ACCESS_KEY: "some_access_key1"\n MINIO_SECRET_KEY: "some_secret_key1"\n depends_on:\n - master\n registry1:\n image: registry:2\n environment:\n REGISTRY_HTTP_ADDR: "0.0.0.0:5001" # seaweedfs s3\n REGISTRY_LOG_LEVEL: "debug"\n REGISTRY_STORAGE: "s3"\n REGISTRY_STORAGE_S3_REGION: "us-east-1"\n REGISTRY_STORAGE_S3_REGIONENDPOINT: "http://s3:8333"\n REGISTRY_STORAGE_S3_BUCKET: "registry"\n REGISTRY_STORAGE_S3_ACCESSKEY: "some_access_key1"\n REGISTRY_STORAGE_S3_SECRETKEY: "some_secret_key1"\n REGISTRY_STORAGE_S3_V4AUTH: "true"\n REGISTRY_STORAGE_S3_SECURE: "false"\n REGISTRY_STORAGE_S3_SKIPVERIFY: "true"\n REGISTRY_STORAGE_S3_ROOTDIRECTORY: "/"\n REGISTRY_STORAGE_DELETE_ENABLED: "true"\n REGISTRY_STORAGE_REDIRECT_DISABLE: "true"\n REGISTRY_VALIDATION_DISABLED: "true"\n ports:\n - 5001:5001\n depends_on:\n - s3\n - minio\n registry2:\n image: registry:2\n environment:\n REGISTRY_HTTP_ADDR: "0.0.0.0:5002" # minio\n REGISTRY_LOG_LEVEL: "debug"\n REGISTRY_STORAGE: "s3"\n REGISTRY_STORAGE_S3_REGION: "us-east-1"\n REGISTRY_STORAGE_S3_REGIONENDPOINT: "http://minio:9000"\n REGISTRY_STORAGE_S3_BUCKET: "registry"\n REGISTRY_STORAGE_S3_ACCESSKEY: "some_access_key1"\n REGISTRY_STORAGE_S3_SECRETKEY: "some_secret_key1"\n REGISTRY_STORAGE_S3_V4AUTH: "true"\n REGISTRY_STORAGE_S3_SECURE: "false"\n REGISTRY_STORAGE_S3_SKIPVERIFY: "true"\n REGISTRY_STORAGE_S3_ROOTDIRECTORY: "/"\n REGISTRY_STORAGE_DELETE_ENABLED: "true"\n REGISTRY_STORAGE_REDIRECT_DISABLE: "true"\n REGISTRY_VALIDATION_DISABLED: "true"\n ports:\n - 5002:5002\n depends_on:\n - s3\n - minio | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-registry-compose.yml | local-registry-compose.yml | YAML | 2,556 | 0.8 | 0 | 0 | awesome-app | 705 | 2024-02-26T05:21:52.354260 | MIT | false | 19484d0b09263e96ce32c97c0e543cc1 |
version: '3.9'\n\nservices:\n master:\n image: chrislusf/seaweedfs:local\n ports:\n - 9333:9333\n - 19333:19333\n command: "master -ip=master"\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - 8080:8080\n - 18080:18080\n command: "volume -mserver=master:9333 -port=8080 -ip=volume -preStopSeconds=1"\n depends_on:\n - master\n filer:\n image: chrislusf/seaweedfs:local\n ports:\n - 8888:8888\n - 18888:18888\n command: '-v=9 filer -master="master:9333"'\n restart: on-failure\n volumes:\n - ./notification.toml:/etc/seaweedfs/notification.toml\n depends_on:\n - master\n - volume\n - rabbitmq\n - replicate\n environment:\n RABBIT_SERVER_URL: "amqp://guest:guest@rabbitmq:5672/"\n replicate:\n image: chrislusf/seaweedfs:local\n command: '-v=9 filer.replicate'\n restart: on-failure\n volumes:\n - ./notification.toml:/etc/seaweedfs/notification.toml\n - ./replication.toml:/etc/seaweedfs/replication.toml\n depends_on:\n - rabbitmq\n environment:\n RABBIT_SERVER_URL: "amqp://guest:guest@rabbitmq:5672/"\n s3:\n image: chrislusf/seaweedfs:local\n ports:\n - 8333:8333\n command: 's3 -filer="filer:8888"'\n depends_on:\n - master\n - volume\n - filer\n rabbitmq:\n image: rabbitmq:3.8.10-management-alpine\n ports:\n - 5672:5672\n - 15671:15671\n - 15672:15672\n environment:\n RABBITMQ_SERVER_ADDITIONAL_ERL_ARGS: "-rabbit log_levels [{connection,error},{queue,debug}]" | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-replicate-compose.yml | local-replicate-compose.yml | YAML | 1,530 | 0.8 | 0 | 0 | node-utils | 219 | 2023-09-10T01:49:18.820473 | Apache-2.0 | false | 6b6c19d17d0cc25bc2d84fd2835692af |
version: '3.9'\nservices:\n node1:\n image: chrislusf/seaweedfs:local\n command: "server -master -volume -filer"\n ports:\n - 8888:8888\n - 18888:18888\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://localhost:9333/cluster/healthz" ]\n interval: 1s\n start_period: 10s\n timeout: 30s\n mount1:\n image: chrislusf/seaweedfs:local\n privileged: true\n command: "mount -filer=node1:8888 -dir=/mnt -dirAutoCreate"\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://node1:8888/" ]\n interval: 1s\n start_period: 10s\n timeout: 30s\n depends_on:\n node1:\n condition: service_healthy\n node2:\n image: chrislusf/seaweedfs:local\n ports:\n - 7888:8888\n - 17888:18888\n command: "server -master -volume -filer"\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://localhost:9333/cluster/healthz" ]\n interval: 1s\n start_period: 10s\n timeout: 30s\n mount2:\n image: chrislusf/seaweedfs:local\n privileged: true\n command: "mount -filer=node2:8888 -dir=/mnt -dirAutoCreate"\n healthcheck:\n test: [ "CMD", "curl", "--fail", "-I", "http://node2:8888/" ]\n interval: 1s\n start_period: 10s\n timeout: 30s\n depends_on:\n node2:\n condition: service_healthy\n sync:\n image: chrislusf/seaweedfs:local\n command: "-v=4 filer.sync -a=node1:8888 -b=node2:8888 -a.debug -b.debug"\n depends_on:\n mount1:\n condition: service_healthy\n mount2:\n condition: service_healthy\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\local-sync-mount-compose.yml | local-sync-mount-compose.yml | YAML | 1,559 | 0.8 | 0 | 0 | vue-tools | 172 | 2023-12-25T14:04:52.796489 | Apache-2.0 | false | 3022d89054f783f373548ad2ea0c271c |
# 2021-01-30 16:25:30\nversion: '3.8'\n\nservices:\n\n etcd:\n image: gasparekatapy/etcd\n networks:\n - net\n deploy:\n mode: replicated\n replicas: 3\n\n master:\n image: chrislusf/seaweedfs:local\n environment:\n WEED_MASTER_FILER_DEFAULT: "filer:8888"\n WEED_MASTER_SEQUENCER_TYPE: "raft"\n ports:\n - "9333:9333"\n - "19333:19333"\n networks:\n - net\n command:\n - 'master'\n - '-resumeState=true'\n - '-ip=master'\n - '-port=9333'\n deploy:\n mode: replicated\n replicas: 1\n\n filer:\n image: chrislusf/seaweedfs:local\n environment:\n WEED_LEVELDB2_ENABLED: "false"\n WEED_ETCD_ENABLED: "true"\n WEED_ETCD_SERVERS: "etcd:2379"\n ports:\n - target: 8888\n published: 8888\n protocol: tcp\n mode: host\n - target: 18888\n published: 18888\n protocol: tcp\n mode: host\n networks:\n - net\n command:\n - 'filer'\n - '-ip=filer'\n - '-port=8888'\n - '-port.readonly=28888'\n - '-master=master:9333'\n - '-disableDirListing=true'\n deploy:\n mode: replicated\n replicas: 1\n\n volume:\n image: chrislusf/seaweedfs:local\n ports:\n - target: 8080\n published: 8080\n protocol: tcp\n mode: host\n - target: 18080\n published: 18080\n protocol: tcp\n mode: host\n networks:\n - net\n command:\n - 'volume'\n - '-mserver=master:9333'\n - '-port=8080'\n deploy:\n mode: global\n\n ###########################################################################\n\nnetworks:\n net:\n | dataset_sample\yaml\seaweedfs_seaweedfs\docker\compose\swarm-etcd.yml | swarm-etcd.yml | YAML | 1,618 | 0.8 | 0 | 0.025974 | react-lib | 632 | 2024-01-25T07:44:34.945367 | GPL-3.0 | false | dd483b4609ffaeef690f255a9e842001 |
global:\n scrape_interval: 30s\n scrape_timeout: 10s\n\nscrape_configs:\n - job_name: services\n metrics_path: /metrics\n static_configs:\n - targets:\n - 'prometheus:9090'\n - 'master:9324'\n - 'volume:9325'\n - 'filer:9326'\n - 's3:9327' | dataset_sample\yaml\seaweedfs_seaweedfs\docker\prometheus\prometheus.yml | prometheus.yml | YAML | 285 | 0.7 | 0 | 0 | awesome-app | 141 | 2024-09-14T08:59:50.967153 | BSD-3-Clause | false | 0b7bab32aa88a913a94e8148159d3691 |
# Artifact Hub repository metadata file\n#\n# Some settings like the verified publisher flag or the ignored packages won't\n# be applied until the next time the repository is processed. Please keep in\n# mind that the repository won't be processed if it has not changed since the\n# last time it was processed. Depending on the repository kind, this is checked\n# in a different way. For Helm http based repositories, we consider it has\n# changed if the `index.yaml` file changes. For git based repositories, it does\n# when the hash of the last commit in the branch you set up changes. This does\n# NOT apply to ownership claim operations, which are processed immediately.\n#\n\nrepositoryID: 5b2f1fe2-20e5-486e-9746-183484642aa2\n# owners: # (optional, used to claim repository ownership)\n# - name: username\n# email: email\n | dataset_sample\yaml\seaweedfs_seaweedfs\k8s\charts\artifacthub-repo.yml | artifacthub-repo.yml | YAML | 819 | 0.8 | 0.125 | 0.933333 | vue-tools | 538 | 2023-10-12T17:25:28.966567 | Apache-2.0 | false | 364f154d02974dfd2abf0ff40a1aab44 |
server:\n port: 8080\n\nftp:\n port: 2222\n passive-address: localhost\n passive-ports: 30000-30999\n\nhdfs:\n uri: seaweedfs://localhost:8888\n\nseaweedFs:\n enable: true\n access: direct # direct/filerProxy/publicUrl\n replication: "000" | dataset_sample\yaml\seaweedfs_seaweedfs\other\hdfs-over-ftp\src\main\resources\application.yml | application.yml | YAML | 233 | 0.8 | 0 | 0 | react-lib | 441 | 2024-02-27T04:43:24.944742 | BSD-3-Clause | false | 12792fbcefb29f6f90202e9aeb69e3d1 |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/comparator"\n | dataset_sample\yaml\sebastianbergmann_comparator\.github\FUNDING.yml | FUNDING.yml | YAML | 133 | 0.7 | 0 | 0 | vue-tools | 248 | 2024-07-01T08:43:57.996423 | BSD-3-Clause | false | 35ba97965e994596c332d4df415794a2 |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\nenv:\n COMPOSER_ROOT_VERSION: "7.0.x-dev"\n\npermissions:\n contents: read\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.4\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with phpunit/phpunit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_comparator\.github\workflows\ci.yml | ci.yml | YAML | 2,167 | 0.8 | 0.010753 | 0.014493 | node-utils | 790 | 2024-08-22T03:13:49.287298 | MIT | false | cacfde004a07cb2673e6be974f1275fd |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/diff"\n | dataset_sample\yaml\sebastianbergmann_diff\.github\FUNDING.yml | FUNDING.yml | YAML | 127 | 0.7 | 0 | 0 | node-utils | 218 | 2024-08-30T04:58:58.727704 | BSD-3-Clause | false | 8e608f192a4043138b28246d7912a07d |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\npermissions:\n contents: read\n\nenv:\n COMPOSER_ROOT_VERSION: "7.0.x-dev"\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n ini-values: memory_limit=-1\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_diff\.github\workflows\ci.yml | ci.yml | YAML | 2,197 | 0.8 | 0.010638 | 0.014286 | awesome-app | 817 | 2025-05-17T02:46:20.482657 | MIT | false | a02762f972bae6431bb0cb078ab59bdd |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/environment"\n | dataset_sample\yaml\sebastianbergmann_environment\.github\FUNDING.yml | FUNDING.yml | YAML | 134 | 0.7 | 0 | 0 | node-utils | 149 | 2024-12-18T16:24:35.527855 | BSD-3-Clause | false | 5f9e3713059e221d9b9d1106c00ef3fe |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\nenv:\n COMPOSER_ROOT_VERSION: "8.0.x-dev"\n\npermissions:\n contents: read\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_environment\.github\workflows\ci.yml | ci.yml | YAML | 2,159 | 0.8 | 0.010753 | 0.014493 | node-utils | 50 | 2024-01-02T12:17:24.345784 | Apache-2.0 | false | 61b709c4b36337ba2796f66743d5219e |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/global-state"\n | dataset_sample\yaml\sebastianbergmann_global-state\.github\FUNDING.yml | FUNDING.yml | YAML | 135 | 0.7 | 0 | 0 | react-lib | 946 | 2024-03-06T08:20:59.847843 | BSD-3-Clause | false | 62bffb145ed8fb3c2301d761da9dcb22 |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\npermissions:\n contents: read\n\nenv:\n COMPOSER_ROOT_VERSION: "8.0.x-dev"\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_global-state\.github\workflows\ci.yml | ci.yml | YAML | 2,159 | 0.8 | 0.010753 | 0.014493 | react-lib | 107 | 2024-01-22T10:11:28.594808 | GPL-3.0 | false | 905c1ac3ef5740c00f237f8f02457e83 |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/object-enumerator"\n | dataset_sample\yaml\sebastianbergmann_object-enumerator\.github\FUNDING.yml | FUNDING.yml | YAML | 140 | 0.7 | 0 | 0 | awesome-app | 576 | 2025-05-07T04:16:49.669265 | Apache-2.0 | false | fb5c7f8df633bebd49449de7eb767874 |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\nenv:\n COMPOSER_ROOT_VERSION: "7.0.x-dev"\n\npermissions:\n contents: read\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_object-enumerator\.github\workflows\ci.yml | ci.yml | YAML | 2,159 | 0.8 | 0.010753 | 0.014493 | node-utils | 121 | 2023-11-24T03:20:01.877841 | GPL-3.0 | false | b8efa6bfb113f1bb705385927fc9e0e4 |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/object-reflector"\n | dataset_sample\yaml\sebastianbergmann_object-reflector\.github\FUNDING.yml | FUNDING.yml | YAML | 139 | 0.7 | 0 | 0 | vue-tools | 793 | 2024-11-05T07:26:37.828114 | MIT | false | dd474d33b1b347c24263434c097440ba |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\nenv:\n COMPOSER_ROOT_VERSION: "5.0.x-dev"\n\npermissions:\n contents: read\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_object-reflector\.github\workflows\ci.yml | ci.yml | YAML | 2,159 | 0.8 | 0.010753 | 0.014493 | react-lib | 809 | 2023-10-03T16:01:18.958041 | GPL-3.0 | false | ed54bc9e0fb7e909a46d1d9daee06774 |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/phpunit/php-timer"\n | dataset_sample\yaml\sebastianbergmann_php-timer\.github\FUNDING.yml | FUNDING.yml | YAML | 130 | 0.7 | 0 | 0 | node-utils | 101 | 2025-07-03T00:13:42.901931 | BSD-3-Clause | false | 0cc1ef6fc23ff6ad1e88a11cee369a86 |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\nenv:\n COMPOSER_ROOT_VERSION: "8.0.x-dev"\n\npermissions:\n contents: read\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_php-timer\.github\workflows\ci.yml | ci.yml | YAML | 2,159 | 0.8 | 0.010753 | 0.014493 | node-utils | 156 | 2025-06-22T21:52:30.308682 | BSD-3-Clause | false | 61b709c4b36337ba2796f66743d5219e |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/phpunit/phpunit"\ncustom: https://phpunit.de/sponsors.html\n | dataset_sample\yaml\sebastianbergmann_phpunit\.github\FUNDING.yml | FUNDING.yml | YAML | 169 | 0.8 | 0 | 0 | awesome-app | 37 | 2024-05-26T08:59:27.004227 | Apache-2.0 | false | 0ffec6504e6e51668936e9ab83e9b09b |
github: sebastianbergmann\nliberapay: sebastianbergmann\nthanks_dev: u/gh/sebastianbergmann\ntidelift: "packagist/sebastian/recursion-context"\n | dataset_sample\yaml\sebastianbergmann_recursion-context\.github\FUNDING.yml | FUNDING.yml | YAML | 140 | 0.7 | 0 | 0 | react-lib | 425 | 2024-05-25T05:22:08.373400 | MIT | false | dbcb795b25baf12c49b040a2c3c66c9c |
# https://help.github.com/en/categories/automating-your-workflow-with-github-actions\n\non:\n - "pull_request"\n - "push"\n\nname: "CI"\n\nenv:\n COMPOSER_ROOT_VERSION: "7.0.x-dev"\n\npermissions:\n contents: read\n\njobs:\n coding-guidelines:\n name: Coding Guidelines\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Run PHP-CS-Fixer\n run: ./tools/php-cs-fixer fix --dry-run --show-progress=dots --using-cache=no --verbose\n\n static-analysis:\n name: Static Analysis\n\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Install PHP\n uses: shivammathur/setup-php@v2\n with:\n php-version: 8.3\n coverage: none\n\n - name: Install dependencies with Composer\n run: ./tools/composer update --no-interaction --no-ansi --no-progress\n\n - name: Run PHPStan\n run: ./tools/phpstan analyse --no-progress --error-format=github\n\n tests:\n name: Tests\n\n runs-on: ubuntu-latest\n\n strategy:\n fail-fast: false\n matrix:\n php-version:\n - "8.3"\n - "8.4"\n - "8.5"\n\n steps:\n - name: "Checkout"\n uses: "actions/checkout@v4"\n\n - name: "Install PHP with extensions"\n uses: "shivammathur/setup-php@v2"\n with:\n php-version: "${{ matrix.php-version }}"\n coverage: "xdebug"\n\n - name: "Install dependencies with Composer"\n run: "./tools/composer update --no-ansi --no-interaction --no-progress"\n\n - name: "Run tests with PHPUnit"\n run: "vendor/bin/phpunit --log-junit junit.xml --coverage-clover=coverage.xml"\n\n - name: Upload test results to Codecov.io\n if: ${{ !cancelled() }}\n uses: codecov/test-results-action@v1\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n\n - name: Upload code coverage data to Codecov.io\n uses: codecov/codecov-action@v4\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n | dataset_sample\yaml\sebastianbergmann_recursion-context\.github\workflows\ci.yml | ci.yml | YAML | 2,159 | 0.8 | 0.010753 | 0.014493 | react-lib | 64 | 2023-10-12T20:19:32.230470 | Apache-2.0 | false | b8efa6bfb113f1bb705385927fc9e0e4 |
contact_links:\n - name: Ask for help\n url: https://github.com/serilog/serilog/wiki/Usage-help\n about: Ask the community for help on how to use Serilog\n | dataset_sample\yaml\serilog_serilog\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 158 | 0.8 | 0.5 | 0 | vue-tools | 504 | 2025-06-06T03:06:52.913439 | BSD-3-Clause | false | f852618073f9a6341852bc1ffb531c49 |
# If this file is renamed, the incrementing run attempt number will be reset.\n\nname: CI\n\non:\n push:\n branches: [ "dev", "main" ]\n pull_request:\n branches: [ "dev", "main" ]\n\nenv:\n CI_BUILD_NUMBER_BASE: ${{ github.run_number }}\n CI_TARGET_BRANCH: ${{ github.head_ref || github.ref_name }}\n\njobs:\n build:\n\n # The build must run on Windows so that .NET Framework targets can be built and tested.\n runs-on: windows-latest\n\n permissions:\n contents: write\n\n steps:\n - uses: actions/checkout@v4\n - name: Setup\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Compute build number\n shell: bash\n run: |\n echo "CI_BUILD_NUMBER=$(($CI_BUILD_NUMBER_BASE+2300))" >> $GITHUB_ENV\n - name: Build and Publish\n env:\n DOTNET_CLI_TELEMETRY_OPTOUT: true\n NUGET_API_KEY: ${{ secrets.NUGET_API_KEY }}\n GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n shell: pwsh\n run: |\n ./Build.ps1\n | dataset_sample\yaml\serilog_serilog\.github\workflows\ci.yml | ci.yml | YAML | 1,023 | 0.8 | 0 | 0.058824 | vue-tools | 847 | 2024-03-15T17:10:12.538572 | MIT | false | 7bedddeec31459350d7c30769320d7e1 |
name: Benchmark OrmLite \n\non: workflow_dispatch\n\njobs:\n benchmark-ormlite:\n runs-on: ubuntu-latest\n services:\n postgres:\n image: postgres\n env:\n POSTGRES_USER: postgres\n POSTGRES_PASSWORD: p@55wOrd\n POSTGRES_DB: test\n ports:\n - 5432:5432\n options: >-\n --health-cmd pg_isready\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n sqlserver:\n image: mcr.microsoft.com/mssql/server:2022-latest\n env:\n ACCEPT_EULA: Y\n SA_PASSWORD: p@55wOrd\n MSSQL_PID: Developer\n ports:\n - 1433:1433\n mysql:\n image: mysql:latest\n env:\n MYSQL_ROOT_PASSWORD: p@55wOrd\n MYSQL_DATABASE: test\n ports:\n - 3306:3306\n steps:\n - name: Create Databases\n run: |\n sudo apt-get update\n sudo apt-get install -y postgresql-client mysql-client sqlcmd\n PGPASSWORD=p@55wOrd psql -h localhost -U postgres -tc "SELECT 'CREATE DATABASE test' WHERE NOT EXISTS (SELECT FROM pg_database WHERE datname = 'test')"\n mysql -h 127.0.0.1 --password=p@55wOrd --user root -e "CREATE DATABASE IF NOT EXISTS test"\n sqlcmd -U sa -P p@55wOrd -Q "CREATE DATABASE test"\n\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.OrmLite/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n\n - name: Benchmark OrmLite\n working-directory: ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Benchmarks\n run: dotnet run -c Release\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\benchmarks-ormlite.yml | benchmarks-ormlite.yml | YAML | 1,871 | 0.7 | 0 | 0 | vue-tools | 120 | 2023-07-17T21:42:32.325870 | Apache-2.0 | false | d6513acb906870cc86f3a81a2c1ac324 |
name: Build Aws\n\non:\n push:\n paths:\n - 'ServiceStack.Aws/**'\n - '.github/workflows/build-aws.yml'\n\njobs:\n build-aws:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.Aws/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n echo "AWS_ACCESS_KEY=${{ secrets.AWS_ACCESS_KEY }}" >> $GITHUB_ENV\n echo "AWS_SECRET_KEY=${{ secrets.AWS_SECRET_KEY }}" >> $GITHUB_ENV\n\n - name: Aws Tests\n run: dotnet test --framework net8.0 ./ServiceStack.Aws/tests/ServiceStack.Aws.Tests/ServiceStack.Aws.Tests.csproj --logger 'trx;LogFileName=results.trx'\n \n - name: Test Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Text Tests\n path: ./ServiceStack.Aws/tests/ServiceStack.Aws.Tests/TestResults/results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-aws.yml | build-aws.yml | YAML | 1,272 | 0.8 | 0.025 | 0 | vue-tools | 453 | 2025-04-21T09:32:33.746400 | MIT | false | 580d6424f3ccf6825ed443763e5b84c5 |
name: Build Azure\n\non:\n push:\n paths:\n - 'ServiceStack.Azure/**'\n - '.github/workflows/build-azure.yml'\n\npermissions:\n contents: read\n\njobs:\n build-azure:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.Azure/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n\n# Azure tests need mocking.\n# - name: Azure Tests\n# run: dotnet test --framework net8.0 ./ServiceStack.Azure/tests/ServiceStack.Azure.Tests/ServiceStack.Azure.Tests.csproj --logger 'trx;LogFileName=results.trx'\n#\n# - name: Test Report\n# uses: dorny/test-reporter@v1\n# if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n# with:\n# name: Text Tests\n# path: ./ServiceStack.Azure/tests/ServiceStack.Azure.Tests/TestResults/results.trx\n# reporter: dotnet-trx\n# only-summary: 'true'\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-azure.yml | build-azure.yml | YAML | 1,200 | 0.8 | 0.02381 | 0.342857 | vue-tools | 307 | 2024-12-29T12:11:16.533781 | BSD-3-Clause | false | 61f2ede2ad99e725e8fc8ff31b3812af |
name: Build Blazor\n\non:\n push:\n paths:\n - 'ServiceStack.Blazor/**'\n - '.github/workflows/build-blazor.yml'\n\npermissions:\n contents: read\n\njobs:\n build-blazor:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.Blazor/build\n run: dotnet build ./build.proj\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-blazor.yml | build-blazor.yml | YAML | 499 | 0.8 | 0 | 0 | react-lib | 816 | 2023-07-18T21:23:05.797244 | BSD-3-Clause | false | 59a103b723fe1994fd047cc0e475ade4 |
name: Build Core\n\non:\n push:\n paths:\n - 'ServiceStack.Core/**'\n - '.github/workflows/build-core.yml'\n\npermissions:\n contents: read\n\njobs:\n build-servicestack-core:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n \n - name: Build\n working-directory: ServiceStack.Core/build\n run: dotnet build ./build.proj \n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-core.yml | build-core.yml | YAML | 512 | 0.8 | 0 | 0 | awesome-app | 298 | 2025-01-28T18:39:35.865730 | BSD-3-Clause | false | 3c14f15db1d12161c9010981806eacc7 |
name: Build Logging\n\non:\n push:\n paths:\n - 'ServiceStack.Logging/**'\n - '.github/workflows/build-logging.yml'\n\npermissions:\n contents: read\n\njobs:\n build-logging:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.Logging/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n \n# net472 only package, need to upgrade if possible to run tests or use different runner.\n# - name: Logging Tests\n# run: dotnet test --framework net8.0 ./ServiceStack.Logging/tests/ServiceStack.Logging.Tests/ServiceStack.Logging.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n\n# - name: Test Report\n# uses: dorny/test-reporter@v1\n# if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n# with:\n# name: Text Tests\n# path: ./ServiceStack.Logging/tests/ServiceStack.Logging.Tests/TestResults/results.trx\n# reporter: dotnet-trx\n# only-summary: 'true'\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-logging.yml | build-logging.yml | YAML | 1,293 | 0.8 | 0.046512 | 0.323529 | vue-tools | 921 | 2024-10-08T12:16:42.219529 | GPL-3.0 | false | c3ad879249207e25f70ba9399e2ef2c0 |
name: Build OrmLite\n\non:\n push:\n paths:\n - 'ServiceStack.OrmLite/**'\n - '.github/workflows/build-ormlite.yml'\n\njobs:\n build-ormlite:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n \n - name: Build\n working-directory: ServiceStack.OrmLite/build\n run: dotnet build ./build.proj\n \n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n \n - name: Tests Setup\n working-directory: ServiceStack.OrmLite/tests\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests.Setup/ServiceStack.OrmLite.Tests.Setup.csproj \n\n - name: Test Sqlite\n working-directory: ServiceStack.OrmLite/tests\n env:\n ORMLITE_DIALECT: Sqlite\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.SqliteTests/ServiceStack.OrmLite.SqliteTests.csproj --logger 'trx;LogFileName=test-results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.SqliteTests/TestResults/test-results.trx\n reporter: dotnet-trx\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-ormlite.yml | build-ormlite.yml | YAML | 1,447 | 0.8 | 0.022222 | 0 | node-utils | 993 | 2025-01-13T17:11:12.985405 | MIT | false | bd4f184cf00271e2a49962c5d70967b4 |
name: Build Redis\n\non:\n push:\n paths:\n - 'ServiceStack.Redis/**'\n - '.github/workflows/build-redis.yml'\n\njobs:\n build-redis:\n runs-on: ubuntu-latest\n services:\n redis:\n image: redis\n options: >-\n --health-cmd "redis-cli ping"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n ports:\n - 6379:6379\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n \n - name: Build\n working-directory: ServiceStack.Redis/build\n run: dotnet build ./build.proj\n \n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n \n - name: Test Without Integration\n working-directory: ServiceStack.Redis/tests\n run: dotnet test --framework net8.0 ./ServiceStack.Redis.Tests/ServiceStack.Redis.Tests.csproj --filter TestCategory\!=Integration --logger 'trx;LogFileName=non-integration-results.trx'\n\n - name: Non-Integration Tests Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: Redis Non-Integration Tests \n path: ./ServiceStack.Redis/tests/ServiceStack.Redis.Tests/TestResults/non-integration-results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n \n\n - name: Test With Integration\n id: test_integration\n working-directory: ServiceStack.Redis/tests\n run: dotnet test --framework net8.0 ./ServiceStack.Redis.Tests/ServiceStack.Redis.Tests.csproj --filter TestCategory=Integration --logger 'trx;LogFileName=integration-results.trx'\n\n - name: Integration Tests Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped' \n with:\n name: Redis Integration Tests\n path: ./ServiceStack.Redis/tests/ServiceStack.Redis.Tests/TestResults/integration-results.trx\n reporter: dotnet-trx\n only-summary: 'true' | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-redis.yml | build-redis.yml | YAML | 2,178 | 0.8 | 0.031746 | 0 | python-kit | 18 | 2023-10-13T20:31:49.531679 | BSD-3-Clause | false | 6bdf8a0b322aeb285a402865e65dd988 |
name: Build ServiceStack\n\non: \n push:\n paths:\n - 'ServiceStack/**'\n - '.github/workflows/build-servicestack.yml'\n\njobs:\n build-servicestack:\n runs-on: ubuntu-latest\n permissions:\n checks: write\n contents: read\n services:\n redis:\n image: redis\n options: >-\n --health-cmd "redis-cli ping"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n ports:\n - 6379:6379\n sqlserver:\n image: mcr.microsoft.com/mssql/server:2022-latest\n env:\n ACCEPT_EULA: Y\n SA_PASSWORD: Test!tesT\n MSSQL_PID: Developer\n ports:\n - 48501:1433\n postgres:\n image: postgres\n env:\n POSTGRES_USER: postgres\n POSTGRES_PASSWORD: test\n POSTGRES_DB: test\n ports:\n - 48303:5432\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n \n - name: Build\n working-directory: ServiceStack/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n echo "CI_RABBITMQ=gistlyn.com:45672" >> $GITHUB_ENV\n echo "CI_DYNAMODB=http://gistlyn.com:48000" >> $GITHUB_ENV\n\n - name: Test Extensions\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.Extensions.Tests/ServiceStack.Extensions.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: Common Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.Extensions.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Test Common\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.Common.Tests/ServiceStack.Common.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: Common Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.Common.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Test ServiceModels\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.ServiceModel.Tests/ServiceStack.ServiceModel.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: ServiceModels Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.ServiceModel.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Test WebHost Endpoints\n env:\n PGSQL_CONNECTION: Server=localhost;Port=48303;User Id=postgres;Password=test;Database=test;Pooling=true;MinPoolSize=0;MaxPoolSize=200\n MSSQL_CONNECTION: Server=localhost,48501;Database=master;User Id=sa;Password=Test!tesT;MultipleActiveResultSets=True;\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.WebHost.Endpoints.Tests/ServiceStack.WebHost.Endpoints.Tests.csproj --logger 'trx;LogFileName=results.trx' --logger 'console;verbosity=detailed'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: WebHost.Endpoints Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.WebHost.Endpoints.Tests/TestResults/results.trx\n reporter: dotnet-trx\n \n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-servicestack.yml | build-servicestack.yml | YAML | 3,789 | 0.8 | 0.036697 | 0 | python-kit | 661 | 2024-12-22T02:53:09.161572 | GPL-3.0 | false | dfbb69473710602631629bbe3e4faf39 |
name: Build Stripe\n\non:\n push:\n paths:\n - 'ServiceStack.Stripe/**'\n - '.github/workflows/build-stripe.yml'\n\njobs:\n build-stripe:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.Stripe/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n\n - name: Stripe Tests\n run: dotnet test --framework net8.0 ./ServiceStack.Stripe/tests/ServiceStack.Stripe.Tests/ServiceStack.Stripe.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Stripe Tests\n path: ./ServiceStack.Stripe/tests/ServiceStack.Stripe.Tests/TestResults/results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-stripe.yml | build-stripe.yml | YAML | 1,143 | 0.8 | 0.026316 | 0 | awesome-app | 374 | 2024-08-09T22:14:43.988874 | BSD-3-Clause | false | 8cfeb54a1d5049559e929192f3927dfe |
name: Build ServiceStack.Text\n\non:\n push:\n paths:\n - 'ServiceStack.Text/**'\n - '.github/workflows/build-text.yml'\n\njobs:\n build-text:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n \n - name: Build\n working-directory: ServiceStack.Text/build\n run: dotnet build ./build.proj\n \n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n \n - name: Text tests\n run: dotnet test --framework net8.0 ./ServiceStack.Text/tests/ServiceStack.Text.Tests/ServiceStack.Text.Tests.csproj --logger 'trx;LogFileName=test-results.trx'\n\n - name: Text Tests Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Text Tests\n path: ./ServiceStack.Text/tests/ServiceStack.Text.Tests/TestResults/test-results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\build-text.yml | build-text.yml | YAML | 1,171 | 0.8 | 0.026316 | 0 | python-kit | 399 | 2024-03-21T23:54:28.003775 | Apache-2.0 | false | 609b3025a460a83215a0597e7a8c3cf8 |
name: A Feedz Publish\non: workflow_dispatch\n\nenv:\n FEED_URL: https://f.feedz.io/servicestack/pre-release/nuget/index.json\n\njobs:\n build:\n runs-on: ubuntu-latest\n steps:\n # Checkout the repo\n - uses: actions/checkout@v4\n\n # Setup .NET SDK\n - name: Setup .NET\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.*'\n\n - name: Download artifact\n uses: dawidd6/action-download-artifact@v2\n with:\n workflow: pre-release-pack.yml\n # workflow_conclusion: success\n name: ServiceStack Packages\n commit: ${{ github.sha }}\n path: ./build/staging\n \n - name: Check output\n working-directory: ./build\n run: |\n cd staging\n export number_of_packages=$(ls -1 | wc -l)\n echo "number_of_packages=${number_of_packages}" >> $GITHUB_ENV\n dotnet nuget add source ${{ env.FEED_URL }} -n pre-release -u pre-release -p ${{ secrets.FEEDZ_TOKEN }} --store-password-in-clear-text\n\n - name: Clear packages\n working-directory: ./build/staging\n shell: bash\n run: |\n ls -1 | grep .nupkg$ | sed -E 's/(.*)\.([0-9]+\.[0-9]+\.[0-9]+).nupkg$/\1 \2/' | while read line\n do \n dotnet nuget delete ${line} --source ${{ env.FEED_URL }} --api-key ${{ secrets.FEEDZ_TOKEN }} --non-interactive || true\n done\n\n - name: Push to GitHub\n working-directory: ./build/staging\n run: |\n # Check if more than 73 packages \n if [[ ${number_of_packages} -gt 73 ]]; then\n echo "Publishing to Azure Artifacts"\n dotnet nuget push '*.nupkg' --source ${{ env.FEED_URL }} --api-key ${{ secrets.FEEDZ_TOKEN }} --skip-duplicate\n else\n echo 'Less files than expected, skipping push'\n exit 1\n fi\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\feedz-push.yml | feedz-push.yml | YAML | 1,875 | 0.8 | 0.053571 | 0.081633 | react-lib | 435 | 2025-02-26T03:42:07.495298 | BSD-3-Clause | false | d2f733f5e86607530be4b66935e92472 |
name: A GitHub Publish\npermissions:\n packages: write\n contents: write\n\non: workflow_dispatch\n\njobs:\n github-push:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Download artifact\n uses: dawidd6/action-download-artifact@v2\n with:\n workflow: pre-release-pack.yml\n # workflow_conclusion: success\n name: ServiceStack Packages\n commit: ${{ github.sha }}\n path: ./build/staging\n \n - name: Check output\n working-directory: ./build\n run: |\n cd staging\n export number_of_packages=$(ls -1 | wc -l)\n echo "number_of_packages=${number_of_packages}" >> $GITHUB_ENV\n \n - name: Clear packages\n working-directory: ./build/staging\n shell: bash\n run: |\n echo ${{ secrets.GITHUB_TOKEN }} | gh auth login --with-token\n cp ../clear-github-packages.sh .\n chmod +x ./clear-github-packages.sh\n ./clear-github-packages.sh\n \n - name: Push to GitHub\n working-directory: ./build/staging\n run: |\n # Check if more than 73 packages \n if [[ ${number_of_packages} -gt 73 ]]; then\n echo "Pushing to GitHub Packages"\n dotnet nuget push '*.nupkg' --source https://nuget.pkg.github.com/ServiceStack/index.json --api-key ${{ secrets.GITHUB_TOKEN }}\n else\n echo 'Less files than expected, skipping push'\n exit 1\n fi\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\github-push.yml | github-push.yml | YAML | 1,639 | 0.8 | 0.037037 | 0.042553 | vue-tools | 420 | 2025-01-07T13:41:54.879950 | Apache-2.0 | false | cf7474c412f06783adb6702dcfc0822b |
name: Integration Tests OrmLite Community Providers\n\non: workflow_dispatch\n\njobs:\n integration-ormlite-community:\n runs-on: ubuntu-latest\n services:\n redis:\n image: redis\n options: >-\n --health-cmd "redis-cli ping"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n ports:\n - 6379:6379\n firebird:\n image: jacobalberty/firebird:v4.0.0\n env:\n ISC_PASSWORD: Test!tesT\n FIREBIRD_DATABASE: test.gdb\n ports:\n - 48101:3050\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.OrmLite/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n \n - name: Tests Firebird Setup\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: Firebird\n FIREBIRD_CONNECTION: User=test;Password=Test!tesT;Database=/firebird/data/test.gdb;DataSource=localhost;Port=48101;Dialect=3;charset=ISO8859_1;MinPoolSize=0;MaxPoolSize=100;\n EnableLegacyClientAuth: true\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests.Setup/ServiceStack.OrmLite.Tests.Setup.csproj\n\n - name: Test Firebird OrmLite\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: Firebird\n FIREBIRD_CONNECTION: User=test;Password=Test!tesT;Database=/firebird/data/test.gdb;DataSource=localhost;Port=48101;Dialect=3;charset=ISO8859_1;MinPoolSize=0;MaxPoolSize=100;\n EnableLegacyClientAuth: true\n run: |\n dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj --logger 'trx;LogFileName=results.trx' --filter "FullyQualifiedName=ServiceStack.OrmLite.Tests.OrderByTests"\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite Firebird Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Tests/TestResults/results.trx\n reporter: dotnet-trx\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\integration-ormlite-community-providers.yml | integration-ormlite-community-providers.yml | YAML | 2,446 | 0.7 | 0.044776 | 0 | awesome-app | 321 | 2024-01-02T19:08:08.023744 | BSD-3-Clause | false | 2a295b919e5de4e893a340fac591ae34 |
name: Integration Tests OrmLite \n\non: workflow_dispatch\n\njobs:\n integration-ormlite:\n runs-on: ubuntu-latest\n services:\n redis:\n image: redis\n options: >-\n --health-cmd "redis-cli ping"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n ports:\n - 6379:6379\n postgres:\n image: postgres\n env:\n POSTGRES_USER: postgres\n POSTGRES_PASSWORD: test\n POSTGRES_DB: test\n ports:\n - 48303:5432\n options: >-\n --health-cmd pg_isready\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n sqlserver:\n image: mcr.microsoft.com/mssql/server:2017-latest\n env:\n ACCEPT_EULA: Y\n SA_PASSWORD: Test!tesT\n MSSQL_PID: Developer\n ports:\n - 48501:1433\n mysql:\n image: mysql:8.0.28\n env:\n MYSQL_ROOT_PASSWORD: Test!tesT\n MYSQL_DATABASE: test\n ports:\n - 48205:3306\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build\n working-directory: ServiceStack.OrmLite/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n\n - name: Test Sqlite\n working-directory: ServiceStack.OrmLite/tests\n env:\n ORMLITE_DIALECT: Sqlite\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite Sqlite Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Tests Postgres Setup\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: PostgreSql11\n PGSQL_CONNECTION: Server=localhost;Port=48303;User Id=postgres;Password=test;Database=test;Pooling=true;MinPoolSize=0;MaxPoolSize=200\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests.Setup/ServiceStack.OrmLite.Tests.Setup.csproj\n\n - name: Test Postgres OrmLite\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: PostgreSql11\n PGSQL_CONNECTION: Server=localhost;Port=48303;User Id=postgres;Password=test;Database=test;Pooling=true;MinPoolSize=0;MaxPoolSize=200\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite PG Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Tests SQL Server Setup\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: SqlServer2017\n MSSQL_CONNECTION: Server=localhost,48501;Database=master;User Id=sa;Password=Test!tesT;MultipleActiveResultSets=True;\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests.Setup/ServiceStack.OrmLite.Tests.Setup.csproj\n\n\n - name: Test SQL Server OrmLite\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: SqlServer2017\n MSSQL_CONNECTION: Server=localhost,48501;Database=master;User Id=sa;Password=Test!tesT;MultipleActiveResultSets=True;\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite MS SQL Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Tests MySql Setup\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: MySql\n MYSQL_CONNECTION: Server=localhost;Port=48205;Database=test;UID=root;Password=Test!tesT\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests.Setup/ServiceStack.OrmLite.Tests.Setup.csproj\n\n - name: Test MySql OrmLite\n working-directory: ServiceStack.OrmLite/tests\n if: success() || failure()\n env:\n ORMLITE_DIALECT: MySql\n MYSQL_CONNECTION: Server=localhost;Port=48205;Database=test;UID=root;Password=Test!tesT\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite MySql Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\integration-ormlite.yml | integration-ormlite.yml | YAML | 5,572 | 0.7 | 0.065789 | 0 | awesome-app | 826 | 2025-06-27T08:28:35.868244 | Apache-2.0 | false | 71addce3fab91a620c1bea4ee0e5cb7c |
name: A MyGet Publish\n\non: workflow_dispatch\n\njobs:\n myget-push:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Download artifact\n uses: dawidd6/action-download-artifact@v2\n with:\n workflow: pre-release-pack.yml\n # workflow_conclusion: success\n name: ServiceStack Packages\n commit: ${{ github.sha }}\n path: ./build/staging\n \n - name: Check output\n working-directory: ./build\n run: |\n cd staging\n export number_of_packages=$(ls -1 | wc -l)\n echo "number_of_packages=${number_of_packages}" >> $GITHUB_ENV\n \n - name: Push to MyGet\n working-directory: ./build/staging\n run: |\n # Check if more than 73 packages \n if [[ ${number_of_packages} -gt 73 ]]; then\n echo "Pushing to MyGet"\n dotnet nuget push '*.nupkg' -s 'https://www.myget.org/F/servicestack/api/v2/package' -k '${{ secrets.MYGET_APIKEY }}' --skip-duplicate\n else\n echo 'Less files than expected, skipping push'\n exit 1\n fi\n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\myget-push.yml | myget-push.yml | YAML | 1,268 | 0.8 | 0.047619 | 0.055556 | node-utils | 679 | 2025-03-16T17:49:13.929711 | BSD-3-Clause | false | 402bb4a1edcc69fd476294f939c194d6 |
name: NuGet Pack\n\non: workflow_dispatch\n\njobs:\n build-test-all:\n runs-on: ubuntu-latest\n services:\n redis:\n image: redis\n ports:\n - 6379:6379\n sqlserver:\n image: mcr.microsoft.com/mssql/server:2017-latest\n env:\n ACCEPT_EULA: Y\n SA_PASSWORD: Test!tesT\n MSSQL_PID: Developer\n ports:\n - 48501:1433\n postgres:\n image: postgres\n env:\n POSTGRES_USER: postgres\n POSTGRES_PASSWORD: test\n POSTGRES_DB: test\n ports:\n - 48303:5432\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n \n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n echo "AWS_ACCESS_KEY=${{ secrets.AWS_ACCESS_KEY }}" >> $GITHUB_ENV\n echo "AWS_SECRET_KEY=${{ secrets.AWS_SECRET_KEY }}" >> $GITHUB_ENV\n\n# Individual build steps to make it easy to catch issues.\n\n - name: Build Text\n working-directory: ServiceStack.Text/build\n run: dotnet build ./build.proj\n\n - name: Text tests\n run: dotnet test --framework net8.0 ./ServiceStack.Text/tests/ServiceStack.Text.Tests/ServiceStack.Text.Tests.csproj --logger 'trx;LogFileName=test-results.trx'\n\n - name: Text Tests Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Text Tests\n path: ./ServiceStack.Text/tests/ServiceStack.Text.Tests/TestResults/test-results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n \n - name: Build\n working-directory: ServiceStack.Redis/build\n run: dotnet build ./build.proj\n\n - name: Env setup\n run: |\n echo "SERVICESTACK_LICENSE=${{ secrets.SERVICESTACK_LICENSE }}" >> $GITHUB_ENV\n echo "CI_RABBITMQ=gistlyn.com:45672" >> $GITHUB_ENV\n echo "CI_DYNAMODB=http://gistlyn.com:48000" >> $GITHUB_ENV\n\n - name: Redis Test Without Integration\n working-directory: ServiceStack.Redis/tests\n run: dotnet test --framework net8.0 ./ServiceStack.Redis.Tests/ServiceStack.Redis.Tests.csproj --filter TestCategory\!=Integration --logger 'trx;LogFileName=non-integration-results.trx'\n\n - name: Redis Non-Integration Tests Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: Redis Non-Integration Tests\n path: ./ServiceStack.Redis/tests/ServiceStack.Redis.Tests/TestResults/non-integration-results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n\n\n - name: Redis Test With Integration\n id: test_integration\n working-directory: ServiceStack.Redis/tests\n run: dotnet test --framework net8.0 ./ServiceStack.Redis.Tests/ServiceStack.Redis.Tests.csproj --filter TestCategory=Integration --logger 'trx;LogFileName=integration-results.trx'\n\n - name: Redis Integration Tests Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Redis Integration Tests\n path: ./ServiceStack.Redis/tests/ServiceStack.Redis.Tests/TestResults/integration-results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n \n - name: Build OrmLite\n working-directory: ServiceStack.OrmLite/build\n run: dotnet build ./build.proj \n \n - name: Test Sqlite\n working-directory: ServiceStack.OrmLite/tests\n env:\n ORMLITE_DIALECT: Sqlite\n run: dotnet test --framework net8.0 ./ServiceStack.OrmLite.SqliteTests/ServiceStack.OrmLite.SqliteTests.csproj --logger 'trx;LogFileName=test-results.trx'\n\n - name: Test Sqlite Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: OrmLite Sqlite Tests\n only-summary: 'true'\n path: ./ServiceStack.OrmLite/tests/ServiceStack.OrmLite.SqliteTests/TestResults/test-results.trx\n reporter: dotnet-trx\n\n - name: Build ServiceStack\n working-directory: ServiceStack/build\n run: dotnet build ./build.proj\n\n - name: Test ServiceStack.Common\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.Common.Tests/ServiceStack.Common.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: ServiceStack.Common Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.Common.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Test ServiceStack.ServiceModels\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.ServiceModel.Tests/ServiceStack.ServiceModel.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: ServiceModels Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.ServiceModel.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Test ServiceStack.WebHost Endpoints\n env:\n PGSQL_CONNECTION: Server=localhost;Port=48303;User Id=postgres;Password=test;Database=test;Pooling=true;MinPoolSize=0;MaxPoolSize=200\n MSSQL_CONNECTION: Server=localhost,48501;Database=master;User Id=sa;Password=Test!tesT;MultipleActiveResultSets=True;Encrypt=false;TrustServerCertificate=true;\n run: dotnet test --framework net8.0 ./ServiceStack/tests/ServiceStack.WebHost.Endpoints.Tests/ServiceStack.WebHost.Endpoints.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: success() || failure()\n with:\n name: ServiceStack.WebHost.Endpoints Tests\n only-summary: 'true'\n path: ./ServiceStack/tests/ServiceStack.WebHost.Endpoints.Tests/TestResults/results.trx\n reporter: dotnet-trx\n\n - name: Build Aws\n working-directory: ServiceStack.Aws/build\n run: dotnet build ./build.proj\n\n - name: Aws Tests\n run: dotnet test --framework net8.0 ./ServiceStack.Aws/tests/ServiceStack.Aws.Tests/ServiceStack.Aws.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Text Tests\n path: ./ServiceStack.Aws/tests/ServiceStack.Aws.Tests/TestResults/results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n\n - name: Build Azure\n working-directory: ServiceStack.Azure/build\n run: dotnet build ./build.proj\n \n - name: Build Blazor\n working-directory: ServiceStack.Blazor/build\n run: dotnet build ./build.proj\n\n - name: Build Logging\n working-directory: ServiceStack.Logging/build\n run: dotnet build ./build.proj\n \n - name: Build Core\n working-directory: ServiceStack.Core/build\n run: dotnet build ./build.proj\n \n - name: Build Stripe\n working-directory: ServiceStack.Stripe/build\n run: dotnet build ./build.proj\n\n - name: Stripe Tests\n run: dotnet test --framework net8.0 ./ServiceStack.Stripe/tests/ServiceStack.Stripe.Tests/ServiceStack.Stripe.Tests.csproj --logger 'trx;LogFileName=results.trx'\n\n - name: Test Report\n uses: dorny/test-reporter@v1\n if: (success() || failure()) && steps.test_integration.outcome != 'skipped'\n with:\n name: Stripe Tests\n path: ./ServiceStack.Stripe/tests/ServiceStack.Stripe.Tests/TestResults/results.trx\n reporter: dotnet-trx\n only-summary: 'true'\n \n nuget-pack:\n needs: build-test-all\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup .NET\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.*'\n \n - name: Rebuild All\n working-directory: ./build\n run: |\n chmod +x ./build-all.sh\n chmod +x ../ServiceStack/build/build.sh\n chmod +x ../ServiceStack.Aws/build/build.sh\n chmod +x ../ServiceStack.Azure/build/build.sh\n chmod +x ../ServiceStack.Blazor/build/build.sh\n chmod +x ../ServiceStack.Logging/build/build.sh\n chmod +x ../ServiceStack.OrmLite/build/build.sh\n chmod +x ../ServiceStack.Redis/build/build.sh\n chmod +x ../ServiceStack.Stripe/build/build.sh\n chmod +x ../ServiceStack.Text/build/build.sh\n chmod +x ../ServiceStack.Core/build/build.sh\n ./build-all.sh\n\n - name: Stage output\n working-directory: ./build\n run: |\n chmod +x ./stage-output.sh\n ./stage-output.sh\n cd staging\n export number_of_packages=$(ls -1 | wc -l)\n echo "number_of_packages=${number_of_packages}" >> $GITHUB_ENV\n\n - name: Check number of packages\n if: env.number_of_packages < 73\n run: |\n echo "Less packages produced than expected, failing."\n exit 1\n\n - uses: actions/upload-artifact@v4\n with:\n name: ServiceStack Packages\n retention-days: 1\n path: ./build/staging/*.nupkg | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\nuget-pack.yml | nuget-pack.yml | YAML | 9,569 | 0.8 | 0.043478 | 0.004673 | vue-tools | 879 | 2025-04-13T01:39:51.364655 | GPL-3.0 | false | 18b28836d9dc87cc90c29858a5a0bf4b |
name: NuGet Publish\n\non: workflow_dispatch\n\njobs:\n nuget-push:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Download artifact\n uses: dawidd6/action-download-artifact@v2\n with:\n workflow: nuget-pack.yml\n # workflow_conclusion: success\n name: ServiceStack Packages\n commit: ${{ github.sha }}\n path: ./build/staging\n\n - name: Check output\n working-directory: ./build\n run: |\n cd staging\n export number_of_packages=$(ls -1 | wc -l)\n echo "number_of_packages=${number_of_packages}" >> $GITHUB_ENV\n\n - name: Push to NuGet\n working-directory: ./build/staging\n run: |\n # Check if more than 73 packages \n if [[ ${number_of_packages} -gt 73 ]]; then\n echo "Pushing to NuGet"\n dotnet nuget push '*.nupkg' -s 'https://api.nuget.org/v3/index.json' -k '${{ secrets.NUGET_APIKEY }}' --skip-duplicate\n else\n echo 'Less files than expected, skipping push'\n exit 1\n fi\n \n \n | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\nuget-push.yml | nuget-push.yml | YAML | 1,249 | 0.8 | 0.045455 | 0.055556 | python-kit | 420 | 2023-10-04T23:17:41.003441 | BSD-3-Clause | false | c1915b43f7f0f4056664af93db5a89c2 |
name: A Pre Release Pack\n\non: workflow_dispatch\n\npermissions:\n contents: read\n\njobs:\n pre-release-pack:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Setup dotnet\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: '8.0.100'\n \n\n - name: Build All\n working-directory: ./build\n run: |\n chmod +x ./build-all.sh\n chmod +x ../ServiceStack/build/build.sh\n chmod +x ../ServiceStack.Aws/build/build.sh\n chmod +x ../ServiceStack.Azure/build/build.sh\n chmod +x ../ServiceStack.Blazor/build/build.sh\n chmod +x ../ServiceStack.Logging/build/build.sh\n chmod +x ../ServiceStack.OrmLite/build/build.sh\n chmod +x ../ServiceStack.Redis/build/build.sh\n chmod +x ../ServiceStack.Stripe/build/build.sh\n chmod +x ../ServiceStack.Text/build/build.sh\n chmod +x ../ServiceStack.Core/build/build.sh\n ./build-all.sh\n\n - name: Stage output\n working-directory: ./build\n run: |\n chmod +x ./stage-output.sh\n ./stage-output.sh\n cd staging\n export number_of_packages=$(ls -1 | wc -l)\n echo "number_of_packages=${number_of_packages}" >> $GITHUB_ENV\n\n\n - name: Check number of packages\n if: env.number_of_packages < 73\n run: |\n echo "less packages produced (${{ env.number_of_packages }}) than expected (>=73), failing."\n ls -1 ./build/staging/*.nupkg\n exit 1\n\n - uses: actions/upload-artifact@v4\n with:\n name: ServiceStack Packages\n retention-days: 1\n path: ./build/staging/*.nupkg | dataset_sample\yaml\ServiceStack_ServiceStack\.github\workflows\pre-release-pack.yml | pre-release-pack.yml | YAML | 1,704 | 0.8 | 0.018182 | 0 | vue-tools | 755 | 2024-05-12T18:26:59.974702 | Apache-2.0 | false | 7a42228794efcbafa0ff25b3a253151c |
version: "3.9"\nservices:\n redis:\n image: redis\n ports:\n - 6379:6379\n postgres:\n image: postgres\n environment:\n POSTGRES_USER: test\n POSTGRES_PASSWORD: test\n POSTGRES_DB: test\n ports:\n - 48303:5432\n sqlserver:\n image: mcr.microsoft.com/mssql/server:2017-latest\n environment:\n ACCEPT_EULA: Y\n SA_PASSWORD: Test!tesT\n MSSQL_PID: Developer\n ports:\n - 48501:1433\n mysql:\n image: mysql:8.0.28\n environment:\n MYSQL_ROOT_PASSWORD: Test!tesT\n MYSQL_DATABASE: test\n MYSQL_ROOT_HOST: '%'\n MYSQL_AUTH_PLUGIN: mysql_native_password\n ports:\n - 48205:3306\n healthcheck:\n test: ["CMD", "mysqladmin", "ping", "-h", "localhost", "-u", "root", "--password=Test!tesT"]\n interval: 5s\n retries: 20\n# firebird:\n# image: jacobalberty/firebird:v4.0.0\n# environment:\n# ISC_PASSWORD: Test!tesT\n# FIREBIRD_USER: test\n# FIREBIRD_PASSWORD: Test!tesT\n# FIREBIRD_DATABASE: test.gdb\n# EnableLegacyClientAuth: true\n# ports:\n# - 48101:3050\n\n test-run-mysql:\n image: mcr.microsoft.com/dotnet/sdk:6.0\n depends_on:\n mysql:\n condition: service_healthy\n environment:\n ORMLITE_DIALECT: MySql\n MYSQL_CONNECTION: Server=mysql;Port=3306;Database=test;UID=root;Password=Test!tesT\n SERVICESTACK_LICENSE: \n volumes:\n - ../../:/servicestack\n working_dir: /servicestack/ServiceStack.OrmLite/tests\n command: ["dotnet", "test", "--framework", "net6.0", "./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj"]\n \n test-run-postgres:\n image: mcr.microsoft.com/dotnet/sdk:6.0\n depends_on:\n postgres:\n condition: service_healthy\n environment:\n ORMLITE_DIALECT: PostgreSQL\n POSTGRES_CONNECTION: Host=postgres;Port=5432;Database=test;User ID=test;Password=test;\n SERVICESTACK_LICENSE: \n volumes:\n - ../../:/servicestack\n working_dir: /servicestack/ServiceStack.OrmLite/tests\n command: ["dotnet", "test", "--framework", "net6.0", "./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj"]\n\n test-run-sqlserver:\n image: mcr.microsoft.com/dotnet/sdk:6.0\n depends_on:\n sqlserver:\n condition: service_healthy\n environment:\n ORMLITE_DIALECT: SqlServer\n SQLSERVER_CONNECTION: Server=sqlserver,1433;Database=test;User ID=SA;Password=Test!tesT;\n SERVICESTACK_LICENSE: \n volumes:\n - ../../:/servicestack\n working_dir: /servicestack/ServiceStack.OrmLite/tests\n command: ["dotnet", "test", "--framework", "net6.0", "./ServiceStack.OrmLite.Tests/ServiceStack.OrmLite.Tests.csproj"]\n \n \n | dataset_sample\yaml\ServiceStack_ServiceStack\ServiceStack.OrmLite\build\docker-compose.yml | docker-compose.yml | YAML | 2,659 | 0.8 | 0 | 0.119048 | python-kit | 748 | 2023-08-28T08:37:27.036925 | Apache-2.0 | false | 2703d2a1525296e571564e5e27127d69 |
# this file creates docker instances of all the db engines required for running integration tests \n# before running tests, from a commandline in this directory type `docker-compose up -d` to start database engines\n# to remove all db engines after tesing, type `docker-compose down`\n\nversion: '3'\n\nservices:\n\n firebird3:\n image: "jacobalberty/firebird:3.0.4"\n networks:\n - servicestack_test\n ports: \n - "48101:3050"\n environment: \n FIREBIRD_DATABASE: "test.gdb"\n FIREBIRD_USER: "test"\n ISC_PASSWORD: "masterkey"\n EnableWireCrypt: "true"\n\n mysql5.5:\n image: mariadb:5.5\n networks:\n - servicestack_test\n ports:\n - "48201:3306"\n environment:\n MYSQL_DATABASE: "test"\n MYSQL_USER: "test"\n MYSQL_ROOT_PASSWORD: "test"\n\n mysql10.1:\n image: mariadb:10.1\n networks:\n - servicestack_test\n ports:\n - "48202:3306"\n environment:\n MYSQL_DATABASE: "test"\n MYSQL_USER: "test"\n MYSQL_ROOT_PASSWORD: "test"\n\n mysql10.2:\n image: mariadb:10.2\n networks:\n - servicestack_test\n ports:\n - "48203:3306"\n environment:\n MYSQL_DATABASE: "test"\n MYSQL_USER: "test"\n MYSQL_ROOT_PASSWORD: "test"\n\n mysql10.3:\n image: mariadb:10.3\n networks:\n - servicestack_test\n ports:\n - "48204:3306"\n environment:\n MYSQL_DATABASE: "test"\n MYSQL_USER: "test"\n MYSQL_ROOT_PASSWORD: "test"\n\n mysql10.4:\n image: mariadb:10.4\n networks:\n - servicestack_test\n ports:\n - "48205:3306"\n environment:\n MYSQL_DATABASE: "test"\n MYSQL_USER: "test"\n MYSQL_ROOT_PASSWORD: "test"\n\n postgres9:\n image: postgres:9-alpine\n networks:\n - servicestack_test\n ports:\n - "48301:5432"\n environment:\n POSTGRES_USER: "test"\n POSTGRES_DB: "test"\n POSTGRES_PASSWORD: "test"\n\n postgres10:\n image: postgres:10-alpine\n networks:\n - servicestack_test\n ports:\n - "48302:5432"\n environment:\n POSTGRES_USER: "test"\n POSTGRES_DB: "test"\n POSTGRES_PASSWORD: "test"\n\n postgres11:\n image: postgres:11-alpine\n networks:\n - servicestack_test\n ports:\n - "48303:5432"\n environment:\n POSTGRES_USER: "test"\n POSTGRES_DB: "test"\n POSTGRES_PASSWORD: "test" \n\n oracle11:\n image: epiclabs/docker-oracle-xe-11g\n networks:\n - servicestack_test\n ports:\n - "48401:1521"\n environment:\n ORACLE_ALLOW_REMOTE: "true"\n ORACLE_PASSWORD : "test"\n \n MSSqlServer2019:\n image: mcr.microsoft.com/mssql/server:2019-latest\n networks:\n - servicestack_test\n ports:\n - "48501:1433"\n environment:\n MSSQL_PID: "Express"\n ACCEPT_EULA: "Y"\n SA_PASSWORD : "Test!tesT"\n \nnetworks:\n servicestack_test: | dataset_sample\yaml\ServiceStack_ServiceStack\ServiceStack.OrmLite\src\Docker\docker-compose.yml | docker-compose.yml | YAML | 2,760 | 0.95 | 0.007692 | 0.025641 | python-kit | 49 | 2024-12-14T09:07:32.003980 | BSD-3-Clause | false | c27fd0ced44c106c856ab9046c520c83 |
# this file creates docker instances of all the db engines required for running integration tests\n# before running tests, from a commandline in this directory type `docker-compose up -d` to start database engines\n# to remove all db engines after tesing, type `docker-compose down`\n\nversion: '3'\n\nservices:\n\n firebird3:\n image: "jacobalberty/firebird:3.0.4"\n ports:\n - "48101:3050"\n environment:\n FIREBIRD_DATABASE: "test.gdb"\n FIREBIRD_USER: "test"\n ISC_PASSWORD: "masterkey"\n EnableWireCrypt: "true"\n\n mysql10.4:\n image: mariadb:10.4\n ports:\n - "48205:3306"\n environment:\n MYSQL_DATABASE: "test"\n MYSQL_USER: "test"\n MYSQL_ROOT_PASSWORD: "test"\n\n postgres11:\n image: postgres:11-alpine\n ports:\n - "48303:5432"\n environment:\n POSTGRES_USER: "test"\n POSTGRES_DB: "test"\n POSTGRES_PASSWORD: "test"\n\n oracle11:\n image: epiclabs/docker-oracle-xe-11g\n ports:\n - "48401:1521"\n environment:\n ORACLE_ALLOW_REMOTE: "true"\n ORACLE_PASSWORD : "test"\n\n MSSqlServer2017:\n image: mcr.microsoft.com/mssql/server:2017-latest-ubuntu\n ports:\n - "48501:1433"\n environment:\n MSSQL_PID: "Express"\n ACCEPT_EULA: "Y"\n SA_PASSWORD : "Test!tesT"\n\n redis:\n image: redis\n ports:\n - "6379:6379"\n volumes:\n - "./data:/data"\n command: redis-server --appendonly yes\n\n dynamodb:\n image: amazon/dynamodb-local\n ports:\n - "48000:8000"\n\n rabbitmq:\n image: rabbitmq:3.7.5-management\n hostname: app-rabbitmq\n ports:\n - 45672:5672\n - 41672:15672\n volumes:\n - ./data/rabbitmq:/var/lib/rabbitmq/mnesia/rabbit@app-rabbitmq:cached\n environment:\n RABBITMQ_ERLANG_COOKIE: 6085e2412b6fa88647466c6a81c0cea0\n RABBITMQ_DEFAULT_USER: guest\n RABBITMQ_DEFAULT_PASS: guest\n RABBITMQ_DEFAULT_VHOST: /\n\n server:\n image: jetbrains/teamcity-server\n ports:\n - "2080:8111"\n volumes:\n - /root/docker/teamcity/server/data:/data/teamcity_server/datadir\n - /root/docker/teamcity/server/logs:/opt/teamcity/logs\n\n ci-agent:\n image: servicestack/ci-agent\n volumes:\n - /root/docker/build/ci-agent/conf:/data/teamcity_agent/conf\n environment:\n SERVER_URL: https://build.servicestack.net\n\n\nnetworks:\n default:\n driver: bridge | dataset_sample\yaml\ServiceStack_ServiceStack\ServiceStack.OrmLite\src\Docker\servicestack-ci\docker-compose.yml | docker-compose.yml | YAML | 2,337 | 0.95 | 0.010204 | 0.035294 | node-utils | 524 | 2025-07-01T02:19:59.028381 | BSD-3-Clause | false | c2aeb9f6cb5be3bf9e4d74ef4c43aae3 |
name: Build ShareX\n\non:\n push:\n branches:\n - "**"\n tags:\n - "v[0-9]+.[0-9]+.[0-9]+"\n paths-ignore:\n - "**/*.md"\n - "**/*.gitignore"\n - "**/*.gitattributes"\n\npermissions:\n contents: read\n\njobs:\n build:\n name: Build\n runs-on: windows-latest\n\n strategy:\n fail-fast: false\n matrix:\n configuration:\n - Release\n - Debug\n - Steam\n - MicrosoftStore\n - MicrosoftStoreDebug\n platform:\n - Any CPU\n\n env:\n SOLUTION_FILE_PATH: ShareX.sln\n ASSEMBLY_INFO_PATH: Directory.build.props\n\n outputs:\n APP_VERSION: ${{ env.APP_VERSION }}\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Add msbuild to PATH\n uses: microsoft/setup-msbuild@v2\n\n - name: Set APP_VERSION\n run: |\n $content = Get-Content "${{ env.ASSEMBLY_INFO_PATH }}" -Raw\n $pattern = '<Version>([0-9]+(?:\.[0-9]+){1,3})</Version>'\n $match = [regex]::Match($content, $pattern)\n $version = $match.Groups[1].Value\n if ($env:GITHUB_REF -eq "refs/heads/develop") {\n $version = "$version.$env:GITHUB_RUN_NUMBER"\n $content = [regex]::Replace($content, $pattern, "<Version>$version</Version>")\n Set-Content -Path "${{ env.ASSEMBLY_INFO_PATH }}" -Value "$content" -NoNewline\n }\n echo $version\n echo "APP_VERSION=$version" >> $env:GITHUB_ENV\n\n - name: Download API keys\n env:\n API_KEYS: ${{ secrets.API_KEYS }}\n if: env.API_KEYS != ''\n working-directory: ${{ env.GITHUB_WORKSPACE }}\n run: |\n Invoke-WebRequest -Uri "$env:API_KEYS" -OutFile "ShareX.UploadersLib\APIKeys\APIKeysLocal.cs"\n\n - name: Restore NuGet packages\n working-directory: ${{ env.GITHUB_WORKSPACE }}\n run: nuget restore "${{ env.SOLUTION_FILE_PATH }}" -Project2ProjectTimeOut 300\n\n - name: Build\n working-directory: ${{ env.GITHUB_WORKSPACE }}\n run: msbuild -m -p:Configuration="${{ matrix.configuration }}" -p:Platform="${{ matrix.platform }}" "${{ env.SOLUTION_FILE_PATH }}"\n\n - name: Setup\n working-directory: ${{ env.GITHUB_WORKSPACE }}\n run: |\n & "ShareX.Setup\bin\${{ matrix.configuration }}\ShareX.Setup.exe" -silent -job "${{ matrix.configuration }}"\n\n - name: Upload artifact (Setup)\n if: matrix.configuration == 'Release'\n uses: actions/upload-artifact@v4\n with:\n name: Setup\n path: Output\ShareX-${{ env.APP_VERSION }}-setup.exe\n\n - name: Upload artifact (Portable)\n if: matrix.configuration == 'Release'\n uses: actions/upload-artifact@v4\n with:\n name: Portable\n path: Output\ShareX-${{ env.APP_VERSION }}-portable.zip\n\n - name: Upload artifact (Debug)\n if: matrix.configuration == 'Debug'\n uses: actions/upload-artifact@v4\n with:\n name: Debug\n path: Output\ShareX-${{ env.APP_VERSION }}-debug.zip\n\n - name: Upload artifact (Steam)\n if: matrix.configuration == 'Steam'\n uses: actions/upload-artifact@v4\n with:\n name: Steam\n path: Output\ShareX-${{ env.APP_VERSION }}-Steam.zip\n\n - name: Upload artifact (MicrosoftStore)\n if: matrix.configuration == 'MicrosoftStore'\n uses: actions/upload-artifact@v4\n with:\n name: MicrosoftStore\n path: Output\ShareX-${{ env.APP_VERSION }}.appx\n\n - name: Upload artifact (MicrosoftStoreDebug)\n if: matrix.configuration == 'MicrosoftStoreDebug'\n uses: actions/upload-artifact@v4\n with:\n name: MicrosoftStoreDebug\n path: Output\ShareX-${{ env.APP_VERSION }}-debug.appx\n\n release:\n name: Release\n needs: build\n if: github.ref == 'refs/heads/develop' || startsWith(github.ref, 'refs/tags/v')\n runs-on: windows-latest\n\n permissions:\n contents: write\n\n env:\n REPO_DEV_BUILDS: ${{ github.repository_owner }}/DevBuilds\n RELEASE_BODY_PATH: RELEASE_BODY.md\n APP_VERSION: ${{ needs.build.outputs.APP_VERSION }}\n\n steps:\n - name: Download artifact (Setup)\n uses: actions/download-artifact@v4\n with:\n name: Setup\n path: Output\n\n - name: Download artifact (Portable)\n uses: actions/download-artifact@v4\n with:\n name: Portable\n path: Output\n\n - name: Download artifact (Debug)\n if: github.ref == 'refs/heads/develop'\n uses: actions/download-artifact@v4\n with:\n name: Debug\n path: Output\n\n - name: Download artifact (Steam)\n if: github.ref == 'refs/heads/develop'\n uses: actions/download-artifact@v4\n with:\n name: Steam\n path: Output\n\n - name: Download artifact (MicrosoftStore)\n if: github.ref == 'refs/heads/develop'\n uses: actions/download-artifact@v4\n with:\n name: MicrosoftStore\n path: Output\n\n - name: Download artifact (MicrosoftStoreDebug)\n if: github.ref == 'refs/heads/develop'\n uses: actions/download-artifact@v4\n with:\n name: MicrosoftStoreDebug\n path: Output\n\n - name: Create release body file\n run: |\n $checksums = Get-ChildItem -Path "Output\" -Recurse -File\n | Sort-Object -Property Name\n | ForEach-Object { "| $($_.Name) | ``$((Get-FileHash $_.FullName -Algorithm SHA256).Hash)`` |" }\n | Out-String\n $output = "| File | SHA256 |`r`n| --- | --- |`r`n$($checksums.Trim())"\n echo $output >> $env:GITHUB_STEP_SUMMARY\n if ($env:GITHUB_REF.StartsWith("refs/tags/v")) {\n $output = "**Changelog:** https://getsharex.com/changelog#$env:GITHUB_REF_NAME`r`n`r`n$output"\n }\n echo $output\n Set-Content -Path "${{ env.RELEASE_BODY_PATH }}" -Value "$output" -NoNewline\n\n - name: Release (Dev)\n env:\n CUSTOM_GITHUB_TOKEN: ${{ secrets.CUSTOM_GITHUB_TOKEN }}\n if: env.CUSTOM_GITHUB_TOKEN != '' && env.REPO_DEV_BUILDS != '' && github.ref == 'refs/heads/develop'\n uses: softprops/action-gh-release@975c1b265e11dd76618af1c374e7981f9a6ff44a\n with:\n repository: ${{ env.REPO_DEV_BUILDS }}\n token: ${{ env.CUSTOM_GITHUB_TOKEN }}\n tag_name: v${{ env.APP_VERSION }}\n name: ShareX ${{ env.APP_VERSION }} Dev\n body_path: ${{ env.RELEASE_BODY_PATH }}\n draft: false\n prerelease: false\n files: |\n Output/ShareX-${{ env.APP_VERSION }}-setup.exe\n Output/ShareX-${{ env.APP_VERSION }}-portable.zip\n Output/ShareX-${{ env.APP_VERSION }}-debug.zip\n Output/ShareX-${{ env.APP_VERSION }}-Steam.zip\n Output/ShareX-${{ env.APP_VERSION }}.appx\n Output/ShareX-${{ env.APP_VERSION }}-debug.appx\n\n - name: Release (Stable)\n env:\n CUSTOM_GITHUB_TOKEN: ${{ secrets.CUSTOM_GITHUB_TOKEN }}\n if: env.CUSTOM_GITHUB_TOKEN != '' && startsWith(github.ref, 'refs/tags/v')\n uses: softprops/action-gh-release@975c1b265e11dd76618af1c374e7981f9a6ff44a\n with:\n repository: ${{ github.repository }}\n token: ${{ env.CUSTOM_GITHUB_TOKEN }}\n tag_name: ${{ github.ref_name }}\n name: ShareX ${{ env.APP_VERSION }}\n body_path: ${{ env.RELEASE_BODY_PATH }}\n draft: false\n prerelease: true\n files: |\n Output/ShareX-${{ env.APP_VERSION }}-setup.exe\n Output/ShareX-${{ env.APP_VERSION }}-portable.zip | dataset_sample\yaml\ShareX_ShareX\.github\workflows\build.yml | build.yml | YAML | 7,686 | 0.8 | 0.069869 | 0 | react-lib | 331 | 2024-02-27T07:35:47.837989 | Apache-2.0 | false | 7321b981ab1e1dc40b81348e90f7ec0f |
name: Build ShareX (PR)\n\non:\n pull_request:\n branches:\n - "**"\n\npermissions:\n contents: read\n\njobs:\n build:\n name: Build\n runs-on: windows-latest\n\n strategy:\n fail-fast: false\n matrix:\n configuration:\n - Release\n - Debug\n - Steam\n - MicrosoftStore\n - MicrosoftStoreDebug\n platform:\n - Any CPU\n\n env:\n SOLUTION_FILE_PATH: ShareX.sln\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Add msbuild to PATH\n uses: microsoft/setup-msbuild@v2\n\n - name: Restore NuGet packages\n working-directory: ${{ env.GITHUB_WORKSPACE }}\n run: nuget restore "${{ env.SOLUTION_FILE_PATH }}" -Project2ProjectTimeOut 300\n\n - name: Build\n working-directory: ${{ env.GITHUB_WORKSPACE }}\n run: msbuild -m -p:Configuration="${{ matrix.configuration }}" -p:Platform="${{ matrix.platform }}" "${{ env.SOLUTION_FILE_PATH }}" | dataset_sample\yaml\ShareX_ShareX\.github\workflows\pr.yml | pr.yml | YAML | 978 | 0.7 | 0 | 0 | vue-tools | 301 | 2024-10-24T02:17:26.148099 | Apache-2.0 | false | 0f686b7c0b001c829cc39a5b01cc1d3c |
name: Close stale issues and PRs\n\non:\n schedule:\n - cron: "30 1 * * *"\n\npermissions:\n issues: write\n pull-requests: write\n\njobs:\n stale:\n runs-on: ubuntu-latest\n\n steps:\n - uses: actions/stale@v9\n with:\n stale-issue-message: ""\n stale-pr-message: ""\n days-before-issue-stale: 360\n days-before-pr-stale: 90\n days-before-issue-close: 7\n days-before-pr-close: 7\n stale-issue-label: "Stale"\n stale-pr-label: "Stale"\n exempt-issue-labels: "Bug,Enhancement"\n exempt-pr-labels: ""\n operations-per-run: 100 | dataset_sample\yaml\ShareX_ShareX\.github\workflows\stale.yml | stale.yml | YAML | 622 | 0.7 | 0 | 0 | node-utils | 436 | 2024-07-31T11:05:52.847691 | MIT | false | ff0342a07904a3080acfa5117f209695 |
version: 2.1\n\norbs:\n win: circleci/windows@5.0\n macos: circleci/macos@2.4.1\n\nexecutors:\n docker-rust:\n docker:\n # Note: Let CI use our MSRV, rather than latest\n - image: cimg/rust:1.85.0\n resource_class: small\n machine-ubuntu:\n machine:\n image: default\n docker_layer_caching: true\n resource_class: large\n\n# sscache steps are from this guide\n# https://medium.com/@edouard.oger/rust-caching-on-circleci-using-sccache-c996344f0115\ncommands:\n restore-cargo-and-sccache:\n parameters:\n sccache-size:\n description: Max size for the cache\n type: string\n # 500 MB is the recommended soft cap for a cache on circleci.\n # Large integration tests override this default.\n default: 500M\n steps:\n - run:\n name: Install sccache\n command: |\n SCCACHE_VERSION='v0.8.2'\n ls ~/.cargo/bin/sccache \\n || curl -L https://github.com/mozilla/sccache/releases/download/$SCCACHE_VERSION/sccache-$SCCACHE_VERSION-x86_64-unknown-linux-musl.tar.gz \\n | tar -xOz sccache-$SCCACHE_VERSION-x86_64-unknown-linux-musl/sccache \\n > ~/.cargo/bin/sccache \\n && chmod +x ~/.cargo/bin/sccache\n echo 'export RUSTC_WRAPPER=~/.cargo/bin/sccache' >> $BASH_ENV\n echo 'export SCCACHE_CACHE_SIZE=<< parameters.sccache-size >>' >> $BASH_ENV\n - run:\n name: Start sccache\n command: ~/.cargo/bin/sccache --start-server\n - restore_cache:\n name: Restore sccache cache\n keys: # Find latest cache for this job on same branch, else fall back to older cache.\n - sccache-cache-{{ .Environment.CIRCLE_JOB }}-{{ .Branch }}-\n - sccache-cache-{{ .Environment.CIRCLE_JOB }}-\n - restore_cache:\n name: Restore cargo registry cache\n keys: # This is saved once per lockfile update. If new lockfile, get an older cache.\n - cargo-{{ checksum "Cargo.lock" }}\n - cargo-\n save-sccache:\n steps:\n - run:\n name: Sccache stats\n command: sccache --show-stats\n - run:\n name: Prune old sccache items\n # Delete files that have not been accessed in 5 days\n command: |\n du -sh ~/.cache/sccache\n find ~/.cache/sccache -atime +5 | wc -l\n find ~/.cache/sccache -atime +5 -delete\n du -sh ~/.cache/sccache\n - run:\n name: Sccache stats\n command: sccache --show-stats\n - save_cache:\n name: Save sccache cache\n # We use {{ .Branch }}-{{ .Revision }} to upload a fresh cache for each commit on a branch.\n # If a new commit is built, it will fall back on the most recent cache from the same branch.\n key: sccache-cache-{{ .Environment.CIRCLE_JOB }}-{{ .Branch }}-{{ .Revision }}\n paths:\n - ~/.cache/sccache\n # This is only performed by the workspace clippy job, since it does the largest cargo fetch.\n # Other jobs might restore a slightly older copy of this cache.\n save-cargo-cache:\n steps:\n # Discard crates.io patches so that it produces a re-usable cache key checksum.\n - run:\n name: Restore Cargo.lock\n command: git restore Cargo.lock\n - save_cache:\n name: Save cargo cache\n # This is saved once per lockfile update.\n key: cargo-{{ checksum "Cargo.lock" }}\n paths:\n - ~/.cargo/registry/cache\n - ~/.cargo/registry/index\n - ~/.cargo/git/db\n apply-patches:\n steps:\n - run:\n name: Patch local crates\n command: ./scripts/apply-patches.sh\n install-rust:\n steps:\n - run:\n name: Install Rust (MSRV)\n # Note: Let CI use our MSRV, rather than latest\n command: which cargo || curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --default-toolchain 1.85.0\n install-cargo-make:\n steps:\n - run:\n name: Install cargo-make\n environment:\n VERSION: "0.37.23"\n command: |\n curl -OL https://github.com/sagiegurari/cargo-make/releases/download/$VERSION/cargo-make-v$VERSION-x86_64-unknown-linux-musl.zip && \\n unzip cargo-make-v$VERSION-x86_64-unknown-linux-musl.zip && \\n mv cargo-make-v$VERSION-x86_64-unknown-linux-musl/cargo-make ~/.cargo/bin && \\n rm -rf cargo-make-v$VERSION-x86_64-unknown-linux-musl*\n install-cargo-audit:\n steps:\n - run:\n name: Install cargo-audit\n environment:\n VERSION: "v0.21.0"\n command: |\n curl -L https://github.com/rustsec/rustsec/releases/download/cargo-audit/$VERSION/cargo-audit-x86_64-unknown-linux-musl-$VERSION.tgz \\n | tar -xOz cargo-audit-x86_64-unknown-linux-musl-$VERSION/cargo-audit \\n > ~/.cargo/bin/cargo-audit \\n && chmod +x ~/.cargo/bin/cargo-audit\n set-git-tag:\n steps:\n - run:\n name: Set git tag in BASH_ENV\n command: echo TAG=$(git describe --tags --abbrev=0) >> $BASH_ENV\n make-artifact:\n parameters:\n target:\n description: "Rust target to put in artifact"\n type: string\n suffix:\n description: "Suffix that is on the binary"\n type: string\n default: ""\n steps:\n - set-git-tag\n - run:\n name: Set binary directory in the environment\n command: |\n echo BIN_DIR=cargo-shuttle-<< parameters.target >>-$TAG >> $BASH_ENV\n - run:\n name: Make artifact\n command: |\n mkdir $BIN_DIR\n mv target/<< parameters.target >>/release/cargo-shuttle<< parameters.suffix >> $BIN_DIR/cargo-shuttle<< parameters.suffix >>\n mv target/<< parameters.target >>/release/shuttle<< parameters.suffix >> $BIN_DIR/shuttle<< parameters.suffix >>\n mv LICENSE $BIN_DIR/\n mv README.md $BIN_DIR/\n mkdir -p artifacts/<< parameters.target >>\n cp $BASH_ENV artifacts/<< parameters.target >>.env\n tar -cvzf artifacts/<< parameters.target >>/cargo-shuttle-$TAG-<< parameters.target >>.tar.gz $BIN_DIR\n # Persist the bash environment to the workspace as well, we need it for the release job.\n # Make sure the name is unique, since the binaries will be built in parallel.\n # https://discuss.circleci.com/t/share-environment-variable-between-different-job/45647/4\n - persist_to_workspace:\n root: artifacts\n paths:\n - << parameters.target >>/*\n - << parameters.target >>.env\n\njobs:\n workspace-fmt-clippy:\n executor: docker-rust\n resource_class: xlarge\n steps:\n - checkout\n - restore-cargo-and-sccache\n - install-cargo-make\n - run: cargo make ci-workspace\n - run: curl --proto '=https' --tlsv1.2 -LsSf https://github.com/1Password/typeshare/releases/download/v1.13.0/typeshare-cli-v1.13.0-installer.sh | sh\n - run: cargo make types\n - run: cargo make check-types\n - save-sccache\n - save-cargo-cache\n cargo-audit:\n executor: docker-rust\n steps:\n - checkout\n - install-cargo-make\n - install-cargo-audit\n - run: cargo make audit\n test-standalone:\n parameters:\n path:\n description: "Path to crate external from workspace"\n type: string\n executor: docker-rust\n steps:\n - checkout\n - restore-cargo-and-sccache\n - install-cargo-make\n - apply-patches\n - run: cargo make ci-standalone << parameters.path >>\n - save-sccache\n test-workspace-member:\n parameters:\n crate:\n description: Crate in workspace to test\n type: string\n executor: docker-rust\n steps:\n - install-rust\n - checkout\n - restore-cargo-and-sccache\n - install-cargo-make\n - run: cargo make test-member << parameters.crate >>\n - save-sccache\n test-workspace-member-with-integration:\n parameters:\n crate:\n description: Crate in workspace to test\n type: string\n resource_class:\n description: The resource type to use for the machine\n type: string\n sccache-size:\n type: string\n default: 500M\n executor: docker-rust\n resource_class: << parameters.resource_class >>\n steps:\n - install-rust\n - install-cargo-make\n - checkout\n - run: git submodule update --init\n - restore-cargo-and-sccache:\n sccache-size: << parameters.sccache-size >>\n - apply-patches\n - run: cargo make test-member << parameters.crate >>\n - run: cargo make test-member-integration << parameters.crate >>\n - save-sccache\n test-workspace-member-with-integration-docker:\n parameters:\n crate:\n description: "Crate in workspace to test"\n type: string\n resource_class:\n description: The resource type to use for the machine\n type: string\n sccache-size:\n type: string\n default: 500M\n # Using a machine image since tests will start a docker container\n executor: machine-ubuntu\n resource_class: << parameters.resource_class >>\n steps:\n - install-rust\n - install-cargo-make\n - checkout\n - run: git submodule update --init\n - restore-cargo-and-sccache:\n sccache-size: << parameters.sccache-size >>\n - apply-patches\n - run: cargo make test-member-integration-docker << parameters.crate >>\n - save-sccache\n build-binaries-linux:\n machine:\n image: default\n resource_class: << parameters.resource_class >>\n parameters:\n target:\n description: "Linux target to build for"\n type: string\n resource_class:\n description: "The resource type to use for the machine"\n type: string\n steps:\n - checkout\n - run: sudo apt update && sudo DEBIAN_FRONTEND=noninteractive apt install -y libssl-dev musl-tools clang\n - run:\n name: Install Rust\n # Note: Let binary build use latest\n command: curl --proto '=https' --tlsv1.3 https://sh.rustup.rs -sSf | bash -s -- -y --target << parameters.target >>\n - run:\n name: Build\n command: |\n # From https://github.com/briansmith/ring/issues/1414#issuecomment-1055177218\n export CC_aarch64_unknown_linux_musl=clang\n cargo build --release --package cargo-shuttle --target << parameters.target >>\n - make-artifact:\n target: << parameters.target >>\n build-binaries-windows:\n executor:\n name: win/default\n size: xlarge\n shell: bash.exe\n environment:\n CARGO_NET_GIT_FETCH_WITH_CLI: "true"\n steps:\n - checkout\n - run:\n name: Install Rust\n # Note: Let binary build use latest\n command: |\n choco uninstall rust # The one coming from choco interferes with the one coming from rustup\n wget -OutFile "C:\rustup-init.exe" https://static.rust-lang.org/rustup/dist/x86_64-pc-windows-msvc/rustup-init.exe\n C:\rustup-init.exe -y --target x86_64-pc-windows-msvc\n shell: powershell.exe\n - run:\n name: Build\n command: |\n cargo.exe build --release --package cargo-shuttle --target x86_64-pc-windows-msvc\n shell: powershell.exe\n - make-artifact:\n target: x86_64-pc-windows-msvc\n suffix: ".exe"\n build-binaries-mac:\n macos:\n xcode: 15.3.0\n resource_class: macos.m1.medium.gen1\n steps:\n - checkout\n # Necessary to build for Intel targets on m1.\n - macos/install-rosetta\n - run:\n name: Install Rust\n # Note: Let binary build use latest\n command: curl --proto '=https' https://sh.rustup.rs -sSf | bash -s -- -y --target x86_64-apple-darwin\n - run:\n name: Build\n command: |\n cargo build --release --package cargo-shuttle --target x86_64-apple-darwin\n - make-artifact:\n target: x86_64-apple-darwin\n publish-github-release-draft:\n docker:\n - image: cimg/go:1.19.3\n steps:\n - attach_workspace:\n at: artifacts\n - run:\n name: "Set tag in environment"\n command: |\n for file in artifacts/*.env; do\n cat artifacts/${file##*/} >> $BASH_ENV\n rm artifacts/${file##*/}\n done\n - run:\n name: "Publish Release on GitHub"\n # Since each binary is in a sub directory named after its target, we flatten\n # the artifacts directory before passing it to ghr\n command: |\n find artifacts -mindepth 2 -type f -exec mv -t artifacts {} +\n go install github.com/tcnksm/ghr@v0.16.2\n ghr -t ${GITHUB_TOKEN} -u ${CIRCLE_PROJECT_USERNAME} -r ${CIRCLE_PROJECT_REPONAME} -c ${CIRCLE_SHA1} -delete -draft ${TAG} ./artifacts/\n publish-crate:\n parameters:\n path:\n description: Path to crate to publish\n type: string\n executor: docker-rust\n steps:\n - checkout\n - run:\n name: Publish crate\n # Don't build the crate, we know that the crate should build based on other workflows\n # Ignore failure if the crate is already uploaded\n command: |\n set +e\n cargo publish --no-verify --manifest-path << parameters.path >>/Cargo.toml > /tmp/publish-out 2>&1\n ERR=$?\n set -e\n if [ $ERR != 0 ] && ! grep "already uploaded" /tmp/publish-out; then\n cat /tmp/publish-out\n exit 1\n fi\nworkflows:\n ci:\n jobs:\n - workspace-fmt-clippy\n - test-standalone:\n name: << matrix.path >>\n matrix:\n parameters:\n path:\n - resources/aws-rds\n - resources/openai\n - resources/opendal\n - resources/qdrant\n - resources/shared-db\n - resources/turso\n - services/shuttle-actix-web\n - services/shuttle-axum\n - services/shuttle-poem\n - services/shuttle-rocket\n - services/shuttle-salvo\n - services/shuttle-serenity\n - services/shuttle-thruster\n - services/shuttle-tide\n - services/shuttle-tower\n - services/shuttle-warp\n - test-workspace-member:\n name: << matrix.crate >>\n matrix:\n parameters:\n crate:\n - shuttle-codegen\n - shuttle-common\n - test-workspace-member-with-integration:\n name: << matrix.crate >>\n matrix:\n alias: test-workspace-member-with-integration-xlarge\n parameters:\n resource_class:\n - xlarge\n sccache-size:\n - 1G\n crate:\n - cargo-shuttle\n - test-workspace-member-with-integration-docker:\n name: << matrix.crate >> with docker\n matrix:\n alias: test-workspace-member-with-integration-docker-large\n parameters:\n resource_class:\n - large\n sccache-size:\n - 750M\n crate:\n - cargo-shuttle\n - cargo-audit:\n filters:\n branches:\n only: main\n build-binaries:\n jobs:\n - build-binaries-linux:\n name: build-binaries-x86_64-gnu\n target: x86_64-unknown-linux-gnu\n resource_class: xlarge\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - build-binaries-linux:\n name: build-binaries-x86_64-musl\n target: x86_64-unknown-linux-musl\n resource_class: xlarge\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - build-binaries-linux:\n name: build-binaries-aarch64\n target: aarch64-unknown-linux-musl\n resource_class: arm.xlarge\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - build-binaries-windows:\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - build-binaries-mac:\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - publish-github-release-draft:\n requires:\n - build-binaries-x86_64-gnu\n - build-binaries-x86_64-musl\n - build-binaries-aarch64\n - build-binaries-windows\n - build-binaries-mac\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n publish-crates:\n jobs:\n - approve-publish-crates:\n type: approval\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - publish-crate:\n matrix:\n parameters:\n path:\n - codegen\n - common\n name: publish-<< matrix.path >>\n requires:\n - approve-publish-crates\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - publish-crate:\n matrix:\n parameters:\n path:\n - api-client\n - service\n name: publish-<< matrix.path >>\n requires:\n - publish-common\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - publish-crate:\n matrix:\n parameters:\n path:\n - resources/aws-rds\n - resources/openai\n - resources/opendal\n - resources/qdrant\n - resources/shared-db\n - resources/turso\n name: publish-<< matrix.path >>\n requires:\n - publish-service\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - publish-crate:\n matrix:\n parameters:\n path:\n - runtime\n - cargo-shuttle\n name: publish-<< matrix.path >>\n requires:\n - publish-api-client\n - publish-service\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n - publish-crate:\n matrix:\n parameters:\n path:\n - services/shuttle-actix-web\n - services/shuttle-axum\n - services/shuttle-poem\n - services/shuttle-rocket\n - services/shuttle-salvo\n - services/shuttle-serenity\n - services/shuttle-thruster\n - services/shuttle-tide\n - services/shuttle-tower\n - services/shuttle-warp\n name: publish-<< matrix.path >>\n requires:\n - publish-runtime\n filters:\n branches:\n ignore: /.*/\n tags:\n only: /v[\d\.]+/\n | dataset_sample\yaml\shuttle-hq_shuttle\.circleci\config.yml | config.yml | YAML | 19,435 | 0.95 | 0.022609 | 0.045534 | node-utils | 381 | 2023-12-03T07:34:30.186652 | BSD-3-Clause | false | 2c8138095e0e974cb6089a02410051c4 |
blank_issues_enabled: true\ncontact_links:\n - name: Discord\n url: https://discord.gg/shuttle\n about: Feel free to reach out on our Discord should you have any questions!\n | dataset_sample\yaml\shuttle-hq_shuttle\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 176 | 0.8 | 0 | 0 | vue-tools | 604 | 2025-05-05T16:56:54.771595 | Apache-2.0 | false | 07eb6658b92391e2611f8880eeeaf6fd |
name: Feature suggestion\ndescription: Suggest a new feature\ntype: Feature\nlabels: ["S-Triage"]\nbody:\n - type: markdown\n attributes:\n value: Thanks for taking the time to suggest a feature!\n - type: textarea\n id: describe\n attributes:\n label: Describe the feature\n validations:\n required: true\n - type: textarea\n id: example\n attributes:\n label: Suggestion or example of how the feature would be used\n - type: checkboxes\n id: duplicate\n attributes:\n label: Duplicate declaration\n options:\n - label: I have searched the issues and this feature has not been requested before.\n required: true\n | dataset_sample\yaml\shuttle-hq_shuttle\.github\ISSUE_TEMPLATE\FEATURE-SUGGESTION.yml | FEATURE-SUGGESTION.yml | YAML | 663 | 0.85 | 0.04 | 0 | react-lib | 178 | 2024-06-04T08:56:36.455747 | MIT | false | 8150eddfa7a0d01af45f9db8da1b3011 |
name: Improvement suggestion\ndescription: Suggest an improvement to an existing feature\ntype: Improvement\nlabels: ["S-Triage"]\nbody:\n - type: markdown\n attributes:\n value: Thanks for taking the time to suggest an improvement!\n - type: textarea\n id: describe\n attributes:\n label: Describe the improvement\n validations:\n required: true\n - type: checkboxes\n id: duplicate\n attributes:\n label: Duplicate declaration\n options:\n - label: I have searched the issues and this improvement has not been requested before.\n required: true\n | dataset_sample\yaml\shuttle-hq_shuttle\.github\ISSUE_TEMPLATE\IMPROVEMENT.yml | IMPROVEMENT.yml | YAML | 589 | 0.85 | 0.047619 | 0 | node-utils | 793 | 2025-02-09T15:39:19.650434 | MIT | false | f474d25755b080d6f3f134517eae55f7 |
# Documentation: https://docs.codecov.io/docs/codecov-yaml\n\ncodecov:\n # Avoid "Missing base report"\n # https://github.com/codecov/support/issues/363\n # https://docs.codecov.io/docs/comparing-commits\n allow_coverage_offsets: true\n\n # Avoid Report Expired\n # https://docs.codecov.io/docs/codecov-yaml#section-expired-reports\n max_report_age: off\n\ncoverage:\n # Use integer precision\n # https://docs.codecov.com/docs/codecovyml-reference#coverageprecision\n precision: 0\n\n # Explicitly control coverage status checks\n # https://docs.codecov.com/docs/commit-status#disabling-a-status\n status:\n project: on\n patch: off\n | dataset_sample\yaml\SixLabors_ImageSharp\codecov.yml | codecov.yml | YAML | 645 | 0.8 | 0 | 0.555556 | react-lib | 382 | 2024-03-22T13:58:47.632314 | GPL-3.0 | false | 4a86c9ebfc678dd0f7493b00258baaf9 |
version: 2\nupdates:\n - package-ecosystem: "github-actions"\n directory: "/"\n schedule:\n interval: "weekly"\n | dataset_sample\yaml\SixLabors_ImageSharp\.github\dependabot.yml | dependabot.yml | YAML | 118 | 0.7 | 0 | 0 | python-kit | 666 | 2025-05-09T23:56:10.058691 | GPL-3.0 | false | 4334132ba6bcd78670bf574b6b4236f4 |
blank_issues_enabled: false\ncontact_links:\n - name: Questions\n url: https://github.com/SixLabors/ImageSharp/discussions/categories/q-a\n about: Ask the community for help.\n - name: Feature Request\n url: https://github.com/SixLabors/ImageSharp/discussions/categories/ideas\n about: Share ideas for new features for this project.\n | dataset_sample\yaml\SixLabors_ImageSharp\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 352 | 0.8 | 0.375 | 0 | vue-tools | 275 | 2025-04-03T16:36:16.770714 | Apache-2.0 | false | 3924100501f2a30d8bf0f2735056fbc9 |
name: "Bug Report"\ndescription: Create a report to help us improve the project. Issues are not guaranteed to be triaged.\nlabels: ["needs triage"]\nbody:\n- type: checkboxes\n attributes:\n label: Prerequisites\n options:\n - label: I have written a descriptive issue title\n required: true\n - label: I have verified that I am running the latest version of ImageSharp\n required: true\n - label: I have verified if the problem exist in both `DEBUG` and `RELEASE` mode\n required: true\n - label: I have searched [open](https://github.com/SixLabors/ImageSharp/issues) and [closed](https://github.com/SixLabors/ImageSharp/issues?q=is%3Aissue+is%3Aclosed) issues to ensure it has not already been reported\n required: true\n- type: input\n attributes:\n label: ImageSharp version\n validations:\n required: true\n- type: input\n attributes:\n label: Other ImageSharp packages and versions\n validations:\n required: true\n- type: input\n attributes:\n label: Environment (Operating system, version and so on)\n validations:\n required: true\n- type: input\n attributes:\n label: .NET Framework version\n validations:\n required: true\n- type: textarea\n attributes:\n label: Description\n description: A description of the bug\n validations:\n required: true\n- type: textarea\n attributes:\n label: Steps to Reproduce\n description: List of steps, sample code, failing test or link to a project that reproduces the behavior. Make sure you place a stack trace inside a code (```) block to avoid linking unrelated issues.\n validations:\n required: true\n- type: textarea\n attributes:\n label: Images\n description: Please upload images that can be used to reproduce issues in the area below. If the file type is not supported the file can be zipped and then uploaded instead.\n | dataset_sample\yaml\SixLabors_ImageSharp\.github\ISSUE_TEMPLATE\oss-bug-report.yml | oss-bug-report.yml | YAML | 1,841 | 0.95 | 0.019231 | 0 | react-lib | 744 | 2024-10-04T14:37:59.514176 | GPL-3.0 | false | 0302236264825578bd20fc71d62cbe78 |
name: CodeCoverage\n\non:\n schedule:\n # 2AM every Tuesday/Thursday\n - cron: "0 2 * * 2,4"\njobs:\n Build:\n strategy:\n matrix:\n options:\n - os: ubuntu-latest\n framework: net8.0\n runtime: -x64\n codecov: true\n\n runs-on: ${{matrix.options.os}}\n\n steps:\n\n - name: Install libgdi+, which is required for tests running on ubuntu\n if: ${{ contains(matrix.options.os, 'ubuntu') }}\n run: |\n sudo apt-get update\n sudo apt-get -y install libgdiplus libgif-dev libglib2.0-dev libcairo2-dev libtiff-dev libexif-dev\n\n - name: Git Config\n shell: bash\n run: |\n git config --global core.autocrlf false\n git config --global core.longpaths true\n\n - name: Git Checkout\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n submodules: recursive\n\n # See https://github.com/actions/checkout/issues/165#issuecomment-657673315\n - name: Git Create LFS FileList\n run: git lfs ls-files -l | cut -d' ' -f1 | sort > .lfs-assets-id\n\n - name: Git Setup LFS Cache\n uses: actions/cache@v4\n id: lfs-cache\n with:\n path: .git/lfs\n key: ${{ runner.os }}-lfs-${{ hashFiles('.lfs-assets-id') }}-v1\n\n - name: Git Pull LFS\n run: git lfs pull\n\n - name: NuGet Install\n uses: NuGet/setup-nuget@v2\n\n - name: NuGet Setup Cache\n uses: actions/cache@v4\n id: nuget-cache\n with:\n path: ~/.nuget\n key: ${{ runner.os }}-nuget-${{ hashFiles('**/*.csproj', '**/*.props', '**/*.targets') }}\n restore-keys: ${{ runner.os }}-nuget-\n\n - name: DotNet Setup\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: |\n 8.0.x\n\n - name: DotNet Build\n shell: pwsh\n run: ./ci-build.ps1 "${{matrix.options.framework}}"\n env:\n SIXLABORS_TESTING: True\n\n - name: DotNet Test\n shell: pwsh\n run: ./ci-test.ps1 "${{matrix.options.os}}" "${{matrix.options.framework}}" "${{matrix.options.runtime}}" "${{matrix.options.codecov}}"\n env:\n SIXLABORS_TESTING: True\n XUNIT_PATH: .\tests\ImageSharp.Tests # Required for xunit\n\n - name: Export Failed Output\n uses: actions/upload-artifact@v4\n if: failure()\n with:\n name: actual_output_${{ runner.os }}_${{ matrix.options.framework }}${{ matrix.options.runtime }}.zip\n path: tests/Images/ActualOutput/\n\n - name: Codecov Update\n uses: codecov/codecov-action@v4\n if: matrix.options.codecov == true && startsWith(github.repository, 'SixLabors')\n with:\n flags: unittests\n | dataset_sample\yaml\SixLabors_ImageSharp\.github\workflows\code-coverage.yml | code-coverage.yml | YAML | 2,746 | 0.95 | 0.053191 | 0.025641 | react-lib | 98 | 2023-12-13T11:34:31.572609 | BSD-3-Clause | false | fc96f4b683354833b8eafcf891a70fb9 |
# List the start up tasks. You can start them in parallel in multiple terminals.\n# https://www.gitpod.io/docs/config-start-tasks/\ntasks:\n - init: >\n make tools build\n command: make inspect\n\n# Enable prebuilds of your project to enable faster workspace start times.\n# https://www.gitpod.io/docs/prebuilds/#configure-the-github-app\ngithub:\n prebuilds:\n master: true\n branches: true\n pullRequests: true\n pullRequestsFromForks: true\n addCheck: true | dataset_sample\yaml\slimtoolkit_slim\.gitpod.yml | .gitpod.yml | YAML | 469 | 0.8 | 0 | 0.266667 | python-kit | 881 | 2024-02-20T14:33:23.313179 | Apache-2.0 | false | 51ba9a1c09089dfadb2f641d7559ca8e |
comment:\n behavior: default\n layout: diff\n require_changes: false\ncoverage:\n precision: 1\n range:\n - 50.0\n - 100.0\n round: down\n status:\n patch: false\n project: false\ngithub_checks: false\n | dataset_sample\yaml\solana-labs_solana\.codecov.yml | .codecov.yml | YAML | 203 | 0.85 | 0 | 0 | react-lib | 152 | 2024-05-13T06:16:35.988687 | BSD-3-Clause | false | 41223e29840a6ca6d969bf0cd67715f9 |
# https://doc.mergify.io/\npull_request_rules:\n - name: label changes from community\n conditions:\n - author≠@core-contributors\n - author≠@monorepo-maintainers\n - author≠@monorepo-write\n - author≠@monorepo-triage\n - author≠mergify[bot]\n - author≠dependabot[bot]\n - author≠github-actions[bot]\n actions:\n label:\n add:\n - community\n - need:merge-assist\n - name: request review for community changes\n conditions:\n - author≠@core-contributors\n - author≠@monorepo-maintainers\n - author≠@monorepo-write\n - author≠@monorepo-triage\n - author≠mergify[bot]\n - author≠dependabot[bot]\n - author≠github-actions[bot]\n # Only request reviews from the pr subscribers group if no one\n # has reviewed the community PR yet. These checks only match\n # reviewers with admin, write or maintain permission on the repository.\n - "#approved-reviews-by=0"\n - "#commented-reviews-by=0"\n - "#changes-requested-reviews-by=0"\n - "#review-requested=0"\n actions:\n request_reviews:\n teams:\n - "@solana-labs/community-pr-subscribers"\n - name: label changes from monorepo-triage\n conditions:\n - author≠@core-contributors\n - author≠mergify[bot]\n - author≠dependabot[bot]\n - author≠github-actions[bot]\n - author≠@monorepo-maintainers\n - author≠@monorepo-write\n - author=@monorepo-triage\n actions:\n label:\n add:\n - need:merge-assist\n - name: automatic merge (squash) on CI success\n conditions:\n - and:\n - status-success=buildkite/solana\n - status-success=ci-gate\n - label=automerge\n - label!=no-automerge\n - or:\n # only require docs checks if docs files changed\n - -files~=^docs/\n - status-success=build & deploy docs\n - or:\n - -files~=(\.rs|Cargo\.toml|Cargo\.lock|\.github/scripts/cargo-clippy-before-script\.sh|\.github/workflows/cargo\.yml)$\n - and:\n - or:\n - check-success=clippy-stable (macos-latest)\n - check-success=clippy-stable (macos-latest-large)\n - or:\n - check-success=clippy-nightly (macos-latest)\n - check-success=clippy-nightly (macos-latest-large)\n - or:\n - -files~=(\.rs|Cargo\.toml|Cargo\.lock|cargo-build-bpf|cargo-test-bpf|cargo-build-sbf|cargo-test-sbf|ci/downstream-projects/run-spl\.sh|\.github/workflows/downstream-project-spl\.yml)$\n - and:\n - status-success=cargo-test-sbf (token/program)\n - status-success=cargo-test-sbf (instruction-padding/program, token/program-2022, token/program-2022-test)\n - status-success=cargo-test-sbf (associated-token-account/program, associated-token-account/program-test)\n - status-success=cargo-test-sbf (token-upgrade/program)\n - status-success=cargo-test-sbf (feature-proposal/program)\n - status-success=cargo-test-sbf (governance/addin-mock/program, governance/program)\n - status-success=cargo-test-sbf (memo/program)\n - status-success=cargo-test-sbf (name-service/program)\n - status-success=cargo-test-sbf (stake-pool/program)\n - status-success=cargo-test-sbf (single-pool/program)\n actions:\n merge:\n method: squash\n - name: remove automerge label on CI failure\n conditions:\n - and:\n - label=automerge\n - "#status-failure!=0"\n - -merged\n actions:\n label:\n remove:\n - automerge\n comment:\n message: automerge label removed due to a CI failure\n - name: v1.17 feature-gate backport\n conditions:\n - label=v1.17\n - label=feature-gate\n actions:\n backport:\n assignees: &BackportAssignee\n - "{{ merged_by|replace('mergify[bot]', label|select('equalto', 'community')|first|default(author)|replace('community', '@solana-labs/community-pr-subscribers')) }}"\n title: "{{ destination_branch }}: {{ title }} (backport of #{{ number }})"\n ignore_conflicts: true\n labels:\n - feature-gate\n branches:\n - v1.17\n - name: v1.17 non-feature-gate backport\n conditions:\n - label=v1.17\n - label!=feature-gate\n actions:\n backport:\n assignees: *BackportAssignee\n title: "{{ destination_branch }}: {{ title }} (backport of #{{ number }})"\n ignore_conflicts: true\n branches:\n - v1.17\n - name: v1.17 backport warning comment\n conditions:\n - label=v1.17\n actions:\n comment:\n message: >\n Backports to the stable branch are to be avoided unless absolutely\n necessary for fixing bugs, security issues, and perf regressions.\n Changes intended for backport should be structured such that a\n minimum effective diff can be committed separately from any\n refactoring, plumbing, cleanup, etc that are not strictly\n necessary to achieve the goal. Any of the latter should go only\n into master and ride the normal stabilization schedule.\n - name: v1.18 feature-gate backport\n conditions:\n - label=v1.18\n - label=feature-gate\n actions:\n backport:\n assignees: *BackportAssignee\n title: "{{ destination_branch }}: {{ title }} (backport of #{{ number }})"\n ignore_conflicts: true\n labels:\n - feature-gate\n branches:\n - v1.18\n - name: v1.18 non-feature-gate backport\n conditions:\n - label=v1.18\n - label!=feature-gate\n actions:\n backport:\n assignees: *BackportAssignee\n title: "{{ destination_branch }}: {{ title }} (backport of #{{ number }})"\n ignore_conflicts: true\n branches:\n - v1.18\n - name: v1.18 backport warning comment\n conditions:\n - label=v1.18\n actions:\n comment:\n message: >\n Backports to the beta branch are to be avoided unless absolutely\n necessary for fixing bugs, security issues, and perf regressions.\n Changes intended for backport should be structured such that a\n minimum effective diff can be committed separately from any\n refactoring, plumbing, cleanup, etc that are not strictly\n necessary to achieve the goal. Any of the latter should go only\n into master and ride the normal stabilization schedule. Exceptions\n include CI/metrics changes, CLI improvements and documentation\n updates on a case by case basis.\n\ncommands_restrictions:\n # The author of copied PRs is the Mergify user.\n # Restrict `copy` access to Core Contributors\n copy:\n conditions:\n - author=@core-contributors\n | dataset_sample\yaml\solana-labs_solana\.mergify.yml | .mergify.yml | YAML | 6,819 | 0.95 | 0.038674 | 0.038889 | node-utils | 767 | 2023-10-29T18:09:35.606554 | BSD-3-Clause | false | a915ebe584ee2346e1ef910c78a52276 |
branches:\n only:\n - master\n - /^v\d+\.\d+/\n\nnotifications:\n email: false\n slack:\n on_success: change\n if: NOT type = pull_request\n secure: F4IjOE05MyaMOdPRL+r8qhs7jBvv4yDM3RmFKE1zNXnfUOqV4X38oQM1EI+YVsgpMQLj/pxnEB7wcTE4Bf86N6moLssEULCpvAuMVoXj4QbWdomLX+01WbFa6fLVeNQIg45NHrz2XzVBhoKOrMNnl+QI5mbR2AlS5oqsudHsXDnyLzZtd4Y5SDMdYG1zVWM01+oNNjgNfjcCGmOE/K0CnOMl6GPi3X9C34tJ19P2XT7MTDsz1/IfEF7fro2Q8DHEYL9dchJMoisXSkem5z7IDQkGzXsWdWT4NnndUvmd1MlTCE9qgoXDqRf95Qh8sB1Dz08HtvgfaosP2XjtNTfDI9BBYS15Ibw9y7PchAJE1luteNjF35EOy6OgmCLw/YpnweqfuNViBZz+yOPWXVC0kxnPIXKZ1wyH9ibeH6E4hr7a8o9SV/6SiWIlbYF+IR9jPXyTCLP/cc3sYljPWxDnhWFwFdRVIi3PbVAhVu7uWtVUO17Oc9gtGPgs/GrhOMkJfwQPXaudRJDpVZowxTX4x9kefNotlMAMRgq+Drbmgt4eEBiCNp0ITWgh17BiE1U09WS3myuduhoct85+FoVeaUkp1sxzHVtGsNQH0hcz7WcpZyOM+AwistJA/qzeEDQao5zi1eKWPbO2xAhi2rV1bDH6bPf/4lDBwLRqSiwvlWU=\n\nos: linux\ndist: bionic\nlanguage: minimal\n\njobs:\n include:\n - &release-artifacts\n if: type IN (api, cron) OR tag IS present\n name: "macOS release artifacts"\n os: osx\n osx_image: xcode12\n language: rust\n rust:\n - stable\n install:\n - source ci/rust-version.sh\n - PATH="/usr/local/opt/coreutils/libexec/gnubin:$PATH"\n - readlink -f .\n - brew install gnu-tar\n - PATH="/usr/local/opt/gnu-tar/libexec/gnubin:$PATH"\n - tar --version\n script:\n - source ci/env.sh\n - rustup set profile default\n - ci/publish-tarball.sh\n deploy:\n - provider: s3\n access_key_id: $AWS_ACCESS_KEY_ID\n secret_access_key: $AWS_SECRET_ACCESS_KEY\n bucket: release.solana.com\n region: us-west-1\n skip_cleanup: true\n acl: public_read\n local_dir: travis-s3-upload\n on:\n all_branches: true\n - provider: releases\n token: $GITHUB_TOKEN\n skip_cleanup: true\n file_glob: true\n file: travis-release-upload/*\n on:\n tags: true\n - <<: *release-artifacts\n name: "Windows release artifacts"\n os: windows\n install:\n - choco install openssl\n - export OPENSSL_DIR="C:\Program Files\OpenSSL-Win64"\n - source ci/rust-version.sh\n - PATH="/usr/local/opt/coreutils/libexec/gnubin:$PATH"\n - readlink -f .\n # Linux release artifacts are still built by ci/buildkite-secondary.yml\n #- <<: *release-artifacts\n # name: "Linux release artifacts"\n # os: linux\n # before_install:\n # - sudo apt-get install libssl-dev libudev-dev\n\n # docs pull request\n - name: "docs"\n if: type IN (push, pull_request) OR tag IS present\n language: node_js\n node_js:\n - "lts/*"\n\n services:\n - docker\n\n cache:\n directories:\n - ~/.npm\n\n before_install:\n - source ci/env.sh\n - .travis/channel_restriction.sh edge beta || travis_terminate 0\n - .travis/affects.sh docs/ .travis || travis_terminate 0\n - cd docs/\n - source .travis/before_install.sh\n\n script:\n - source .travis/script.sh\n | dataset_sample\yaml\solana-labs_solana\.travis.yml | .travis.yml | YAML | 3,101 | 0.8 | 0.031915 | 0.081395 | react-lib | 320 | 2025-02-07T23:14:19.904798 | GPL-3.0 | false | 40807eca405460090f04f6147c822edc |
# To get started with Dependabot version updates, you'll need to specify which\n# package ecosystems to update and where the package manifests are located.\n# Please see the documentation for all configuration options:\n# https://help.github.com/github/administering-a-repository/configuration-options-for-dependency-updates\n\nversion: 2\nupdates:\n- package-ecosystem: cargo\n directory: "/"\n schedule:\n interval: daily\n time: "01:00"\n timezone: America/Los_Angeles\n #labels:\n # - "automerge"\n open-pull-requests-limit: 0\n | dataset_sample\yaml\solana-labs_solana\.github\dependabot.yml | dependabot.yml | YAML | 531 | 0.8 | 0.125 | 0.4 | react-lib | 994 | 2023-11-26T10:05:16.027882 | Apache-2.0 | false | 4e303ed0b82ca9c01f8b7d35c37658e7 |
question:\n issues:\n # Post a comment, `{issue-author}` is an optional placeholder\n comment: >\n Hi @{issue-author},\n\n\n Thanks for your question!\n\n\n We want to make sure to keep signal strong in the GitHub issue tracker – to make sure\n that it remains the best place to track issues that affect the development of Solana itself.\n\n\n Questions like yours deserve a purpose-built Q&A forum. Unless there exists evidence that\n this is a bug with Solana itself, please post your question to the Solana Stack Exchange\n using this link: https://solana.stackexchange.com/questions/ask\n\n \n ---\n\n _This\n [automated message](https://github.com/solana-labs/solana/blob/master/.github/label-actions.yml)\n is a result of having added the ‘question’ tag_.\n\n # Close the issue\n close: true\n | dataset_sample\yaml\solana-labs_solana\.github\label-actions.yml | label-actions.yml | YAML | 863 | 0.8 | 0.037037 | 0.117647 | vue-tools | 923 | 2025-01-22T20:38:10.517729 | BSD-3-Clause | false | c3d5b0bac91f3e3cbbca6cd98667f1a4 |
name: Feature Gate Tracker\ndescription: Track the development and status of an on-chain feature\ntitle: "Feature Gate: "\nlabels: ["feature-gate"]\nbody:\n - type: markdown\n attributes:\n value: >\n Steps to add a new feature are outlined below. Note that these steps only cover\n the process of getting a feature into the core Solana code.\n\n - For features that are unambiguously good (ie bug fixes), these steps are sufficient.\n\n - For features that should go up for community vote (ie fee structure changes), more\n information on the additional steps to follow can be found at:\n <https://spl.solana.com/feature-proposal#feature-proposal-life-cycle>\n\n 1. Generate a new keypair with `solana-keygen new --outfile feature.json --no-passphrase`\n - Keypairs should be held by core contributors only. If you're a non-core contirbutor going\n through these steps, the PR process will facilitate a keypair holder being picked. That\n person will generate the keypair, provide pubkey for PR, and ultimately enable the feature.\n\n 2. Add a public module for the feature, specifying keypair pubkey as the id with\n `solana_sdk::declare_id!()` within the module. Additionally, add an entry to `FEATURE_NAMES` map.\n\n 3. Add desired logic to check for and switch on feature availability.\n - type: input\n id: simd\n attributes:\n label: SIMD\n description: Solana IMprovement Document (SIMD)\n placeholder: Link to the https://github.com/solana-foundation/solana-improvement-documents document for this feature\n validations:\n required: true\n - type: textarea\n id: description\n attributes:\n label: Description\n placeholder: Describe why the new feature gate is needed and any necessary conditions for its activation\n validations:\n required: true\n - type: input\n id: id\n attributes:\n label: Feature ID\n description: The public key of the feature account\n validations:\n required: true\n - type: dropdown\n id: activation-method\n attributes:\n label: Activation Method\n options:\n - Single Core Contributor\n - Staked Validator Vote\n validations:\n required: true\n - type: textarea\n id: deployment\n attributes:\n label: Deployment Considerations\n placeholder: Describe any considerations for public-cluster deployment, including needed tests and metrics to be monitored\n validations:\n required: true\n - type: input\n id: beta-version\n attributes:\n label: Minimum Beta Version\n placeholder: Edit this response when feature has landed in a beta release\n validations:\n required: false\n - type: input\n id: stable-version\n attributes:\n label: Minimum Stable Version\n placeholder: Edit this response when feature has landed in a stable release\n validations:\n required: false\n - type: input\n id: testnet\n attributes:\n label: Testnet Activation Epoch\n placeholder: Edit this response when feature is activated on this cluster\n validations:\n required: false\n - type: input\n id: devnet\n attributes:\n label: Devnet Activation Epoch\n placeholder: Edit this response when feature is activated on this cluster\n validations:\n required: false\n - type: input\n id: mainnet-beta\n attributes:\n label: Mainnet-Beta Activation Epoch\n placeholder: Edit this response when feature is activated on this cluster\n validations:\n required: false\n | dataset_sample\yaml\solana-labs_solana\.github\ISSUE_TEMPLATE\2-feature-gate.yml | 2-feature-gate.yml | YAML | 3,559 | 0.95 | 0.080808 | 0 | react-lib | 773 | 2024-05-30T07:54:11.391960 | Apache-2.0 | false | 7b948962ea8a698a97d2acd13f5b1dd1 |
name: Cargo\n\non:\n push:\n branches:\n - master\n - v[0-9]+.[0-9]+\n pull_request:\n branches:\n - master\n - v[0-9]+.[0-9]+\n paths:\n - "**.rs"\n - "**/Cargo.toml"\n - "**/Cargo.lock"\n - ".github/scripts/cargo-clippy-before-script.sh"\n - ".github/workflows/cargo.yml"\n\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\nenv:\n SHELL: /bin/bash\n SCCACHE_GHA_ENABLED: "true"\n RUSTC_WRAPPER: "sccache"\n\njobs:\n clippy-stable:\n strategy:\n matrix:\n os:\n - macos-latest-large\n - windows-latest\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v4\n\n - uses: mozilla-actions/sccache-action@v0.0.3\n with:\n version: "v0.5.4"\n\n - shell: bash\n run: .github/scripts/cargo-clippy-before-script.sh ${{ runner.os }}\n\n - shell: bash\n run: |\n source ci/rust-version.sh stable\n rustup component add clippy --toolchain "$rust_stable"\n scripts/cargo-clippy-stable.sh\n\n clippy-nightly:\n strategy:\n matrix:\n os:\n - macos-latest-large\n - windows-latest\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v4\n\n - uses: mozilla-actions/sccache-action@v0.0.3\n with:\n version: "v0.5.4"\n\n - shell: bash\n run: .github/scripts/cargo-clippy-before-script.sh ${{ runner.os }}\n\n - shell: bash\n run: |\n source ci/rust-version.sh nightly\n rustup component add clippy --toolchain "$rust_nightly"\n scripts/cargo-clippy-nightly.sh\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\cargo.yml | cargo.yml | YAML | 1,669 | 0.7 | 0 | 0 | node-utils | 130 | 2024-12-05T02:14:26.917870 | GPL-3.0 | false | e141b7c33f3496f86f23c575b8a5a0ea |
name: Require changelog entry\n\non:\n pull_request:\n types: [opened, synchronize, reopened, labeled, unlabeled]\n\njobs:\n check-changelog:\n if: contains(github.event.pull_request.labels.*.name, 'changelog')\n runs-on: ubuntu-latest\n\n steps:\n - uses: actions/checkout@v3\n with:\n fetch-depth: 0\n - name: Check if changes to CHANGELOG.md\n shell: bash\n env:\n BASE_REF: ${{ github.event.pull_request.base.ref }}\n run: .github/scripts/check-changelog.sh\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\changelog-label.yml | changelog-label.yml | YAML | 497 | 0.7 | 0.1 | 0 | node-utils | 272 | 2023-11-13T17:57:29.266200 | MIT | false | 02c2742c3af36868274820113e951e76 |
name: client_targets\n\non:\n push:\n branches:\n - master\n pull_request:\n branches:\n - master\n paths:\n - "client/**"\n - "sdk/**"\n - ".github/workflows/client-targets.yml"\n - "ci/rust-version.sh"\n - "**/Cargo.toml"\n - "**/Cargo.lock"\n\nenv:\n CARGO_TERM_COLOR: always\n\njobs:\n android:\n strategy:\n matrix:\n os:\n - ubuntu-20.04\n target:\n - x86_64-linux-android\n - aarch64-linux-android\n - i686-linux-android\n - armv7-linux-androideabi\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v3\n\n - run: cargo install cargo-ndk@2.12.2\n\n - name: Setup Rust\n run: |\n source ci/rust-version.sh stable\n rustup target add --toolchain "$rust_stable" ${{ matrix.target }}\n\n - name: Stable build\n run: ./cargo stable ndk --target ${{ matrix.target }} build -p solana-client\n\n ios:\n strategy:\n matrix:\n os:\n - macos-11\n target:\n - aarch64-apple-ios\n - x86_64-apple-ios\n - aarch64-apple-darwin\n - x86_64-apple-darwin\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v3\n\n - name: Setup Rust\n run: |\n source ci/rust-version.sh stable\n rustup target add --toolchain "$rust_stable" ${{ matrix.target }}\n\n - name: Stable build\n run: ./cargo stable build --target ${{ matrix.target }} -p solana-client\n\n error_reporting:\n needs:\n - android\n - ios\n if: failure() && github.event_name == 'push'\n uses: ./.github/workflows/error-reporting.yml\n secrets:\n WEBHOOK: ${{ secrets.SLACK_ERROR_REPORTING_WEBHOOK }}\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\client-targets.yml | client-targets.yml | YAML | 1,723 | 0.8 | 0.013333 | 0 | awesome-app | 519 | 2025-02-20T00:59:35.281810 | Apache-2.0 | false | 19024ab2481f457b45d9528c07ec8bd6 |
name: Close new issues\n\non:\n issues:\n types: [opened, reopened]\n\njobs:\n comment-and-close:\n runs-on: ubuntu-latest\n steps:\n - shell: bash\n env:\n ISSUE_NUMBER: ${{ github.event.issue.number }}\n GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n GH_REPO: ${{ github.repository }}\n COMMENT: >\n This repository is no longer in use. Please re-open this\n issue in the agave repo: https://github.com/anza-xyz/agave\n run: >\n gh issue close "$ISSUE_NUMBER" --comment "$COMMENT"\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\close-new-issues.yml | close-new-issues.yml | YAML | 554 | 0.8 | 0 | 0 | react-lib | 526 | 2024-07-23T01:22:52.835644 | BSD-3-Clause | false | 5804737b94bc8365ee2403200d78f378 |
name: Close new pull requests\n\non:\n pull_request:\n types: [opened, reopened]\n\njobs:\n comment-and-close:\n runs-on: ubuntu-latest\n steps:\n - shell: bash\n env:\n PR_NUMBER: ${{ github.event.pull_request.number }}\n GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n GH_REPO: ${{ github.repository }}\n COMMENT: >\n This repository is no longer in use. Please re-open this\n pull request in the agave repo: https://github.com/anza-xyz/agave\n run: >\n gh pr close "$PR_NUMBER" --comment "$COMMENT"\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\close-new-pull-requests.yml | close-new-pull-requests.yml | YAML | 572 | 0.8 | 0 | 0 | vue-tools | 540 | 2023-12-13T14:25:01.615070 | Apache-2.0 | false | df25b5e12f96d7ab1fe86bc4aa35f3e0 |
name: crate-check\n\non:\n push:\n branches:\n - master\n pull_request:\n branches:\n - master\n paths:\n - "**/Cargo.toml"\n - ".github/workflows/crate-check.yml"\n\njobs:\n check:\n runs-on: ubuntu-20.04\n steps:\n - uses: actions/checkout@v3\n with:\n fetch-depth: 0\n\n - name: Get commit range (push)\n if: ${{ github.event_name == 'push' }}\n run: |\n echo "COMMIT_RANGE=${{ github.event.before }}..$GITHUB_SHA" >> $GITHUB_ENV\n\n - name: Get commit range (pull_request)\n if: ${{ github.event_name == 'pull_request' }}\n run: |\n echo "COMMIT_RANGE=${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}" >> $GITHUB_ENV\n\n - name: Setup Rust\n shell: bash\n run: |\n source ci/rust-version.sh stable\n rustup default $rust_stable\n\n - name: Install toml-cli\n shell: bash\n run: |\n cargo install toml-cli\n\n - run: |\n ci/check-crates.sh\n\n error_reporting:\n needs:\n - check\n if: failure() && github.event_name == 'push'\n uses: ./.github/workflows/error-reporting.yml\n secrets:\n WEBHOOK: ${{ secrets.SLACK_ERROR_REPORTING_WEBHOOK }}\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\crate-check.yml | crate-check.yml | YAML | 1,247 | 0.7 | 0.057692 | 0 | vue-tools | 722 | 2023-12-28T09:41:21.916560 | GPL-3.0 | false | 4d035750392be91b25de5e9b4b765ae4 |
name: docs\n\non:\n push:\n branches:\n - master\n - v[0-9]+.[0-9]+\n tags:\n - v[0-9]+.[0-9]+.[0-9]+\n pull_request:\n branches:\n - master\n - v[0-9]+.[0-9]+\n\njobs:\n check:\n outputs:\n continue: ${{ steps.check.outputs.need_to_build }}\n runs-on: ubuntu-20.04\n steps:\n - name: Checkout\n uses: actions/checkout@v3\n with:\n fetch-depth: 0\n\n - name: Get commit range (push)\n if: ${{ github.event_name == 'push' }}\n run: |\n echo "COMMIT_RANGE=${{ github.event.before }}..$GITHUB_SHA" >> $GITHUB_ENV\n\n - name: Get commit range (pull_request)\n if: ${{ github.event_name == 'pull_request' }}\n run: |\n echo "COMMIT_RANGE=${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}" >> $GITHUB_ENV\n\n - name: Get file status\n run: |\n set +e\n git diff --name-only $COMMIT_RANGE | grep \\n -e '.github/workflows/docs.yml' \\n -e 'docs/**'\n echo "FILE_CHANGED=$?" >> $GITHUB_ENV\n\n - name: Check\n id: check\n shell: bash\n run: |\n source ci/env.sh\n eval "$(ci/channel-info.sh)"\n TAG=$CI_TAG\n\n echo "TAG: $TAG"\n echo "CHANNEL: $CHANNEL"\n echo "FILE_CHANGED: $FILE_CHANGED"\n\n echo need_to_build="$(\n if [ "$TAG" != '' ]\n then\n echo 1\n elif [ $FILE_CHANGED = 0 ] && ( [ "$CHANNEL" = "beta" ] || [ "$CHANNEL" = "edge" ] )\n then\n echo 1\n else\n echo 0\n fi\n )" >> $GITHUB_OUTPUT\n\n build_and_deploy:\n needs:\n - check\n if: >\n github.repository == 'solana-labs/solana' &&\n needs.check.outputs.continue == 1\n # the name is used by .mergify.yml as well\n name: build & deploy docs\n runs-on: ubuntu-20.04\n steps:\n - name: Checkout\n uses: actions/checkout@v3\n\n - name: Setup Node\n uses: actions/setup-node@v3\n with:\n node-version: 16\n\n - name: Build\n working-directory: docs\n run: |\n npm install\n ./build.sh\n env:\n VERCEL_TOKEN: ${{ secrets.VERCEL_TOKEN }}\n VERCEL_SCOPE: ${{ secrets.VERCEL_SCOPE }}\n\n error_reporting:\n needs:\n - check\n - build_and_deploy\n if: failure() && github.event_name == 'push'\n uses: ./.github/workflows/error-reporting.yml\n secrets:\n WEBHOOK: ${{ secrets.SLACK_ERROR_REPORTING_WEBHOOK }}\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\docs.yml | docs.yml | YAML | 2,556 | 0.8 | 0.04902 | 0.011111 | node-utils | 87 | 2023-10-10T14:34:11.283279 | MIT | false | e007eabf33743a89e0174be7f929e726 |
name: Downstream Project - Anchor\n\non:\n push:\n branches:\n - master\n - v[0-9]+.[0-9]+\n pull_request:\n branches:\n - master\n - v[0-9]+.[0-9]+\n paths:\n - "**.rs"\n - "Cargo.toml"\n - "Cargo.lock"\n - "cargo-build-bpf"\n - "cargo-test-bpf"\n - "cargo-build-sbf"\n - "cargo-test-sbf"\n - "scripts/build-downstream-anchor-projects.sh"\n - ".github/scripts/purge-ubuntu-runner.sh"\n - ".github/scripts/downstream-project-spl-install-deps.sh"\n - ".github/workflows/downstream-project-anchor.yml"\n workflow_call:\n inputs:\n branch:\n required: false\n type: string\n default: "master"\n\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\nenv:\n SHELL: /bin/bash\n SCCACHE_GHA_ENABLED: "true"\n RUSTC_WRAPPER: "sccache"\n\njobs:\n test:\n runs-on: ubuntu-latest\n strategy:\n matrix:\n version: ["v0.29.0"]\n steps:\n - uses: actions/checkout@v3\n\n - shell: bash\n run: |\n .github/scripts/purge-ubuntu-runner.sh\n\n - uses: mozilla-actions/sccache-action@v0.0.3\n with:\n version: "v0.5.4"\n\n - shell: bash\n run: |\n source .github/scripts/downstream-project-spl-install-deps.sh\n ./scripts/build-downstream-anchor-projects.sh ${{ matrix.version }}\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\downstream-project-anchor.yml | downstream-project-anchor.yml | YAML | 1,397 | 0.85 | 0 | 0 | awesome-app | 471 | 2024-08-25T04:13:01.971592 | MIT | false | 903db6f9f4b500ee6d34ef38d2452954 |
name: Downstream Project - SPL (Nightly)\n\non:\n schedule:\n - cron: "0 3 * * *"\n\njobs:\n main:\n # As this is a cron job, it is better to avoid running it for all the forks.\n # They are unlike to benefit from these executions, and they could easily\n # eat up all the minutes GitHub allocation to free accounts.\n if: >\n github.event_name != 'schedule'\n || github.repository == 'solana-labs/solana'\n\n strategy:\n fail-fast: false\n matrix:\n branch:\n - master\n uses: ./.github/workflows/downstream-project-spl.yml\n with:\n branch: ${{ matrix.branch }}\n\n error_reporting:\n needs:\n - main\n if: failure()\n uses: ./.github/workflows/error-reporting.yml\n secrets:\n WEBHOOK: ${{ secrets.SLACK_ERROR_REPORTING_WEBHOOK }}\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\downstream-project-spl-nightly.yml | downstream-project-spl-nightly.yml | YAML | 794 | 0.8 | 0.096774 | 0.111111 | vue-tools | 792 | 2024-10-11T15:54:19.125731 | MIT | false | 8855c2bcf2c30ec34844104689307247 |
name: Downstream Project - SPL\n\non:\n push:\n branches:\n - master\n - v[0-9]+.[0-9]+\n pull_request:\n branches:\n - master\n - v[0-9]+.[0-9]+\n paths:\n - "**.rs"\n - "Cargo.toml"\n - "Cargo.lock"\n - "cargo-build-bpf"\n - "cargo-test-bpf"\n - "cargo-build-sbf"\n - "cargo-test-sbf"\n - "ci/downstream-projects/run-spl.sh"\n - ".github/workflows/downstream-project-spl.yml"\n workflow_call:\n inputs:\n branch:\n required: false\n type: string\n default: "master"\n\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\nenv:\n SHELL: /bin/bash\n SCCACHE_GHA_ENABLED: "true"\n RUSTC_WRAPPER: "sccache"\n\njobs:\n check:\n if: false\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v3\n\n - shell: bash\n run: |\n .github/scripts/purge-ubuntu-runner.sh\n\n - uses: mozilla-actions/sccache-action@v0.0.3\n with:\n version: "v0.5.4"\n\n - shell: bash\n run: |\n source .github/scripts/downstream-project-spl-common.sh\n source .github/scripts/downstream-project-spl-install-deps.sh\n\n cargo check\n\n test:\n if: false\n runs-on: ubuntu-latest\n strategy:\n matrix:\n arrays:\n [\n {\n test_paths: ["token/cli"],\n required_programs:\n [\n "token/program",\n "token/program-2022",\n "associated-token-account/program",\n ],\n },\n {\n test_paths: ["single-pool/cli"],\n required_programs:\n [\n "single-pool/program",\n ],\n },\n {\n test_paths: ["token-upgrade/cli"],\n required_programs:\n [\n "token-upgrade/program",\n ],\n },\n ]\n steps:\n - uses: actions/checkout@v3\n\n - shell: bash\n run: |\n .github/scripts/purge-ubuntu-runner.sh\n\n - uses: mozilla-actions/sccache-action@v0.0.3\n with:\n version: "v0.5.4"\n\n - shell: bash\n run: |\n source .github/scripts/downstream-project-spl-common.sh\n source .github/scripts/downstream-project-spl-install-deps.sh\n\n programStr="${{ tojson(matrix.arrays.required_programs) }}"\n IFS=', ' read -ra programs <<<"${programStr//[\[\]$'\n'$'\r' ]/}"\n for program in "${programs[@]}"; do\n $CARGO_BUILD_SBF --manifest-path "$program"/Cargo.toml\n done\n\n testPathsStr="${{ tojson(matrix.arrays.test_paths) }}"\n IFS=', ' read -ra test_paths <<<"${testPathsStr//[\[\]$'\n'$'\r' ]/}"\n for test_path in "${test_paths[@]}"; do\n cargo test --manifest-path "$test_path"/Cargo.toml\n done\n\n cargo-test-sbf:\n runs-on: ubuntu-latest\n strategy:\n matrix:\n programs:\n - [token/program]\n - [\n instruction-padding/program,\n token/program-2022,\n token/program-2022-test,\n ]\n - [\n associated-token-account/program,\n associated-token-account/program-test,\n ]\n - [token-upgrade/program]\n - [feature-proposal/program]\n - [governance/addin-mock/program, governance/program]\n - [memo/program]\n - [name-service/program]\n - [stake-pool/program]\n - [single-pool/program]\n\n steps:\n - uses: actions/checkout@v3\n\n - shell: bash\n run: |\n .github/scripts/purge-ubuntu-runner.sh\n\n - uses: mozilla-actions/sccache-action@v0.0.3\n with:\n version: "v0.5.4"\n\n - shell: bash\n run: |\n source .github/scripts/downstream-project-spl-common.sh\n\n programStr="${{ tojson(matrix.programs) }}"\n IFS=', ' read -ra programs <<<"${programStr//[\[\]$'\n'$'\r' ]/}"\n\n for program in "${programs[@]}"; do\n $CARGO_TEST_SBF --manifest-path "$program"/Cargo.toml\n done\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\downstream-project-spl.yml | downstream-project-spl.yml | YAML | 4,197 | 0.95 | 0.030864 | 0 | python-kit | 810 | 2023-08-19T12:09:30.398696 | Apache-2.0 | false | 2cefb745b2308939191b89812f1be41a |
name: error-reporting\n\non:\n workflow_call:\n secrets:\n WEBHOOK:\n required: true\n\njobs:\n slack:\n runs-on: ubuntu-20.04\n steps:\n - env:\n COMMIT_MESSAGE: ${{ github.event.head_commit.message }}\n COMMIT_AUTHOR_NAME: ${{ github.event.head_commit.author.name }}\n run: |\n curl -H "Content-Type: application/json" \\n -X POST ${{ secrets.WEBHOOK }} \\n -d '{\n "attachments": [\n {\n "color": "#AC514C",\n "text":\n "*${{ github.repository }} (${{ github.workflow }})*\n'"$COMMIT_MESSAGE"' - _'"$COMMIT_AUTHOR_NAME"'_\n<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|View Build>",\n }\n ]\n }'\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\error-reporting.yml | error-reporting.yml | YAML | 816 | 0.95 | 0 | 0 | react-lib | 967 | 2023-11-30T18:14:00.266059 | Apache-2.0 | false | 629a2ecfae009acc29bba8fd9c0adc15 |
name: increment-cargo-version\n\non:\n release:\n types: [published]\n\njobs:\n check_compilation:\n name: Increment cargo version\n runs-on: ubuntu-latest\n steps:\n - name: Checkout Repository\n uses: actions/checkout@v3\n\n # This script confirms two assumptions:\n # 1) Tag should be branch.<patch_version>\n # 2) Tag should match the crate version numbers in the manifest files (which get incremented by the next step)\n - name: Confirm tag, branch, and cargo version numbers\n run: scripts/confirm-cargo-version-numbers-before-bump.sh ${{ github.event.release.target_commitish }} ${{ github.event.release.tag_name }}\n\n - name: Update Patch Version Numbers\n run: |\n OUTPUT=$(scripts/increment-cargo-version.sh patch)\n SOLANA_NEW_VERSION=$(sed -E 's/.* -> //' <<< $OUTPUT)\n echo "SOLANA_NEW_VERSION=$SOLANA_NEW_VERSION"\n echo "SOLANA_NEW_VERSION=$SOLANA_NEW_VERSION" >> $GITHUB_ENV\n\n - name: Cargo Tree\n run: ./scripts/cargo-for-all-lock-files.sh tree\n\n - name: Create Pull Request\n uses: peter-evans/create-pull-request@v4\n with:\n commit-message: Bump Version to ${{ env.SOLANA_NEW_VERSION }}\n title: Bump Version to ${{ env.SOLANA_NEW_VERSION }}\n body: PR opened by Github Action\n branch: update-version-${{ env.SOLANA_NEW_VERSION }}\n base: ${{ github.event.release.target_commitish }}\n labels: automerge\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\increment-cargo-version-on-release.yml | increment-cargo-version-on-release.yml | YAML | 1,480 | 0.8 | 0.025641 | 0.090909 | node-utils | 80 | 2024-11-14T18:26:11.121408 | MIT | false | 57313f191d28aca0edb6795e0d4ccb3b |
name: "Issue Label Actions"\n\non:\n issues:\n types: [labeled, unlabeled]\n\npermissions:\n contents: read\n issues: write\n\njobs:\n action:\n runs-on: ubuntu-latest\n steps:\n - uses: dessant/label-actions@v2\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\label-actions.yml | label-actions.yml | YAML | 216 | 0.7 | 0 | 0 | vue-tools | 250 | 2025-06-22T20:52:17.704670 | MIT | false | 1816a3d3e388f9ce9c79f3cd472be372 |
name: "Manage stale issues and PRs"\non:\n # Chosen to be just before London wakes up and way past San Francisco's bedtime.\n schedule:\n - cron: "0 8 * * 1-5" # This is in UTC.\n # Do a dry-run (debug-only: true) whenever this workflow itself is changed.\n pull_request:\n paths:\n - .github/workflows/manage-stale-issues-and-prs.yml\n types:\n - opened\n - synchronize\n\npermissions:\n issues: write\n pull-requests: write\n\njobs:\n stale:\n # Forks do not need to run this, especially on cron schedule.\n if: >\n github.event_name != 'schedule'\n || github.repository == 'solana-labs/solana'\n\n runs-on: ubuntu-latest\n steps:\n - uses: actions/stale@v6\n with:\n ascending: true # Spend API operations budget on older, more-likely-to-get-closed issues first\n close-issue-message: "" # Leave no comment when closing\n close-pr-message: "" # Leave no comment when closing\n days-before-issue-stale: 365\n days-before-pr-stale: 14\n days-before-close: 7\n debug-only: ${{ github.event_name == 'pull_request' }} # Dry-run when true.\n exempt-all-milestones: true # Milestones can sometimes last a month, so exempt issues attached to a milestone.\n exempt-issue-labels: blocked,do-not-close,feature-gate,security\n exempt-pr-labels: blocked,do-not-close,feature-gate,security\n # No actual changes get made in debug-only mode, so we can raise the operations ceiling.\n operations-per-run: ${{ github.event_name == 'pull_request' && 1000 || 900}}\n stale-issue-label: stale\n stale-issue-message: "" # Leave no comment when marking as stale\n stale-pr-label: stale\n stale-pr-message: "" # Leave no comment when marking as stale\n | dataset_sample\yaml\solana-labs_solana\.github\workflows\manage-stale-issues-and-prs.yml | manage-stale-issues-and-prs.yml | YAML | 1,797 | 0.8 | 0.022727 | 0.097561 | python-kit | 562 | 2023-11-08T14:06:16.828806 | BSD-3-Clause | false | 2173648ed0353dd2dd895952c66e1653 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.