content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
contact_links:\n - name: Support questions & other\n url: https://github.com/meilisearch/meilisearch/discussions/new\n about: For any other question, open a discussion in this repository\n - name: Language support request & feedback\n url: https://github.com/meilisearch/product/discussions/categories/feedback-feature-proposal?discussions_q=label%3Aproduct%3Acore%3Atokenizer+category%3A%22Feedback+%26+Feature+Proposal%22\n about: The requests and feedback regarding Language support are not managed in this repository. Please upvote the related discussion in our dedicated product repository or open a new one if it doesn't exist.\n - name: Any other feature request & feedback\n url: https://github.com/meilisearch/product/discussions/categories/feedback-feature-proposal\n about: The feature requests and feedback regarding the already existing features are not managed in this repository. Please open a discussion in our dedicated product repository\n - name: Documentation issue\n url: https://github.com/meilisearch/documentation/issues/new\n about: For documentation issues, open an issue or a PR in the documentation repository\n | dataset_sample\yaml\meilisearch_meilisearch\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 1,155 | 0.8 | 0.076923 | 0 | node-utils | 256 | 2024-03-10T23:39:37.609780 | BSD-3-Clause | false | 3dea1ba7b66e945a68f396f5951d805a |
name: Bench (manual)\n\non:\n workflow_dispatch:\n inputs:\n workload:\n description: "The path to the workloads to execute (workloads/...)"\n required: true\n default: "workloads/movies.json"\n\nenv:\n WORKLOAD_NAME: ${{ github.event.inputs.workload }}\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n timeout-minutes: 180 # 3h\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n - name: Run benchmarks - workload ${WORKLOAD_NAME} - branch ${{ github.ref }} - commit ${{ github.sha }}\n run: |\n cargo xtask bench --api-key "${{ secrets.BENCHMARK_API_KEY }}" --dashboard-url "${{ vars.BENCHMARK_DASHBOARD_URL }}" --reason "Manual [Run #${{ github.run_id }}](https://github.com/meilisearch/meilisearch/actions/runs/${{ github.run_id }})" -- ${WORKLOAD_NAME}\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\bench-manual.yml | bench-manual.yml | YAML | 911 | 0.95 | 0 | 0 | react-lib | 320 | 2024-08-02T03:11:22.586115 | Apache-2.0 | false | 623509ef6b3ce08612f994f6b7197767 |
name: Bench (PR)\non:\n issue_comment:\n types: [created]\n\npermissions:\n issues: write\n\nenv:\n GH_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n\njobs:\n run-benchmarks-on-comment:\n if: startsWith(github.event.comment.body, '/bench')\n name: Run and upload benchmarks\n runs-on: benchmarks\n timeout-minutes: 180 # 3h\n steps:\n - name: Check permissions\n id: permission\n env:\n PR_AUTHOR: ${{github.event.issue.user.login }}\n COMMENT_AUTHOR: ${{github.event.comment.user.login }}\n REPOSITORY: ${{github.repository}}\n PR_ID: ${{github.event.issue.number}}\n run: |\n PR_REPOSITORY=$(gh api /repos/"$REPOSITORY"/pulls/"$PR_ID" --jq .head.repo.full_name)\n if $(gh api /repos/"$REPOSITORY"/collaborators/"$PR_AUTHOR"/permission --jq .user.permissions.push)\n then\n echo "::notice title=Authentication success::PR author authenticated"\n else\n echo "::error title=Authentication error::PR author doesn't have push permission on this repository"\n exit 1\n fi\n if $(gh api /repos/"$REPOSITORY"/collaborators/"$COMMENT_AUTHOR"/permission --jq .user.permissions.push)\n then\n echo "::notice title=Authentication success::Comment author authenticated"\n else\n echo "::error title=Authentication error::Comment author doesn't have push permission on this repository"\n exit 1\n fi\n if [ "$PR_REPOSITORY" = "$REPOSITORY" ]\n then\n echo "::notice title=Authentication success::PR started from main repository"\n else\n echo "::error title=Authentication error::PR started from a fork"\n exit 1\n fi\n\n - name: Check for Command\n id: command\n uses: xt0rted/slash-command-action@v2\n with:\n command: bench\n reaction-type: "rocket"\n repo-token: ${{ env.GH_TOKEN }}\n\n - uses: xt0rted/pull-request-comment-branch@v3\n id: comment-branch\n with:\n repo_token: ${{ env.GH_TOKEN }}\n\n - uses: actions/checkout@v3\n if: success()\n with:\n fetch-depth: 0 # fetch full history to be able to get main commit sha\n ref: ${{ steps.comment-branch.outputs.head_ref }}\n\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n - name: Run benchmarks on PR ${{ github.event.issue.id }}\n run: |\n cargo xtask bench --api-key "${{ secrets.BENCHMARK_API_KEY }}" \\n --dashboard-url "${{ vars.BENCHMARK_DASHBOARD_URL }}" \\n --reason "[Comment](${{ github.event.comment.html_url }}) on [#${{ github.event.issue.number }}](${{ github.event.issue.html_url }})" \\n -- ${{ steps.command.outputs.command-arguments }} > benchlinks.txt\n\n - name: Send comment in PR\n run: |\n gh pr comment ${{github.event.issue.number}} --body-file benchlinks.txt\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\bench-pr.yml | bench-pr.yml | YAML | 2,984 | 0.8 | 0.073171 | 0 | react-lib | 510 | 2025-03-03T17:50:44.276777 | BSD-3-Clause | false | a0a571c1587b5ca3e3af5af5b9c41824 |
name: Indexing bench (push)\n\non:\n push:\n branches:\n - main\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n timeout-minutes: 180 # 3h\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${BENCH_NAME} - Branch main - Commit ${{ github.sha }}\n run: |\n cargo xtask bench --api-key "${{ secrets.BENCHMARK_API_KEY }}" --dashboard-url "${{ vars.BENCHMARK_DASHBOARD_URL }}" --reason "Push on `main` [Run #${{ github.run_id }}](https://github.com/meilisearch/meilisearch/actions/runs/${{ github.run_id }})" -- workloads/*.json\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\bench-push-indexing.yml | bench-push-indexing.yml | YAML | 719 | 0.8 | 0 | 0.052632 | python-kit | 32 | 2024-02-22T20:45:45.478322 | Apache-2.0 | false | 0449848eb7457bf3416a0db2d3a24c58 |
name: Benchmarks (manual)\n\non:\n workflow_dispatch:\n inputs:\n dataset_name:\n description: "The name of the dataset used to benchmark (search_songs, search_wiki, search_geo or indexing)"\n required: false\n default: "search_songs"\n\nenv:\n BENCH_NAME: ${{ github.event.inputs.dataset_name }}\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n timeout-minutes: 4320 # 72h\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Set variables\n - name: Set current branch name\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/})" >> $GITHUB_OUTPUT\n id: current_branch\n - name: Set normalized current branch name # Replace `/` by `_` in branch name to avoid issues when pushing to S3\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/} | tr '/' '_')" >> $GITHUB_OUTPUT\n id: normalized_current_branch\n - name: Set shorter commit SHA\n shell: bash\n run: echo "short=$(echo $GITHUB_SHA | cut -c1-8)" >> $GITHUB_OUTPUT\n id: commit_sha\n - name: Set file basename with format "dataset_branch_commitSHA"\n shell: bash\n run: echo "basename=$(echo ${BENCH_NAME}_${{ steps.normalized_current_branch.outputs.name }}_${{ steps.commit_sha.outputs.short }})" >> $GITHUB_OUTPUT\n id: file\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}\n run: |\n cd crates/benchmarks\n cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}\n\n # Generate critcmp files\n - name: Install critcmp\n uses: taiki-e/install-action@v2\n with:\n tool: critcmp\n - name: Export cripcmp file\n run: |\n critcmp --export ${{ steps.file.outputs.basename }} > ${{ steps.file.outputs.basename }}.json\n\n # Upload benchmarks\n - name: Upload ${{ steps.file.outputs.basename }}.json to DO Spaces # DigitalOcean Spaces = S3\n uses: BetaHuhn/do-spaces-action@v2\n with:\n access_key: ${{ secrets.DO_SPACES_ACCESS_KEY }}\n secret_key: ${{ secrets.DO_SPACES_SECRET_KEY }}\n space_name: ${{ secrets.DO_SPACES_SPACE_NAME }}\n space_region: ${{ secrets.DO_SPACES_SPACE_REGION }}\n source: ${{ steps.file.outputs.basename }}.json\n out_dir: critcmp_results\n\n # Helper\n - name: "README: compare with another benchmark"\n run: |\n echo "${{ steps.file.outputs.basename }}.json has just been pushed."\n echo 'How to compare this benchmark with another one?'\n echo ' - Check the available files with: ./benchmarks/scripts/list.sh'\n echo " - Run the following command: ./benchmaks/scripts/compare.sh <file-to-compare-with> ${{ steps.file.outputs.basename }}.json"\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\benchmarks-manual.yml | benchmarks-manual.yml | YAML | 3,013 | 0.95 | 0 | 0.074627 | node-utils | 515 | 2024-12-02T22:11:41.886964 | Apache-2.0 | false | 1a99638cf2beaca68153daf0f74fd6a7 |
name: Benchmarks (PR)\non: issue_comment\npermissions:\n issues: write\n\nenv:\n GH_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n\njobs:\n run-benchmarks-on-comment:\n if: startsWith(github.event.comment.body, '/benchmark')\n name: Run and upload benchmarks\n runs-on: benchmarks\n timeout-minutes: 4320 # 72h\n steps:\n - name: Check permissions\n id: permission\n env:\n PR_AUTHOR: ${{github.event.issue.user.login }}\n COMMENT_AUTHOR: ${{github.event.comment.user.login }}\n REPOSITORY: ${{github.repository}}\n PR_ID: ${{github.event.issue.number}}\n run: |\n PR_REPOSITORY=$(gh api /repos/"$REPOSITORY"/pulls/"$PR_ID" --jq .head.repo.full_name)\n if $(gh api /repos/"$REPOSITORY"/collaborators/"$PR_AUTHOR"/permission --jq .user.permissions.push)\n then\n echo "::notice title=Authentication success::PR author authenticated"\n else\n echo "::error title=Authentication error::PR author doesn't have push permission on this repository"\n exit 1\n fi\n if $(gh api /repos/"$REPOSITORY"/collaborators/"$COMMENT_AUTHOR"/permission --jq .user.permissions.push)\n then\n echo "::notice title=Authentication success::Comment author authenticated"\n else\n echo "::error title=Authentication error::Comment author doesn't have push permission on this repository"\n exit 1\n fi\n if [ "$PR_REPOSITORY" = "$REPOSITORY" ]\n then\n echo "::notice title=Authentication success::PR started from main repository"\n else\n echo "::error title=Authentication error::PR started from a fork"\n exit 1\n fi\n\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n - name: Check for Command\n id: command\n uses: xt0rted/slash-command-action@v2\n with:\n command: benchmark\n reaction-type: "eyes"\n repo-token: ${{ env.GH_TOKEN }}\n\n - uses: xt0rted/pull-request-comment-branch@v3\n id: comment-branch\n with:\n repo_token: ${{ env.GH_TOKEN }}\n\n - uses: actions/checkout@v3\n if: success()\n with:\n fetch-depth: 0 # fetch full history to be able to get main commit sha\n ref: ${{ steps.comment-branch.outputs.head_ref }}\n\n # Set variables\n - name: Set current branch name\n shell: bash\n run: echo "name=$(git rev-parse --abbrev-ref HEAD)" >> $GITHUB_OUTPUT\n id: current_branch\n - name: Set normalized current branch name # Replace `/` by `_` in branch name to avoid issues when pushing to S3\n shell: bash\n run: echo "name=$(git rev-parse --abbrev-ref HEAD | tr '/' '_')" >> $GITHUB_OUTPUT\n id: normalized_current_branch\n - name: Set shorter commit SHA\n shell: bash\n run: echo "short=$(echo $GITHUB_SHA | cut -c1-8)" >> $GITHUB_OUTPUT\n id: commit_sha\n - name: Set file basename with format "dataset_branch_commitSHA"\n shell: bash\n run: echo "basename=$(echo ${{ steps.command.outputs.command-arguments }}_${{ steps.normalized_current_branch.outputs.name }}_${{ steps.commit_sha.outputs.short }})" >> $GITHUB_OUTPUT\n id: file\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${{ steps.command.outputs.command-arguments }} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}\n run: |\n cd crates/benchmarks\n cargo bench --bench ${{ steps.command.outputs.command-arguments }} -- --save-baseline ${{ steps.file.outputs.basename }}\n\n # Generate critcmp files\n - name: Install critcmp\n uses: taiki-e/install-action@v2\n with:\n tool: critcmp\n - name: Export cripcmp file\n run: |\n critcmp --export ${{ steps.file.outputs.basename }} > ${{ steps.file.outputs.basename }}.json\n\n # Upload benchmarks\n - name: Upload ${{ steps.file.outputs.basename }}.json to DO Spaces # DigitalOcean Spaces = S3\n uses: BetaHuhn/do-spaces-action@v2\n with:\n access_key: ${{ secrets.DO_SPACES_ACCESS_KEY }}\n secret_key: ${{ secrets.DO_SPACES_SECRET_KEY }}\n space_name: ${{ secrets.DO_SPACES_SPACE_NAME }}\n space_region: ${{ secrets.DO_SPACES_SPACE_REGION }}\n source: ${{ steps.file.outputs.basename }}.json\n out_dir: critcmp_results\n\n # Compute the diff of the benchmarks and send a message on the GitHub PR\n - name: Compute and send a message in the PR\n env:\n GITHUB_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n run: |\n set -x\n export base_ref=$(git merge-base origin/main ${{ steps.comment-branch.outputs.head_ref }} | head -c8)\n export base_filename=$(echo ${{ steps.command.outputs.command-arguments }}_main_${base_ref}.json)\n export bench_name=$(echo ${{ steps.command.outputs.command-arguments }})\n echo "Here are your $bench_name benchmarks diff 👊" >> body.txt\n echo '```' >> body.txt\n ./benchmarks/scripts/compare.sh $base_filename ${{ steps.file.outputs.basename }}.json >> body.txt\n echo '```' >> body.txt\n gh pr comment ${{ steps.current_branch.outputs.name }} --body-file body.txt\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\benchmarks-pr.yml | benchmarks-pr.yml | YAML | 5,362 | 0.8 | 0.047244 | 0.043103 | react-lib | 144 | 2025-01-14T01:37:28.177320 | BSD-3-Clause | false | c992f027b373d9b86447ba610a1725ba |
name: Benchmarks of indexing (push)\n\non:\n push:\n branches:\n - main\n\nenv:\n INFLUX_TOKEN: ${{ secrets.INFLUX_TOKEN }}\n BENCH_NAME: "indexing"\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n timeout-minutes: 4320 # 72h\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Set variables\n - name: Set current branch name\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/})" >> $GITHUB_OUTPUT\n id: current_branch\n - name: Set normalized current branch name # Replace `/` by `_` in branch name to avoid issues when pushing to S3\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/} | tr '/' '_')" >> $GITHUB_OUTPUT\n id: normalized_current_branch\n - name: Set shorter commit SHA\n shell: bash\n run: echo "short=$(echo $GITHUB_SHA | cut -c1-8)" >> $GITHUB_OUTPUT\n id: commit_sha\n - name: Set file basename with format "dataset_branch_commitSHA"\n shell: bash\n run: echo "basename=$(echo ${BENCH_NAME}_${{ steps.normalized_current_branch.outputs.name }}_${{ steps.commit_sha.outputs.short }})" >> $GITHUB_OUTPUT\n id: file\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}\n run: |\n cd crates/benchmarks\n cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}\n\n # Generate critcmp files\n - name: Install critcmp\n uses: taiki-e/install-action@v2\n with:\n tool: critcmp\n - name: Export cripcmp file\n run: |\n critcmp --export ${{ steps.file.outputs.basename }} > ${{ steps.file.outputs.basename }}.json\n\n # Upload benchmarks\n - name: Upload ${{ steps.file.outputs.basename }}.json to DO Spaces # DigitalOcean Spaces = S3\n uses: BetaHuhn/do-spaces-action@v2\n with:\n access_key: ${{ secrets.DO_SPACES_ACCESS_KEY }}\n secret_key: ${{ secrets.DO_SPACES_SECRET_KEY }}\n space_name: ${{ secrets.DO_SPACES_SPACE_NAME }}\n space_region: ${{ secrets.DO_SPACES_SPACE_REGION }}\n source: ${{ steps.file.outputs.basename }}.json\n out_dir: critcmp_results\n\n # Upload benchmarks to influxdb\n - name: Upload ${{ steps.file.outputs.basename }}.json to influxDB\n run: telegraf --config https://eu-central-1-1.aws.cloud2.influxdata.com/api/v2/telegrafs/08b52e34a370b000 --once --debug\n\n # Helper\n - name: "README: compare with another benchmark"\n run: |\n echo "${{ steps.file.outputs.basename }}.json has just been pushed."\n echo 'How to compare this benchmark with another one?'\n echo ' - Check the available files with: ./benchmarks/scripts/list.sh'\n echo " - Run the following command: ./benchmaks/scipts/compare.sh <file-to-compare-with> ${{ steps.file.outputs.basename }}.json"\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\benchmarks-push-indexing.yml | benchmarks-push-indexing.yml | YAML | 3,087 | 0.8 | 0 | 0.088235 | awesome-app | 367 | 2025-06-23T13:12:21.648595 | GPL-3.0 | false | ab3fb4e16be7f61b0971c9fb3e68174e |
name: Benchmarks of search for geo (push)\n\non:\n push:\n branches:\n - main\n\nenv:\n BENCH_NAME: "search_geo"\n INFLUX_TOKEN: ${{ secrets.INFLUX_TOKEN }}\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Set variables\n - name: Set current branch name\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/})" >> $GITHUB_OUTPUT\n id: current_branch\n - name: Set normalized current branch name # Replace `/` by `_` in branch name to avoid issues when pushing to S3\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/} | tr '/' '_')" >> $GITHUB_OUTPUT\n id: normalized_current_branch\n - name: Set shorter commit SHA\n shell: bash\n run: echo "short=$(echo $GITHUB_SHA | cut -c1-8)" >> $GITHUB_OUTPUT\n id: commit_sha\n - name: Set file basename with format "dataset_branch_commitSHA"\n shell: bash\n run: echo "basename=$(echo ${BENCH_NAME}_${{ steps.normalized_current_branch.outputs.name }}_${{ steps.commit_sha.outputs.short }})" >> $GITHUB_OUTPUT\n id: file\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}\n run: |\n cd crates/benchmarks\n cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}\n\n # Generate critcmp files\n - name: Install critcmp\n uses: taiki-e/install-action@v2\n with:\n tool: critcmp\n - name: Export cripcmp file\n run: |\n critcmp --export ${{ steps.file.outputs.basename }} > ${{ steps.file.outputs.basename }}.json\n\n # Upload benchmarks\n - name: Upload ${{ steps.file.outputs.basename }}.json to DO Spaces # DigitalOcean Spaces = S3\n uses: BetaHuhn/do-spaces-action@v2\n with:\n access_key: ${{ secrets.DO_SPACES_ACCESS_KEY }}\n secret_key: ${{ secrets.DO_SPACES_SECRET_KEY }}\n space_name: ${{ secrets.DO_SPACES_SPACE_NAME }}\n space_region: ${{ secrets.DO_SPACES_SPACE_REGION }}\n source: ${{ steps.file.outputs.basename }}.json\n out_dir: critcmp_results\n\n # Upload benchmarks to influxdb\n - name: Upload ${{ steps.file.outputs.basename }}.json to influxDB\n run: telegraf --config https://eu-central-1-1.aws.cloud2.influxdata.com/api/v2/telegrafs/08b52e34a370b000 --once --debug\n\n # Helper\n - name: "README: compare with another benchmark"\n run: |\n echo "${{ steps.file.outputs.basename }}.json has just been pushed."\n echo 'How to compare this benchmark with another one?'\n echo ' - Check the available files with: ./benchmarks/scripts/list.sh'\n echo " - Run the following command: ./benchmaks/scipts/compare.sh <file-to-compare-with> ${{ steps.file.outputs.basename }}.json"\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\benchmarks-push-search-geo.yml | benchmarks-push-search-geo.yml | YAML | 3,063 | 0.8 | 0.013158 | 0.089552 | python-kit | 469 | 2025-01-14T11:48:56.665350 | MIT | false | a34aebb0ae941c406131f32bc98e81de |
name: Benchmarks of search for songs (push)\n\non:\n push:\n branches:\n - main\n\nenv:\n BENCH_NAME: "search_songs"\n INFLUX_TOKEN: ${{ secrets.INFLUX_TOKEN }}\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Set variables\n - name: Set current branch name\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/})" >> $GITHUB_OUTPUT\n id: current_branch\n - name: Set normalized current branch name # Replace `/` by `_` in branch name to avoid issues when pushing to S3\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/} | tr '/' '_')" >> $GITHUB_OUTPUT\n id: normalized_current_branch\n - name: Set shorter commit SHA\n shell: bash\n run: echo "short=$(echo $GITHUB_SHA | cut -c1-8)" >> $GITHUB_OUTPUT\n id: commit_sha\n - name: Set file basename with format "dataset_branch_commitSHA"\n shell: bash\n run: echo "basename=$(echo ${BENCH_NAME}_${{ steps.normalized_current_branch.outputs.name }}_${{ steps.commit_sha.outputs.short }})" >> $GITHUB_OUTPUT\n id: file\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}\n run: |\n cd crates/benchmarks\n cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}\n\n # Generate critcmp files\n - name: Install critcmp\n uses: taiki-e/install-action@v2\n with:\n tool: critcmp\n - name: Export cripcmp file\n run: |\n critcmp --export ${{ steps.file.outputs.basename }} > ${{ steps.file.outputs.basename }}.json\n\n # Upload benchmarks\n - name: Upload ${{ steps.file.outputs.basename }}.json to DO Spaces # DigitalOcean Spaces = S3\n uses: BetaHuhn/do-spaces-action@v2\n with:\n access_key: ${{ secrets.DO_SPACES_ACCESS_KEY }}\n secret_key: ${{ secrets.DO_SPACES_SECRET_KEY }}\n space_name: ${{ secrets.DO_SPACES_SPACE_NAME }}\n space_region: ${{ secrets.DO_SPACES_SPACE_REGION }}\n source: ${{ steps.file.outputs.basename }}.json\n out_dir: critcmp_results\n\n # Upload benchmarks to influxdb\n - name: Upload ${{ steps.file.outputs.basename }}.json to influxDB\n run: telegraf --config https://eu-central-1-1.aws.cloud2.influxdata.com/api/v2/telegrafs/08b52e34a370b000 --once --debug\n\n # Helper\n - name: "README: compare with another benchmark"\n run: |\n echo "${{ steps.file.outputs.basename }}.json has just been pushed."\n echo 'How to compare this benchmark with another one?'\n echo ' - Check the available files with: ./benchmarks/scripts/list.sh'\n echo " - Run the following command: ./benchmaks/scipts/compare.sh <file-to-compare-with> ${{ steps.file.outputs.basename }}.json"\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\benchmarks-push-search-songs.yml | benchmarks-push-search-songs.yml | YAML | 3,067 | 0.8 | 0.013158 | 0.089552 | react-lib | 856 | 2025-04-23T23:46:49.295437 | MIT | false | dcbb8999339fa799f8bd5c1c02e81ec8 |
name: Benchmarks of search for Wikipedia articles (push)\n\non:\n push:\n branches:\n - main\n\nenv:\n BENCH_NAME: "search_wiki"\n INFLUX_TOKEN: ${{ secrets.INFLUX_TOKEN }}\n\njobs:\n benchmarks:\n name: Run and upload benchmarks\n runs-on: benchmarks\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Set variables\n - name: Set current branch name\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/})" >> $GITHUB_OUTPUT\n id: current_branch\n - name: Set normalized current branch name # Replace `/` by `_` in branch name to avoid issues when pushing to S3\n shell: bash\n run: echo "name=$(echo ${GITHUB_REF#refs/heads/} | tr '/' '_')" >> $GITHUB_OUTPUT\n id: normalized_current_branch\n - name: Set shorter commit SHA\n shell: bash\n run: echo "short=$(echo $GITHUB_SHA | cut -c1-8)" >> $GITHUB_OUTPUT\n id: commit_sha\n - name: Set file basename with format "dataset_branch_commitSHA"\n shell: bash\n run: echo "basename=$(echo ${BENCH_NAME}_${{ steps.normalized_current_branch.outputs.name }}_${{ steps.commit_sha.outputs.short }})" >> $GITHUB_OUTPUT\n id: file\n\n # Run benchmarks\n - name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}\n run: |\n cd crates/benchmarks\n cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}\n\n # Generate critcmp files\n - name: Install critcmp\n uses: taiki-e/install-action@v2\n with:\n tool: critcmp\n - name: Export cripcmp file\n run: |\n critcmp --export ${{ steps.file.outputs.basename }} > ${{ steps.file.outputs.basename }}.json\n\n # Upload benchmarks\n - name: Upload ${{ steps.file.outputs.basename }}.json to DO Spaces # DigitalOcean Spaces = S3\n uses: BetaHuhn/do-spaces-action@v2\n with:\n access_key: ${{ secrets.DO_SPACES_ACCESS_KEY }}\n secret_key: ${{ secrets.DO_SPACES_SECRET_KEY }}\n space_name: ${{ secrets.DO_SPACES_SPACE_NAME }}\n space_region: ${{ secrets.DO_SPACES_SPACE_REGION }}\n source: ${{ steps.file.outputs.basename }}.json\n out_dir: critcmp_results\n\n # Upload benchmarks to influxdb\n - name: Upload ${{ steps.file.outputs.basename }}.json to influxDB\n run: telegraf --config https://eu-central-1-1.aws.cloud2.influxdata.com/api/v2/telegrafs/08b52e34a370b000 --once --debug\n\n # Helper\n - name: "README: compare with another benchmark"\n run: |\n echo "${{ steps.file.outputs.basename }}.json has just been pushed."\n echo 'How to compare this benchmark with another one?'\n echo ' - Check the available files with: ./benchmarks/scripts/list.sh'\n echo " - Run the following command: ./benchmaks/scipts/compare.sh <file-to-compare-with> ${{ steps.file.outputs.basename }}.json"\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\benchmarks-push-search-wiki.yml | benchmarks-push-search-wiki.yml | YAML | 3,079 | 0.8 | 0.013158 | 0.089552 | node-utils | 155 | 2025-05-23T07:06:19.941559 | MIT | false | 9674e2dc14d1bb65528851b4454887db |
name: PR Milestone Check\n\non:\n pull_request:\n types: [opened, reopened, edited, synchronize, milestoned, demilestoned]\n branches:\n - "main"\n - "release-v*.*.*"\n\njobs:\n check-milestone:\n name: Check PR Milestone\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout code\n uses: actions/checkout@v3\n\n - name: Validate PR milestone\n uses: actions/github-script@v7\n with:\n github-token: ${{ secrets.GITHUB_TOKEN }}\n script: |\n // Get PR number directly from the event payload\n const prNumber = context.payload.pull_request.number;\n\n // Get PR details\n const { data: prData } = await github.rest.pulls.get({\n owner: 'meilisearch',\n repo: 'meilisearch',\n pull_number: prNumber\n });\n\n // Get base branch name\n const baseBranch = prData.base.ref;\n console.log(`Base branch: ${baseBranch}`);\n\n // Get PR milestone\n const prMilestone = prData.milestone;\n if (!prMilestone) {\n core.setFailed('PR must have a milestone assigned');\n return;\n }\n console.log(`PR milestone: ${prMilestone.title}`);\n\n // Validate milestone format: vx.y.z\n const milestoneRegex = /^v\d+\.\d+\.\d+$/;\n if (!milestoneRegex.test(prMilestone.title)) {\n core.setFailed(`Milestone "${prMilestone.title}" does not follow the required format vx.y.z`);\n return;\n }\n\n // For main branch PRs, check if the milestone is the highest one\n if (baseBranch === 'main') {\n // Get all milestones\n const { data: milestones } = await github.rest.issues.listMilestones({\n owner: 'meilisearch',\n repo: 'meilisearch',\n state: 'open',\n sort: 'due_on',\n direction: 'desc'\n });\n\n // Sort milestones by version number (vx.y.z)\n const sortedMilestones = milestones\n .filter(m => milestoneRegex.test(m.title))\n .sort((a, b) => {\n const versionA = a.title.substring(1).split('.').map(Number);\n const versionB = b.title.substring(1).split('.').map(Number);\n\n // Compare major version\n if (versionA[0] !== versionB[0]) return versionB[0] - versionA[0];\n // Compare minor version\n if (versionA[1] !== versionB[1]) return versionB[1] - versionA[1];\n // Compare patch version\n return versionB[2] - versionA[2];\n });\n\n if (sortedMilestones.length === 0) {\n core.setFailed('No valid milestones found in the repository. Please create at least one milestone with the format vx.y.z');\n return;\n }\n\n const highestMilestone = sortedMilestones[0];\n console.log(`Highest milestone: ${highestMilestone.title}`);\n\n if (prMilestone.title !== highestMilestone.title) {\n core.setFailed(`PRs targeting the main branch must use the highest milestone (${highestMilestone.title}), but this PR uses ${prMilestone.title}`);\n return;\n }\n } else {\n // For release branches, the milestone should match the branch version\n const branchVersion = baseBranch.substring(8); // remove 'release-'\n if (prMilestone.title !== branchVersion) {\n core.setFailed(`PRs targeting release branch "${baseBranch}" must use the matching milestone "${branchVersion}", but this PR uses "${prMilestone.title}"`);\n return;\n }\n }\n\n console.log('PR milestone validation passed!');\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\check-valid-milestone.yml | check-valid-milestone.yml | YAML | 3,910 | 0.95 | 0.09 | 0.141176 | python-kit | 767 | 2023-08-10T23:12:55.733606 | Apache-2.0 | false | 7982893170bc543f5673cbe6f7f54f98 |
name: Comment when db change labels are added\n\non:\n pull_request:\n types: [labeled]\n\nenv:\n MESSAGE: |\n ### Hello, I'm a bot 🤖 \n\n You are receiving this message because you declared that this PR make changes to the Meilisearch database.\n Depending on the nature of the change, additional actions might be required on your part. The following sections detail the additional actions depending on the nature of the change, please copy the relevant section in the description of your PR, and make sure to perform the required actions.\n\n Thank you for contributing to Meilisearch :heart:\n\n ## This PR makes forward-compatible changes\n\n *Forward-compatible changes are changes to the database such that databases created in an older version of Meilisearch are still valid in the new version of Meilisearch. They usually represent additive changes, like adding a new optional attribute or setting.*\n\n - [ ] Detail the change to the DB format and why they are forward compatible\n - [ ] Forward-compatibility: A database created before this PR and using the features touched by this PR was able to be opened by a Meilisearch produced by the code of this PR.\n\n\n ## This PR makes breaking changes\n\n *Breaking changes are changes to the database such that databases created in an older version of Meilisearch need changes to remain valid in the new version of Meilisearch. This typically happens when the way to store the data changed (change of database, new required key, etc). This can also happen due to breaking changes in the API of an experimental feature. ⚠️ This kind of changes are more difficult to achieve safely, so proceed with caution and test dumpless upgrade right before merging the PR.*\n\n - [ ] Detail the changes to the DB format,\n - [ ] which are compatible, and why\n - [ ] which are not compatible, why, and how they will be fixed up in the upgrade\n - [ ] /!\ Ensure all the read operations still work!\n - If the change happened in milli, you may need to check the version of the database before doing any read operation\n - If the change happened in the index-scheduler, make sure the new code can immediately read the old database\n - If the change happened in the meilisearch-auth database, reach out to the team; we don't know yet how to handle these changes\n - [ ] Write the code to go from the old database to the new one\n - If the change happened in milli, the upgrade function should be written and called [here](https://github.com/meilisearch/meilisearch/blob/3fd86e8d76d7d468b0095d679adb09211ca3b6c0/crates/milli/src/update/upgrade/mod.rs#L24-L47)\n - If the change happened in the index-scheduler, we've never done it yet, but the right place to do it should be [here](https://github.com/meilisearch/meilisearch/blob/3fd86e8d76d7d468b0095d679adb09211ca3b6c0/crates/index-scheduler/src/scheduler/process_upgrade/mod.rs#L13)\n - [ ] Write an integration test [here](https://github.com/meilisearch/meilisearch/blob/main/crates/meilisearch/tests/upgrade/mod.rs) ensuring you can read the old database, upgrade to the new database, and read the new database as expected\n \n\njobs:\n add-comment:\n runs-on: ubuntu-latest\n if: github.event.label.name == 'db change'\n steps:\n - name: Add comment\n uses: actions/github-script@v7\n with:\n github-token: ${{ secrets.GITHUB_TOKEN }}\n script: |\n const message = process.env.MESSAGE;\n github.rest.issues.createComment({\n issue_number: context.issue.number,\n owner: context.repo.owner,\n repo: context.repo.repo,\n body: message\n })\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\db-change-comments.yml | db-change-comments.yml | YAML | 3,721 | 0.95 | 0.052632 | 0.113636 | awesome-app | 275 | 2024-02-08T05:07:59.777848 | GPL-3.0 | false | f1bdf669af77a5a940fe91449288e6b9 |
name: Check db change labels\n\non:\n pull_request:\n types: [opened, synchronize, reopened, labeled, unlabeled]\n\nenv:\n GH_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n\njobs:\n check-labels:\n runs-on: ubuntu-latest\n steps:\n - name: Checkout code\n uses: actions/checkout@v3\n - name: Check db change labels\n id: check_labels\n run: |\n URL=/repos/meilisearch/meilisearch/pulls/${{ github.event.pull_request.number }}/labels\n echo ${{ github.event.pull_request.number }}\n echo $URL\n LABELS=$(gh api -H "Accept: application/vnd.github+json" -H "X-GitHub-Api-Version: 2022-11-28" /repos/meilisearch/meilisearch/issues/${{ github.event.pull_request.number }}/labels -q .[].name)\n if [[ ! "$LABELS" =~ "db change" && ! "$LABELS" =~ "no db change" ]]; then\n echo "::error::Pull request must contain either the 'db change' or 'no db change' label."\n exit 1\n else\n echo "The label is set"\n fi\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\db-change-missing.yml | db-change-missing.yml | YAML | 1,009 | 0.7 | 0.035714 | 0 | node-utils | 849 | 2024-12-22T04:49:05.552252 | GPL-3.0 | false | 3348d567b4af0472d81f12dd387c50b2 |
name: Create issue to upgrade dependencies\n\non:\n schedule:\n # Run the first of the month, every 6 month\n - cron: '0 0 1 */6 *'\n workflow_dispatch:\n\njobs:\n create-issue:\n runs-on: ubuntu-latest\n env:\n ISSUE_TEMPLATE: issue-template.md\n GH_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n steps:\n - uses: actions/checkout@v3\n - name: Download the issue template\n run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/dependency-issue.md > $ISSUE_TEMPLATE\n - name: Create issue\n run: |\n gh issue create \\n --title 'Upgrade dependencies' \\n --label 'dependencies,maintenance' \\n --body-file $ISSUE_TEMPLATE\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\dependency-issue.yml | dependency-issue.yml | YAML | 713 | 0.8 | 0 | 0.045455 | react-lib | 495 | 2024-08-27T00:44:47.115246 | MIT | false | 006924eabb1923feb8c79bb6065fe106 |
name: Run the indexing fuzzer\n\non:\n push:\n branches:\n - main\n\njobs:\n fuzz:\n name: Setup the action\n runs-on: ubuntu-latest\n timeout-minutes: 4320 # 72h\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n\n # Run benchmarks\n - name: Run the fuzzer\n run: |\n cargo run --release --bin fuzz-indexing\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\fuzzer-indexing.yml | fuzzer-indexing.yml | YAML | 417 | 0.8 | 0 | 0.052632 | react-lib | 554 | 2024-01-17T11:55:08.638945 | Apache-2.0 | false | 616bfad9010fe08d9aa7f1dd7635609f |
name: Milestone's workflow\n\n# /!\ No git flow are handled here\n\n# For each Milestone created (not opened!), and if the release is NOT a patch release (only the patch changed)\n# - the roadmap issue is created, see https://github.com/meilisearch/engine-team/blob/main/issue-templates/roadmap-issue.md\n# - the changelog issue is created, see https://github.com/meilisearch/engine-team/blob/main/issue-templates/changelog-issue.md\n# - update the ruleset to add the current release version to the list of allowed versions and be able to use the merge queue.\n\n# For each Milestone closed\n# - the `release_version` label is created\n# - this label is applied to all issues/PRs in the Milestone\n\non:\n milestone:\n types: [created, closed]\n\nenv:\n MILESTONE_VERSION: ${{ github.event.milestone.title }}\n MILESTONE_URL: ${{ github.event.milestone.html_url }}\n MILESTONE_DUE_ON: ${{ github.event.milestone.due_on }}\n GH_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n\njobs:\n # -----------------\n # MILESTONE CREATED\n # -----------------\n\n get-release-version:\n if: github.event.action == 'created'\n runs-on: ubuntu-latest\n outputs:\n is-patch: ${{ steps.check-patch.outputs.is-patch }}\n steps:\n - uses: actions/checkout@v3\n - name: Check if this release is a patch release only\n id: check-patch\n run: |\n echo version: $MILESTONE_VERSION\n if [[ $MILESTONE_VERSION =~ ^v[0-9]+\.[0-9]+\.0$ ]]; then\n echo 'This is NOT a patch release'\n echo "is-patch=false" >> $GITHUB_OUTPUT\n elif [[ $MILESTONE_VERSION =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then\n echo 'This is a patch release'\n echo "is-patch=true" >> $GITHUB_OUTPUT\n else\n echo "Not a valid format of release, check the Milestone's title."\n echo 'Should be vX.Y.Z'\n exit 1\n fi\n\n create-roadmap-issue:\n needs: get-release-version\n # Create the roadmap issue if the release is not only a patch release\n if: github.event.action == 'created' && needs.get-release-version.outputs.is-patch == 'false'\n runs-on: ubuntu-latest\n env:\n ISSUE_TEMPLATE: issue-template.md\n steps:\n - uses: actions/checkout@v3\n - name: Download the issue template\n run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/roadmap-issue.md > $ISSUE_TEMPLATE\n - name: Replace all empty occurrences in the templates\n run: |\n # Replace all <<version>> occurrences\n sed -i "s/<<version>>/$MILESTONE_VERSION/g" $ISSUE_TEMPLATE\n\n # Replace all <<milestone_id>> occurrences\n milestone_id=$(echo $MILESTONE_URL | cut -d '/' -f 7)\n sed -i "s/<<milestone_id>>/$milestone_id/g" $ISSUE_TEMPLATE\n\n # Replace release date if exists\n if [[ ! -z $MILESTONE_DUE_ON ]]; then\n date=$(echo $MILESTONE_DUE_ON | cut -d 'T' -f 1)\n sed -i "s/Release date\: 20XX-XX-XX/Release date\: $date/g" $ISSUE_TEMPLATE\n fi\n - name: Create the issue\n run: |\n gh issue create \\n --title "$MILESTONE_VERSION ROADMAP" \\n --label 'epic,impacts docs,impacts integrations,impacts cloud' \\n --body-file $ISSUE_TEMPLATE \\n --milestone $MILESTONE_VERSION\n\n create-changelog-issue:\n needs: get-release-version\n # Create the changelog issue if the release is not only a patch release\n if: github.event.action == 'created' && needs.get-release-version.outputs.is-patch == 'false'\n runs-on: ubuntu-latest\n env:\n ISSUE_TEMPLATE: issue-template.md\n steps:\n - uses: actions/checkout@v3\n - name: Download the issue template\n run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/changelog-issue.md > $ISSUE_TEMPLATE\n - name: Replace all empty occurrences in the templates\n run: |\n # Replace all <<version>> occurrences\n sed -i "s/<<version>>/$MILESTONE_VERSION/g" $ISSUE_TEMPLATE\n\n # Replace all <<milestone_id>> occurrences\n milestone_id=$(echo $MILESTONE_URL | cut -d '/' -f 7)\n sed -i "s/<<milestone_id>>/$milestone_id/g" $ISSUE_TEMPLATE\n - name: Create the issue\n run: |\n gh issue create \\n --title "Create release changelogs for $MILESTONE_VERSION" \\n --label 'impacts docs,documentation' \\n --body-file $ISSUE_TEMPLATE \\n --milestone $MILESTONE_VERSION \\n --assignee curquiza\n\n create-update-version-issue:\n needs: get-release-version\n # Create the update-version issue even if the release is a patch release\n if: github.event.action == 'created'\n runs-on: ubuntu-latest\n env:\n ISSUE_TEMPLATE: issue-template.md\n steps:\n - uses: actions/checkout@v3\n - name: Download the issue template\n run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/update-version-issue.md > $ISSUE_TEMPLATE\n - name: Create the issue\n run: |\n gh issue create \\n --title "Update version in Cargo.toml for $MILESTONE_VERSION" \\n --label 'maintenance' \\n --body-file $ISSUE_TEMPLATE \\n --milestone $MILESTONE_VERSION\n\n create-update-openapi-issue:\n needs: get-release-version\n # Create the openAPI issue if the release is not only a patch release\n if: github.event.action == 'created' && needs.get-release-version.outputs.is-patch == 'false'\n runs-on: ubuntu-latest\n env:\n ISSUE_TEMPLATE: issue-template.md\n steps:\n - uses: actions/checkout@v3\n - name: Download the issue template\n run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/update-openapi-issue.md > $ISSUE_TEMPLATE\n - name: Create the issue\n run: |\n gh issue create \\n --title "Update Open API file for $MILESTONE_VERSION" \\n --label 'maintenance' \\n --body-file $ISSUE_TEMPLATE \\n --milestone $MILESTONE_VERSION\n\n update-ruleset:\n runs-on: ubuntu-latest\n if: github.event.action == 'created'\n steps:\n - uses: actions/checkout@v3\n - name: Install jq\n run: |\n sudo apt-get update\n sudo apt-get install -y jq\n - name: Update ruleset\n env:\n # gh api repos/meilisearch/meilisearch/rulesets --jq '.[] | {name: .name, id: .id}'\n RULESET_ID: 4253297\n BRANCH_NAME: ${{ github.event.inputs.branch_name }}\n run: |\n echo "RULESET_ID: ${{ env.RULESET_ID }}"\n echo "BRANCH_NAME: ${{ env.BRANCH_NAME }}"\n\n # Get current ruleset conditions\n CONDITIONS=$(gh api repos/meilisearch/meilisearch/rulesets/${{ env.RULESET_ID }} --jq '{ conditions: .conditions }')\n\n # Update the conditions by appending the milestone version\n UPDATED_CONDITIONS=$(echo $CONDITIONS | jq '.conditions.ref_name.include += ["refs/heads/release-'${{ env.MILESTONE_VERSION }}'"]')\n\n # Update the ruleset from stdin (-)\n echo $UPDATED_CONDITIONS |\n gh api repos/meilisearch/meilisearch/rulesets/${{ env.RULESET_ID }} \\n --method PUT \\n -H "Accept: application/vnd.github+json" \\n -H "X-GitHub-Api-Version: 2022-11-28" \\n --input -\n\n # ----------------\n # MILESTONE CLOSED\n # ----------------\n\n create-release-label:\n if: github.event.action == 'closed'\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v3\n - name: Create the ${{ env.MILESTONE_VERSION }} label\n run: |\n label_description="PRs/issues solved in $MILESTONE_VERSION"\n if [[ ! -z $MILESTONE_DUE_ON ]]; then\n date=$(echo $MILESTONE_DUE_ON | cut -d 'T' -f 1)\n label_description="$label_description released on $date"\n fi\n\n gh api repos/meilisearch/meilisearch/labels \\n --method POST \\n -H "Accept: application/vnd.github+json" \\n -f name="$MILESTONE_VERSION" \\n -f description="$label_description" \\n -f color='ff5ba3'\n\n labelize-all-milestone-content:\n if: github.event.action == 'closed'\n needs: create-release-label\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v3\n - name: Add label ${{ env.MILESTONE_VERSION }} to all PRs in the Milestone\n run: |\n prs=$(gh pr list --search milestone:"$MILESTONE_VERSION" --limit 1000 --state all --json number --template '{{range .}}{{tablerow (printf "%v" .number)}}{{end}}')\n for pr in $prs; do\n gh pr edit $pr --add-label $MILESTONE_VERSION\n done\n - name: Add label ${{ env.MILESTONE_VERSION }} to all issues in the Milestone\n run: |\n issues=$(gh issue list --search milestone:"$MILESTONE_VERSION" --limit 1000 --state all --json number --template '{{range .}}{{tablerow (printf "%v" .number)}}{{end}}')\n for issue in $issues; do\n gh issue edit $issue --add-label $MILESTONE_VERSION\n done\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\milestone-workflow.yml | milestone-workflow.yml | YAML | 9,159 | 0.8 | 0.102679 | 0.133663 | react-lib | 399 | 2024-12-04T08:52:48.121320 | GPL-3.0 | false | eefd408734d35ca1f21523cb9095473c |
name: Publish to APT & Homebrew\n\non:\n release:\n types: [released]\n\njobs:\n check-version:\n name: Check the version validity\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v3\n - name: Check release validity\n run: bash .github/scripts/check-release.sh\n\n debian:\n name: Publish debian packagge\n runs-on: ubuntu-latest\n needs: check-version\n container:\n # Use ubuntu-22.04 to compile with glibc 2.35\n image: ubuntu:22.04\n steps:\n - name: Install needed dependencies\n run: |\n apt-get update && apt-get install -y curl\n apt-get install build-essential -y\n - uses: dtolnay/rust-toolchain@1.85\n - name: Install cargo-deb\n run: cargo install cargo-deb\n - uses: actions/checkout@v3\n - name: Build deb package\n run: cargo deb -p meilisearch -o target/debian/meilisearch.deb\n - name: Upload debian pkg to release\n uses: svenstaro/upload-release-action@2.7.0\n with:\n repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}\n file: target/debian/meilisearch.deb\n asset_name: meilisearch.deb\n tag: ${{ github.ref }}\n - name: Upload debian pkg to apt repository\n run: curl -F package=@target/debian/meilisearch.deb https://${{ secrets.GEMFURY_PUSH_TOKEN }}@push.fury.io/meilisearch/\n\n homebrew:\n name: Bump Homebrew formula\n runs-on: ubuntu-latest\n needs: check-version\n steps:\n - name: Create PR to Homebrew\n uses: mislav/bump-homebrew-formula-action@v3\n with:\n formula-name: meilisearch\n formula-path: Formula/m/meilisearch.rb\n env:\n COMMITTER_TOKEN: ${{ secrets.HOMEBREW_COMMITTER_TOKEN }}\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\publish-apt-brew-pkg.yml | publish-apt-brew-pkg.yml | YAML | 1,728 | 0.8 | 0 | 0.019608 | vue-tools | 710 | 2023-08-21T13:36:26.542596 | Apache-2.0 | false | d14d0f67d182b68ce398128856d322af |
name: Publish binaries to GitHub release\n\non:\n workflow_dispatch:\n schedule:\n - cron: "0 2 * * *" # Every day at 2:00am\n release:\n types: [published]\n\njobs:\n check-version:\n name: Check the version validity\n runs-on: ubuntu-latest\n # No need to check the version for dry run (cron)\n steps:\n - uses: actions/checkout@v3\n # Check if the tag has the v<nmumber>.<number>.<number> format.\n # If yes, it means we are publishing an official release.\n # If no, we are releasing a RC, so no need to check the version.\n - name: Check tag format\n if: github.event_name == 'release'\n id: check-tag-format\n run: |\n escaped_tag=$(printf "%q" ${{ github.ref_name }})\n\n if [[ $escaped_tag =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then\n echo "stable=true" >> $GITHUB_OUTPUT\n else\n echo "stable=false" >> $GITHUB_OUTPUT\n fi\n - name: Check release validity\n if: github.event_name == 'release' && steps.check-tag-format.outputs.stable == 'true'\n run: bash .github/scripts/check-release.sh\n\n publish-linux:\n name: Publish binary for Linux\n runs-on: ubuntu-latest\n needs: check-version\n container:\n # Use ubuntu-22.04 to compile with glibc 2.35\n image: ubuntu:22.04\n steps:\n - uses: actions/checkout@v3\n - name: Install needed dependencies\n run: |\n apt-get update && apt-get install -y curl\n apt-get install build-essential -y\n - uses: dtolnay/rust-toolchain@1.85\n - name: Build\n run: cargo build --release --locked\n # No need to upload binaries for dry run (cron)\n - name: Upload binaries to release\n if: github.event_name == 'release'\n uses: svenstaro/upload-release-action@2.7.0\n with:\n repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}\n file: target/release/meilisearch\n asset_name: meilisearch-linux-amd64\n tag: ${{ github.ref }}\n\n publish-macos-windows:\n name: Publish binary for ${{ matrix.os }}\n runs-on: ${{ matrix.os }}\n needs: check-version\n strategy:\n fail-fast: false\n matrix:\n os: [macos-13, windows-2022]\n include:\n - os: macos-13\n artifact_name: meilisearch\n asset_name: meilisearch-macos-amd64\n - os: windows-2022\n artifact_name: meilisearch.exe\n asset_name: meilisearch-windows-amd64.exe\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n - name: Build\n run: cargo build --release --locked\n # No need to upload binaries for dry run (cron)\n - name: Upload binaries to release\n if: github.event_name == 'release'\n uses: svenstaro/upload-release-action@2.7.0\n with:\n repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}\n file: target/release/${{ matrix.artifact_name }}\n asset_name: ${{ matrix.asset_name }}\n tag: ${{ github.ref }}\n\n publish-macos-apple-silicon:\n name: Publish binary for macOS silicon\n runs-on: macos-13\n needs: check-version\n strategy:\n matrix:\n include:\n - target: aarch64-apple-darwin\n asset_name: meilisearch-macos-apple-silicon\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Installing Rust toolchain\n uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n target: ${{ matrix.target }}\n - name: Cargo build\n uses: actions-rs/cargo@v1\n with:\n command: build\n args: --release --target ${{ matrix.target }}\n - name: Upload the binary to release\n # No need to upload binaries for dry run (cron)\n if: github.event_name == 'release'\n uses: svenstaro/upload-release-action@2.7.0\n with:\n repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}\n file: target/${{ matrix.target }}/release/meilisearch\n asset_name: ${{ matrix.asset_name }}\n tag: ${{ github.ref }}\n\n publish-aarch64:\n name: Publish binary for aarch64\n runs-on: ubuntu-latest\n needs: check-version\n env:\n DEBIAN_FRONTEND: noninteractive\n container:\n # Use ubuntu-22.04 to compile with glibc 2.35\n image: ubuntu:22.04\n strategy:\n matrix:\n include:\n - target: aarch64-unknown-linux-gnu\n asset_name: meilisearch-linux-aarch64\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install needed dependencies\n run: |\n apt-get update -y && apt upgrade -y\n apt-get install -y curl build-essential gcc-aarch64-linux-gnu\n - name: Set up Docker for cross compilation\n run: |\n apt-get install -y curl apt-transport-https ca-certificates software-properties-common\n curl -fsSL https://download.docker.com/linux/ubuntu/gpg | apt-key add -\n add-apt-repository "deb [arch=$(dpkg --print-architecture)] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"\n apt-get update -y && apt-get install -y docker-ce\n - name: Installing Rust toolchain\n uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n target: ${{ matrix.target }}\n - name: Configure target aarch64 GNU\n ## Environment variable is not passed using env:\n ## LD gold won't work with MUSL\n # env:\n # JEMALLOC_SYS_WITH_LG_PAGE: 16\n # RUSTFLAGS: '-Clink-arg=-fuse-ld=gold'\n run: |\n echo '[target.aarch64-unknown-linux-gnu]' >> ~/.cargo/config\n echo 'linker = "aarch64-linux-gnu-gcc"' >> ~/.cargo/config\n echo 'JEMALLOC_SYS_WITH_LG_PAGE=16' >> $GITHUB_ENV\n - name: Install a default toolchain that will be used to build cargo cross\n run: |\n rustup default stable\n - name: Cargo build\n uses: actions-rs/cargo@v1\n with:\n command: build\n use-cross: true\n args: --release --target ${{ matrix.target }}\n env:\n CROSS_DOCKER_IN_DOCKER: true\n - name: List target output files\n run: ls -lR ./target\n - name: Upload the binary to release\n # No need to upload binaries for dry run (cron)\n if: github.event_name == 'release'\n uses: svenstaro/upload-release-action@2.7.0\n with:\n repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}\n file: target/${{ matrix.target }}/release/meilisearch\n asset_name: ${{ matrix.asset_name }}\n tag: ${{ github.ref }}\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\publish-binaries.yml | publish-binaries.yml | YAML | 6,619 | 0.8 | 0.096774 | 0.083799 | react-lib | 322 | 2023-08-26T01:44:06.476048 | GPL-3.0 | false | d2e59f7c4c1811cc9697813eeac94767 |
name: Publish images to Docker Hub\n\non:\n push:\n # Will run for every tag pushed except `latest`\n # When the `latest` git tag is created with this [CI](../latest-git-tag.yml)\n # we don't need to create a Docker `latest` image again.\n # The `latest` Docker image push is already done in this CI when releasing a stable version of Meilisearch.\n tags-ignore:\n - latest\n # Both `schedule` and `workflow_dispatch` build the nightly tag\n schedule:\n - cron: '0 23 * * *' # Every day at 11:00pm\n workflow_dispatch:\n\njobs:\n docker:\n runs-on: docker\n steps:\n - uses: actions/checkout@v3\n\n # If we are running a cron or manual job ('schedule' or 'workflow_dispatch' event), it means we are publishing the `nightly` tag, so not considered stable.\n # If we have pushed a tag, and the tag has the v<nmumber>.<number>.<number> format, it means we are publishing an official release, so considered stable.\n # In this situation, we need to set `output.stable` to create/update the following tags (additionally to the `vX.Y.Z` Docker tag):\n # - a `vX.Y` (without patch version) Docker tag\n # - a `latest` Docker tag\n # For any other tag pushed, this is not considered stable.\n - name: Define if stable and latest release\n id: check-tag-format\n env:\n # To avoid request limit with the .github/scripts/is-latest-release.sh script\n GITHUB_PATH: ${{ secrets.MEILI_BOT_GH_PAT }}\n run: |\n escaped_tag=$(printf "%q" ${{ github.ref_name }})\n echo "latest=false" >> $GITHUB_OUTPUT\n\n if [[ ${{ github.event_name }} != 'push' ]]; then\n echo "stable=false" >> $GITHUB_OUTPUT\n elif [[ $escaped_tag =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then\n echo "stable=true" >> $GITHUB_OUTPUT\n echo "latest=$(sh .github/scripts/is-latest-release.sh)" >> $GITHUB_OUTPUT\n else\n echo "stable=false" >> $GITHUB_OUTPUT\n fi\n\n # Check only the validity of the tag for stable releases (not for pre-releases or other tags)\n - name: Check release validity\n if: steps.check-tag-format.outputs.stable == 'true'\n run: bash .github/scripts/check-release.sh\n\n - name: Set build-args for Docker buildx\n id: build-metadata\n run: |\n # Extract commit date\n commit_date=$(git show -s --format=%cd --date=iso-strict ${{ github.sha }})\n\n echo "date=$commit_date" >> $GITHUB_OUTPUT\n\n - name: Set up QEMU\n uses: docker/setup-qemu-action@v3\n\n - name: Set up Docker Buildx\n uses: docker/setup-buildx-action@v3\n\n - name: Login to Docker Hub\n uses: docker/login-action@v3\n with:\n username: ${{ secrets.DOCKERHUB_USERNAME }}\n password: ${{ secrets.DOCKERHUB_TOKEN }}\n\n - name: Docker meta\n id: meta\n uses: docker/metadata-action@v5\n with:\n images: getmeili/meilisearch\n # Prevent `latest` to be updated for each new tag pushed.\n # We need latest and `vX.Y` tags to only be pushed for the stable Meilisearch releases.\n flavor: latest=false\n tags: |\n type=ref,event=tag\n type=raw,value=nightly,enable=${{ github.event_name != 'push' }}\n type=semver,pattern=v{{major}}.{{minor}},enable=${{ steps.check-tag-format.outputs.stable == 'true' }}\n type=semver,pattern=v{{major}},enable=${{ steps.check-tag-format.outputs.stable == 'true' }}\n type=raw,value=latest,enable=${{ steps.check-tag-format.outputs.stable == 'true' && steps.check-tag-format.outputs.latest == 'true' }}\n\n - name: Build and push\n uses: docker/build-push-action@v6\n with:\n push: true\n platforms: linux/amd64,linux/arm64\n tags: ${{ steps.meta.outputs.tags }}\n build-args: |\n COMMIT_SHA=${{ github.sha }}\n COMMIT_DATE=${{ steps.build-metadata.outputs.date }}\n GIT_TAG=${{ github.ref_name }}\n\n # /!\ Don't touch this without checking with Cloud team\n - name: Send CI information to Cloud team\n # Do not send if nightly build (i.e. 'schedule' or 'workflow_dispatch' event)\n if: github.event_name == 'push'\n uses: peter-evans/repository-dispatch@v3\n with:\n token: ${{ secrets.MEILI_BOT_GH_PAT }}\n repository: meilisearch/meilisearch-cloud\n event-type: cloud-docker-build\n client-payload: '{ "meilisearch_version": "${{ github.ref_name }}", "stable": "${{ steps.check-tag-format.outputs.stable }}" }'\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\publish-docker-images.yml | publish-docker-images.yml | YAML | 4,608 | 0.8 | 0.103774 | 0.193548 | vue-tools | 871 | 2024-08-03T22:45:09.557710 | BSD-3-Clause | false | 03b5f6c1c15558b6b67bdbb0cedfde72 |
name: Update Meilisearch version in Cargo.toml\n\non:\n workflow_dispatch:\n inputs:\n new_version:\n description: "The new version (vX.Y.Z)"\n required: true\n\nenv:\n NEW_VERSION: ${{ github.event.inputs.new_version }}\n NEW_BRANCH: update-version-${{ github.event.inputs.new_version }}\n GH_TOKEN: ${{ secrets.MEILI_BOT_GH_PAT }}\n\njobs:\n update-version-cargo-toml:\n name: Update version in Cargo.toml\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v3\n - uses: dtolnay/rust-toolchain@1.85\n with:\n profile: minimal\n - name: Install sd\n run: cargo install sd\n - name: Update Cargo.toml file\n run: |\n raw_new_version=$(echo $NEW_VERSION | cut -d 'v' -f 2)\n new_string="version = \"$raw_new_version\""\n sd '^version = "\d+.\d+.\w+"$' "$new_string" Cargo.toml\n - name: Build Meilisearch to update Cargo.lock\n run: cargo build\n - name: Commit and push the changes to the ${{ env.NEW_BRANCH }} branch\n uses: EndBug/add-and-commit@v9\n with:\n message: "Update version for the next release (${{ env.NEW_VERSION }}) in Cargo.toml"\n new_branch: ${{ env.NEW_BRANCH }}\n - name: Create the PR pointing to ${{ github.ref_name }}\n run: |\n gh pr create \\n --title "Update version for the next release ($NEW_VERSION) in Cargo.toml" \\n --body '⚠️ This PR is automatically generated. Check the new version is the expected one and Cargo.lock has been updated before merging.' \\n --label 'skip changelog' \\n --milestone $NEW_VERSION \\n --base $GITHUB_REF_NAME\n | dataset_sample\yaml\meilisearch_meilisearch\.github\workflows\update-cargo-toml-version.yml | update-cargo-toml-version.yml | YAML | 1,680 | 0.85 | 0.044444 | 0 | python-kit | 925 | 2023-08-05T07:49:16.464010 | Apache-2.0 | false | 3adac6b7c536d029a790899e83ec3fa0 |
global:\n scrape_interval: 15s # By default, scrape targets every 15 seconds.\n\n # Attach these labels to any time series or alerts when communicating with\n # external systems (federation, remote storage, Alertmanager).\n external_labels:\n monitor: 'codelab-monitor'\n\n# A scrape configuration containing exactly one endpoint to scrape:\n# Here it's Prometheus itself.\nscrape_configs:\n # The job name is added as a label `job=<job_name>` to any timeseries scraped from this config.\n - job_name: 'meilisearch'\n\n # Override the global default and scrape targets from this job every 5 seconds.\n scrape_interval: 5s\n\n static_configs:\n - targets: ['localhost:7700'] | dataset_sample\yaml\meilisearch_meilisearch\assets\prometheus-basic-scraper.yml | prometheus-basic-scraper.yml | YAML | 682 | 0.8 | 0 | 0.4 | python-kit | 547 | 2023-09-13T16:16:23.538405 | BSD-3-Clause | false | 1df5c02f3c5d8dd8801ab1edccb0100d |
name: storytelling\nchannels:\n - conda-forge\ndependencies:\n - python=3.5\n - pygpu\n - mkl-service\n - pip:\n - forbiddenfruit \n - azureml-defaults\n - scipy\n - nltk\n - sklearn\n - Cython\n - theano\n - nose\n - https://github.com/Lasagne/Lasagne/archive/master.zip\n - scikit-image\n - Pillow\n - azureml-sdk\n - gensim\n - opencv-python\n \n | dataset_sample\yaml\microsoft_ailab\Pix2Story\environment.yml | environment.yml | YAML | 375 | 0.8 | 0 | 0 | vue-tools | 108 | 2024-09-08T20:25:43.945819 | MIT | false | 4200bd3f39864ab9bb65aae3c3775978 |
# Conda environment specification. The dependencies defined in this file will\n\n# be automatically provisioned for runs with userManagedDependencies=False.\n\n\n# Details about the Conda environment file format:\n\n# https://conda.io/docs/user-guide/tasks/manage-environments.html#create-env-file-manually\n\n\nname: project_environment\ndependencies:\n # The python interpreter version.\n\n # Currently Azure ML only supports 3.5.2 and later.\n\n- python=3.5.5\n\n- pip:\n # Required packages for AzureML execution, history, and data preparation.\n\n - azureml-defaults\n - scikit-image\n - nose\n - nltk\n - Cython\n - sklearn\n - Pillow\n - azureml-sdk\n - gensim\n - opencv-python==3.3.0.9\n- scipy\n- pandas\n | dataset_sample\yaml\microsoft_ailab\Pix2Story\source\config_deploy\myenv.yml | myenv.yml | YAML | 697 | 0.8 | 0.060606 | 0.304348 | awesome-app | 55 | 2025-03-26T16:25:17.417209 | MIT | false | b4d89960f6adc3919ec92e5d20840886 |
steps:\n- script: |\n chmod +x build.linux.sh\n ./build.linux.sh\n- task: PublishBuildArtifacts@1\n displayName: Publish Linux package assets\n inputs:\n pathtoPublish: $(Build.SourcesDirectory)/Installers/Linux/build\n artifactName: Package\n artifactType: container | dataset_sample\yaml\microsoft_ailab\Snip-Insights\.vsts.linux.yml | .vsts.linux.yml | YAML | 275 | 0.7 | 0 | 0 | vue-tools | 919 | 2024-05-20T16:14:42.473827 | Apache-2.0 | false | 6e259d8a4fec2e2cf24dbebdcf7b8b16 |
steps:\n- script: |\n chmod +x build.mac.sh\n ./build.mac.sh\n- task: PublishBuildArtifacts@1\n displayName: Publish Mac package assets\n inputs:\n pathtoPublish: $(Build.SourcesDirectory)/Installers/Mac/build\n artifactName: Package\n artifactType: container | dataset_sample\yaml\microsoft_ailab\Snip-Insights\.vsts.mac.yml | .vsts.mac.yml | YAML | 267 | 0.7 | 0 | 0 | react-lib | 608 | 2025-04-06T13:03:48.682281 | Apache-2.0 | false | 44ee43443726d3649701800c2b0aa20e |
steps:\n- powershell: |\n ./build.windows.ps1\n- task: PublishBuildArtifacts@1\n displayName: Publish Windows package assets\n inputs:\n pathtoPublish: $(Build.SourcesDirectory)/bin/x64/Release\n artifactName: Package\n artifactType: container | dataset_sample\yaml\microsoft_ailab\Snip-Insights\.vsts.windows.yml | .vsts.windows.yml | YAML | 249 | 0.7 | 0 | 0 | node-utils | 975 | 2024-10-03T20:35:32.959638 | BSD-3-Clause | false | d93941b8d5109f321fda2a9fd0f649de |
services:\n garnet:\n image: 'ghcr.io/microsoft/garnet'\n ulimits:\n memlock: -1\n ports:\n - "6379:6379"\n # To avoid docker NAT, consider `host` mode.\n # https://docs.docker.com/compose/compose-file/compose-file-v3/#network_mode\n # network_mode: "host"\n volumes:\n - garnetdata:/data\nvolumes:\n garnetdata:\n | dataset_sample\yaml\microsoft_garnet\docker-compose.yml | docker-compose.yml | YAML | 338 | 0.8 | 0 | 0.214286 | vue-tools | 984 | 2025-02-14T20:40:59.906028 | Apache-2.0 | false | 04a124f407db087a22ca9a0a7c070272 |
trigger:\n branches:\n exclude:\n - continuousbenchmark\npr:\n branches:\n exclude:\n - continuousbenchmark\nresources:\n repositories:\n - repository: self\n type: git\njobs: \n - job: Phase_1 \n displayName: Assessment\n cancelTimeoutInMinutes: 1\n pool:\n name: Azure Pipelines\n vmImage: windows-latest\n steps:\n - checkout: self\n clean: False\n submodules: recursive\n persistCredentials: True\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-policheck.PoliCheck@2\n displayName: 'Run PoliCheck'\n inputs:\n targetType: F\n optionsFC: 1\n optionsUEPATH: '$(Build.SourcesDirectory)/.azure/pipelines/policheck-exclusion.xml'\n optionsSEV: '1|2|3|4'\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-postanalysis.PostAnalysis@2\n displayName: 'Policheck Break Build'\n inputs:\n GdnBreakAllTools: false\n GdnBreakGdnToolPoliCheck: true\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-report.SdtReport@2\n displayName: 'Create Security Analysis Report'\n inputs:\n GdnExportTsvFile: true\n GdnExportAllTools: false\n GdnExportGdnToolPoliCheck: true\n GdnExportGdnToolPoliCheckSeverity: Error\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-publishsecurityanalysislogs.PublishSecurityAnalysisLogs@3\n displayName: 'Publish Security Analysis Logs'\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-postanalysis.PostAnalysis@2\n displayName: 'Post Analysis'\n inputs:\n GdnBreakAllTools: false\n GdnBreakGdnToolPoliCheck: true\n | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-compliance-policheck.yml | azure-pipelines-compliance-policheck.yml | YAML | 1,664 | 0.7 | 0 | 0 | node-utils | 570 | 2023-10-10T21:12:33.393909 | MIT | false | f2983e006b32a168f23bb5d0ba30bc79 |
trigger:\n branches:\n exclude:\n - continuousbenchmark\npr:\n branches:\n exclude:\n - continuousbenchmark\nvariables:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\nresources:\n repositories:\n - repository: self\n type: git\njobs: \n - job: Phase_1 \n displayName: Assessment\n cancelTimeoutInMinutes: 1\n pool:\n name: Azure Pipelines\n vmImage: windows-latest\n steps:\n - checkout: self\n clean: False\n submodules: recursive\n persistCredentials: True\n - task: UseDotNet@2\n displayName: 'Use .NET Core sdk 8.0.x'\n inputs:\n version: 8.0.x\n - task: UseDotNet@2\n displayName: 'Use .NET Core sdk 9.0.x'\n inputs:\n version: 9.0.x\n - task: NuGetToolInstaller@1\n displayName: Nuget Tool Installer\n inputs:\n versionspec: '*'\n checkLatest: true\n - task: NuGetAuthenticate@1\n displayName: 'NuGet Authenticate'\n - task: DotNetCoreCLI@2\n displayName: dotnet build\n inputs:\n projects: '**/Garnet.*.csproj'\n arguments: '-c Release'\n - task: CopyFiles@2\n displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'\n inputs:\n Contents: '**\bin\AnyCPU\$(BuildConfiguration)\**\*'\n TargetFolder: '$(Build.ArtifactStagingDirectory)'\n - task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0\n displayName: Component Detection\n inputs:\n detectorsToRun: NuGet,Npm\n scanType: 'Register'\n verbosity: 'Verbose'\n alertWarningLevel: 'High'\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-credscan.CredScan@3\n name: CredScan6 \n displayName: Run CredScan\n continueOnError: True\n inputs:\n suppressionsFile: $(Build.SourcesDirectory)\.azure\pipelines\credscan-exclusion.json\n debugMode: false\n folderSuppression: false\n verboseOutput: true\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-report.SdtReport@2\n name: SdtReport1\n displayName: 'Create Security Analysis Report'\n inputs:\n GdnExportTsvFile: true\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-publishsecurityanalysislogs.PublishSecurityAnalysisLogs@3\n name: PublishSecurityAnalysisLogs12\n displayName: Publish Security Analysis Logs\n inputs:\n TargetPath: '\\my\share\$(Build.DefinitionName)\$(Build.BuildNumber)'\n AntiMalware: true\n APIScan: true\n CodesignValidation: true\n CredScan: true\n FortifySCA: true\n FxCop: true\n ModernCop: true\n MSRD: true\n SDLNativeRules: true\n Semmle: true\n TSLint: true\n WebScout: true\n - task: securedevelopmentteam.vss-secure-development-tools.build-task-postanalysis.PostAnalysis@2\n name: PostAnalysis13\n displayName: Post Analysis\n inputs:\n GdnBreakAllTools: false\n GdnBreakGdnToolCredScan: true\n GdnBreakGdnToolFxCop: true\n GdnBreakGdnToolFxCopSeverity: Error\n GdnBreakGdnToolSemmle: true\n | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-compliance.yml | azure-pipelines-compliance.yml | YAML | 3,072 | 0.7 | 0 | 0 | vue-tools | 910 | 2024-04-04T21:59:35.204967 | GPL-3.0 | false | 1049b341cee3f11e9f836d5c33eb9d5a |
######################################\n# NOTE: Before running this pipeline to generate a the GitHub Release and new nuget packages, update the VersionPrefix value in Version.props file\n# NOTE: When Version.props file is modified, this pipeline will automatically get triggered\n# NOTE: If this pipeline is ran against a branch that isn't main, the GitHub Release task and NuGet push task will be skipped\n# NOTE: If this pipeline is manually started, a "Proof of Presence" (from a SAW machine) will be required before this pipeline can proceed.\n###################################### \ntrigger:\n branches:\n include:\n - main\n paths:\n include:\n - Version.props\nresources:\n repositories:\n - repository: self\n type: git\n ref: refs/heads/main\n\npool:\n vmImage: 'windows-latest'\n\njobs:\n- job: Phase_1\n displayName: Assessment\n cancelTimeoutInMinutes: 1\n pool:\n name: Azure Pipelines\n vmImage: 'windows-latest'\n steps:\n - checkout: self\n clean: False\n submodules: recursive\n persistCredentials: True\n\n - task: PowerShell@2\n displayName: 'Extract version number from Version.props'\n inputs:\n filePath: .azure/pipelines/extract_version.ps1\n workingDirectory: .azure/pipelines\n\n - task: UseDotNet@2\n displayName: Use .NET Core sdk 6.0.x - needed for code signing\n inputs:\n version: 6.0.x\n - task: UseDotNet@2\n displayName: Use .NET Core sdk 8.0.x\n inputs:\n version: 8.0.x\n - task: UseDotNet@2\n displayName: Use .NET Core sdk 9.0.x\n inputs:\n version: 9.0.x\n\n - task: DotNetCoreCLI@2\n displayName: dotnet build\n inputs:\n projects: '**/Garnet*.csproj'\n arguments: -c Release\n\n - task: PowerShell@2\n displayName: 'Publish the GarnetServer binaries'\n inputs:\n filePath: .azure/pipelines/createbinaries.ps1 \n arguments: 1\n workingDirectory: .azure/pipelines\n\n - task: EsrpCodeSigning@5\n displayName: Sign the binaries for nuget and zipped files\n inputs:\n ConnectedServiceName: 'GarnetCodeSignConn'\n UseMSIAuthentication: false \n AppRegistrationClientId: 'ff81d1e2-748b-49e1-a771-b57246668c2b'\n AppRegistrationTenantId: '975f013f-7f24-47e8-a7d3-abc4752bf346'\n AuthAKVName: 'GarnetCodeSignKeyVault'\n AuthSignCertName: 'garnet-codesign-signing-cert'\n AuthCertName: 'garnet-codesign-auth-cert'\n ServiceEndpointUrl: 'https://api.esrp.microsoft.com/api/v2'\n FolderPath: .\n Pattern: Garnet*.dll,Tsavorite*.dll,Garnet*.exe,HdrHistogram.dll,native_device.dll,*Lua.dll\n signConfigType: inlineSignParams\n inlineOperation: >-\n [\n {\n "KeyCode" : "CP-230012",\n "OperationCode" : "SigntoolSign",\n "Parameters" : {\n "OpusName" : "Microsoft",\n "OpusInfo" : "http://www.microsoft.com",\n "FileDigest" : "/fd \"SHA256\"",\n "PageHash" : "/NPH",\n "TimeStamp" : "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"\n },\n "ToolName" : "sign",\n "ToolVersion" : "1.0"\n },\n {\n "KeyCode" : "CP-230012",\n "OperationCode" : "SigntoolVerify",\n "Parameters" : {},\n "ToolName" : "sign",\n "ToolVersion" : "1.0"\n }\n ]\n SessionTimeout: 20\n VerboseLogin: true \n\n - task: CmdLine@2\n displayName: nuget pack Garnet Library\n inputs:\n workingDirectory: $(System.DefaultWorkingDirectory)\n script: 'dotnet pack --no-build --no-restore --output $(Build.ArtifactStagingDirectory) -p:PackageVersion=$(Build.BuildNumber) /p:Configuration=Release libs/host/Garnet.host.csproj'\n\n - task: CmdLine@2\n displayName: nuget pack Garnet Server\n inputs:\n workingDirectory: $(System.DefaultWorkingDirectory)\n script: 'dotnet pack --no-build --no-restore --output $(Build.ArtifactStagingDirectory) -p:PackageVersion=$(Build.BuildNumber) /p:Configuration=Release main/GarnetServer/GarnetServer.csproj'\n\n - task: EsrpCodeSigning@5\n displayName: Sign the NuGet Packages \n inputs:\n ConnectedServiceName: 'GarnetCodeSignConn'\n UseMSIAuthentication: false \n AppRegistrationClientId: 'ff81d1e2-748b-49e1-a771-b57246668c2b'\n AppRegistrationTenantId: '975f013f-7f24-47e8-a7d3-abc4752bf346'\n AuthAKVName: 'GarnetCodeSignKeyVault'\n AuthSignCertName: 'garnet-codesign-signing-cert'\n AuthCertName: 'garnet-codesign-auth-cert'\n ServiceEndpointUrl: 'https://api.esrp.microsoft.com/api/v2'\n FolderPath: $(Build.ArtifactStagingDirectory)\n Pattern: Microsoft.Garnet.*.nupkg, garnet-server.*.nupkg\n signConfigType: inlineSignParams\n inlineOperation: >-\n [\n {\n "KeyCode" : "CP-401405",\n "OperationCode" : "NuGetSign",\n "Parameters" : {\n "OpusName" : "Microsoft",\n "OpusInfo" : "http://www.microsoft.com",\n "FileDigest" : "/fd \"SHA256\"",\n "PageHash" : "/NPH",\n "TimeStamp" : "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"\n },\n "ToolName" : "sign",\n "ToolVersion" : "1.0"\n },\n {\n "KeyCode" : "CP-401405",\n "OperationCode" : "NuGetVerify",\n "Parameters" : {},\n "ToolName" : "sign",\n "ToolVersion" : "1.0"\n }\n ]\n SessionTimeout: 20\n VerboseLogin: true\n\n # Do after Nuget Pack so not part of Nuget Pack\n - task: PowerShell@2\n displayName: 'Zip the GarnetServer binaries'\n inputs:\n filePath: .azure/pipelines/createbinaries.ps1 \n arguments: 2\n workingDirectory: .azure/pipelines\n\n - task: CopyFiles@2\n displayName: 'Copy Zipped Files (both net80 and net90 in zipped file) to Artifacts dir: $(Build.artifactstagingdirectory)'\n inputs:\n Contents: '**'\n SourceFolder: '$(Build.SourcesDirectory)/main/GarnetServer/bin/Release/publish/output'\n TargetFolder: $(build.artifactstagingdirectory)\n\n - task: PublishBuildArtifacts@1\n displayName: 'Publish Artifact: drop'\n\n - task: GitHubRelease@1\n displayName: 'Create the GitHub release'\n condition: eq(variables['Build.SourceBranchName'], 'main')\n inputs: \n action: 'create'\n gitHubConnection: ADO_to_Github_ServiceConnection\n tagSource: userSpecifiedTag\n tag: 'v$(Build.BuildNumber)'\n title: 'Garnet v$(Build.BuildNumber)'\n releaseNotesSource: inline\n releaseNotesInline: |\n Get NuGet binaries at:\n * Library: https://www.nuget.org/packages/Microsoft.Garnet\n * Tool: https://www.nuget.org/packages/garnet-server\n\n More information at:\n * https://microsoft.github.io/garnet\n * https://github.com/microsoft/garnet\n * https://www.microsoft.com/en-us/research/project/garnet\n \n assets: |\n $(Build.ArtifactStagingDirectory)/*.nupkg\n $(Build.ArtifactStagingDirectory)/*.zip\n $(Build.ArtifactStagingDirectory)/*.tar.xz\n $(Build.ArtifactStagingDirectory)/*.7z\n\n - task: NuGetCommand@2\n displayName: 'Push both packages to NuGet.org'\n condition: eq(variables['Build.SourceBranchName'], 'main')\n inputs:\n command: push\n packagesToPush: '$(Build.ArtifactStagingDirectory)/**/*.nupkg'\n nuGetFeedType: external\n publishFeedCredentials: GarnetADO_to_Nuget | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-external-release.yml | azure-pipelines-external-release.yml | YAML | 7,670 | 0.95 | 0.009524 | 0.061856 | python-kit | 559 | 2025-02-22T15:28:54.581176 | BSD-3-Clause | false | 5934085d16049ef183b7b8d7197d3751 |
######################################\n# NOTE: Before running this pipeline to generate a new nuget package, update the version string in two places\n# 1) update the name: string below (line 6) -- this is the version for the nuget package (e.g. 1.9.52)\n# 2) update \libs\host\GarnetServer.cs readonly string version (~line 45) -- NOTE - these two values need to be the same \n###################################### \nname: 1.9.53\ntrigger: none\nresources:\n repositories:\n - repository: self\n type: git\n ref: refs/heads/main\njobs:\n- job: Phase_1\n displayName: Assessment\n cancelTimeoutInMinutes: 1\n pool:\n name: Azure Pipelines\n steps:\n - checkout: self\n clean: False\n submodules: recursive\n persistCredentials: True\n - task: UseDotNet@2\n displayName: Use .NET Core sdk 8.0.x\n inputs:\n version: 8.0.x\n - task: NuGetToolInstaller@1\n displayName: Nuget Tool Installer\n inputs:\n versionspec: '*'\n checkLatest: true\n - task: NuGetAuthenticate@1\n displayName: NuGet Authenticate\n - task: NuGetAuthenticate@1\n displayName: 'NuGet Authenticate'\n - task: DotNetCoreCLI@2\n displayName: dotnet build\n inputs:\n projects: '**/Garnet.*.csproj'\n arguments: -c Release\n - task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@2\n displayName: Sign the binaries\n enabled: True\n inputs:\n ConnectedServiceName: Garnet Code Signing\n FolderPath: .\n Pattern: Garnet.server.dll,Garnet.client.dll,Garnet.common.dll,Garnet.cluster.dll,Garnet.host.dll,HdrHistogram.dll,Tsavorite.core.dll,Tsavorite.devices.AzureStorageDevice.dll,native_device.dll\n signConfigType: inlineSignParams\n inlineOperation: >-\n [\n {\n "keyCode": "CP-230012",\n "operationSetCode": "SigntoolSign",\n "parameters": [\n {\n "parameterName": "OpusName",\n "parameterValue": "Microsoft"\n },\n {\n "parameterName": "OpusInfo",\n "parameterValue": "http://www.microsoft.com"\n },\n {\n "parameterName": "FileDigest",\n "parameterValue": "/fd \"SHA256\""\n },\n {\n "parameterName": "PageHash",\n "parameterValue": "/NPH"\n },\n {\n "parameterName": "TimeStamp",\n "parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"\n }\n ],\n "toolName": "sign",\n "toolVersion": "1.0"\n },\n {\n "keyCode": "CP-230012",\n "operationSetCode": "SigntoolVerify",\n "parameters": [ ],\n "toolName": "sign",\n "toolVersion": "1.0"\n }\n ]\n - task: CmdLine@2\n displayName: Command Line Script\n inputs:\n script: dir\n - task: CopyFiles@2\n displayName: 'Copy Files to: $(build.artifactstagingdirectory)'\n inputs:\n Contents: '**/bin/AnyCPU/$(BuildConfiguration)/**/*'\n TargetFolder: $(build.artifactstagingdirectory)\n - task: NuGetCommand@2\n displayName: nuget pack Garnet\n enabled: True\n inputs:\n command: custom\n arguments: pack Garnet.nuspec -OutputDirectory $(Build.ArtifactStagingDirectory) -Properties Configuration=$(BuildConfiguration) -Symbols -SymbolPackageFormat snupkg -version $(Build.BuildNumber) -Verbosity Detailed\n - task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@2\n displayName: Sign the NuGet Package\n enabled: True\n inputs:\n ConnectedServiceName: Garnet Code Signing\n FolderPath: $(Build.ArtifactStagingDirectory)\n Pattern: Microsoft.Garnet.*.nupkg\n signConfigType: inlineSignParams\n inlineOperation: >-\n [\n {\n "keyCode": "CP-401405",\n "operationSetCode": "NuGetSign",\n "parameters": [ ],\n "toolName": "sign",\n "toolVersion": "1.0"\n },\n {\n "keyCode": "CP-401405",\n "operationSetCode": "NuGetVerify",\n "parameters": [ ],\n "toolName": "sign",\n "toolVersion": "1.0"\n }\n ]\n - task: PublishBuildArtifacts@1\n displayName: 'Publish Artifact: drop'\n enabled: True\n - task: NuGetCommand@2\n displayName: Internal NuGet push\n enabled: True\n inputs:\n command: push\n searchPatternPush: $(Build.ArtifactStagingDirectory)/**/*.nupkg\n feedPublish: f1985af0-e833-489c-9266-0d9999734615/d2a662ea-a7a6-4cf7-ab9d-8b8dc36082cd\n | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-internal-release.yml | azure-pipelines-internal-release.yml | YAML | 4,780 | 0.8 | 0.007299 | 0.036496 | awesome-app | 253 | 2024-07-13T13:45:39.805190 | BSD-3-Clause | false | c4781a5104b88afbe8ed2c9d31398aaf |
resources:\n repositories:\n - repository: GarnetGithubRepo\n type: github\n name: microsoft/garnet\n endpoint: ADO_to_Github_ServiceConnection\n trigger:\n branches:\n include:\n - main\n\npool:\n vmImage: 'ubuntu-latest'\n\njobs:\n- job: MirrorGithubToADO\n displayName: 'Mirror GitHub main to Azure DevOps main'\n pool:\n vmImage: 'ubuntu-latest'\n steps:\n - checkout: GarnetGithubRepo\n clean: true\n persistCredentials: true\n\n - script: |\n # Set up Git identity \n git config user.email "$(GIT_USER_EMAIL)"\n git config user.name "$(GIT_USER_NAME)"\n git config --global credential.helper 'cache --timeout=3600'\n\n # Fetch the latest changes from the GitHub main branch\n git fetch origin main\n\n # Check out main branch locally\n # git checkout main\n\n # Ensure the local main branch is checked out and reset to match origin/main\n git checkout -B main origin/main\n\n # Reset the main branch to match the origin/main branch\n # git reset --hard origin/main\n\n # Push the updated main branch to Azure DevOps - make sure to get the config files each push so compliant with security policy\n git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" remote add vso https://dev.azure.com/msresearch/_git/Garnet\n git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" fetch vso cfs-remediation\n git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" cherry-pick 0bae2b8a7d3dc9bfc59d25acf30b470ff97f6a7f\n git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push --force https://dev.azure.com/msresearch/_git/Garnet main\n\n displayName: 'Mirror GitHub main to Azure DevOps main'\n env:\n GIT_USER_EMAIL: $(GIT_USER_EMAIL)\n GIT_USER_NAME: $(GIT_USER_NAME)\n SYSTEM_ACCESSTOKEN: $(System.AccessToken) | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-mirror.yml | azure-pipelines-mirror.yml | YAML | 1,874 | 0.8 | 0 | 0.181818 | awesome-app | 653 | 2023-08-02T05:21:22.788856 | BSD-3-Clause | false | 579f019f58b4af68d9511c16d96debf8 |
trigger:\n branches:\n include:\n - refs/heads/main\n\nschedules:\n- cron: "0 7 * * *" # 7:00 UTC is 11:00PM Pacific\n branches:\n include:\n - refs/heads/main\n always: true\n\nresources:\n repositories:\n - repository: self\n type: git\n\nvariables:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\n\njobs: \n- job: 'Linux'\n pool:\n name: GarnetPerfCI_Linux_Pool\n vmImage: GarnetPerfCI_LinuxImage\n displayName: 'Linux'\n timeoutInMinutes: 75\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n\n - bash: |\n sudo npm install -g azurite\n sudo mkdir azurite\n sudo azurite --silent --location azurite --debug azurite\debug.log &\n displayName: 'Install and Run Azurite'\n \n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: UseDotNet@2 # Optional if the .NET Core SDK is already installed\n displayName: Use .NET\n\n - script: dotnet restore\n displayName: .NET Restore\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Garnet Server'\n inputs:\n command: 'build'\n projects: '**/GarnetServer.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Resp.benchmark'\n inputs:\n command: 'build'\n projects: '**/Resp.benchmark.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: PowerShell@2\n displayName: '1: Resp.Benchmark ONLINE; Get Set; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_GetSet_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '2: Resp.Benchmark ONLINE; Get Set; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_GetSet_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '3: Resp.Benchmark ONLINE; ZADD ZREM; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_ZADDZREM_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '4: Resp.Benchmark ONLINE; ZADD ZREM; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_ZADDZREM_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '5: Resp.Benchmark OFFLINE; Get; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_Get_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '6: Resp.Benchmark OFFLINE; Get; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_Get_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '7: Resp.Benchmark OFFLINE; ZADDREM; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_ZADDREM_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '8: Resp.Benchmark OFFLINE; ZADDREM; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_ZADDREM_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: CopyFiles@2\n displayName: 'Copy Results Files to: $(build.artifactstagingdirectory)'\n inputs:\n Contents: '**/test/PerfRegressionTesting/results/**/*'\n TargetFolder: $(build.artifactstagingdirectory)\n\n - task: CopyFiles@2\n displayName: 'Copy ErrorLog Files to: $(build.artifactstagingdirectory)'\n inputs:\n Contents: '**/test/PerfRegressionTesting/errorlog/**/*'\n TargetFolder: $(build.artifactstagingdirectory)\n\n - task: PublishBuildArtifacts@1\n displayName: 'Publish Artifact Output Files: LogResultFiles'\n inputs:\n ArtifactName: LogAndResultFiles\n\n- job: Windows \n displayName: Windows\n cancelTimeoutInMinutes: 1\n\n pool:\n name: GarnetPerfCI_Pool\n vmImage: GarnetPerfCI_WindowsImage\n steps:\n - checkout: self\n clean: False\n submodules: recursive\n persistCredentials: True\n \n - task: UseDotNet@2\n displayName: 'Use .NET Core sdk 8.0.x'\n inputs:\n version: 8.0.x\n \n - task: NuGetToolInstaller@1\n displayName: Nuget Tool Installer\n inputs:\n versionspec: '*'\n checkLatest: true\n\n - task: NuGetAuthenticate@1\n displayName: 'NuGet Authenticate'\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - script: nuget restore\n displayName: Nuget Restore\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet Server $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/GarnetServer.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Resp.benchmark'\n inputs:\n command: 'build'\n projects: '**/Resp.benchmark.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: PowerShell@2\n displayName: 'Run Tsavorite Benchmark Test'\n enabled: false\n inputs:\n targetType: filePath\n filePath: './libs/storage/Tsavorite/cs/benchmark/scripts/run_benchmark.ps1'\n arguments: '$(Build.SourcesDirectory)'\n workingDirectory: libs/storage/Tsavorite/cs/benchmark/scripts\n\n - task: PowerShell@2\n displayName: '1: Resp.Benchmark ONLINE; Get Set; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_GetSet_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '2: Resp.Benchmark ONLINE; Get Set; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_GetSet_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '3: Resp.Benchmark ONLINE; ZADD ZREM; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_ZADDZREM_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '4: Resp.Benchmark ONLINE; ZADD ZREM; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Online_ZADDZREM_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '5: Resp.Benchmark OFFLINE; Get; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_Get_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '6: Resp.Benchmark OFFLINE; Get; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_Get_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '7: Resp.Benchmark OFFLINE; ZADDREM; 1 Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_ZADDREM_1Thr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: PowerShell@2\n displayName: '8: Resp.Benchmark OFFLINE; ZADDREM; Max Thread'\n continueOnError: true\n enabled: true\n inputs:\n targetType: filePath\n filePath: './test/PerfRegressionTesting/run_benchmark.ps1'\n arguments: 'ConfigFiles\CI_Config_Offline_ZADDREM_MaxThr.json'\n workingDirectory: test/PerfRegressionTesting\n\n - task: CopyFiles@2\n displayName: 'Copy Results Files to: $(build.artifactstagingdirectory)'\n inputs:\n Contents: '**/test/PerfRegressionTesting/results/**/*'\n TargetFolder: $(build.artifactstagingdirectory)\n\n - task: CopyFiles@2\n displayName: 'Copy ErrorLog Files to: $(build.artifactstagingdirectory)'\n inputs:\n Contents: '**/test/PerfRegressionTesting/errorlog/**/*'\n TargetFolder: $(build.artifactstagingdirectory)\n\n - task: PublishBuildArtifacts@1\n displayName: 'Publish Artifact Output Files: LogResultFiles'\n inputs:\n ArtifactName: LogAndResultFiles\n | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-performance.yml | azure-pipelines-performance.yml | YAML | 10,632 | 0.8 | 0.00295 | 0 | awesome-app | 942 | 2024-02-12T10:28:03.179422 | Apache-2.0 | false | cc4aaeeb3000e4bb8f81e3063c45cc70 |
trigger:\n paths:\n include:\n - libs/storage/Tsavorite/*\n \nvariables:\n solution: 'Garnet.sln'\n RunAzureTests: 'yes'\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\n\njobs:\n- job: 'csharpWindows'\n pool:\n vmImage: windows-latest\n displayName: 'C# (Windows)'\n timeoutInMinutes: 125\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n\n - task: NodeTool@0\n displayName: Node Tool\n inputs:\n versionSpec: 14.x\n\n - script : npm install -g azurite\n displayName: Install Azurite\n\n - script : start /B azurite\n displayName: Start Azurite\n\n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: NuGetToolInstaller@1\n displayName: Nuget Tool Installer\n inputs:\n versionspec: '*'\n checkLatest: true\n\n - script: nuget restore\n displayName: Nuget Restore\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Tsavorite binaries and tests'\n inputs:\n command: 'build'\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Tsavorite'\n continueOnError: true\n inputs:\n command: test\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings'\n \n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet Cluster binaries and tests'\n continueOnError: true\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet Cluster'\n continueOnError: true\n inputs:\n command: test\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings'\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet binaries and tests $(buildConfiguration)'\n continueOnError: true\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet'\n continueOnError: true\n inputs:\n command: test\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings'\n \n - task: PublishTestResults@2\n displayName: 'Publish Test Results'\n continueOnError: true\n inputs:\n testRunner: VSTest\n testResultsFiles: '**/*.trx'\n searchFolder: '$(Agent.TempDirectory)'\n\n- job: 'csharpLinux'\n pool:\n vmImage: ubuntu-latest\n displayName: 'C# (Linux)'\n timeoutInMinutes: 125\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n\n - bash: |\n sudo npm install -g azurite\n sudo mkdir azurite\n sudo azurite --silent --location azurite --debug azurite\debug.log &\n displayName: 'Install and Run Azurite'\n \n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: UseDotNet@2 # Optional if the .NET Core SDK is already installed\n displayName: Use .NET\n\n - script: dotnet restore\n displayName: .NET Restore\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Tsavorite binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Tsavorite on $(buildConfiguration)'\n continueOnError: true\n inputs:\n command: test\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings'\n \n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet $(buildConfiguration)'\n continueOnError: true\n inputs:\n command: test\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings'\n publishTestResults: true\n \n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet Cluster $(buildConfiguration)'\n continueOnError: true\n inputs:\n command: test\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings'\n\n - task: PublishTestResults@2\n displayName: 'Publish Test Results'\n inputs:\n testResultsFormat: 'VSTest'\n testResultsFiles: '*.trx'\n searchFolder: '$(Agent.TempDirectory)' | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-tsavorite-codecoverage.yml | azure-pipelines-tsavorite-codecoverage.yml | YAML | 5,783 | 0.8 | 0.038462 | 0 | awesome-app | 311 | 2023-12-10T01:05:14.964409 | BSD-3-Clause | false | eab8302358f970337d7d31947be440a7 |
trigger:\n paths:\n include:\n - libs/storage/Tsavorite/*\n \nvariables:\n solution: 'Garnet.sln'\n RunAzureTests: 'yes'\n\njobs:\n- job: 'csharpWindows'\n pool:\n vmImage: windows-latest\n displayName: 'C# (Windows)'\n timeoutInMinutes: '175'\n\n strategy:\n maxParallel: 2\n matrix:\n AnyCPU-Debug:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Debug'\n AnyCPU-Release:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n\n - task: UseDotNet@2\n displayName: Use .NET 9.0\n inputs:\n packageType: 'sdk'\n version: '9.0.x'\n\n - task: NodeTool@0\n displayName: Node Tool\n inputs:\n versionSpec: 14.x\n\n - script : npm install -g azurite\n displayName: Install Azurite\n\n - script : start /B azurite\n displayName: Start Azurite\n\n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: NuGetToolInstaller@1\n displayName: Nuget Tool Installer\n inputs:\n versionspec: '*'\n checkLatest: true\n\n - script: nuget restore\n displayName: Nuget Restore\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Tsavorite binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Tsavorite on $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed"'\n \n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet Cluster on $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed"'\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet on $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed"'\n \n - task: PublishTestResults@2\n displayName: 'Publish Test Results'\n inputs:\n testRunner: VSTest\n testResultsFiles: '**/*.trx'\n searchFolder: '$(Agent.TempDirectory)'\n\n- job: 'csharpLinux'\n pool:\n vmImage: ubuntu-latest\n displayName: 'C# (Linux)'\n timeoutInMinutes: '175'\n\n strategy:\n maxParallel: 2\n matrix:\n AnyCPU-Debug:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Debug'\n AnyCPU-Release:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n\n - task: UseDotNet@2\n displayName: Use .NET 9.0\n inputs:\n packageType: 'sdk'\n version: '9.0.x'\n\n - bash: |\n sudo npm install -g azurite\n sudo mkdir azurite\n sudo azurite --silent --location azurite --debug azurite\debug.log &\n displayName: 'Install and Run Azurite'\n \n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: UseDotNet@2 # Optional if the .NET Core SDK is already installed\n displayName: Use .NET\n\n - script: dotnet restore\n displayName: .NET Restore\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Tsavorite binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Tsavorite on $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Tsavorite.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed"'\n \n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed"'\n publishTestResults: true\n \n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet Cluster $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed"'\n\n - task: PublishTestResults@2\n displayName: 'Publish Test Results'\n inputs:\n testResultsFormat: 'VSTest'\n testResultsFiles: '*.trx'\n searchFolder: '$(Agent.TempDirectory)'\n | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines-tsavorite.yml | azure-pipelines-tsavorite.yml | YAML | 5,662 | 0.8 | 0.034314 | 0 | node-utils | 782 | 2024-05-23T04:14:52.757400 | MIT | false | 915b7ddd2a8c83e8cb8d9694178e544e |
trigger:\n paths:\n exclude:\n - libs/storage/Tsavorite/*\n \nvariables:\n solution: 'Garnet.sln'\n RunAzureTests: 'yes'\n\njobs:\n- job: 'csharpWindows'\n pool:\n vmImage: windows-latest\n displayName: 'C# (Windows)'\n timeoutInMinutes: '120'\n\n strategy:\n maxParallel: '2'\n matrix:\n AnyCPU-Debug:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Debug'\n AnyCPU-Release:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n\n - task: UseDotNet@2\n displayName: Use .NET 9.0\n inputs:\n packageType: 'sdk'\n version: '9.0.x'\n\n - task: NodeTool@0\n displayName: Node Tool\n inputs:\n versionSpec: 14.x\n\n - script : npm install -g azurite\n displayName: Install Azurite\n\n - script : start /B azurite\n displayName: Start Azurite\n\n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: NuGetToolInstaller@1\n displayName: Nuget Tool Installer\n inputs:\n versionspec: '*'\n checkLatest: true\n \n - script: nuget restore\n displayName: Nuget Restore\n\n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet Cluster on $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings -- NUnit.DisplayName=FullName'\n \n - task: DotNetCoreCLI@2\n displayName: 'Build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet on $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings -- NUnit.DisplayName=FullName'\n\n - task: NuGetCommand@2\n displayName: Pack nuget package to test nuspec is correct\n condition: and (succeeded(), eq(variables.buildConfiguration, 'Release'))\n inputs:\n command: custom\n arguments: 'pack Garnet.nuspec -OutputDirectory $(Build.ArtifactStagingDirectory) -Properties -Symbols -SymbolPackageFormat snupkg -version $(Build.BuildNumber) -Verbosity Detailed'\n\n - task: PublishTestResults@2\n displayName: 'Publish Test Results'\n inputs:\n testRunner: VSTest\n testResultsFiles: '**/*.trx'\n searchFolder: '$(Agent.TempDirectory)'\n\n- job: 'csharpLinux'\n pool:\n vmImage: ubuntu-latest\n displayName: 'C# (Linux)'\n timeoutInMinutes: '120'\n\n strategy:\n maxParallel: 2\n matrix:\n AnyCPU-Debug:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Debug'\n AnyCPU-Release:\n buildPlatform: 'Any CPU'\n buildConfiguration: 'Release'\n\n steps:\n - task: UseDotNet@2\n displayName: Use .NET 8.0\n inputs:\n packageType: 'sdk'\n version: '8.0.x'\n - task: UseDotNet@2\n displayName: Use .NET 9.0\n inputs:\n packageType: 'sdk'\n version: '9.0.x'\n\n - bash: |\n sudo npm install -g azurite\n sudo mkdir azurite\n sudo azurite --silent --location azurite --debug azurite\debug.log &\n displayName: 'Install and Run Azurite'\n \n - task: NuGetAuthenticate@1\n displayName: Nuget Authenticate\n\n - task: UseDotNet@2 # Optional if the .NET Core SDK is already installed\n displayName: Use .NET\n\n - script: dotnet restore\n displayName: .NET Restore\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Cluster binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet Cluster $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.cluster.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings -- NUnit.DisplayName=FullName'\n\n - task: DotNetCoreCLI@2\n displayName: '.NET Core build Garnet binaries and tests $(buildConfiguration)'\n inputs:\n command: 'build'\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration)'\n\n - task: DotNetCoreCLI@2\n displayName: 'Run Tests for Garnet $(buildConfiguration)'\n inputs:\n command: test\n projects: '**/Garnet.test.csproj'\n arguments: '--configuration $(buildConfiguration) --logger:"console;verbosity=detailed" --collect "Code coverage" --settings $(Build.SourcesDirectory)\.azure\pipelines\CodeCoverage.runsettings -- NUnit.DisplayName=FullName'\n publishTestResults: true\n \n - task: PublishTestResults@2\n displayName: 'Publish Test Results'\n inputs:\n testResultsFormat: 'VSTest'\n testResultsFiles: '*.trx'\n searchFolder: '$(Agent.TempDirectory)'\n | dataset_sample\yaml\microsoft_garnet\.azure\pipelines\azure-pipelines.yml | azure-pipelines.yml | YAML | 5,561 | 0.8 | 0.027473 | 0 | vue-tools | 430 | 2025-03-12T17:34:11.541702 | BSD-3-Clause | false | d47489a3b109bfd1f6d23c0e5c9040ae |
name: Bug report\ndescription: File a bug report\ntitle: "Bug title"\nlabels: []\nbody: \n - type: textarea\n validations:\n required: true\n attributes:\n label: Describe the bug\n description: Please enter a short, clear description of the bug.\n - type: textarea\n validations:\n required: true\n attributes:\n label: Steps to reproduce the bug\n description: Please provide any required setup and steps to reproduce the behavior.\n placeholder: |\n 1. Go to '...'\n 2. Click on '....'\n - type: textarea\n attributes:\n label: Expected behavior\n description: Please provide a description of what you expected to happen\n - type: textarea\n attributes:\n label: Screenshots\n description: If applicable, add screenshots here to help explain your problem\n - type: textarea\n attributes:\n label: Release version\n description: Please input release version\n placeholder: |\n v1.0.5\n - type: textarea\n attributes:\n label: IDE\n description: Which IDE versions did you see the issue on?\n placeholder: |\n Visual Studio 2022-preview\n - type: textarea\n attributes:\n label: OS version\n description: Which OS versions did you see the issue on?\n placeholder: |\n Windows 11 (22000)\n - type: textarea\n attributes:\n label: Additional context\n description: Enter any other applicable info here\n | dataset_sample\yaml\microsoft_garnet\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 1,435 | 0.85 | 0 | 0 | python-kit | 406 | 2024-02-03T03:32:59.001691 | Apache-2.0 | false | e2c5104e9c5259880bc7fe75e9a07bfa |
blank_issues_enabled: false\ncontact_links:\n - name: Ask a Question\n url: https://github.com/microsoft/garnet/discussions/new/choose\n about: Please ask and answer questions for garnet here.\n - name: Read the Docs\n url: https://microsoft.github.io/garnet/\n about: Be sure you've read the docs! | dataset_sample\yaml\microsoft_garnet\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 317 | 0.8 | 0.142857 | 0 | node-utils | 812 | 2024-06-22T11:32:00.136406 | BSD-3-Clause | false | 8650d6955c671e6c8a07858c5a81dc05 |
name: Feature request\ndescription: Suggest an idea for the garnet\ntitle: "Feature request title"\nbody:\n - type: dropdown\n validations:\n required: true\n attributes:\n label: Feature request type\n options:\n - sample request\n - enhancement\n - type: textarea\n validations:\n required: true\n attributes:\n label: Is your feature request related to a problem? Please describe\n description: Please provide a description of the sample request or enhancement\n placeholder: |\n A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]\n - type: textarea\n validations:\n required: true\n attributes:\n label: Describe the solution you'd like\n description: A clear and concise description of what you want to happen\n - type: textarea\n attributes:\n label: Describe alternatives you've considered\n description: A clear and concise description of any alternative solutions or features you've considered\n - type: textarea\n attributes:\n label: Additional context\n description: Enter any other applicable info here\n placeholder: |\n Add any other context or screenshots about the feature request here. | dataset_sample\yaml\microsoft_garnet\.github\ISSUE_TEMPLATE\feature_request.yml | feature_request.yml | YAML | 1,244 | 0.85 | 0.028571 | 0 | python-kit | 3 | 2025-04-11T11:11:05.848026 | GPL-3.0 | false | e1c4ca26e7db69a188a94b35c8678e9e |
name: Garnet CI BDN.benchmark\non:\n workflow_dispatch:\n push:\n branches:\n - main\n\nenv:\n DOTNET_SKIP_FIRST_TIME_EXPERIENCE: 1\n DOTNET_NOLOGO: true\n\npermissions:\n actions: write\n deployments: write # permission to deploy GitHub pages website\n contents: write # permission to update benchmark contents in continuousbenchmark branch\n\njobs:\n changes:\n name: Check for changes\n runs-on: ubuntu-latest\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Apply filter\n uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 #v3 for security reasons have pinned tag (commit SHA) for 3rd party\n id: filter\n with:\n filters: |\n website:\n - 'website/**'\n bdnbenchmark:\n - '!((*.md)|(website/**))'\n \n # Job to build and run BDN.benchmark\n build-run-bdnbenchmark:\n name: BDNBenchmark\n needs: [changes]\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n os: [ ubuntu-latest, windows-latest ]\n framework: [ 'net8.0', 'net9.0' ]\n configuration: [ 'Release' ]\n test: [ 'Operations.BasicOperations', 'Operations.ObjectOperations', 'Operations.HashObjectOperations', 'Operations.SortedSetOperations', 'Cluster.ClusterMigrate', 'Cluster.ClusterOperations', 'Lua.LuaScripts', 'Lua.LuaScriptCacheOperations','Lua.LuaRunnerOperations','Operations.CustomOperations', 'Operations.RawStringOperations', 'Operations.ScriptOperations', 'Operations.ModuleOperations', 'Operations.PubSubOperations', 'Operations.SetOperations', 'Network.BasicOperations', 'Network.RawStringOperations' ]\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Setup .NET\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: | \n 8.0.x\n 9.0.x\n - name: Install dependencies\n run: dotnet restore\n\n - name: Run BDN.benchmark Perf Test\n run: ./test/BDNPerfTests/run_bdnperftest.ps1 ${{ matrix.test }} ${{ matrix.framework }}\n shell: pwsh\n continue-on-error: false\n \n - name: Upload test results to artifacts\n uses: actions/upload-artifact@v4\n with:\n name: Results-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}\n path: |\n ./test/BDNPerfTests/results\n if: ${{ always() }}\n\n - name: Upload Error Log to artifacts\n uses: actions/upload-artifact@v4\n with:\n name: ErrorLog-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}\n path: |\n ./test/BDNPerfTests/errorlog\n if: ${{ always() }}\n\n # Run `github-action-benchmark` action for the Continuous Benchmarking Charts (https://microsoft.github.io/garnet/charts/)\n - name: Store benchmark result for charts\n uses: benchmark-action/github-action-benchmark@d48d326b4ca9ba73ca0cd0d59f108f9e02a381c7 # v1 for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n name: ${{matrix.test}} (${{matrix.os}} ${{matrix.framework}} ${{matrix.configuration}})\n tool: 'benchmarkdotnet'\n output-file-path: ./test/BDNPerfTests/BenchmarkDotNet.Artifacts/results/BDN.benchmark.${{ matrix.test }}-report-full-compressed.json\n github-token: ${{ secrets.GITHUB_TOKEN }}\n auto-push: ${{ github.event_name == 'push' && github.ref == 'refs/heads/main' }}\n save-data-file: ${{ github.event_name == 'push' && github.ref == 'refs/heads/main' }}\n summary-always: true\n gh-pages-branch: 'continuousbenchmark'\n benchmark-data-dir-path: 'website/static/charts'\n max-items-in-chart: 50\n alert-threshold: '110%'\n comment-always: true\n fail-on-alert: false | dataset_sample\yaml\microsoft_garnet\.github\workflows\ci-bdnbenchmark.yml | ci-bdnbenchmark.yml | YAML | 3,868 | 0.8 | 0.095745 | 0.022989 | vue-tools | 437 | 2024-07-16T13:09:34.010560 | MIT | false | e08ed97d2e21dbc907b24cf384b4c3cc |
name: Garnet .NET CI\non:\n workflow_dispatch:\n push:\n branches:\n - main\n pull_request:\n branches:\n - main\n \nenv:\n DOTNET_SKIP_FIRST_TIME_EXPERIENCE: 1\n DOTNET_NOLOGO: true\n\npermissions:\n contents: read\n\njobs:\n changes:\n name: Check for changes\n runs-on: ubuntu-latest # don't need matrix to test where the changes were made in code\n permissions:\n pull-requests: read\n contents: read\n outputs:\n tsavorite: ${{ steps.filter.outputs.tsavorite }}\n website: ${{ steps.filter.outputs.website }}\n garnet: ${{ steps.filter.outputs.garnet }}\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Apply filter\n uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 #v3 for security reasons have pinned tag (commit SHA) for 3rd party\n id: filter\n with:\n filters: |\n tsavorite:\n - 'libs/storage/Tsavorite/**'\n website:\n - 'website/**'\n garnet:\n - '!((*.md)|(website/**))'\n \n format-garnet:\n name: Format Garnet\n needs: changes\n runs-on: ubuntu-latest\n if: needs.changes.outputs.garnet == 'true'\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Setup .NET 8.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n - name: Setup .NET 9.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Install dependencies\n run: dotnet restore Garnet.sln\n - name: Check style format\n run: dotnet format Garnet.sln --no-restore --verify-no-changes --verbosity diagnostic\n\n format-tsavorite:\n name: Format Tsavorite\n needs: changes\n runs-on: ubuntu-latest\n if: needs.changes.outputs.tsavorite == 'true'\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Setup .NET 8.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n - name: Setup .NET 9.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Install dependencies\n run: dotnet restore libs/storage/Tsavorite/cs/Tsavorite.sln\n - name: Check style format\n run: dotnet format libs/storage/Tsavorite/cs/Tsavorite.sln --no-restore --verify-no-changes --verbosity diagnostic\n \n # Job to build and test Garnet code\n build-test-garnet:\n name: Garnet\n needs: [changes, format-garnet]\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n os: [ ubuntu-latest, windows-latest ]\n framework: [ 'net8.0' , 'net9.0']\n configuration: [ 'Debug', 'Release' ]\n test: [ 'Garnet.test', 'Garnet.test.cluster' ]\n if: needs.changes.outputs.garnet == 'true'\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Setup .NET 8.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n - name: Setup .NET 9.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Install dependencies\n run: dotnet restore\n - name: Build Garnet\n run: dotnet build --configuration ${{ matrix.configuration }}\n - name: Run tests ${{ matrix.test }}\n run: dotnet test test/${{ matrix.test }} -f ${{ matrix.framework }} --configuration ${{ matrix.configuration }} --logger "console;verbosity=detailed" --logger trx --results-directory "GarnetTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}" -- NUnit.DisplayName=FullName\n timeout-minutes: 45 \n - name: Upload test results\n uses: actions/upload-artifact@v4\n with:\n name: dotnet-garnet-results-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}\n path: GarnetTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}\n if: ${{ always() }}\n\n # Job to build and test Tsavorite code (only if there were changes to it)\n build-test-tsavorite: \n name: Tsavorite\n needs: [changes, format-tsavorite]\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n os: [ ubuntu-latest, windows-latest ]\n framework: [ 'net8.0', 'net9.0' ]\n configuration: [ 'Debug', 'Release' ]\n if: needs.changes.outputs.tsavorite == 'true' \n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Set workaround for libaio on Ubuntu 24.04 (see https://askubuntu.com/questions/1512196/libaio1-on-noble/1512197#1512197)\n run: |\n sudo ln -s /usr/lib/x86_64-linux-gnu/libaio.so.1t64 /usr/lib/x86_64-linux-gnu/libaio.so.1\n if: ${{ matrix.os == 'ubuntu-latest' }}\n - name: Set environment variable for Linux\n run: echo "RunAzureTests=yes" >> $GITHUB_ENV\n if: ${{ matrix.os == 'ubuntu-latest' }}\n - name: Set environment variable for Windows\n run: echo ("RunAzureTests=yes") >> $env:GITHUB_ENV\n if: ${{ matrix.os == 'windows-latest' }}\n - name: Setup .NET 8.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n - name: Setup .NET 9.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Setup Node.js for Azurite\n uses: actions/setup-node@v4\n with:\n node-version: 20\n - name: Install and Run Azurite\n shell: bash\n run: |\n npm install -g azurite\n azurite &\n - name: Install dependencies\n run: dotnet restore\n - name: Build Tsavorite\n run: dotnet build libs/storage/Tsavorite/cs/test/Tsavorite.test.csproj --configuration ${{ matrix.configuration }}\n - name: Run Tsavorite tests\n run: dotnet test libs/storage/Tsavorite/cs/test/Tsavorite.test.csproj -f ${{ matrix.framework }} --configuration ${{ matrix.configuration }} --logger "console;verbosity=detailed" --logger trx --results-directory "TsavoriteTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}"\n timeout-minutes: 45 \n - name: Upload test results\n uses: actions/upload-artifact@v4\n with:\n name: dotnet-tsavorite-results-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}\n path: TsavoriteTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}\n if: ${{ always() }}\n\n build-website:\n name: Build Website\n needs: changes \n runs-on: ubuntu-latest\n defaults:\n run:\n working-directory: website\n if: needs.changes.outputs.website == 'true' \n steps:\n - uses: actions/checkout@v4\n - uses: actions/setup-node@v4\n with:\n node-version: 20\n cache: yarn\n cache-dependency-path: ./website/yarn.lock\n - name: Install dependencies\n run: yarn install --frozen-lockfile\n - name: Build website\n run: yarn build\n\n pipeline-success:\n name: Garnet CI (Complete)\n runs-on: ubuntu-latest\n needs: [ build-test-garnet, build-test-tsavorite, build-website ]\n steps:\n - run: echo Done!\n if: ${{ !(failure() || cancelled()) }}\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\ci.yml | ci.yml | YAML | 7,311 | 0.8 | 0.092233 | 0.010152 | react-lib | 650 | 2025-02-16T03:29:39.617534 | Apache-2.0 | false | 9d9621ad6b3b82b8a120acc2a402b41d |
#\n# Auto generated YML file to run CodeQL except for build steps at the bottom which needed to be manual since using net90\n#\nname: "CodeQL Advanced"\n\non:\n workflow_dispatch:\n push:\n branches: [ "main" ]\n pull_request:\n branches: [ "main" ]\n schedule:\n - cron: '38 11 * * 1'\n\njobs:\n analyze:\n name: Analyze (${{ matrix.language }})\n # Runner size impacts CodeQL analysis time. To learn more, please see:\n # - https://gh.io/recommended-hardware-resources-for-running-codeql\n # - https://gh.io/supported-runners-and-hardware-resources\n # - https://gh.io/using-larger-runners (GitHub.com only)\n # Consider using larger runners or machines with greater resources for possible analysis time improvements.\n runs-on: ubuntu-latest\n permissions:\n # required for all workflows\n security-events: write\n\n # required to fetch internal or private CodeQL packs\n packages: read\n\n # only required for workflows in private repositories\n actions: read\n contents: read\n\n strategy:\n fail-fast: false\n matrix:\n include:\n - language: csharp\n build-mode: manual\n - language: javascript-typescript\n build-mode: none\n # CodeQL supports the following values keywords for 'language': 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift'\n # Use `c-cpp` to analyze code written in C, C++ or both\n # Use 'java-kotlin' to analyze code written in Java, Kotlin or both\n # Use 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both\n # To learn more about changing the languages that are analyzed or customizing the build mode for your analysis,\n # see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/customizing-your-advanced-setup-for-code-scanning.\n # If you are analyzing a compiled language, you can modify the 'build-mode' for that language to customize how\n # your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages\n steps:\n - name: Check if manual build mode\n if: ${{ matrix.build-mode == 'manual' }}\n run: echo "Manual build mode detected"\n\n - name: Checkout repository\n uses: actions/checkout@v4\n\n # Initializes the CodeQL tools for scanning.\n - name: Initialize CodeQL\n uses: github/codeql-action/init@v3\n with:\n languages: ${{ matrix.language }}\n build-mode: ${{ matrix.build-mode }}\n # If you wish to specify custom queries, you can do so here or in a config file.\n # By default, queries listed here will override any specified in a config file.\n # Prefix the list here with "+" to use these queries and those in the config file.\n\n # For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs\n # queries: security-extended,security-and-quality\n\n # If the analyze step fails for one of the languages you are analyzing with\n # "We were unable to automatically build your code", modify the matrix above\n # to set the build mode to "manual" for that language. Then modify this step\n # to build your code.\n # ℹ️ Command-line programs to run using the OS shell.\n # 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun\n - name: Setup .NET 8.0\n if: ${{ matrix.build-mode == 'manual' }}\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n\n - name: Setup .NET 9.0\n if: ${{ matrix.build-mode == 'manual' }}\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n\n - name: Install dependencies\n if: ${{ matrix.build-mode == 'manual' }}\n run: dotnet restore\n\n - name: Build Garnet net80\n if: ${{ matrix.build-mode == 'manual' }}\n run: dotnet build -f net8.0\n\n - name: Build Garnet net90\n if: ${{ matrix.build-mode == 'manual' }}\n run: dotnet build -f net9.0\n\n - name: Perform CodeQL Analysis\n uses: github/codeql-action/analyze@v3\n with:\n category: "/language:${{matrix.language}}"\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\codeql.yml | codeql.yml | YAML | 4,430 | 0.95 | 0.228571 | 0.340659 | vue-tools | 838 | 2024-05-14T23:49:13.931835 | MIT | false | 0e09916f47a9a4b3e1b857b2e3ad54df |
name: Deploy to GitHub Pages\n\non:\n workflow_dispatch:\n push:\n branches:\n - main\n paths:\n - '.github/workflows/deploy-website.yml'\n - 'website/**'\n workflow_run:\n workflows: [Garnet CI BDN.benchmark]\n types:\n - completed\npermissions:\n contents: write\n\njobs:\n deploy:\n name: Deploy to GitHub Pages\n runs-on: ubuntu-latest\n defaults:\n run:\n working-directory: website\n steps:\n - uses: actions/checkout@v4\n with:\n ref: main\n - uses: actions/checkout@v4\n with:\n ref: continuousbenchmark\n sparse-checkout: |\n website/static/charts\n path: continuousbenchmark\n - name: Copy charts\n run: |\n mkdir -p static/charts\n cp ../continuousbenchmark/website/static/charts/* static/charts\n - uses: actions/setup-node@v4\n with:\n node-version: 20\n cache: yarn\n cache-dependency-path: ./website/yarn.lock\n\n - name: Install dependencies\n run: yarn install --frozen-lockfile\n - name: Build website\n run: yarn build\n\n # Popular action to deploy to GitHub Pages:\n # Docs: https://github.com/peaceiris/actions-gh-pages#%EF%B8%8F-docusaurus\n - name: Deploy to GitHub Pages\n uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e #v4 - for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n github_token: ${{ secrets.GITHUB_TOKEN }}\n # Build output to publish to the `gh-pages` branch:\n publish_dir: ./website/build\n # The following lines assign commit authorship to the official\n # GH-Actions bot for deploys to `gh-pages` branch:\n # https://github.com/actions/checkout/issues/13#issuecomment-724415212\n # The GH actions bot is used by default if you didn't specify the two fields.\n # You can swap them out with your own user credentials.\n user_name: github-actions[bot]\n user_email: 41898282+github-actions[bot]@users.noreply.github.com\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\deploy-website.yml | deploy-website.yml | YAML | 2,093 | 0.8 | 0.0625 | 0.133333 | react-lib | 220 | 2023-09-25T09:50:56.560207 | MIT | false | f9a999ad4472eb7fdd65466d968efb23 |
name: Garnet Docker images for Linux\n\non:\n workflow_dispatch: # allow manual run\n workflow_run:\n workflows: ['Garnet .NET CI']\n types: [completed]\n branches: [main]\n push:\n tags: 'v*'\n\njobs:\n docker:\n strategy:\n matrix:\n include:\n - dockerfile: Dockerfile\n image: ghcr.io/${{ github.repository }}\n os: ubuntu-latest\n - dockerfile: Dockerfile.alpine\n image: ghcr.io/${{ github.repository }}-alpine\n os: ubuntu-latest\n - dockerfile: Dockerfile.ubuntu\n image: ghcr.io/${{ github.repository }}-jammy\n os: ubuntu-latest\n - dockerfile: Dockerfile.chiseled\n image: ghcr.io/${{ github.repository }}-jammy-chiseled\n os: ubuntu-latest\n - dockerfile: Dockerfile.cbl-mariner\n image: ghcr.io/${{ github.repository }}-cbl-mariner2.0\n os: ubuntu-latest\n runs-on: ${{ matrix.os }}\n permissions:\n contents: read\n packages: write\n if: ${{ github.event.workflow_run.conclusion == 'success' || github.event_name != 'workflow_run' }}\n steps:\n -\n name: Checkout\n uses: actions/checkout@v4\n -\n name: Docker meta\n id: meta\n uses: docker/metadata-action@8e1d5461f02b7886d3c1a774bfbd873650445aa2 # was v5 but now v6 with this commit for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n images: ${{ matrix.image }}\n tags: |\n # generate Docker tags based on the following events/attributes\n type=ref,event=branch\n type=ref,event=pr\n type=semver,pattern={{version}}\n type=semver,pattern={{major}}.{{minor}}\n type=semver,pattern={{major}}\n type=sha\n # set latest tag for default branch\n type=raw,value=latest,enable={{is_default_branch}}\n -\n name: Set up QEMU\n uses: docker/setup-qemu-action@4574d27a4764455b42196d70a065bc6853246a25 # v3 for security reasons have pinned tag (commit SHA) for 3rd party\n -\n name: Set up Docker Buildx\n uses: docker/setup-buildx-action@f7ce87c1d6bead3e36075b2ce75da1f6cc28aaca # v3 for security reasons have pinned tag (commit SHA) for 3rd party\n -\n name: Login to GitHub Container Registry\n if: github.event_name != 'pull_request'\n uses: docker/login-action@327cd5a69de6c009b9ce71bce8395f28e651bf99 # was v3 but now v6 with this commit for security reasons have pinned tag (commit SHA) for 3rd party\n\n with:\n registry: ghcr.io\n username: ${{ github.actor }}\n password: ${{ secrets.GITHUB_TOKEN }}\n -\n name: Build and push\n uses: docker/build-push-action@ca877d9245402d1537745e0e356eab47c3520991 # v5 for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n file: ${{ matrix.dockerfile }}\n provenance: false\n platforms: linux/amd64,linux/arm64\n push: ${{ github.event_name != 'pull_request' }}\n tags: ${{ steps.meta.outputs.tags }}\n labels: ${{ steps.meta.outputs.labels }}\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\docker-linux.yml | docker-linux.yml | YAML | 3,172 | 0.8 | 0.17284 | 0.025641 | node-utils | 500 | 2024-04-27T18:58:59.778241 | BSD-3-Clause | false | ccd1033c4b8519c3b8d7a8df4a3a2cc3 |
name: Garnet Docker images for Windows\n\non:\n workflow_dispatch: # allow manual run\n workflow_run:\n workflows: ['Garnet .NET CI']\n types: [completed]\n branches: [main]\n push:\n tags: 'v*'\n\npermissions:\n contents: read\n packages: write\n\njobs:\n docker:\n runs-on: windows-latest\n if: ${{ github.event.workflow_run.conclusion == 'success' || github.event_name != 'workflow_run' }}\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n \n - name: Docker meta\n id: meta\n uses: docker/metadata-action@8e1d5461f02b7886d3c1a774bfbd873650445aa2 # was v5 but now v6 with this commit for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n images: ghcr.io/${{ github.repository }}-nanoserver-ltsc2022\n tags: |\n # generate Docker tags based on the following events/attributes\n type=ref,event=branch\n type=ref,event=pr\n type=semver,pattern={{version}}\n type=semver,pattern={{major}}.{{minor}}\n type=semver,pattern={{major}}\n type=sha\n # set latest tag for default branch\n type=raw,value=latest,enable={{is_default_branch}}\n \n - name: Login to GitHub Container Registry\n if: github.event_name != 'pull_request'\n uses: docker/login-action@327cd5a69de6c009b9ce71bce8395f28e651bf99 # was v3 but now v6 with this commit for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n registry: ghcr.io\n username: ${{ github.actor }}\n password: ${{ secrets.GITHUB_TOKEN }}\n \n - name: Build and push\n if: github.event_name != 'pull_request'\n run: |\n docker build -f Dockerfile.nanoserver `\n --tag ${{ fromJSON(steps.meta.outputs.json).tags[0] }} `\n --tag ${{ fromJSON(steps.meta.outputs.json).tags[1] }} `\n --tag ${{ fromJSON(steps.meta.outputs.json).tags[2] }} .\n\n docker push ${{ fromJSON(steps.meta.outputs.json).tags[0] }}\n docker push ${{ fromJSON(steps.meta.outputs.json).tags[1] }}\n docker push ${{ fromJSON(steps.meta.outputs.json).tags[2] }}\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\docker-windows.yml | docker-windows.yml | YAML | 2,198 | 0.8 | 0.155172 | 0.039216 | awesome-app | 145 | 2024-10-26T10:08:13.842706 | BSD-3-Clause | false | 63f26ece2ed9fb50fc4e71e309978151 |
name: Garnet Helm Chart\n\non:\n workflow_dispatch: # allow manual run\n push:\n branches:\n - main\n paths:\n - 'charts/*/Chart.yaml'\n\npermissions:\n contents: write\n packages: write\n pull-requests: write\n \njobs:\n helm-chart:\n runs-on: ubuntu-latest\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n\n - name: Configure git\n run: |\n git config user.name "${GITHUB_ACTOR}"\n git config user.email "${GITHUB_ACTOR}@users.noreply.github.com"\n\n - name: Install helm\n uses: azure/setup-helm@fe7b79cd5ee1e45176fcad797de68ecaf3ca4814 # v4 for security reasons have pinned tag (commit SHA) for 3rd party\n env:\n GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"\n\n - name: Install helm-docs\n env:\n HELM_DOCS_VERSION: "1.14.2"\n run: |\n cd /tmp\n wget https://github.com/norwoodj/helm-docs/releases/download/v"${HELM_DOCS_VERSION}"/helm-docs_"${HELM_DOCS_VERSION}"_Linux_x86_64.tar.gz\n tar -xvf helm-docs_"${HELM_DOCS_VERSION}"_Linux_x86_64.tar.gz\n sudo mv helm-docs /usr/local/bin\n\n - name: Set helm chart appVersion from Version.props\n run: |\n export VERSION_PROPS=$(awk -F'[<>]' '/VersionPrefix/{print $3}' Version.props)\n sed -i -e 's#Version.props#"'${VERSION_PROPS}'"#g' charts/garnet/Chart.yaml\n\n - name: Helm lint, helm-docs and helm package\n run: |\n mkdir .cr-release-packages\n for chart in $(find charts -mindepth 1 -maxdepth 1 -type d); do\n if [ -z "${chart:-}" ]; then\n break\n fi\n helm lint "${chart}"\n helm-docs --document-dependency-values --chart-search-root "${chart}"\n helm package "${chart}" --dependency-update --destination .cr-release-packages\n done\n\n - name: Discard changes on the charts/garnet/Chart.yaml file\n run: |\n git checkout -- charts/garnet/Chart.yaml\n\n - name: Create Pull Request\n uses: peter-evans/create-pull-request@67ccf781d68cd99b580ae25a5c18a1cc84ffff1f # v7 for security reasons have pinned tag (commit SHA) for 3rd party\n with:\n add-paths: charts/garnet/README.md\n token: ${{ secrets.GITHUB_TOKEN }}\n committer: github-actions[bot] <${{ github.actor }}@users.noreply.github.com>\n author: ${{ github.actor }} <${{ github.actor_id }}+${{ github.actor }}@users.noreply.github.com>\n signoff: false\n branch: helm-docs-gen\n delete-branch: true\n title: '[helm-chart] Update charts/garnet/README.md by helm-docs'\n body: |\n - Update charts/garnet/README.md\n\n Auto-generated by [create-pull-request][1]\n\n [1]: https://github.com/peter-evans/create-pull-request\n labels: |\n helm-chart\n automated pr\n\n - name: Login to GHCR\n env:\n GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"\n run: |\n echo "${GITHUB_TOKEN}" | helm registry login ghcr.io --username "${GITHUB_ACTOR}" --password-stdin\n\n - name: Push charts to GHCR\n run: |\n shopt -s nullglob\n for pkg in .cr-release-packages/*.tgz; do\n if [ -z "${pkg:-}" ]; then\n break\n fi\n helm push "${pkg}" "oci://ghcr.io/${GITHUB_REPOSITORY_OWNER}/helm-charts"\n done\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\helm-chart.yml | helm-chart.yml | YAML | 3,458 | 0.8 | 0.08 | 0 | node-utils | 124 | 2024-10-16T03:16:24.591780 | Apache-2.0 | false | 2c60572eaffa1e808b90fe512ba258fd |
name: Locker - Lock stale issues and PRs\non:\n schedule:\n - cron: '0 9 * * *' # Once per day, early morning PT\n\n workflow_dispatch:\n # Manual triggering through the GitHub UI, API, or CLI\n inputs:\n daysSinceClose:\n required: true\n default: "60"\n daysSinceUpdate:\n required: true\n default: "60"\n\npermissions:\n issues: write\n pull-requests: write\n\njobs:\n main:\n runs-on: ubuntu-latest\n steps:\n - name: Checkout Actions\n uses: actions/checkout@v4\n with:\n repository: "microsoft/vscode-github-triage-actions"\n path: ./actions\n ref: cd16cd2aad6ba2da74bb6c6f7293adddd579a90e # locker action commit sha\n - name: Install Actions\n run: npm install --production --prefix ./actions\n - name: Run Locker\n uses: ./actions/locker\n with:\n daysSinceClose: ${{ fromJson(inputs.daysSinceClose || 60) }}\n daysSinceUpdate: ${{ fromJson(inputs.daysSinceUpdate || 60) }}\n | dataset_sample\yaml\microsoft_garnet\.github\workflows\locker.yml | locker.yml | YAML | 1,002 | 0.95 | 0 | 0.030303 | react-lib | 488 | 2024-07-22T13:57:58.790282 | Apache-2.0 | false | c00c9c01da0e33211a063db6432bdf1c |
name: Garnet Nightly Tests\non:\n schedule:\n - cron: '0 7 * * *' # Runs at 07:00 UTC, which is 11:00 PM PST\n workflow_dispatch:\n \nenv:\n DOTNET_SKIP_FIRST_TIME_EXPERIENCE: 1\n DOTNET_NOLOGO: true\n\npermissions:\n contents: read\n \njobs:\n build-test-garnet:\n name: Garnet\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n os: [ ubuntu-22.04, ubuntu-24.04, windows-2022, windows-2019 ]\n framework: [ 'net8.0', 'net9.0' ]\n configuration: [ 'Debug', 'Release' ]\n test: [ 'Garnet.test', 'Garnet.test.cluster' ]\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Setup .NET 8.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n - name: Setup .NET 9.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Install dependencies\n run: dotnet restore\n - name: Check style format\n run: dotnet format --verify-no-changes --verbosity diagnostic\n - name: Build Garnet\n run: dotnet build --configuration ${{ matrix.configuration }}\n - name: Run tests ${{ matrix.test }}\n run: dotnet test test/${{ matrix.test }} -f ${{ matrix.framework }} --logger "console;verbosity=detailed" --logger trx --results-directory "GarnetTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}"\n timeout-minutes: 45 \n - name: Upload test results\n uses: actions/upload-artifact@v4\n with:\n name: dotnet-garnet-results-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}\n path: GarnetTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}-${{ matrix.test }}\n if: ${{ always() }}\n\n # Job to build and test Tsavorite code\n build-test-tsavorite: \n name: Tsavorite\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n os: [ ubuntu-22.04, ubuntu-24.04, windows-2022, windows-2019 ]\n framework: [ 'net8.0', 'net9.0' ]\n configuration: [ 'Debug', 'Release' ]\n steps:\n - name: Check out code\n uses: actions/checkout@v4\n - name: Set workaround for libaio on Ubuntu 24.04 (see https://askubuntu.com/questions/1512196/libaio1-on-noble/1512197#1512197)\n run: |\n sudo ln -s /usr/lib/x86_64-linux-gnu/libaio.so.1t64 /usr/lib/x86_64-linux-gnu/libaio.so.1\n if: ${{ matrix.os == 'ubuntu-24.04' }} \n - name: Set environment variable for Linux\n run: echo "RunAzureTests=yes" >> $GITHUB_ENV\n if: ${{ matrix.os == 'ubuntu-24.04' }}\n - name: Set environment variable for Windows\n run: echo ("RunAzureTests=yes") >> $env:GITHUB_ENV\n if: ${{ matrix.os == 'windows-latest' }}\n - name: Setup .NET 8.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 8.0.x\n - name: Setup .NET 9.0\n uses: actions/setup-dotnet@v4\n with:\n dotnet-version: 9.0.x\n - name: Setup Node.js for Azurite\n uses: actions/setup-node@v4\n with:\n node-version: '20'\n - name: Install and Run Azurite\n shell: bash\n run: |\n npm install -g azurite\n azurite &\n - name: Install dependencies\n run: dotnet restore\n - name: Format\n run: dotnet format --verify-no-changes --verbosity diagnostic\n - name: Build Tsavorite\n run: dotnet build libs/storage/Tsavorite/cs/test/Tsavorite.test.csproj --configuration ${{ matrix.configuration }}\n - name: Run Tsavorite tests\n run: dotnet test libs/storage/Tsavorite/cs/test/Tsavorite.test.csproj -f ${{ matrix.framework }} --logger "console;verbosity=detailed" --logger trx --results-directory "TsavoriteTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}"\n timeout-minutes: 45 \n - name: Upload test results\n uses: actions/upload-artifact@v4\n with:\n name: dotnet-tsavorite-results-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}\n path: TsavoriteTestResults-${{ matrix.os }}-${{ matrix.framework }}-${{ matrix.configuration }}\n if: ${{ always() }}\n\n pipeline-success:\n name: Garnet Nightly (Complete)\n runs-on: ubuntu-latest\n needs: [ build-test-garnet, build-test-tsavorite ]\n steps:\n - run: echo Done!\n if: ${{ !(failure() || cancelled()) }}\n \n | dataset_sample\yaml\microsoft_garnet\.github\workflows\nightly.yml | nightly.yml | YAML | 4,513 | 0.8 | 0.086957 | 0.009174 | python-kit | 107 | 2023-09-28T05:22:04.747095 | BSD-3-Clause | false | 3f22ff7127839126f8e5ccea9fabc490 |
badrishc:\n name: Badrish Chandramouli\n title: Partner Research Manager, Microsoft Research\n url: https://badrish.net\n image_url: https://badrish.net/assets/icons/badrish4.jpg\n\nhkhalid:\n name: Hamdaan Khalid\n title: Software Engineer, Azure Resource Graph\n url: https://hamdaan-rails-personal.herokuapp.com/\n image_url: https://media.licdn.com/dms/image/v2/D5603AQEB5k6B-_kYcg/profile-displayphoto-shrink_800_800/profile-displayphoto-shrink_800_800/0/1713142460509?e=1743033600&v=beta&t=efEhRJq1SLgi09uCSUQJN3ssq-_cwljG0ysUc54GcSc\n | dataset_sample\yaml\microsoft_garnet\website\blog\authors.yml | authors.yml | YAML | 538 | 0.8 | 0 | 0 | vue-tools | 682 | 2025-04-07T19:35:41.706919 | GPL-3.0 | false | 599d12fe18d77884f796401f3404ac96 |
dirs: \n - .\nignorePatterns:\n - pattern: "/github/"\n - pattern: "./actions"\n - pattern: "./blob"\n - pattern: "./issues"\n - pattern: "./discussions"\n - pattern: "./pulls"\n - pattern: "https:\/\/platform.openai.com"\n - pattern: "https:\/\/outlook.office.com/bookings"\nexcludedFiles:\n # Files that are temporarily excluded because they contain links that are temporarily unavailable.\n - ./dotnet/src/Experimental/Process.IntegrationTestRunner.Dapr/README.md # Cannot reach https://docs.dapr.io/getting-started/install-dapr-selfhost/. Status: 404" location:{path:"dotnet/src/Experimental/Process.IntegrationTestRunner.Dapr/README.md" \n - ./python/DEV_SETUP.md # "Cannot reach https://code.visualstudio.com/docs/editor/workspaces. Status: 404"\nexcludedDirs:\n # Folders which include links to localhost, since it's not ignored with regular expressions\n - ./python/samples/demos/telemetry\n - ./python/samples/demos/process_with_dapr\n - ./dotnet/samples/Demos/ProcessWithDapr\n - ./dotnet/samples/Demos/CopilotAgentPlugins\n # Exclude the decisions folder that contains documents with links prone to becoming broken and temporarily unavailable. This should be removed when the link checker is implemented as a background job that does not block PRs.\n - ./docs/decisions/\nbaseUrl: https://github.com/microsoft/semantic-kernel/\naliveStatusCodes: \n - 200\n - 206\n - 429\n - 500\n - 503\nuseGitIgnore: true\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\.linkspector.yml | .linkspector.yml | YAML | 1,412 | 0.8 | 0 | 0.096774 | react-lib | 688 | 2023-08-08T07:43:17.730297 | Apache-2.0 | false | 206847ca9be781705c2f2c3baeb88b1e |
# To get started with Dependabot version updates, you'll need to specify which\n# package ecosystems to update and where the package manifests are located.\n# Please see the documentation for all configuration options:\n# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates\n\nversion: 2\nupdates:\n # Maintain dependencies for nuget\n - package-ecosystem: "nuget"\n directory: "dotnet/"\n schedule:\n interval: "weekly"\n day: "monday"\n ignore:\n # For all System.* and Microsoft.Extensions/Bcl.* packages, ignore all major version updates\n - dependency-name: "System.*"\n update-types: ["version-update:semver-major"]\n - dependency-name: "Microsoft.Extensions.*"\n update-types: ["version-update:semver-major"]\n - dependency-name: "Microsoft.Bcl.*"\n update-types: ["version-update:semver-major"]\n - dependency-name: "Moq"\n labels:\n - ".NET"\n - "dependencies"\n\n # Maintain dependencies for nuget\n - package-ecosystem: "nuget"\n directory: "samples/dotnet"\n schedule:\n interval: "weekly"\n day: "monday"\n\n # Maintain dependencies for npm\n - package-ecosystem: "npm"\n directory: "samples/apps"\n schedule:\n interval: "weekly"\n day: "monday"\n\n # Maintain dependencies for pip\n - package-ecosystem: "pip"\n directory: "python/"\n schedule:\n interval: "weekly"\n day: "monday"\n labels:\n - "python"\n - "dependencies"\n\n # Maintain dependencies for github-actions\n - package-ecosystem: "github-actions"\n # Workflow files stored in the\n # default location of `.github/workflows`\n directory: "/"\n schedule:\n interval: "weekly"\n day: "monday"\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\dependabot.yml | dependabot.yml | YAML | 1,734 | 0.8 | 0.12069 | 0.226415 | react-lib | 95 | 2024-10-18T00:04:02.152642 | BSD-3-Clause | false | 9a83b87f8ad12d7e64e117bb4a52fadf |
# Add 'kernel' label to any change within Connectors, Extensions, Skills, and tests directories\nkernel:\n - dotnet/src/Connectors/**/*\n - dotnet/src/Extensions/**/*\n - dotnet/src/Skills/**/*\n - dotnet/src/IntegrationTests/**/*\n - dotnet/src/SemanticKernel.UnitTests/**/*\n\n# Add 'kernel.core' label to any change within the 'SemanticKernel', 'SemanticKernel.Abstractions', or 'SemanticKernel.MetaPackage' directories\nkernel.core:\n - dotnet/src/SemanticKernel/**/*\n - dotnet/src/SemanticKernel.Abstractions/**/*\n - dotnet/src/SemanticKernel.MetaPackage/**/*\n\n# Add 'python' label to any change within the 'python' directory\npython:\n - python/**/*\n\n# Add 'java' label to any change within the 'java' directory\njava:\n - java/**/*\n\n# Add 'samples' label to any change within the 'samples' directory\nsamples:\n - samples/**/*\n\n# Add '.NET' label to any change within samples or kernel 'dotnet' directories.\n.NET:\n - dotnet/**/*\n\n# Add 'copilot chat' label to any change within the 'samples/apps/copilot-chat-app' directory\ncopilot chat:\n - samples/apps/copilot-chat-app/**/*\n\n# Add 'documentation' label to any change within the 'docs' directory, or any '.md' files\ndocumentation:\n - docs/**/*\n - '**/*.md'\n\n# Add 'memory' label to any memory connectors in dotnet/ or python/\nmemory:\n - dotnet/src/Connectors/Connectors.Memory.*/**/*\n - python/semantic_kernel/connectors/memory/**/*\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\labeler.yml | labeler.yml | YAML | 1,392 | 0.8 | 0 | 0.257143 | vue-tools | 867 | 2024-08-16T01:03:11.759433 | Apache-2.0 | false | 17fc1b0f4bdbee1589d4a5f8e0c8c36b |
name: Close inactive issues\non:\n schedule:\n - cron: "30 1 * * *"\n\njobs:\n close-issues:\n runs-on: ubuntu-latest\n permissions:\n issues: write\n pull-requests: write\n steps:\n - uses: actions/stale@v5\n with:\n days-before-issue-stale: 90\n days-before-issue-close: 14\n stale-issue-label: "stale"\n stale-issue-message: "This issue is stale because it has been open for 90 days with no activity."\n close-issue-message: "This issue was closed because it has been inactive for 14 days since being marked as stale."\n days-before-pr-stale: -1\n days-before-pr-close: -1\n repo-token: ${{ secrets.GITHUB_TOKEN }}\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\close-inactive-issues.yml | close-inactive-issues.yml | YAML | 705 | 0.7 | 0.090909 | 0 | vue-tools | 814 | 2024-08-13T04:48:37.325938 | MIT | false | 472bab88b86b214ea1af1287778c8085 |
# CodeQL is the code analysis engine developed by GitHub to automate security checks.\n# The results are shown as code scanning alerts in GitHub. For more details, visit:\n# https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/about-code-scanning-with-codeql\n\nname: "CodeQL"\n\non:\n push:\n # TODO: Add "feature*" back in again, once we determine the cause of the ongoing CodeQL failures.\n branches: ["main", "experimental*", "*-development"]\n schedule:\n - cron: "17 11 * * 2"\n\njobs:\n analyze:\n name: Analyze\n runs-on: ubuntu-latest\n permissions:\n actions: read\n contents: read\n security-events: write\n\n strategy:\n fail-fast: false\n matrix:\n language: ["csharp", "python"]\n # CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]\n # Use only 'java' to analyze code written in Java, Kotlin or both\n # Use only 'javascript' to analyze code written in JavaScript, TypeScript or both\n # Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n # Initializes the CodeQL tools for scanning.\n - name: Initialize CodeQL\n uses: github/codeql-action/init@v3\n with:\n languages: ${{ matrix.language }}\n # If you wish to specify custom queries, you can do so here or in a config file.\n # By default, queries listed here will override any specified in a config file.\n # Prefix the list here with "+" to use these queries and those in the config file.\n\n # Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs\n # queries: security-extended,security-and-quality\n\n # Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).\n # If this step fails, then you should remove it and run the build manually (see below)\n - name: Autobuild\n if: ${{ matrix.language != 'java' }}\n uses: github/codeql-action/autobuild@v3\n\n # ℹ️ Command-line programs to run using the OS shell.\n # 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun\n\n # If the Autobuild fails above, remove it and uncomment the following three lines.\n # modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.\n\n # - run: |\n # echo "Run, Build Application using script"\n # ./location_of_script_within_repo/buildscript.sh\n\n - name: Perform CodeQL Analysis\n uses: github/codeql-action/analyze@v3\n with:\n category: "/language:${{matrix.language}}"\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\codeql-analysis.yml | codeql-analysis.yml | YAML | 3,002 | 0.8 | 0.101449 | 0.403509 | node-utils | 564 | 2025-01-26T19:25:36.201180 | BSD-3-Clause | false | b0712a0a378f932aaa85ba198c43f79d |
name: Generate PR Description\n\non:\n issue_comment:\n types: [created]\n\njobs:\n generate-pr-description:\n permissions:\n pull-requests: write\n statuses: write\n runs-on: ubuntu-latest\n if: github.event.issue.pull_request && contains(github.event.comment.body, '/sk generate-pr-description')\n steps:\n - name: Get PR branch\n uses: xt0rted/pull-request-comment-branch@v3\n id: comment-branch\n\n - name: Set latest commit status as pending\n uses: myrotvorets/set-commit-status-action@v2.0.1\n with:\n sha: ${{ steps.comment-branch.outputs.head_sha }}\n token: ${{ secrets.GITHUB_TOKEN }}\n status: pending\n\n - name: Generate PR description\n uses: mkarle/skonsole-generate-pr-description@v1\n with:\n pull-request-number: ${{ github.event.issue.number }}\n pull-request-diff-url: ${{ github.event.issue.pull_request.diff_url }}\n token: ${{ secrets.GITHUB_TOKEN }}\n update-type: ${{ contains(github.event.comment.body, 'replace') && 'replace' || (contains(github.event.comment.body, 'prefix') && 'prefix' || 'suffix') }}\n env: # Set Azure credentials secret as an input\n AZURE_OPENAI_CHAT_DEPLOYMENT_NAME: ${{ vars.AZUREOPENAI__CHAT__DEPLOYMENTNAME }}\n AZURE_OPENAI_API_ENDPOINT: ${{ secrets.AZUREOPENAI__ENDPOINT }}\n AZURE_OPENAI_API_KEY: ${{ secrets.AZUREOPENAI__APIKEY }}\n \n - name: Set latest commit status as ${{ job.status }}\n uses: myrotvorets/set-commit-status-action@v2.0.1\n if: always()\n with:\n sha: ${{ steps.comment-branch.outputs.head_sha }}\n token: ${{ secrets.GITHUB_TOKEN }}\n status: ${{ job.status }}\n\n - name: Add comment to PR\n uses: actions/github-script@v6\n if: always()\n with:\n script: |\n const name = '${{ github.workflow }}';\n const url = '${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}';\n const success = '${{ job.status }}' === 'success';\n const body = `${name}: ${success ? 'succeeded ✅' : 'failed ❌'}\n${url}`;\n\n await github.rest.issues.createComment({\n issue_number: context.issue.number,\n owner: context.repo.owner,\n repo: context.repo.repo,\n body: body\n })\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\generate-pr-description.yml | generate-pr-description.yml | YAML | 2,405 | 0.8 | 0.04918 | 0 | react-lib | 941 | 2024-05-25T15:36:23.636106 | Apache-2.0 | false | b7af52d8afbcb54a7bb180a9550ef943 |
name: Label Discussions\n\non:\n discussion:\n types:\n - created\n\njobs:\n label_discussions:\n runs-on: ubuntu-latest\n permissions:\n discussions: write\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n\n steps:\n - name: Always add "triage" label\n if: >\n github.event.discussion.node_id != '' &&\n (github.event.discussion.category.name == 'General' ||\n github.event.discussion.category.name == 'Ideas' ||\n github.event.discussion.category.name == 'Q&A')\n uses: octokit/graphql-action@v2.x\n with:\n query: |\n mutation($labelableId: ID!) {\n addLabelsToLabelable(\n input: {\n labelableId: $labelableId\n labelIds: ["LA_kwDOJDJ_Yc8AAAABU2klmQ"]\n }\n ) {\n clientMutationId\n }\n }\n variables: |\n {\n "labelableId": "${{ github.event.discussion.node_id }}"\n }\n\n - name: Conditionally add "python" label\n if: >\n github.event.discussion.node_id != '' &&\n (github.event.discussion.category.name == 'General' ||\n github.event.discussion.category.name == 'Ideas' ||\n github.event.discussion.category.name == 'Q&A') &&\n (contains(github.event.discussion.body, 'python') ||\n contains(github.event.discussion.body, 'Python') ||\n contains(github.event.discussion.title, 'python') ||\n contains(github.event.discussion.title, 'Python'))\n uses: octokit/graphql-action@v2.x\n with:\n query: |\n mutation($labelableId: ID!) {\n addLabelsToLabelable(\n input: {\n labelableId: $labelableId\n labelIds: ["LA_kwDOJDJ_Yc8AAAABO0f2Lg"]\n }\n ) {\n clientMutationId\n }\n }\n variables: |\n {\n "labelableId": "${{ github.event.discussion.node_id }}"\n }\n\n - name: Conditionally add ".NET" label\n if: >\n github.event.discussion.node_id != '' &&\n (github.event.discussion.category.name == 'General' ||\n github.event.discussion.category.name == 'Ideas' ||\n github.event.discussion.category.name == 'Q&A') &&\n (contains(github.event.discussion.body, '.net') ||\n contains(github.event.discussion.body, '.NET') ||\n contains(github.event.discussion.title, '.net') ||\n contains(github.event.discussion.title, '.NET') ||\n contains(github.event.discussion.body, 'dotnet') ||\n contains(github.event.discussion.body, 'DotNet') ||\n contains(github.event.discussion.title, 'dotnet') ||\n contains(github.event.discussion.title, 'DotNet') ||\n contains(github.event.discussion.body, 'c#') ||\n contains(github.event.discussion.body, 'C#') ||\n contains(github.event.discussion.title, 'c#') ||\n contains(github.event.discussion.title, 'C#') ||\n contains(github.event.discussion.body, 'csharp') ||\n contains(github.event.discussion.body, 'CSharp') ||\n contains(github.event.discussion.title, 'csharp') ||\n contains(github.event.discussion.title, 'CSharp'))\n uses: octokit/graphql-action@v2.x\n with:\n query: |\n mutation($labelableId: ID!) {\n addLabelsToLabelable(\n input: {\n labelableId: $labelableId\n labelIds: ["LA_kwDOJDJ_Yc8AAAABN_r7Lw"]\n }\n ) {\n clientMutationId\n }\n }\n variables: |\n {\n "labelableId": "${{ github.event.discussion.node_id }}"\n }\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\label-discussions.yml | label-discussions.yml | YAML | 3,874 | 0.8 | 0.028037 | 0 | node-utils | 595 | 2024-08-28T17:38:16.369740 | MIT | false | 7e0944e002bc0cf7a0a676c95586451c |
name: Label issues\non:\n issues:\n types:\n - reopened\n - opened\n\njobs:\n label_issues:\n name: "Issue: add labels"\n if: ${{ github.event.action == 'opened' || github.event.action == 'reopened' }}\n runs-on: ubuntu-latest\n permissions:\n issues: write\n steps:\n - uses: actions/github-script@v6\n with:\n github-token: ${{ secrets.GH_ACTIONS_PR_WRITE }}\n script: |\n // Get the issue body and title\n const body = context.payload.issue.body\n let title = context.payload.issue.title\n \n // Define the labels array\n let labels = ["triage"]\n \n // Check if the body or the title contains the word 'python' (case-insensitive)\n if ((body != null && body.match(/python/i)) || (title != null && title.match(/python/i))) {\n // Add the 'python' label to the array\n labels.push("python")\n }\n \n // Check if the body or the title contains the word 'java' (case-insensitive)\n if ((body != null && body.match(/java/i)) || (title != null && title.match(/java/i))) {\n // Add the 'java' label to the array\n labels.push("java")\n }\n \n // Check if the body or the title contains the words 'dotnet', '.net', 'c#' or 'csharp' (case-insensitive)\n if ((body != null && body.match(/.net/i)) || (title != null && title.match(/.net/i)) ||\n (body != null && body.match(/dotnet/i)) || (title != null && title.match(/dotnet/i)) ||\n (body != null && body.match(/C#/i)) || (title != null && title.match(/C#/i)) ||\n (body != null && body.match(/csharp/i)) || (title != null && title.match(/csharp/i))) {\n // Add the '.NET' label to the array\n labels.push(".NET")\n }\n\n // Add the labels to the issue\n github.rest.issues.addLabels({\n issue_number: context.issue.number,\n owner: context.repo.owner,\n repo: context.repo.repo,\n labels: labels\n });\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\label-issues.yml | label-issues.yml | YAML | 2,167 | 0.8 | 0.12963 | 0.1875 | react-lib | 998 | 2023-12-26T15:09:52.308552 | Apache-2.0 | false | c2fcfba26566139476438949001885c1 |
name: Create Issue when Needs Port label is added\non:\n issues:\n types: [labeled]\n pull_request_target:\n types: [labeled]\n\njobs:\n create_issue:\n if: contains(github.event.pull_request.labels.*.name, 'needs_port_to_dotnet') || contains(github.event.pull_request.labels.*.name, 'needs_port_to_python') || contains(github.event.issue.labels.*.name, 'needs_port_to_dotnet') || contains(github.event.issue.labels.*.name, 'needs_port_to_python')\n name: "Create Issue"\n continue-on-error: true\n runs-on: ubuntu-latest\n permissions:\n issues: write\n pull-requests: read\n env:\n GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n GH_REPO: ${{ github.repository }}\n\n steps:\n - name: Create dotnet issue\n if: contains(github.event.pull_request.labels.*.name, 'needs_port_to_dotnet') || contains(github.event.issue.labels.*.name, 'needs_port_to_dotnet')\n env:\n ORIGINAL_URL: ${{ github.event.issue.html_url || github.event.pull_request.html_url }}\n ORIGINAL_BODY: ${{ github.event.issue.body || github.event.pull_request.body }}\n ORIGINAL_TITLE: ${{ github.event.issue.title || github.event.pull_request.title }}\n ORIGINAL_NUMBER: ${{ github.event.issue.number || github.event.pull_request.number }}\n run: |\n {\n echo "# Original issue"\n echo "$ORIGINAL_URL"\n echo "## Description"\n echo "$ORIGINAL_BODY"\n echo "\n Relates to #$ORIGINAL_NUMBER"\n } > issue_body.md\n new_issue_url=$(gh issue create \\n --title "Port python feature: ${{ github.event.issue.title || github.event.pull_request.title }}" \\n --label ".NET" \\n --body-file issue_body.md)\n - name: Create python issue\n if: contains(github.event.pull_request.labels.*.name, 'needs_port_to_python') || contains(github.event.issue.labels.*.name, 'needs_port_to_python')\n env:\n ORIGINAL_URL: ${{ github.event.issue.html_url || github.event.pull_request.html_url }}\n ORIGINAL_BODY: ${{ github.event.issue.body || github.event.pull_request.body }}\n ORIGINAL_TITLE: ${{ github.event.issue.title || github.event.pull_request.title }}\n ORIGINAL_NUMBER: ${{ github.event.issue.number || github.event.pull_request.number }}\n run: |\n {\n echo "# Original issue"\n echo "$ORIGINAL_URL"\n echo "## Description"\n echo "$ORIGINAL_BODY"\n echo "\n Relates to #$ORIGINAL_NUMBER"\n } > issue_body.md\n new_issue_url=$(gh issue create \\n --title "Port dotnet feature: ${{ github.event.issue.title || github.event.pull_request.title }}" \\n --label "python" \\n --body-file issue_body.md)\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\label-needs-port.yml | label-needs-port.yml | YAML | 2,783 | 0.8 | 0.050847 | 0 | python-kit | 414 | 2025-03-15T23:39:22.800349 | Apache-2.0 | false | 0a037bef3fa3c072980ffa42c905404e |
# This workflow will triage pull requests and apply a label based on the\n# paths that are modified in the pull request.\n#\n# To use this workflow, you will need to set up a .github/labeler.yml\n# file with configuration. For more information, see:\n# https://github.com/actions/labeler\n\nname: Label pull request\non: [pull_request_target]\n\njobs:\n add_label:\n runs-on: ubuntu-latest\n permissions:\n contents: read\n pull-requests: write\n\n steps:\n - uses: actions/labeler@v4\n with:\n repo-token: "${{ secrets.GH_ACTIONS_PR_WRITE }}"\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\label-pr.yml | label-pr.yml | YAML | 566 | 0.8 | 0 | 0.333333 | python-kit | 6 | 2025-01-07T19:32:36.539222 | Apache-2.0 | false | fb0f6fa5ed445da2441d19e32e983122 |
name: Label title prefix\non:\n issues:\n types: [labeled]\n pull_request_target:\n types: [labeled]\n\njobs:\n add_title_prefix:\n name: "Issue/PR: add title prefix"\n continue-on-error: true\n runs-on: ubuntu-latest\n permissions:\n issues: write\n pull-requests: write\n\n steps:\n - uses: actions/github-script@v6\n name: "Issue/PR: update title"\n with:\n github-token: ${{ secrets.GITHUB_TOKEN }}\n script: |\n let prefixLabels = {\n "python": "Python",\n "java": "Java",\n ".NET": ".Net"\n };\n\n function addTitlePrefix(title, prefix)\n {\n // Update the title based on the label and prefix\n // Check if the title starts with the prefix (case-sensitive)\n if (!title.startsWith(prefix + ": ")) {\n // If not, check if the first word is the label (case-insensitive)\n if (title.match(new RegExp(`^${prefix}`, 'i'))) {\n // If yes, replace it with the prefix (case-sensitive)\n title = title.replace(new RegExp(`^${prefix}`, 'i'), prefix);\n } else {\n // If not, prepend the prefix to the title\n title = prefix + ": " + title;\n }\n }\n\n return title;\n }\n\n labelAdded = context.payload.label.name\n\n // Check if the issue or PR has the label\n if (labelAdded in prefixLabels) {\n let prefix = prefixLabels[labelAdded];\n switch(context.eventName) {\n case 'issues':\n github.rest.issues.update({\n issue_number: context.issue.number,\n owner: context.repo.owner,\n repo: context.repo.repo,\n title: addTitlePrefix(context.payload.issue.title, prefix)\n });\n break\n\n case 'pull_request_target':\n github.rest.pulls.update({\n pull_number: context.issue.number,\n owner: context.repo.owner,\n repo: context.repo.repo,\n title: addTitlePrefix(context.payload.pull_request.title, prefix)\n });\n break\n default:\n core.setFailed('Unrecognited eventName: ' + context.eventName);\n }\n }\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\label-title-prefix.yml | label-title-prefix.yml | YAML | 2,485 | 0.95 | 0.109589 | 0.090909 | python-kit | 616 | 2023-09-18T04:31:16.771891 | MIT | false | 33f0cf2558c632a60ae10ff51bf93c89 |
name: Check .md links\n\non:\n workflow_dispatch:\n pull_request:\n branches: ["main"]\n\npermissions:\n contents: read\n\njobs:\n markdown-link-check:\n runs-on: ubuntu-22.04\n # check out the latest version of the code\n steps:\n - uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n # Checks the status of hyperlinks in all files\n - name: Run linkspector\n uses: umbrelladocs/action-linkspector@v1\n with:\n reporter: local\n filter_mode: nofilter\n fail_on_error: true\n config_file: ".github/.linkspector.yml"\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\markdown-link-check.yml | markdown-link-check.yml | YAML | 602 | 0.8 | 0 | 0.086957 | python-kit | 28 | 2023-09-07T23:09:03.045204 | MIT | false | 32af3bf333171061c7105e76040d2359 |
name: Merge Gatekeeper\n\non:\n pull_request:\n branches: [ "main", "feature*" ]\n merge_group:\n branches: ["main"]\n\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\njobs:\n merge-gatekeeper:\n runs-on: ubuntu-latest\n # Restrict permissions of the GITHUB_TOKEN.\n # Docs: https://docs.github.com/en/actions/using-jobs/assigning-permissions-to-jobs\n permissions:\n checks: read\n statuses: read\n steps:\n - name: Run Merge Gatekeeper\n # NOTE: v1 is updated to reflect the latest v1.x.y. Please use any tag/branch that suits your needs:\n # https://github.com/upsidr/merge-gatekeeper/tags\n # https://github.com/upsidr/merge-gatekeeper/branches\n uses: upsidr/merge-gatekeeper@v1\n if: github.event_name == 'pull_request'\n with:\n token: ${{ secrets.GITHUB_TOKEN }}\n timeout: 3600\n interval: 30\n ignored: python-tests-coverage\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\merge-gatekeeper.yml | merge-gatekeeper.yml | YAML | 1,016 | 0.8 | 0.03125 | 0.172414 | python-kit | 471 | 2025-02-08T18:48:35.357676 | GPL-3.0 | false | 1cc3e3dfc226701434dda639f920fc44 |
name: Python Build Assets\n\non:\n release:\n types: [published]\n\npermissions:\n contents: write\n id-token: write\n\njobs:\n python-build-assets:\n if: github.event_name == 'release' && startsWith(github.event.release.tag_name, 'python-')\n name: Python Build Assets and add to Release\n runs-on: ubuntu-latest\n environment: "integration"\n env:\n UV_PYTHON: "3.10"\n steps:\n - uses: actions/checkout@v4\n - name: Set up uv\n uses: astral-sh/setup-uv@v5\n with:\n version: "0.5.x"\n enable-cache: true\n cache-suffix: ${{ runner.os }}-${{ matrix.python-version }}\n cache-dependency-glob: "**/uv.lock"\n - name: Check version\n run: |\n echo "Building and uploading Python package version: ${{ github.event.release.tag_name }}"\n - name: Build the package\n run: cd python && make build\n - name: Release\n uses: softprops/action-gh-release@v2\n with:\n files: |\n python/dist/*\n - name: Azure Login\n uses: azure/login@v2\n with:\n client-id: ${{ secrets.AZURE_CLIENT_ID }}\n tenant-id: ${{ secrets.AZURE_TENANT_ID }}\n subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}\n - name: Start DevOps pipeline\n uses: azure/cli@v2\n with:\n inlineScript: |\n az pipelines run --id ${{ vars.ADO_PYTHON_RELEASE_ID }} --org ${{ vars.ADO_ORG }} --project ${{ vars.ADO_PROJECT_NAME }} --parameters tag=${{ github.event.release.tag_name }} delay=0\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\python-build.yml | python-build.yml | YAML | 1,545 | 0.8 | 0.020833 | 0 | python-kit | 961 | 2023-09-01T07:14:24.385104 | GPL-3.0 | false | 6637ff0edcfeef0e39e1a10105b8d57e |
name: Python Code Quality Checks\non:\n workflow_dispatch:\n pull_request:\n branches: ["main", "feature*"]\n paths:\n - "python/**"\n\njobs:\n pre-commit:\n if: "!cancelled()"\n strategy:\n fail-fast: false\n matrix:\n python-version: ["3.10"]\n runs-on: ubuntu-latest\n continue-on-error: true\n defaults:\n run:\n working-directory: python\n env:\n # Configure a constant location for the uv cache\n UV_CACHE_DIR: /tmp/.uv-cache\n UV_PYTHON: ${{ matrix.python-version }}\n steps:\n - uses: actions/checkout@v4\n - name: Set up uv\n uses: astral-sh/setup-uv@v5\n with:\n version: "0.5.x"\n enable-cache: true\n cache-suffix: ${{ runner.os }}-${{ matrix.python-version }}\n cache-dependency-glob: "**/uv.lock"\n - name: Install the project\n run: uv sync --all-extras --dev\n - uses: pre-commit/action@v3.0.1\n name: Run Pre-Commit Hooks\n with:\n extra_args: --config python/.pre-commit-config.yaml --all-files\n - name: Run Mypy\n run: uv run mypy -p semantic_kernel --config-file mypy.ini\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\python-lint.yml | python-lint.yml | YAML | 1,144 | 0.8 | 0.04878 | 0.025 | awesome-app | 340 | 2024-01-18T18:59:47.473555 | GPL-3.0 | false | 72de6bdb5799b3757ce59d73514f482c |
name: Python Start Release on ADO\n\non:\n workflow_dispatch:\n inputs:\n tag:\n description: "Tag to release"\n required: true\n\npermissions:\n contents: read\n id-token: write\n\njobs:\n python-start-ado-release-job:\n name: Trigger ADO Pipeline for Python Release\n runs-on: ubuntu-latest\n environment: "integration"\n steps:\n - name: Azure Login\n uses: azure/login@v2\n with:\n client-id: ${{ secrets.AZURE_CLIENT_ID }}\n tenant-id: ${{ secrets.AZURE_TENANT_ID }}\n subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}\n - name: Start DevOps pipeline\n uses: azure/cli@v2\n with:\n inlineScript: |\n az pipelines run --id ${{ vars.ADO_PYTHON_RELEASE_ID }} --org ${{ vars.ADO_ORG }} --project ${{ vars.ADO_PROJECT_NAME }} --parameters tag=${{ inputs.tag }} delay=0\n | dataset_sample\yaml\microsoft_semantic-kernel\.github\workflows\python-manual-release.yml | python-manual-release.yml | YAML | 868 | 0.85 | 0.033333 | 0 | awesome-app | 508 | 2024-05-10T17:56:04.140556 | GPL-3.0 | false | de4822b3106e52c22a13887e206c22c6 |
openapi: 3.0.1\ninfo:\n title: APOD - Subset\n description: 'This endpoint structures the APOD imagery and associated metadata so that it can be repurposed for other applications. In addition, if the concept_tags parameter is set to True, then keywords derived from the image explanation are returned. These keywords could be used as auto-generated hashtags for twitter or instagram feeds; but generally help with discoverability of relevant imagery'\n contact:\n email: evan.t.yates@nasa.gov\n license:\n name: Apache 2.0\n url: http://www.apache.org/licenses/LICENSE-2.0.html\n version: 1.0.0\nservers:\n - url: https://api.nasa.gov/planetary\n - url: http://api.nasa.gov/planetary\npaths:\n /apod:\n get:\n tags:\n - request tag\n summary: Returns images\n description: Returns the picture of the day\n operationId: apod\n parameters:\n - name: date\n in: query\n description: The date of the APOD image to retrieve\n style: form\n explode: false\n schema:\n type: string\n - name: hd\n in: query\n description: Retrieve the URL for the high resolution image\n style: form\n explode: false\n schema:\n type: boolean\n responses:\n '200':\n description: successful operation\n content:\n application/json:\n schema:\n type: array\n items: { }\n security:\n - api_key: [ ]\ncomponents:\n securitySchemes:\n api_key:\n type: apiKey\n name: api_key\n in: query | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Concepts\Resources\Plugins\CopilotAgentPlugins\AstronomyPlugin\astronomy-openapi.yml | astronomy-openapi.yml | YAML | 1,601 | 0.8 | 0.078431 | 0 | python-kit | 660 | 2024-11-08T07:51:38.140255 | GPL-3.0 | false | e45935b39d69d839d4fa5a03294308c1 |
openapi: 3.0.1\ninfo:\n title: OData Service for namespace microsoft.graph - Subset\n description: This OData service is located at https://graph.microsoft.com/v1.0\n version: v1.0\nservers:\n - url: https://graph.microsoft.com/v1.0\npaths:\n /me/messages:\n get:\n tags:\n - me.message\n summary: Get the messages in the signed-in user\u0026apos;s mailbox\n description: Get the messages in the signed-in user\u0026apos;s mailbox (including the Deleted Items and Clutter folders). Depending on the page size and mailbox data, getting messages from a mailbox can incur multiple requests. The default page size is 10 messages. Use $top to customize the page size, within the range of 1 and 1000. To improve the operation response time, use $select to specify the exact properties you need; see example 1 below. Fine-tune the values for $select and $top, especially when you must use a larger page size, as returning a page with hundreds of messages each with a full response payload may trigger the gateway timeout (HTTP 504). To get the next page of messages, simply apply the entire URL returned in @odata.nextLink to the next get-messages request. This URL includes any query parameters you may have specified in the initial request. Do not try to extract the $skip value from the @odata.nextLink URL to manipulate responses. This API uses the $skip value to keep count of all the items it has gone through in the user\u0026apos;s mailbox to return a page of message-type items. It\u0026apos;s therefore possible that even in the initial response, the $skip value is larger than the page size. For more information, see Paging Microsoft Graph data in your app. Currently, this operation returns message bodies in only HTML format. There are two scenarios where an app can get messages in another user\u0026apos;s mail folder\n operationId: me_ListMessages\n parameters:\n - name: includeHiddenMessages\n in: query\n description: Include Hidden Messages\n style: form\n explode: false\n schema:\n type: string\n - $ref: '#/components/parameters/top'\n - $ref: '#/components/parameters/skip'\n - $ref: '#/components/parameters/search'\n - $ref: '#/components/parameters/filter'\n - $ref: '#/components/parameters/count'\n - name: $orderby\n in: query\n description: Order items by property values\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $select\n in: query\n description: Select properties to be returned\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $expand\n in: query\n description: Expand related entities\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n responses:\n 2XX:\n $ref: '#/components/responses/microsoft.graph.messageCollectionResponse'\n x-ms-pageable:\n nextLinkName: '@odata.nextLink'\n operationName: listMore\n itemName: value\n /me/sendMail:\n post:\n tags:\n - me.user.Actions\n summary: Invoke action sendMail\n description: 'Send the message specified in the request body using either JSON or MIME format. When using JSON format, you can include a file attachment in the same sendMail action call. When using MIME format: This method saves the message in the Sent Items folder. Alternatively, create a draft message to send later. To learn more about the steps involved in the backend before a mail is delivered to recipients, see here.'\n operationId: me_sendMail\n requestBody:\n $ref: '#/components/requestBodies/sendMailRequestBody'\n responses:\n '204':\n description: Success\ncomponents:\n schemas:\n microsoft.graph.message:\n title: message\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n categories:\n type: array\n items:\n type: string\n nullable: true\n description: The categories associated with the item\n changeKey:\n type: string\n description: 'Identifies the version of the item. Every time the item is changed, changeKey changes as well. This allows Exchange to apply changes to the correct version of the object. Read-only.'\n nullable: true\n createdDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n bccRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The Bcc: recipients for the message.'\n body:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n bodyPreview:\n type: string\n description: The first 255 characters of the message body. It is in text format.\n nullable: true\n ccRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The Cc: recipients for the message.'\n conversationId:\n type: string\n description: The ID of the conversation the email belongs to.\n nullable: true\n conversationIndex:\n type: string\n description: Indicates the position of the message within the conversation.\n format: base64url\n nullable: true\n flag:\n $ref: '#/components/schemas/microsoft.graph.followupFlag'\n from:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n hasAttachments:\n type: boolean\n description: 'Indicates whether the message has attachments. This property doesn''t include inline attachments, so if a message contains only inline attachments, this property is false. To verify the existence of inline attachments, parse the body property to look for a src attribute, such as <IMG src=''cid:image001.jpg@01D26CD8.6C05F070''>.'\n nullable: true\n importance:\n $ref: '#/components/schemas/microsoft.graph.importance'\n inferenceClassification:\n $ref: '#/components/schemas/microsoft.graph.inferenceClassificationType'\n internetMessageHeaders:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.internetMessageHeader'\n description: A collection of message headers defined by RFC5322. The set includes message headers indicating the network path taken by a message from the sender to the recipient. It can also contain custom message headers that hold app data for the message. Returned only on applying a $select query option. Read-only.\n internetMessageId:\n type: string\n description: The message ID in the format specified by RFC2822.\n nullable: true\n isDeliveryReceiptRequested:\n type: boolean\n description: Indicates whether a read receipt is requested for the message.\n nullable: true\n isDraft:\n type: boolean\n description: Indicates whether the message is a draft. A message is a draft if it hasn't been sent yet.\n nullable: true\n isRead:\n type: boolean\n description: Indicates whether the message has been read.\n nullable: true\n isReadReceiptRequested:\n type: boolean\n description: Indicates whether a read receipt is requested for the message.\n nullable: true\n parentFolderId:\n type: string\n description: The unique identifier for the message's parent mailFolder.\n nullable: true\n receivedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time the message was received. The date and time information uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z.'\n format: date-time\n nullable: true\n replyTo:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: The email addresses to use when replying.\n sender:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n sentDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time the message was sent. The date and time information uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z.'\n format: date-time\n nullable: true\n subject:\n type: string\n description: The subject of the message.\n nullable: true\n toRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The To: recipients for the message.'\n uniqueBody:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n webLink:\n type: string\n description: 'The URL to open the message in Outlook on the web.You can append an ispopout argument to the end of the URL to change how the message is displayed. If ispopout is not present or if it is set to 1, then the message is shown in a popout window. If ispopout is set to 0, the browser shows the message in the Outlook on the web review pane.The message opens in the browser if you are signed in to your mailbox via Outlook on the web. You are prompted to sign in if you are not already signed in with the browser.This URL cannot be accessed from within an iFrame.'\n nullable: true\n attachments:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.attachment'\n description: The fileAttachment and itemAttachment attachments for the message.\n extensions:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.extension'\n description: The collection of open extensions defined for the message. Nullable.\n multiValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.multiValueLegacyExtendedProperty'\n description: The collection of multi-value extended properties defined for the message. Nullable.\n singleValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.singleValueLegacyExtendedProperty'\n description: The collection of single-value extended properties defined for the message. Nullable.\n microsoft.graph.recipient:\n title: recipient\n required:\n - '@odata.type'\n type: object\n properties:\n emailAddress:\n $ref: '#/components/schemas/microsoft.graph.emailAddress'\n '@odata.type':\n type: string\n discriminator:\n propertyName: '@odata.type'\n microsoft.graph.itemBody:\n title: itemBody\n required:\n - '@odata.type'\n type: object\n properties:\n content:\n type: string\n description: The content of the item.\n nullable: true\n contentType:\n $ref: '#/components/schemas/microsoft.graph.bodyType'\n '@odata.type':\n type: string\n description: The body of the message. It can be in HTML or text format. Find out about safe HTML in a message body.\n microsoft.graph.followupFlag:\n title: followupFlag\n required:\n - '@odata.type'\n type: object\n properties:\n completedDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n dueDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n flagStatus:\n $ref: '#/components/schemas/microsoft.graph.followupFlagStatus'\n startDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n '@odata.type':\n type: string\n description: 'The flag value that indicates the status, start date, due date, or completion date for the message.'\n microsoft.graph.importance:\n title: importance\n enum:\n - low\n - normal\n - high\n type: string\n description: 'The importance of the message. The possible values are: low, normal, and high.'\n microsoft.graph.inferenceClassificationType:\n title: inferenceClassificationType\n enum:\n - focused\n - other\n type: string\n description: 'The classification of the message for the user, based on inferred relevance or importance, or on an explicit override. The possible values are: focused or other.'\n microsoft.graph.internetMessageHeader:\n title: internetMessageHeader\n required:\n - '@odata.type'\n type: object\n properties:\n name:\n type: string\n description: Represents the key in a key-value pair.\n nullable: true\n value:\n type: string\n description: The value in a key-value pair.\n nullable: true\n '@odata.type':\n type: string\n microsoft.graph.attachment:\n title: attachment\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n contentType:\n type: string\n description: The MIME type.\n nullable: true\n isInline:\n type: boolean\n description: 'true if the attachment is an inline attachment; otherwise, false.'\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n name:\n type: string\n description: The attachment's file name.\n nullable: true\n size:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The length of the attachment in bytes.\n format: int32\n microsoft.graph.extension:\n title: extension\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n microsoft.graph.multiValueLegacyExtendedProperty:\n title: multiValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: array\n items:\n type: string\n nullable: true\n description: A collection of property values.\n microsoft.graph.singleValueLegacyExtendedProperty:\n title: singleValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: string\n description: A property value.\n nullable: true\n microsoft.graph.messageCollectionResponse:\n title: Base collection pagination and count responses\n type: object\n properties:\n '@odata.count':\n type: integer\n format: int64\n nullable: true\n '@odata.nextLink':\n type: string\n nullable: true\n value:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.message'\n microsoft.graph.emailAddress:\n title: emailAddress\n required:\n - '@odata.type'\n type: object\n properties:\n address:\n type: string\n description: The email address of the person or entity.\n nullable: true\n name:\n type: string\n description: The display name of the person or entity.\n nullable: true\n '@odata.type':\n type: string\n description: The recipient's email address.\n microsoft.graph.bodyType:\n title: bodyType\n enum:\n - text\n - html\n type: string\n description: The type of the content. Possible values are text and html.\n microsoft.graph.dateTimeTimeZone:\n title: dateTimeTimeZone\n required:\n - '@odata.type'\n type: object\n properties:\n dateTime:\n type: string\n description: 'A single point of time in a combined date and time representation ({date}T{time}; for example, 2017-08-29T04:00:00.0000000).'\n timeZone:\n type: string\n description: 'Represents a time zone, for example, ''Pacific Standard Time''. See below for more possible values.'\n nullable: true\n '@odata.type':\n type: string\n description: The date and time that the follow-up was finished.\n microsoft.graph.followupFlagStatus:\n title: followupFlagStatus\n enum:\n - notFlagged\n - complete\n - flagged\n type: string\n description: 'The status for follow-up for an item. Possible values are notFlagged, complete, and flagged.'\n responses:\n microsoft.graph.messageCollectionResponse:\n description: Retrieved collection\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.messageCollectionResponse'\n parameters:\n top:\n name: $top\n in: query\n description: Show only the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n example: 50\n skip:\n name: $skip\n in: query\n description: Skip the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n search:\n name: $search\n in: query\n description: Search items by search phrases\n style: form\n explode: false\n schema:\n type: string\n filter:\n name: $filter\n in: query\n description: Filter items by property values\n style: form\n explode: false\n schema:\n type: string\n count:\n name: $count\n in: query\n description: Include count of items\n style: form\n explode: false\n schema:\n type: boolean\n requestBodies:\n sendMailRequestBody:\n description: Action parameters\n content:\n application/json:\n schema:\n type: object\n properties:\n Message:\n $ref: '#/components/schemas/microsoft.graph.message'\n SaveToSentItems:\n type: boolean\n default: false\n nullable: true\n required: true\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Concepts\Resources\Plugins\CopilotAgentPlugins\AstronomyPlugin\messages-openapi.yml | messages-openapi.yml | YAML | 20,759 | 0.95 | 0.064327 | 0 | vue-tools | 392 | 2024-05-30T14:00:55.375612 | GPL-3.0 | false | a39ce52b4dbd09812535090331e93cd5 |
openapi: 3.0.4\ninfo:\n title: OData Service for namespace microsoft.graph - Subset\n description: This OData service is located at https://graph.microsoft.com/v1.0\n version: v1.0\nservers:\n - url: https://graph.microsoft.com/v1.0\npaths:\n /me/calendar/events:\n get:\n tags:\n - me.calendar\n summary: List events\n description: "Retrieve a list of events in a calendar. The calendar can be one for a user, or the default calendar of a Microsoft 365 group. The list of events contains single instance meetings and series masters. To get expanded event instances, you can get the calendar view, or\nget the instances of an event."\n operationId: me_calendar_ListEvents\n parameters:\n - $ref: '#/components/parameters/top'\n - $ref: '#/components/parameters/skip'\n - $ref: '#/components/parameters/search'\n - $ref: '#/components/parameters/filter'\n - $ref: '#/components/parameters/count'\n - name: $orderby\n in: query\n description: Order items by property values\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $select\n in: query\n description: Select properties to be returned\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $expand\n in: query\n description: Expand related entities\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n responses:\n 2XX:\n $ref: '#/components/responses/microsoft.graph.eventCollectionResponse'\n x-ms-pageable:\n nextLinkName: '@odata.nextLink'\n operationName: listMore\n itemName: value\n post:\n tags:\n - me.calendar\n summary: Create new navigation property to events for me\n operationId: me_calendar_CreateEvents\n requestBody:\n description: New navigation property\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.event'\n required: true\n responses:\n 2XX:\n description: Created navigation property.\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.event'\ncomponents:\n schemas:\n microsoft.graph.event:\n title: event\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n categories:\n type: array\n items:\n type: string\n nullable: true\n description: The categories associated with the item\n changeKey:\n type: string\n description: 'Identifies the version of the item. Every time the item is changed, changeKey changes as well. This allows Exchange to apply changes to the correct version of the object. Read-only.'\n nullable: true\n createdDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n allowNewTimeProposals:\n type: boolean\n description: 'true if the meeting organizer allows invitees to propose a new time when responding; otherwise, false. Optional. Default is true.'\n nullable: true\n attendees:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.attendee'\n description: The collection of attendees for the event.\n body:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n bodyPreview:\n type: string\n description: The preview of the message associated with the event. It is in text format.\n nullable: true\n end:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n hasAttachments:\n type: boolean\n description: Set to true if the event has attachments.\n nullable: true\n hideAttendees:\n type: boolean\n description: 'When set to true, each attendee only sees themselves in the meeting request and meeting Tracking list. Default is false.'\n nullable: true\n iCalUId:\n type: string\n description: A unique identifier for an event across calendars. This ID is different for each occurrence in a recurring series. Read-only.\n nullable: true\n importance:\n $ref: '#/components/schemas/microsoft.graph.importance'\n isAllDay:\n type: boolean\n description: 'Set to true if the event lasts all day. If true, regardless of whether it''s a single-day or multi-day event, start and end time must be set to midnight and be in the same time zone.'\n nullable: true\n isCancelled:\n type: boolean\n description: Set to true if the event has been canceled.\n nullable: true\n isDraft:\n type: boolean\n description: 'Set to true if the user has updated the meeting in Outlook but has not sent the updates to attendees. Set to false if all changes have been sent, or if the event is an appointment without any attendees.'\n nullable: true\n isOnlineMeeting:\n type: boolean\n description: 'True if this event has online meeting information (that is, onlineMeeting points to an onlineMeetingInfo resource), false otherwise. Default is false (onlineMeeting is null). Optional. After you set isOnlineMeeting to true, Microsoft Graph initializes onlineMeeting. Subsequently Outlook ignores any further changes to isOnlineMeeting, and the meeting remains available online.'\n nullable: true\n isOrganizer:\n type: boolean\n description: Set to true if the calendar owner (specified by the owner property of the calendar) is the organizer of the event (specified by the organizer property of the event). This also applies if a delegate organized the event on behalf of the owner.\n nullable: true\n isReminderOn:\n type: boolean\n description: Set to true if an alert is set to remind the user of the event.\n nullable: true\n location:\n $ref: '#/components/schemas/microsoft.graph.location'\n locations:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.location'\n description: 'The locations where the event is held or attended from. The location and locations properties always correspond with each other. If you update the location property, any prior locations in the locations collection would be removed and replaced by the new location value.'\n onlineMeeting:\n $ref: '#/components/schemas/microsoft.graph.onlineMeetingInfo'\n onlineMeetingProvider:\n $ref: '#/components/schemas/microsoft.graph.onlineMeetingProviderType'\n onlineMeetingUrl:\n type: string\n description: 'A URL for an online meeting. The property is set only when an organizer specifies in Outlook that an event is an online meeting such as Skype. Read-only.To access the URL to join an online meeting, use joinUrl which is exposed via the onlineMeeting property of the event. The onlineMeetingUrl property will be deprecated in the future.'\n nullable: true\n organizer:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n originalEndTimeZone:\n type: string\n description: The end time zone that was set when the event was created. A value of tzone://Microsoft/Custom indicates that a legacy custom time zone was set in desktop Outlook.\n nullable: true\n originalStart:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'Represents the start time of an event when it is initially created as an occurrence or exception in a recurring series. This property is not returned for events that are single instances. Its date and time information is expressed in ISO 8601 format and is always in UTC. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n originalStartTimeZone:\n type: string\n description: The start time zone that was set when the event was created. A value of tzone://Microsoft/Custom indicates that a legacy custom time zone was set in desktop Outlook.\n nullable: true\n recurrence:\n $ref: '#/components/schemas/microsoft.graph.patternedRecurrence'\n reminderMinutesBeforeStart:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The number of minutes before the event start time that the reminder alert occurs.\n format: int32\n nullable: true\n responseRequested:\n type: boolean\n description: 'Default is true, which represents the organizer would like an invitee to send a response to the event.'\n nullable: true\n responseStatus:\n $ref: '#/components/schemas/microsoft.graph.responseStatus'\n sensitivity:\n $ref: '#/components/schemas/microsoft.graph.sensitivity'\n seriesMasterId:\n type: string\n description: 'The ID for the recurring series master item, if this event is part of a recurring series.'\n nullable: true\n showAs:\n $ref: '#/components/schemas/microsoft.graph.freeBusyStatus'\n start:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n subject:\n type: string\n description: The text of the event's subject line.\n nullable: true\n transactionId:\n type: string\n description: 'A custom identifier specified by a client app for the server to avoid redundant POST operations in case of client retries to create the same event. This is useful when low network connectivity causes the client to time out before receiving a response from the server for the client''s prior create-event request. After you set transactionId when creating an event, you cannot change transactionId in a subsequent update. This property is only returned in a response payload if an app has set it. Optional.'\n nullable: true\n type:\n $ref: '#/components/schemas/microsoft.graph.eventType'\n webLink:\n type: string\n description: 'The URL to open the event in Outlook on the web.Outlook on the web opens the event in the browser if you are signed in to your mailbox. Otherwise, Outlook on the web prompts you to sign in.This URL cannot be accessed from within an iFrame.'\n nullable: true\n attachments:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.attachment'\n description: 'The collection of FileAttachment, ItemAttachment, and referenceAttachment attachments for the event. Navigation property. Read-only. Nullable.'\n calendar:\n $ref: '#/components/schemas/microsoft.graph.calendar'\n extensions:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.extension'\n description: The collection of open extensions defined for the event. Nullable.\n instances:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.event'\n description: 'The occurrences of a recurring series, if the event is a series master. This property includes occurrences that are part of the recurrence pattern, and exceptions that have been modified, but does not include occurrences that have been cancelled from the series. Navigation property. Read-only. Nullable.'\n multiValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.multiValueLegacyExtendedProperty'\n description: The collection of multi-value extended properties defined for the event. Read-only. Nullable.\n singleValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.singleValueLegacyExtendedProperty'\n description: The collection of single-value extended properties defined for the event. Read-only. Nullable.\n microsoft.graph.attendee:\n title: attendee\n required:\n - '@odata.type'\n type: object\n properties:\n emailAddress:\n $ref: '#/components/schemas/microsoft.graph.emailAddress'\n '@odata.type':\n type: string\n type:\n $ref: '#/components/schemas/microsoft.graph.attendeeType'\n proposedNewTime:\n $ref: '#/components/schemas/microsoft.graph.timeSlot'\n status:\n $ref: '#/components/schemas/microsoft.graph.responseStatus'\n microsoft.graph.itemBody:\n title: itemBody\n required:\n - '@odata.type'\n type: object\n properties:\n content:\n type: string\n description: The content of the item.\n nullable: true\n contentType:\n $ref: '#/components/schemas/microsoft.graph.bodyType'\n '@odata.type':\n type: string\n description: The body of the message associated with the event. It can be in HTML or text format.\n microsoft.graph.dateTimeTimeZone:\n title: dateTimeTimeZone\n required:\n - '@odata.type'\n type: object\n properties:\n dateTime:\n type: string\n description: 'A single point of time in a combined date and time representation ({date}T{time}; for example, 2017-08-29T04:00:00.0000000).'\n timeZone:\n type: string\n description: 'Represents a time zone, for example, ''Pacific Standard Time''. See below for more possible values.'\n nullable: true\n '@odata.type':\n type: string\n description: 'The date, time, and time zone that the event ends. By default, the end time is in UTC.'\n microsoft.graph.importance:\n title: importance\n enum:\n - low\n - normal\n - high\n type: string\n description: 'The importance of the event. The possible values are: low, normal, high.'\n microsoft.graph.location:\n title: location\n required:\n - '@odata.type'\n type: object\n properties:\n address:\n $ref: '#/components/schemas/microsoft.graph.physicalAddress'\n coordinates:\n $ref: '#/components/schemas/microsoft.graph.outlookGeoCoordinates'\n displayName:\n type: string\n description: The name associated with the location.\n nullable: true\n locationEmailAddress:\n type: string\n description: Optional email address of the location.\n nullable: true\n locationType:\n $ref: '#/components/schemas/microsoft.graph.locationType'\n locationUri:\n type: string\n description: Optional URI representing the location.\n nullable: true\n uniqueId:\n type: string\n description: For internal use only.\n nullable: true\n uniqueIdType:\n $ref: '#/components/schemas/microsoft.graph.locationUniqueIdType'\n '@odata.type':\n type: string\n description: The location of the event.\n discriminator:\n propertyName: '@odata.type'\n microsoft.graph.onlineMeetingInfo:\n title: onlineMeetingInfo\n required:\n - '@odata.type'\n type: object\n properties:\n conferenceId:\n type: string\n description: The ID of the conference.\n nullable: true\n joinUrl:\n type: string\n description: The external link that launches the online meeting. This is a URL that clients launch into a browser and will redirect the user to join the meeting.\n nullable: true\n phones:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.phone'\n description: All of the phone numbers associated with this conference.\n quickDial:\n type: string\n description: The preformatted quick dial for this call.\n nullable: true\n tollFreeNumbers:\n type: array\n items:\n type: string\n nullable: true\n description: The toll free numbers that can be used to join the conference.\n tollNumber:\n type: string\n description: The toll number that can be used to join the conference.\n nullable: true\n '@odata.type':\n type: string\n description: 'Details for an attendee to join the meeting online. Default is null. Read-only. After you set the isOnlineMeeting and onlineMeetingProvider properties to enable a meeting online, Microsoft Graph initializes onlineMeeting. When set, the meeting remains available online, and you cannot change the isOnlineMeeting, onlineMeetingProvider, and onlneMeeting properties again.'\n microsoft.graph.onlineMeetingProviderType:\n title: onlineMeetingProviderType\n enum:\n - unknown\n - skypeForBusiness\n - skypeForConsumer\n - teamsForBusiness\n type: string\n description: 'Represents the online meeting service provider. By default, onlineMeetingProvider is unknown. The possible values are unknown, teamsForBusiness, skypeForBusiness, and skypeForConsumer. Optional. After you set onlineMeetingProvider, Microsoft Graph initializes onlineMeeting. Subsequently you cannot change onlineMeetingProvider again, and the meeting remains available online.'\n microsoft.graph.recipient:\n title: recipient\n required:\n - '@odata.type'\n type: object\n properties:\n emailAddress:\n $ref: '#/components/schemas/microsoft.graph.emailAddress'\n '@odata.type':\n type: string\n description: The organizer of the event.\n discriminator:\n propertyName: '@odata.type'\n mapping:\n '#microsoft.graph.attendeeBase': '#/components/schemas/microsoft.graph.attendeeBase'\n '#microsoft.graph.attendee': '#/components/schemas/microsoft.graph.attendee'\n microsoft.graph.patternedRecurrence:\n title: patternedRecurrence\n required:\n - '@odata.type'\n type: object\n properties:\n pattern:\n $ref: '#/components/schemas/microsoft.graph.recurrencePattern'\n range:\n $ref: '#/components/schemas/microsoft.graph.recurrenceRange'\n '@odata.type':\n type: string\n description: The recurrence pattern for the event.\n microsoft.graph.responseStatus:\n title: responseStatus\n required:\n - '@odata.type'\n type: object\n properties:\n response:\n $ref: '#/components/schemas/microsoft.graph.responseType'\n time:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time when the response was returned. It uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n '@odata.type':\n type: string\n description: Indicates the type of response sent in response to an event message.\n microsoft.graph.sensitivity:\n title: sensitivity\n enum:\n - normal\n - personal\n - private\n - confidential\n type: string\n description: 'Possible values are: normal, personal, private, confidential.'\n microsoft.graph.freeBusyStatus:\n title: freeBusyStatus\n enum:\n - unknown\n - free\n - tentative\n - busy\n - oof\n - workingElsewhere\n type: string\n description: 'The status to show. Possible values are: free, tentative, busy, oof, workingElsewhere, unknown.'\n microsoft.graph.eventType:\n title: eventType\n enum:\n - singleInstance\n - occurrence\n - exception\n - seriesMaster\n type: string\n description: 'The event type. Possible values are: singleInstance, occurrence, exception, seriesMaster. Read-only'\n microsoft.graph.attachment:\n title: attachment\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n contentType:\n type: string\n description: The MIME type.\n nullable: true\n isInline:\n type: boolean\n description: 'true if the attachment is an inline attachment; otherwise, false.'\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n name:\n type: string\n description: The attachment's file name.\n nullable: true\n size:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The length of the attachment in bytes.\n format: int32\n microsoft.graph.calendar:\n description: The calendar that contains the event. Navigation property. Read-only.\n microsoft.graph.extension:\n title: extension\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n microsoft.graph.multiValueLegacyExtendedProperty:\n title: multiValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: array\n items:\n type: string\n nullable: true\n description: A collection of property values.\n microsoft.graph.singleValueLegacyExtendedProperty:\n title: singleValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: string\n description: A property value.\n nullable: true\n microsoft.graph.eventCollectionResponse:\n title: Base collection pagination and count responses\n type: object\n properties:\n '@odata.count':\n type: integer\n format: int64\n nullable: true\n '@odata.nextLink':\n type: string\n nullable: true\n value:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.event'\n microsoft.graph.emailAddress:\n title: emailAddress\n required:\n - '@odata.type'\n type: object\n properties:\n address:\n type: string\n description: The email address of the person or entity.\n nullable: true\n name:\n type: string\n description: The display name of the person or entity.\n nullable: true\n '@odata.type':\n type: string\n description: The recipient's email address.\n microsoft.graph.attendeeType:\n title: attendeeType\n enum:\n - required\n - optional\n - resource\n type: string\n description: 'The type of attendee. The possible values are: required, optional, resource. Currently if the attendee is a person, findMeetingTimes always considers the person is of the Required type.'\n microsoft.graph.timeSlot:\n title: timeSlot\n required:\n - '@odata.type'\n type: object\n properties:\n end:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n start:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n '@odata.type':\n type: string\n description: 'An alternate date/time proposed by the attendee for a meeting request to start and end. If the attendee hasn''t proposed another time, then this property isn''t included in a response of a GET event.'\n microsoft.graph.bodyType:\n title: bodyType\n enum:\n - text\n - html\n type: string\n description: The type of the content. Possible values are text and html.\n microsoft.graph.physicalAddress:\n title: physicalAddress\n required:\n - '@odata.type'\n type: object\n properties:\n city:\n type: string\n description: The city.\n nullable: true\n countryOrRegion:\n type: string\n description: 'The country or region. It''s a free-format string value, for example, ''United States''.'\n nullable: true\n postalCode:\n type: string\n description: The postal code.\n nullable: true\n state:\n type: string\n description: The state.\n nullable: true\n street:\n type: string\n description: The street.\n nullable: true\n '@odata.type':\n type: string\n description: The street address of the location.\n microsoft.graph.outlookGeoCoordinates:\n title: outlookGeoCoordinates\n required:\n - '@odata.type'\n type: object\n properties:\n accuracy:\n type: number\n description: 'The accuracy of the latitude and longitude. As an example, the accuracy can be measured in meters, such as the latitude and longitude are accurate to within 50 meters.'\n format: double\n nullable: true\n altitude:\n type: number\n description: The altitude of the location.\n format: double\n nullable: true\n altitudeAccuracy:\n type: number\n description: The accuracy of the altitude.\n format: double\n nullable: true\n latitude:\n type: number\n description: The latitude of the location.\n format: double\n nullable: true\n longitude:\n type: number\n description: The longitude of the location.\n format: double\n nullable: true\n '@odata.type':\n type: string\n description: The geographic coordinates and elevation of the location.\n microsoft.graph.locationType:\n title: locationType\n enum:\n - default\n - conferenceRoom\n - homeAddress\n - businessAddress\n - geoCoordinates\n - streetAddress\n - hotel\n - restaurant\n - localBusiness\n - postalAddress\n type: string\n description: 'The type of location. The possible values are: default, conferenceRoom, homeAddress, businessAddress,geoCoordinates, streetAddress, hotel, restaurant, localBusiness, postalAddress. Read-only.'\n microsoft.graph.locationUniqueIdType:\n title: locationUniqueIdType\n enum:\n - unknown\n - locationStore\n - directory\n - private\n - bing\n type: string\n description: For internal use only.\n microsoft.graph.phone:\n title: phone\n required:\n - '@odata.type'\n type: object\n properties:\n language:\n type: string\n nullable: true\n number:\n type: string\n description: The phone number.\n nullable: true\n region:\n type: string\n nullable: true\n type:\n $ref: '#/components/schemas/microsoft.graph.phoneType'\n '@odata.type':\n type: string\n microsoft.graph.recurrencePattern:\n title: recurrencePattern\n required:\n - '@odata.type'\n type: object\n properties:\n dayOfMonth:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The day of the month on which the event occurs. Required if type is absoluteMonthly or absoluteYearly.\n format: int32\n daysOfWeek:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.dayOfWeek'\n description: 'A collection of the days of the week on which the event occurs. The possible values are: sunday, monday, tuesday, wednesday, thursday, friday, saturday. If type is relativeMonthly or relativeYearly, and daysOfWeek specifies more than one day, the event falls on the first day that satisfies the pattern. Required if type is weekly, relativeMonthly, or relativeYearly.'\n firstDayOfWeek:\n $ref: '#/components/schemas/microsoft.graph.dayOfWeek'\n index:\n $ref: '#/components/schemas/microsoft.graph.weekIndex'\n interval:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: 'The number of units between occurrences, where units can be in days, weeks, months, or years, depending on the type. Required.'\n format: int32\n month:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The month in which the event occurs. This is a number from 1 to 12.\n format: int32\n type:\n $ref: '#/components/schemas/microsoft.graph.recurrencePatternType'\n '@odata.type':\n type: string\n description: 'The frequency of an event. For access reviews: Do not specify this property for a one-time access review. Only interval, dayOfMonth, and type (weekly, absoluteMonthly) properties of recurrencePattern are supported.'\n microsoft.graph.recurrenceRange:\n title: recurrenceRange\n required:\n - '@odata.type'\n type: object\n properties:\n endDate:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])$'\n type: string\n description: 'The date to stop applying the recurrence pattern. Depending on the recurrence pattern of the event, the last occurrence of the meeting may not be this date. Required if type is endDate.'\n format: date\n nullable: true\n numberOfOccurrences:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The number of times to repeat the event. Required and must be positive if type is numbered.\n format: int32\n recurrenceTimeZone:\n type: string\n description: 'Time zone for the startDate and endDate properties. Optional. If not specified, the time zone of the event is used.'\n nullable: true\n startDate:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])$'\n type: string\n description: 'The date to start applying the recurrence pattern. The first occurrence of the meeting may be this date or later, depending on the recurrence pattern of the event. Must be the same value as the start property of the recurring event. Required.'\n format: date\n nullable: true\n type:\n $ref: '#/components/schemas/microsoft.graph.recurrenceRangeType'\n '@odata.type':\n type: string\n description: The duration of an event.\n microsoft.graph.responseType:\n title: responseType\n enum:\n - none\n - organizer\n - tentativelyAccepted\n - accepted\n - declined\n - notResponded\n type: string\n description: 'The response type. Possible values are: none, organizer, tentativelyAccepted, accepted, declined, notResponded.To differentiate between none and notResponded: none – from organizer''s perspective. This value is used when the status of an attendee/participant is reported to the organizer of a meeting. notResponded – from attendee''s perspective. Indicates the attendee has not responded to the meeting request. Clients can treat notResponded == none. As an example, if attendee Alex hasn''t responded to a meeting request, getting Alex'' response status for that event in Alex'' calendar returns notResponded. Getting Alex'' response from the calendar of any other attendee or the organizer''s returns none. Getting the organizer''s response for the event in anybody''s calendar also returns none.'\n microsoft.graph.phoneType:\n title: phoneType\n enum:\n - home\n - business\n - mobile\n - other\n - assistant\n - homeFax\n - businessFax\n - otherFax\n - pager\n - radio\n type: string\n description: 'The type of phone number. The possible values are: home, business, mobile, other, assistant, homeFax, businessFax, otherFax, pager, radio.'\n microsoft.graph.dayOfWeek:\n title: dayOfWeek\n enum:\n - sunday\n - monday\n - tuesday\n - wednesday\n - thursday\n - friday\n - saturday\n type: string\n microsoft.graph.weekIndex:\n title: weekIndex\n enum:\n - first\n - second\n - third\n - fourth\n - last\n type: string\n description: 'Specifies on which instance of the allowed days specified in daysOfWeek the event occurs, counted from the first instance in the month. The possible values are: first, second, third, fourth, last. Default is first. Optional and used if type is relativeMonthly or relativeYearly.'\n microsoft.graph.recurrencePatternType:\n title: recurrencePatternType\n enum:\n - daily\n - weekly\n - absoluteMonthly\n - relativeMonthly\n - absoluteYearly\n - relativeYearly\n type: string\n description: 'The recurrence pattern type: daily, weekly, absoluteMonthly, relativeMonthly, absoluteYearly, relativeYearly. Required. For more information, see values of type property.'\n microsoft.graph.recurrenceRangeType:\n title: recurrenceRangeType\n enum:\n - endDate\n - noEnd\n - numbered\n type: string\n description: 'The recurrence range. The possible values are: endDate, noEnd, numbered. Required.'\n responses:\n microsoft.graph.eventCollectionResponse:\n description: Retrieved collection\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.eventCollectionResponse'\n parameters:\n top:\n name: $top\n in: query\n description: Show only the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n example: 50\n skip:\n name: $skip\n in: query\n description: Skip the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n search:\n name: $search\n in: query\n description: Search items by search phrases\n style: form\n explode: false\n schema:\n type: string\n filter:\n name: $filter\n in: query\n description: Filter items by property values\n style: form\n explode: false\n schema:\n type: string\n count:\n name: $count\n in: query\n description: Include count of items\n style: form\n explode: false\n schema:\n type: boolean | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Concepts\Resources\Plugins\CopilotAgentPlugins\CalendarPlugin\calendar-openapi.yml | calendar-openapi.yml | YAML | 36,959 | 0.95 | 0.060573 | 0 | vue-tools | 467 | 2025-06-07T04:05:28.238372 | GPL-3.0 | false | f3e84c04354950eb7e6f2e36c356f71d |
openapi: 3.0.1\ninfo:\n title: OData Service for namespace microsoft.graph - Subset\n description: This OData service is located at https://graph.microsoft.com/v1.0\n version: v1.0\nservers:\n - url: https://graph.microsoft.com/v1.0\npaths:\n /me/contacts:\n get:\n tags:\n - me.contact\n summary: List contacts\n description: 'Get a contact collection from the default contacts folder of the signed-in user. There are two scenarios where an app can get contacts in another user''s contact folder:'\n operationId: me_ListContacts\n parameters:\n - $ref: '#/components/parameters/top'\n - $ref: '#/components/parameters/skip'\n - $ref: '#/components/parameters/search'\n - $ref: '#/components/parameters/filter'\n - $ref: '#/components/parameters/count'\n - name: $orderby\n in: query\n description: Order items by property values\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $select\n in: query\n description: Select properties to be returned\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $expand\n in: query\n description: Expand related entities\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n responses:\n 2XX:\n $ref: '#/components/responses/microsoft.graph.contactCollectionResponse'\n x-ms-pageable:\n nextLinkName: '@odata.nextLink'\n operationName: listMore\n itemName: value\ncomponents:\n schemas:\n microsoft.graph.contactCollectionResponse:\n title: Base collection pagination and count responses\n type: object\n properties:\n '@odata.count':\n type: integer\n format: int64\n nullable: true\n '@odata.nextLink':\n type: string\n nullable: true\n value:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.contact'\n microsoft.graph.contact:\n title: contact\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n categories:\n type: array\n items:\n type: string\n nullable: true\n description: The categories associated with the item\n changeKey:\n type: string\n description: 'Identifies the version of the item. Every time the item is changed, changeKey changes as well. This allows Exchange to apply changes to the correct version of the object. Read-only.'\n nullable: true\n createdDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n assistantName:\n type: string\n description: The name of the contact's assistant.\n nullable: true\n birthday:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The contact''s birthday. The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n businessAddress:\n $ref: '#/components/schemas/microsoft.graph.physicalAddress'\n businessHomePage:\n type: string\n description: The business home page of the contact.\n nullable: true\n businessPhones:\n type: array\n items:\n type: string\n nullable: true\n description: The contact's business phone numbers.\n children:\n type: array\n items:\n type: string\n nullable: true\n description: The names of the contact's children.\n companyName:\n type: string\n description: The name of the contact's company.\n nullable: true\n department:\n type: string\n description: The contact's department.\n nullable: true\n displayName:\n type: string\n description: 'The contact''s display name. You can specify the display name in a create or update operation. Note that later updates to other properties may cause an automatically generated value to overwrite the displayName value you have specified. To preserve a pre-existing value, always include it as displayName in an update operation.'\n nullable: true\n emailAddresses:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.emailAddress'\n description: The contact's email addresses.\n fileAs:\n type: string\n description: The name the contact is filed under.\n nullable: true\n generation:\n type: string\n description: The contact's suffix.\n nullable: true\n givenName:\n type: string\n description: The contact's given name.\n nullable: true\n homeAddress:\n $ref: '#/components/schemas/microsoft.graph.physicalAddress'\n homePhones:\n type: array\n items:\n type: string\n nullable: true\n description: The contact's home phone numbers.\n imAddresses:\n type: array\n items:\n type: string\n nullable: true\n description: The contact's instant messaging (IM) addresses.\n initials:\n type: string\n description: The contact's initials.\n nullable: true\n jobTitle:\n type: string\n description: The contact’s job title.\n nullable: true\n manager:\n type: string\n description: The name of the contact's manager.\n nullable: true\n middleName:\n type: string\n description: The contact's middle name.\n nullable: true\n mobilePhone:\n type: string\n description: The contact's mobile phone number.\n nullable: true\n nickName:\n type: string\n description: The contact's nickname.\n nullable: true\n officeLocation:\n type: string\n description: The location of the contact's office.\n nullable: true\n otherAddress:\n $ref: '#/components/schemas/microsoft.graph.physicalAddress'\n parentFolderId:\n type: string\n description: The ID of the contact's parent folder.\n nullable: true\n personalNotes:\n type: string\n description: The user's notes about the contact.\n nullable: true\n profession:\n type: string\n description: The contact's profession.\n nullable: true\n spouseName:\n type: string\n description: The name of the contact's spouse/partner.\n nullable: true\n surname:\n type: string\n description: The contact's surname.\n nullable: true\n title:\n type: string\n description: The contact's title.\n nullable: true\n yomiCompanyName:\n type: string\n description: The phonetic Japanese company name of the contact.\n nullable: true\n yomiGivenName:\n type: string\n description: The phonetic Japanese given name (first name) of the contact.\n nullable: true\n yomiSurname:\n type: string\n description: The phonetic Japanese surname (last name) of the contact.\n nullable: true\n extensions:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.extension'\n description: The collection of open extensions defined for the contact. Read-only. Nullable.\n multiValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.multiValueLegacyExtendedProperty'\n description: The collection of multi-value extended properties defined for the contact. Read-only. Nullable.\n photo:\n $ref: '#/components/schemas/microsoft.graph.profilePhoto'\n singleValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.singleValueLegacyExtendedProperty'\n description: The collection of single-value extended properties defined for the contact. Read-only. Nullable.\n microsoft.graph.physicalAddress:\n title: physicalAddress\n required:\n - '@odata.type'\n type: object\n properties:\n city:\n type: string\n description: The city.\n nullable: true\n countryOrRegion:\n type: string\n description: 'The country or region. It''s a free-format string value, for example, ''United States''.'\n nullable: true\n postalCode:\n type: string\n description: The postal code.\n nullable: true\n state:\n type: string\n description: The state.\n nullable: true\n street:\n type: string\n description: The street.\n nullable: true\n '@odata.type':\n type: string\n description: The contact's business address.\n microsoft.graph.emailAddress:\n title: emailAddress\n required:\n - '@odata.type'\n type: object\n properties:\n address:\n type: string\n description: The email address of the person or entity.\n nullable: true\n name:\n type: string\n description: The display name of the person or entity.\n nullable: true\n '@odata.type':\n type: string\n microsoft.graph.extension:\n title: extension\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n microsoft.graph.multiValueLegacyExtendedProperty:\n title: multiValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: array\n items:\n type: string\n nullable: true\n description: A collection of property values.\n microsoft.graph.profilePhoto:\n description: Optional contact picture. You can get or set a photo for a contact.\n microsoft.graph.singleValueLegacyExtendedProperty:\n title: singleValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: string\n description: A property value.\n nullable: true\n responses:\n microsoft.graph.contactCollectionResponse:\n description: Retrieved collection\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.contactCollectionResponse'\n parameters:\n top:\n name: $top\n in: query\n description: Show only the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n example: 50\n skip:\n name: $skip\n in: query\n description: Skip the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n search:\n name: $search\n in: query\n description: Search items by search phrases\n style: form\n explode: false\n schema:\n type: string\n filter:\n name: $filter\n in: query\n description: Filter items by property values\n style: form\n explode: false\n schema:\n type: string\n count:\n name: $count\n in: query\n description: Include count of items\n style: form\n explode: false\n schema:\n type: boolean | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Concepts\Resources\Plugins\CopilotAgentPlugins\ContactsPlugin\contacts-openapi.yml | contacts-openapi.yml | YAML | 13,509 | 0.95 | 0.024876 | 0 | vue-tools | 577 | 2023-09-25T22:04:29.614454 | MIT | false | 3f99430786ff9e8b2f8d0a98b689820b |
openapi: 3.0.1\ninfo:\n title: OData Service for namespace microsoft.graph - Subset\n description: This OData service is located at https://graph.microsoft.com/v1.0\n version: v1.0\nservers:\n - url: https://graph.microsoft.com/v1.0\npaths:\n "/me/drive/root/children/{driveItem-id}/content":\n get:\n tags:\n - drives.driveItem\n summary: Get content for the navigation property items from drives\n description: "The content stream, if the item represents a file."\n operationId: drives_GetItemsContent\n parameters:\n - name: $format\n in: query\n description: Format of the content\n style: form\n explode: false\n schema:\n type: string\n responses:\n 2XX:\n description: Retrieved media content\n content:\n application/octet-stream:\n schema:\n type: string\n format: binary\n parameters:\n - name: driveItem-id\n in: path\n description: The unique identifier of driveItem\n required: true\n style: simple\n schema:\n type: string\ncomponents: {}\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Concepts\Resources\Plugins\CopilotAgentPlugins\DriveItemPlugin\driveitem-openapi.yml | driveitem-openapi.yml | YAML | 1,153 | 0.95 | 0.075 | 0 | node-utils | 861 | 2024-05-04T00:34:14.610088 | Apache-2.0 | false | 3a16fee4a557adef0dba0aa138410898 |
openapi: 3.0.1\ninfo:\n title: OData Service for namespace microsoft.graph - Subset\n description: This OData service is located at https://graph.microsoft.com/v1.0\n version: v1.0\nservers:\n - url: https://graph.microsoft.com/v1.0\npaths:\n /me/messages:\n get:\n tags:\n - me.message\n summary: Get the messages in the signed-in user\u0026apos;s mailbox\n description: Get the messages in the signed-in user\u0026apos;s mailbox (including the Deleted Items and Clutter folders). Depending on the page size and mailbox data, getting messages from a mailbox can incur multiple requests. The default page size is 10 messages. Use $top to customize the page size, within the range of 1 and 1000. To improve the operation response time, use $select to specify the exact properties you need; see example 1 below. Fine-tune the values for $select and $top, especially when you must use a larger page size, as returning a page with hundreds of messages each with a full response payload may trigger the gateway timeout (HTTP 504). To get the next page of messages, simply apply the entire URL returned in @odata.nextLink to the next get-messages request. This URL includes any query parameters you may have specified in the initial request. Do not try to extract the $skip value from the @odata.nextLink URL to manipulate responses. This API uses the $skip value to keep count of all the items it has gone through in the user\u0026apos;s mailbox to return a page of message-type items. It\u0026apos;s therefore possible that even in the initial response, the $skip value is larger than the page size. For more information, see Paging Microsoft Graph data in your app. Currently, this operation returns message bodies in only HTML format. There are two scenarios where an app can get messages in another user\u0026apos;s mail folder\n operationId: me_ListMessages\n parameters:\n - name: includeHiddenMessages\n in: query\n description: Include Hidden Messages\n style: form\n explode: false\n schema:\n type: string\n - $ref: '#/components/parameters/top'\n - $ref: '#/components/parameters/skip'\n - $ref: '#/components/parameters/search'\n - $ref: '#/components/parameters/filter'\n - $ref: '#/components/parameters/count'\n - name: $orderby\n in: query\n description: Order items by property values\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $select\n in: query\n description: Select properties to be returned\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $expand\n in: query\n description: Expand related entities\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n responses:\n 2XX:\n $ref: '#/components/responses/microsoft.graph.messageCollectionResponse'\n x-ms-pageable:\n nextLinkName: '@odata.nextLink'\n operationName: listMore\n itemName: value\n /me/sendMail:\n post:\n tags:\n - me.user.Actions\n summary: Invoke action sendMail\n description: 'Send the message specified in the request body using either JSON or MIME format. When using JSON format, you can include a file attachment in the same sendMail action call. When using MIME format: This method saves the message in the Sent Items folder. Alternatively, create a draft message to send later. To learn more about the steps involved in the backend before a mail is delivered to recipients, see here.'\n operationId: me_sendMail\n requestBody:\n $ref: '#/components/requestBodies/sendMailRequestBody'\n responses:\n '204':\n description: Success\ncomponents:\n schemas:\n microsoft.graph.message:\n title: message\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n categories:\n type: array\n items:\n type: string\n nullable: true\n description: The categories associated with the item\n changeKey:\n type: string\n description: 'Identifies the version of the item. Every time the item is changed, changeKey changes as well. This allows Exchange to apply changes to the correct version of the object. Read-only.'\n nullable: true\n createdDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n bccRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The Bcc: recipients for the message.'\n body:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n bodyPreview:\n type: string\n description: The first 255 characters of the message body. It is in text format.\n nullable: true\n ccRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The Cc: recipients for the message.'\n conversationId:\n type: string\n description: The ID of the conversation the email belongs to.\n nullable: true\n conversationIndex:\n type: string\n description: Indicates the position of the message within the conversation.\n format: base64url\n nullable: true\n flag:\n $ref: '#/components/schemas/microsoft.graph.followupFlag'\n from:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n hasAttachments:\n type: boolean\n description: 'Indicates whether the message has attachments. This property doesn''t include inline attachments, so if a message contains only inline attachments, this property is false. To verify the existence of inline attachments, parse the body property to look for a src attribute, such as <IMG src=''cid:image001.jpg@01D26CD8.6C05F070''>.'\n nullable: true\n importance:\n $ref: '#/components/schemas/microsoft.graph.importance'\n inferenceClassification:\n $ref: '#/components/schemas/microsoft.graph.inferenceClassificationType'\n internetMessageHeaders:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.internetMessageHeader'\n description: A collection of message headers defined by RFC5322. The set includes message headers indicating the network path taken by a message from the sender to the recipient. It can also contain custom message headers that hold app data for the message. Returned only on applying a $select query option. Read-only.\n internetMessageId:\n type: string\n description: The message ID in the format specified by RFC2822.\n nullable: true\n isDeliveryReceiptRequested:\n type: boolean\n description: Indicates whether a read receipt is requested for the message.\n nullable: true\n isDraft:\n type: boolean\n description: Indicates whether the message is a draft. A message is a draft if it hasn't been sent yet.\n nullable: true\n isRead:\n type: boolean\n description: Indicates whether the message has been read.\n nullable: true\n isReadReceiptRequested:\n type: boolean\n description: Indicates whether a read receipt is requested for the message.\n nullable: true\n parentFolderId:\n type: string\n description: The unique identifier for the message's parent mailFolder.\n nullable: true\n receivedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time the message was received. The date and time information uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z.'\n format: date-time\n nullable: true\n replyTo:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: The email addresses to use when replying.\n sender:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n sentDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time the message was sent. The date and time information uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z.'\n format: date-time\n nullable: true\n subject:\n type: string\n description: The subject of the message.\n nullable: true\n toRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The To: recipients for the message.'\n uniqueBody:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n webLink:\n type: string\n description: 'The URL to open the message in Outlook on the web.You can append an ispopout argument to the end of the URL to change how the message is displayed. If ispopout is not present or if it is set to 1, then the message is shown in a popout window. If ispopout is set to 0, the browser shows the message in the Outlook on the web review pane.The message opens in the browser if you are signed in to your mailbox via Outlook on the web. You are prompted to sign in if you are not already signed in with the browser.This URL cannot be accessed from within an iFrame.'\n nullable: true\n attachments:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.attachment'\n description: The fileAttachment and itemAttachment attachments for the message.\n extensions:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.extension'\n description: The collection of open extensions defined for the message. Nullable.\n multiValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.multiValueLegacyExtendedProperty'\n description: The collection of multi-value extended properties defined for the message. Nullable.\n singleValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.singleValueLegacyExtendedProperty'\n description: The collection of single-value extended properties defined for the message. Nullable.\n microsoft.graph.recipient:\n title: recipient\n required:\n - '@odata.type'\n type: object\n properties:\n emailAddress:\n $ref: '#/components/schemas/microsoft.graph.emailAddress'\n '@odata.type':\n type: string\n discriminator:\n propertyName: '@odata.type'\n microsoft.graph.itemBody:\n title: itemBody\n required:\n - '@odata.type'\n type: object\n properties:\n content:\n type: string\n description: The content of the item.\n nullable: true\n contentType:\n $ref: '#/components/schemas/microsoft.graph.bodyType'\n '@odata.type':\n type: string\n description: The body of the message. It can be in HTML or text format. Find out about safe HTML in a message body.\n microsoft.graph.followupFlag:\n title: followupFlag\n required:\n - '@odata.type'\n type: object\n properties:\n completedDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n dueDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n flagStatus:\n $ref: '#/components/schemas/microsoft.graph.followupFlagStatus'\n startDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n '@odata.type':\n type: string\n description: 'The flag value that indicates the status, start date, due date, or completion date for the message.'\n microsoft.graph.importance:\n title: importance\n enum:\n - low\n - normal\n - high\n type: string\n description: 'The importance of the message. The possible values are: low, normal, and high.'\n microsoft.graph.inferenceClassificationType:\n title: inferenceClassificationType\n enum:\n - focused\n - other\n type: string\n description: 'The classification of the message for the user, based on inferred relevance or importance, or on an explicit override. The possible values are: focused or other.'\n microsoft.graph.internetMessageHeader:\n title: internetMessageHeader\n required:\n - '@odata.type'\n type: object\n properties:\n name:\n type: string\n description: Represents the key in a key-value pair.\n nullable: true\n value:\n type: string\n description: The value in a key-value pair.\n nullable: true\n '@odata.type':\n type: string\n microsoft.graph.attachment:\n title: attachment\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n contentType:\n type: string\n description: The MIME type.\n nullable: true\n isInline:\n type: boolean\n description: 'true if the attachment is an inline attachment; otherwise, false.'\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n name:\n type: string\n description: The attachment's file name.\n nullable: true\n size:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The length of the attachment in bytes.\n format: int32\n microsoft.graph.extension:\n title: extension\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n microsoft.graph.multiValueLegacyExtendedProperty:\n title: multiValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: array\n items:\n type: string\n nullable: true\n description: A collection of property values.\n microsoft.graph.singleValueLegacyExtendedProperty:\n title: singleValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: string\n description: A property value.\n nullable: true\n microsoft.graph.messageCollectionResponse:\n title: Base collection pagination and count responses\n type: object\n properties:\n '@odata.count':\n type: integer\n format: int64\n nullable: true\n '@odata.nextLink':\n type: string\n nullable: true\n value:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.message'\n microsoft.graph.emailAddress:\n title: emailAddress\n required:\n - '@odata.type'\n type: object\n properties:\n address:\n type: string\n description: The email address of the person or entity.\n nullable: true\n name:\n type: string\n description: The display name of the person or entity.\n nullable: true\n '@odata.type':\n type: string\n description: The recipient's email address.\n microsoft.graph.bodyType:\n title: bodyType\n enum:\n - text\n - html\n type: string\n description: The type of the content. Possible values are text and html.\n microsoft.graph.dateTimeTimeZone:\n title: dateTimeTimeZone\n required:\n - '@odata.type'\n type: object\n properties:\n dateTime:\n type: string\n description: 'A single point of time in a combined date and time representation ({date}T{time}; for example, 2017-08-29T04:00:00.0000000).'\n timeZone:\n type: string\n description: 'Represents a time zone, for example, ''Pacific Standard Time''. See below for more possible values.'\n nullable: true\n '@odata.type':\n type: string\n description: The date and time that the follow-up was finished.\n microsoft.graph.followupFlagStatus:\n title: followupFlagStatus\n enum:\n - notFlagged\n - complete\n - flagged\n type: string\n description: 'The status for follow-up for an item. Possible values are notFlagged, complete, and flagged.'\n responses:\n microsoft.graph.messageCollectionResponse:\n description: Retrieved collection\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.messageCollectionResponse'\n parameters:\n top:\n name: $top\n in: query\n description: Show only the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n example: 50\n skip:\n name: $skip\n in: query\n description: Skip the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n search:\n name: $search\n in: query\n description: Search items by search phrases\n style: form\n explode: false\n schema:\n type: string\n filter:\n name: $filter\n in: query\n description: Filter items by property values\n style: form\n explode: false\n schema:\n type: string\n count:\n name: $count\n in: query\n description: Include count of items\n style: form\n explode: false\n schema:\n type: boolean\n requestBodies:\n sendMailRequestBody:\n description: Action parameters\n content:\n application/json:\n schema:\n type: object\n properties:\n Message:\n $ref: '#/components/schemas/microsoft.graph.message'\n SaveToSentItems:\n type: boolean\n default: false\n nullable: true\n required: true\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Concepts\Resources\Plugins\CopilotAgentPlugins\MessagesPlugin\messages-openapi.yml | messages-openapi.yml | YAML | 20,759 | 0.95 | 0.064327 | 0 | python-kit | 762 | 2024-03-16T09:05:43.366357 | BSD-3-Clause | false | a39ce52b4dbd09812535090331e93cd5 |
version: '3.8'\n\nservices:\n quality-check:\n build:\n context: .\n dockerfile: Dockerfile\n secrets:\n - hf_token\n ports: \n - "8080:80"\n secrets:\n - hf_token\nsecrets:\n hf_token:\n file: .env/hf_token.txt\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\samples\Demos\QualityCheck\python-server\docker-compose.yml | docker-compose.yml | YAML | 243 | 0.7 | 0 | 0 | python-kit | 762 | 2025-06-20T22:00:04.248592 | Apache-2.0 | false | 12d40255f3de9e87cce8d45e61f9680c |
name: breakfast_flow\ngoal: Make breakfast\nsteps:\n - goal: Make coffee\n plugins:\n - MakeCoffeePlugin\n requires:\n - coffee_bean\n provides:\n - coffee\n\n - goal: Select coffee been\n plugins:\n - CoffeeRecommendationPlugin\n provides:\n - coffee_bean\n completionType: AtLeastOnce\n\n - goal: Recipe\n plugins:\n - WebSearchPlugin\n - CalorieCalculatorPlugin\n - HealthCheckPlugin\n provides:\n - ingredients\n completionType: AtLeastOnce\n transitionMessage: Do you want to add one more recipe?\n\n - goal: Cook\n plugins:\n - CookPlugin\n - WebSearchPlugin\n requires:\n - coffee\n - ingredients\n provides:\n - breakfast\n\n - flowName: lunch_flow\n completionType: Optional\n startingMessage: Would you like to prepare the lunch as well?\n\nprovides:\n - breakfast\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\src\Experimental\Orchestration.Flow.UnitTests\TestData\Flow\flow.yml | flow.yml | YAML | 853 | 0.85 | 0 | 0 | awesome-app | 334 | 2023-07-17T08:22:15.782676 | BSD-3-Clause | true | 9eb521ecffdd42ad6597522adfe3879b |
openapi: 3.0.1\ninfo:\n title: OData Service for namespace microsoft.graph - Subset\n description: This OData service is located at https://graph.microsoft.com/v1.0\n version: v1.0\nservers:\n - url: https://graph.microsoft.com/v1.0\npaths:\n /me/messages:\n get:\n tags:\n - me.message\n summary: Get the messages in the signed-in user\u0026apos;s mailbox\n description: Get the messages in the signed-in user\u0026apos;s mailbox (including the Deleted Items and Clutter folders). Depending on the page size and mailbox data, getting messages from a mailbox can incur multiple requests. The default page size is 10 messages. Use $top to customize the page size, within the range of 1 and 1000. To improve the operation response time, use $select to specify the exact properties you need; see example 1 below. Fine-tune the values for $select and $top, especially when you must use a larger page size, as returning a page with hundreds of messages each with a full response payload may trigger the gateway timeout (HTTP 504). To get the next page of messages, simply apply the entire URL returned in @odata.nextLink to the next get-messages request. This URL includes any query parameters you may have specified in the initial request. Do not try to extract the $skip value from the @odata.nextLink URL to manipulate responses. This API uses the $skip value to keep count of all the items it has gone through in the user\u0026apos;s mailbox to return a page of message-type items. It\u0026apos;s therefore possible that even in the initial response, the $skip value is larger than the page size. For more information, see Paging Microsoft Graph data in your app. Currently, this operation returns message bodies in only HTML format. There are two scenarios where an app can get messages in another user\u0026apos;s mail folder\n operationId: me_ListMessages\n parameters:\n - name: includeHiddenMessages\n in: query\n description: Include Hidden Messages\n style: form\n explode: false\n schema:\n type: string\n - $ref: '#/components/parameters/top'\n - $ref: '#/components/parameters/skip'\n - $ref: '#/components/parameters/search'\n - $ref: '#/components/parameters/filter'\n - $ref: '#/components/parameters/count'\n - name: $orderby\n in: query\n description: Order items by property values\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $select\n in: query\n description: Select properties to be returned\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n - name: $expand\n in: query\n description: Expand related entities\n style: form\n explode: false\n schema:\n uniqueItems: true\n type: array\n items:\n type: string\n responses:\n 2XX:\n $ref: '#/components/responses/microsoft.graph.messageCollectionResponse'\n x-ms-pageable:\n nextLinkName: '@odata.nextLink'\n operationName: listMore\n itemName: value\n /me/sendMail:\n post:\n tags:\n - me.user.Actions\n summary: Invoke action sendMail\n description: 'Send the message specified in the request body using either JSON or MIME format. When using JSON format, you can include a file attachment in the same sendMail action call. When using MIME format: This method saves the message in the Sent Items folder. Alternatively, create a draft message to send later. To learn more about the steps involved in the backend before a mail is delivered to recipients, see here.'\n operationId: me_sendMail\n requestBody:\n $ref: '#/components/requestBodies/sendMailRequestBody'\n responses:\n '204':\n description: Success\ncomponents:\n schemas:\n microsoft.graph.message:\n title: message\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n categories:\n type: array\n items:\n type: string\n nullable: true\n description: The categories associated with the item\n changeKey:\n type: string\n description: 'Identifies the version of the item. Every time the item is changed, changeKey changes as well. This allows Exchange to apply changes to the correct version of the object. Read-only.'\n nullable: true\n createdDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n bccRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The Bcc: recipients for the message.'\n body:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n bodyPreview:\n type: string\n description: The first 255 characters of the message body. It is in text format.\n nullable: true\n ccRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The Cc: recipients for the message.'\n conversationId:\n type: string\n description: The ID of the conversation the email belongs to.\n nullable: true\n conversationIndex:\n type: string\n description: Indicates the position of the message within the conversation.\n format: base64url\n nullable: true\n flag:\n $ref: '#/components/schemas/microsoft.graph.followupFlag'\n from:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n hasAttachments:\n type: boolean\n description: 'Indicates whether the message has attachments. This property doesn''t include inline attachments, so if a message contains only inline attachments, this property is false. To verify the existence of inline attachments, parse the body property to look for a src attribute, such as <IMG src=''cid:image001.jpg@01D26CD8.6C05F070''>.'\n nullable: true\n importance:\n $ref: '#/components/schemas/microsoft.graph.importance'\n inferenceClassification:\n $ref: '#/components/schemas/microsoft.graph.inferenceClassificationType'\n internetMessageHeaders:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.internetMessageHeader'\n description: A collection of message headers defined by RFC5322. The set includes message headers indicating the network path taken by a message from the sender to the recipient. It can also contain custom message headers that hold app data for the message. Returned only on applying a $select query option. Read-only.\n internetMessageId:\n type: string\n description: The message ID in the format specified by RFC2822.\n nullable: true\n isDeliveryReceiptRequested:\n type: boolean\n description: Indicates whether a read receipt is requested for the message.\n nullable: true\n isDraft:\n type: boolean\n description: Indicates whether the message is a draft. A message is a draft if it hasn't been sent yet.\n nullable: true\n isRead:\n type: boolean\n description: Indicates whether the message has been read.\n nullable: true\n isReadReceiptRequested:\n type: boolean\n description: Indicates whether a read receipt is requested for the message.\n nullable: true\n parentFolderId:\n type: string\n description: The unique identifier for the message's parent mailFolder.\n nullable: true\n receivedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time the message was received. The date and time information uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z.'\n format: date-time\n nullable: true\n replyTo:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: The email addresses to use when replying.\n sender:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n sentDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The date and time the message was sent. The date and time information uses ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z.'\n format: date-time\n nullable: true\n subject:\n type: string\n description: The subject of the message.\n nullable: true\n toRecipients:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.recipient'\n description: 'The To: recipients for the message.'\n uniqueBody:\n $ref: '#/components/schemas/microsoft.graph.itemBody'\n webLink:\n type: string\n description: 'The URL to open the message in Outlook on the web.You can append an ispopout argument to the end of the URL to change how the message is displayed. If ispopout is not present or if it is set to 1, then the message is shown in a popout window. If ispopout is set to 0, the browser shows the message in the Outlook on the web review pane.The message opens in the browser if you are signed in to your mailbox via Outlook on the web. You are prompted to sign in if you are not already signed in with the browser.This URL cannot be accessed from within an iFrame.'\n nullable: true\n attachments:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.attachment'\n description: The fileAttachment and itemAttachment attachments for the message.\n extensions:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.extension'\n description: The collection of open extensions defined for the message. Nullable.\n multiValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.multiValueLegacyExtendedProperty'\n description: The collection of multi-value extended properties defined for the message. Nullable.\n singleValueExtendedProperties:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.singleValueLegacyExtendedProperty'\n description: The collection of single-value extended properties defined for the message. Nullable.\n microsoft.graph.recipient:\n title: recipient\n required:\n - '@odata.type'\n type: object\n properties:\n emailAddress:\n $ref: '#/components/schemas/microsoft.graph.emailAddress'\n '@odata.type':\n type: string\n discriminator:\n propertyName: '@odata.type'\n microsoft.graph.itemBody:\n title: itemBody\n required:\n - '@odata.type'\n type: object\n properties:\n content:\n type: string\n description: The content of the item.\n nullable: true\n contentType:\n $ref: '#/components/schemas/microsoft.graph.bodyType'\n '@odata.type':\n type: string\n description: The body of the message. It can be in HTML or text format. Find out about safe HTML in a message body.\n microsoft.graph.followupFlag:\n title: followupFlag\n required:\n - '@odata.type'\n type: object\n properties:\n completedDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n dueDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n flagStatus:\n $ref: '#/components/schemas/microsoft.graph.followupFlagStatus'\n startDateTime:\n $ref: '#/components/schemas/microsoft.graph.dateTimeTimeZone'\n '@odata.type':\n type: string\n description: 'The flag value that indicates the status, start date, due date, or completion date for the message.'\n microsoft.graph.importance:\n title: importance\n enum:\n - low\n - normal\n - high\n type: string\n description: 'The importance of the message. The possible values are: low, normal, and high.'\n microsoft.graph.inferenceClassificationType:\n title: inferenceClassificationType\n enum:\n - focused\n - other\n type: string\n description: 'The classification of the message for the user, based on inferred relevance or importance, or on an explicit override. The possible values are: focused or other.'\n microsoft.graph.internetMessageHeader:\n title: internetMessageHeader\n required:\n - '@odata.type'\n type: object\n properties:\n name:\n type: string\n description: Represents the key in a key-value pair.\n nullable: true\n value:\n type: string\n description: The value in a key-value pair.\n nullable: true\n '@odata.type':\n type: string\n microsoft.graph.attachment:\n title: attachment\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n contentType:\n type: string\n description: The MIME type.\n nullable: true\n isInline:\n type: boolean\n description: 'true if the attachment is an inline attachment; otherwise, false.'\n lastModifiedDateTime:\n pattern: '^[0-9]{4,}-(0[1-9]|1[012])-(0[1-9]|[12][0-9]|3[01])T([01][0-9]|2[0-3]):[0-5][0-9]:[0-5][0-9]([.][0-9]{1,12})?(Z|[+-][0-9][0-9]:[0-9][0-9])$'\n type: string\n description: 'The Timestamp type represents date and time information using ISO 8601 format and is always in UTC time. For example, midnight UTC on Jan 1, 2014 is 2014-01-01T00:00:00Z'\n format: date-time\n nullable: true\n name:\n type: string\n description: The attachment's file name.\n nullable: true\n size:\n maximum: 2147483647\n minimum: -2147483648\n type: number\n description: The length of the attachment in bytes.\n format: int32\n microsoft.graph.extension:\n title: extension\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n microsoft.graph.multiValueLegacyExtendedProperty:\n title: multiValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: array\n items:\n type: string\n nullable: true\n description: A collection of property values.\n microsoft.graph.singleValueLegacyExtendedProperty:\n title: singleValueLegacyExtendedProperty\n required:\n - '@odata.type'\n type: object\n properties:\n id:\n type: string\n description: The unique identifier for an entity. Read-only.\n '@odata.type':\n type: string\n value:\n type: string\n description: A property value.\n nullable: true\n microsoft.graph.messageCollectionResponse:\n title: Base collection pagination and count responses\n type: object\n properties:\n '@odata.count':\n type: integer\n format: int64\n nullable: true\n '@odata.nextLink':\n type: string\n nullable: true\n value:\n type: array\n items:\n $ref: '#/components/schemas/microsoft.graph.message'\n microsoft.graph.emailAddress:\n title: emailAddress\n required:\n - '@odata.type'\n type: object\n properties:\n address:\n type: string\n description: The email address of the person or entity.\n nullable: true\n name:\n type: string\n description: The display name of the person or entity.\n nullable: true\n '@odata.type':\n type: string\n description: The recipient's email address.\n microsoft.graph.bodyType:\n title: bodyType\n enum:\n - text\n - html\n type: string\n description: The type of the content. Possible values are text and html.\n microsoft.graph.dateTimeTimeZone:\n title: dateTimeTimeZone\n required:\n - '@odata.type'\n type: object\n properties:\n dateTime:\n type: string\n description: 'A single point of time in a combined date and time representation ({date}T{time}; for example, 2017-08-29T04:00:00.0000000).'\n timeZone:\n type: string\n description: 'Represents a time zone, for example, ''Pacific Standard Time''. See below for more possible values.'\n nullable: true\n '@odata.type':\n type: string\n description: The date and time that the follow-up was finished.\n microsoft.graph.followupFlagStatus:\n title: followupFlagStatus\n enum:\n - notFlagged\n - complete\n - flagged\n type: string\n description: 'The status for follow-up for an item. Possible values are notFlagged, complete, and flagged.'\n responses:\n microsoft.graph.messageCollectionResponse:\n description: Retrieved collection\n content:\n application/json:\n schema:\n $ref: '#/components/schemas/microsoft.graph.messageCollectionResponse'\n parameters:\n top:\n name: $top\n in: query\n description: Show only the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n example: 50\n skip:\n name: $skip\n in: query\n description: Skip the first n items\n style: form\n explode: false\n schema:\n minimum: 0\n type: integer\n search:\n name: $search\n in: query\n description: Search items by search phrases\n style: form\n explode: false\n schema:\n type: string\n filter:\n name: $filter\n in: query\n description: Filter items by property values\n style: form\n explode: false\n schema:\n type: string\n count:\n name: $count\n in: query\n description: Include count of items\n style: form\n explode: false\n schema:\n type: boolean\n requestBodies:\n sendMailRequestBody:\n description: Action parameters\n content:\n application/json:\n schema:\n type: object\n properties:\n Message:\n $ref: '#/components/schemas/microsoft.graph.message'\n SaveToSentItems:\n type: boolean\n default: false\n nullable: true\n required: true\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\src\Functions\Functions.UnitTests\OpenApi\TestPlugins\messages-openapi.yml | messages-openapi.yml | YAML | 20,759 | 0.95 | 0.064327 | 0 | node-utils | 87 | 2025-02-18T13:35:10.249212 | MIT | true | a39ce52b4dbd09812535090331e93cd5 |
---\nversion: '3.4'\nservices:\n weaviate:\n\n image: semitechnologies/weaviate:1.21.2\n links:\n - "contextionary:contextionary"\n ports:\n - 8080:8080\n restart: on-failure:0\n environment:\n LOG_LEVEL: "debug"\n CONTEXTIONARY_URL: contextionary:9999\n QUERY_DEFAULTS_LIMIT: 25\n AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'\n PERSISTENCE_DATA_PATH: "./data"\n DEFAULT_VECTORIZER_MODULE: text2vec-contextionary\n ENABLE_MODULES: text2vec-contextionary,backup-filesystem,generative-openai\n BACKUP_FILESYSTEM_PATH: "/tmp/backups"\n CLUSTER_GOSSIP_BIND_PORT: "7100"\n CLUSTER_DATA_BIND_PORT: "7101"\n AUTHENTICATION_APIKEY_ALLOWED_KEYS: 'my-secret-key'\n healthcheck:\n test: [ "CMD", "curl", "-f", "http://localhost:8080/v1" ]\n interval: 1m\n timeout: 10s\n retries: 5\n start_period: 5s\n contextionary:\n image: semitechnologies/contextionary:en0.16.0-v1.2.0\n environment:\n LOG_LEVEL: "debug"\n OCCURRENCE_WEIGHT_LINEAR_FACTOR: 0.75\n EXTENSIONS_STORAGE_MODE: weaviate\n EXTENSIONS_STORAGE_ORIGIN: http://weaviate:8080\n NEIGHBOR_OCCURRENCE_IGNORE_PERCENTILE: 5\n ENABLE_COMPOUND_SPLITTING: 'false'\n | dataset_sample\yaml\microsoft_semantic-kernel\dotnet\src\IntegrationTests\Connectors\Memory\Weaviate\docker-compose.yml | docker-compose.yml | YAML | 1,220 | 0.8 | 0 | 0 | python-kit | 250 | 2025-01-22T15:41:38.168442 | GPL-3.0 | true | 596db006143f58c0e3b93ac9b3ae86c1 |
# yaml-language-server: $schema=https://golangci-lint.run/jsonschema/custom-gcl.jsonschema.json\n\nversion: v2.0.2\n\ndestination: ./_tools\n\nplugins:\n - module: 'github.com/microsoft/typescript-go/_tools'\n import: 'github.com/microsoft/typescript-go/_tools/customlint'\n path: ./_tools\n | dataset_sample\yaml\microsoft_typescript-go\.custom-gcl.yml | .custom-gcl.yml | YAML | 288 | 0.95 | 0 | 0.142857 | vue-tools | 645 | 2025-02-23T22:12:44.894076 | GPL-3.0 | false | 1587a292ff12b43fded9a668694efe78 |
# yaml-language-server: $schema=https://golangci-lint.run/jsonschema/golangci.jsonschema.json\n\nversion: '2'\n\nrun:\n allow-parallel-runners: true\n\nlinters:\n default: none\n enable:\n # Enabled\n - asasalint\n - bidichk\n - bodyclose\n - canonicalheader\n - copyloopvar\n - customlint\n - durationcheck\n - errcheck\n - errchkjson\n - errname\n - errorlint\n - fatcontext\n - gocheckcompilerdirectives\n - goprintffuncname\n - govet\n - grouper\n - inamedparam\n - ineffassign\n - intrange\n - makezero\n - mirror\n - misspell\n - musttag\n - nakedret\n - nolintlint\n - paralleltest\n - perfsprint\n - predeclared\n - reassign\n - testableexamples\n - tparallel\n - unconvert\n - usestdlibvars\n - usetesting\n\n # Need to add headers\n # - goheader\n\n # Check\n # - exhaustive\n # - gocritic\n # - gosec\n # - revive\n # - staticcheck\n # - testifylint\n # - unparam\n # - unused\n\n settings:\n custom:\n customlint:\n type: module\n\n exclusions:\n presets:\n - comments\n - std-error-handling\n\nissues:\n max-issues-per-linter: 0\n max-same-issues: 0\n | dataset_sample\yaml\microsoft_typescript-go\.golangci.yml | .golangci.yml | YAML | 1,165 | 0.8 | 0 | 0.203125 | awesome-app | 224 | 2024-09-06T13:17:08.010739 | MIT | false | c3d23c679f19c0f571625739b01a5613 |
version: 2\nupdates:\n - package-ecosystem: "cargo"\n directory: "/"\n schedule:\n interval: "daily"\n - package-ecosystem: "github-actions"\n directory: "/"\n schedule:\n interval: "daily"\n | dataset_sample\yaml\microsoft_windows-rs\.github\dependabot.yml | dependabot.yml | YAML | 205 | 0.7 | 0 | 0 | vue-tools | 319 | 2025-06-14T06:20:20.045396 | GPL-3.0 | false | 60b8abf18096f3ee049522e0cbfce57a |
name: Fix environment\ndescription: GitHub VMs aren't configured correctly\nruns:\n using: "composite"\n steps:\n - name: Configure Cargo for GNU\n shell: pwsh\n run: |\n Add-Content $env:USERPROFILE\.cargo\config @"\n [target.x86_64-pc-windows-gnu]\n linker = `"C:\\msys64\\mingw64\\bin\\x86_64-w64-mingw32-gcc.exe`"\n ar = `"C:\\msys64\\mingw64\\bin\\ar.exe`"\n [target.i686-pc-windows-gnu]\n linker = `"C:\\msys64\\mingw32\\bin\\i686-w64-mingw32-gcc.exe`"\n ar = `"C:\\msys64\\mingw32\\bin\\ar.exe`"\n "@\n if: contains(matrix.target, 'windows-gnu')\n - name: Configure environment\n shell: pwsh\n run: |\n $vs_root = & "${env:ProgramFiles(x86)}\Microsoft Visual Studio\Installer\vswhere.exe" `\n -latest -property installationPath -format value\n\n switch -Wildcard ("${{ matrix.target }}")\n {\n "*-pc-windows-gnu"\n {\n "C:\msys64\mingw64\bin;C:\msys64\mingw32\bin" >> $env:GITHUB_PATH\n }\n "i686*"\n {\n "${env:ProgramFiles(x86)}\Windows Kits\10\bin\10.0.26100.0\x86" >> $env:GITHUB_PATH\n ((Resolve-Path "$vs_root\VC\Tools\MSVC\*\bin\Hostx86\x86")\n | Sort-Object -Descending | Select -First 1).ToString() >> $env:GITHUB_PATH\n }\n "x86_64*"\n {\n "${env:ProgramFiles(x86)}\Windows Kits\10\bin\10.0.26100.0\x64" >> $env:GITHUB_PATH\n ((Resolve-Path "$vs_root\VC\Tools\MSVC\*\bin\Hostx64\x64")\n | Sort-Object -Descending | Select -First 1).ToString() >> $env:GITHUB_PATH\n }\n "aarch64*"\n {\n "${env:ProgramFiles(x86)}\Windows Kits\10\bin\10.0.26100.0\arm64" >> $env:GITHUB_PATH\n ((Resolve-Path "$vs_root\VC\Tools\MSVC\*\bin\Hostx64\arm64")\n | Sort-Object -Descending | Select -First 1).ToString() >> $env:GITHUB_PATH\n }\n "*"\n {\n (Join-Path $env:GITHUB_WORKSPACE "target\debug\deps").ToString() >> $env:GITHUB_PATH\n (Join-Path $env:GITHUB_WORKSPACE "target\test\debug\deps").ToString() >> $env:GITHUB_PATH\n "INCLUDE=${env:ProgramFiles(x86)}\Windows Kits\10\include\10.0.26100.0\winrt;${env:ProgramFiles(x86)}\Windows Kits\10\include\10.0.26100.0\cppwinrt" `\n >> $env:GITHUB_ENV\n }\n }\n # Workaround to address several issues with windows-2022 runners:\n # - Old mingw-w64-* binutils that prevent deterministic builds.\n # - Missing llvm-dlltool in the native Windows LLVM package.\n # - Missing mingw-w64 compiler packages.\n - name: Update packages\n shell: pwsh\n run: |\n C:\msys64\usr\bin\pacman.exe --sync --refresh --noconfirm mingw-w64-x86_64-binutils\n C:\msys64\usr\bin\pacman.exe --sync --refresh --noconfirm mingw-w64-x86_64-llvm\n C:\msys64\usr\bin\pacman.exe --sync --refresh --noconfirm mingw-w64-i686-gcc\n C:\msys64\usr\bin\pacman.exe --sync --refresh --noconfirm mingw-w64-x86_64-gcc\n if: ${{ !contains(matrix.runner, 'windows-11-arm') }}\n | dataset_sample\yaml\microsoft_windows-rs\.github\actions\fix-environment\action.yml | action.yml | YAML | 3,117 | 0.8 | 0.059701 | 0.060606 | react-lib | 180 | 2025-03-17T02:14:48.623245 | Apache-2.0 | false | 089fc2597e4d3d59c5feb7c03a14ef99 |
name: Bug report\ndescription: File a bug report\nlabels: [bug]\nbody:\n - type: textarea\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel is relevant to replicating the bug.\n - type: textarea\n attributes:\n label: Crate manifest\n description: >\n Please provide the Cargo manifest (Cargo.toml) needed to compile your crate.\n render: TOML\n - type: textarea\n attributes:\n label: Crate code\n description: >\n Please provide all code needed to reproduce the issue.\n render: Rust\n | dataset_sample\yaml\microsoft_windows-rs\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 626 | 0.7 | 0 | 0 | python-kit | 463 | 2023-07-11T08:33:21.084644 | MIT | false | 2dcbe60714a5116b562ae6845de9e90c |
blank_issues_enabled: false\ncontact_links:\n - name: Windows API question\n url: https://docs.microsoft.com/en-us/answers/topics/windows-api.html\n about: Please ask and answer questions about the Windows API on Microsoft Q&A\n | dataset_sample\yaml\microsoft_windows-rs\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 230 | 0.8 | 0 | 0 | node-utils | 195 | 2024-02-25T23:19:36.425519 | GPL-3.0 | false | f896302582e1d24f124f2efec3eb34da |
name: Feature request\ndescription: Suggest an improvement\nlabels: [enhancement]\nbody:\n - type: textarea\n attributes:\n label: Suggestion\n description: >\n Please share your suggestion here. Be sure to include all necessary context.\n | dataset_sample\yaml\microsoft_windows-rs\.github\ISSUE_TEMPLATE\feature_request.yml | feature_request.yml | YAML | 251 | 0.7 | 0 | 0 | react-lib | 366 | 2024-10-09T10:28:04.475410 | Apache-2.0 | false | 52ab0b9bc9d32f2a8f93a892a27cc079 |
name: clippy\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update nightly && rustup default nightly-x86_64-pc-windows-msvc\n - name: Add toolchain target\n run: rustup target add x86_64-pc-windows-msvc\n - name: Install clippy\n run: rustup component add clippy\n - name: Fix environment\n uses: ./.github/actions/fix-environment\n - name: Check cppwinrt\n run: cargo clippy -p cppwinrt\n - name: Check windows\n run: cargo clippy -p windows\n - name: Check windows-bindgen\n run: cargo clippy -p windows-bindgen\n - name: Check windows-collections\n run: cargo clippy -p windows-collections\n - name: Check windows-core\n run: cargo clippy -p windows-core\n - name: Check windows-ecma335\n run: cargo clippy -p windows-ecma335\n - name: Check windows-future\n run: cargo clippy -p windows-future\n - name: Check windows-implement\n run: cargo clippy -p windows-implement\n - name: Check windows-interface\n run: cargo clippy -p windows-interface\n - name: Check windows-link\n run: cargo clippy -p windows-link\n - name: Check windows-numerics\n run: cargo clippy -p windows-numerics\n - name: Check windows-registry\n run: cargo clippy -p windows-registry\n - name: Check windows-result\n run: cargo clippy -p windows-result\n - name: Check windows-strings\n run: cargo clippy -p windows-strings\n - name: Check windows-sys\n run: cargo clippy -p windows-sys\n - name: Check windows-targets\n run: cargo clippy -p windows-targets\n - name: Check windows-version\n run: cargo clippy -p windows-version\n - name: Check helpers\n run: cargo clippy -p helpers\n - name: Check tool_bindgen\n run: cargo clippy -p tool_bindgen\n - name: Check tool_bindings\n run: cargo clippy -p tool_bindings\n - name: Check tool_gnu\n run: cargo clippy -p tool_gnu\n - name: Check tool_license\n run: cargo clippy -p tool_license\n - name: Check tool_merge\n run: cargo clippy -p tool_merge\n - name: Check tool_msvc\n run: cargo clippy -p tool_msvc\n - name: Check tool_standalone\n run: cargo clippy -p tool_standalone\n - name: Check tool_test_all\n run: cargo clippy -p tool_test_all\n - name: Check tool_workspace\n run: cargo clippy -p tool_workspace\n - name: Check tool_yml\n run: cargo clippy -p tool_yml | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\clippy.yml | clippy.yml | YAML | 2,865 | 0.8 | 0 | 0 | node-utils | 197 | 2024-01-31T10:48:39.780848 | GPL-3.0 | false | 03dbaca174b27a09e61084f86787476c |
name: cross\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\nenv:\n LLVM-MINGW-TOOLCHAIN-NAME: llvm-mingw-20220906-ucrt-ubuntu-18.04-x86_64\n\njobs:\n check:\n strategy:\n matrix:\n #\n # i686 cross compilation requires mingw packages that are configured\n # to use DWARF-2 exception handling (vice SJLJ).\n #\n # The mingw package ecosystem is fragmented so we avoid that target\n # for now. (e.g. macOS hosts will require mingw recompile)\n #\n # See also:\n # https://github.com/rust-lang/rust/issues/79577\n # https://sourceforge.net/p/mingw-w64/wiki2/Exception%20Handling\n #\n image: [macos-14, ubuntu-22.04]\n target: [x86_64-pc-windows-gnu, aarch64-pc-windows-gnullvm, x86_64-pc-windows-gnullvm, i686-pc-windows-gnullvm]\n runs-on: ${{ matrix.image }}\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Update toolchain\n run: rustup update --no-self-update stable && rustup default stable\n\n - name: Add toolchain target\n run: rustup target add ${{ matrix.target }}\n\n - name: Install gcc-mingw-w64-x86-64\n run: sudo apt-get install -y gcc-mingw-w64-x86-64\n if: startsWith(matrix.image, 'ubuntu-') && matrix.target == 'x86_64-pc-windows-gnu'\n\n - name: Install mingw-w64\n run: brew install mingw-w64\n if: startsWith(matrix.image, 'macos-') && matrix.target == 'x86_64-pc-windows-gnu'\n\n - name: LLVM MinGW toolchain cache configuration\n id: cache-llvm-mingw-toolchain\n uses: actions/cache@v4\n if: startsWith(matrix.image, 'ubuntu-') && contains(matrix.target, 'gnullvm')\n with:\n path: ${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}\n key: ${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}\n\n - name: Install LLVM MinGW toolchain\n if: startsWith(matrix.image, 'ubuntu-') && contains(matrix.target, 'gnullvm') && steps.cache-llvm-mingw-toolchain.outputs.cache-hit != 'true'\n run: |\n curl -L -o ${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}.tar.xz https://github.com/mstorsjo/llvm-mingw/releases/download/20220906/${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}.tar.xz\n tar -xf ${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}.tar.xz\n echo "${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}/bin" >> $GITHUB_PATH\n \n - name: Add LLVM MinGW toolchain to PATH\n if: startsWith(matrix.image, 'ubuntu-') && contains(matrix.target, 'gnullvm')\n run: |\n echo "${{ env.LLVM-MINGW-TOOLCHAIN-NAME }}/bin" >> $GITHUB_PATH\n\n - name: Test\n shell: pwsh\n if: startsWith(matrix.image, 'ubuntu-') && contains(matrix.target, 'gnullvm') || endsWith(matrix.target, 'gnu')\n run: |\n cargo test --no-run --target ${{ matrix.target }} -p test_win32\n if (-Not (Resolve-Path "target/*/debug/deps/test_win32-*.exe" | Test-Path)) {\n throw "Failed to find test_win32 executable."\n }\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\cross.yml | cross.yml | YAML | 3,106 | 0.95 | 0.097561 | 0.157143 | react-lib | 703 | 2025-06-28T22:44:06.950804 | GPL-3.0 | false | c23826cf169e3208b85afbc77cfd9055 |
name: debugger_visualizer\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\nenv:\n RUSTFLAGS: --cfg windows_debugger_visualizer\n\njobs:\n check:\n runs-on: windows-2022\n\n strategy:\n matrix:\n include:\n - target: x86_64-pc-windows-msvc\n - target: i686-pc-windows-msvc\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update nightly && rustup default nightly-${{ matrix.target }}\n - name: Add toolchain target\n run: rustup target add ${{ matrix.target }}\n\n - name: Test\n run: cargo test -p test_debugger_visualizer -- --test-threads=1\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\debugger_visualizer.yml | debugger_visualizer.yml | YAML | 828 | 0.8 | 0 | 0 | node-utils | 255 | 2024-10-13T09:03:19.249020 | MIT | false | 37b62c4510d343671d99094707319c02 |
name: docs\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n windows:\n name: windows\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Check\n run: cargo doc --no-deps -p windows\n\n windows-sys:\n name: windows-sys\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Check\n run: cargo doc --no-deps -p windows-sys\n\n other-crates:\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Check\n run: >\n cargo doc --no-deps\n -p windows-bindgen\n -p windows-core\n -p cppwinrt\n -p windows-implement\n -p windows-interface\n -p windows-registry\n -p windows-result\n -p windows-strings\n -p windows-targets\n -p windows-version\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\doc.yml | doc.yml | YAML | 1,066 | 0.8 | 0 | 0 | awesome-app | 985 | 2023-12-11T07:20:42.535809 | GPL-3.0 | false | ba8460cf6b1299243c987a9c5676754d |
name: fmt\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: ubuntu-22.04\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update stable && rustup default stable\n - name: Check\n run: cargo fmt --all -- --check\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\fmt.yml | fmt.yml | YAML | 487 | 0.8 | 0 | 0 | react-lib | 385 | 2023-11-09T12:35:01.146621 | BSD-3-Clause | false | 026b15c99929c5f7a3a2fa02fe4e3a64 |
name: gen\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n name: tool_${{ matrix.tool }}\n runs-on: ubuntu-22.04\n strategy:\n matrix:\n tool: [bindgen, bindings, yml, license, standalone, workspace]\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update stable && rustup default stable\n - name: Run\n run: cargo run -p tool_${{ matrix.tool }} --release\n - name: Check\n shell: bash\n run: |\n git add -N .\n git diff --exit-code || (echo '::error::Generated `tool_${{ matrix.tool }}` is out-of-date. Please run `cargo run -p tool_${{ matrix.tool }}`'; exit 1)\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\gen.yml | gen.yml | YAML | 878 | 0.8 | 0 | 0 | awesome-app | 244 | 2024-11-04T13:55:10.362162 | Apache-2.0 | false | f0ae538de73f2917f85c91bf8feabfcb |
name: lib\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Fix environment\n uses: ./.github/actions/fix-environment\n\n - name: Build gnu libs\n shell: cmd\n run: |\n set PATH=C:\msys64\mingw64\bin;%PATH%\n cargo run -p tool_gnu -- all\n\n - name: Find Visual Studio\n id: visual-studio\n shell: pwsh\n run: |\n $path = & "${env:ProgramFiles(x86)}\Microsoft Visual Studio\Installer\vswhere.exe" `\n -latest -property installationPath -format value\n "install_path=$path" | Out-File -FilePath $env:GITHUB_OUTPUT -Append\n\n - name: Build i686_msvc\n shell: cmd\n run: |\n call "${{steps.visual-studio.outputs.install_path}}\VC\Auxiliary\Build\vcvars32.bat" x86\n cargo run -p tool_msvc\n\n - name: Build x86_64_msvc\n shell: cmd\n run: |\n call "${{steps.visual-studio.outputs.install_path}}\VC\Auxiliary\Build\vcvars32.bat" amd64\n cargo run -p tool_msvc\n\n - name: Build aarch64_msvc\n shell: cmd\n run: |\n call "${{steps.visual-studio.outputs.install_path}}\VC\Auxiliary\Build\vcvars32.bat" amd64_arm64\n cargo run -p tool_msvc\n\n - name: Upload libs\n uses: actions/upload-artifact@v4\n with:\n name: libs\n path: crates/targets/*/lib/*\n\n - name: Check diff\n shell: bash\n run: |\n git add -N .\n git diff --exit-code crates/targets || (echo '::error::Generated target libs are out-of-date.'; exit 1)\n\n - name: Check dumpbin\n shell: pwsh\n run: |\n $VisualStudioRoot = "${{steps.visual-studio.outputs.install_path}}"\n $DumpbinPath = Resolve-Path "$VisualStudioRoot\VC\Tools\MSVC\*\bin\*\x86\dumpbin.exe" |\n Select -ExpandProperty Path -First 1\n $Tests = @(\n [Tuple]::Create("aarch64_msvc","AA64"),\n [Tuple]::Create("x86_64_msvc","8664"),\n [Tuple]::Create("i686_msvc","14C")\n )\n foreach($Test in $Tests) {\n $Target = $Test.Item1\n $Magic = $Test.Item2\n $Output = [string](& $DumpbinPath /headers crates/targets/$Target/lib/windows.0.53.0.lib)\n if($Output -match "Machine\s*: $Magic" -ne $True) {\n Write-Error "Import lib check failed for $Target ($Magic)."\n Exit 1\n }\n }\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\lib.yml | lib.yml | YAML | 2,690 | 0.8 | 0.022727 | 0 | react-lib | 941 | 2024-12-22T14:07:26.823694 | MIT | false | 1387cb63a5aff7fc1e9fd8cd2a6bf3c2 |
name: linux\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: ubuntu-22.04\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update stable && rustup default stable\n - name: Run cargo test\n run: cargo test -p test_linux --target x86_64-unknown-linux-gnu\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\linux.yml | linux.yml | YAML | 530 | 0.8 | 0 | 0 | react-lib | 627 | 2024-07-30T20:59:22.034081 | Apache-2.0 | false | 6cfa5b3439f5a3af14585bdbcd8faa1d |
name: miri\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update nightly && rustup default nightly-x86_64-pc-windows-msvc\n - name: Add toolchain target\n run: rustup target add x86_64-pc-windows-msvc\n - name: Add miri\n run: rustup toolchain install nightly --component miri\n - name: Check string literals\n run: cargo miri test -p test_strings --test literals | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\miri.yml | miri.yml | YAML | 725 | 0.8 | 0 | 0 | awesome-app | 469 | 2025-06-13T20:03:25.032066 | Apache-2.0 | false | 287e774f799601c390847d08fa3ee2aa |
name: msrv\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check cppwinrt\n run: cargo check -p cppwinrt --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows\n run: cargo check -p windows --features Globalization,Win32_Graphics_Direct2D\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-bindgen\n run: cargo check -p windows-bindgen --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-collections\n run: cargo check -p windows-collections --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-core\n run: cargo check -p windows-core --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.82 && rustup default 1.82\n - name: Check windows-ecma335\n run: cargo check -p windows-ecma335 --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-future\n run: cargo check -p windows-future --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-implement\n run: cargo check -p windows-implement --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-interface\n run: cargo check -p windows-interface --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.71 && rustup default 1.71\n - name: Check windows-link\n run: cargo check -p windows-link --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-numerics\n run: cargo check -p windows-numerics --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-registry\n run: cargo check -p windows-registry --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-result\n run: cargo check -p windows-result --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-strings\n run: cargo check -p windows-strings --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.60 && rustup default 1.60\n - name: Check windows-sys\n run: cargo check -p windows-sys --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.60 && rustup default 1.60\n - name: Check windows-targets\n run: cargo check -p windows-targets --all-features\n - name: Rust version\n run: rustup update --no-self-update 1.74 && rustup default 1.74\n - name: Check windows-version\n run: cargo check -p windows-version --all-features | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\msrv.yml | msrv.yml | YAML | 3,635 | 0.8 | 0 | 0 | python-kit | 82 | 2024-11-10T22:33:21.760806 | GPL-3.0 | false | db9d03da9ab992d0ab6788490ed10a70 |
name: no-default-features\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Update toolchain\n run: rustup update --no-self-update nightly && rustup default nightly-x86_64-pc-windows-msvc\n - name: Add toolchain target\n run: rustup target add x86_64-pc-windows-msvc\n - name: Fix environment\n uses: ./.github/actions/fix-environment\n - name: Check cppwinrt\n run: cargo check -p cppwinrt --no-default-features\n - name: Check windows\n run: cargo check -p windows --no-default-features\n - name: Check windows-bindgen\n run: cargo check -p windows-bindgen --no-default-features\n - name: Check windows-collections\n run: cargo check -p windows-collections --no-default-features\n - name: Check windows-core\n run: cargo check -p windows-core --no-default-features\n - name: Check windows-ecma335\n run: cargo check -p windows-ecma335 --no-default-features\n - name: Check windows-future\n run: cargo check -p windows-future --no-default-features\n - name: Check windows-implement\n run: cargo check -p windows-implement --no-default-features\n - name: Check windows-interface\n run: cargo check -p windows-interface --no-default-features\n - name: Check windows-link\n run: cargo check -p windows-link --no-default-features\n - name: Check windows-numerics\n run: cargo check -p windows-numerics --no-default-features\n - name: Check windows-registry\n run: cargo check -p windows-registry --no-default-features\n - name: Check windows-result\n run: cargo check -p windows-result --no-default-features\n - name: Check windows-strings\n run: cargo check -p windows-strings --no-default-features\n - name: Check windows-sys\n run: cargo check -p windows-sys --no-default-features\n - name: Check windows-targets\n run: cargo check -p windows-targets --no-default-features\n - name: Check windows-version\n run: cargo check -p windows-version --no-default-features | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\no-default-features.yml | no-default-features.yml | YAML | 2,351 | 0.8 | 0 | 0 | awesome-app | 477 | 2025-05-17T10:26:51.975728 | MIT | false | bf3b950261e1ee827d9d44749e57b872 |
name: no_std\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\njobs:\n check:\n strategy:\n matrix:\n rust: [stable, nightly]\n runs-on: windows-2022\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n - name: Prepare\n run: rustup update --no-self-update ${{ matrix.rust }} && rustup default ${{ matrix.rust }}\n - name: Check\n run: cargo check -p test_no_std\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\no_std.yml | no_std.yml | YAML | 565 | 0.8 | 0 | 0 | python-kit | 314 | 2024-01-04T05:49:20.187686 | BSD-3-Clause | false | 11d73c3072fa549a49bdbc4f9a6d1e2a |
name: slim_errors\n\non:\n pull_request:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n push:\n paths-ignore:\n - '.github/ISSUE_TEMPLATE/**'\n - 'web/**'\n branches:\n - master\n\nenv:\n RUSTFLAGS: --cfg windows_slim_errors\n\njobs:\n check:\n strategy:\n matrix:\n include:\n - target: x86_64-pc-windows-msvc\n - target: i686-pc-windows-msvc\n - target: x86_64-pc-windows-gnu\n - target: i686-pc-windows-gnu\n runs-on:\n - windows-2022\n runs-on: ${{ matrix.runs-on }}\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n \n - name: Update toolchain\n run: rustup update --no-self-update nightly && rustup default nightly-${{ matrix.target }}\n \n - name: Add toolchain target\n run: rustup target add ${{ matrix.target }}\n\n - name: Fix environment\n uses: ./.github/actions/fix-environment\n \n - name: Test\n run: cargo test -p test_result\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\slim_errors.yml | slim_errors.yml | YAML | 1,018 | 0.8 | 0 | 0 | vue-tools | 647 | 2024-03-18T10:02:52.497302 | MIT | false | caa121fa5bba31d3c7a120d0e036e069 |
name: Mark stale issues and pull requests\n\non:\n schedule:\n - cron: '0 0 * * *'\n\njobs:\n stale:\n\n runs-on: ubuntu-22.04\n permissions:\n issues: write\n pull-requests: write\n\n steps:\n - uses: actions/stale@v9\n with:\n repo-token: ${{ secrets.GITHUB_TOKEN }}\n days-before-stale: 10\n days-before-close: 5\n stale-issue-message: 'This issue is stale because it has been open 10 days with no activity. Remove stale label or comment or this will be closed in 5 days.'\n stale-pr-message: 'This pull request is stale because it has been open 10 days with no activity. Remove stale label or comment or this will be closed in 5 days.'\n stale-issue-label: 'no-issue-activity'\n stale-pr-label: 'no-pr-activity'\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\stale.yml | stale.yml | YAML | 772 | 0.7 | 0 | 0 | python-kit | 168 | 2025-06-07T08:48:42.574320 | Apache-2.0 | false | 43db4f7477958c9c53c5409e3331082d |
name: web\n\npermissions:\n contents: write\n\non:\n pull_request:\n paths: [web/**]\n types: [opened, synchronize, reopened, closed]\n push:\n paths: [web/**]\n branches: [master]\n workflow_dispatch:\n\njobs:\n web:\n if: github.repository == 'microsoft/windows-rs'\n runs-on: ubuntu-22.04\n steps:\n - uses: actions/checkout@v4\n with:\n path: current\n\n - uses: actions/setup-node@v4\n with:\n node-version: 'latest'\n\n - name: Build sites\n shell: pwsh\n run: |\n cd current/web\n $sites = Get-ChildItem -Directory\n foreach ($site in $sites) {\n Write-Host "Building $($site.Name)"\n ./build.ps1 $site.Name\n }\n\n - uses: actions/checkout@v4\n with:\n ref: gh-pages\n path: gh-pages\n\n - name: Update gh-pages\n shell: pwsh\n run: |\n cd gh-pages\n\n $sites = Get-ChildItem ../current/web -Directory\n foreach ($site in $sites) {\n Write-Host "Deploying $($site.Name)"\n\n if ("${{ github.event_name }}" -eq "pull_request") {\n if ("${{ github.event.action }}" -eq "closed") {\n Remove-Item -Recurse -Force -ErrorAction Ignore "$($site.Name)/preview/pr-${{ github.event.number }}"\n } else {\n if (Test-Path "$($site.FullName)/build") {\n $previewPath = "$($site.Name)/preview/pr-${{ github.event.number }}"\n New-Item -ItemType Directory -Force $previewPath | Out-Null\n Remove-Item -Recurse -Force -ErrorAction Ignore "$previewPath/*"\n Copy-Item "$($site.FullName)/build/*" $previewPath -Recurse -Force\n }\n }\n } elseif ("${{ github.ref }}" -eq "refs/heads/master") {\n if (Test-Path "$($site.FullName)/build") {\n New-Item -ItemType Directory -Force $site.Name | Out-Null\n Get-ChildItem "$($site.Name)/*" -Exclude "preview" | Remove-Item -Recurse -Force -ErrorAction Ignore\n Copy-Item "$($site.FullName)/build/*" $site.Name -Recurse -Force\n }\n }\n }\n\n git config user.name "github-actions[bot]"\n git config user.email "github-actions[bot]@users.noreply.github.com"\n git remote set-url origin https://github.com/${{ github.repository }}\n\n $status = git status --porcelain\n if ($status) {\n git add .\n git commit -m "Update from ${{ github.sha }}"\n git push\n }\n | dataset_sample\yaml\microsoft_windows-rs\.github\workflows\web.yml | web.yml | YAML | 2,695 | 0.8 | 0.074074 | 0 | vue-tools | 514 | 2024-10-04T10:23:37.609671 | MIT | false | ee821819f10d5e98e897fb4ef40b50a0 |
run:\n go: "1.22"\n build-tags:\n - dynamic\n - test\n\nlinters:\n disable-all: true\n enable:\n - gosimple\n - govet\n - ineffassign\n - staticcheck\n - decorder\n - depguard\n - gofmt\n - goimports\n - gosec\n - revive\n - unconvert\n - misspell\n - typecheck\n - durationcheck\n - forbidigo\n - gci\n - whitespace\n - gofumpt\n - gocritic\n\nlinters-settings:\n gci:\n sections:\n - standard\n - default\n - prefix(github.com/milvus-io)\n custom-order: true\n govet:\n enable: # add extra linters\n - nilness\n gofumpt:\n module-path: github.com/milvus-io\n goimports:\n local-prefixes: github.com/milvus-io\n revive:\n rules:\n - name: unused-parameter\n disabled: true\n - name: var-naming\n severity: warning\n disabled: false\n arguments:\n - ["ID", "IDS"] # Allow list\n - name: context-as-argument\n severity: warning\n disabled: false\n arguments:\n - allowTypesBefore: "*testing.T"\n - name: datarace\n severity: warning\n disabled: false\n - name: duplicated-imports\n severity: warning\n disabled: false\n - name: waitgroup-by-value\n severity: warning\n disabled: false\n - name: indent-error-flow\n severity: warning\n disabled: false\n arguments:\n - "preserveScope"\n - name: range-val-in-closure\n severity: warning\n disabled: false\n - name: range-val-address\n severity: warning\n disabled: false\n - name: string-of-int\n severity: warning\n disabled: false\n misspell:\n locale: US\n gocritic:\n enabled-checks:\n - ruleguard\n settings:\n ruleguard:\n failOnError: true\n rules: "rules.go"\n depguard:\n rules:\n main:\n deny:\n - pkg: "errors"\n desc: not allowed, use github.com/cockroachdb/errors\n - pkg: "github.com/pkg/errors"\n desc: not allowed, use github.com/cockroachdb/errors\n - pkg: "github.com/pingcap/errors"\n desc: not allowed, use github.com/cockroachdb/errors\n - pkg: "golang.org/x/xerrors"\n desc: not allowed, use github.com/cockroachdb/errors\n - pkg: "github.com/go-errors/errors"\n desc: not allowed, use github.com/cockroachdb/errors\n - pkg: "io/ioutil"\n desc: ioutil is deprecated after 1.16, 1.17, use os and io package instead\n - pkg: "github.com/tikv/client-go/rawkv"\n desc: not allowed, use github.com/tikv/client-go/v2/txnkv\n - pkg: "github.com/tikv/client-go/v2/rawkv"\n desc: not allowed, use github.com/tikv/client-go/v2/txnkv\n - pkg: "github.com/gogo/protobuf"\n desc: "not allowed, gogo protobuf is deprecated"\n - pkg: "github.com/golang/protobuf/proto"\n desc: "not allowed, protobuf v1 is deprecated, use google.golang.org/protobuf/proto instead"\n forbidigo:\n forbid:\n - '^time\.Tick$'\n - 'return merr\.Err[a-zA-Z]+'\n - 'merr\.Wrap\w+\(\)\.Error\(\)'\n - '\.(ErrorCode|Reason) = '\n - 'Reason:\s+\w+\.Error\(\)'\n - 'errors.New\((.+)\.GetReason\(\)\)'\n - 'commonpb\.Status\{[\s\n]*ErrorCode:[\s\n]*.+[\s\S\n]*?\}'\n - 'os\.Open\(.+\)'\n - 'os\.ReadFile\(.+\)'\n - 'os\.WriteFile\(.+\)'\n - "runtime.NumCPU"\n - "runtime.GOMAXPROCS(0)"\n #- 'fmt\.Print.*' WIP\n\nissues:\n exclude-dirs:\n - build\n - configs\n - deployments\n - docs\n - scripts\n - internal/core\n - cmake_build\n - mmap\n - data\n - ci\n exclude-files:\n - partial_search_test.go\n exclude-use-default: false\n exclude-rules:\n - path: .+_test\.go\n linters:\n - forbidigo\n - path: mocks\/(.)+mock_(.+)\.go\n text: "don't use an underscore in package name"\n exclude:\n - should have a package comment\n - should have comment\n - should be of the form\n - should not use dot imports\n - which can be annoying to use\n # Binds to all network interfaces\n - G102\n # Use of unsafe calls should be audited\n - G103\n # Errors unhandled\n - G104\n # file/folder Permission\n - G301\n - G302\n # Potential file inclusion via variable\n - G304\n # Deferring unsafe method like *os.File Close\n - G307\n # TLS MinVersion too low\n - G402\n # Use of weak random number generator math/rand\n - G404\n # Unused parameters\n - SA1019\n # defer return errors\n - SA5001\n # TODO: cleanup following exclusions, added on golangci-lint upgrade\n - sloppyLen\n - dupSubExpr\n - assignOp\n - ifElseChain\n - elseif\n - commentFormatting\n - exitAfterDefer\n - captLocal\n - singleCaseSwitch\n - typeSwitchVar\n - indent-error-flow\n - appendAssign\n - deprecatedComment\n - SA9009\n - SA1006\n - S1009\n - offBy1\n - unslice \n # Integer overflow conversion\n - G115\n\n # Maximum issues count per one linter. Set to 0 to disable. Default is 50.\n max-issues-per-linter: 0\n # Maximum count of issues with the same text. Set to 0 to disable. Default is 3.\n max-same-issues: 0\n\nservice:\n # use the fixed version to not introduce new linters unexpectedly\n golangci-lint-version: 1.55.2\n | dataset_sample\yaml\milvus-io_milvus\.golangci.yml | .golangci.yml | YAML | 5,284 | 0.95 | 0 | 0.079208 | python-kit | 90 | 2024-04-05T22:23:04.907684 | MIT | false | 558fcd2d5a43add9db921fc2f0a4faa1 |
# Configuration File for CodeCov\ncodecov:\n require_ci_to_pass: no\n notify:\n require_ci_to_pass: no\n wait_for_ci: false\n ci:\n - jenkins.milvus.io:18080\ncoverage:\n precision: 2\n round: down\n range: "70...100"\n\n status:\n project:\n default:\n target: 77%\n threshold: 0% #Allow the coverage to drop by threshold%, and posting a success status.\n patch:\n default:\n target: 80% #target of patch diff\n threshold: 0%\n if_ci_failed: error #success, failure, error, ignore\n\ncomment:\n layout: "reach, diff, flags, components, files"\n behavior: default\n require_changes: false\n branches: # branch names that can post comment\n - master\n \ncomponent_management:\n default_rules: # default rules that will be inherited by all components\n statuses:\n - type: project # in this case every component that doens't have a status defined will have a project type one\n target: auto\n branches:\n - "!main"\n\n individual_components:\n - component_id: client\n name: Client\n paths:\n - client/**\n\n - component_id: core\n name: Core\n paths:\n - internal/core/**\n\n\n - component_id: go\n name: Go\n paths:\n - pkg/**\n - internal/**\n - "!internal/core/**" # Exclude core component\n\nignore:\n - "LICENSES"\n - ".git"\n - "*.yml"\n - "*.md"\n - "docs/.*"\n - "**/*.pb.go"\n - "**/*.proto"\n - "internal/metastore/db/dbmodel/mocks/.*"\n - "**/mock_*.go"\n\n\n\n\n\n | dataset_sample\yaml\milvus-io_milvus\codecov.yml | codecov.yml | YAML | 1,496 | 0.95 | 0.013699 | 0.016667 | node-utils | 465 | 2023-09-18T12:42:55.681407 | Apache-2.0 | false | c28d93e3f340cfb98956e6db5ba67061 |
version: '3.5'\n\nx-ccache: &ccache\n CCACHE_COMPILERCHECK: content\n CCACHE_COMPRESS: 1\n CCACHE_COMPRESSLEVEL: 5\n CCACHE_MAXSIZE: 2G\n CCACHE_DIR: /ccache\n\nservices:\n builder:\n image: ${IMAGE_REPO}/milvus-env:${OS_NAME}-${DATE_VERSION}\n # Build devcontainer\n build:\n context: .\n dockerfile: build/docker/builder/cpu/${OS_NAME}/Dockerfile\n args:\n TARGETARCH: ${IMAGE_ARCH}\n cache_from:\n - ${IMAGE_REPO}/milvus-env:${OS_NAME}-${LATEST_DATE_VERSION}\n platform: linux/${IMAGE_ARCH}\n shm_size: 2G\n # expose 19530 port so we can directly access milvus inside build container\n # ports:\n # - "19530:19530"\n environment:\n <<: *ccache\n OS_NAME: ${OS_NAME}\n PULSAR_ADDRESS: ${PULSAR_ADDRESS}\n ETCD_ENDPOINTS: ${ETCD_ENDPOINTS}\n MINIO_ADDRESS: ${MINIO_ADDRESS}\n CONAN_USER_HOME: /home/milvus\n AZURE_STORAGE_CONNECTION_STRING: ${AZURITE_CONNECTION_STRING}\n ENABLE_GCP_NATIVE: ${ENABLE_GCP_NATIVE}\n volumes: &builder-volumes\n - .:/go/src/github.com/milvus-io/milvus:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${IMAGE_ARCH}-${OS_NAME}-ccache:/ccache:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${IMAGE_ARCH}-${OS_NAME}-go-mod:/go/pkg/mod:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${IMAGE_ARCH}-${OS_NAME}-vscode-extensions:/home/milvus/.vscode-server/extensions:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${IMAGE_ARCH}-${OS_NAME}-conan:/home/milvus/.conan:delegated\n working_dir: '/go/src/github.com/milvus-io/milvus'\n depends_on:\n - etcd\n - minio\n - pulsar\n - azurite\n - gcpnative\n # Command\n command: &builder-command >\n /bin/bash -c "\n make check-proto-product && make verifiers && make unittest"\n\n gpubuilder:\n image: ${IMAGE_REPO}/milvus-env:gpu-${OS_NAME}-${GPU_DATE_VERSION}\n # Build devcontainer\n build:\n context: .\n dockerfile: build/docker/builder/gpu/${OS_NAME}/Dockerfile\n args:\n TARGETARCH: ${IMAGE_ARCH}\n cache_from:\n - ${IMAGE_REPO}/milvus-env:gpu-${OS_NAME}-${LATEST_GPU_DATE_VERSION}\n # user: {{ CURRENT_ID }}\n shm_size: 2G\n # expose 19530 port so we can directly access milvus inside build container\n # ports:\n # - "19530:19530"\n environment:\n <<: *ccache\n OS_NAME: ${OS_NAME}\n PULSAR_ADDRESS: ${PULSAR_ADDRESS}\n ETCD_ENDPOINTS: ${ETCD_ENDPOINTS}\n MINIO_ADDRESS: ${MINIO_ADDRESS}\n CONAN_USER_HOME: /home/milvus\n AZURE_STORAGE_CONNECTION_STRING: ${AZURITE_CONNECTION_STRING}\n ENABLE_GCP_NATIVE: ${ENABLE_GCP_NATIVE}\n volumes: &builder-volumes-gpu\n - .:/go/src/github.com/milvus-io/milvus:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker-gpu}/${OS_NAME}-ccache:/ccache:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker-gpu}/${OS_NAME}-go-mod:/go/pkg/mod:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker-gpu}/${OS_NAME}-vscode-extensions:/home/milvus/.vscode-server/extensions:delegated\n - ${DOCKER_VOLUME_DIRECTORY:-.docker-gpu}/${OS_NAME}-conan:/home/milvus/.conan:delegated\n working_dir: '/go/src/github.com/milvus-io/milvus'\n depends_on:\n - etcd\n - minio\n - pulsar\n - azurite\n - gcpnative\n # Command\n command: &builder-command-gpu >\n /bin/bash -c "\n make check-proto-product && make verifiers && make unittest"\n\n etcd:\n image: milvusdb/etcd:3.5.5-r2\n environment:\n - ALLOW_NONE_AUTHENTICATION=yes\n - ETCD_AUTO_COMPACTION_MODE=revision\n - ETCD_AUTO_COMPACTION_RETENTION=1000\n - ETCD_QUOTA_BACKEND_BYTES=4294967296\n - ETCD_SNAPSHOT_COUNT=50000\n healthcheck:\n test: [ 'CMD', '/opt/bitnami/scripts/etcd/healthcheck.sh' ]\n interval: 30s\n timeout: 20s\n retries: 3\n\n pulsar:\n image: apachepulsar/pulsar:2.8.2\n command: |\n /bin/bash -c \\n "bin/apply-config-from-env.py conf/standalone.conf && \\n exec bin/pulsar standalone --no-functions-worker --no-stream-storage"\n environment:\n # 10MB\n - PULSAR_PREFIX_maxMessageSize=10485760\n # this is 104857600 + 10240 (padding)\n - nettyMaxFrameSizeBytes=104867840\n - PULSAR_GC=-XX:+UseG1GC\n\n minio:\n image: minio/minio:RELEASE.2023-03-20T20-16-18Z\n environment:\n MINIO_ACCESS_KEY: minioadmin\n MINIO_SECRET_KEY: minioadmin\n command: minio server /minio_data\n healthcheck:\n test: [ 'CMD', 'curl', '-f', 'http://localhost:9000/minio/health/live' ]\n interval: 30s\n timeout: 20s\n retries: 3\n\n azurite:\n image: mcr.microsoft.com/azure-storage/azurite\n command: azurite-blob --blobHost 0.0.0.0\n\n jaeger:\n image: jaegertracing/all-in-one:latest\n\n gcpnative:\n image: fsouza/fake-gcs-server\n command: -scheme http -public-host storage.gcs.127.0.0.1.nip.io:4443 -external-url "http://storage.gcs.127.0.0.1.nip.io:4443"\n hostname: storage.gcs.127.0.0.1.nip.io\n ports:\n - "4443:4443"\n\nnetworks:\n default:\n name: milvus_dev\n | dataset_sample\yaml\milvus-io_milvus\docker-compose.yml | docker-compose.yml | YAML | 5,008 | 0.95 | 0 | 0.092857 | node-utils | 885 | 2024-06-09T10:14:30.774204 | BSD-3-Clause | false | a580d86d0266b2012959d703fe073996 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.